Re: [Python-Dev] [Python-checkins] r45925 - in python/trunk: Lib/tempfile.py Lib/test/test_os.py Misc/NEWS Modules/posixmodule.c
Martin v. Löwis wrote: > M.-A. Lemburg wrote: >> BTW, and intended as offer for compromise, should we instead >> add the Win32 codes to the errno module (or a new winerrno >> module) ?! I can write a parser that takes winerror.h and >> generates the module code. > > Instead won't help: the breakage will still occur. It would > be possible to put symbolic constants into the except clauses, > instead of using the numeric values, but you still have to > add all these "except WindowsError,e" clauses, even if > the constants were available. Right, I meant "instead of using ErrorCode approach". > However, in addition would be useful: people will want to > check for specific Win32 error codes, just because they are > more descriptive. I propose to call the module "winerror" > (in parallel to winerror.h, just as the errno module > parallels errno.h) Ok. > Adding them all to the errno would work for most cases, > except that you get conflicts for errno.errcode. I think it's better to separate the two - you wouldn't want to load all the winerror codes on Unix. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, May 11 2006) >>> Python/Zope Consulting and Support ...http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ...http://python.egenix.com/ ::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] total ordering.
Guido van Rossum wrote: > On 5/6/06, Vladimir Yu. Stepanov <[EMAIL PROTECTED]> wrote: > [proposing a total ordering between types] > > It Ain't Gonna Happen. (From now on, I'll write this as IAGH.) > > In Python 3000, we'll actually *remove* ordering between arbitrary > types as a feature; only types that explicitly care to be ordered with > respect to one another will be ordered. Equality tests are unaffected, > x==y will simply return False if x and y are of incomparable types; > but x > -- > --Guido van Rossum (home page: http://www.python.org/~guido/) > ___ > Python-Dev mailing list > Python-Dev@python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: http://mail.python.org/mailman/options/python-dev/vys%40renet.ru > > > When comparison is made like ("something" < 123), generating of exception is necessary. At sorting or use of binary trees it is not so obvious. The behavior function of comparison in this case depends on needs of the user. At present time use of redefined function is complicated. I shall give an example. By a call of a method sort for the list can give absolutely different exceptions, depending on the order of sorted values or data (thanks for David Mertz and Josiah Carlson): - > >> [chr(255),1j,1,u't'].sort() Traceback (most recent call last): File "", line 1, in TypeError: no ordering relation is defined for complex numbers > >> [chr(255),1j,u't',1].sort() Traceback (most recent call last): File "", line 1, in UnicodeDecodeError: 'ascii' codec can't decode byte 0xff in position 0: ordinal not in range(128) - If for Python-3000 similar it will be shown concerning types str(), int(), complex() and so on, and the type of exceptions will strongly vary, it will make problematic redefinition of behavior of function of sorting. It would be quite good to create one more class of exceptions which would unify generation of mistakes at comparison of non-comparable types or data. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] binary trees.
Josiah Carlson wrote: > And you can actually compare str and unicode, so, if you have a str that > is greater than the unicode, you run into this issue. With unicode > becoming str in Py3k, we may not run into this issue much then, unless > bytes are comparable to str, in which case we end up witht the same > problems. > > Actually, according to my google of "python dev total ordering", it > gives me... > > http://mail.python.org/pipermail/python-dev/2003-March/034169.html > http://mail.python.org/pipermail/python-list/2003-March/154142.html > > Which were earlier discussions on this topic, which are quite topical. > The ultimate solution is to choose a total ordering on types and > consider the problem solved. Then list.sort( > Under the second reference there was a question. complex it can be partially comparable with int, long, float? In fact 1 == 1+0j? ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] python 2.4 and universal binaries
Hi, I'd like to backport the patches I've done to the trunk regarding universal binary support for OSX and endian issues in Mac specific modules. The last set seems easy enough, all of those are clearly bugfixes. I'm not sure if the universal binary patches are acceptable for backport (and specifically the change to pyconfig.h.in) The rationale for this is simple: Apple seems to pick up a recent copy of python for every new major release of OSX (Python 2.2.x for Jaguar, Python 2.3.0 for Panther, Python 2.3.5 for Tiger) and is therefore likely to use Python 2.4.x for the next release of the OS. The official python 2.4 tree currently doesn't support building univeral binaries. Given that Apple ships a broken universal build of python 2.3 with the new intel macs I'm worrying that they won't fix this for their next OS release, which would increase the support load on the pythonmac-sig list. By adding support for universal binaries to python 2.4 we'd reduce the change for a broken python in the next OSX release. Ronald ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] python 2.4 and universal binaries
Sounds like an all-round good plan to me. On 5/11/06, Ronald Oussoren <[EMAIL PROTECTED]> wrote: > Hi, > > I'd like to backport the patches I've done to the trunk regarding > universal binary support for OSX and endian issues in Mac specific > modules. > > The last set seems easy enough, all of those are clearly bugfixes. > I'm not sure if the universal binary patches are acceptable for > backport (and specifically the change to pyconfig.h.in) > > The rationale for this is simple: Apple seems to pick up a recent > copy of python for every new major release of OSX (Python 2.2.x for > Jaguar, Python 2.3.0 for Panther, Python 2.3.5 for Tiger) and is > therefore likely to use Python 2.4.x for the next release of the OS. > The official python 2.4 tree currently doesn't support building > univeral binaries. Given that Apple ships a broken universal build of > python 2.3 with the new intel macs I'm worrying that they won't fix > this for their next OS release, which would increase the support load > on the pythonmac-sig list. By adding support for universal binaries > to python 2.4 we'd reduce the change for a broken python in the next > OSX release. > > Ronald > ___ > Python-Dev mailing list > Python-Dev@python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > http://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] total ordering.
On 5/11/06, Vladimir 'Yu' Stepanov <[EMAIL PROTECTED]> wrote: > If for Python-3000 similar it will be shown concerning types > str(), int(), complex() and so on, and the type of exceptions > will strongly vary, it will make problematic redefinition of > behavior of function of sorting. Not really. We'll just document that sort() should only be used on a list of objects that implement a total ordering. The behavior otherwise will simply be undefined; it will raise whatever exception is first raised by an unsupported comparison (most likely TypeError). In practice this won't be a problem. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] total ordering.
Guido van Rossum wrote: > On 5/11/06, Vladimir 'Yu' Stepanov <[EMAIL PROTECTED]> wrote: >> If for Python-3000 similar it will be shown concerning types >> str(), int(), complex() and so on, and the type of exceptions >> will strongly vary, it will make problematic redefinition of >> behavior of function of sorting. > > Not really. We'll just document that sort() should only be used on a > list of objects that implement a total ordering. [...] It might be useful in some cases to have a keyword argument to sort/sorted that says to ignore exceptions arising from comparing elements, and leaves the ordering of non-comparable values undefined. -Edward ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] nag, nag -- 2.5 open issues
Neal Norwitz wrote: > Martin: msilib -- Martin/Andrew is this done? That's done, yes. Martin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] python 2.4 and universal binaries
Ronald Oussoren wrote: > The rationale for this is simple: Apple seems to pick up a recent > copy of python for every new major release of OSX (Python 2.2.x for > Jaguar, Python 2.3.0 for Panther, Python 2.3.5 for Tiger) and is > therefore likely to use Python 2.4.x for the next release of the OS. > The official python 2.4 tree currently doesn't support building > univeral binaries. Given that Apple ships a broken universal build of > python 2.3 with the new intel macs I'm worrying that they won't fix > this for their next OS release, which would increase the support load > on the pythonmac-sig list. By adding support for universal binaries > to python 2.4 we'd reduce the change for a broken python in the next > OSX release. It's fine with me to backport this. Please be aware, though, that the release of 2.4.4 is likely to occur *after* the release of 2.5.0. Not sure when Apple decides to freeze the Python version for the next OS release, but if they really chose the most recent one "by policy", that will be either 2.5.0 or 2.4.3. Regards, Martin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] total ordering.
Edward Loper wrote: > It might be useful in some cases to have a keyword argument to > sort/sorted that says to ignore exceptions arising from comparing > elements, and leaves the ordering of non-comparable values undefined. Why? Far better to use a key (or cmp if you really want) that imposes a total (or at least consistent) ordering. Otherwise sorting is random. Tim Delaney ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Efficient set complement and operation on large/infinite sets.
I'm about to write some code to manage sets, and wanted to float a few thoughts here because I have various ideas about how to implement what I want to do, and I think one of them could be done by changing Python's set type in useful and backward compatible way. Apologies if this is discussed in the archives. I didn't see it. My original motivation is a need to work efficiently with very large sets, let's say 10M elements. In some cases, you can operate on a set without needing to explicitly represent all its elements. For example, if my set is the Universal (U) set, then X in U, should return True. If my set is U and I union it with any other set, it is unchanged, etc. By also maintaining a finite set of things that are _not_ in my large set, I can do other operations efficiently. E.g., calling U.remove(X) would _add_ X to the set of elements that were known not to be in the big set that you were trying to represent efficiently. In most cases, such a set class would simply do the opposite/inverse operation on its finite exclusion set. While it's convenient to warm up by talk about the infinite Universal set, we could just as easily be talking about arbitrarily large finite sets. There are some examples below. Another way to discuss the same thing is to talk about set complements. It would be nice to be able to complement (or invert) a set. Once you did so, you might have an infinite set on your hands, but as the last paragraph argues, you can still operate on an infinite set efficiently. Naturally, you can't fully enumerate it, or take its size. I think that giving Python sets the ability to handle complementarity would have some nice uses - and that these go well beyond the particular thing that I need right now. For example, suppose you want to represent the set of all floating point numbers. You just create an empty set and complement it. Then you can test it as usual, and begin to exclude things from it. Or if you have a system with 10M objects and you need to support operations on these objects via a query language that lets the user type "not X" where X is a set of objects, there is a natural and efficient way (constant time) to execute the query - you just mark the set as being inverted (complemented). If I were going to implement the above by changing Python's built in set type, here's what I think I'd do: Add four optional keyword arguments to the set constructor: - invert=False - universalSet=None - universalSetSize=None - universeExclusionFunc=None invert is just a flag, indicating the sense of the set. If inverse is False: the set behaves exactly as a normal Python set and the other three new arguments are ignored. If inverse is True: universalSet represents the universal set. If it is None, then the universal set is infinite. Otherwise, universalSet is an iterable. The implementation should not call this iterable unless it's unavoidable, on the presumption that if the programmer wanted to operate directly on the contents of this iterable, they'd be doing it in a conventional fashion (e.g., with a normal set). The universalSetSize, used for len() calculations, is the number of elements in universalSet, if known & if finite. The universeExclusionFunc can be called with a single element argument to see if the element should be considered excluded from the universalSet. This may seem like a weird idea, but it's the key to flexibility and efficiency. In an invert=True set, the code would use the normal set content as a mutable set of objects that are not in the universe, as described above, but would, in addition, use the universeExclusionFunc (if defined) to identify elements not in the set (because they are not in the universe), and thus avoid the use of the expensive (or infinite) universalSet. Note that an inverted (or normal) set can be inverted by simply setting invert=False, so this requires a new invert() method (which could be triggered by the use of something like 'not' or '!' or '~'). In this case, an inverted set becomes a normal Python set. The elements represented by universeExclusionFunc remain invalid - they are simply not in the universe (and, if deemed sensible, this could be sanity checked in add()). If it's not clear, when a set is inverted, any iterable given to __init__, (i.e., the iterable argument in the normal case of constructing a Python set), is just treated as usual (but in this case will be taken to represent things not in the set initially). Here are some examples of usage: 1. Suppose you want to work with the set of integers [0, 1) and that initially your set is all such integers. You could create this set via: S = set(invert=True, universalSet=xrange(1), universalSetSize=1, universeExclusionFunc=(lambda x: x >= 1) This has the intended effect, is efficient, and no-one need call the iterator. You can (I t
Re: [Python-Dev] Efficient set complement and operation on large/infinite sets.
Hm... Without reading though all this, I expect that you'd be better off implementing this for yourself without attempting to pull the standard library sets into the picture (especially since sets.py is obsolete as of 2.4; set and frozenset are now built-in types). You're really after rather specialized set representations. I've done this myself and as long as you stick to solving *just* the problem at hand, it's easy. If you want a general solution, it's hard... --Guido On 5/11/06, Terry Jones <[EMAIL PROTECTED]> wrote: > I'm about to write some code to manage sets, and wanted to float a few > thoughts here because I have various ideas about how to implement what I > want to do, and I think one of them could be done by changing Python's set > type in useful and backward compatible way. > > Apologies if this is discussed in the archives. I didn't see it. > > My original motivation is a need to work efficiently with very large sets, > let's say 10M elements. In some cases, you can operate on a set without > needing to explicitly represent all its elements. For example, if my set is > the Universal (U) set, then X in U, should return True. If my set is U and > I union it with any other set, it is unchanged, etc. By also maintaining a > finite set of things that are _not_ in my large set, I can do other > operations efficiently. E.g., calling U.remove(X) would _add_ X to the set > of elements that were known not to be in the big set that you were trying > to represent efficiently. In most cases, such a set class would simply do > the opposite/inverse operation on its finite exclusion set. > > While it's convenient to warm up by talk about the infinite Universal set, > we could just as easily be talking about arbitrarily large finite sets. > There are some examples below. > > Another way to discuss the same thing is to talk about set complements. It > would be nice to be able to complement (or invert) a set. Once you did so, > you might have an infinite set on your hands, but as the last paragraph > argues, you can still operate on an infinite set efficiently. Naturally, > you can't fully enumerate it, or take its size. > > I think that giving Python sets the ability to handle complementarity would > have some nice uses - and that these go well beyond the particular thing > that I need right now. > > For example, suppose you want to represent the set of all floating point > numbers. You just create an empty set and complement it. Then you can test > it as usual, and begin to exclude things from it. Or if you have a system > with 10M objects and you need to support operations on these objects via a > query language that lets the user type "not X" where X is a set of objects, > there is a natural and efficient way (constant time) to execute the query - > you just mark the set as being inverted (complemented). > > If I were going to implement the above by changing Python's built in set > type, here's what I think I'd do: > > Add four optional keyword arguments to the set constructor: > > - invert=False > - universalSet=None > - universalSetSize=None > - universeExclusionFunc=None > > invert is just a flag, indicating the sense of the set. > > If inverse is False: > > the set behaves exactly as a normal Python set and the other three > new arguments are ignored. > > If inverse is True: > > universalSet represents the universal set. If it is None, then the > universal set is infinite. Otherwise, universalSet is an iterable. The > implementation should not call this iterable unless it's unavoidable, on > the presumption that if the programmer wanted to operate directly on the > contents of this iterable, they'd be doing it in a conventional fashion > (e.g., with a normal set). The universalSetSize, used for len() > calculations, is the number of elements in universalSet, if known & > if finite. > > The universeExclusionFunc can be called with a single element argument to > see if the element should be considered excluded from the universalSet. > This may seem like a weird idea, but it's the key to flexibility and > efficiency. In an invert=True set, the code would use the normal set > content as a mutable set of objects that are not in the universe, as > described above, but would, in addition, use the universeExclusionFunc > (if defined) to identify elements not in the set (because they are not in > the universe), and thus avoid the use of the expensive (or infinite) > universalSet. > > Note that an inverted (or normal) set can be inverted by simply setting > invert=False, so this requires a new invert() method (which could be > triggered by the use of something like 'not' or '!' or '~'). In this case, > an inverted set becomes a normal Python set. The elements represented by > universeExclusionFunc remain invalid - they are simply not in the universe > (and, if deemed sensible, this could be sanity checked in add()). > > If it's not clear, when a set is inverted, a
Re: [Python-Dev] Efficient set complement and operation on large/infinite sets.
A quick followup to my own posting: I meant to say something about implementing __rand__() and pop(). I'd either add another optional function argument to the constructor. It would return a random element from the universe. Then for __rand__() and pop(), you'd call until it (hopefully!) returned something not excluded. Or, do something non-random, like return a random (non-excluded) integer. Or, just raise an exception. I think I prefer the extra argument approach, where the docs state clearly that you can expect to wait longer and longer for random elements as you empty a finite inverted set. I prefer this approach because getting a random element from a set is something you really should be able to do. Just raising an exception is the cleanest and clearest choice. One thing I certainly would not consider is trying to mess around with the excluded set (which may in any case be empty) to figure out a suitable return type. And yes, I agree in advance, adding 5 new optional arguments to the set() constructor isn't pretty. Is the added functionality is worth it? Terry ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Efficient set complement and operation on large/infinite sets.
> "Guido" == Guido van Rossum <[EMAIL PROTECTED]> writes: Guido> Hm... Without reading though all this, I expect that you'd be Guido> better off implementing this for yourself without attempting to pull Guido> the standard library sets into the picture (especially since sets.py Guido> is obsolete as of 2.4; set and frozenset are now built-in types). Guido> You're really after rather specialized set representations. I've Guido> done this myself and as long as you stick to solving *just* the Guido> problem at hand, it's easy. If you want a general solution, it's Guido> hard... I certainly would be better off in the short term, and probably the long term too. It's likely what I'll do in any case as it's much, much quicker, I only need a handful of the set operations, and I don't need to talk to anyone :-) I don't think I'm proposing something specialized. Set complement is something one learns in primary school. It's just difficult to provide in general, as you say. Aside from my own itch, which I know how to scratch, my question is whether it's worth trying to work this into Python itself. Sounds like not. Terry ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Efficient set complement and operation on large/infinite sets.
>Guido> Hm... Without reading though all this, I expect that you'd be >Guido> better off implementing this for yourself without attempting to pull >Guido> the standard library sets into the picture (especially since sets.py >Guido> is obsolete as of 2.4; set and frozenset are now built-in types). >Guido> You're really after rather specialized set representations. I've >Guido> done this myself and as long as you stick to solving *just* the >Guido> problem at hand, it's easy. If you want a general solution, it's >Guido> hard... > >I certainly would be better off in the short term, and probably the long >term too. It's likely what I'll do in any case as it's much, much quicker, >I only need a handful of the set operations, and I don't need to talk to >anyone :-) > >I don't think I'm proposing something specialized. Set complement is >something one learns in primary school. It's just difficult to provide in >general, as you say. > >Aside from my own itch, which I know how to scratch, my question is whether >it's worth trying to work this into Python itself. Sounds like not. > > There's room in the world for alternate implementations of sets, each with its own strengths and weaknesses. For example, bitsets potentially offer some speed and space economies for certain apps. Alternatve implementations will most likely start-off as third-party extension modules, and if they prove essential, they may make it to the collections module. As for the built-in set types, I recommend leaving those alone and keeping a API as simple as possible. The __rand__ idea is interesting but I don't see how to implement an equiprobable hash table selection in O(1) time -- keep in mind that a set may be very sparse at the time of the selection. Raymond ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] python 2.4 and universal binaries
This is fine with me. Note that 2.4.4 won't be out until after 2.5.0, so it's a couple of months off yet. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] pthreads, fork, import, and execvp
Hello everyone! We have been encountering several deadlocks in a threaded Python application which calls subprocess.Popen (i.e. fork()) in some of its threads. This has occurred on Python 2.4.1 on a 2.4.27 Linux kernel. Preliminary analysis of the hang shows that the child process blocks upon entering the execvp function, in which the import_lock is acquired due to the following line: def _ execvpe(file, args, env=None): from errno import ENOENT, ENOTDIR ... It is known that when forking from a pthreaded application, acquisition attempts on locks which were already locked by other threads while fork() was called will deadlock. Due to these oddities we were wondering if it would be better to extract the above import line from the execvpe call, to prevent lock acquisition attempts in such cases. Another workaround could be re-assigning a new lock to import_lock (such a thing is done with the global interpreter lock) at PyOS_AfterFork or pthread_atfork. We'd appreciate any opinions you might have on the subject. Thanks in advance, Yair and Rotem ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PyThreadState_SetAsyncExc, PyErr_Clear and native extensions
I use PyThreadState_SetAsyncExc to stop a python thread but there are situations when the thread doesn't stop and continues executing normally. After some debugging, I realized that the problem is that PyThreadState_SetAsyncExc was called when the thread was inside a native extension, that for some reason calls PyErr_Clear. That code happens to be inside boost::python. I do need to stop the thread from executing Python code as soon as possible (as soon as it returns from a native function is also acceptable). Because we have embedded Python's VM in our product, I'm thinking of modifying PyErr_Clear() to return immediately if the thread was stopped (we determine if the thread should stop using our own functions). Example: void PyErr_Clear(void) { if (!stop_executing_this_thread()) PyErr_Restore(NULL, NULL, NULL); } Does anybody see any problem with this approach ?, Does anybody have a cleaner/better solution ? I've allready posted this in comp.lang.python a couple of days ago but got no answers, that's why I'm posting here, coz I think the only ones that could give me a solution/workaround are Python core developers. Thanks a lot. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PyThreadState_SetAsyncExc, PyErr_Clear and native extensions
Gabriel Becedillas wrote: > PyThreadState_SetAsyncExc was called when the thread was inside a > native extension, that for some reason calls PyErr_Clear. Maybe PyThreadState_SetAsyncExc should set a flag that says "this is an async exception, don't clear it", and have PyErr_Clear take notice of that flag. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | Carpe post meridiem! | Christchurch, New Zealand | (I'm not a morning person.) | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] pthreads, fork, import, and execvp
Yeah, I think imports inside functions are overused. On 5/9/06, Rotem Yaari <[EMAIL PROTECTED]> wrote: > Hello everyone! > > We have been encountering several deadlocks in a threaded Python > application which calls subprocess.Popen (i.e. fork()) in some of its > threads. > > This has occurred on Python 2.4.1 on a 2.4.27 Linux kernel. > > Preliminary analysis of the hang shows that the child process blocks > upon entering the execvp function, in which the import_lock is acquired > due to the following line: > > def _ execvpe(file, args, env=None): > from errno import ENOENT, ENOTDIR > ... > > It is known that when forking from a pthreaded application, acquisition > attempts on locks which were already locked by other threads while > fork() was called will deadlock. > > Due to these oddities we were wondering if it would be better to extract > the above import line from the execvpe call, to prevent lock > acquisition attempts in such cases. > > Another workaround could be re-assigning a new lock to import_lock > (such a thing is done with the global interpreter lock) at PyOS_AfterFork or > pthread_atfork. > > We'd appreciate any opinions you might have on the subject. > > Thanks in advance, > > Yair and Rotem > ___ > Python-Dev mailing list > Python-Dev@python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > http://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PyThreadState_SetAsyncExc, PyErr_Clear and native extensions
Gabriel Becedillas wrote: > Does anybody see any problem with this approach ?, Does anybody have a > cleaner/better solution ? I don't think there *is* a solution: asynchronous exceptions and thread cancellation just cannot work. In the specific case, the caller of PyErr_Clear will continue its computation, instead of immediately leaving the function. To solve this specific problem, you would also have to give PyErr_Clear an int return code (whether or not the exception was cleared), and then you need to change all uses of PyErr_Clear to check for failure, and return immediately (after performing local cleanup, of course). You then need to come up with a protocol to determine whether an exception is "clearable"; the new exception hierarchy suggests that one should "normally" only catch Exception, and let any other BaseException through. So PyErr_Clear should grow a flag indicating whether you want to clear just all Exceptions, or indeed all BaseExceptions. Regards, Martin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com