[Python-Dev] PEP: 576 Title: Rationalize Built-in function classes
Hi, At the language summit this year, there was some discussion of PEP 575. I wanted to simplify the PEP, but rather than modify that PEP, Nick Coghlan encouraged me to write an alternative PEP instead. PEP 576 aims to fulfill the same goals as PEP 575, but with fewer changes and to be fully backwards compatible. The PEP can be viewed here: https://github.com/python/peps/blob/master/pep-0576.rst Cheers, Mark. P.S. I'm happy to have discussion of this PEP take place via GitHub, rather than the mailing list, but I thought I would follow the conventional route for now. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] subprocess shell=True on Windows doesn't escape ^ character
On 11/06/2014 21:26, anatoly techtonik wrote: I am banned from tracker, so I post the bug here: The OP's approach to the Python community is beautifully summarised here http://bugs.python.org/issue8940 -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence --- This email is free from viruses and malware because avast! Antivirus protection is active. http://www.avast.com ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Pending issues
The following is a list of the 18 pending issues on the bug tracker. All have been in this state for at least one month so I'm assuming that they can be closed or they wouldn't have been set to pending in the first place. Can somebody take a look at them with a view to closing them or setting them back to open if needed. 16221tokenize.untokenize() "compat" mode misses the encoding when using an iterator 15600expose the finder details used by the FileFinder path hook 12588test_capi.test_subinterps() failed on OpenBSD (powerpc) 7979 connect_ex returns 103 often 17668re.split loses characters matching ungrouped parts of a pattern 11204re module: strange behaviour of space inside {m, n} 14518Add bcrypt $2a$ to crypt.py 15883Add Py_errno to work around multiple CRT issue 19919SSL: test_connect_ex_error fails with EWOULDBLOCK 20026sqlite: handle correctly invalid isolation_level 18228AIX locale parsing failure 1602742 itemconfigure returns incorrect text property of text items 19954test_tk floating point exception on my gentoo box with tk 8.6.1 21084IDLE can't deal with characters above the range (U+-U+) 20997Wrong URL fragment identifier in search result 6895 locale._parse_localename fails when localename does not contain encoding information 1669539 Improve Windows os.path.join (ntpath.join) "smart" joining 21231Issue a python 3 warning when old style classes are defined. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence --- This email is free from viruses and malware because avast! Antivirus protection is active. http://www.avast.com ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 3121, 384 Refactoring Issues
I'm just curious as to why there are 54 open issues after both of these PEPs have been accepted and 384 is listed as finished. Did we hit some unforeseen technical problem which stalled development? For these and any other open issues if you need some Windows testing doing please feel free to put me on the nosy list and ask for a test run. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence --- This email is free from viruses and malware because avast! Antivirus protection is active. http://www.avast.com ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Another case for frozendict
I find it handy to use named tuple as my database mapping type. It allows you to perform this behavior seamlessly. -Mark > On Jul 13, 2014, at 7:04, "Jason R. Coombs" wrote: > > I repeatedly run into situations where a frozendict would be useful, and > every time I do, I go searching and find the (unfortunately rejected) > PEP-416. I’d just like to share another case where having a frozendict in the > stdlib would be useful to me. > > I was interacting with a database and had a list of results from 206 queries: > > >>> res = [db.cases.remove({'_id': doc['_id']}) for doc in fives] > >>> len(res) > 206 > > I can see that the results are the same for the first two queries. > > >>> res[0] > {'n': 1, 'err': None, 'ok': 1.0} > >>> res[1] > {'n': 1, 'err': None, 'ok': 1.0} > > So I’d like to test to see if that’s the case, so I try to construct a ‘set’ > on the results, which in theory would give me a list of unique results: > > >>> set(res) > Traceback (most recent call last): > File "", line 1, in > TypeError: unhashable type: 'dict' > > I can’t do that because dict is unhashable. That’s reasonable, and if I had a > frozen dict, I could easily work around this limitation and accomplish what I > need. > > >>> set(map(frozendict, res)) > Traceback (most recent call last): > File "", line 1, in > NameError: name 'frozendict' is not defined > > PEP-416 mentions a MappingProxyType, but that’s no help. > > >>> res_ex = list(map(types.MappingProxyType, res)) > >>> set(res_ex) > Traceback (most recent call last): > File "", line 1, in > TypeError: unhashable type: 'mappingproxy' > > I can achieve what I need by constructing a set on the ‘items’ of the dict. > > >>> set(tuple(doc.items()) for doc in res) > {(('n', 1), ('err', None), ('ok', 1.0))} > > But that syntax would be nicer if the result had the same representation as > the input (mapping instead of tuple of pairs). A frozendict would have > readily enabled the desirable behavior. > > Although hashability is mentioned in the PEP under constraints, there are > many use-cases that fall out of the ability to hash a dict, such as the one > described above, which are not mentioned at all in use-cases for the PEP. > > If there’s ever any interest in reviving that PEP, I’m in favor of its > implementation. > ___ > Python-Dev mailing list > Python-Dev@python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/wizzat%40gmail.com ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Reviving restricted mode?
On 11/08/2014 18:42, matsjoyce wrote: Yup, I read that post. However, those specific issues do not exist in my module, as there is a module whitelist, and a method whitelist. Builtins are now proxied, and all types going in to functions are checked for modification. There maybe some holes in my approach, but I can't find them. Any chance of giving us some context, or do I have to retrieve my crystal ball from the menders? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Documenting enum types
On 14/08/2014 17:14, Ethan Furman wrote: On 08/14/2014 08:51 AM, Ben Hoyt wrote: The BDFL actually wrote:- The enemy must be documented and exported, since users will encounter them. QOTW. enum == enemy? Is that you, Raymond? ;-) ROFL! Thanks, I needed that! :D -- ~Ethan~ I'll be seeing the PSF in court, on the grounds that I've just bust a gut laughing :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 4000 to explicitly declare we won't be doing a Py3k style compatibility break again?
[Moderately off-topic] On Sun, Aug 17, 2014 at 3:39 AM, Steven D'Aprano wrote: > I used to refer to Python 4000 as the hypothetical compatibility break > version. Now I refer to Python 5000. > I personally think it should be Python 500, or Py5M. When we come to create the mercurial branch, that should of course, following tradition, be called p5ym. -- Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] List insert at index that is well out of range - behaves like append
On 15/09/14 12:31, Tal Einat wrote: On Mon, Sep 15, 2014 at 6:18 AM, Harish Tech wrote: I had a list a = [1, 2, 3] when I did a.insert(100, 100) [1, 2, 3, 100] as list was originally of size 4 and I was trying to insert value at index 100 , it behaved like append instead of throwing any errors as I was trying to insert in an index that did not even existed . Should it not throw IndexError: list assignment index out of range exception as it throws when I attempt doing a[100] = 100 Question : 1. Any idea Why has it been designed to silently handle this instead of informing the user with an exception ? Personal Opinion : Lets see how other dynamic languages behave in such a situation : Ruby : > a = [1, 2] > a[100] = 100 > a => [1, 2, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, 100] The way ruby handles this is pretty clear and sounds meaningful (and this is how I expected to behave and it behaved as per my expectation) at least to me . So what I felt was either it should throw exception or do the way ruby handles it . Is ruby way of handling not the obvious way ? I even raised it in stackoverflow http://stackoverflow.com/questions/25840177/list-insert-at-index-that-is-well-out-of-range-behaves-like-append and got some responses . Hello Harish, The appropriate place to ask questions like this is python-list [1], or perhaps Stack Overflow. I think this is an OK forum for this question. If someone isn't sure if something is a bug or not, then why not ask here before reporting it on the bug tracker? This does seem strange behaviour, and the documentation for list.insert gives no clue as to why this behaviour was chosen. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] List insert at index that is well out of range - behaves like append
On 15/09/2014 23:29, Mark Shannon wrote: On 15/09/14 12:31, Tal Einat wrote: On Mon, Sep 15, 2014 at 6:18 AM, Harish Tech wrote: I had a list a = [1, 2, 3] when I did a.insert(100, 100) [1, 2, 3, 100] as list was originally of size 4 and I was trying to insert value at index 100 , it behaved like append instead of throwing any errors as I was trying to insert in an index that did not even existed . Should it not throw IndexError: list assignment index out of range exception as it throws when I attempt doing a[100] = 100 Question : 1. Any idea Why has it been designed to silently handle this instead of informing the user with an exception ? Personal Opinion : Lets see how other dynamic languages behave in such a situation : Ruby : > a = [1, 2] > a[100] = 100 > a => [1, 2, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, nil, 100] The way ruby handles this is pretty clear and sounds meaningful (and this is how I expected to behave and it behaved as per my expectation) at least to me . So what I felt was either it should throw exception or do the way ruby handles it . Is ruby way of handling not the obvious way ? I even raised it in stackoverflow http://stackoverflow.com/questions/25840177/list-insert-at-index-that-is-well-out-of-range-behaves-like-append and got some responses . Hello Harish, The appropriate place to ask questions like this is python-list [1], or perhaps Stack Overflow. I think this is an OK forum for this question. If someone isn't sure if something is a bug or not, then why not ask here before reporting it on the bug tracker? This does seem strange behaviour, and the documentation for list.insert gives no clue as to why this behaviour was chosen. Cheers, Mark. I assume it's based on the concepts of slicing. From the docs "s.insert(i, x) - inserts x into s at the index given by i (same as s[i:i] = [x])". Although shouldn't that read s[i:i+1] = [x] ? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Sysadmin tasks
Hi, http://speed.python.org/ could do with some love. Cheers, Mark. On 01/10/14 08:35, Shorya Raj wrote: Hello Just curious, is there any sort of tasklist for any sort of sysadmin sort of work surrounding CPython development? There seem to be plenty of tasks for the actual coding part, but it would be good to get something up for the more systems admin side of things. If there is no one managing that side yet, I would be more than happy to start to do so. Thanks SbSpider ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mark%40hotpy.org ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [python-committers] [RELEASE] Python 3.4.2 is now available
On 08/10/2014 11:21, Victor Stinner wrote: 2014-10-08 10:57 GMT+02:00 Larry Hastings : You can download it here: https://www.python.org/download/releases/3.4.2 This page redirect me to https://www.python.org/download/releases/3.4.1 Maybe some web servers of the CDN don't contain the latest version. I guess that the issue will quickly disappears. Victor Further if you navigate from 3.4.1 to 3.4.2 it says "Python 3.4.2rc1 was released on October 8th, 2014.". The download itself is correct. Thanks as always to everybody who has contributed, another great piece of work. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status of C compilers for Python on Windows
On 10/10/2014 01:29, Victor Stinner wrote: === MinGW Some people tried to compile Python. See for example: https://bitbucket.org/puqing/python-mingw We even got some patches: http://bugs.python.org/issue3871 (rejected) There are 55 open issues on the bug tracker with mingw in the title. See also: https://stackoverflow.com/questions/15365249/build-python-with-mingw-and-gcc MinGW reuses the Microsoft C library and it is based on GCC which is very stable, actively developed, supports a lot of archiectures, etc. I guess that it should be possible to reuse third party GCC tools like the famous GDB debugger? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] isinstance() on old-style classes in Py 2.7
Hi, The problem is a side effect of the fact that old-style classes are implemented on top of new-style meta-classes. Consequently although C is the "class" of C() it is not its "type". >>> type(C()) >>> type(C()).__mro__ (, ) therefore >>> issubclass(type(C()), object) True which implies >>> isinstance(C(),object) True Cheers, Mark. On 21/10/14 17:43, Andreas Maier wrote: Hi. Today, I ran across this, in Python 2.7.6: class C: ... pass ... issubclass(C,object) False isinstance(C(),object) True <-- ??? The description of isinstance() in Python 2.7 does not reveal this result (to my reading). From a duck-typing perspective, one would also not guess that an instance of C would be considered an instance of object: dir(C()) ['__doc__', '__module__'] dir(object()) ['__class__', '__delattr__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__new__', '__reduce__ ', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__'] -> What is the motivation for isinstance(C,object) to return True in Python 2.7? Andy Andreas Maier IBM Senior Technical Staff Member, Systems Management Architecture & Design IBM Research & Development Laboratory Boeblingen, Germany mai...@de.ibm.com, +49-7031-16-3654 IBM Deutschland Research & Development GmbH Vorsitzende des Aufsichtsrats: Martina Koederitz Geschaeftsfuehrung: Dirk Wittkopp Sitz der Gesellschaft: Boeblingen Registergericht: Amtsgericht Stuttgart, HRB 243294 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mark%40hotpy.org ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status of C compilers for Python on Windows
On 26/10/2014 00:24, R. David Murray wrote: On Sun, 26 Oct 2014 00:19:44 +0200, Antoine Pitrou wrote: On Sun, 26 Oct 2014 09:06:36 +1100 Chris Angelico wrote: On Sun, Oct 26, 2014 at 8:59 AM, Antoine Pitrou wrote: How do you know this isn't a problem, since you haven't *tested* with MSVC? Why on Earth would you want to test your PEP work with an unsupported Windows compiler and runtime, rather than with the officially supported compiler and runtime? This discussion revolved around supporting MinGW in addition to MSVC. If it had been supported when I was doing that, I could have spun myself up a Windows build and tested it. My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting. While true, I don't think that matters for Chris' point. Given only the ability to build with the MSVC toolchain, his code (which might even be pure python for the purposes of this discussion) would not get tested on Windows until committed and run by the buildbots. If he could build CPython using MinGW, he would, and would test his code on Windows. Even if there are C components and MSVC/MinGW compatibility issues are revealed when the buildbots eventually run the code, still the number of bugs present would probably be lower if he had tested it on Windows than if he hadn't. I know I for one do not generally test patches on Windows because I haven't taken the time to learn how to build CPython on it. Sure, I could test pure python changes by applying patches to an installed Python, but that's an ongoing pain and I'd rather learn to build CPython on Windows and get to use the normal hg tools. If I could use a more linux-like toolchain to build CPython on windows, I would doubtless do much more testing on windows for stuff where I think windows might behave differently (and I might look at more Windows bugs...though frankly there are plenty of bugs for me to look at without looking at Windows bugs). This is not necessarily a compelling argument for MinGW support. However, it *is* a valid argument, IMO. Note: it can be made even less compelling by making it a lot easier to build CPython on Windows without having an MSVC license (which I think means not using the GUI, for which I say *yay* :). I think Zach Ware has been working on improving the Windows build process, and I keep meaning to give it a try... --David MSVC Express Edition 2010 works perfectly for building 3.5 so no license needed. Links to older versions have been pointed out on other threads, either here or python-ideas, maybe both? Or use the command line as Antoine pointed out elsewhere. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Static checker for common Python programming errors
Hi, I think this might be a bit off-topic for this mailing list, code-qual...@python.org is the place for discussing static analysis tools. Although if anyone does have any comments on any particular checks they would like, I would be interested as well. Cheers, Mark. On 17/11/14 14:49, Stefan Bucur wrote: I'm developing a Python static analysis tool that flags common programming errors in Python programs. The tool is meant to complement other tools like Pylint (which perform checks at lexical and syntactic level) by going deeper with the code analysis and keeping track of the possible control flow paths in the program (path-sensitive analysis). For instance, a path-sensitive analysis detects that the following snippet of code would raise an AttributeError exception: if object is None: # If the True branch is taken, we know the object is None object.doSomething() # ... so this statement would always fail I'm writing first to the Python developers themselves to ask, in their experience, what common pitfalls in the language & its standard library such a static checker should look for. For instance, here [1] is a list of static checks for the C++ language, as part of the Clang static analyzer project. My preliminary list of Python checks is quite rudimentary, but maybe could serve as a discussion starter: * Proper Unicode handling (for 2.x) - encode() is not called on str object - decode() is not called on unicode object * Check for integer division by zero * Check for None object dereferences Thanks a lot, Stefan Bucur [1] http://clang-analyzer.llvm.org/available_checks.html ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mark%40hotpy.org ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Please reconsider PEP 479.
Hi, I have serious concerns about this PEP, and would ask you to reconsider it. [ Very short summary: Generators are not the problem. It is the naive use of next() in an iterator that is the problem. (Note that all the examples involve calls to next()). Change next() rather than fiddling with generators. ] I have five main concerns with PEP 479. 1. Is the problem, as stated by the PEP, really the problem at all? 2. The proposed solution does not address the underlying problem. 3. It breaks a fundamental aspect of generators, that they are iterators. 4. This will be a hindrance to porting code from Python 2 to Python 3. 5. The behaviour of next() is not considered, even though it is the real cause of the problem (if there is a problem). 1. The PEP states that "The interaction of generators and StopIteration is currently somewhat surprising, and can conceal obscure bugs." I don't believe that to be the case; if someone knows what StopIteration is and how it is used, then the interaction is entirely as expected. I believe the naive use of next() in an iterator to be the underlying problem. The interaction of generators and next() is just a special case of this. StopIteration is not a normal exception, indicating a problem, rather it exists to signal exhaustion of an iterator. However, next() raises StopIteration for an exhausted iterator, which really is an error. Any iterator code (generator or __next__ method) that calls next() treats the StopIteration as a normal exception and propogates it. The controlling loop then interprets StopIteration as a signal to stop and thus stops. *The problem is the implicit shift from signal to error and back to signal.* 2. The proposed solution does not address this issue at all, but rather legislates against generators raising StopIteration. 3. Generators and the iterator protocol were introduced in Python 2.2, 13 years ago. For all of that time the iterator protocol has been defined by the __iter__(), next()/__next__() methods and the use of StopIteration to terminate iteration. Generators are a way to write iterators without the clunkiness of explicit __iter__() and next()/__next__() methods, but have always obeyed the same protocol as all other iterators. This has allowed code to rewritten from one form to the other whenever desired. Do not forget that despite the addition of the send() and throw() methods and their secondary role as coroutines, generators have primarily always been a clean and elegant way of writing iterators. 4. Porting from Python 2 to Python 3 seems to be hard enough already. 5. I think I've already covered this in the other points, but to reiterate (excuse the pun): Calling next() on an exhausted iterator is, I would suggest, a logical error. However, next() raises StopIteration which is really a signal to the controlling loop. The fault is with next() raising StopIteration. Generators raising StopIteration is not the problem. It also worth noting that calling next() is the only place a StopIteration exception is likely to occur outside of the iterator protocol. An example -- Consider a function to return the value from a set with a single member. def value_from_singleton(s): if len(s) < 2: #Intentional error here (should be len(s) == 1) return next(iter(s)) raise ValueError("Not a singleton") Now suppose we pass an empty set to value_from_singleton(s), then we get a StopIteration exception, which is a bit weird, but not too bad. However it is when we use it in a generator (or in the __next__ method of an iterator) that we get a serious problem. Currently the iterator appears to be exhausted early, which is wrong. However, with the proposed change we get RuntimeError("generator raised StopIteration") raised, which is also wrong, just in a different way. Solutions - My preferred "solution" is to do nothing except improving the documentation of next(). Explain that it can raise StopIteration which, if allowed to propogate can cause premature exhaustion of an iterator. If something must be done then I would suggest changing the behaviour of next() for an exhausted iterator. Rather than raise StopIteration it should raise ValueError (or IndexError?). Also, it might be worth considering making StopIteration inherit from BaseException, rather than Exception. Cheers, Mark. P.S. 5 days seems a rather short time to respond to a PEP. Could we make it at least a couple of weeks in the future, or better still specify a closing date for comments. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Please reconsider PEP 479.
On 23/11/14 22:54, Chris Angelico wrote: On Mon, Nov 24, 2014 at 7:18 AM, Mark Shannon wrote: Hi, I have serious concerns about this PEP, and would ask you to reconsider it. Hoping I'm not out of line in responding here, as PEP author. Some of your concerns (eg "5 days is too short") are clearly for Guido, not me, but perhaps I can respond to the rest of it. [ Very short summary: Generators are not the problem. It is the naive use of next() in an iterator that is the problem. (Note that all the examples involve calls to next()). Change next() rather than fiddling with generators. ] StopIteration is not a normal exception, indicating a problem, rather it exists to signal exhaustion of an iterator. However, next() raises StopIteration for an exhausted iterator, which really is an error. Any iterator code (generator or __next__ method) that calls next() treats the StopIteration as a normal exception and propogates it. The controlling loop then interprets StopIteration as a signal to stop and thus stops. *The problem is the implicit shift from signal to error and back to signal.* The situation is this: Both __next__ and next() need the capability to return literally any object at all. (I raised a hypothetical possibility of some sort of sentinel object, but for such a sentinel to be useful, it will need to have a name, which means that *by definition* that object would have to come up when iterating over the .values() of some namespace.) They both also need to be able to indicate a lack of return value. This means that either they return a (success, value) tuple, or they have some other means of signalling exhaustion. You are grouping next() and it.__next__() together, but they are different. I think we agree that the __next__() method is part of the iterator protocol and should raise StopIteration. There is no fundamental reason why next(), the builtin function, should raise StopIteration, just because __next__(), the method, does. Many xxx() functions that wrap __xxx__() methods add additional functionality. Consider max() or min(). Both of these methods take an iterable and if that iterable is empty they raise a ValueError. If next() did likewise then the original example that motivates this PEP would not be a problem. I'm not sure what you mean by your "However" above. In both __next__ and next(), this is a signal; it becomes an error as soon as you call next() and don't cope adequately with the signal, just as KeyError is an error. 2. The proposed solution does not address this issue at all, but rather legislates against generators raising StopIteration. Because that's the place where a StopIteration will cause a silent behavioral change, instead of cheerily bubbling up to top-level and printing a traceback. I must disagree. It is the FOR_ITER bytecode (implementing a loop or comprehension) that "silently" converts a StopIteration exception into a branch. I think the generator's __next__() method handling of exceptions is correct; it propogates them, like most other code. 3. Generators and the iterator protocol were introduced in Python 2.2, 13 years ago. For all of that time the iterator protocol has been defined by the __iter__(), next()/__next__() methods and the use of StopIteration to terminate iteration. Generators are a way to write iterators without the clunkiness of explicit __iter__() and next()/__next__() methods, but have always obeyed the same protocol as all other iterators. This has allowed code to rewritten from one form to the other whenever desired. Do not forget that despite the addition of the send() and throw() methods and their secondary role as coroutines, generators have primarily always been a clean and elegant way of writing iterators. This question has been raised several times; there is a distinct difference between __iter__() and __next__(), and it is only the I just mentioned __iter__ as it is part of the protocol, I agree that __next__ is relevant method. latter which is aware of StopIteration. Compare these three classes: class X: def __init__(self): self.state=0 def __iter__(self): return self def __next__(self): if self.state == 3: raise StopIteration self.state += 1 return self.state class Y: def __iter__(self): return iter([1,2,3]) class Z: def __iter__(self): yield 1 yield 2 yield 3 Note how just one of these classes uses StopIteration, and yet all three are iterable, yielding the same results. Neither Y nor Z is breaking iterator protocol - but neither of them is writing an iterator, either. All three raise StopIteration, even if it is implicit. This is trivial to demonstrate: def will_it_raise_stop_iteration(it): try: while True: it.__next__() except StopIteration: print("Raises StopIteration") except:
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
On 29/11/14 01:59, Nathaniel Smith wrote: Hi all, [snip] Option 3: Make it legal to assign to the __dict__ attribute of a module object, so that we can write something like new_module = MyModuleSubclass(...) new_module.__dict__ = sys.modules[__name__].__dict__ sys.modules[__name__].__dict__ = {} # *** sys.modules[__name__] = new_module Why does MyModuleClass need to sub-class types.ModuleType? Modules have no special behaviour, apart from the inability to write to their __dict__ attribute, which is the very thing you don't want. If it quacks like a module... Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
On 29/11/14 19:37, Nathaniel Smith wrote: [snip] - The "new module" object has to be a subtype of ModuleType, b/c there are lots of places that do isinstance(x, ModuleType) checks (notably It has to be a *subtype* is does not need to be a *subclass* class M: ...__class__ = ModuleType ... isinstance(M(), ModuleType) True Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
Hi, This discussion has been going on for a while, but no one has questioned the basic premise. Does this needs any change to the language or interpreter? I believe it does not. I'm modified your original metamodule.py to not use ctypes and support reloading: https://gist.github.com/markshannon/1868e7e6115d70ce6e76 Cheers, Mark. On 29/11/14 01:59, Nathaniel Smith wrote: Hi all, There was some discussion on python-ideas last month about how to make it easier/more reliable for a module to override attribute access. This is useful for things like autoloading submodules (accessing 'foo.bar' triggers the import of 'bar'), or for deprecating module attributes that aren't functions. (Accessing 'foo.bar' emits a DeprecationWarning, "the bar attribute will be removed soon".) Python has had some basic support for this for a long time -- if a module overwrites its entry in sys.modules[__name__], then the object that's placed there will be returned by 'import'. This allows one to define custom subclasses of module and use them instead of the default, similar to how metaclasses allow one to use custom subclasses of 'type'. In practice though it's very difficult to make this work safely and correctly for a top-level package. The main problem is that when you create a new object to stick into sys.modules, this necessarily means creating a new namespace dict. And now you have a mess, because now you have two dicts: new_module.__dict__ which is the namespace you export, and old_module.__dict__, which is the globals() for the code that's trying to define the module namespace. Keeping these in sync is extremely error-prone -- consider what happens, e.g., when your package __init__.py wants to import submodules which then recursively import the top-level package -- so it's difficult to justify for the kind of large packages that might be worried about deprecating entries in their top-level namespace. So what we'd really like is a way to somehow end up with an object that (a) has the same __dict__ as the original module, but (b) is of our own custom module subclass. If we can do this then metamodules will become safe and easy to write correctly. (There's a little demo of working metamodules here: https://github.com/njsmith/metamodule/ but it uses ctypes hacks that depend on non-stable parts of the CPython ABI, so it's not a long-term solution.) I've now spent some time trying to hack this capability into CPython and I've made a list of the possible options I can think of to fix this. I'm writing to python-dev because none of them are obviously The Right Way so I'd like to get some opinions/ruling/whatever on which approach to follow up on. Option 1: Make it possible to change the type of a module object in-place, so that we can write something like sys.modules[__name__].__class__ = MyModuleSubclass Option 1 downside: The invariants required to make __class__ assignment safe are complicated, and only implemented for heap-allocated type objects. PyModule_Type is not heap-allocated, so making this work would require lots of delicate surgery to typeobject.c. I'd rather not go down that rabbit-hole. Option 2: Make PyModule_Type into a heap type allocated at interpreter startup, so that the above just works. Option 2 downside: PyModule_Type is exposed as a statically-allocated global symbol, so doing this would involve breaking the stable ABI. Option 3: Make it legal to assign to the __dict__ attribute of a module object, so that we can write something like new_module = MyModuleSubclass(...) new_module.__dict__ = sys.modules[__name__].__dict__ sys.modules[__name__].__dict__ = {} # *** sys.modules[__name__] = new_module The line marked *** is necessary because the way modules are designed, they expect to control the lifecycle of their __dict__. When the module object is initialized, it fills in a bunch of stuff in the __dict__. When the module object (not the dict object!) is deallocated, it deletes everything from the __dict__. This latter feature in particular means that having two module objects sharing the same __dict__ is bad news. Option 3 downside: The paragraph above. Also, there's stuff inside the module struct besides just the __dict__, and more stuff has appeared there over time. Option 4: Add a new function sys.swap_module_internals, which takes two module objects and swaps their __dict__ and other attributes. By making the operation a swap instead of an assignment, we avoid the lifecycle pitfalls from Option 3. By making it a builtin, we can make sure it always handles all the module fields that matter, not just __dict__. Usage: new_module = MyModuleSubclass(...) sys.swap_module_internals(new_module, sys.modules[__name__]) sys.modules[__name__] = new_module Option 4 downside: Obviously a hack.
Re: [Python-Dev] Python 2.x and 3.x use survey, 2014 edition
I disagree. I know there's a huge focus on The Big Libraries (and wholesale migration is all but impossible without them), but the long tail of libraries is still incredibly important. It's like saying that migrating the top 10 Perl libraries to Perl 6 would allow people to completely ignore all of CPAN. It just doesn't make sense. -Mark On Thu, Dec 11, 2014 at 6:47 AM, Giampaolo Rodola' wrote: > > > On Wed, Dec 10, 2014 at 5:59 PM, Bruno Cauet wrote: > >> Hi all, >> Last year a survey was conducted on python 2 and 3 usage. >> Here is the 2014 edition, slightly updated (from 9 to 11 questions). >> It should not take you more than 1 minute to fill. I would be pleased if >> you took that time. >> >> Here's the url: http://goo.gl/forms/tDTcm8UzB3 >> I'll publish the results around the end of the year. >> >> Last year results: https://wiki.python.org/moin/2.x-vs-3.x-survey >> >> Thank you >> Bruno >> >> ___ >> Python-Dev mailing list >> Python-Dev@python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/g.rodola%40gmail.com >> > > I still think the only *real* obstacle remains the lack of important > packages such as twisted, gevent and pika which haven't been ported yet. > With those ones ported switching to Python 3 *right now* is not only > possible and relatively easy, but also convenient. > > > -- > Giampaolo - http://grodola.blogspot.com > > > ___ > Python-Dev mailing list > Python-Dev@python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/wizzat%40gmail.com > > ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Python 2.x and 3.x use survey, 2014 edition
So, I'm more than aware of how to write Python 2/3 compatible code. I've ported 10-20 libraries to Python 3 and write Python 2/3 compatible code at work. I'm also aware of how much writing 2/3 compatible code makes me hate Python as a language. It'll be a happy day when one of the two languages dies so that I never have to write code like that again. However, my point was that just because the core libraries by usage are *starting* to roll out Python 3 support doesn't mean that things are "easy" or "convenient" yet. There are too many libraries in the long tail which fulfill semi-common purposes and haven't been moved over yet. Yeah, sure, they haven't been updated in years... but neither has the language they're built on. I suppose what I'm saying is that the long tail of libraries is far more valuable than it seems the Python3 zealots are giving it credit for. Please don't claim it's "easy" to move over just because merely most of the top 20 libraries have been moved over. :-/ -Mark On Thu, Dec 11, 2014 at 12:14 PM, Dan Stromberg wrote: > On Thu, Dec 11, 2014 at 11:35 AM, Mark Roberts wrote: > > I disagree. I know there's a huge focus on The Big Libraries (and > wholesale > > migration is all but impossible without them), but the long tail of > > libraries is still incredibly important. It's like saying that migrating > the > > top 10 Perl libraries to Perl 6 would allow people to completely ignore > all > > of CPAN. It just doesn't make sense. > > Things in the Python 2.x vs 3.x world aren't that bad. > > See: > https://python3wos.appspot.com/ and > https://wiki.python.org/moin/PortingPythonToPy3k > http://stromberg.dnsalias.org/~strombrg/Intro-to-Python/ (writing code > to run on 2.x and 3.x) > > I believe just about everything I've written over the last few years > either ran on 2.x and 3.x unmodified, or ran on 3.x alone. If you go > the former route, you don't need to wait for your libraries to be > updated. > > I usually run pylint twice for my projects (after each change, prior > to checkin), once with a 2.x interpreter, and once with a 3.x > interpreter (using > http://stromberg.dnsalias.org/svn/this-pylint/trunk/this-pylint) , but > I gather pylint has the option of running on a 2.x interpreter and > warning about anything that wouldn't work on 3.x. > ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Python 2.x and 3.x use survey, 2014 edition
On Mon, Dec 15, 2014 at 11:30 AM, Chris Barker wrote: > Are you primarily writing packages for others to use? if so, then yes. But > I wonder how many people are in that camp? Don't most of us spend most of > our time writing our own purpose-built code? > > That might be a nice thing to see in a survey, actually. > So, I'm the guy that used the "hate" word in relation to writing 2/3 compliant code. And really, as a library maintainer/writer I do hate writing 2/3 compatible code. Having 4 future imports in every file and being forced to use a compatibility shim to do something as simple as iterating across a dictionary is somewhere between sad and infuriating - and that's just the beginning of the madness. From there we get into identifier related problems with their own compatibility shims - like str(), unicode(), bytes(), int(), and long(). Writing 2/3 compatible Python feels more like torture than fun. Even the python-future.org FAQ mentions how un-fun writing 2/3 compatible Python is. The whole situation is made worse because I *KNOW* that Python 3 is a better language than Python 2, but that it doesn't *MATTER* because Python 2 is what people are - and will be - using for the foreseeable future. It's impractical to drop library support for Python 2 when all of your users use Python 2, and bringing the topic up yields a response that amounts to: "WELL, Python 3 is the future! It has been out for SEVEN YEARS! You know Python 2 won't be updated ever again! Almost every library has been updated to Python 3 and you're just behind the times! And, you'll have to switch EVENTUALLY anyway! If you'd just stop writing Python 2 libraries and focus on pure Python 3 then you wouldn't have to write legacy code! PEOPLE LIKE YOU are why the split is going to be there until at least 2020!". And then my head explodes from the hostility of the "core Python community". Perhaps no individual response is quite so blunt, but the community (taken as a whole) feels outright toxic on this topic to me. Consider some statistics from Pypi: - 13359 Python 2.7 packages - 7140 Python 3.x packages - 2732 Python 3.4 packages - 4024 Python 2.7/3.x compatible packages - 2281 2.7/3.4 compatible modules - 9335 Python 2.7 without ANY Python 3 support - 11078 Python 2.7 without Python 3.4 support - 451 modules 3.4 only packages - 3116 Python 3.x only packages - 1004 Python 3.x modules without (tagged) Python 3.4 support Looking at the numbers, I just cannot fathom how we as a community can react this way. The top 50 projects (!!) still prevent a lot of people from switching to Python 3, but the long tail of library likely an even bigger blocker. I also don't understand how we can claim people should start ALL new projects in Python 3 - and be indignant when they don't!. We haven't successfully converted the top 50 projects after SEVEN YEARS, and the long tail without 3.x support is still getting longer. Claims that we have something approaching library parity seem... hilarious, at best? I suppose what I'm saying is that there's lots of claims that the conversion to Python 3 is inevitable, but I'm not convinced about that. I'd posit that another outcome is the slow death of Python as a language. I would suggest adding some "community health" metrics around the Python 2/3 split, as well as a question about whether someone considers themselves primarily a library author, application developer, or both. I'd also ask how many people have started a new application in another language instead of Python 3 because of the split. -Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Python 2.x and 3.x use survey, 2014 edition
On Tue, Dec 16, 2014 at 2:45 AM, Antoine Pitrou wrote: > > Iterating accross a dictionary doesn't need compatibility shims. It's > dead simple in all Python versions: > > $ python2 > Python 2.7.8 (default, Oct 20 2014, 15:05:19) > [GCC 4.9.1] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> d = {'a': 1} > >>> for k in d: print(k) > ... > a > > $ python3 > Python 3.4.2 (default, Oct 8 2014, 13:08:17) > [GCC 4.9.1] on linux > Type "help", "copyright", "credits" or "license" for more information. > >>> d = {'a': 1} > >>> for k in d: print(k) > ... > a > > Besides, using iteritems() and friends is generally a premature > optimization, unless you know you'll have very large containers. > Creating a list is cheap. > It seems to me that every time I hear this, the author is basically admitting that Python is a toy language not meant for "serious computing" (where serious is defined in extremely modest terms). The advice is also very contradictory to literally every talk on performant Python that I've seen at PyCon or PyData or ... well, anywhere. And really, doesn't it strike you as incredibly presumptuous to call the *DEFAULT BEHAVIOR* of Python 3 a "premature optimization"? Isn't the whole reason that the default behavior switch was made is because creating lists willy nilly all over the place really *ISN'T* cheap? This isn't the first time someone has tried to run this line past me, but it's the first time I've been fed up enough with the topic to call it complete BS on the spot. Please help me stop the community at large from saying this, because it really isn't true at all. -Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Python 2.x and 3.x use survey, 2014 edition
Perhaps you are correct, and I will attempt to remain more constructive on the topic (despite it being an *incredibly* frustrating experience). However, my point remains: this is a patently false thing that is being parroted throughout the Python community, and it's outright insulting to be told my complaints about writing 2/3 compatible code are invalid on the basis of "premature optimization". -Mark On Tue, Dec 16, 2014 at 10:57 AM, Brett Cannon wrote: > > Mark, your tone is no longer constructive and is hurting your case in > arguing for anything. Please take it down a notch. > > On Tue Dec 16 2014 at 1:48:59 PM Mark Roberts wrote: > >> On Tue, Dec 16, 2014 at 2:45 AM, Antoine Pitrou >> wrote: >>> >>> Iterating accross a dictionary doesn't need compatibility shims. It's >>> dead simple in all Python versions: >>> >>> $ python2 >>> Python 2.7.8 (default, Oct 20 2014, 15:05:19) >>> [GCC 4.9.1] on linux2 >>> Type "help", "copyright", "credits" or "license" for more information. >>> >>> d = {'a': 1} >>> >>> for k in d: print(k) >>> ... >>> a >>> >>> $ python3 >>> Python 3.4.2 (default, Oct 8 2014, 13:08:17) >>> [GCC 4.9.1] on linux >>> Type "help", "copyright", "credits" or "license" for more information. >>> >>> d = {'a': 1} >>> >>> for k in d: print(k) >>> ... >>> a >>> >>> Besides, using iteritems() and friends is generally a premature >>> optimization, unless you know you'll have very large containers. >>> Creating a list is cheap. >>> >> >> It seems to me that every time I hear this, the author is basically >> admitting that Python is a toy language not meant for "serious computing" >> (where serious is defined in extremely modest terms). The advice is also >> very contradictory to literally every talk on performant Python that I've >> seen at PyCon or PyData or ... well, anywhere. And really, doesn't it >> strike you as incredibly presumptuous to call the *DEFAULT BEHAVIOR* of >> Python 3 a "premature optimization"? Isn't the whole reason that the >> default behavior switch was made is because creating lists willy nilly all >> over the place really *ISN'T* cheap? This isn't the first time someone has >> tried to run this line past me, but it's the first time I've been fed up >> enough with the topic to call it complete BS on the spot. Please help me >> stop the community at large from saying this, because it really isn't true >> at all. >> >> -Mark >> ___ >> Python-Dev mailing list >> Python-Dev@python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/ >> brett%40python.org >> > ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] New Windows installer for Python 3.5
On 04/01/2015 22:56, Steve Dower wrote: Paul Moore wrote: Overall, this looks good. One question - will it be possible to install both 32-bit and 64-bit Python on the same machine? Currently, you need a custom install to do this (as the default directory doesn't include the architecture) and IIRC there's some oddness around install order. It would be nice if installing both versions were a supported option, both for the "default" install and in custom installs. Yes, the default install directories on the first page are different for 32-bit and 64-bit (either "Program Files [(x86)]" or "Python35[-32]"). The default value on the customize page is currently always C:\Python35, and so you'll need to change it if you're doing custom installs, but that's easy to add the "-32" by default. (I used a -32 suffix because it matches the py.exe option.) Also, what happens now with setting PATH? Is Python (and the scripts directory) added to PATH by default? If so, what happens when you install 2 versions of Python? Yes, and in general the later installed version will win and system-wide installs always beat per-user installs. As I mentioned above, using py.exe with a parameter or shebang line is the most reliable way to get the version you want. In case it's not clear, I'm thinking of the impact on build machines, which often have multiple versions, in both 32- and 64-bit forms, installed simultaneously (but can also be used as a "normal" development machine, and for that purpose will want a selected Python version as the default one. You should see my dev machines :) Most have 2.5, 2.6, 2.7 32-bit and 64-bit, 3.0, 3.1, 3.2, 3.3 32-bit and 64-bit, 3.4 32-bit and 64-bit, IronPython and often PyPy, Anaconda or Canopy. So I'm being fairly selfish when I try and make the multiple-versions scenario work better, but it will benefit everyone. Also, how does the launcher py.exe fit into the picture? Is it still installed into the Windows directory? What about for a user install? Are Python scripts associated with the launcher, and if so, how does it pick up the version you selected as default? py.exe is more important than ever. It's still installed into the Windows directory for all-user installs, and into the Python directory for just-for-me. It's installed in a way that will make upgrades more reliable (so if you install 3.6 and then 3.5, you'll keep the newer launcher) and all the file associations go straight to the launcher. The default Python for the launcher seems to be 2.7 if it's there and the latest version if not (though I could be wrong). Shebang lines are the best way to choose a specific version. Incidentally, whoever came up with the py.exe launcher deserves a medal. It's made dealing with multiple versions amazingly easy. If I'm reading this correctly it means that py.exe gets picked up from PATH so it could be 32 or 64 bit. Does this mean that the launcher could be or needs enhancing so 32 or 64 bit can be selected? I'm not sure if anything can be done about pyw.exe, perhaps you (plural) can throw some light on this for me. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] New Windows installer for Python 3.5
On 05/01/2015 17:09, Steve Dower wrote: Paul Moore wrote: Steve is in essence saying that it's not possible to sanely manage PATH as part of the new installer, but that py.exe makes that unnecessary. It's actually not possible to sanely manage PATH from any installer - it really needs to be handled by a user directly (though I can't ever bring myself to tell anyone to use the UI for it). The old installers were less susceptible because they didn't support per-user installs, but you'd still get the "last install Python wins" behaviour. Something that's help keep me slightly sane is the Rapid Environment Editor http://www.rapidee.com/en/about. I'm sure there are plenty of other choices but it does what I need. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Compile Python on Windows (OpenSSL)
On 13/01/2015 22:04, Victor Stinner wrote: +* Type: PCbuild\win32\python_d.exe PCbuild\prepare_ssl.py externals\openssl-1.0.1j See also http://bugs.python.org/issue23212 "Update Windows and OS X installer copies of OpenSSL to 1.0.1k" -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 484 syntax: NONONONONONONO!
On 01/02/2015 10:13, Benjamin wrote: The proposed syntax is abominable. It's the opposite of readable. I have no views on the subject as I won't be using it, but there is no need to shout to get your point across. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
On 10/02/2015 13:23, Antoine Pitrou wrote: On Tue, 10 Feb 2015 23:16:38 +1000 Nick Coghlan wrote: On 10 Feb 2015 19:24, "Terry Reedy" wrote: On 2/9/2015 7:29 PM, Neil Girdhar wrote: For some reason I can't seem to reply using Google groups, which is is telling "this is a read-only mirror" (anyone know why?) I presume spam prevention. Most spam on python-list comes from the read-write GG mirror. There were also problems with Google Groups getting the reply-to headers wrong (so if someone flipped the mirror to read-only: thank you!) With any luck, we'll have a native web gateway later this year after Mailman 3 is released, so posting through Google Groups will be less desirable. There is already a Web and NNTP gateway with Gmane: http://news.gmane.org/gmane.comp.python.devel No need to rely on Google's mediocre services. Regards Antoine. Highly recommended as effectively zero spam. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 370 - per-user scripts directory on Windows
On 10/02/2015 20:47, Paul Moore wrote: On 10 February 2015 at 12:38, Paul Moore wrote: Comments? If this is acceptable, I would be willing to prepare a patch for Python 3.5 implementing this. See http://bugs.python.org/issue23437 As yet untested, as I just realised I need to get Visual Studio 2015 installed to be able to build Python 3.5. I'll try to get that sorted out, but I thought it would be worth putting the patch up anyway - it's pretty simple. Paul Visual Studio 2013 is fine for building 3.5. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 370 - per-user scripts directory on Windows
On 12/02/2015 20:23, Paul Moore wrote: On 12 February 2015 at 20:11, Ethan Furman wrote: I believe you are correct; however, as the PEP for the launcher stated [1] "use as a general-purpose (aka non-python) launcher is explicitly not supported". Yes, I once used it to start Perl scripts, just because it appealed to the perverse instinct in me. Part of me wishes there had been a problem, so that I could be the one who raised a legitimate bug on bugs.python.org saying "Python encounters an error when running a Perl script" :-) Paul Get thee behind me, Satan. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] subclassing builtin data structures
> On Feb 12, 2015, at 18:40, Chris Angelico wrote: > > On Fri, Feb 13, 2015 at 12:46 PM, MRAB wrote: >>>>> class BaseInt: >> ... def __init__(self, value): >> ... self._value = value >> ... def __add__(self, other): >> ... return type(self)(self._value + other) > >> On Fri, Feb 13, 2015 at 11:55 AM, Guido van Rossum wrote: >> ... there is no reason (in general) why >> the signature of a subclass constructor should match the base class >> constructor, and it often doesn't. > > You're requiring that any subclass of BaseInt be instantiable with one > argument, namely its value. That's requiring that the signature of the > subclass constructor match the base class constructor. > > ChrisA > ___ > Python-Dev mailing list > Python-Dev@python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/wizzat%40gmail.com No, it seems like he's asking that the type return a new object of the same type instead of one of the superclass. In effect, making the Date class call type(self)(*args) instead of datetime.date(*args). He seems completely willing to accept the consequences of changing the constructor (namely that he will have to override all the methods that call the constructor). It seems like good object oriented design to me. -Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 441 - Improving Python ZIP Application Support
On 15/02/2015 18:06, Steve Dower wrote: "Go ahead, make my pep." We should make a python-dev t-shirt with this on it :) I'll buy one provided p&p isn't too high :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 441 - Improving Python ZIP Application Support
I don't know what anyone else does, but in cases where I have both on my windows box, I do use python2(.x) and python3(.y) . If I only have one version on the box, I use the generic name of course. (I don't often have only one version on my boxes though. 2.x inevitably gets drug in in for some reason or another and I hardly ever uninstall old versions of 3.x) I don't use the launcher though, so I might be out-of-scope entirely. (in which case, sorry for the noise) ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 441 - Improving Python ZIP Application Support
If I only have one version on my box, yes, I only use "python". But like I said, for me personally, that situation doesn't last very long. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Emit SyntaxWarning on unrecognized backslash escapes?
On 23/02/2015 21:27, Serhiy Storchaka wrote: On 23.02.15 21:58, Joao S. O. Bueno wrote: That happens all the time, and is this use case that should possibly be addressed here - maybe something as simple as adding a couple of paragraphs to different places in the documentation could mitigate the issue. (in contrast to make a tons of otherwise valid code to become deprecated in a couple releases). The problem is that the user don't know that he should read the documentation. It just find that his script works with "C:\sample.txt", but doesn't work with "D:\test.txt". He has no ideas what happen. Isn't this why users have help desks? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 488: elimination of PYO files
On 06/03/15 16:34, Brett Cannon wrote: Over on the import-sig I proposed eliminating the concept of .pyo files since they only signify that /some/ optimization took place, not /what/ optimizations took place. Everyone on the SIG was positive with the idea so I wrote a PEP, got positive feedback from the SIG again, and so now I present to you PEP 488 for discussion. [snip] Historically -O and -OO have been the antithesis of optimisation, they change the behaviour of the program with no noticeable effect on performance. If a change is to be made, why not just drop .pyo files and be done with it? Any worthwhile optimisation needs to be done at runtime or involve much more than tweaking bytecode. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 489: Redesigning extension module loading
On 16/03/2015 12:38, Petr Viktorin wrote: Hello, Can you use anything from the meta issue http://bugs.python.org/issue15787 for PEP 3121 and PEP 384 or will the work that you are doing render everything done previously redundant? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Builtin functions are magically static methods?
On 29/03/15 19:16, Paul Sokolovsky wrote: Hello, I looked into porting Python3 codecs module to MicroPython and saw rather strange behavior, which is best illustrated with following testcase: == def foo(a): print("func:", a) import _codecs fun = _codecs.utf_8_encode #fun = hash #fun = str.upper #fun = foo class Bar: meth = fun print(fun) print(fun("foo")) b = Bar() print(b.meth("bar")) == Uncommenting either _codecs.utf_8_encode or hash (both builtin functions) produces 2 similar output lines, which in particular means that its possible to call a native function as (normal) object method, which then behaves as if it was a staticmethod - self is not passed to a native function. Using native object method in this manner produces error of self type mismatch (TypeError: descriptor 'upper' for 'str' objects doesn't apply to 'Bar' object). And using standard Python function expectedly produces error about argument number mismatch, because used as a method, function gets extra self argument. So the questions are: 1. How so, the native functions exhibit such magic behavior? Is it documented somewhere - I never read or heard about that (cannot say I read each and every word in Python reference docs, but read enough. As an example, https://docs.python.org/3/library/stdtypes.html#functions is rather short and mentions difference in implementation, not in meta-behavior). In fact the "magic" is exhibited by Python functions, not by builtin ones. Python functions are descriptors, builtin functions are not. 2. The main question: how to easily and cleanly achieve the same behavior for standard Python functions? I'd think it's staticmethod(), but: Write your own "BuiltinFunction" class which has the desired properties, ie. it would be callable, but not a descriptor. Then write a "builtin_function" decorator to produce such an object from a function. The class and decorator could be the same object. Personally, I think that such a class (plus a builtin function type that behaved like a Python function) would be a useful addition to the standard library. Modules do get converted from Python to C and vice-versa. staticmethod(lambda:1)() Traceback (most recent call last): File "", line 1, in TypeError: 'staticmethod' object is not callable Surprise. (By "easily and cleanly" I mean without meta-programming tricks, like instead of real arguments accept "*args, **kwargs" and then munge args). Thanks, Paul mailto:pmis...@gmail.com Cheers, Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] importante!!
Features of blender are relevant on blender mailing lists, not here . (I don't understand why you would want a 3d modeling program to be an IDE, but whatever floats your boat) Also, python-dev isn't really the place for feature requests. If you want something added, add it yourself. : ) ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Type hints -- a mediocre programmer's reaction
Just another peanut from the gallery: I pretty much agree with everything that harry said. My current response to type annotations is "Yuck, that kills readability. I hope no code I ever have to read uses this.". ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 492: No new syntax is required
Hi, I was looking at PEP 492 and it seems to me that no new syntax is required. Looking at the code, it does four things; all of which, or a functional equivalent, could be done with no new syntax. 1. Make a normal function into a generator or coroutine. This can be done with a decorator. 2. Support a parallel set of special methods starting with 'a' or 'async'. Why not just use the current set of special methods? 3. "await". "await" is an operator that takes one argument and produces a single result, without altering flow control and can thus be replaced by an function. 4. Asynchronous with statement. The PEP lists the equivalent as "with (yield from xxx)" which doesn't seem so bad. Please don't add unnecessary new syntax. Cheers, Mark. P.S. I'm not objecting to any of the other new features proposed, just the new syntax. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: No new syntax is required
On 26/04/15 21:40, Yury Selivanov wrote: Hi Mark, On 2015-04-26 4:21 PM, Mark Shannon wrote: Hi, I was looking at PEP 492 and it seems to me that no new syntax is required. Mark, all your points are explained in the PEP in a great detail: I did read the PEP. I do think that clarifying the distinction between coroutines and 'normal' generators is a good idea. Adding stuff to the standard library to help is fine. I just don't think that any new syntax is necessary. Looking at the code, it does four things; all of which, or a functional equivalent, could be done with no new syntax. Yes, everything that the PEP proposes can be done without new syntax. That's how people use asyncio right now, with only what we have in 3.4. But it's hard. Iterating through something asynchronously? Write a 'while True' loop. Instead of 1 line you now have 5 or 6. Want to commit your database transaction? Instead of 'async with' you will write 'try..except..finally' block, with a very high probability to introduce a bug, because you don't rollback or commit properly or propagate exception. I don't see why you can't do transactions using a 'with' statement. 1. Make a normal function into a generator or coroutine. This can be done with a decorator. https://www.python.org/dev/peps/pep-0492/#rationale-and-goals states that """ it is not possible to natively define a coroutine which has no yield or yield from statement """ which is just not true. https://www.python.org/dev/peps/pep-0492/#debugging-features Requires the addition of the CO_COROUTINE flag, not any new keywords. https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword Seems to be repeating the above. 2. Support a parallel set of special methods starting with 'a' or 'async'. Why not just use the current set of special methods? Because you can't reuse them. https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-for-and-with-statements Which seems back to front. The argument is that existing syntax constructs cannot be made to work with asynchronous objects. Why not make the asynchronous objects work with the existing syntax? https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-magic-names The argument here relies on the validity of the previous points. 3. "await". "await" is an operator that takes one argument and produces a single result, without altering flow control and can thus be replaced by an function. It can't be replaced by a function. Only if you use greenlets or Stackless Python. Why not? The implementation of await is here: https://github.com/python/cpython/compare/master...1st1:await#diff-23c87bfada1d01335a3019b9321502a0R642 which clearly could be made into a function. 4. Asynchronous with statement. The PEP lists the equivalent as "with (yield from xxx)" which doesn't seem so bad. There is no equivalent to 'async with'. "with (yield from xxx)" only allows you to suspend execution in __enter__ (and it's not actually in __enter__, but in a coroutine that returns a context manager). https://www.python.org/dev/peps/pep-0492/#asynchronous-context-managers-and-async-with see "New Syntax" section to see what 'async with' is equivalent too. Which, by comparing with PEP 343, can be translated as: with expr as e: e = await(e) ... Please don't add unnecessary new syntax. It is necessary. This isn't an argument, it's just contradiction ;) Perhaps you haven't spent a lot of time maintaining huge code-bases developed with frameworks like asyncio, so I understand why it does look unnecessary to you. This is a good reason for clarifying the distinction between 'normal' generators and coroutines. It is not, IMO, justification for burdening the language (and everyone porting Python 2 code) with extra syntax. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: No new syntax is required
On 27/04/15 00:13, Guido van Rossum wrote: But new syntax is the whole point of the PEP. I want to be able to *syntactically* tell where the suspension points are in coroutines. Doesn't "yield from" already do that? Currently this means looking for yield [from]; PEP 492 just adds looking for await and async [for|with]. Making await() a function defeats the purpose because now aliasing can hide its presence, and we're back in the land of gevent or stackless (where *anything* can potentially suspend the current task). I don't want to live in that land. I don't think I was clear enough. I said that "await" *is* a function, not that is should be disguised as one. Reading the code, "GetAwaitableIter" would be a better name for that element of the implementation. It is a straightforward non-blocking function. On Sun, Apr 26, 2015 at 1:21 PM, Mark Shannon mailto:m...@hotpy.org>> wrote: Hi, I was looking at PEP 492 and it seems to me that no new syntax is required. Looking at the code, it does four things; all of which, or a functional equivalent, could be done with no new syntax. 1. Make a normal function into a generator or coroutine. This can be done with a decorator. 2. Support a parallel set of special methods starting with 'a' or 'async'. Why not just use the current set of special methods? 3. "await". "await" is an operator that takes one argument and produces a single result, without altering flow control and can thus be replaced by an function. 4. Asynchronous with statement. The PEP lists the equivalent as "with (yield from xxx)" which doesn't seem so bad. Please don't add unnecessary new syntax. Cheers, Mark. P.S. I'm not objecting to any of the other new features proposed, just the new syntax. ___ Python-Dev mailing list Python-Dev@python.org <mailto:Python-Dev@python.org> https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/guido%40python.org -- --Guido van Rossum (python.org/~guido <http://python.org/~guido>) ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: No new syntax is required
On 26/04/15 23:24, Nick Coghlan wrote: On 27 Apr 2015 07:50, "Mark Shannon" mailto:m...@hotpy.org>> wrote: > On 26/04/15 21:40, Yury Selivanov wrote: >> >> But it's hard. Iterating through something asynchronously? Write a >> 'while True' loop. Instead of 1 line you now have 5 or 6. Want to >> commit your database transaction? Instead of 'async with' you will >> write 'try..except..finally' block, with a very high probability to >> introduce a bug, because you don't rollback or commit properly or >> propagate exception. > > I don't see why you can't do transactions using a 'with' statement. Because you need to pass control back to the event loop from the *__exit__* method in order to wait for the commit/rollback operation without blocking the scheduler. The "with (yield from cm())" formulation doesn't allow either __enter__ *or* __exit__ to suspend the coroutine to wait for IO, so you have to do the IO up front and return a fully synchronous (but still non-blocking) CM as the result. True. The 'with' statement cannot support this use case, but try-except can do the job: trans = yield from db_conn.transaction() try: ... except: yield from trans.roll_back() raise yield from trans.commit() Admittedly not as elegant as the 'with' statement, but perfectly readable. We knew about these problems going into PEP 3156 (http://python-notes.curiousefficiency.org/en/latest/pep_ideas/async_programming.html#using-special-methods-in-explicitly-asynchronous-code) so it's mainly a matter of having enough experience with asyncio now to be able to suggest specific syntactic sugar to make the right way and the easy way the same way. asyncio is just one module amongst thousands, does it really justify special syntax? Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Issues with PEP 482 (1)
Hi, I still think that there are several issues that need addressing with PEP 492. This time, one issue at a time :) "async" The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 shortcomings. The second of which is: """It is not possible to natively define a coroutine which has no yield or yield from statements.""" This is incorrect, although what is meant by 'natively' is unclear. A coroutine without a yield statement can be defined simply and concisely, thus: @coroutine def f(): return 1 This is only a few character longer than the proposed new syntax, perfectly explicit and requires no modification the language whatsoever. A pure-python definition of the "coroutine" decorator is given below. So could the "Rationale and Goals" be correctly accordingly, please. Also, either the "async def" syntax should be dropped, or a new justification is required. Cheers, Mark. #coroutine.py from types import FunctionType, CodeType CO_COROUTINE = 0x0080 CO_GENERATOR = 0x0020 def coroutine(f): 'Converts a function to a generator function' old_code = f.__code__ new_code = CodeType( old_code.co_argcount, old_code.co_kwonlyargcount, old_code.co_nlocals, old_code.co_stacksize, old_code.co_flags | CO_GENERATOR | CO_COROUTINE, old_code.co_code, old_code.co_consts, old_code.co_names, old_code.co_varnames, old_code.co_filename, old_code.co_name, old_code.co_firstlineno, old_code.co_lnotab, old_code.co_freevars, old_code.co_cellvars) return FunctionType(new_code, f.__globals__) P.S. The reverse of this decorator, which unsets the flags, converts a generator function into a normal function. :? ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: No new syntax is required
On 28/04/15 20:24, Paul Sokolovsky wrote: Hello, [snip] Based on all this passage, my guess is that you miss difference between C and Python functions. This is rather patronising, almost to the point of being insulting. Please keep the debate civil. [snip] Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Issues with PEP 482 (1)
On 28/04/15 20:39, Paul Sokolovsky wrote: Hello, On Tue, 28 Apr 2015 19:44:53 +0100 Mark Shannon wrote: [] A coroutine without a yield statement can be defined simply and concisely, thus: @coroutine def f(): return 1 [] A pure-python definition of the "coroutine" decorator is given below. [] from types import FunctionType, CodeType CO_COROUTINE = 0x0080 CO_GENERATOR = 0x0020 def coroutine(f): 'Converts a function to a generator function' old_code = f.__code__ new_code = CodeType( old_code.co_argcount, old_code.co_kwonlyargcount, This is joke right? Well it was partly for entertainment value, although it works on PyPy. The point is that something that can be done with a decorator, whether in pure Python or as builtin, does not require new syntax. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Issues with PEP 482 (1)
On 28/04/15 21:06, Guido van Rossum wrote: On Tue, Apr 28, 2015 at 11:44 AM, Mark Shannon mailto:m...@hotpy.org>> wrote: Hi, I still think that there are several issues that need addressing with PEP 492. This time, one issue at a time :) "async" The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 shortcomings. The second of which is: """It is not possible to natively define a coroutine which has no yield or yield from statements.""" This is incorrect, although what is meant by 'natively' is unclear. A coroutine without a yield statement can be defined simply and concisely, thus: @coroutine def f(): return 1 This is only a few character longer than the proposed new syntax, perfectly explicit and requires no modification the language whatsoever. A pure-python definition of the "coroutine" decorator is given below. So could the "Rationale and Goals" be correctly accordingly, please. Also, either the "async def" syntax should be dropped, or a new justification is required. So here's *my* motivation for this. I don't want the code generator to have to understand decorators. To the code generator, a decorator is just an expression, and it shouldn't be required to understand decorators in sufficient detail to know that *this* particular decorator means to generate different code. The code generator knows nothing about it. The generated bytecode is identical, only the flags are changed. The decorator can just return a copy of the function with modified co_flags. And it's not just generating different code -- it's also the desire to issue static errors (SyntaxError) when await (or async for/with) is used outside a coroutine, or when yield [from] is use inside one. Would raising a TypeError at runtime be sufficient to catch the sort of errors that you are worried about? The motivation is clear enough to me (and AFAIR I'm the BDFL for this PEP :-). Can't argue with that. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Migrate python-dev to Mailman 3?
On 10/26/2017 07:28 AM, Wes Turner wrote: > > > On Thursday, October 26, 2017, Paul Moore wrote: > > On 26 October 2017 at 10:24, Victor Stinner > wrote: > > We are using Mailman 3 for the new buildbot-status mailing list and it > > works well: > > > > > https://mail.python.org/mm3/archives/list/buildbot-sta...@python.org/ > > ... > > My only use of the pipermail archives is to find permanent URLs for > mails I want to refer people to. My usage goes as follows: > > 1. Google search for a post. > 2. Paste in the URL to an email. > > Or, if I have the post already (usually in my email client). > > 1. Check the date and subject of the post. > 2. Go to the pipermail article by month, and scan the list for the > subject and author. > 3. Click on the link, check it's the right email, copy the URL. > 4. Paste it into my email. > > I don't use the archives for reading. If the above two usages are > still available, I don't care. But in particular, the fact that > individual posts are searchable from Google is important to me. A Google search narrowed with "site:mail.python.org" and perhaps "inurl:listn...@python.org" works for HyperKitty archives as well. Also, the archive itself has a "search this list" box. ...> The complexity of this process is also very wastefully frustrating to > me. (Maybe it's in the next month's message tree? No fulltext search? No > way to even do an inurl: search because of the URIs?!) I don't see these issues. There is a full text search box on the archive page and I don't see the problem with Google inurl: > Isn't there a way to append a permalink to the relayed message footers? > Google Groups and Github do this and it saves a lot of time. As you note below, there is an Archived-At: header. I have just submitted an RFE at <https://gitlab.com/mailman/mailman/issues/432> to enable placing this in the message header/footer. > [Re-searches for things] > > Mailman3 adds an RFC 5064 "Archived-At" header with a link that some > clients provide the ability to open in a normal human browser: > > http://dustymabe.com/2016/01/10/archived-at-email-header-from-mailman-3-lists/ > > I often click the "view it on Github" link in GitHub issue emails. (It's > after the '--' email signature delimiter, so it doesn't take up so much > room). > > "[feature] Add permalink to mail message to the footer when delivering > email" > https://gitlab.com/mailman/hyperkitty/issues/27 This needs to be in Mailman Core, not HyperKitty. As I note above, I filed an RFE with core and also referenced it in the HyperKitty issue > Finally, how would a transition be handled? I assume the old archives > would be retained, so would there be a cut-off date and people would > have to know to use the old or new archives based on the date of the > message? > > > Could an HTTP redirect help with directing users to the new or old archives? What we did when migrating security-sig is we migrated the archive but kept the old one and added this message and link to the old archive page. "This list has been migrated to Mailman 3. This archive is not being updated. Here is the new archive including these old posts." We also redirected <https://mail.python.org/mailman/listinfo/security-sig> to <https://mail.python.org/mm3/mailman3/lists/security-sig.python.org/>. We purposely didn't redirect the old archive so that saved URLs would still work. We did the same things for security-announce and clearly can do the same for future migrations. Finally note that Mailman 3 supports archivers other than HyperKitty. For example, one can configure a list to archive at www.mail-archive.com, in such a way that the Archived-At: permalink points to the message at www.mail-archive.com. -- Mark Sapiro The highway is for gamblers, San Francisco Bay Area, Californiabetter use your sense - B. Dylan ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Guarantee ordered dict literals in v3.7?
On 07/11/17 04:05, David Mertz wrote: I strongly opposed adding an ordered guarantee to regular dicts. If the implementation happens to keep that, great. Maybe OrderedDict can be rewritten to use the dict implementation. But the evidence that all implementations will always be fine with this restraint feels poor, and we have a perfectly good explicit OrderedDict for those who want that. If there is an ordered guarantee for regular dicts but not for dict literals, which is the subject of this thread, then haven't we got a recipe for the kind of confusion that will lead to the number of questions from newbies going off of the Richter scale? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Python possible vulnerabilities in concurrency
On 16/11/17 04:53, Guido van Rossum wrote: [snip] They then go on to explain that sometimes vulnerabilities can be exploited, but I object to calling all bugs vulnerabilities -- that's just using a scary word to get attention for a sleep-inducing document containing such gems as "Use floating-point arithmetic only when absolutely needed" (page 230). Thanks for reading it, so we don't have to :) As Wes said, cwe.mitre.org is the place to go if you care about this stuff, although it can be a bit opaque. For non-experts, https://www.owasp.org/index.php/Top_10_2013-Top_10 is a good starting point to learn about software vulnerabilities, Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Comments on PEP 563 (Postponed Evaluation of Annotations)
Hi, Overall I am strongly in favour of this PEP. It pretty much cures all the ongoing pain of using PEP 3017 annotations for type hints. There is one thing I don't like however, and that is treating strings as if the quotes weren't there. While this seems like a superficial simplification to make transition easier, it introduces inconsistency and will ultimately make both implementing and using type hints harder. Having the treatment of strings depend on their depth in the AST seems confusing and unnecessary: "List[int]" becomes 'List[int]' # quotes removed but List["int"] becomes 'List["int"]' # quoted retained Also, T = "My unparseable annotation" def f()->T: pass would remain legal, but def f()->"My unparseable annotation" would become illegal. The change in behaviour between the above two code snippets is already confusing enough without making one of them a SyntaxError. Using annotations for purposes other than type hinting is legal and has been for quite a while. Also, PEP 484 type-hints are not the only type system in the Python ecosystem. Cython has a long history of using static type hints. For tools other than MyPy, the inconsistent quoting is onerous and will require double-quoting to prevent a parse error. For example def foo()->"unsigned int": ... will become illegal and require the cumbersome def foo()->'"unsigned int"': ... Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Comments on PEP 560 (Core support for typing module and generic types)
Hi, I am very concerned by this PEP. By far and away the largest change in PEP 560 is the change to the behaviour of object.__getitem__. This is not mentioned in the PEP at all, but is explicit in the draft implementation. The implementation could implement `type.__getitem__` instead of changing `object.__getitem__`, but that is still a major change to the language. In fact, the addition of `__mro_entries__` makes `__class_getitem__` unnecessary. The addition of `__mro_entries__` allows instances of classes that do not subclass `type` to act as classes in some circumstances. That means that any class can implement `__getitem__` to provide a generic type. For example, here is a minimal working implementation of `List`: class Generic: def __init__(self, concrete): self.concrete = concrete def __getitem__(self, index): return self.concrete def __mro_entries__(self): return self.concrete List = Generic(list) class MyList(List): pass # Works perfectly class MyIntList(List[int]): pass # Also works. The name `__mro_entries__` suggests that this method is solely related method resolution order, but it is really about providing an instance of `type` where one is expected. This is analogous to `__int__`, `__float__` and `__index__` which provide an int, float and int respectively. This rather suggests (to me at least) the name `__type__` instead of `__mro_entries__` Also, why return a tuple of classes, not just a single class? The PEP should include the justification for this decision. Should `isinstance` and `issubclass` call `__mro_entries__` before raising an error if the second argument is not a class? In other words, if `List` implements `__mro_entries__` to return `list` then should `issubclass(x, List)` act like `issubclass(x, list)`? (IMO, it shouldn't) The reasoning behind this decision should be made explicit in the PEP. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Comment on PEP 562 (Module __getattr__ and __dir__)
Hi, Just one comment. Could the new behaviour of attribute lookup on a module be spelled out more explicitly please? I'm guessing it is now something like: `module.__getattribute__` is now equivalent to: def __getattribute__(mod, name): try: return object.__getattribute__(mod, name) except AttributeError: try: getter = mod.__dict__["__getattr__"] except KeyError: raise AttributeError(f"module has no attribute '{name}'") return getter(name) Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comment on PEP 562 (Module __getattr__ and __dir__)
On 19/11/17 20:41, Serhiy Storchaka wrote: 19.11.17 22:24, Mark Shannon пише: Just one comment. Could the new behaviour of attribute lookup on a module be spelled out more explicitly please? I'm guessing it is now something like: `module.__getattribute__` is now equivalent to: def __getattribute__(mod, name): try: return object.__getattribute__(mod, name) except AttributeError: try: getter = mod.__dict__["__getattr__"] except KeyError: raise AttributeError(f"module has no attribute '{name}'") return getter(name) I think it is better to describe in the terms of __getattr__. def ModuleType.__getattr__(mod, name): try: getter = mod.__dict__["__getattr__"] except KeyError: raise AttributeError(f"module has no attribute '{name}'") return getter(name) The implementation of ModuleType.__getattribute__ will be not changed (it is inherited from the object type). Not quite, ModuleType overrides object.__getattribute__ in order to provide a better error message. So with your suggestion, the change would be to *not* override object.__getattribute__ and provide the above ModuleType.__getattr__ Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comments on PEP 560 (Core support for typing module and generic types)
On 19/11/17 22:36, Ivan Levkivskyi wrote: On 19 November 2017 at 21:06, Mark Shannon <mailto:m...@hotpy.org>> wrote: By far and away the largest change in PEP 560 is the change to the behaviour of object.__getitem__. This is not mentioned in the PEP at all, but is explicit in the draft implementation. The implementation could implement `type.__getitem__` instead of changing `object.__getitem__`, but that is still a major change to the language. Except that there is no such thing as object._getitem__. Probably you mean PyObject_GetItem (which is just what is done by BINARY_SUBSCR opcode). Yes, I should have taken more time to look at the code. I thought you were implementing `object.__getitem__`. In general, Python implements its operators as a simple redirection to a special method, with the exception of binary operators which are necessarily more complex. f(...) -> type(f).__call__(f, ...) o.a -> type(o).__getattribute__(o, "a") o[i] -> type(o).__getitem__(o, i) Which is why I don't like the additional complexity you are adding to the dispatching. If we really must have `__class_getitem__` (and I don't think that we do) then implementing `type.__getitem__` is a much less intrusive way to do it. In fact, I initially implemented type.__getitem__, but I didn't like it for various reasons. Could you elaborate? I don't think that any of the above are changes to the language. These are rather implementation details. The only unusual thing is that while dunders are searched on class, __class_getitem__ is searched on the object (class object in this case) itself. But this is clearly explained in the PEP. In fact, the addition of `__mro_entries__` makes `__class_getitem__` unnecessary. But how would you implement this: class C(Generic[T]): ... C[int] # This should work The issue of type-hinting container classes is a tricky one. The definition is defining both the implementation class and the interface type. We want the implementation and interface to be distinct. However, we want to avoid needless repetition. In the example you gave, `C` is a class definition that is intended to be used as a generic container. In my mind the cleanest way to do this is with a class decorator. Something like: @Generic[T] class C: ... or @implements(Generic[T]) class C: ... C would then be a type not a class, as the decorator is free to return a non-class object. It allows the implementation and interface to be distinct: @implements(Sequence[T]) class MySeq(list): ... @implements(Set[Node]) class SmallNodeSet(list): ... # For small sets a list is more efficient than a set. but avoid repetition for the more common case: class IntStack(List[int]): ... Given the power and flexibility of the built-in data structures, defining custom containers is relatively rare. I'm not saying that it should not be considered, but a few minor hurdles are acceptable to keep the rest of the language (including more common uses of type-hints) clean. The name `__mro_entries__` suggests that this method is solely related method resolution order, but it is really about providing an instance of `type` where one is expected. This is analogous to `__int__`, `__float__` and `__index__` which provide an int, float and int respectively. This rather suggests (to me at least) the name `__type__` instead of `__mro_entries__` This was already discussed during months, and in particular the name __type__ was not liked by ... you Ha, you have a better memory than I :) I won't make any more naming suggestions. What I should have said is that the name should reflect what it does, not the initial reason for including it. https://github.com/python/typing/issues/432#issuecomment-304070379 So I would propose to stop bikesheding this (also Guido seems to like the currently proposed name). Should `isinstance` and `issubclass` call `__mro_entries__` before raising an error if the second argument is not a class? In other words, if `List` implements `__mro_entries__` to return `list` then should `issubclass(x, List)` act like `issubclass(x, list)`? (IMO, it shouldn't) The reasoning behind this decision should be made explicit in the PEP. I think this is orthogonal to the PEP. There are many situations where a class is expected, and IMO it is clear that all that are not mentioned in the PEP stay unchanged. Indeed, but you do mention issubclass in the PEP. I think a few extra words of explanation would be helpful. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] What's the status of PEP 505: None-aware operators?
Hi Lukasz, I don’t have plans on editing or promoting the PEP any further, unless there is renewed interest or somebody proposes a more Pythonic syntax. -- Mark E. Haase > On Nov 28, 2017, at 3:31 PM, Raymond Hettinger > wrote: > > >> I also cc python-dev to see if anybody here is strongly in favor or against >> this inclusion. > > Put me down for a strong -1. The proposal would occasionally save a few > keystokes but comes at the expense of giving Python a more Perlish look and a > more arcane feel. > > One of the things I like about Python is that I can walk non-programmers > through the code and explain what it does. The examples in PEP 505 look like > a step in the wrong direction. They don't "look like Python" and make me > feel like I have to decrypt the code to figure-out what it does. > >timeout ?? local_timeout ?? global_timeout >'foo' in (None ?? ['foo', 'bar']) >requested_quantity ?? default_quantity * price >name?.strip()[4:].upper() >user?.first_name.upper() > > > Raymond ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Guido's Python 1.0.0 Announcement from 27 Jan 1994
On 27/01/18 17:05, Oleg Broytman wrote: On Sat, Jan 27, 2018 at 08:58:54AM -0800, Senthil Kumaran wrote: Someone in HackerNews shared the Guido's Python 1.0.0 announcement from 27 Jan 1994. That is, on this day, 20 years ago. 24 years ago, no? (-: Correct so we only have one year to organise the 25th birthday party. The exact time and place for the party will obviously have to be discussed on python-ideas, or do we need a new mailing list? :-) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Backward incompatible change about docstring AST
On 27/02/18 13:37, INADA Naoki wrote: Hi, all. There is design discussion which is deferred blocker of 3.7. https://bugs.python.org/issue32911 ## Background An year ago, I moved docstring in AST from statements list to field of module, class and functions. https://bugs.python.org/issue29463 Without this change, AST-level constant folding was complicated because "foo" can be docstring but "fo" + "o" can't be docstring. This simplified some other edge cases. For example, future import must be on top of the module, but docstring can be before it. Docstring is very special than other expressions/statement. Of course, this change was backward incompatible. Tools reading/writing docstring via AST will be broken by this change. For example, it broke PyFlakes, and PyFlakes solved it already. https://github.com/PyCQA/pyflakes/pull/273 Since AST doesn't guarantee backward compatibility, we can change AST if it's reasonable. The AST module does make some guarantees. The general advice for anyone wanting to do bytecode generation is "don't generate bytecodes directly, use the AST module." However, as long as the AST -> bytecode conversion remains the same, I think it is OK to change source -> AST conversion. Last week, Mark Shannon reported issue about this backward incompatibility. As he said, this change losted lineno and column of docstring from AST. https://bugs.python.org/issue32911#msg312567 ## Design discussion And as he said, there are three options: https://bugs.python.org/issue32911#msg312625 It seems to be that there are three reasonable choices: 1. Revert to 3.6 behaviour, with the addition of `docstring` attribute. 2. Change the docstring attribute to an AST node, possibly by modifying the grammar. 3. Do nothing. 1 is backward compatible about reading docstring. But when writing, it's not DRY or SSOT. There are two source of docstring. For example: `ast.Module([ast.Str("spam")], docstring="egg")` 2 is considerable. I tried to implement this idea by adding `DocString` statement AST. https://github.com/python/cpython/pull/5927/files This is my preferred option now. While it seems large change, most changes are reverting the AST changes. So it's more closer to 3.6 codebase. (especially, test_ast is very close to 3.6) In this PR, `ast.Module([ast.Str("spam")])` doesn't have docstring for simplicity. So it's backward incompatible for both of reading and writing docstring too. But it keeps lineno and column of docstring in AST. > 3 is most conservative because 3.7b2 was cut now and there are some tools supporting 3.7 already. I prefer 2 or 3. If we took 3, I don't want to do 2 in 3.8. One backward incompatible change is better than two. I agree. Whatever we do, we should stick with it. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Symmetry arguments for API expansion
On Mon, Mar 12, 2018 at 4:49 PM, Raymond Hettinger < raymond.hettin...@gmail.com> wrote: > What is the proposal? > * Add an is_integer() method to int(), Decimal(), Fraction(), and Real(). > Modify Rational() to provide a default implementation. > >From the issue discussion, it sounds to me as though the OP would be content with adding is_integer to int and Fraction (leaving the decimal module and the numeric tower alone). > Starting point: Do we need this? > * We already have a simple, traditional, portable, and readable way to > make the test: int(x) == x > As already pointed out in the issue discussion, this solution isn't particularly portable (it'll fail for infinities and nans), and can be horribly inefficient in the case of a Decimal input with large exponent: In [1]: import decimal In [2]: x = decimal.Decimal('1e9') In [3]: %timeit x == int(x) 1.42 s ± 6.27 ms per loop (mean ± std. dev. of 7 runs, 1 loop each) In [4]: %timeit x == x.to_integral_value() 230 ns ± 2.03 ns per loop (mean ± std. dev. of 7 runs, 100 loops each) * In the context of ints, the test x.is_integer() always returns True. > This isn't very useful. > It's useful in the context of duck typing, which I believe is a large part of the OP's point. For a value x that's known to be *either* float or int (which is not an uncommon situation), it makes x.is_integer() valid without needing to know the specific type of x. * It conflicts with a design goal for the decimal module to not invent new > functionality beyond the spec unless essential for integration with the > rest of the language. The reasons included portability with other > implementations and not trying to guess what the committee would have > decided in the face of tricky questions such as whether > Decimal('1.01').is_integer() > should return True when the context precision is only three decimal places > (i.e. whether context precision and rounding traps should be applied before > the test and whether context flags should change after the test). > I don't believe there's any ambiguity here. The correct behaviour looks clear: the context isn't used, no flags are touched, and the method returns True if and only if the value is finite and an exact integer. This is analogous to the existing is-sNaN, is-signed, is-finite, is-zero, is-infinite tests, none of which are affected by (or affect) context. -- Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Symmetry arguments for API expansion
On Mon, Mar 12, 2018 at 9:18 PM, Tim Peters wrote: > [Guido] > > as_integer_ratio() seems mostly cute (it has Tim Peters all > > over it), > > Nope! I had nothing to do with it. I would have been -0.5 on adding > it had I been aware at the time. > Looks like it snuck into the float type as part of the fractions.Fraction work in https://bugs.python.org/issue1682 . I couldn't find much related discussion; I suspect that the move was primarily for optimization (see https://github.com/python/cpython/commit/3ea7b41b5805c60a05e697211d0bfc14a62a19fb). Decimal.as_integer_ratio was added here: https://bugs.python.org/issue25928 . I do have significant uses of `float.as_integer_ratio` in my own code, and wouldn't enjoy seeing it being deprecated/ripped out, though I guess I'd cope. Some on this thread have suggested that things like is_integer and as_integer_ratio should be math module functions. Any suggestions for how that might be made to work? Would we special-case the types we know about, and handle only those (so the math module would end up having to know about the fractions and decimal modules)? Or add a new magic method (e.g., __as_integer_ratio__) for each case we want to handle, like we do for math.__floor__, math.__trunc__ and math.__ceil__? Or use some form of single dispatch, so that custom types can register their own handlers? The majority of current math module functions simply convert their arguments to a float, so a naive implementation of math.is_integer in the same style wouldn't work: it would give incorrect results for a non-integral Decimal instance that ended up getting rounded to an integral value by the float conversion. Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Deprecating float.is_integer()
I'd prefer to see `float.is_integer` stay. There _are_ occasions when one wants to check that a floating-point number is integral, and on those occasions, using `x.is_integer()` is the one obvious way to do it. I don't think the fact that it can be misused should be grounds for deprecation. As far as real uses: I didn't find uses of `is_integer` in our code base here at Enthought, but I did find plenty of places where it _could_ reasonably have been used, and where something less readable like `x % 1 == 0` was being used instead. For evidence that it's generally useful: it's already been noted that the decimal module uses it internally. The mpmath package defines its own "isint" function and uses it in several places: see https://github.com/fredrik-johansson/mpmath/blob/2858b1000ffdd8596defb50381dcb83de2b6/mpmath/ctx_mp_python.py#L764. MPFR also has an mpfr_integer_p predicate: http://www.mpfr.org/mpfr-current/mpfr.html#index-mpfr_005finteger_005fp. A concrete use-case: suppose you wanted to implement the beta function ( https://en.wikipedia.org/wiki/Beta_function) for real arguments in Python. You'll likely need special handling for the poles, which occur only for some negative integer arguments, so you'll need an is_integer test for those. For small positive integer arguments, you may well want the accuracy advantage that arises from computing the beta function in terms of factorials (giving a correctly-rounded result) instead of via the log of the gamma function. So again, you'll want an is_integer test to identify those cases. (Oddly enough, I found myself looking at this recently as a result of the thread about quartile definitions: there are links between the beta function, the beta distribution, and order statistics, and the (k-1/3)/(n+1/3) expression used in the recommended quartile definition comes from an approximation to the median of a beta distribution with integral parameters.) Or, you could look at the SciPy implementation of the beta function, which does indeed do the C equivalent of is_integer in many places: https://github.com/scipy/scipy/blob/11509c4a98edded6c59423ac44ca1b7f28fba1fd/scipy/special/cephes/beta.c#L67 In sum: it's an occasionally useful operation; there's no other obvious, readable spelling of the operation that does the right thing in all cases, and it's _already_ in Python! In general, I'd think that deprecation of an existing construct should not be done lightly, and should only be done when there's an obvious and significant benefit to that deprecation. I don't see that benefit here. -- Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Deprecating float.is_integer()
On Wed, Mar 21, 2018 at 8:49 PM, David Mertz wrote: > For example, this can be true (even without reaching inf): > > >>> x.is_integer() > True > >>> (math.sqrt(x**2)).is_integer() > False > If you have a moment to share it, I'd be interested to know what value of `x` you used to achieve this, and what system you were on. This can't happen under IEEE 754 arithmetic. -- Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 541 - Accepted
Hi All, As the BDFL-Delegate, I’m happy to announce PEP 541 has been accepted. PEP 541 has been voted by the packaging-wg (https://wiki.python.org/psf/ PackagingWG/Charter): - Donald Stufft - Dustin Ingram - Ernest W. Durbin III - Ewa Jodlowska - Kenneth Reitz - Mark Mangoba - Nathaniel J. Smith - Nick Coghlan - Nicole Harris - Sumana Harihareswara Thank you to the packaging-wg and to everyone that has contributed to PEP 541. Best regards, Mark -- Mark Mangoba | PSF IT Manager | Python Software Foundation | mmang...@python.org | python.org | Infrastructure Staff: infrastructure-st...@python.org | GPG: 2DE4 D92B 739C 649B EBB8 CCF6 DC05 E024 5F4C A0D1 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Bugs Migration to OpenShift
Hi All, We’re planning to finish up the bugs.python.org migration to Red Hat OpenShift by May 14th (US Pycon Sprints). For the most part everything will stay same, with the exception of cleaning up some old URL’s and redirects from the previous hosting provider: Upfront Software. We will post a more concrete timeline here by May 1st, but wanted to share this exciting news to move bugs.python.org into a more stable and optimal state. Thank you all for your patience and feedback. A special thanks to Maciej Szulik and Red Hat for helping the PSF with this project. Best regards, Mark -- Mark Mangoba | PSF IT Manager | Python Software Foundation | mmang...@python.org | python.org | Infrastructure Staff: infrastructure-st...@python.org | GPG: 2DE4 D92B 739C 649B EBB8 CCF6 DC05 E024 5F4C A0D1 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 12/04/18 17:12, Jeroen Demeyer wrote: Dear Python developers, I would like to request a review of PEP 575, which is about changing the classes used for built-in functions and Python functions and methods. The text of the PEP can be found at The motivation of PEP 575 is to allow introspection of built-in functions and to allow functions implemented in Python to be re-implemented in C. These are excellent goals. The PEP then elaborates a complex class hierarchy, and various extensions to the C API. This adds a considerable maintainance burden and restricts future changes and optimisations to CPython. While a unified *interface* makes sense, a unified class hierarchy and implementation, IMO, do not. The hierarchy also seems to force classes that are dissimilar to share a common base-class. Bound-methods may be callables, but they are not functions, they are a pair of a function and a "self" object. As the PEP points out, Cython functions are able to mimic Python functions, why not do the same for CPython builtin-functions? As an aside, rather than unifying the classes of all non-class callables, CPython's builtin-function class could be split in two. Currently it is both a bound-method and a function. The name 'builtin_function_or_method' is a give away :) Consider the most common "function" and "method" classes: >>> class C: ...def f(self): pass # "functions" >>> type(C.f) >>> type(len) >>> type(list.append) >>> type(int.__add__) # "bound-methods" >>> type(C().f) >>> type([].append) >>> type(1 .__add__) IMO, there are so many versions of "function" and "bound-method", that a unified class hierarchy and the resulting restriction to the implementation will make implementing a unified interface harder, not easier. For "functions", all that is needed is to specify an interface, say a single property "__signature__". Then all that a class that wants to be a "function" need do is have a "__signature__" property and be callable. For "bound-methods", we should reuse the interface of 'method'; two properties, "__func__" and "__self__". Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 572: Assignment Expressions
Hi, On 17/04/18 08:46, Chris Angelico wrote: Having survived four rounds in the boxing ring at python-ideas, PEP 572 is now ready to enter the arena of python-dev. I'm very strongly opposed to this PEP. Would Python be better with two subtly different assignment operators? The answer of "no" seems self evident to me. Do we need an assignment expression at all (regardless of the chosen operator)? I think we do not. Assignment is clear at the moment largely because of the context; it can only occur at the statement level. Consequently, assignment and keyword arguments are never confused despite have the same form `name = expr` List comprehensions --- The PEP uses the term "simplifying" when it really means "shortening". One example is stuff = [[y := f(x), x/y] for x in range(5)] as a simplification of stuff = [(lambda y: [y,x/y])(f(x)) for x in range(5)] IMO, the "simplest" form of the above is the named helper function. def meaningful_name(x): t = f(x) return t, x/t [meaningful_name(i) for i in range(5)] Is longer, but much simpler to understand. I am also concerned that the ability to put assignments anywhere allows weirdnesses like these: try: ... except (x := Exception) as x: ... with (x: = open(...)) as x: ... def do_things(fire_missiles=False, plant_flowers=False): ... do_things(plant_flowers:=True) # whoops! It is easy to say "don't do that", but why allow it in the first place? Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Bugs Migration to OpenShift
Hi All, Victor made a good point here. After discussion with Maciej, we will postpone this migration to OpenShift until after sprints since bpo will be heavily used. Maciej and I will update everyone on the timeline after sprints. Best regards, Mark On Mon, Apr 30, 2018 at 12:54 AM, Victor Stinner wrote: > Does it mean that the final switch will happen during the sprints? > Would it be possible to do it before or after? If bugs.python.org > doesn't work during the sprint, it will be much harder to contribute > to CPython during the sprints. > > (If I misunderstood, ignore my message :-)) > > Victor > > 2018-04-29 19:07 GMT+02:00 Mark Mangoba : >> Hi All, >> >> We’re planning to finish up the bugs.python.org migration to Red Hat >> OpenShift by May 14th (US Pycon Sprints). For the most part >> everything will stay same, with the exception of cleaning up some old >> URL’s and redirects from the previous hosting provider: Upfront >> Software. >> >> We will post a more concrete timeline here by May 1st, but wanted to >> share this exciting news to move bugs.python.org into a more stable >> and optimal state. >> >> Thank you all for your patience and feedback. A special thanks to >> Maciej Szulik and Red Hat for helping the PSF with this project. >> >> Best regards, >> Mark >> >> -- >> Mark Mangoba | PSF IT Manager | Python Software Foundation | >> mmang...@python.org | python.org | Infrastructure Staff: >> infrastructure-st...@python.org | GPG: 2DE4 D92B 739C 649B EBB8 CCF6 >> DC05 E024 5F4C A0D1 >> ___ >> Python-Dev mailing list >> Python-Dev@python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com -- Mark Mangoba | PSF IT Manager | Python Software Foundation | mmang...@python.org | python.org | Infrastructure Staff: infrastructure-st...@python.org | GPG: 2DE4 D92B 739C 649B EBB8 CCF6 DC05 E024 5F4C A0D1 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 576
Hi all, Just a reminder that PEP 576 still exists as a lightweight alternative to PEP 575/580. It achieves the same goals as PEP 580 but is much smaller. https://github.com/markshannon/pep-576 Unless there is a big rush, I would like to do some experiments as to whether the new calling convention should be typedef (*callptr)(PyObject *func, PyObject *const *stack, Py_ssize_t nargs, PyObject *kwnames); or whether the increased generality of: typedef (*callptr)(PyObject *func, PyObject *const *stack, Py_ssize_t nargs, PyObject *kwnames, PyTupleObject *starargs, PyObject *kwdict); is a worthwhile enhancement. An implementation can be found here: https://github.com/python/cpython/compare/master...markshannon:pep-576-minimal Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status of PEP 484 and the typing module
On 21/05/15 16:01, Guido van Rossum wrote: Hi Mark, We're down to the last few items here. I'm CC'ing python-dev so folks can see how close we are. I'll answer point by point. On Thu, May 21, 2015 at 6:24 AM, Mark Shannon mailto:m...@hotpy.org>> wrote: Hi, The PEP itself is looking fairly good. I hope you'll accept it at least provisionally so we can iterate over the finer points while a prototype of typing.py in in beta 1. However, I don't think that typing.py is ready yet, for a number of reasons: 1. As I've said before, there needs to be a distinction between classes and types. They is no need for Any, Generic, Generic's subtypes, or Union to subclass builtins.type. I strongly disagree. They can appear in many positions where real classes are acceptable, in particular annotations can have classes (e.g. int) or types (e.g. Union[int, str]). Why does this mean that they have to be classes? Annotations can be any object. It might to help to think, not in terms of types being classes, but classes being shorthand for the nominal type for that class (from the point of view of the checker and type geeks) So when the checker sees 'int' it treats it as Type(int). Subtyping is distinct from subclassing; Type(int) <: Union[Type(int), Type(str)] has no parallel in subclassing. There is no class that corresponds to a Union, Any or a Generic. In order to support the class C(ParameterType[T]): pass syntax, parametric types do indeed need to be classes, but Python has multiple inheritance, so thats not a problem: class ParameterType(type, Type): ... Otherwise typing.Types shouldn't be builtin.types and vice versa. I think a lot of this issues on the tracker would not have been issues had the distinction been more clearly enforced. Playing around with typing.py, it has also become clear to me that it is also important to distinguish type constructors from types. What do I mean by a type constructor? A type constructor makes types. "List" is an example of a type constructor. It constructs types such as List[T] and List[int]. Saying that something is a List (as opposed to a list) should be rejected. The PEP actually says that plain List (etc.) is equivalent to List[Any]. (Well, at least that's the intention; it's implied by the section about the equivalence between Node() and Node[Any](). Perhaps we should change that. Using 'List', rather than 'list' or 'List[Any]' suggests an error, or misunderstanding, to me. Is there a use case where 'List' is needed, and 'list' will not suffice? I'm assuming that the type checker knows that 'list' is a MutableSequence. 2. Usability of typing as it stands: Let's try to make a class that implements a mutable mapping. >>> import typing as tp #Make some variables. >>> T = tp.TypeVar('T') >>> K = tp.TypeVar('K') >>> V = tp.TypeVar('V') #Then make our class: >>> class MM(tp.MutableMapping): pass ... #Oh that worked, but it shouldn't. MutableMapping is a type constructor. It means MutableMapping[Any]. #Let's make one >>> MM() Traceback (most recent call last): File "", line 1, in File "/home/mark/repositories/typehinting/prototyping/typing.py", line 1095, in __new__ if _gorg(c) is Generic: File "/home/mark/repositories/typehinting/prototyping/typing.py", line 887, in _gorg while a.__origin__ is not None: AttributeError: type object 'Sized' has no attribute '__origin__' # ??? Sorry, that's a bug I introduced in literally the last change to typing.py. I will fix it. The expected behavior is TypeError: Can't instantiate abstract class MM with abstract methods __len__ #Well let's try using type variables. class MM2(tp.MutableMapping[K, V]): pass ... >>> MM2() Traceback (most recent call last): File "", line 1, in File "/home/mark/repositories/typehinting/prototyping/typing.py", line 1095, in __new__ if _gorg(c) is Generic: File "/home/mark/repositories/typehinting/prototyping/typing.py", line 887, in _gorg while a.__origin__ is not None: AttributeError: type object 'Sized' has no attribute '__origin__' Ditto, and sorry. No need to apologise, I'm just a bit worried about how easy it was for me to expose this sort of bug. At this point, we have to resort to using 'Dict', which forces us to subclass 'dict' which may not be what we want as it may cause metaclass conflicts.
[Python-Dev] PEP 484 (Type Hints) announcement
Hello all, I am pleased to announce that I am accepting PEP 484 (Type Hints). Given the proximity of the beta release I thought I would get this announcement out now, even though there are some (very) minor details to iron out. (If you want to know the details, it's all at https://github.com/ambv/typehinting) I hope that PEP 484 will be a benefit to all users of Python. I think the proposed annotation semantics and accompanying module are technically sound and I hope that they are socially acceptable to the Python community. I have long been aware that as well as a powerful, sophisticated and "production quality" language, Python is also used by many casual programmers, and as a language to introduce children to programming. I also realise that this PEP does not look like it will be any help to the part-time programmer or beginner. However, I am convinced that it will enable significant improvements to IDEs (hopefully including IDLE), static checkers and other tools. These tools will then help us all, beginners included. This PEP has been a huge amount of work, involving a lot of people. So thank you to everyone involved. If I were to list names I would inevitably miss someone out. You know who you are. Finally, if you are worried that this will make Python ugly and turn it into some sort of inferior Java, then I share you concerns, but I would like to remind you of another potential ugliness; operator overloading. C++, Perl and Haskell have operator overloading and it gets abused something rotten to produce "concise" (a.k.a. line noise) code. Python also has operator overloading and it is used sensibly, as it should be. Why? It's a cultural issue; readability matters. Python is your language, please use type-hints responsibly :) Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Preserving the definition order of class namespaces.
On 24/05/15 10:35, Nick Coghlan wrote: On 24 May 2015 at 15:53, Eric Snow wrote: On May 23, 2015 10:47 PM, "Guido van Rossum" wrote: How will __definition_order__ be set in the case where __prepare__ doesn't return an OrderedDict? Or where a custom metaclass's __new__ calls its superclass's __new__ with a plain dict? (I just wrote some code that does that. :-) I was planning on setting it to None if the order is not available. At the moment that's just a check for OrderedDict. Is it specifically necessary to save the order by default? Metaclasses would be able to access the ordered namespace in their __new__ method regardless, and for 3.6, I still like the __init_subclass__ hook idea proposed in PEP 487, which includes passing the original namespace to the new hook. So while I'm sold on the value of making class execution namespaces ordered by default, I'm not yet sold on the idea of *remembering* that order without opting in to doing so in the metaclass. If we leave __definition_order__ out for the time being then, for the vast majority of code, the fact that the ephemeral namespace used to evaluate the class body switched from being a basic dictionary to an ordered one would be a hidden implementation detail, rather than making all type objects a little bigger. and a little slower. Cheers, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Computed Goto dispatch for Python 2
On 28/05/2015 15:47, Skip Montanaro wrote: On Thu, May 28, 2015 at 9:02 AM, Brett Cannon wrote: But you could argue that "Special cases aren't special enough to break the rules" and that's what we are proposing here by claiming Python 2.7 is a special case and thus we should accept a patch that is not a one-liner change to gain some performance in a bugfix release. One can read anything he wants into the Zen. I could respond with this retort: "Although practicality beats purity," but I won't. :-) Skip That's good, otherwise you'd just be repeating what Raymond said further up this subthread two hours and one minute before you didn't say it :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] RM for 3.6?
On 01/06/2015 19:33, Benjamin Peterson wrote: On Mon, Jun 1, 2015, at 14:24, Stefan Behnel wrote: Barry Warsaw schrieb am 01.06.2015 um 20:07: On May 30, 2015, at 10:09 AM, Serhiy Storchaka wrote: Isn't it a time to assign release manager for 3.6-3.7? Indeed! Please welcome Ned Deily as RM for 3.6: https://www.python.org/dev/peps/pep-0494/ Does he know already? The suck^H^H^H^H man even volunteered! Was that "volunteered" as in RM or the Comfy Chair? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Unable to build regex module against Python 3.5 32-bit
On 05/06/2015 01:37, MRAB wrote: Steve Dower's post has prompted me to look again at building the regex module for Python 3.5, 32-bit and 64-bit, using just Mingw64 and linking against python32.dll. It works! Earlier versions of Python, however, including Python 2.7, still seem to want libpython??.a. This http://bugs.python.org/issue24385 should interest you. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Obtaining stack-frames from co-routine objects
On 14/06/2015 11:50, Ben Leslie wrote: Per Nick's advice I've created enhancement proposal 245340 with an attached patch. http://bugs.python.org/issue24450 as opposed to http://bugs.python.org/issue24450#msg245340 :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 493: Redistributor guidance for Python 2.7 HTTPS
On 07/07/2015 03:10, Stephen J. Turnbull wrote: Cross-posted to redirect discussion. Replies directed to Python-Ideas. Erik Bray writes on Python-Dev: > On Mon, Jul 6, 2015 at 6:21 AM, Antoine Pitrou wrote: > > On Mon, 6 Jul 2015 14:22:46 +1000, Nick Coghlan wrote: > >> > >> The main change from the last version discussed on python-ideas > > > > Was it discussed there? That list has become totally useless, I've > > stopped following it. > > Considering that a useful discussion of a useful PEP occurred there > (not to mention other occasionally useful discussions) I'd say that > such a value judgment is not only unnecessary but also inaccurate. As you point out, the words "totally" and "useless" were unnecessary and inaccurate respectively. However, the gist of his post, that the S/N on Python-Ideas has become substantially lower in the last few months, seems accurate to me. At least two recent threads could have been continued on Python-List, where they would have benefited a lot more users, and they didn't seem profitable on Python-Ideas since it was quite evident that Those Who Know About Python were adamantly opposed to the idea as discussed in the thread, while the proponent kept pushing on that brick wall rather than seeking a way around it. I myself continue to follow Python-Ideas, Nick and other committers are posting here daily, and even Guido manages to pop up occasionally, so that may be no problem (or even a good thing if it results in educating and inviting new committers in the long run). But I think it's worth considering whether it we should cultivate a bit more discipline here. Again, discussion on Python-Ideas, please. From https://mail.python.org/mailman/listinfo/python-ideas This list is to contain discussion of speculative language ideas for Python for possible inclusion into the language. If an idea gains traction it can then be discussed and honed to the point of becoming a solid proposal to put to python-dev as appropriate. Relative to the above I believe that far too many proposals are for trivial ideas, mainly targetted at the stdlib, that would be better suited to the main python list. As for gaining traction, it's often the complete opposite, flogging a dead horse is an understatement for some threads. Gently putting the OP down with a firm but polite "it ain't gonna happen" would save a lot of time all around. Just my £0.02p worth. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 493: Redistributor guidance for Python 2.7 HTTPS
On 07/07/2015 15:42, Antoine Pitrou wrote: Whether the time required to properly follow python-ideas is a productive involvement for the average core dev is another question. The problem I see with python-ideas is that it may select on free time more than on actual, concrete contribution... (note that python-list has a similar problem with some of its old-timers and regular ranters; the difference is that python-list has a ready alternative in StackOverflow, with perhaps higher-quality answers... it's less and less relevant in the grand scheme of things) I cannot see StackOverflow as a "ready alternative" to python-list as questions are strictly closed, nothing in the way of debate is allowed. I'd love to explain to some people what I think of their "perhaps higher-quality answers" but I don't have enough "reputation". Having said that I have to agree about python-list, there is very little of any substance on there nowadays. Perhaps that's because people are reading any of the 387 Python lists on gmane that are dedicated to their specific area of interest? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] How far to go with user-friendliness
On 14/07/2015 23:22, Robert Collins wrote: For clarity, I think we should: - remove the assret check, it is I think spurious. - add a set of functions to the mock module that should be used in preference to Mock.assert* - mark the Mock.assert* functions as PendingDeprecation - in 3.6 move the PendingDeprecation to Deprecated - in 3.7 remove the Mock.assert* functions and the check for method names beginning with assert entirely. -Rob +1 from me as not even Baldrick could do better, see https://en.wikipedia.org/wiki/Baldrick :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] How far to go with user-friendliness
On 18/07/2015 01:00, Ryan Gonzalez wrote: I am tempted to reply with a slightly sarcastic message involving a cookie... I'm not tempted, I will ask, what the hell are you on about? On July 17, 2015 6:40:21 PM CDT, Antoine Pitrou wrote: Frankly, this kind of inept discussion, where a bunch of folks get hung up about an extremely minor design decision (who cares whether "assret" is being special-cased or not? in the actual world, not the fantasy world of righteous indignation and armchair architects?), is amongst the reasons why I'm stopping contributing to CPython. Keep up the good work, you're making this place totally repulsive to participate in. Every maintainer or contributor now has an army of voluntary hair-splitters to bother about, most of whom probably aren't relying on said functionality to begin with. Regards Antoine. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] How far to go with user-friendliness
On 19/07/2015 22:06, Brett Cannon wrote: On Sun, Jul 19, 2015 at 8:58 AM Ethan Furman mailto:et...@stoneleaf.us>> wrote: On 07/19/2015 02:22 AM, s.krah wrote: > Ein Sa, 18 Jul 2015 15:35:05 + *Stephen J. Turnbull hat geschrieben >> s.krah writes: >>> Sorry, that amounts to twisting my words. >> >> Let's not play the dozens here. That just extends the thread to no point. > > Indeed. I'll just filter you from now on. You may as well filter me too, then, because you are acting like an ass and I'm saying so. Is the name calling really necessary? Couldn't you have just as easily said that you disapproved of Stephen K's attitude without calling him an ass? Same goes for Stephen K's comment where he could have stated he was simply going to ignore Stephen T and be less snippy about it. There are ways to get the point across just as strongly without resorting to this sort of stuff. This whole thread has shown two problems we have on this list. One is the occasional name calling and bad attitude that we let slide in the name of blowing off steam or something. We are all adults here and can get the point across that we disapprove of something without resorting to playground antics. Plus emails can be delayed until cooler heads prevail. It's this kind of thing that leads to the need of a CoC for this list and contributing in general so that people can feel okay saying they thought a comment was out of line without retaliation for it. The other problem is letting threads drag on needlessly. The longer a thread drags on, the greater the chance someone is going to say something they regret. It can also lead to some people like Antoine feeling like their time is being wasted and become frustrated. I think in this instance debate should have been cut sooner when no clear consensus was being reached to force a reversal of the patch and then have someone say politely that a core dev who is the listed expert on a module made a call and if someone disliked it they could produce a patch and propose it on the issue tracker to see if they could change someone's mind (I believe both Nick and Ethan have made the same point). Our niceness can be to a fault when no one is willing to step up and simply say "this thread is in a stalemate and nothing new is being added, please move it to the issue tracker if you wish to discuss further where you can propose a patch" and we just be good about telling people to move the discussion to the issue tracker if they keep replying. There is absolutely no reason we can't keep discussions cordial, friendly, and on-point on this list and prevent this sort of debacle from occurring again. +infinity -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Building python 2.7.10 for Windows from source
I have been using Python for some time but it's been a decade since I've tried to build it from source, back in the 2.4 days. Things seem to have gotten a little more complicated now. I've read through the PCBuild/README file and got most stuff compiling. I find it a little odd that there are special instructions for the building the release version of tcl/tk. Is that what the developers actually do when they cut a release, or is there some other, top-level script that does this automatically? It just seems odd. Anyhow, my specific question is around the distutils wininst stubs, provided as binaries in the release tarball. Where can I find the source files that those binaries are built from? Many thanks, Mark. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 25/07/2015 00:06, ISAAC J SCHWABACHER wrote: I got to "Daylight Saving Time is a red herring," and stopped reading. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 447 (type.__getdescriptor__)
Hi, On 22/07/15 09:25, Ronald Oussoren wrote:> Hi, > > Another summer with another EuroPython, which means its time again to > try to revive PEP 447… > IMO, there are two main issues with the PEP and implementation. 1. The implementation as outlined in the PEP is infinitely recursive, since the lookup of "__getdescriptor__" on type must necessarily call type.__getdescriptor__. The implementation (in C) special cases classes that inherit "__getdescriptor__" from type. This special casing should be mentioned in the PEP. 2. The actual implementation in C does not account for the case where the class of a metaclass implements __getdescriptor__ and that method returns a value when called with "__getdescriptor__" as the argument. Why was "__getattribute_super__" rejected as an alternative? No reason is given. "__getattribute_super__" has none of the problems listed above. Making super(t, obj) delegate to t.__super__(obj) seems consistent with other builtin method/classes and doesn't add corner cases to the already complex implementation of PyType_Lookup(). Cheers, Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 447 (type.__getdescriptor__)
> On 26 July 2015 at 10:41 Ronald Oussoren wrote: > > > > > On 26 Jul 2015, at 09:14, Ronald Oussoren wrote: > > > > > >> On 25 Jul 2015, at 17:39, Mark Shannon >> <mailto:m...@hotpy.org>> wrote: > >> > >> Hi, > >> > >> On 22/07/15 09:25, Ronald Oussoren wrote:> Hi, > >>> > >>> Another summer with another EuroPython, which means its time again to > >>> try to revive PEP 447… > >>> > >> > >> IMO, there are two main issues with the PEP and implementation. > >> > >> 1. The implementation as outlined in the PEP is infinitely recursive, since > >> the > >> lookup of "__getdescriptor__" on type must necessarily call > >> type.__getdescriptor__. > >> The implementation (in C) special cases classes that inherit > >> "__getdescriptor__" > >> from type. This special casing should be mentioned in the PEP. > > > > Sure. An alternative is to slightly change the the PEP: use > > __getdescriptor__ when > > present and directly peek into __dict__ when it is not, and then remove the > > default > > __getdescriptor__. > > > > The reason I didn’t do this in the PEP is that I prefer a programming model > > where > > I can explicitly call the default behaviour. > > I’m not sure there is a problem after all (but am willing to use the > alternative I describe above), > although that might be because I’m too much focussed on CPython semantics. > > The __getdescriptor__ method is a slot in the type object and because of that > the > normal attribute lookup mechanism is side-stepped for methods implemented in > C. A > __getdescriptor__ that is implemented on Python is looked up the normal way by > the > C function that gets added to the type struct for such methods, but that’s not > a problem for > type itself. > > That’s not new for __getdescriptor__ but happens for most other special > methods as well, > as I noted in my previous mail, and also happens for the __dict__ lookup > that’s currently > used (t.__dict__ is an attribute and should be lookup up using > __getattribute__, …) "__getdescriptor__" is fundamentally different from "__getattribute__" in that is defined in terms of itself. object.__getattribute__ is defined in terms of type.__getattribute__, but type.__getattribute__ just does dictionary lookups. However defining type.__getattribute__ in terms of __descriptor__ causes a circularity as __descriptor__ has to be looked up on a type. So, not only must the cycle be broken by special casing "type", but that "__getdescriptor__" can be defined not only by a subclass, but also a metaclass that uses "__getdescriptor__" to define "__getdescriptor__" on the class. (and so on for meta-meta classes, etc.) Cheers, Mark ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Burning down the backlog.
On 26/07/2015 22:59, Paul Moore wrote: On 26 July 2015 at 16:39, Berker Peksağ wrote: I'm not actually clear what "Commit Review" status means. I did do a quick check of the dev guide, and couldn't come up with anything, https://docs.python.org/devguide/triaging.html#stage Thanks, I missed that. The patches I checked seemed to have been committed and were still at commit review, though. Doesn't the roundup robot update the stage when there's a commit? (Presumably not, and people forget to do so too). Paul I wouldn't know. I certainly believe that the more time we spend assisting Cannon, Coghlan & Co on the core workflow, the quicker, in the medium to long term, we put the backlog of issues to bed. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Building python 2.7.10 for Windows from source
Thanks, that got me a bit further. Now I'm wondering how I figure out which version of tcl,tk and Tix actually got built with the 2.7.10 installer. Tools\buildbot\external.bat conflicts with the versions found in PC\build_tkinter.py, and the version in PC\VS8.0\build_tkinter.py. I am assuming the buildbot script is the one that's actually, used? I would submit a patch to clean some of this up, but sounds as though it's in the pipeline. On Fri, Jul 24, 2015 at 2:46 PM, Zachary Ware wrote: > On Jul 24, 2015 8:30 AM, "Mark Kelley" wrote: >> >> I have been using Python for some time but it's been a decade since >> I've tried to build it from source, back in the 2.4 days. Things seem >> to have gotten a little more complicated now. >> >> I've read through the PCBuild/README file and got most stuff >> compiling. I find it a little odd that there are special instructions >> for the building the release version of tcl/tk. Is that what the >> developers actually do when they cut a release, or is there some >> other, top-level script that does this automatically? It just seems >> odd. > > That used to be standard procedure, yes. However, I just recently backported > the project files from 3.5, which include project files for building Tcl/Tk > and Tix, in both Debug and Release configurations, so I may have missed some > stuff that could be removed from PCbuild/readme.txt. You do need some extra > stuff to build 2.7 with its new project files, though (which i know is now > covered in readme.txt). There hasn't been a release with those project files > yet though, they're just in the hg repo. > >> Anyhow, my specific question is around the distutils wininst stubs, >> provided as binaries in the release tarball. Where can I find the >> source files that those binaries are built from? > > I believe the source for those is in PC/bdist_wininst/, or some very similar > path. > > Hope this helps, > -- > Zach > (On a phone) ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 27/07/2015 15:45, Nick Coghlan wrote: On 28 July 2015 at 00:27, Steve Dower wrote: Am I the only one feeling like this entire thread should be moved to python-ideas at this point? Since this is an area where the discussion of implementation details and the discussion of the developer experience can easily end up at cross purposes, I'm wondering if there may be value in actually splitting those two discussions into different venues by creating a datetime-sig, and specifically inviting the pytz and dateutil developers to participate in the SIG as well. The traffic on a similarly niche group like import-sig is only intermittent, but it means that by the time we bring suggestions to python-ideas or python-dev, we've already thrashed out the low level arcana and know that whatever we're proposing *can* be made to work, leaving the core lists to focus on the question of whether or not the change *should* be made. Whether or not to do that would be up to the folks with a specific interest in working with dates and times, though. Cheers, Nick. Would it be worth doing a straw poll to gauge how many people really need this, from my perspective anyway, level of complexity? I've used datetimes a lot, but I don't even need naive timezones, completely dumb suits me. Alternatively just go ahead, knowing that if the proposal isn't accepted into the stdlib it can at least go on pypi. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 01:58, Tres Seaver wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 07/27/2015 06:03 PM, Tim Peters wrote: Even if days weren't a distinguished unit for timedelta, I'd still much rather write, e.g., timedelta(days=5, hours=3) than timedelta(hours=123) or timedelta(hours=5*24 + 3) etc. The intent of the first spelling is obvious at a glance. - From a human's perspective, "a day from now" is always potentially unambigous, just like "a month from now" or "a year from now", whereas "24 hours from now" is never so. In a given application, a user who doesn't care can always write a helper function to generate hours; in an applicatino whose developer who *does* care, the 'days' argument to timedelta in its current does *not* help achieve her goal: it is an attractive nuisance she will have to learn to avoid. To me a day is precisely 24 hours, no more, no less. I have no interest in messing about with daylight savings of 30 minutes, one hour, two hours or any other variant that I've not heard about. In my mission critical code, which I use to predict my cashflow, I use code such as. timedelta(days=14) Is somebody now going to tell me that this isn't actually two weeks? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 03:15, Tim Peters wrote: [Mark Lawrence ] To me a day is precisely 24 hours, no more, no less. I have no interest in messing about with daylight savings of 30 minutes, one hour, two hours or any other variant that I've not heard about. In my mission critical code, which I use to predict my cashflow, I use code such as. timedelta(days=14) Is somebody now going to tell me that this isn't actually two weeks? Precisely define what "two weeks" means, and then someone can answer. One week == 7 days == 7 * 24 hours Two weeks = 2 * (one week) The timedelta in question represents precisely 14 24-hours days, and ignores the possibility that some day in there may suffer a leap second. As I've said elsewhere I've no interest in DST, at least right here, right now, let alone leap seconds. When I run my cashflow forecast the balance in my bank account one year from today isn't going to be influenced by UK clocks falling back to GMT at the end of October and on to BST at the end of next March. It remains unclear to me which of those outcomes _you_ consider to be "actually 14 days". But my bet is that you like what Python already does here (because "tz-naive arithmetic" is exactly what _I_ want in all my financial code). Correct. What I would like to know is how many people are in my position, how many people are in the situation of needing every possible combination of dates, times, daylight saving, local time zone rules and anything else you can think of under the sun, and how many are on the scale somewhere in between these two extremes. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 06:21, Lennart Regebro wrote: On Tue, Jul 28, 2015 at 3:22 AM, Mark Lawrence wrote: To me a day is precisely 24 hours, no more, no less. OK. In my mission critical code, which I use to predict my cashflow, I use code such as. timedelta(days=14) Is somebody now going to tell me that this isn't actually two weeks? Yes, I'm telling you that, now. The two claims "One day is always precisely 24 hours" and "14 days is two weeks" are not both true. You have to choose one. //Lennart You can tell me, but as far as I'm concerned in my application both are true, so I don't have to choose one. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 07:54, Lennart Regebro wrote: On Tue, Jul 28, 2015 at 8:11 AM, Tim Peters wrote: [Tim] timedelta objects only store days, seconds, and microseconds, [Lennart Regebro ] Except that they don't actually store days. They store 24 hour periods, Not really. A timedelta is truly an integer number of microseconds, and that's all. That's what I said. Timedeltas, internally assume that 1 day is 24 hours. Or 8640 microseconds. That's the assumption internally in the timedelta object. The problem with that being that in the real world that's not true. In my real world it is. We clearly have parallel worlds. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 05:26, Tim Peters wrote: Python's datetime supports microsecond precision. Mere seconds are for wimps ;-) Microseconds are for wimps https://bugs.python.org/issue22117 :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 13:35, Lennart Regebro wrote: On Tue, Jul 28, 2015 at 1:55 PM, Mark Lawrence wrote: One week == 7 days == 7 * 24 hours Two weeks = 2 * (one week) Right, and that of course is not true in actual reality. I know you are not interested in DST, but with a timezone that has DST, two times a year, the above statement is wrong. Tim asked for my definition of two weeks so I've given it. With respect to that in reality this is true, for me, with my application, making my statement above correct. For my application we could go from GMT to BST and back on successive days throughout the year and it wouldn't make any difference. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Building python 2.7.10 for Windows from source
I got my MSI built, after numerous modifications to the various build scripts. the installed file set bears little resemblance to the official release for the same version, which is a bit of a fail for the Open-source principle, but it seems nobody cares, so I'll split. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status on PEP-431 Timezones
On 28/07/2015 16:47, Chris Angelico wrote: On Tue, Jul 28, 2015 at 10:06 PM, Mark Lawrence wrote: On 28/07/2015 06:21, Lennart Regebro wrote: On Tue, Jul 28, 2015 at 3:22 AM, Mark Lawrence wrote: To me a day is precisely 24 hours, no more, no less. In my mission critical code, which I use to predict my cashflow, I use code such as. timedelta(days=14) Is somebody now going to tell me that this isn't actually two weeks? Yes, I'm telling you that, now. The two claims "One day is always precisely 24 hours" and "14 days is two weeks" are not both true. You have to choose one. You can tell me, but as far as I'm concerned in my application both are true, so I don't have to choose one. (and subsequently) Tim asked for my definition of two weeks so I've given it. With respect to that in reality this is true, for me, with my application, making my statement above correct. For my application we could go from GMT to BST and back on successive days throughout the year and it wouldn't make any difference. When your clocks go from winter time to summer time, there are two possibilities: 1) Your application says "days=14" and actually gets 167 or 169 hours 2) Your application says "days=14" and ends up with the time changing My cashflow forecast doesn't give two hoots how many hours there are in two weeks, which I've defined elsewhere. It doesn't care if the time changes. Neither does it care how many days there are in a month for that matter. It can even cater with plotting data with a tick on the 29th of each month when we have a leap year and February is included in the plot, thanks to the dateutils rrule. (Or equivalently if you say "days=1" or "hours=24" or whatever.) A naive declaration of "two weeks later" could conceivably mean either. When I schedule my weekly Dungeons & Dragons sessions, they are officially based on UTC [1], which means that one session starts 168 hours after the previous one. Currently, they happen when my local clock reads noon; in summer, my local clock will read 1PM. Was it still "a week later" when it was noon once and 1PM the next time? Don't know and don't care, your application is not working in the same way that mine does. Conversely, my (also weekly) Thinkful open sessions are scheduled every week at 8AM US Eastern time (America/New_York). For anyone on the Atlantic coast of the US, they will occur every Wednesday and the clock will read 08:00 every time. Sometimes, one will happen 167 hours after the previous one, or 169 hours afterwards. Is that "a week later"? Ditto my above remark. Your application has to make a choice between these two interpretations. This is a fundamental choice that MUST be made. Trying to pretend that your application doesn't care is like trying to say that Code Page 437 is good enough for all your work, and you can safely assume that one byte is one character is one byte. No. ChrisA [1] Leap seconds aren't significant, as people are often several minutes early or late, so UTC/UT1/GMT/TIA are all effectively equivalent. Precisely my point. For me hours are not significant, days are. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Issues not responded to.
There are over 400 issues on the bug tracker that have not had a response to the initial message, roughly half of these within the last eight months alone. Is there a (relatively) simple way that we can share these out between us to sort those that are likely to need dealing with in the medium to longer term, from the simple short term ones, e.g very easy typo fixes? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com