Re: [Python-Dev] [snakebite] snakebite for GSoC?
On Thu, Mar 19, 2009 at 10:32:03AM -0700, ajaksu wrote: > Does anyone have good ideas for assigning students to snakebite? Is it > too early? Perhaps a little too early, python-dev@ won't know anything about Snakebite yet as I haven't publicly announced it there ;-) Watch this space closer to PyCon. FWIW, though, we're planning for Snakebite to be *very* involved with GSoC/GHOP. > I think the client-side 'Snakebite daemon' and server-side stuff > described at http://tinyurl.com/beyond-buildbot would be great > projects. Indeed. Trent. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Proposal: new list function: pack
I propose a new function for list for pack values of a list and sliding over them: then we can do things like this: for i, j, k in pack(range(10), 3, partialend=False): print i, j, k I propose this because i need a lot of times pack and slide function over list and this one combine the two in a generator way. def pack(l, size=2, slide=2, partialend=True): lenght = len(l) for p in range(0,lenght-size,slide): def packet(): for i in range(size): yield l[p+i] yield packet() p = p + slide if partialend or lenght-p == size: def packet(): for i in range(lenght-p): yield l[p+i] yield packet() a = range(10) print a print 'pack(a, 2, 2, True):', [list(p) for p in pack(a, 2, 2, True)] print 'pack(a, 2, 2, False):', [list(p) for p in pack(a, 2, 2, False)] print 'pack(a, 2, 3, True):', [list(p) for p in pack(a, 2, 3, True)] print 'pack(a, 2, 3, False):', [list(p) for p in pack(a, 2, 3, False)] print 'pack(a, 3, 2, True):', [list(p) for p in pack(a, 3, 2, True)] print 'pack(a, 3, 2, False):', [list(p) for p in pack(a, 3, 2, False)] print 'pack(a, 3, 3, True):', [list(p) for p in pack(a, 3, 3, True)] print 'pack(a, 3, 3, False):', [list(p) for p in pack(a, 3, 3, False)] paul bedaride ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [snakebite] Re: snakebite for GSoC?
On Fri, Mar 20, 2009 at 07:37:40AM +, Trent Nelson wrote: -> -> On Thu, Mar 19, 2009 at 10:32:03AM -0700, ajaksu wrote: -> > Does anyone have good ideas for assigning students to snakebite? Is it -> > too early? -> -> Perhaps a little too early, python-dev@ won't know anything about -> Snakebite yet as I haven't publicly announced it there ;-) Watch -> this space closer to PyCon. I do have a snakebite-motivated project, listed here: http://ivory.idyll.org/blog/mar-09/gsoc-projects.html (#7) Right now an independent study student is building something, but he can't work on it over the summer, so continuing it in various ways could be a GSoC project. cheers, --titus -- C. Titus Brown, c...@msu.edu ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposal: new list function: pack
On Fri, Mar 20, 2009, paul bedaride wrote: > > I propose a new function for list for pack values of a list and > sliding over them: Please switch this discussion to python-ideas -- Aahz (a...@pythoncraft.com) <*> http://www.pythoncraft.com/ "Programming language design is not a rational science. Most reasoning about it is at best rationalization of gut feelings, and at worst plain wrong." --GvR, python-ideas, 2009-3-1 ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposal: new list function: pack
On Fri, 20 Mar 2009, paul bedaride wrote: I propose a new function for list for pack values of a list and sliding over them: then we can do things like this: for i, j, k in pack(range(10), 3, partialend=False): print i, j, k I propose this because i need a lot of times pack and slide function over list and this one combine the two in a generator way. See the Python documentation for zip(): http://docs.python.org/library/functions.html#zip And this article in which somebody independently rediscovers the idea: http://drj11.wordpress.com/2009/01/28/my-python-dream-about-groups/ Summary: except for the "partialend" parameter, this can already be done in a single line. It is not for me to say whether this nevertheless would be useful as a library routine (if only perhaps to make it easy to specify "partialend" explicitly). It seems to me that sometimes one would want izip instead of zip. And I think you could get the effect of partialend=True in 2.6 by using izip_longest (except with an iterator result rather than a list). def pack(l, size=2, slide=2, partialend=True): lenght = len(l) for p in range(0,lenght-size,slide): def packet(): for i in range(size): yield l[p+i] yield packet() p = p + slide if partialend or lenght-p == size: def packet(): for i in range(lenght-p): yield l[p+i] yield packet() Isaac Morland CSCF Web Guru DC 2554C, x36650WWW Software Specialist ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposal: new list function: pack
Isaac Morland wrote: I propose this because i need a lot of times pack and slide function over list and this one combine the two in a generator way. I've written functions with a subset of this functionality on more than one occasion. Having it in itertools looks like it would be useful to a lot of people. See the Python documentation for zip(): http://docs.python.org/library/functions.html#zip zip can be used to achieve this purpose, but only with serious itertools-fu. If I want to iterate over a list [1, 2, 3, 4] looking at pairs (1, 2) and (3, 4), it would be much nicer to write: for a, b in itertools.pack(l, 2): ... than for a, b in itertools.izip(*[iter(l)]*2): ... which is what the zip documentation proposes. The former would be clear to anyone looking at the documentation of "pack" (and maybe even without it if we choose a better name), while the latter requires quite some deciphering, followed by carefully looking at izip's documentation that it's actually legal to rely on argument evaluation order and not peeking at iterables, like that code does. izip is not the only contender for this pattern; something similar is possible using groupby, but it's hard to make it fit in an easily understable line either. This is the shortest I came up with: def pack(iterable, n): cycler = (i for i in itertools.count() for j in xrange(n)) return (g for k, g in itertools.groupby(iterable, lambda x: cycler.next())) This has the nice property that it returns iterables rather than tuples, although tuples are probably good enough (they seem to be good enough for izip). The name "pack" is a bit too cryptic, even by itertools standards, so it might be better to choose a name that conveys the intention of returning "groups of n adjacent elements" (group_adjacent?). To fit with the rest of itertools, and to be really useful, the function shouldn't insist on sequences, but should accept any iterable. http://drj11.wordpress.com/2009/01/28/my-python-dream-about-groups/ That posting ends with: """ It still scares me a bit. This code is obviously ridiculous. I can’t help feeling I’ve missed a more Pythonic way of doing it. """ Looking at izip(*[iter(l)]*n), I tend to agree. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Summary of Python tracker Issues
ACTIVITY SUMMARY (03/13/09 - 03/20/09) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue number. Do NOT respond to this message. 2392 open (+27) / 14957 closed (+14) / 17349 total (+41) Open issues with patches: 836 Average duration of open issues: 661 days. Median duration of open issues: 402 days. Open Issues Breakdown open 2362 (+25) pending30 ( +2) Issues Created Or Reopened (44) ___ PATCH: Armin's attribute lookup caching for 3.0 03/14/09 http://bugs.python.org/issue1568reopened pitrou patch merge json library with latest simplejson 2.0.x 03/17/09 http://bugs.python.org/issue4136reopened pitrou patch profile doesn't support non-UTF8 source code 03/20/09 http://bugs.python.org/issue4282reopened haypo patch, needs review doc copyedits03/13/09 CLOSED http://bugs.python.org/issue5486created dsm001 patch Parts of Tkinter missing (but not when running from IDLE)03/14/09 CLOSED http://bugs.python.org/issue5487created oc nb_inplace_divide slot is missing in docs03/14/09 CLOSED http://bugs.python.org/issue5488created donlorenzo patch Broken DLL 03/14/09 CLOSED http://bugs.python.org/issue5489created JCoder Broken DLL 03/14/09 CLOSED http://bugs.python.org/issue5490created JCoder Clarify contextlib.nested semantics 03/15/09 CLOSED http://bugs.python.org/issue5491created ncoghlan Error on leaving IDLE with quit() or exit() under Linux 03/15/09 http://bugs.python.org/issue5492created gerluijten Rephrasing the doc of object.__nonzero__ 03/15/09 CLOSED http://bugs.python.org/issue5493created ezio.melotti Failure in test_httpservers on Linux 03/15/09 http://bugs.python.org/issue5494created gerluijten ValueError exception of tuple.index(x) gives imprecise error mes 03/15/09 CLOSED http://bugs.python.org/issue5495created Retro patch codecs.lookup docstring is misleading03/15/09 CLOSED http://bugs.python.org/issue5496created exarkun openssl compileerror with original source03/17/09 http://bugs.python.org/issue5497created ocean-city patch Can SGMLParser properly handle tags?03/17/09 http://bugs.python.org/issue5498created once-off only accept byte for getarg('c') and unicode for getarg('C') 03/17/09 http://bugs.python.org/issue5499created haypo patch tarfile: path problem in arcname under windows 03/17/09 http://bugs.python.org/issue5500created ellioh Update multiprocessing docs re: freeze_support 03/17/09 http://bugs.python.org/issue5501created bcorfman
[Python-Dev] Py_ssize_t support for ctypes arrays and pointers
I received some (so far unfinished) patches for ctypes that will allow to create arrays with more than 2**31 elements and that will eventually also support pointer offsets larger than int, on 64-bit platforms. Since I do not have a machine with so much memory: Does one of the buildbots allow to run tests for this feature, or do I have to wait for the snakebite farm? -- Thanks, Thomas ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Py_ssize_t support for ctypes arrays and pointers
Will testing an array of chars do? You can easily allocate 4-5Gb on a regular 64bit pc, even with only 1G of ram, given that your swap space is sufficient. If you want to excercise your array, then you might get some paging, but it's not completely impossible. K -Original Message- From: python-dev-bounces+kristjan=ccpgames@python.org [mailto:python-dev-bounces+kristjan=ccpgames@python.org] On Behalf Of Thomas Heller Sent: 20. mars 2009 19:01 To: python-dev@python.org Subject: [Python-Dev] Py_ssize_t support for ctypes arrays and pointers I received some (so far unfinished) patches for ctypes that will allow to create arrays with more than 2**31 elements and that will eventually also support pointer offsets larger than int, on 64-bit platforms. Since I do not have a machine with so much memory: Does one of the buildbots allow to run tests for this feature, or do I have to wait for the snakebite farm? -- Thanks, Thomas ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/kristjan%40ccpgames.com ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Core projects for Summer of Code
> Summer of Code is ramping up. Every year the common complaint is that not > enough Python core projects get proposed by students, and of course a big > reason for that is often the only encouragement we offer prospective > students is a link to the PEP index. > > The challenge is finding project ideas for them that could reasonably occupy > them for the entire Summer and which the results of their work can be > demonstrated. They're being paid for specific projects so "Spend the Summer > fixing bugs on the tracker" is a no-go, and Google has outlined that Summer > of Code is about code, not documentation. Improve doctest by allowing it to be aware of nested test scopes such that a variable defined at "class-level scope" (i.e. the variable b defined at the class-level doctest """>>> b=Bag("abacab")""") can be used in "method-level scopes" without re-defining it every time for each method's doctest (each method would reset the given variable (if used) to its original state rather than live mutated between equal-level scopes). Would be a great improvement for doctest in my opinion--both in ease-of-use, and reduction of redundant, error-prone ("did you define your test variable the same in each method?") code)--as well as other benefits. Appreciate any consideration... marcos ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Multiprocessing on Solaris
Hello fellow co-developers! Today I was in contact with a Python user who tried to compile pyprocessing - the ancestor of multiprocessing - on Solaris. It failed to run because Solaris is missing two features (HAVE_FD_TRANSFER and HAVE_SEM_TIMEDWAIT). Does anybody have a Solaris box at his disposal to test the settings? Neither Python 2.6 nor my backup have the correct settings for Solaris. Christian ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
> Today I was in contact with a Python user who tried to compile > pyprocessing - the ancestor of multiprocessing - on Solaris. It failed > to run because Solaris is missing two features (HAVE_FD_TRANSFER and > HAVE_SEM_TIMEDWAIT). Does anybody have a Solaris box at his disposal to > test the settings? Neither Python 2.6 nor my backup have the correct > settings for Solaris. I don't quite understand what it is that you want tested - what "settings"? Most likely, the answer is yes, I can test stuff on Solaris (both SPARC and x86/amd64). Regards, Martin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
Known issue: http://bugs.python.org/issue3110 I haven't had time to look into it, I was planning on working on many of the mp bugs during the sprint at pycon. On Fri, Mar 20, 2009 at 8:18 PM, Christian Heimes wrote: > Hello fellow co-developers! > > Today I was in contact with a Python user who tried to compile > pyprocessing - the ancestor of multiprocessing - on Solaris. It failed > to run because Solaris is missing two features (HAVE_FD_TRANSFER and > HAVE_SEM_TIMEDWAIT). Does anybody have a Solaris box at his disposal to > test the settings? Neither Python 2.6 nor my backup have the correct > settings for Solaris. > > Christian > > ___ > Python-Dev mailing list > Python-Dev@python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > http://mail.python.org/mailman/options/python-dev/jnoller%40gmail.com > ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
Martin v. Löwis schrieb: >> Today I was in contact with a Python user who tried to compile >> pyprocessing - the ancestor of multiprocessing - on Solaris. It failed >> to run because Solaris is missing two features (HAVE_FD_TRANSFER and >> HAVE_SEM_TIMEDWAIT). Does anybody have a Solaris box at his disposal to >> test the settings? Neither Python 2.6 nor my backup have the correct >> settings for Solaris. > > I don't quite understand what it is that you want tested - what > "settings"? > > Most likely, the answer is yes, I can test stuff on Solaris (both SPARC > and x86/amd64). According to the user's experience multiprocessing should not compile and run correctly unless this patch is applied. I'm not sure if the value "solaris" for platform is correct. You may also need to change libraries to ['rt']. Index: setup.py === --- setup.py(revision 70478) +++ setup.py(working copy) @@ -1280,6 +1280,14 @@ ) libraries = [] +elif platform == 'solaris': +macros = dict( +HAVE_SEM_OPEN=1, +HAVE_SEM_TIMEDWAIT=0, +HAVE_FD_TRANSFER=0, +) +libraries = [] + else: # Linux and other unices macros = dict( HAVE_SEM_OPEN=1, ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
On Fri, Mar 20, 2009 at 8:50 PM, Christian Heimes wrote: > Martin v. Löwis schrieb: >>> Today I was in contact with a Python user who tried to compile >>> pyprocessing - the ancestor of multiprocessing - on Solaris. It failed >>> to run because Solaris is missing two features (HAVE_FD_TRANSFER and >>> HAVE_SEM_TIMEDWAIT). Does anybody have a Solaris box at his disposal to >>> test the settings? Neither Python 2.6 nor my backup have the correct >>> settings for Solaris. >> >> I don't quite understand what it is that you want tested - what >> "settings"? >> >> Most likely, the answer is yes, I can test stuff on Solaris (both SPARC >> and x86/amd64). > > According to the user's experience multiprocessing should not compile > and run correctly unless this patch is applied. I'm not sure if the > value "solaris" for platform is correct. You may also need to change > libraries to ['rt']. > > > Index: setup.py > === > --- setup.py (revision 70478) > +++ setup.py (working copy) > @@ -1280,6 +1280,14 @@ > ) > libraries = [] > > + elif platform == 'solaris': > + macros = dict( > + HAVE_SEM_OPEN=1, > + HAVE_SEM_TIMEDWAIT=0, > + HAVE_FD_TRANSFER=0, > + ) > + libraries = [] > + > else: # Linux and other unices > macros = dict( > HAVE_SEM_OPEN=1, If this should be addressed in trunk/3k, we need to track this in the tracker in the bug I cited in the other email. I can't speak for the original pyprocessing code. -jesse ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
Jesse Noller wrote: > On Fri, Mar 20, 2009 at 8:50 PM, Christian Heimes wrote: >> Martin v. Löwis schrieb: Today I was in contact with a Python user who tried to compile pyprocessing - the ancestor of multiprocessing - on Solaris. It failed to run because Solaris is missing two features (HAVE_FD_TRANSFER and HAVE_SEM_TIMEDWAIT). Does anybody have a Solaris box at his disposal to test the settings? Neither Python 2.6 nor my backup have the correct settings for Solaris. >>> I don't quite understand what it is that you want tested - what >>> "settings"? >>> >>> Most likely, the answer is yes, I can test stuff on Solaris (both SPARC >>> and x86/amd64). >> According to the user's experience multiprocessing should not compile >> and run correctly unless this patch is applied. I'm not sure if the >> value "solaris" for platform is correct. You may also need to change >> libraries to ['rt']. >> >> >> Index: setup.py >> === >> --- setup.py(revision 70478) >> +++ setup.py(working copy) >> @@ -1280,6 +1280,14 @@ >> ) >> libraries = [] >> >> +elif platform == 'solaris': >> +macros = dict( >> +HAVE_SEM_OPEN=1, >> +HAVE_SEM_TIMEDWAIT=0, >> +HAVE_FD_TRANSFER=0, >> +) >> +libraries = [] >> + >> else: # Linux and other unices >> macros = dict( >> HAVE_SEM_OPEN=1, > > If this should be addressed in trunk/3k, we need to track this in the > tracker in the bug I cited in the other email. I can't speak for the > original pyprocessing code. > I just checked out the trunk on a Sparc Solaris 8 box, and on the trunk, those defines are specified differently: building '_multiprocessing' extension gcc -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -DHAVE_SEM_OPEN=1 -DHAVE_FD_TRANSFER=1 -DHAVE_SEM_TIMEDWAIT=1 -IModules/_multiprocessing -I. -I./Include -I/usr/local/include -IInclude -I/nfs/nfs2/home/scratch/scodial/python-trunk -c trunk/Modules/_multiprocessing/multiprocessing.c -o build/temp.solaris-2.8-sun4u-2.7/trunk/Modules/_multiprocessing/multiprocessing.o However, the build is still without issue: trunk/Modules/_multiprocessing/multiprocessing.c: In function `multiprocessing_sendfd': trunk/Modules/_multiprocessing/multiprocessing.c:100: warning: implicit declaration of function `CMSG_SPACE' trunk/Modules/_multiprocessing/multiprocessing.c:117: warning: implicit declaration of function `CMSG_LEN' trunk/Modules/_multiprocessing/connection.h: In function `connection_new': trunk/Modules/_multiprocessing/connection.h:51: warning: unknown conversion type character `z' in format trunk/Modules/_multiprocessing/connection.h:51: warning: too many arguments for format trunk/Modules/_multiprocessing/connection.h: In function `connection_repr': trunk/Modules/_multiprocessing/connection.h:401: warning: unknown conversion type character `z' in format trunk/Modules/_multiprocessing/connection.h: In function `connection_new': trunk/Modules/_multiprocessing/connection.h:51: warning: unknown conversion type character `z' in format trunk/Modules/_multiprocessing/connection.h:51: warning: too many arguments for format trunk/Modules/_multiprocessing/connection.h: In function `connection_repr': trunk/Modules/_multiprocessing/connection.h:401: warning: unknown conversion type character `z' in format -- Scott Dial sc...@scottdial.com scod...@cs.indiana.edu ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
Jesse> Known issue: Jesse> http://bugs.python.org/issue3110 Jesse> I haven't had time to look into it, I was planning on working on Jesse> many of the mp bugs during the sprint at pycon. Jesse, I will be at the sprints for a couple days and should be able to test things out on Solaris or let you look over my shoulder as we poke around the machines at work if you need. Skip ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Multiprocessing on Solaris
On Fri, Mar 20, 2009 at 9:51 PM, wrote: > Jesse> Known issue: > > Jesse> http://bugs.python.org/issue3110 > > Jesse> I haven't had time to look into it, I was planning on working on > Jesse> many of the mp bugs during the sprint at pycon. > > Jesse, I will be at the sprints for a couple days and should be able to test > things out on Solaris or let you look over my shoulder as we poke around the > machines at work if you need. > > Skip > Sweet, do you think a 64 bit Opensolaris VM would work too? ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] What level of detail wanted for import and the language reference?
Doc changes are now checked in. Someone who has not been starting at import for over two years should probably go in and clean it up as it is probably not clear to a newbie (but then again newbies should not be reading the language ref; more worried about the docs in sys). On Mon, Mar 16, 2009 at 15:39, Brett Cannon wrote: > At this point importlib is done for its public API for Python 3.1. That > means it's time to turn my attention to making sure the semantics of import > are well documented. But where to put all of the details? The language > reference for import ( > http://docs.python.org/dev/py3k/reference/simple_stmts.html#the-import-statement) > explains the basics, but is lacking all of the details of PEP 302 and other > stuff like __path__ that have existed for ages. > > My question is if I should flesh out the details in the language reference > or do it in importlib's intro docs. The main reason I could see not doing it > in the langauge reference (or at least duplicating it) is it would be > somewhat easier to reference specific objects in importlib but I am not sure > if the language reference should try to stay away from stdlib references. > > -Brett > ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] What level of detail wanted for import and the language reference?
2009/3/20 Brett Cannon : > Doc changes are now checked in. Someone who has not been starting at import > for over two years should probably go in and clean it up as it is probably > not clear to a newbie (but then again newbies should not be reading the > language ref; more worried about the docs in sys). It would be nice to have at least the sys docs backported to the trunk. -- Regards, Benjamin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] What level of detail wanted for import and the language reference?
On Fri, Mar 20, 2009 at 20:18, Benjamin Peterson wrote: > 2009/3/20 Brett Cannon : > > Doc changes are now checked in. Someone who has not been starting at > import > > for over two years should probably go in and clean it up as it is > probably > > not clear to a newbie (but then again newbies should not be reading the > > language ref; more worried about the docs in sys). > > It would be nice to have at least the sys docs backported to the trunk. That would also require backporting stuff from the glossary. In other words I ain't doing it now, but you might be able to convince me at PyCon. I will at least create a bug about it, though. -Brett ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposal: new list function: pack
Hrvoje Niksic wrote: > Looking at izip(*[iter(l)]*n), I tend to agree. Note that the itertools recipes page in the docs includes the following: def pairwise(iterable): "s -> (s0,s1), (s1,s2), (s2, s3), ..." a, b = tee(iterable) next(b, None) return izip(a, b) There are a couple of other variants here: http://code.activestate.com/recipes/439095/ And a different take on providing similar functionality here: http://code.activestate.com/recipes/544296/ However, the idea of providing a general windowing function in itertools has been considered in the past and emphatically rejected: http://mail.python.org/pipermail/python-dev/2006-May/065305.html Cheers, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia --- ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 380 (yield from a subgenerator) comments
I really like the PEP - it's a solid extension of the ideas introduced by PEP 342. The two changes I would suggest is that the PEP be made more explicit regarding the fact that the try/finally block only enclose the yield expression itself (i.e. no other parts of the containing statement) and that the caching comment be updated with a list of specific semantic elements that the caching should not affect. For the first part, I would prefer if the example was changed to use capitals for the variant non-keyword parts of the statement: RESULT = yield from EXPR And that it formally expanded to: _i = iter(EXPR) try: _u = _i.next() while 1: try: _v = yield _u except Exception, _e: _m = getattr(_i, 'throw', None) if _m is not None: _u = _m(_e) else: raise else: if _v is None: _u = _i.next() else: _u = _i.send(_v) except StopIteration, _e: _expr_result = _e.value finally: _m = getattr(_i, 'close', None) if _m is not None: _m() RESULT = _expr_result I believe writing it that way would make it clearer that the scope of the try/finally block doesn't include the assignment part of the statement. For the second part, the specific semantics that I believe should be noted as not changing even if an implementation chooses to cache the bound methods are these: - The "send" and "throw" methods of the subiterator should not be retrieved if those methods are never called on the delegating generator - If "send" is called on the delegating generator and the subiterator has no "send" method, then an appropriate "AttributeError" should be raised in the delegating generator - If retrieving the "next", "send" or "throw" methods from the subiterator results in an exception then the subiterator's "close" method (if it has one) should still be called Cheers, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia --- ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com