Re: [Python-Dev] PEP 377 - allow __enter__() methods to skip the statement body

2009-03-21 Thread Nick Coghlan
James Pye wrote:
> The identification of this issue came from an *experiment* attempting to
> create a *single* "daemonized()" CM that would execute the
> with-statement's block in a new child process and, of course, not
> execute it in the parent. At first, I ran into the RuntimeError in the
> parent process, and then after rewriting the CMs as classes, I realized
> the futility.
> 
> with daemonized():
>  run_some_subprocess()
> 
> Of course it was all possible if I used the component CMs directly:
> 
> with parent_trap():
>  with fork_but_raise_in_parent():
>   run_some_subprocess()

When updating the PEP with the rejection notice, it occurred to me that
it is fairly easy to handle specific use cases like this reasonably
cleanly by including a callable in the design that is always used inline
in the body of the outermost with statement. For example:

  @contextmanager
  def make_daemon()
class SkipInParent(Exception): pass
def startd():
   # Fork process, then raise SkipInParent
   # in the parent process. The child process
   # continues running as a daemon.
try:
  yield startd
except SkipInParent:
  pass


  with make_daemon() as startd:
startd()
# Daemon code goes here

With that approach, since it is startd() that raises the exception
rather than __enter__() then __exit__() will always be given the chance
to suppress it.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
---
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread Antoine Pitrou
Nick Coghlan  gmail.com> writes:
> 
> And that it formally expanded to:
> 
> 

Do we really want to add a syntactic feature which has such a complicated
expansion? I fear it will make code using "yield from" much more difficult to
understand and audit.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread Nick Coghlan
Antoine Pitrou wrote:
> Nick Coghlan  gmail.com> writes:
>> And that it formally expanded to:
>>
>>  conditionals>
> 
> Do we really want to add a syntactic feature which has such a complicated
> expansion? I fear it will make code using "yield from" much more difficult to
> understand and audit.

Yes, I think we do. The previous argument against explicit syntactic
support for invoking subiterators was that it was trivial to do so by
iterating over the subiterator and yielding each item in turn.

With the additional generator features introduced by PEP 342, that is no
longer the case: as described in Greg's PEP, simple iteration doesn't
support send() and throw() correctly. The gymnastics needed to support
send() and throw() actually aren't that complex when you break them
down, but they aren't trivial either.

Whether or not different people will find code using "yield from"
difficult to understand or not will have more to do with their grasp of
the concepts of cooperative multitasking in general more so than the
underlying trickery involved in allowing truly nested generators.

Here's an annotated version of the expansion that will hopefully make
things clearer:

  # Create the subiterator
  _i = iter(EXPR)
  # Outer try block serves two purposes:
  #  - retrieve expression result from StopIteration instance
  #  - ensure _i.close() is called if it exists
  try:
  # Get first value to be yielded
  _u = _i.next()
  while 1:
  # Inner try block allows exceptions passed in via
  # the generator's throw() method to be passed to
  # the subiterator
  try:
  _v = yield _u
  except Exception, _e:
  # An exception was thrown into this
  # generator. If the subiterator has
  # a throw() method, then we pass the
  # exception down. Otherwise, we
  # propagate the exception in the
  # current generator
  # Note that SystemExit and
  # GeneratorExit are never passed down.
  # For those, we rely on the close()
  # call in the outer finally block
  _m = getattr(_i, 'throw', None)
  if _m is not None:
  # throw() will either yield
  # a new value, raise StopIteration
  # or reraise the original exception
  _u = _m(_e)
  else:
  raise
  else:
  if _v is None:
  # Get the next subiterator value
  _u = _i.next()
  else:
  # A value was passed in using
  # send(), so attempt to pass it
  # down to the subiterator.
  # AttributeError will be raised
  # if the subiterator doesn't
  # provide a send() method
  _u = _i.send(_v)
  except StopIteration, _e:
  # Subiterator ended, get the expression result
  _expr_result = _e.value
  finally:
  # Ensure close() is called if it exists
  _m = getattr(_i, 'close', None)
  if _m is not None:
  _m()
  RESULT = _expr_result


On further reflection (and after reading a couple more posts on
python-ideas relating to this PEP), I have two more questions/concerns:

1. The inner try/except is completely pointless if the subiterator
doesn't have a throw() method. Would it make sense to have two versions
of the inner loop (with and without the try block) and choose which one
to use based on whether or not the subiterator has a throw() method?
(Probably not, since this PEP is mainly about generators as cooperative
pseudo-threads and in such situations all iterators involved are likely
to be generators and hence have throw() methods. However, I think the
question is at least worth thinking about.)

2. Due to a couple of bug reports against 2.5,
contextlib.GeneratorContextManager now takes extra care when handling
exceptions to avoid accidentally suppressing explicitly thrown in
StopIteration instances. However, the current expansion in PEP 380
doesn't check if the StopIteration caught by the outer try statement was
one that was originally thrown into the generator rather than an
indicator that the subiterator naturally reached the end of its
execution. That isn't a difficult behaviour to eliminate, but it does
require a slight change to the semantic definition of the new expression:

  _i = iter(EXPR)
  _thrown_exc = None
  try:
  _u = _i.next()
  while 1:
  try:
  _v = yield _u
  except Exception, _e:
  _thrown_exc = _e
  _m = getattr(_i, 'throw', None)
  if _m is not None:
  _u = _m(_e)
  else:
  raise
  else:
  if _v is None:
  _u = _i.next()
  else:
  _u = _i.send(_v)
  except StopIteration, _e:
  if _e is _thrown_ex

Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread Antoine Pitrou
Nick Coghlan  gmail.com> writes:
> 
> Whether or not different people will find code using "yield from"
> difficult to understand or not will have more to do with their grasp of
> the concepts of cooperative multitasking in general more so than the
> underlying trickery involved in allowing truly nested generators.

I don't agree. Cooperative multitasking looks quite orthogonal to me to the
complexity brought by this new statement. You can perfectly well "grasp the
concepts of cooperative multitasking" without finding the semantics of this new
statement easy to understand and remember. Hiding so many special cases behind a
one-line statement does not help, IMO. And providing a commented version of the
expansion does not really help either: it does not make the expansion easier to
remember and replay in the case you have to debug something involving such a
"yield from" statement.

(remember, by the way, that a third-party package like greenlets already
provides cooperative multitasking without any syntax addition, and that
libraries like Twisted already have their own generator-based solution for
cooperative multitasking, which AFAIR no one demonstrated would be improved by
the new statement. I'm not sure where the urgency is, and I don't see any
compelling use case.)


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread P.J. Eby

At 04:45 PM 3/21/2009 +1000, Nick Coghlan wrote:

I really like the PEP - it's a solid extension of the ideas introduced
by PEP 342.


(Replying to you since I haven't seen any other thread on this)

My concern is that allowing 'return value' in generators is going to 
be confusing, since it effectively causes the return value to 
"disappear" if you're not using it in this special way with some 
framework that takes advantage.


However, if you *do* have some framework that takes advantage of 
generators to do microthreads, then it is most likely already written 
so as to have things like 'yield Return(value)' to signal a return, 
and to handle 'yield subgenerator()' without the use of additional syntax.


So, I don't really see the point of the PEP.  'yield from' seems 
marginally useful, but I really dislike making it an expression, 
rather than a statement.  The difference seems just a little too 
subtle, considering how radically different the behavior 
is.   Overall, it has the feel of jamming a framework into the 
language, when doing the same thing in a library is pretty trivial.


I'd almost rather see a standard or "reference" trampoline added to 
the stdlib (preferably with a way to register handling for 
specialized yielded types IO/scheduling hooks), than try to cram half 
a trampoline into the language itself.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread Greg Ewing

Antoine Pitrou wrote:


Do we really want to add a syntactic feature which has such a complicated
expansion? I fear it will make code using "yield from" much more difficult to
understand and audit.


As I've said before, I don't think the feature itself is
difficult to understand. You're not meant to learn about
it by reading the expansion -- that's only there to pin
down all the details for language lawyers.

For humans, almost all the important information is
contained in one paragraph near the top:

"When the iterator is another generator, the effect is the same as if
the body of the subgenerator were inlined at the point of the ``yield
from`` expression. Furthermore, the subgenerator is allowed to execute
a ``return`` statement with a value, and that value becomes the value of
the ``yield from`` expression."

Armed with this perspective, do you still think there will
be difficulty in understanding or auditing code?

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread Greg Ewing

P.J. Eby wrote:

My concern is that allowing 'return value' in generators is going to be 
confusing, since it effectively causes the return value to "disappear" 
if you're not using it in this special way with some framework that 
takes advantage.


But part of all this is that you *don't* need a special
framework to get the return value -- all you need is a
caller that uses a yield-from statement. There are uses
for that besides threading systems.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread Paul Moore
2009/3/21 Greg Ewing :
> P.J. Eby wrote:
>
>> My concern is that allowing 'return value' in generators is going to be
>> confusing, since it effectively causes the return value to "disappear" if
>> you're not using it in this special way with some framework that takes
>> advantage.
>
> But part of all this is that you *don't* need a special
> framework to get the return value -- all you need is a
> caller that uses a yield-from statement. There are uses
> for that besides threading systems.

Can they be added to the PEP? Personally, I find the proposal
appealing, and I don't find the semantics hard to understand (although
certainly the expansion given in the "formal semantics" section makes
my head hurt ;-)) but I don't see many actual reasons why it's useful.
(My own use would most likely to be the trivial "for v in g: yield v"
case). More motivating examples would help a lot.

Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-21 Thread P.J. Eby

At 10:21 AM 3/22/2009 +1200, Greg Ewing wrote:

P.J. Eby wrote:

My concern is that allowing 'return value' in generators is going 
to be confusing, since it effectively causes the return value to 
"disappear" if you're not using it in this special way with some 
framework that takes advantage.


But part of all this is that you *don't* need a special
framework to get the return value -- all you need is a
caller that uses a yield-from statement. There are uses
for that besides threading systems.


Such as?  I've been wracking my brain trying to come up with any 
*other* occasion where I'd need -- or even find it useful -- to have 
one generator yield the contents of another generator to its caller, 
and then use a separate return value in itself.  (I'm thus finding it 
hard to believe there's a non-contrived example that's not doing I/O, 
scheduling, or some other form of co-operative multitasking.)


In any case, you didn't address the confusion issue: the inability of 
generators to return a value is there for a good reason, and adding a 
return value that doesn't actually return anywhere unless you use it 
in a yield-from expression -- an expression that both looks like a 
statement and has control-flow side-effects -- seems both 
over-complex and an invitation to confusion.


This is different from plain yield expressions, in that plain yield 
expressions are *symmetric*: the value returned from the yield 
expression comes from the place where control flow is passed by the 
yield.  That is, 'x = yield y' takes value y, passes control flow to 
the caller, and then returns a result from the caller.  It's like an 
inverse function call.  'x = yield from y', on the other hand, first 
passes control to y, then the caller, then y, then the caller, an 
arbitrary number of times, and then finally returns a value from y, 
not the caller.


This is an awful lot of difference in control flow for only a slight 
change in syntax -- much more of a difference than the difference 
between yield statements and yield expressions.


So at present (for whatever those opinions are worth), I'd say -0 on 
a yield-from *statement* (somewhat useful but maybe not worth 
bothering with), +0 on a reference trampoline in the stdlib (slightly 
better than doing nothing at all, but not by much), and -1 on 
yield-from expressions and return values (confusing complication with 
very narrowly focused benefit, reasonably doable with library code).


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Py_ssize_t support for ctypes arrays and pointers

2009-03-21 Thread Trent Nelson
On Fri, Mar 20, 2009 at 08:00:46PM +0100, Thomas Heller wrote:
> Since I do not have a machine with so much memory: Does one
> of the buildbots allow to run tests for this feature, or
> do I have to wait for the snakebite farm?

Will you be at PyCon?  The wait might not be as bad as you think ;-)

Trent.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com