Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).

Some highlights:

- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Another Anonymous Block Proposal

2005-04-27 Thread Jason Diamond
Hi.
I hope you don't mind another proposal. Please feel free to tear it apart.
A limitation of both Ruby's block syntax and the new PEP 340 syntax is 
the fact that they don't allow you to pass in more than a single 
anonymous block parameter. If Python's going to add anonymous blocks, 
shouldn't it do it better than Ruby?

What follows is a proposal for a syntax that allows passing multiple, 
anonymous callable objects into another callable. No new protocols are 
introduced and none of it is tied to iterators/generators which makes it 
much simpler to understand (and hopefully simpler to implement).

This is long and the initial syntax isn't ideal so please bear with me 
as I move towards what I'd like to see.

The Python grammar would get one new production:
   do_statement ::=
   "do" call ":" NEWLINE
   ( "with" funcname "(" [parameter_list] ")" ":" suite )*
Here's an example using this new "do" statement:
   do process_file(path):
   with process(file):
   for line in file:
   print line
That would translate into:
   def __process(file):
   for line in file:
   print line
   process_file(path, process=__process)
Notice that the name after each "with" keyword is the name of a 
parameter to the function being called. This will be what allows 
multiple block parameters.

The implementation of `process_file` could look something like:
   def process_file(path, process):
   try:
   f = file(path)
   process(f)
   finally:
   if f:
   f.close()
There's no magic in `process_file`. It's just a function that receives a 
callable named `process` as a parameter and it calls that callable with 
one parameter.

There's no magic in the post-translated code, either, except for the 
temporary `__process` definition which shouldn't be user-visible.

The magic comes when the pre-translated code gets each "with" block 
turned into a hidden, local def and passed in as a parameter to 
`process_file`.

This syntax allows for multiple blocks:
   do process_file(path):
   with process(file):
   for line in file:
   print line
   with success():
   print 'file processed successfully!'
   with error(exc):
   print 'an exception was raised during processing:', exc
That's three separate anonymous block parameters with varying number of 
parameters in each one.

This is what `process_file` might look like now:
   def process_file(path, process, success=None, error=None):
   try:
   try:
   f = file(path)
   process(f)
   if success:
   success(()
   except:
   if error:
   error(sys.exc_info())
   raise
   finally:
   if f:
   f.close()
I'm sure that being able to pass in multiple, anonymous blocks will be a 
huge advantage.

Here's an example of how Twisted might be able to use multiple block 
parameters:

   d = do Deferred():
   with callback(data): ...
   with errback(failure): ...
(After typing that in, I realized the do_statement production needs an 
optional assignment part.)

There's nothing requiring that anonymous blocks be used for looping. 
They're strictly parameters which need to be callable. They can, of 
course, be called from within a loop:

   def process_lines(path, process):
   try:
   f = file(path)
   for line in f:
   process(line)
   finally:
   if f:
   f.close()
   do process_lines(path):
   with process(line):
   print line
Admittedly, this syntax is pretty bulky. The "do" keyword is necessary 
to indicate to the parser that this isn't a normal call--this call has 
anonymous block parameters. Having to prefix each one of these 
parameters with "with" is just following the example of "if/elif/else" 
blocks. An alternative might be to use indentation the way that class 
statements "contain" def statements:

   do_statement ::=
   "do" call ":" NEWLINE
   INDENT
   ( funcname "(" [parameter_list] ")" ":" suite )*
   DEDENT
That would turn our last example into this:
   do process_lines(path):
   process(line):
   print line
The example with the `success` and `error` parameters would look like this:
   do process_file(path):
   process(file):
   for line in file:
   print line
   success():
   print 'file processed successfully!'
   error(exc):
   print 'an exception was raised during processing:', exc
To me, that's much easier to see that the three anonymous block 
statements are part of the "do" statement.

It would be ideal if we could even lose the "do" keyword. I think that 
might make the grammar ambiguous, though. If it was possible, we could 
do this:

   process_file(path):
   process(file):
   for line in file:
   print line
   success():
   print 'file processed successfully!'
   error(exc):

Re: [Python-Dev] Another Anonymous Block Proposal

2005-04-27 Thread Jason Diamond
Paul Svensson wrote:
 You're not mentioning scopes of local variables, which seems to be
 the issue where most of the previous proposals lose their balance
 between hairy and pointless...
My syntax is just sugar for nested defs. I assumed the scopes of local 
variables would be identical when using either syntax.

Do you have any pointers to that go into the issues I'm probably missing?
Thanks.
--
Jason
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: Re: anonymous blocks

2005-04-27 Thread Fredrik Lundh
Guido van Rossum wrote:

> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).
>
> Some highlights:
>
> - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
> - __next__() argument simplified to StopIteration or ContinueIteration 
> instance
> - use "continue EXPR" to pass a value to the generator
> - generator exception handling explained

+1 (most excellent)

 



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
I'm still trying to build a case for a non-looping block statement, but the 
proposed enhancements to generators look great. Any further suggestions I make 
regarding a PEP 310 style block statement will account for those generator changes.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] defmacro

2005-04-27 Thread Stephen J. Turnbull
> "Greg" == Greg Ewing <[EMAIL PROTECTED]> writes:

Greg> I didn't claim that people would feel compelled to eliminate
Greg> all uses of lambda; only that, in those cases where they
Greg> *do* feel so compelled, they might not if lambda weren't
Greg> such a long word.

Sure, I understood that.  It's just that my feeling is that lambda
can't "just quote a suite", it brings lots of other semantic baggage
with it.

Anyway, with dynamic scope, we can eliminate lambda, can't we?  Just
pass the suites as quoted lists of forms, compute the macro expansion,
and eval it.  So it seems to me that the central issue us scoping, not
preventing evaluation of the suites.  In Lisp, macros are a way of
temporarily enabling certain amounts of dynamic scoping for all
variables, without declaring them "special".  It is very convenient
that they don't evaluate their arguments, but that is syntactic sugar,
AFAICT.

In other words, it's the same idea as the "collapse" keyword that was
proposed, but with different rules about what gets collapsed, when.

-- 
School of Systems and Information Engineering http://turnbull.sk.tsukuba.ac.jp
University of TsukubaTennodai 1-1-1 Tsukuba 305-8573 JAPAN
   Ask not how you can "do" free software business;
  ask what your business can "do for" free software.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Jim Fulton
Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
This looks pretty cool.
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that happens
   to evaluate to None inside a block simply exits the block, rather
   than exiting a surrounding function. Did I miss something, or is this
   a bug?
2. I assume it would be a hack to try to use block statements to implement
   something like interfaces or classes, because doing so would require
   significant local-variable manipulation.  I'm guessing that
   either implementing interfaces (or implementing a class statement
   in which the class was created before execution of a suite)
   is not a use case for this PEP.
Jim
--
Jim Fulton   mailto:[EMAIL PROTECTED]   Python Powered!
CTO  (540) 361-1714http://www.python.org
Zope Corporation http://www.zope.com   http://www.zope.org
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Greg Ewing wrote:
Nick Coghlan wrote:
def template():
  # pre_part_1
  yield None
  # post_part_1
  yield None
  # pre_part_2
  yield None
  # post_part_2
  yield None
  # pre_part_3
  yield None
  # post_part_3
def user():
  block = template()
  with block:
# do_part_1
  with block:
# do_part_2
  with block:
# do_part_3

That's an interesting idea, but do you have any use cases
in mind?
I was trying to address a use case which looked something like:
   do_begin()
   # code
   if some_condition:
  do_pre()
  # more code
  do_post()
   do_end()
It's actually doable with a non-looping block statement, but I have yet to come 
up with a version which isn't as ugly as hell.

I worry that it will be too restrictive to be really useful.
Without the ability for the iterator to control which blocks
get executed and when, you wouldn't be able to implement
something like a case statement, for example.
We can't write a case statement with a looping block statement either, since 
we're restricted to executing the same suite whenever we encounter a yield 
expression. At least the non-looping version offers some hope, since each yield 
can result in the execution of different code.

For me, the main sticking point is that we *already* have a looping construct to 
drain an iterator - a 'for' loop. The more different the block statement's 
semantics are from a regular loop, the more powerful I think the combination 
will be. Whereas if the block statement is just a for loop with slightly tweaked 
exception handling semantics, then the potential combinations will be far less 
interesting.

My current thinking is that we would be better served by a block construct that 
guaranteed it would call __next__() on entry and on exit, but did not drain the 
generator (e.g. by supplying appropriate __enter__() and __exit__() methods on 
generators for a PEP 310 style block statement, or __enter__(), __except__() and 
__no_except__() for the enhanced version posted elsewhere in this rambling 
discussion).

However, I'm currently scattering my thoughts across half-a-dozen different 
conversation threads. So I'm going to stop doing that, and try to put it all 
into one coherent post :)

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Duncan Booth
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

> Guido van Rossum wrote:
>> I've written a PEP about this topic. It's PEP 340: Anonymous Block
>> Statements (http://python.org/peps/pep-0340.html).
>> 
> Some observations:
> 
> 1. It looks to me like a bare return or a return with an EXPR3 that
> happens 
> to evaluate to None inside a block simply exits the block, rather
> than exiting a surrounding function. Did I miss something, or is
> this a bug?
> 

No, the return sets a flag and raises StopIteration which should make the 
iterator also raise StopIteration at which point the real return happens.

If the iterator fails to re-raise the StopIteration exception (the spec 
only says it should, not that it must) I think the return would be ignored 
but a subsquent exception would then get converted into a return value. I 
think the flag needs reset to avoid this case.

Also, I wonder whether other exceptions from next() shouldn't be handled a 
bit differently. If BLOCK1 throws an exception, and this causes the 
iterator to also throw an exception then one exception will be lost. I 
think it would be better to propogate the original exception rather than 
the second exception.

So something like (added lines to handle both of the above):

itr = EXPR1
exc = arg = None
ret = False
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
if exc is not None:
if ret:
return exc
else:
raise exc   # XXX See below
break
+   except:
+   if ret or exc is None:
+   raise
+   raise exc # XXX See below
+   ret = False
try:
exc = arg = None
BLOCK1
except Exception, exc:
arg = StopIteration()
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Integrating PEP 310 with PEP 340

2005-04-27 Thread Nick Coghlan
This is my attempt at a coherent combination of what I like about both proposals 
(as opposed to my assortment of half-baked attempts scattered through the 
existing discussion).

PEP 340 has many ideas I like:
  - enhanced yield statements and yield expressions
  - enhanced continue and break
  - generator finalisation
  - 'next' builtin and associated __next__() slot
  - changes to 'for' loop
One restriction I don't like is the limitation to ContinueIteration and 
StopIteration as arguments to next(). The proposed semantics and conventions for 
ContinueIteration and StopIteration are fine, but I would like to be able to 
pass _any_ exception in to the generator, allowing the generator to decide if a 
given exception justifies halting the iteration.

The _major_ part I don't like is that the block statement's semantics are too 
similar to those of a 'for' loop. I would like to see a new construct that can 
do things a for loop can't do, and which can be used in _conjunction_ with a for 
loop, to provide greater power than either construct on their own.

PEP 310 forms the basis for a block construct that I _do_ like. The question 
then becomes whether or not generators can be used to write useful PEP 310 style 
block managers (I think they can, in a style very similar to that of the looping 
block construct from PEP 340).

Block statement syntax from PEP 340:
block EXPR1 [as VAR1]:
BLOCK1
Proposed semantics (based on PEP 310, with some ideas stolen from PEP 340):
blk_mgr = EXPR1
VAR1 = blk_mgr.__enter__()
try:
try:
BLOCK1
except Exception, exc:
blk_mgr.__except__(exc)
else:
blk_mgr.__else__()
finally:
blk_mgr.__exit__()
'blk_mgr' is a hidden variable (as per PEP 340).
Note that nothing special happens to 'break', 'return' or 'continue' statements 
with this proposal.

Generator methods to support the block manager protocol used by the block 
statement:
def __enter__(self):
try:
return next(self)
except StopIteration:
raise RuntimeError("Generator exhausted before block statement")
def __except__(self, exc):
try:
next(self, exc)
except StopIteration:
pass
def __no_except__(self):
try:
next(self)
except StopIteration:
pass
def __exit__(self):
pass
Writing simple block managers with this proposal (these should be identical to 
the equivalent PEP 340 block managers):

  def opening(name):
  opened = open(name)
  try:
  yield opened
  finally:
  opened.close()
  def logging(logger, name):
  logger.enter_scope(name)
  try:
  try:
  yield
  except Exception, exc:
  logger.log_exception(exc)
  finally:
  logger.exit_scope()
  def transacting(ts):
  ts.begin()
  try:
  yield
  except:
  ts.abort()
  else:
  ts.commit()
Using simple block managers with this proposal (again, identical to PEP 340):
  block opening(name) as f:
pass
  block logging(logger, name):
pass
  block transacting(ts):
pass
Obviously, the more interesting block managers are those like auto_retry (which 
is a loop, and hence an excellent match for PEP 340), and using a single 
generator in multiple block statements (which PEP 340 doesn't allow at all). 
I'll try to get to those tomorrow (and if I can't find any good use cases for 
the latter trick, then this idea can be summarily discarded in favour of PEP 340).

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Jim Fulton
Duncan Booth wrote:
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that
happens 
   to evaluate to None inside a block simply exits the block, rather
   than exiting a surrounding function. Did I miss something, or is
   this a bug?


No, the return sets a flag and raises StopIteration which should make the 
iterator also raise StopIteration at which point the real return happens.
Only if exc is not None
The only return in the pseudocode is inside "if exc is not None".
Is there another return that's not shown? ;)
I agree that we leave the block, but it doesn't look like we
leave the surrounding scope.
Jim
--
Jim Fulton   mailto:[EMAIL PROTECTED]   Python Powered!
CTO  (540) 361-1714http://www.python.org
Zope Corporation http://www.zope.com   http://www.zope.org
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Samuele Pedroni
Jim Fulton wrote:
Duncan Booth wrote:
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that
happensto evaluate to None inside a block simply exits the 
block, rather
   than exiting a surrounding function. Did I miss something, or is
   this a bug?


No, the return sets a flag and raises StopIteration which should make 
the iterator also raise StopIteration at which point the real return 
happens.

Only if exc is not None
The only return in the pseudocode is inside "if exc is not None".
Is there another return that's not shown? ;)
I agree that we leave the block, but it doesn't look like we
leave the surrounding scope.
that we are having this discussion at all seems a signal that the 
semantics are likely too subtle.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Duncan Booth
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

>> No, the return sets a flag and raises StopIteration which should make
>> the iterator also raise StopIteration at which point the real return
>> happens. 
> 
> Only if exc is not None
> 
> The only return in the pseudocode is inside "if exc is not None".
> Is there another return that's not shown? ;)
> 

Ah yes, I see now what you mean. 

I would think that the relevant psuedo-code should look more like:

except StopIteration:
if ret:
return exc
if exc is not None:
raise exc   # XXX See below
break
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 12:30 AM 4/27/05 -0700, Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration 
instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
Very nice.  It's not clear from the text, btw, if normal exceptions can be 
passed into __next__, and if so, whether they can include a traceback.  If 
they *can*, then generators can also be considered co-routines now, in 
which case it might make sense to call blocks "coroutine blocks", because 
they're basically a way to interleave a block of code with the execution of 
a specified coroutine.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 04:37 AM 4/26/05 -0700, Guido van Rossum wrote:
*Fourth*, and this is what makes Greg and me uncomfortable at the same
time as making Phillip and other event-handling folks drool: from the
previous three points it follows that an iterator may *intercept* any
or all of ReturnFlow, BreakFlow and ContinueFlow, and use them to
implement whatever cool or confusing magic they want.
Actually, this isn't my interest at all.  It's the part where you can pass 
values or exceptions *in* to a generator with *less* magic than is 
currently required.

This interest is unrelated to anonymous blocks in any case; it's about 
being able to simulate lightweight pseudo-threads ala Stackless, for use 
with Twisted.  I can do this now of course, but "yield expressions" as 
described in PEP 340 would eliminate the need for the awkward syntax and 
frame hackery I currently use.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: Re: anonymous blocks

2005-04-27 Thread Fredrik Lundh
Phillip J. Eby wrote:
This interest is unrelated to anonymous blocks in any case; it's about 
being able to simulate lightweight pseudo-threads ala Stackless, for use 
with Twisted.  I can do this now of course, but "yield expressions" as 
described in PEP 340 would eliminate the need for the awkward syntax and 
frame hackery I currently use.
since when does
   def mythread(self):
   ...
   yield request
   print self.response
   ...
qualify as frame hackery?

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Another Anonymous Block Proposal

2005-04-27 Thread Josiah Carlson

Jason Diamond <[EMAIL PROTECTED]> wrote:
> 
> Paul Svensson wrote:
> 
> >  You're not mentioning scopes of local variables, which seems to be
> >  the issue where most of the previous proposals lose their balance
> >  between hairy and pointless...
> 
> My syntax is just sugar for nested defs. I assumed the scopes of local 
> variables would be identical when using either syntax.
> 
> Do you have any pointers to that go into the issues I'm probably missing?

We already have nested defs in Python, no need for a new syntax there.

The trick is that people would like to be able to execute the body of a
def (or at least portions) in the namespace of where it is lexically
defined (seemingly making block syntaxes less appealing), and even some
who want to execute the body of the def in the namespace where the
function is evaluated (which has been discussed as being almost not
possible, if not entirely impossible).

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Josiah Carlson

Guido van Rossum <[EMAIL PROTECTED]> wrote:
> 
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).
> 
> Some highlights:
> 
> - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
> - __next__() argument simplified to StopIteration or ContinueIteration 
> instance
> - use "continue EXPR" to pass a value to the generator
> - generator exception handling explained

Your code for the translation of a standard for loop is flawed.  From
the PEP:

for VAR1 in EXPR1:
BLOCK1
else:
BLOCK2

will be translated as follows:

itr = iter(EXPR1)
arg = None
while True:
try:
VAR1 = next(itr, arg)
finally:
break
arg = None
BLOCK1
else:
BLOCK2


Note that in the translated version, BLOCK2 can only ever execute if
next raises a StopIteration in the call, and BLOCK1 will never be
executed because of the 'break' in the finally clause.

Unless it is too early for me, I believe what you wanted is...

itr = iter(EXPR1)
arg = None
while True:
VAR1 = next(itr, arg)
arg = None
BLOCK1
else:
BLOCK2

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> I would think that the relevant psuedo-code should look more like:
> 
> except StopIteration:
> if ret:
> return exc
> if exc is not None:
> raise exc   # XXX See below
> break

Thanks! This was a bug in the PEP due to a last-minute change in how I
wanted to handle return; I've fixed it as you show (also renaming
'exc' to 'var' since it doesn't always hold an exception).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
On 4/27/05, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).

So block-statements would be very much like for-loops, except:

(1) iter() is not called on the expression
(2) the fact that break, continue, return or a raised Exception
occurred can all be intercepted by the block-iterator/generator,
though break, return and a raised Exception all look the same to the
block-iterator/generator (they are signaled with a StopIteration)
(3) the while loop can only be broken out of by next() raising a
StopIteration, so all well-behaved iterators will be exhausted when
the block-statement is exited

Hope I got that mostly right.

I know this is looking a little far ahead, but is the intention that
even in Python 3.0 for-loops and block-statements will still be
separate statements?  It seems like there's a pretty large section of
overlap.  Playing with for-loop semantics right now isn't possible due
to backwards compatibility, but when that limitation is removed in
Python 3.0, are we hoping that these two similar structures will be
expressed in a single statement?

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> Your code for the translation of a standard for loop is flawed.  From
> the PEP:
> 
> for VAR1 in EXPR1:
> BLOCK1
> else:
> BLOCK2
> 
> will be translated as follows:
> 
> itr = iter(EXPR1)
> arg = None
> while True:
> try:
> VAR1 = next(itr, arg)
> finally:
> break
> arg = None
> BLOCK1
> else:
> BLOCK2
> 
> Note that in the translated version, BLOCK2 can only ever execute if
> next raises a StopIteration in the call, and BLOCK1 will never be
> executed because of the 'break' in the finally clause.

Ouch. Another bug in the PEP. It was late. ;-)

The "finally:" should have been "except StopIteration:" I've updated
the PEP online.

> Unless it is too early for me, I believe what you wanted is...
> 
> itr = iter(EXPR1)
> arg = None
> while True:
> VAR1 = next(itr, arg)
> arg = None
> BLOCK1
> else:
> BLOCK2

No, this would just propagate the StopIteration when next() raises it.
StopIteration is not caught implicitly except around the next() call
made by the for-loop control code.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] ZipFile revision....

2005-04-27 Thread Schollnick, Benjamin
Folks,

There's been a lot of talk lately about changes to the ZipFile 
module...  Along with people stating that there are few "real life"
applications for it

Here's a small "gift"...

A "Quick" Backup utility for your files

Example:

c:\develope\backup\backup.py --source c:\install_software   --target
c:\backups\  --label installers
c:\develope\backup\backup.py --source c:\develope   --target
c:\backups\  --label development -z .pyc
c:\develope\backup\backup.py --source "C:\Program Files\Microsoft SQL
Server\MSSQL\Data"  --target c:\backups\--label sql

It's evolved a bit, but still could use some work  It's
currently only tested in a windows
environment...  So don't expect Mac OS X resource forks to be
preserved.  But it creates and verifies 1Gb+ zip files

If you wish to use this to help benchmark, test, etc, any
changes to the ZipFile module
please feel free to...

- Benjamin


"""Backup Creator Utility

This utility will backup the tree of files that you indicate, into a
archive
of your choice.

"""
#

# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
__version__ = '0.95'# Human Readable Version number
version_info = (0,9,5)  # Easier format version data for comparisons
# i.e. if version_info > (1,2,5)
#
#   if __version__ > '1.00' is a little more
contrived.

__author__  = 'Benjamin A. Schollnick'
__date__= '2004-12-28'  # -mm-dd
__email__   = '[EMAIL PROTECTED]'
__module_name__ = "Archive Backup Tool"
__short_cright__= ""

import  bas_init
import os
import os.path
import sys
import time
import zipfile


###
class   zip_file_engine:
"""The archive backup tool uses pregenerated classes to allow
multiple styles of archives to be created.

This is the wrapper around the Python ZIPFILE module.
"""
def __init__   ( self ):
"""
Inputs  --
None

Outputs --
None
"""
self.Backup_File= None
self.Backup_Open= False
self.Backup_ReadOnly= None
self.Backup_FileName= None

def close_Backup (self ):
"""This will close the current Archive file, and reset the
internal structures to a clean state.

Inputs  --
None

Outputs --
None
"""
if self.Backup_Open <> False:
self.Backup_File.close ()

self.Backup_File= None
self.Backup_Open= False
self.Backup_ReadOnly= None
self.Backup_FileName= None

def open_Backup (   self,
readonly = False,
filename = r"./temp.zip"):
"""This will open a archive file.  Currently appending is not
formally supported...  The Read Only / Read/Write status is set
via the readonly flag.

Inputs  --

Readonly:
True  = Read/Write
False = Read Only

Filename contains the full file/pathname of the zip file.

Outputs --
None
"""
if self.Backup_Open == True:
self.close_Backup ()

self.Backup_Filename = filename
if readonly == False:
self.Backup_File= zipfile.ZipFile ( filename, "r",
zipfile.ZIP_DEFLATED )
self.Backup_Open= True
self.Backup_ReadOnly= True
self.Backup_FileName= filename
else:
self.Backup_File= zipfile.ZipFile ( filename, "w",
zipfile.ZIP_DEFLATED )
self.Backup_Open= True
self.Backup_ReadOnly= False
self.Backup_FileName= filename

def Verify_ZipFile ( self, FileName ):
"""Will create a temporary Zip File object, and verify the
Zip file
at  location.

Inputs  -
FileName - The filename of the ZIP file to
verify.

Outputs -
True  - File Intact CRCs match

Anything else, File Corrupted.  String Contains
the 1st corrupted file.
"""
temporary_Backup_File = zip_file_engine ( )
temporary_Backup_File.open_Backup ( False, FileName)
test_results = temporary_Backup_File.Backup_File.testzip ()
temporary_Backup_File.close_Backup()
return test_results

def Verify_Backup (self, FileName ):
""" Generic Wrapper around the Verify_ZipFile object.
"""

RE: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Andrew Koenig

> that we are having this discussion at all seems a signal that the
> semantics are likely too subtle.

I feel like we're quietly, delicately tiptoeing toward continuations...


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Josiah Carlson

Guido van Rossum <[EMAIL PROTECTED]> wrote:
> Ouch. Another bug in the PEP. It was late. ;-)
> 
> The "finally:" should have been "except StopIteration:" I've updated
> the PEP online.
> 
> > Unless it is too early for me, I believe what you wanted is...
> > 
> > itr = iter(EXPR1)
> > arg = None
> > while True:
> > VAR1 = next(itr, arg)
> > arg = None
> > BLOCK1
> > else:
> > BLOCK2
> 
> No, this would just propagate the StopIteration when next() raises it.
> StopIteration is not caught implicitly except around the next() call
> made by the for-loop control code.

Still no good.  On break, the else isn't executed.

How about...

itr = iter(EXPR1)
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
BLOCK2
break
arg = None
BLOCK1

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).
> 
> Some highlights:
> 
> - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
> - __next__() argument simplified to StopIteration or ContinueIteration 
> instance
> - use "continue EXPR" to pass a value to the generator
> - generator exception handling explained
> 

I am at least +0 on all of this now, with a slow warming up to +1 (but then it
might just be the cold talking  =).

I still prefer the idea of arguments to __next__() be raised if they are
exceptions and otherwise just be returned through the yield expression.  But I
do realize this is easily solved with a helper function now::

 def raise_or_yield(val):
 """Return the argument if not an exception, otherwise raise it.

 Meant to have a yield expression as an argument.  Worries about
 Iteration subclasses are invalid since they will have been handled by the
 __next__() method on the generator already.


 """
 if isinstance(val, Exception):
raise val
 else:
return val

My objections that I had earlier to 'continue' and 'break' being somewhat
magical in block statements has subsided.  It all seems reasonable now within
the context of a block statement.

And while the thought is in my head, I think block statements should be viewed
less as a tweaked version of a 'for' loop and more as an extension to
generators that happens to be very handy for resource management (while
allowing iterators to come over and play on the new swing set as well).  I
think if you take that view then the argument that they are too similar to
'for' loops loses some luster (although I doubt Nick is going to be buy this  
=) .

Basically block statements are providing a simplified, syntactically supported
way to control a generator externally from itself (or at least this is the
impression I am getting).  I just had a flash of worry about how this would
work in terms of abstractions of things to functions with block statements in
them, but then I realized you just push more code into the generator and handle
it there with the block statement just driving the generator.  Seems like this
might provide that last key piece for generators to finally provide cool flow
control that we all know they are capable of but just required extra work
beforehand.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip Eby]
> Very nice.  It's not clear from the text, btw, if normal exceptions can be
> passed into __next__, and if so, whether they can include a traceback.  If
> they *can*, then generators can also be considered co-routines now, in
> which case it might make sense to call blocks "coroutine blocks", because
> they're basically a way to interleave a block of code with the execution of
> a specified coroutine.

The PEP is clear on this: __next__() only takes Iteration instances,
i.e., StopIteration and ContinueIteration. (But see below.)

I'm not sure what the relevance of including a stack trace would be,
and why that feature would be necessary to call them coroutines.

But... Maybe it would be nice if generators could also be used to
implement exception handling patterns, rather than just resource
release patterns. IOW, maybe this should work:

def safeLoop(seq):
for var in seq:
try:
yield var
except Exception, err:
print "ignored", var, ":", err.__class__.__name__

block safeLoop([10, 5, 0, 20]) as x:
print 1.0/x

This should print

0.1
0.2
ignored 0 : ZeroDivisionError
0.02

I've been thinking of alternative signatures for the __next__() method
to handle this. We have the following use cases:

1. plain old next()
2. passing a value from continue EXPR
3. forcing a break due to a break statement
4. forcing a break due to a return statement
5. passing an exception EXC

Cases 3 and 4 are really the same; I don't think the generator needs
to know the difference between a break and a return statement. And
these can be mapped to case 5 with EXC being StopIteration().

Now the simplest API would be this: if the argument to __next__() is
an exception instance (let's say we're talking Python 3000, where all
exceptions are subclasses of Exception), it is raised when yield
resumes; otherwise it is the return value from yield (may be None).

This is somewhat unsatisfactory because it means that you can't pass
an exception instance as a value. I don't know how much of a problem
this will be in practice; I could see it causing unpleasant surprises
when someone designs an API around this that takes an arbitrary
object, when someone tries to pass an exception instance. Fixing such
a thing could be expensive (you'd have to change the API to pass the
object wrapped in a list or something).

An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?

I'll add this to the PEP as an alternative for now.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> I feel like we're quietly, delicately tiptoeing toward continuations...

No way we aren't. We're not really adding anything to the existing
generator machinery (the exception/value passing is a trivial
modification) and that is only capable of 80% of coroutines (but it's
the 80% you need most :-).

As long as I am BDFL Python is unlikely to get continuations -- my
head explodes each time someone tries to explain them to me.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread David Ascher
On 4/27/05, Guido van Rossum <[EMAIL PROTECTED]> wrote:

> As long as I am BDFL Python is unlikely to get continuations -- my
> head explodes each time someone tries to explain them to me.

You just need a safety valve installed. It's outpatient surgery, don't worry.

--david
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 01:27 PM 4/27/05 -0700, Guido van Rossum wrote:
[Phillip Eby]
> Very nice.  It's not clear from the text, btw, if normal exceptions can be
> passed into __next__, and if so, whether they can include a traceback.  If
> they *can*, then generators can also be considered co-routines now, in
> which case it might make sense to call blocks "coroutine blocks", because
> they're basically a way to interleave a block of code with the execution of
> a specified coroutine.
The PEP is clear on this: __next__() only takes Iteration instances,
i.e., StopIteration and ContinueIteration. (But see below.)
I'm not sure what the relevance of including a stack trace would be,
and why that feature would be necessary to call them coroutines.
Well, you need that feature in order to retain traceback information when 
you're simulating threads with a stack of generators.  Although you can't 
return from a generator inside a nested generator, you can simulate this by 
keeping a stack of generators and having a wrapper that passes control 
between generators, such that:

def somegen():
result = yield othergen()
causes the wrapper to push othergen() on the generator stack and execute 
it.  If othergen() raises an error, the wrapper resumes somegen() and 
passes in the error.  If you can only specify the value but not the 
traceback, you lose the information about where the error occurred in 
othergen().

So, the feature is necessary for anything other than "simple" (i.e. 
single-frame) coroutines, at least if you want to retain any possibility of 
debugging.  :)


But... Maybe it would be nice if generators could also be used to
implement exception handling patterns, rather than just resource
release patterns. IOW, maybe this should work:
def safeLoop(seq):
for var in seq:
try:
yield var
except Exception, err:
print "ignored", var, ":", err.__class__.__name__
block safeLoop([10, 5, 0, 20]) as x:
print 1.0/x
Yes, it would be nice.  Also, you may have just come up with an even better 
word for what these things should be called... patterns.  Perhaps they 
could be called "pattern blocks" or "patterned blocks".  Pattern sounds so 
much more hip and politically correct than "macro" or even "code block".  :)


An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?
I think it'd be simpler just to have two methods, conceptually 
"resume(value=None)" and "error(value,tb=None)", whatever the actual method 
names are.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


RE: [Python-Dev] Re: switch statement

2005-04-27 Thread Michael Chermside
Guido writes:
> You mean like this?
>
> if x > 0:
>...normal case...
> elif y > 0:
> abnormal case...
> else:
> ...edge case...
>
> You have guts to call that bad style! :-)

Well, maybe, but this:

if x == 1:
   do_number_1()
elif x == 2:
   do_number_2()
elif x == 3:
   do_number_3()
elif y == 4:
   do_number_4()
elif x == 5:
   do_number_5()
else:
   raise ValueError

is clearly bad style. (Even knowing what I did here, how long does it
take you to find the problem? Hint: line 7.)

I've seen Jim's recipe in the cookbook, and as I said there, I'm impressed
by the clever implementation, but I think it's unwise. PEP 275 proposes
an O(1) solution... either by compiler optimization of certain
if-elif-else structures, or via a new syntax with 'switch' and 'case'
keywords. (I prefer the keywords version myself... that optimization
seems awefully messy, and wouldn't help with the problem above.) Jim's
recipe fixes the problem given above, but it's a O(n) solution, and to
me the words 'switch' and 'case' just *scream* "O(1)". But perhaps
it's worthwhile, just because it avoids repeating "x ==".

Really, this seems like a direct analog of another frequently-heard
Python gripe: the lack of a conditional expression. After all, the
problems with these two code snippets:

 if x == 1:|if condition_1:
do_1() |y = 1
 elif x == 2:  |elif condition_2:
do_2() |y = 2
 elif x == 3:  |elif condition_3:
do_3() |y = 3
 else: |else:
default()  |y = 4

is the repetition of "x ==" and of "y =". As my earlier example
demonstrates, a structure like this in which the "x ==" or the
"y =" VARIES has a totally different *meaning* to the programmer
than one in which the "x ==" or "y =" is the same for every
single branch.

But let's not start discussing conditional expressions now,
because there's already more traffic on the list than I can read.

-- Michael Chermside

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Guido]
> >I'm not sure what the relevance of including a stack trace would be,
> >and why that feature would be necessary to call them coroutines.

[Phillip]
> Well, you need that feature in order to retain traceback information when
> you're simulating threads with a stack of generators.  Although you can't
> return from a generator inside a nested generator, you can simulate this by
> keeping a stack of generators and having a wrapper that passes control
> between generators, such that:
> 
>  def somegen():
>  result = yield othergen()
> 
> causes the wrapper to push othergen() on the generator stack and execute
> it.  If othergen() raises an error, the wrapper resumes somegen() and
> passes in the error.  If you can only specify the value but not the
> traceback, you lose the information about where the error occurred in
> othergen().
> 
> So, the feature is necessary for anything other than "simple" (i.e.
> single-frame) coroutines, at least if you want to retain any possibility of
> debugging.  :)

OK. I think you must be describing continuations there, because my
brain just exploded. :-)

In Python 3000 I want to make the traceback a standard attribute of
Exception instances; would that suffice? I really don't want to pass
the whole (type, value, traceback) triple that currently represents an
exception through __next__().

> Yes, it would be nice.  Also, you may have just come up with an even better
> word for what these things should be called... patterns.  Perhaps they
> could be called "pattern blocks" or "patterned blocks".  Pattern sounds so
> much more hip and politically correct than "macro" or even "code block".  :)

Yes, but the word has a much loftier meaning. I could get used to
template blocks though (template being a specific pattern, and this
whole thing being a non-OO version of the Template Method Pattern from
the GoF book).

> >An alternative that solves this would be to give __next__() a second
> >argument, which is a bool that should be true when the first argument
> >is an exception that should be raised. What do people think?
> 
> I think it'd be simpler just to have two methods, conceptually
> "resume(value=None)" and "error(value,tb=None)", whatever the actual method
> names are.

Part of me likes this suggestion, but part of me worries that it
complicates the iterator API too much. Your resume() would be
__next__(), but that means your error() would become __error__(). This
is more along the lines of PEP 288 and PEP 325 (and even PEP 310), but
we have a twist here in that it is totally acceptable (see my example)
for __error__() to return the next value or raise StopIteration. IOW
the return behavior of __error__() is the same as that of __next__().

Fredrik, what does your intuition tell you?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: switch statement

2005-04-27 Thread Shane Hathaway
Michael Chermside wrote:
>  if x == 1:|if condition_1:
> do_1() |y = 1
>  elif x == 2:  |elif condition_2:
> do_2() |y = 2
>  elif x == 3:  |elif condition_3:
> do_3() |y = 3
>  else: |else:
> default()  |y = 4

This inspired a twisted thought: if you just redefine truth, you don't
have to repeat the variable. <0.9 wink>

True = x
if 1:
do_1()
elif 2:
do_2()
elif 3:
do_3()
else:
default()

Shane
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> If the iterator fails to re-raise the StopIteration exception (the spec
> only says it should, not that it must) I think the return would be ignored
> but a subsquent exception would then get converted into a return value. I
> think the flag needs reset to avoid this case.

Good catch. I've fixed this in the PEP.

> Also, I wonder whether other exceptions from next() shouldn't be handled a
> bit differently. If BLOCK1 throws an exception, and this causes the
> iterator to also throw an exception then one exception will be lost. I
> think it would be better to propogate the original exception rather than
> the second exception.

I don't think so. It's similar to this case:

try:
raise Foo
except:
raise Bar

Here, Foo is also lost.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Jim Fulton]

> 2. I assume it would be a hack to try to use block statements to implement
> something like interfaces or classes, because doing so would require
> significant local-variable manipulation.  I'm guessing that
> either implementing interfaces (or implementing a class statement
> in which the class was created before execution of a suite)
> is not a use case for this PEP.

I would like to get back to the discussion about interfaces and
signature type declarations at some point, and a syntax dedicated to
declaring interfaces is high on my wish list.

In the mean time, if you need interfaces today, I think using
metaclasses would be easier than using a block-statement (if it were
even possible using the latter without passing locals() to the
generator).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Brett C. wrote:
And while the thought is in my head, I think block statements should be viewed
less as a tweaked version of a 'for' loop and more as an extension to
generators that happens to be very handy for resource management (while
allowing iterators to come over and play on the new swing set as well).  I
think if you take that view then the argument that they are too similar to
'for' loops loses some luster (although I doubt Nick is going to be buy this  
=) .
I'm surprisingly close to agreeing with you, actually. I've worked out that it 
isn't the looping that I object to, it's the inability to get out of the loop 
without exhausting the entire iterator.

I need to think about some ideas involving iterator factories, then my 
objections may disappear.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Guido van Rossum wrote:
An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?
I'll add this to the PEP as an alternative for now.
An optional third argument (raise=False) seems a lot friendlier (and more 
flexible) than a typecheck.

Yet another alternative would be for the default behaviour to be to raise 
Exceptions, and continue with anything else, and have the third argument be 
"raise_exc=True" and set it to False to pass an exception in without raising it.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Integrating PEP 310 with PEP 340

2005-04-27 Thread Guido van Rossum
[Nick Coghlan]
> This is my attempt at a coherent combination of what I like about both 
> proposals
> (as opposed to my assortment of half-baked attempts scattered through the
> existing discussion).
> 
> PEP 340 has many ideas I like:
>- enhanced yield statements and yield expressions
>- enhanced continue and break
>- generator finalisation
>- 'next' builtin and associated __next__() slot
>- changes to 'for' loop
> 
> One restriction I don't like is the limitation to ContinueIteration and
> StopIteration as arguments to next(). The proposed semantics and conventions 
> for
> ContinueIteration and StopIteration are fine, but I would like to be able to
> pass _any_ exception in to the generator, allowing the generator to decide if 
> a
> given exception justifies halting the iteration.

I'm close to dropping this if we can agree on the API for passing
exceptions into __next__(); see the section "Alternative __next__()
and Generator Exception Handling" that I just added to the PEP.

> The _major_ part I don't like is that the block statement's semantics are too
> similar to those of a 'for' loop. I would like to see a new construct that can
> do things a for loop can't do, and which can be used in _conjunction_ with a 
> for
> loop, to provide greater power than either construct on their own.

While both 'block' and 'for' are looping constructs, their handling of
the iterator upon premature exit is entirely different, and it's hard
to reconcile these two before Python 3000.

> PEP 310 forms the basis for a block construct that I _do_ like. The question
> then becomes whether or not generators can be used to write useful PEP 310 
> style
> block managers (I think they can, in a style very similar to that of the 
> looping
> block construct from PEP 340).

I've read through your example, and I'm not clear why you think this
is better. It's a much more complex API with less power. What's your
use case? Why should 'block' be disallowed from looping? TOOWTDI or do
you have something better?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Guido]
> > An alternative that solves this would be to give __next__() a second
> > argument, which is a bool that should be true when the first argument
> > is an exception that should be raised. What do people think?
> >
> > I'll add this to the PEP as an alternative for now.

[Nick]
> An optional third argument (raise=False) seems a lot friendlier (and more
> flexible) than a typecheck.

I think I agree, especially since Phillip's alternative (a different
method) is even worse IMO.

> Yet another alternative would be for the default behaviour to be to raise
> Exceptions, and continue with anything else, and have the third argument be
> "raise_exc=True" and set it to False to pass an exception in without raising 
> it.

You've lost me there. If you care about this, can you write it up in
more detail (with code samples or whatever)? Or we can agree on a 2nd
arg to __next__() (and a 3rd one to next()).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 02:50 PM 4/27/05 -0700, Guido van Rossum wrote:
[Guido]
> >I'm not sure what the relevance of including a stack trace would be,
> >and why that feature would be necessary to call them coroutines.
[Phillip]
> Well, you need that feature in order to retain traceback information when
> you're simulating threads with a stack of generators.  Although you can't
> return from a generator inside a nested generator, you can simulate this by
> keeping a stack of generators and having a wrapper that passes control
> between generators, such that:
>
>  def somegen():
>  result = yield othergen()
>
> causes the wrapper to push othergen() on the generator stack and execute
> it.  If othergen() raises an error, the wrapper resumes somegen() and
> passes in the error.  If you can only specify the value but not the
> traceback, you lose the information about where the error occurred in
> othergen().
>
> So, the feature is necessary for anything other than "simple" (i.e.
> single-frame) coroutines, at least if you want to retain any possibility of
> debugging.  :)
OK. I think you must be describing continuations there, because my
brain just exploded. :-)
Probably my attempt at a *brief* explanation backfired.  No, they're not 
continuations or anything nearly that complicated.  I'm "just" simulating 
threads using generators that yield a nested generator when they need to do 
something that might block waiting for I/O.  The pseudothread object pushes 
the yielded generator-iterator and resumes it.  If that generator-iterator 
raises an error, the pseudothread catches it, pops the previous 
generator-iterator, and passes the error into it, traceback and all.

The net result is that as long as you use a "yield expression" for any 
function/method call that might do blocking I/O, and those functions or 
methods are written as generators, you get the benefits of Twisted (async 
I/O without threading headaches) without having to "twist" your code into 
the callback-registration patterns of Twisted.  And, by passing in errors 
with tracebacks, the normal process of exception call-stack unwinding 
combined with pseudothread stack popping results in a traceback that looks 
just as if you had called the functions or methods normally, rather than 
via the pseudothreading mechanism.  Without that, you would only get the 
error context of 'async_readline()', because the traceback wouldn't be able 
to show who *called* async_readline.


In Python 3000 I want to make the traceback a standard attribute of
Exception instances; would that suffice?
If you're planning to make 'raise' reraise it, such that 'raise exc' is 
equivalent to 'raise type(exc), exc, exc.traceback'.  Is that what you 
mean?  (i.e., just making it easier to pass the darn things around)

If so, then I could probably do what I need as long as there exist no error 
types whose instances disallow setting a 'traceback' attribute on them 
after the fact.  Of course, if Exception provides a slot (or dictionary) 
for this, then it shouldn't be a problem.

Of course, it seems to me that you also have the problem of adding to the 
traceback when the same error is reraised...

All in all it seems more complex than just allowing an exception and a 
traceback to be passed.


I really don't want to pass
the whole (type, value, traceback) triple that currently represents an
exception through __next__().
The point of passing it in is so that the traceback can be preserved 
without special action in the body of generators the exception is passing 
through.

I could be wrong, but it seems to me you need this even for PEP 340, if 
you're going to support error management templates, and want tracebacks to 
include the line in the block where the error originated.  Just reraising 
the error inside the generator doesn't seem like it would be enough.


> >An alternative that solves this would be to give __next__() a second
> >argument, which is a bool that should be true when the first argument
> >is an exception that should be raised. What do people think?
>
> I think it'd be simpler just to have two methods, conceptually
> "resume(value=None)" and "error(value,tb=None)", whatever the actual method
> names are.
Part of me likes this suggestion, but part of me worries that it
complicates the iterator API too much.
I was thinking that maybe these would be a "coroutine API" or "generator 
API" instead.  That is, something not usable except with 
generator-iterators and with *new* objects written to conform to it.  I 
don't really see a lot of value in making template blocks work with 
existing iterators.  For that matter, I don't see a lot of value in 
hand-writing new objects with resume/error, instead of just using a generator.

So, I guess I'm thinking you'd have something like tp_block_resume and 
tp_block_error type slots, and generators' tp_iter_next would just be the 
same as tp_block_resume(None).

But maybe this is the part you're thinking is complicated.  :)
___

Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Guido van Rossum wrote:
- temporarily sidestepping the syntax by proposing 'block' instead of
'with'
- __next__() argument simplified to StopIteration or
ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
+1
A minor sticking point - I don't like that the generator has to re-raise any 
``StopIteration`` passed in. Would it be possible to have the semantics be:

   If a generator is resumed with ``StopIteration``, the exception is 
raised
   at the resumption point (and stored for later use). When the generator
   exits normally (i.e. ``return`` or falls off the end) it re-raises the
   stored exception (if any) or raises a new ``StopIteration`` exception.

So a generator would become effectively::
   try:
   stopexc = None
   exc = None
   BLOCK1
   finally:
   if exc is not None:
   raise exc
   if stopexc is not None:
   raise stopexc
   raise StopIteration
where within BLOCK1:
   ``raise `` is equivalent to::
   exc = 
   return
   The start of an ``except`` clause sets ``exc`` to None (if the clause is
   executed of course).
   Calling ``__next__(exception)`` with ``StopIteration`` is equivalent 
to::

   stopexc = exception
   (raise exception at resumption point)
   Calling ``__next__(exception)`` with ``ContinueIteration`` is equivalent 
to::

   (resume exception with exception.value)
   Calling ``__next__(exception)__`` with any other value just raises that 
value
   at the resumption point - this allows for calling with arbitrary 
exceptions.

Also, within a for-loop or block-statement, we could have ``raise 
`` be equivalent to::

   arg = 
   continue
This also takes care of Brett's concern about distinguishing between 
exceptions and values passed to the generator. Anything except StopIteration 
or ContinueIteration will be presumed to be an exception and will be raised. 
Anything passed via ContinueIteration is a value.

Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip]
> Probably my attempt at a *brief* explanation backfired.  No, they're not
> continuations or anything nearly that complicated.  I'm "just" simulating
> threads using generators that yield a nested generator when they need to do
> something that might block waiting for I/O.  The pseudothread object pushes
> the yielded generator-iterator and resumes it.  If that generator-iterator
> raises an error, the pseudothread catches it, pops the previous
> generator-iterator, and passes the error into it, traceback and all.
> 
> The net result is that as long as you use a "yield expression" for any
> function/method call that might do blocking I/O, and those functions or
> methods are written as generators, you get the benefits of Twisted (async
> I/O without threading headaches) without having to "twist" your code into
> the callback-registration patterns of Twisted.  And, by passing in errors
> with tracebacks, the normal process of exception call-stack unwinding
> combined with pseudothread stack popping results in a traceback that looks
> just as if you had called the functions or methods normally, rather than
> via the pseudothreading mechanism.  Without that, you would only get the
> error context of 'async_readline()', because the traceback wouldn't be able
> to show who *called* async_readline.

OK, I sort of get it, at a very high-level, although I still feel this
is wildly out of my league.

I guess I should try it first. ;-)

> >In Python 3000 I want to make the traceback a standard attribute of
> >Exception instances; would that suffice?
> 
> If you're planning to make 'raise' reraise it, such that 'raise exc' is
> equivalent to 'raise type(exc), exc, exc.traceback'.  Is that what you
> mean?  (i.e., just making it easier to pass the darn things around)
> 
> If so, then I could probably do what I need as long as there exist no error
> types whose instances disallow setting a 'traceback' attribute on them
> after the fact.  Of course, if Exception provides a slot (or dictionary)
> for this, then it shouldn't be a problem.

Right, this would be a standard part of the Exception base class, just
like in Java.

> Of course, it seems to me that you also have the problem of adding to the
> traceback when the same error is reraised...

I think when it is re-raised, no traceback entry should be added; the
place that re-raises it should not show up in the traceback, only the
place that raised it in the first place. To me that's the essence of
re-raising (and I think that's how it works when you use raise without
arguments).

> All in all it seems more complex than just allowing an exception and a
> traceback to be passed.

Making the traceback a standard attribute of the exception sounds
simpler; having to keep track of two separate arguments that are as
closely related as an exception and the corresponding traceback is
more complex IMO.

The only reason why it isn't done that way in current Python is that
it couldn't be done that way back when exceptions were strings.

> >I really don't want to pass
> >the whole (type, value, traceback) triple that currently represents an
> >exception through __next__().
> 
> The point of passing it in is so that the traceback can be preserved
> without special action in the body of generators the exception is passing
> through.
> 
> I could be wrong, but it seems to me you need this even for PEP 340, if
> you're going to support error management templates, and want tracebacks to
> include the line in the block where the error originated.  Just reraising
> the error inside the generator doesn't seem like it would be enough.

*** I have to think about this more... ***

> > > I think it'd be simpler just to have two methods, conceptually
> > > "resume(value=None)" and "error(value,tb=None)", whatever the actual 
> > > method
> > > names are.
> >
> >Part of me likes this suggestion, but part of me worries that it
> >complicates the iterator API too much.
> 
> I was thinking that maybe these would be a "coroutine API" or "generator
> API" instead.  That is, something not usable except with
> generator-iterators and with *new* objects written to conform to it.  I
> don't really see a lot of value in making template blocks work with
> existing iterators.

(You mean existing non-generator iterators, right? existing
*generators* will work just fine -- the exception will pass right
through them and that's exactly the right default semantics.

Existing non-generator iterators are indeed a different case, and this
is actually an argument for having a separate API: if the __error__()
method doesn't exist, the exception is just re-raised rather than
bothering the iterator.

OK, I think I'm sold.

> For that matter, I don't see a lot of value in
> hand-writing new objects with resume/error, instead of just using a generator.

Not a lot, but I expect that there may be a few, like an optimized
version of lock synchronization.

> So, I guess I'm thinking you'd have something like tp_bl

Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> A minor sticking point - I don't like that the generator has to re-raise any
> ``StopIteration`` passed in. Would it be possible to have the semantics be:
> 
> If a generator is resumed with ``StopIteration``, the exception is raised
> at the resumption point (and stored for later use). When the generator
> exits normally (i.e. ``return`` or falls off the end) it re-raises the
> stored exception (if any) or raises a new ``StopIteration`` exception.

I don't like the idea of storing exceptions. Let's just say that we
don't care whether it re-raises the very same StopIteration exception
that was passed in or a different one -- it's all moot anyway because
the StopIteration instance is thrown away by the caller of next().

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Tim Delaney wrote:
Also, within a for-loop or block-statement, we could have ``raise
`` be equivalent to::
   arg = 
   continue
For this to work, builtin next() would need to be a bit smarter ... 
specifically, for an old-style iterator, any non-Iteration exception would 
need to be re-raised there.

Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Guido van Rossum wrote:
A minor sticking point - I don't like that the generator has to
re-raise any ``StopIteration`` passed in. Would it be possible to
have the semantics be: 

If a generator is resumed with ``StopIteration``, the exception
is raised at the resumption point (and stored for later use).
When the generator exits normally (i.e. ``return`` or falls off
the end) it re-raises the stored exception (if any) or raises a
new ``StopIteration`` exception. 
I don't like the idea of storing exceptions. Let's just say that we
don't care whether it re-raises the very same StopIteration exception
that was passed in or a different one -- it's all moot anyway because
the StopIteration instance is thrown away by the caller of next().
OK - so what is the point of the sentence::
   The generator should re-raise this exception; it should not yield
   another value.  

when discussing StopIteration?
Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> OK - so what is the point of the sentence::
> 
> The generator should re-raise this exception; it should not yield
> another value.
> 
> when discussing StopIteration?

It forbids returning a value, since that would mean the generator
could "refuse" a break or return statement, which is a little bit too
weird (returning a value instead would turn these into continue
statements).

I'll change this to clarify that I don't care about the identity of
the StopException instance.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Nick Coghlan wrote:
> Brett C. wrote:
> 
>> And while the thought is in my head, I think block statements should
>> be viewed
>> less as a tweaked version of a 'for' loop and more as an extension to
>> generators that happens to be very handy for resource management (while
>> allowing iterators to come over and play on the new swing set as
>> well).  I
>> think if you take that view then the argument that they are too
>> similar to
>> 'for' loops loses some luster (although I doubt Nick is going to be
>> buy this  =) .
> 
> 
> I'm surprisingly close to agreeing with you, actually. I've worked out
> that it isn't the looping that I object to, it's the inability to get
> out of the loop without exhausting the entire iterator.
> 

'break' isn't' enough for you as laid out by the proposal?  The raising of
StopIteration, which is what 'break' does according to the standard, should be
enough to stop the loop without exhausting things.  Same way you stop a 'for'
loop from executing entirely.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 03:58 PM 4/27/05 -0700, Guido van Rossum wrote:
OK, I sort of get it, at a very high-level, although I still feel this
is wildly out of my league.
I guess I should try it first. ;-)
It's not unlike David Mertz' articles on implementing coroutines and 
multitasking using generators, except that I'm adding more "debugging 
sugar", if you will, by making the tracebacks look normal.  It's just that 
the *how* requires me to pass the traceback into the generator.  At the 
moment, I accomplish that by doing a 3-argument raise inside of 
'events.resume()', but it would be really nice to be able to get rid of 
'events.resume()' in a future version of Python.


> Of course, it seems to me that you also have the problem of adding to the
> traceback when the same error is reraised...
I think when it is re-raised, no traceback entry should be added; the
place that re-raises it should not show up in the traceback, only the
place that raised it in the first place. To me that's the essence of
re-raising (and I think that's how it works when you use raise without
arguments).
I think maybe I misspoke.  I mean adding to the traceback *so* that when 
the same error is reraised, the intervening frames are included, rather 
than lost.

In other words, IIRC, the traceback chain is normally increased by one 
entry for each frame the exception escapes.  However, if you start hiding 
that inside of the exception instance, you'll have to modify it instead of 
just modifying the threadstate.  Does that make sense, or am I missing 
something?


> For that matter, I don't see a lot of value in
> hand-writing new objects with resume/error, instead of just using a 
generator.

Not a lot, but I expect that there may be a few, like an optimized
version of lock synchronization.
My point was mainly that we can err on the side of caller convenience 
rather than callee convenience, if there are fewer implementations.  So, 
e.g. multiple methods aren't a big deal if it makes the 'block' 
implementation simpler, if only generators and a handful of special 
template objects are going need to implement the block API.


> So, I guess I'm thinking you'd have something like tp_block_resume and
> tp_block_error type slots, and generators' tp_iter_next would just be the
> same as tp_block_resume(None).
I hadn't thought much about the C-level slots yet, but this is a
reasonable proposal.
Note that it also doesn't require a 'next()' builtin, or a next vs. 
__next__ distinction, if you don't try to overload iteration and 
templating.  The fact that a generator can be used for templating, doesn't 
have to imply that any iterator should be usable as a template, or that the 
iteration protocol is involved in any way.  You could just have 
__resume__/__error__ matching the tp_block_* slots.

This also has the benefit of making the delineation between template blocks 
and for loops more concrete.  For example, this:

block open("filename") as f:
...
could be an immediate TypeError (due to the lack of a __resume__) instead 
of biting you later on in the block when you try to do something with f, or 
because the block is repeating for each line of the file, etc.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Neil Schemenauer
On Wed, Apr 27, 2005 at 12:30:22AM -0700, Guido van Rossum wrote:
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).

[Note: most of these comments are based on version 1.2 of the PEP]

It seems like what you are proposing is a limited form of
coroutines.  Just as Python's generators are limited (yield can only
jump up one stack frame), these coroutines have a similar
limitation.  Someone mentioned that we are edging closer to
continuations.  I think that may be a good thing.  One big
difference between what you propose and general continuations is in
finalization semantics.  I don't think anyone has figured out a way
for try/finally to work with continuations.  The fact that
try/finally can be used inside generators is a significant feature
of this PEP, IMO.

Regarding the syntax, I actually quite like the 'block' keyword.  It
doesn't seem so surprising that the block may be a loop.

Allowing 'continue' to have an optional value is elegant syntax.
I'm a little bit concerned about what happens if the iterator does
not expect a value.  If I understand the PEP, it is silently
ignored.  That seems like it could hide bugs.  OTOH, it doesn't seem
any worse then a caller not expecting a return value.

It's interesting that there is such similarity between 'for' and
'block'.  Why is it that block does not call iter() on EXPR1?  I
guess that fact that 'break' and 'return' work differently is a more
significant difference.

After thinking about this more, I wonder if iterators meant for
'for' loops and iterators meant for 'block' statements are really
very different things.  It seems like a block-iterator really needs
to handle yield-expressions.

I wonder if generators that contain a yield-expression should
properly be called coroutines.  Practically, I suspect it would just
cause confusion.

Perhaps passing an Iteration instance to next() should not be
treated the same as passing None.  It seems like that would
implementing the iterator easier.  Why not treat Iterator like any
normal value?  Then only None, StopIteration, and ContinueIteration
would be special.

Argh, it took me so long to write this that you are already up to
version 1.6 of the PEP.  Time to start a new message. :-)

  Neil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
> [Guido]
> 
>>>An alternative that solves this would be to give __next__() a second
>>>argument, which is a bool that should be true when the first argument
>>>is an exception that should be raised. What do people think?
>>>
>>>I'll add this to the PEP as an alternative for now.
> 
> 
> [Nick]
> 
>>An optional third argument (raise=False) seems a lot friendlier (and more
>>flexible) than a typecheck.
> 
> 
> I think I agree, especially since Phillip's alternative (a different
> method) is even worse IMO.
> 

The extra argument works for me as well.

> 
>>Yet another alternative would be for the default behaviour to be to raise
>>Exceptions, and continue with anything else, and have the third argument be
>>"raise_exc=True" and set it to False to pass an exception in without raising 
>>it.
> 
> 
> You've lost me there. If you care about this, can you write it up in
> more detail (with code samples or whatever)? Or we can agree on a 2nd
> arg to __next__() (and a 3rd one to next()).
> 

Channeling Nick, I think he is saying that the raising argument should be made
True by default and be named 'raise_exc'.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip]
> It's not unlike David Mertz' articles on implementing coroutines and
> multitasking using generators, except that I'm adding more "debugging
> sugar", if you will, by making the tracebacks look normal.  It's just that
> the *how* requires me to pass the traceback into the generator.  At the
> moment, I accomplish that by doing a 3-argument raise inside of
> 'events.resume()', but it would be really nice to be able to get rid of
> 'events.resume()' in a future version of Python.

I'm not familiar with Mertz' articles and frankly I still fear it's
head-explosive material. ;-)

> I think maybe I misspoke.  I mean adding to the traceback *so* that when
> the same error is reraised, the intervening frames are included, rather
> than lost.
> 
> In other words, IIRC, the traceback chain is normally increased by one
> entry for each frame the exception escapes.  However, if you start hiding
> that inside of the exception instance, you'll have to modify it instead of
> just modifying the threadstate.  Does that make sense, or am I missing
> something?

Adding to the traceback chain already in the exception object is
totally kosher, if that's where the traceback is kept.

> My point was mainly that we can err on the side of caller convenience
> rather than callee convenience, if there are fewer implementations.  So,
> e.g. multiple methods aren't a big deal if it makes the 'block'
> implementation simpler, if only generators and a handful of special
> template objects are going need to implement the block API.

Well, the way my translation is currently written, writing next(itr,
arg, exc) is a lot more convenient for the caller than having to write

# if exc is True, arg is an exception; otherwise arg is a value
if exc:
err = getattr(itr, "__error__", None)
if err is not None:
VAR1 = err(arg)
else:
raise arg
else:
VAR1 = next(itr, arg)

but since this will actually be code generated by the bytecode
compiler, I think callee convenience is more important. And the
ability to default __error__ to raise the exception makes a lot of
sense. And we could wrap all this inside the next() built-in -- even
if the actual object should have separate __next__() and __error__()
methods, the user-facing built-in next() function might take an extra
flag to indicate that the argument is an exception, and to handle it
appropriate (like shown above).

> > > So, I guess I'm thinking you'd have something like tp_block_resume and
> > > tp_block_error type slots, and generators' tp_iter_next would just be the
> > > same as tp_block_resume(None).
> >
> >I hadn't thought much about the C-level slots yet, but this is a
> >reasonable proposal.
> 
> Note that it also doesn't require a 'next()' builtin, or a next vs.
> __next__ distinction, if you don't try to overload iteration and
> templating.  The fact that a generator can be used for templating, doesn't
> have to imply that any iterator should be usable as a template, or that the
> iteration protocol is involved in any way.  You could just have
> __resume__/__error__ matching the tp_block_* slots.
> 
> This also has the benefit of making the delineation between template blocks
> and for loops more concrete.  For example, this:
> 
>  block open("filename") as f:
>  ...
> 
> could be an immediate TypeError (due to the lack of a __resume__) instead
> of biting you later on in the block when you try to do something with f, or
> because the block is repeating for each line of the file, etc.

I'm not convinced of that, especially since all *generators* will
automatically be usable as templates, whether or not they were
intended as such. And why *shouldn't* you be allowed to use a block
for looping, if you like the exit behavior (guaranteeing that the
iterator is exhausted when you leave the block in any way)?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> It seems like what you are proposing is a limited form of
> coroutines.

Well, I though that's already what generators were -- IMO there isn't
much news there. We're providing a more convenient way to pass a value
back, but that's always been possible (see Fredrik's examples).

> Allowing 'continue' to have an optional value is elegant syntax.
> I'm a little bit concerned about what happens if the iterator does
> not expect a value.  If I understand the PEP, it is silently
> ignored.  That seems like it could hide bugs.  OTOH, it doesn't seem
> any worse then a caller not expecting a return value.

Exactly.

> It's interesting that there is such similarity between 'for' and
> 'block'.  Why is it that block does not call iter() on EXPR1?  I
> guess that fact that 'break' and 'return' work differently is a more
> significant difference.

Well, perhaps block *should* call iter()? I'd like to hear votes about
this. In most cases that would make a block-statement entirely
equivalent to a for-loop, the exception being only when there's an
exception or when breaking out of an iterator with resource
management.

I initially decided it should not call iter() so as to emphasize that
this isn't supposed to be used for looping over sequences -- EXPR1 is
really expected to be a resource management generator (or iterator).

> After thinking about this more, I wonder if iterators meant for
> 'for' loops and iterators meant for 'block' statements are really
> very different things.  It seems like a block-iterator really needs
> to handle yield-expressions.

But who knows, they might be useful for for-loops as well. After all,
passing values back to the generator has been on some people's wish
list for a long time.

> I wonder if generators that contain a yield-expression should
> properly be called coroutines.  Practically, I suspect it would just
> cause confusion.

I have to admit that I haven't looked carefully for use cases for
this! I just looked at a few Ruby examples and realized that it would
be a fairly simple extension of generators.

You can call such generators coroutines, but they are still generators.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Neil Schemenauer
On Wed, Apr 27, 2005 at 03:58:14PM -0700, Guido van Rossum wrote:
> Time to update the PEP; I'm pretty much settled on these semantics
> now...

[I'm trying to do a bit of Guido channeling here.  I fear I may not
be entirely successful.]

The the __error__ method seems to simplify things a lot.  The
purpose of the __error__ method is to notify the iterator that the
loop has been exited in some unusual way (i.e. not via a
StopIteration raised by the iterator itself).

The translation of a block-statement could become:

itr = EXPR1
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
break
try:
arg = None
BLOCK1
except Exception, exc:
err = getattr(itr, '__error__', None)
if err is None:
raise exc
err(exc)


The translation of "continue EXPR2" would become:

arg = EXPR2
continue

The translation of "break" inside a block-statement would
become:

err = getattr(itr, '__error__', None)
if err is not None:
err(StopIteration())
break

The translation of "return EXPR3" inside a block-statement would
become:

err = getattr(itr, '__error__', None)
if err is not None:
err(StopIteration())
return EXPR3

For generators, calling __error__ with a StopIteration instance
would execute any 'finally' block.  Any other argument to __error__
would get re-raised by the generator instance.

You could then write:

def opened(filename):
fp = open(filename)
try:
yield fp
finally:
fp.close()

and use it like this:

block opened(filename) as fp:


The main difference between 'for' and 'block' is that more iteration
may happen after breaking or returning out of a 'for' loop.  An
iterator used in a block statement is always used up before the
block is exited.

Maybe __error__ should be called __break__ instead.  StopIteration
is not really an error.  If it is called something like __break__,
does it really need to accept an argument?  Of hand I can't think of
what an iterator might do with an exception.

  Neil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
[SNIP]
>>It's interesting that there is such similarity between 'for' and
>>'block'.  Why is it that block does not call iter() on EXPR1?  I
>>guess that fact that 'break' and 'return' work differently is a more
>>significant difference.
> 
> 
> Well, perhaps block *should* call iter()? I'd like to hear votes about
> this. In most cases that would make a block-statement entirely
> equivalent to a for-loop, the exception being only when there's an
> exception or when breaking out of an iterator with resource
> management.
> 

I am -0 on changing it to call iter().  I do like the distinction from a 'for'
loop and leaving an emphasis for template blocks (or blocks, or whatever hip
term you crazy kids are using for these things at the moment) to use
generators.  As I said before, I am viewing these blocks as a construct for
external control of generators, not as a snazzy 'for' loop.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 05:19 PM 4/27/05 -0700, Guido van Rossum wrote:
[Phillip]
> This also has the benefit of making the delineation between template blocks
> and for loops more concrete.  For example, this:
>
>  block open("filename") as f:
>  ...
>
> could be an immediate TypeError (due to the lack of a __resume__) instead
> of biting you later on in the block when you try to do something with f, or
> because the block is repeating for each line of the file, etc.
I'm not convinced of that, especially since all *generators* will
automatically be usable as templates, whether or not they were
intended as such. And why *shouldn't* you be allowed to use a block
for looping, if you like the exit behavior (guaranteeing that the
iterator is exhausted when you leave the block in any way)?
It doesn't guarantee that, does it?  (Re-reads PEP.)  Aha, for *generators* 
it does, because it says passing StopIteration in, stops execution of the 
generator.  But it doesn't say anything about whether iterators in general 
are allowed to be resumed afterward, just that they should not yield a 
value in response to the __next__, IIUC.  As currently written, it sounds 
like existing non-generator iterators would not be forced to an exhausted 
state.

As for the generator-vs-template distinction, I'd almost say that argues in 
favor of requiring some small extra distinction to make a generator 
template-safe, rather than in favor of making all iterators 
template-promiscuous, as it were.  Perhaps a '@block_template' decorator on 
the generator?  This would have the advantage of documenting the fact that 
the generator was written with that purpose in mind.

It seems to me that using a template block to loop over a normal iterator 
is a TOOWTDI violation, but perhaps you're seeing something deeper here...?

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Neil Schemenauer wrote:
> On Wed, Apr 27, 2005 at 03:58:14PM -0700, Guido van Rossum wrote:
> 
>>Time to update the PEP; I'm pretty much settled on these semantics
>>now...
> 
> 
> [I'm trying to do a bit of Guido channeling here.  I fear I may not
> be entirely successful.]
> 
> The the __error__ method seems to simplify things a lot.  The
> purpose of the __error__ method is to notify the iterator that the
> loop has been exited in some unusual way (i.e. not via a
> StopIteration raised by the iterator itself).
> 
> The translation of a block-statement could become:
> 
> itr = EXPR1
> arg = None
> while True:
> try:
> VAR1 = next(itr, arg)
> except StopIteration:
> break
> try:
> arg = None
> BLOCK1
> except Exception, exc:
> err = getattr(itr, '__error__', None)
> if err is None:
> raise exc
> err(exc)
> 
> 
> The translation of "continue EXPR2" would become:
> 
> arg = EXPR2
> continue
> 
> The translation of "break" inside a block-statement would
> become:
> 
> err = getattr(itr, '__error__', None)
> if err is not None:
> err(StopIteration())
> break
> 
> The translation of "return EXPR3" inside a block-statement would
> become:
> 
> err = getattr(itr, '__error__', None)
> if err is not None:
> err(StopIteration())
> return EXPR3
> 
> For generators, calling __error__ with a StopIteration instance
> would execute any 'finally' block.  Any other argument to __error__
> would get re-raised by the generator instance.
> 
> You could then write:
> 
> def opened(filename):
> fp = open(filename)
> try:
> yield fp
> finally:
> fp.close()
> 
> and use it like this:
> 
> block opened(filename) as fp:
> 
> 

Seems great to me.  Clean separation of when the block wants things to keep
going if it can and when it wants to let the generator it's all done.

> The main difference between 'for' and 'block' is that more iteration
> may happen after breaking or returning out of a 'for' loop.  An
> iterator used in a block statement is always used up before the
> block is exited.
> 

This constant use of the phrase "used up" for these blocks is bugging me
slightly.  It isn't like the passed-in generator is having next() called on it
until it stops, it is just finishing up (or cleaning up, choose your favorite
term).  It may have had more iterations to go, but the block signaled it was
done and thus the generator got its chance to finish up and wipe pick up after
itself.

> Maybe __error__ should be called __break__ instead.

I like that.

> StopIteration
> is not really an error.  If it is called something like __break__,
> does it really need to accept an argument?  Of hand I can't think of
> what an iterator might do with an exception.
> 

Could just make the default value be StopIteration.  Is there really a perk to
__break__ only raising StopIteration and not accepting an argument?

The real question of whether people would use the ability of raising other
exceptions passed in from the block.  If you view yield expressions as method
calls, then being able to call __break__ with other exceptions makes sense
since you might code up try/except statements within the generator and that
will care about what kind of exception gets raised.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 05:43 PM 4/27/05 -0700, Guido van Rossum wrote:
Well, perhaps block *should* call iter()? I'd like to hear votes about
this. In most cases that would make a block-statement entirely
equivalent to a for-loop, the exception being only when there's an
exception or when breaking out of an iterator with resource
management.
I initially decided it should not call iter() so as to emphasize that
this isn't supposed to be used for looping over sequences -- EXPR1 is
really expected to be a resource management generator (or iterator).
Which is why I vote for not calling iter(), and further, that blocks not 
use the iteration protocol, but rather use a new "block template" 
protocol.  And finally, that a decorator be used to convert a generator 
function to a "template function" (i.e., a function that returns a block 
template).

I think it's less confusing to have two completely distinct concepts, than 
to have two things that are very similar, yet different in a blurry kind of 
way.  If you want to use a block on an iterator, you can always explicitly 
do something like this:

@blocktemplate
def iterate(iterable):
for value in iterable:
yield value
block iterate([1,2,3]) as x:
print x

> I wonder if generators that contain a yield-expression should
> properly be called coroutines.  Practically, I suspect it would just
> cause confusion.
I have to admit that I haven't looked carefully for use cases for
this!
Anything that wants to do co-operative multitasking, basically.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
Neil Schemenauer wrote:
> For generators, calling __error__ with a StopIteration instance
> would execute any 'finally' block.  Any other argument to __error__
> would get re-raised by the generator instance.

This is only one case right?  Any exception (including StopIteration)
passed to a generator's __error__ method will just be re-raised at the
point of the last yield, right?  Or is there a need to special-case
StopIteration?

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
Phillip J. Eby wrote:
> At 05:19 PM 4/27/05 -0700, Guido van Rossum wrote:
> >I'm not convinced of that, especially since all *generators* will
> >automatically be usable as templates, whether or not they were
> >intended as such. And why *shouldn't* you be allowed to use a block
> >for looping, if you like the exit behavior (guaranteeing that the
> >iterator is exhausted when you leave the block in any way)?
> 
> It doesn't guarantee that, does it?  (Re-reads PEP.)  Aha, for *generators*
> it does, because it says passing StopIteration in, stops execution of the
> generator.  But it doesn't say anything about whether iterators in general
> are allowed to be resumed afterward, just that they should not yield a
> value in response to the __next__, IIUC.  As currently written, it sounds
> like existing non-generator iterators would not be forced to an exhausted
> state.

I wonder if something can be done like what was done for (dare I say
it?) "old-style" iterators:

"The intention of the protocol is that once an iterator's next()
method raises StopIteration, it will continue to do so on subsequent
calls. Implementations that do not obey this property are deemed
broken. (This constraint was added in Python 2.3; in Python 2.2,
various iterators are broken according to this rule.)"[1]

This would mean that if next(itr, ...) raised StopIteration, then
next(itr, ...) should continue to raise StopIteration on subsequent
calls.  I don't know how this is done in the current implementation. 
Would it be hard to do so for the proposed block-statements?

If nothing else, we might at least clearly document what well-behaved
iterators should do...

STeVe

[1] http://docs.python.org/lib/typeiter.html
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Greg Ewing
Neil Schemenauer wrote:
The translation of a block-statement could become:
itr = EXPR1
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
break
try:
arg = None
BLOCK1
except Exception, exc:
err = getattr(itr, '__error__', None)
if err is None:
raise exc
err(exc)
That can't be right. When __error__ is called, if the iterator
catches the exception and goes on to do another yield, the
yielded value needs to be assigned to VAR1 and the block
executed again. It looks like your version will ignore the
value from the second yield and only execute the block again
on the third yield.
So something like Guido's safe_loop() would miss every other
yield.
I think Guido was right in the first place, and __error__
really is just a minor variation on __next__ that shouldn't
have a separate entry point.
Greg

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Greg Ewing
Guido van Rossum wrote:
And surely you exaggerate.  How about this then:
The with-statement is similar to the for-loop.  Until you've
learned about the differences in detail, the only time you should
write a with-statement is when the documentation for the function
you are calling says you should.
I think perhaps I'm not expressing myself very well.
What I'm after is a high-level explanation that actually
tells people something useful, and *doesn't* cop out by
just saying "you're not experienced enough to understand
this yet".
If such an explanation can't be found, I strongly suspect
that this doesn't correspond to a cohesive enough concept
to be made into a built-in language feature. If you can't
give a short, understandable explanation of it, then it's
probably a bad idea.
Greg
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com