Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-05-02 Thread Arnaud Delobelle
On 1 May 2015 at 21:27, Yury Selivanov  wrote:
> On 2015-05-01 4:24 PM, Arnaud Delobelle wrote:
>>
>> On 1 May 2015 at 20:24, Yury Selivanov  wrote:
>>>
>>> On 2015-05-01 3:19 PM, Ethan Furman wrote:
>>
>> [...]

 If we must have __aiter__, then we may as well also have __anext__;
 besides
 being more consistent, it also allows an object to be both a normol
 iterator
 and an asynch iterator.
>>>
>>>
>>> And this is a good point too.
>>
>> I'm not convinced that allowing an object to be both a normal and an
>> async iterator is a good thing.  It could be a recipe for confusion.
>>
>
>
> I doubt that it will be a popular thing.  But disallowing this
> by merging two different protocols in one isn't a good idea
> either.

I having been arguing for merging two different protocols.  I'm saying
that allowing an object to be both normal and async iterable is not an
argument for having separate protocols because it's not a good thing.

Cheers,

-- 
Arnaud
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-05-02 Thread Arnaud Delobelle
On 29 April 2015 at 20:42, Yury Selivanov  wrote:
> Everybody is pulling me in a different direction :)
> Guido proposed to call them "native coroutines".  Some people
> think that "async functions" is a better name.  Greg loves
> his "cofunction" term.
>
> I'm flexible about how we name 'async def' functions.  I like
> to call them "coroutines", because that's what they are, and
> that's how asyncio calls them.  It's also convenient to use
> 'coroutine-object' to explain what is the result of calling
> a coroutine.

I'd like the object created by an 'async def' statement to be called a
'coroutine function' and the result of calling it to be called a
'coroutine'.

This is consistent with the usage of 'generator function' and
'generator' has two advantages IMO:
- they both would follow the pattern 'X function' is a function
statement that when called returns an 'X'.
- When the day comes to define generator coroutines, then it will be
clear what to call them: 'generator coroutine function' will be the
function definition and 'generator coroutine' will be the object it
creates.

Cheers,

-- 
Arnaud
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-05-02 Thread Arnaud Delobelle
On 1 May 2015 at 20:24, Yury Selivanov  wrote:
> On 2015-05-01 3:19 PM, Ethan Furman wrote:
[...]
>> If we must have __aiter__, then we may as well also have __anext__;
>> besides
>> being more consistent, it also allows an object to be both a normol
>> iterator
>> and an asynch iterator.
>
>
> And this is a good point too.

I'm not convinced that allowing an object to be both a normal and an
async iterator is a good thing.  It could be a recipe for confusion.

-- 
Arnaud
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-05-02 Thread Paul Sokolovsky
Hello,

On Thu, 30 Apr 2015 18:53:00 +1200
Greg Ewing  wrote:

> Skip Montanaro wrote:
> > According to Wikipedia ,
> > term "coroutine" was first coined in 1958, so several generations
> > of computer science graduates will be familiar with the textbook
> > definition. If your use of "coroutine" matches the textbook
> > definition of the term, I think you should continue to use it
> > instead of inventing new names which will just confuse people new
> > to Python.
> 
> I don't think anything in asyncio or PEP 492 fits that
> definition directly. Generators and async def functions
> seem to be what that page calls a "generator" or "semicoroutine":
> 
> they differ in that coroutines can control where execution
> continues after they yield, while generators cannot, instead
> transferring control back to the generator's caller.

But of course it's only a Wikipedia page, which doesn't mean it
has to provide complete and well-defined picture, and quality of some
(important) Wikipedia pages is indeed pretty poor and doesn't improve. 


-- 
Best regards,
 Paul  mailto:pmis...@gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-05-02 Thread Alexander Walters
Out of curiosity, how much of a breaking change would making unary 
operators stack arbitrarily be?



On 4/30/2015 23:57, Nathaniel Smith wrote:


On Apr 30, 2015 8:40 PM, "Guido van Rossum" > wrote:

>
> On Thu, Apr 30, 2015 at 8:30 PM, Nathaniel Smith > wrote:

>>
>> The actual effect of making "await" a different precedence is to 
resolve the ambiguity in

>>
>>   await x ** 2
>>
>> If await acted like -, then this would be
>>   await (x ** 2)
>> But with the proposed grammar, it's instead
>>   (await x) ** 2
>> Which is probably correct, and produces the IMHO rather nice 
invariant that "await" binds more tightly than arithmetic in general 
(instead of having to say that it binds more tightly than arithmetic 
*except* in this one corner case...)

>
> Correct.
>>
>> AFAICT these and the ** case are the only expressions where there's 
any difference between Yury's proposed grammar and your proposal of 
treating await like unary minus. But then given the limitations of 
Python's parser plus the desire to disambiguate the expression above 
in the given way, it becomes an arguably regrettable, yet inevitable, 
consequence that

>>
>>   await -fut
>>   await +fut
>>   await ~fut
>> become parse errors.
>
>  Why is that regrettable? Do you have a plan for overloading one of 
those on Futures? I personally consider it a feature that you can't do 
that. :-)


I didn't say it was regrettable, I said it was arguably regrettable. 
For proof, see the last week of python-dev ;-).


(I guess all else being equal it would be nice if unary operators 
could stack arbitrarily, since that really is the more natural parse 
rule IMO and also if things had worked that way then I would have 
spent this thread less confused. But this is a pure argument from 
elegance. In practice there's obviously no good reason to write "await 
-fut" or "-not x", so meh, whatever.)


-n



___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] What's missing in PEP-484 (Type hints)

2015-05-02 Thread Florian Bruhin
* Dima Tisnek  [2015-04-30 13:41:53 +0200]:
> # lambda
> Not mentioned in the PEP, omitted for convenience or is there a rationale?
> f = lambda x: None if x is None else str(x ** 2)
> Current syntax seems to preclude annotation of `x` due to colon.
> Current syntax sort of allows lamba return type annotation, but it's
> easy to confuse with `f`.

Not sure if you'd really want to stuff type annotations into a
lambda... at that point you'd IMHO be better off by using a real
function.

> # local variables
> Not mentioned in the PEP
> Non-trivial code could really use these.
> 
> 
> # global variables
> Not mentioned in the PEP
> Module-level globals are part of API, annotation is welcome.
> What is the syntax?
> 
> 
> # comprehensions
> [3 * x.data for x in foo if "bar" in x.type]
> Arguable, perhaps annotation is only needed on `foo` here, but then
> how complex comprehensions, e.g. below, the intermediate comprehension
> could use an annotation
> [xx foj y in [...] if ...]
> 
> 
> # class attributes
> s = socket.socket(...)
> s.type, s.family, s.proto  # int
> s.fileno  # callable
> If annotations are only available for methods, it will lead to
> Java-style explicit getters and setters.
> Python language and data model prefers properties instead, thus
> annotations are needed on attributes.
> 
> 
> # plain data
> user1 = dict(id=123,  # always int
> name="uuu",  # always str
> ...)  # other fields possible
> smth = [42, "xx", ...]
> (why not namedtuple? b/c extensible, mutable)
> At least one PHP IDE allows to annotate PDO.
> Perhaps it's just bad taste in Python? Or is there a valid use-case?

Most (all?) of this is actually mentioned in the PEP:
https://www.python.org/dev/peps/pep-0484/#type-comments

Florian

-- 
http://www.the-compiler.org | m...@the-compiler.org (Mail/XMPP)
   GPG: 916E B0C8 FD55 A072 | http://the-compiler.org/pubkey.asc
 I love long mails! | http://email.is-not-s.ms/


pgpZkNAI4_hcR.pgp
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Sub-claasing pathlib.Path seems impossible

2015-05-02 Thread Christophe Bal
Hello.

In this post
,
I have noticed a problem with the following code.

from pathlib import Path
> class PPath(Path):
> def __init__(self, *args, **kwargs):
> super().__init__(*args, **kwargs)
>
> test = PPath("dir", "test.txt")
>
>
This gives the following error message.



> Traceback (most recent call last):
>   File "/Users/projetmbc/test.py", line 14, in 
> test = PPath("dir", "test.txt")
>   File "/anaconda/lib/python3.4/pathlib.py", line 907, in __new__
> self = cls._from_parts(args, init=False)
>   File "/anaconda/lib/python3.4/pathlib.py", line 589, in _from_parts
> drv, root, parts = self._parse_args(args)
>   File "/anaconda/lib/python3.4/pathlib.py", line 582, in _parse_args
> return cls._flavour.parse_parts(parts)AttributeError: type object 'PPath' 
> has no attribute '_flavour'
>
>
This breaks the sub-classing from Python point of view. In the post
,
I give a hack to sub-class Path but it's a bit Unpythonic.

*Christophe BAL*
*Enseignant de mathématiques en Lycée **et développeur Python amateur*
*---*
*French math teacher in a "Lycée" **and **Python **amateur developer*
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unicode literals in Python 2.7

2015-05-02 Thread Stephen J. Turnbull
Adam Bartoš writes:

 > I'll describe my picture of the situation, which might be terribly wrong.
 > On Linux, in a typical situation, we have a UTF-8 terminal,
 > PYTHONENIOENCODING=utf-8, GNU readline is used. When the REPL wants input
 > from a user the tokenizer calls PyOS_Readline, which calls GNU readline.
 > The user is prompted >>> , during the input he can use autocompletion and
 > everything and he enters u'α'. PyOS_Readline returns b"u'\xce\xb1'" (as
 > char* or something),

It's char*, according to Parser/myreadline.c.  It is not str in Python
2.

 > which is UTF-8 encoded input from the user.

By default, it's just ASCII-compatible bytes.  I don't know offhand
where, but somehow PYTHONIOENCODING tells Python it's UTF-8 -- that's
how Python knows about it in this situation.

 > The tokenizer, parser, and evaluator process the input and the result is
 > u'\u03b1', which is printed as an answer.
 >
 > In my case I install custom sys.std* objects and a custom readline
 > hook.  Again, the tokenizer calls PyOS_Readline, which calls my
 > readline hook, which calls sys.stdin.readline(),

This is your custom version?

 > which returns an Unicode string a user entered (it was decoded from
 > UTF-16-LE bytes actually). My readline hook encodes this string to
 > UTF-8 and returns it. So the situation is the same.  The tokenizer
 > gets b"\u'xce\xb1'" as before, but know it results in u'\xce\xb1'.
 > 
 > Why is the result different?

The result is different because Python doesn't "learn" that the actual
encoding is UTF-8.  If you have tried setting PYTHONIOENCODING=utf-8
with your setup and that doesn't work, I'm not sure where the
communication is failing.

The only other thing I can think of is to set the encoding
sys.stdin.encoding.  That may be readonly, though (that would explain
why the only way to set the PYTHONIOENCODING is via an environment
variable).  At least you could find out what it is, with and without
PYTHONIOENCODING set to 'utf-8' (or maybe it's 'utf8' or 'UTF-8' --
all work as expected with unicode.encode/str.decode on Mac OS X).

Or it could be unimplemented in your replacement module.

 > I though that in the first case PyCF_SOURCE_IS_UTF8 might have been
 > set. And after your suggestion, I thought that
 > PYTHONIOENCODING=utf-8 is the thing that also sets
 > PyCF_SOURCE_IS_UTF8.

No.  PyCF_SOURCE_IS_UTF8 is set unconditionally in the functions
builtin_{eval,exec,compile}_impl in Python/bltins.c in the cases that
matter AFAICS.  It's not obvious to me under what conditions it might
*not* be set.  It is then consulted in ast.c in PyAST_FromNodeObject,
and nowhere else that I can see.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 492 and types.coroutine

2015-05-02 Thread Ethan Furman
According to https://www.python.org/dev/peps/pep-0492/#id31:

  The [types.coroutine] function applies CO_COROUTINE flag to
  generator-function's code object, making it return a coroutine
  object.

Further down in https://www.python.org/dev/peps/pep-0492/#id32:

   [await] uses the yield from implementation with an extra step of
   validating its argument. await only accepts an awaitable,
   which can be one of:

 ...

 - A generator-based coroutine object returned from a generator
   decorated with types.coroutine().

If I'm understanding this correctly, type.coroutine's only purpose is to add
a flag to a generator object so that await will accept it.

This raises the question of why can't await simply accept a generator
object?  There is no code change to the gen obj itself, there is no
behavior change in the gen obj, it's the exact same byte code, only a
flag is different.

types.coroutine feels a lot like unnecessary boiler-plate.

--
~Ethan~
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 and types.coroutine

2015-05-02 Thread Yury Selivanov

On 2015-05-02 1:04 PM, Ethan Furman wrote:

According to https://www.python.org/dev/peps/pep-0492/#id31:

   The [types.coroutine] function applies CO_COROUTINE flag to
   generator-function's code object, making it return a coroutine
   object.

Further down in https://www.python.org/dev/peps/pep-0492/#id32:

[await] uses the yield from implementation with an extra step of
validating its argument. await only accepts an awaitable,
which can be one of:

  ...

  - A generator-based coroutine object returned from a generator
decorated with types.coroutine().

If I'm understanding this correctly, type.coroutine's only purpose is to add
a flag to a generator object so that await will accept it.

This raises the question of why can't await simply accept a generator
object?  There is no code change to the gen obj itself, there is no
behavior change in the gen obj, it's the exact same byte code, only a
flag is different.



Because we don't want 'await' to accept random generators.
It can't do anything meaningful with them, in a world where
all asyncio code is written with new syntax, passing generator
to 'await' is just a bug.

'types.coroutine' is something that we need to ease transition
to the new syntax.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 and types.coroutine

2015-05-02 Thread Ethan Furman
On 05/02, Yury Selivanov wrote:
> On 2015-05-02 1:04 PM, Ethan Furman wrote:

>> If I'm understanding this correctly, type.coroutine's only purpose is to add
>> a flag to a generator object so that await will accept it.
>> 
>> This raises the question of why can't await simply accept a generator
>> object?  There is no code change to the gen obj itself, there is no
>> behavior change in the gen obj, it's the exact same byte code, only a
>> flag is different.
> 
> Because we don't want 'await' to accept random generators.
> It can't do anything meaningful with them, in a world where
> all asyncio code is written with new syntax, passing generator
> to 'await' is just a bug.

And yet in current asyncio code, random generators can be accepted, and not
even the current asyncio.coroutine wrapper can gaurantee that the generator
is a coroutine in fact.

For that matter, even the new types.coroutine cannot gaurantee that the
returned object is a coroutine and not a generator -- so basically it's just
there to tell the compiler, "yeah, I really know what I'm doing, shut up and
do what I asked."

> 'types.coroutine' is something that we need to ease transition
> to the new syntax.

This doesn't make sense -- either the existing generators are correctly
returning coroutines, in which case the decorator adds nothing, or they
are returning non-coroutines, in which case the decorator adds nothing.

So either way, nothing has been added besides a mandatory boiler-plate
requirement.

--
~Ethan~
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unicode literals in Python 2.7

2015-05-02 Thread Adam Bartoš
I think I have found out where the problem is. In fact, the encoding of the
interactive input is determined by sys.stdin.encoding, but only in the case
that it is a file object (see
https://hg.python.org/cpython/file/d356e68de236/Parser/tokenizer.c#l890 and
the implementation of tok_stdin_decode). For example, by default on my
system sys.stdin has encoding cp852.

>>> u'á'
u'\xe1' # correct
>>> import sys; sys.stdin = "foo"
>>> u'á'
u'\xa0' # incorrect

Even if sys.stdin contained a file-like object with proper encoding
attribute, it wouldn't work since sys.stdin has to be instance of . So the question is, whether it is possible to make a file instance
in Python that is also customizable so it may call my code. For the first
thing, how to change the value of encoding attribute of a file object.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 and types.coroutine

2015-05-02 Thread Yury Selivanov



On 2015-05-02 2:14 PM, Ethan Furman wrote:

On 05/02, Yury Selivanov wrote:

On 2015-05-02 1:04 PM, Ethan Furman wrote:

If I'm understanding this correctly, type.coroutine's only purpose is to add
a flag to a generator object so that await will accept it.

This raises the question of why can't await simply accept a generator
object?  There is no code change to the gen obj itself, there is no
behavior change in the gen obj, it's the exact same byte code, only a
flag is different.

Because we don't want 'await' to accept random generators.
It can't do anything meaningful with them, in a world where
all asyncio code is written with new syntax, passing generator
to 'await' is just a bug.

And yet in current asyncio code, random generators can be accepted, and not
even the current asyncio.coroutine wrapper can gaurantee that the generator
is a coroutine in fact.


This is a flaw in the current Python that we want to fix.


For that matter, even the new types.coroutine cannot gaurantee that the
returned object is a coroutine and not a generator -- so basically it's just
there to tell the compiler, "yeah, I really know what I'm doing, shut up and
do what I asked."


Well, why would you use it on some generator that is not
a generator-based coroutine?



'types.coroutine' is something that we need to ease transition
to the new syntax.

This doesn't make sense -- either the existing generators are correctly
returning coroutines, in which case the decorator adds nothing, or they
are returning non-coroutines, in which case the decorator adds nothing.

So either way, nothing has been added besides a mandatory boiler-plate
requirement.


It's not nothing; it's backwards compatibility.  Please read
https://www.python.org/dev/peps/pep-0492/#await-expression

@types.coroutine marks generator function that its generator
is awaitable.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-05-02 Thread Ron Adam

On 05/02/2015 04:18 PM, Arnaud Delobelle wrote:

On 1 May 2015 at 20:59, Guido van Rossum  wrote:

>On Fri, May 1, 2015 at 12:49 PM, Ron Adam  wrote:

>>
>>
>>Another useful async function might be...
>>
>>async def yielding():
>>pass
>>
>>In a routine is taking very long time, just inserting "await yielding()"
>>in the long calculation would let other awaitables run.
>>

>That's really up to the scheduler, and a function like this should be
>provided by the event loop or scheduler framework you're using.

Really?  I was under the impression that 'await yielding()' as defined
above would actually not suspend the coroutine at all, therefore not
giving any opportunity for the scheduler to resume another coroutine,
and I thought I understood the PEP well enough.  Does this mean that
somehow "await x" guarantees that the coroutine will suspend at least
once?

To me the async def above was the equivalent of the following in the
'yield from' world:

def yielding():
 return
 yield # Just to make it a generator

Then "yield from yielding()" will not yield at all - which makes its
name rather misleading!


It was my understanding that yield from also suspends the current thread, 
allowing others to run.  Of course if it's the only thread, it would not.  
But maybe I'm misremembering earlier discussions.  If it doesn't suspend 
the current thread, then you need to put scheduler.sleep() calls throughout 
your co-routines.


I think Guido is saying it could be either.

Cheers,
   Ron




___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-05-02 Thread Guido van Rossum
On Sat, May 2, 2015 at 1:18 PM, Arnaud Delobelle  wrote:

> On 1 May 2015 at 20:59, Guido van Rossum  wrote:
> > On Fri, May 1, 2015 at 12:49 PM, Ron Adam  wrote:
> >>
> >>
> >> Another useful async function might be...
> >>
> >>async def yielding():
> >>pass
> >>
> >> In a routine is taking very long time, just inserting "await yielding()"
> >> in the long calculation would let other awaitables run.
> >>
> > That's really up to the scheduler, and a function like this should be
> > provided by the event loop or scheduler framework you're using.
>
> Really?  I was under the impression that 'await yielding()' as defined
> above would actually not suspend the coroutine at all, therefore not
> giving any opportunity for the scheduler to resume another coroutine,
> and I thought I understood the PEP well enough.  Does this mean that
> somehow "await x" guarantees that the coroutine will suspend at least
> once?
>

You're correct. That's why I said it should be left up to the framework --
ultimately what you *do* have to put in such a function has to be
understood by the framenwork. And that's why in other messages I've used
await asyncio.sleep(0) as an example. Look up its definition.


> To me the async def above was the equivalent of the following in the
> 'yield from' world:
>
> def yielding():
> return
> yield # Just to make it a generator
>
> Then "yield from yielding()" will not yield at all - which makes its
> name rather misleading!
>

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 and types.coroutine

2015-05-02 Thread Ethan Furman
On 05/02, Yury Selivanov wrote:
> On 2015-05-02 2:14 PM, Ethan Furman wrote:
>> On 05/02, Yury Selivanov wrote:
>>> On 2015-05-02 1:04 PM, Ethan Furman wrote:

>> And yet in current asyncio code, random generators can be accepted, and not
>> even the current asyncio.coroutine wrapper can gaurantee that the generator
>> is a coroutine in fact.

> This is a flaw in the current Python that we want to fix.

Your "fix" doesn't fix it.  I can decorate a non-coroutine generator with
type.coroutine and it will still be broken and a bug in my code.


>> For that matter, even the new types.coroutine cannot gaurantee that the
>> returned object is a coroutine and not a generator -- so basically it's just
>> there to tell the compiler, "yeah, I really know what I'm doing, shut up and
>> do what I asked."
> 
> Well, why would you use it on some generator that is not
> a generator-based coroutine?

I wouldn't, that would be a bug; but decorating a wrong type of generator is
still a bug, and type.coroutine has not fixed that bug.

It's worse than mandatory typing because it can't even check that what I have
declared is true.


>> So either way, nothing has been added besides a mandatory boiler-plate
>> requirement.
> 
> It's not nothing; it's backwards compatibility.  Please read
> https://www.python.org/dev/peps/pep-0492/#await-expression

I have read it, more than once.  If you lift the (brand-new) requirement that a
generator must be decorated, then type.coroutine becomes optional (no more
useful, just optional).  It is not a current requirement that coroutine
generators be decorated.

--
~Ethan~
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 and types.coroutine

2015-05-02 Thread Guido van Rossum
Ethan, at this point your continued arguing is not doing anybody anything
good. Please stop.

On Sat, May 2, 2015 at 2:31 PM, Ethan Furman  wrote:

> On 05/02, Yury Selivanov wrote:
> > On 2015-05-02 2:14 PM, Ethan Furman wrote:
> >> On 05/02, Yury Selivanov wrote:
> >>> On 2015-05-02 1:04 PM, Ethan Furman wrote:
>
> >> And yet in current asyncio code, random generators can be accepted, and
> not
> >> even the current asyncio.coroutine wrapper can gaurantee that the
> generator
> >> is a coroutine in fact.
>
> > This is a flaw in the current Python that we want to fix.
>
> Your "fix" doesn't fix it.  I can decorate a non-coroutine generator with
> type.coroutine and it will still be broken and a bug in my code.
>
>
> >> For that matter, even the new types.coroutine cannot gaurantee that the
> >> returned object is a coroutine and not a generator -- so basically it's
> just
> >> there to tell the compiler, "yeah, I really know what I'm doing, shut
> up and
> >> do what I asked."
> >
> > Well, why would you use it on some generator that is not
> > a generator-based coroutine?
>
> I wouldn't, that would be a bug; but decorating a wrong type of generator
> is
> still a bug, and type.coroutine has not fixed that bug.
>
> It's worse than mandatory typing because it can't even check that what I
> have
> declared is true.
>
>
> >> So either way, nothing has been added besides a mandatory boiler-plate
> >> requirement.
> >
> > It's not nothing; it's backwards compatibility.  Please read
> > https://www.python.org/dev/peps/pep-0492/#await-expression
>
> I have read it, more than once.  If you lift the (brand-new) requirement
> that a
> generator must be decorated, then type.coroutine becomes optional (no more
> useful, just optional).  It is not a current requirement that coroutine
> generators be decorated.
>
> --
> ~Ethan~
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-05-02 Thread Greg Ewing

Guido van Rossum wrote:
On Sat, May 2, 2015 at 1:18 PM, Arnaud Delobelle > wrote:


Does this mean that
somehow "await x" guarantees that the coroutine will suspend at least
once?


No. First, it's possible for x to finish without yielding.
But even if x yields, there is no guarantee that the
scheduler will run something else -- it might just
resume the same task, even if there is another one that
could run. It's up to the scheduler whether it
implements any kind of "fair" scheduling policy.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] ABCs - Re: PEP 492: async/await in Python; version 4

2015-05-02 Thread Stefan Behnel
Stefan Behnel schrieb am 02.05.2015 um 06:54:
> Yury Selivanov schrieb am 01.05.2015 um 20:52:
>> I don't like the idea of combining __next__ and __anext__.
> 
> Ok, fair enough. So, how would you use this new protocol manually then?
> Say, I already know that I won't need to await the next item that the
> iterator will return. For normal iterators, I could just call next() on it
> and continue the for-loop. How would I do it for AIterators?

BTW, I guess that this "AIterator", or rather "AsyncIterator", needs to be
a separate protocol (and ABC) then. Implementing "__aiter__()" and
"__anext__()" seems perfectly reasonable without implementing (or using) a
Coroutine.

That means we also need an "AsyncIterable" as a base class for it.

Would Coroutine then inherit from both Iterator and AsyncIterator? Or
should we try to separate the protocol hierarchy completely? The section on
"Coroutine objects" seems to suggest that inheritance from Iterator is not
intended.

OTOH, I'm not sure if inheriting from AsyncIterator is intended for
Coroutine. The latter might just be a stand-alone ABC with
send/throw/close, after all.

I think that in order to get a better understanding of the protocol(s) that
this PEP proposes, and the terminology that it should use, it would help to
implement these ABCs.

That might even help us to decide if we need new builtins (or helpers)
aiter() and anext() in order to deal with these protocols.

Stefan


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com