Re: [Python-Dev] On the METH_FASTCALL calling convention

2018-07-07 Thread Stefan Behnel
Jeroen Demeyer schrieb am 05.07.2018 um 16:53:
> The only case when this handling of keywords is suboptimal is when using
> **kwargs. In that case, a dict must be converted to a tuple. It looks hard
> to me to support efficiently both the case of fixed keyword arguments
> (f(foo=x)) and a keyword dict (f(**kwargs)). Since the former is more
> common than the latter, the current choice is optimal.

Wrappers that adapt or add some arguments (think partial()) aren't all that
uncommon, even when speed is not irrelevant. But I agree that actual
keyword arguments should rarely be involved in those calls.

Typically, it's calls with 1 to ~3 positional arguments that occur in
performance critical situations. Often just one, rarely more, and zero
arguments is a fast case anyway. Keyword arguments will always suffer some
kind of penalty compared to positional arguments, regardless of how they
are implemented (at runtime). But they can easily be avoided in many cases,
and anyone designing a performance relevant API that *requires* keyword
arguments deserves to have their code forked away from them. :)

The current keyword calling conventions seem fine with me and I do not see
a reason why we should invest discussion time and distributed brain
capacity into "improving" them.

Stefan


PS: Passing keyword arguments through wrappers unchanged might be a case to
consider in the future, but the calling PEPs don't seem the right place to
discuss those, as it's not just a call issue but also a compiler issue.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods

2018-07-07 Thread Stefan Behnel
INADA Naoki schrieb am 07.07.2018 um 06:10:
> How often "custom method type" are used?
> 
> I thought Cython use it by default.
> But when I read code generated by Cython, I can't find it.
> It uses normal PyMethodDef and tp_methods.
> 
> I found CyFunction in Cython repository, but I can't find
> how to use it.  Cython document doesn't explain any information
> about it.

Its usage is disabled by default because of some of the problems that
Jeroen addresses in his PEP(s).

You can enable Cython's own function type by setting the compiler directive
"binding=True", e.g. from your setup.py or in a comment at the very top of
your source file:

# cython: binding=True

The directive name "binding" stems from the fact that CyFunctions bind as
methods when put into classes, but it's really misleading these days
because the main advantage is that it makes Cython compiled functions look
and behave much more like Python functions, including introspection etc.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods

2018-07-07 Thread INADA Naoki
On Sat, Jul 7, 2018 at 4:35 PM Stefan Behnel  wrote:
>
> INADA Naoki schrieb am 07.07.2018 um 06:10:
> > How often "custom method type" are used?
> >
> > I thought Cython use it by default.
> > But when I read code generated by Cython, I can't find it.
> > It uses normal PyMethodDef and tp_methods.
> >
> > I found CyFunction in Cython repository, but I can't find
> > how to use it.  Cython document doesn't explain any information
> > about it.
>
> Its usage is disabled by default because of some of the problems that
> Jeroen addresses in his PEP(s).
>
> You can enable Cython's own function type by setting the compiler directive
> "binding=True", e.g. from your setup.py or in a comment at the very top of
> your source file:
>
> # cython: binding=True
>
> The directive name "binding" stems from the fact that CyFunctions bind as
> methods when put into classes, but it's really misleading these days
> because the main advantage is that it makes Cython compiled functions look
> and behave much more like Python functions, including introspection etc.
>
> Stefan
>

Thank you.  Do you plan to make it default when PEP 580 is accepted
and implemented?

Personally speaking, I used Cython for quick & easy alternative way to
writing extension types.
I don't need compatibility with pure Python functions.  I prefer
minimum and lightweight.
So I will disable it explicitly or stop using Cython.

But if you believe PEP 580 makes many Cython users happy, I believe you.

-- 
INADA Naoki  
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Naming comprehension syntax [was Re: Informal educator feedback on PEP 572 ...]

2018-07-07 Thread Glenn Linderman

On 7/6/2018 9:01 PM, Terry Reedy wrote:
In any case, Python's comprehensions use an English-syntax version of 
extended set builder notation.  "In Python, the set-builder's braces 
are replaced with square brackets, parentheses, or curly braces, 
giving list, generator, and set objects, respectively. Python uses an 
English-based syntax."


Also, "generator builder" is not much more expressive than 
"generator expression",


I looked for an alternative 'x' to 'comprehension' such that 
'generator|list|set|dict x' works and is specific to the notation. 
'Builder' is a reasonable choice. 


I'm not sure if your quote above was quoting documentation, or was a 
suggested quote to add to the documentation, I think the latter, as 
Google didn't find it.


The conflict between the "Builder pattern" and "set-builder notation" 
can be disambiguated by consistently using the hyphenated "set-builder" 
(as wikipedia does). And happily, by using wikipedia terms, they would 
be easily found with explanations outside of python docs as well as (if 
this is done) inside.  We do not need


[ typ + ' builder' for typ in ('set', 'list', 'dict', 'generator')]

only set-builder.  The fencing and : determine the type of the result.  
We could use


[ typ + ' form of set-builder'  for typ in ('set', 'list', 'dict', 
'generator')]


in the few places where the type of the set-builder must be 
disambiguated, avoiding the need for the compound terms.


The result of  ( set-builder ) is a generator. We do not need the term 
"generator expression" or "generator comprehension".  Use "generator 
form of set-builder"... yes, it is one or two syllables longer, but is 
clearer.


A generator can be produced in one of two ways: either a function 
containing a yield, or a set-builder delimited by parentheses or used as 
an actual parameter to a function, both of which can be referred to as 
the "generator form of set-builder".


Glenn
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods

2018-07-07 Thread Stefan Behnel
INADA Naoki schrieb am 07.07.2018 um 10:08:
> Thank you.  Do you plan to make it default when PEP 580 is accepted
> and implemented?

It will become the default at some point, I'm sure. Note that we will still
have to support older Python versions, though, currently 2.6+, which would
not have the improvements available. Some things might be backportable for
us, at least to older Py3.x releases, but we'll see.


> Personally speaking, I used Cython for quick & easy alternative way to
> writing extension types.
> I don't need compatibility with pure Python functions.  I prefer
> minimum and lightweight.
> So I will disable it explicitly or stop using Cython.

I'll try to keep the directive available as a compatibility switch for you. ;)


> But if you believe PEP 580 makes many Cython users happy, I believe you.

It's more of a transitive thing, for the users of your code. If the
functions and methods in the code that I write behave like Python
functions, then people who use my code will not run into surprises and
quirks when trying to do their stuff with them and things will generally
"just work", e.g. inspecting the functions when debugging or using them
interactively, looking up their signature, default arguments and
annotations, generating documentation from them, pickling, assigning them
as methods ... basically anything that people implicitly expect to be able
to do with Python functions (or even callables in general) and that doesn't
(not well or not at all) work with PyCFunction.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] On the METH_FASTCALL calling convention

2018-07-07 Thread Serhiy Storchaka

05.07.18 17:53, Jeroen Demeyer пише:
In other words: I see nothing to improve in the calling convention of 
METH_FASTCALL. I suggest to keep it and make it public as-is.


You have considered the bytecode for calling functions, but this 
actually is not directly related to the calling convention. All opcodes 
(CALL_FUNCTION, CALL_FUNCTION_KW, CALL_FUNCTION_EX and CALL_METHOD) can 
be used for calling any callable. CALL_FUNCTION, CALL_FUNCTION_KW was 
designed for reducing the overhead from the caller side for most common 
cases, and CALL_FUNCTION_EX is used for the rest.


Calling conventions METH_FASTCALL and METH_FASTCALL|METH_KEYWORDS was 
designed for reducing the overhead of creating a temporary tuple and 
dict. They allow to avoid allocation and copying in case of using 
opcodes CALL_FUNCTION, CALL_FUNCTION_KW and in most cases of using C API 
for calling function. But it is not the only possible way, there some 
details can be changed without performance loss, and may be even with a 
gain.


After passing positional and keyword arguments to the C function, we 
need to convert Python objects to C values. For positional-only 
parameters we can use PyArg_ParseTuple() (for METH_VARARGS) and 
_PyArg_ParseStack() (for METH_FASTCALL). If there are keyword 
parameters, we need to use more complicated API: 
PyArg_ParseTupleAndKeywords() or private 
_PyArg_ParseTupleAndKeywordsFast() (for METH_VARARGS|METH_KEYWORDS) and 
private _PyArg_ParseStackAndKeywords() (for 
METH_FASTCALL|METH_KEYWORDS). _PyArg_ParseTupleAndKeywordsFast() and 
_PyArg_ParseStackAndKeywords() are private, complex, unstable and can be 
used only internally in CPython (mostly in the code generated by 
Argument Clinic). They have complex signatures and their code is mostly 
duplicate one other (but with some important differences).


There is a wish of inlining argument parsing functions in the Argument 
Clinic generated code. It is easier to do for PyArg_ParseTuple() and 
_PyArg_UnpackStack(). But functions for parsing keyword arguments are 
more complex, because they perform two things: match keyword argument 
names to parameter positions and convert argument values to C.


There is my idea. Split every of keyword argument parsing functions on 
two parts. The first part linearize keyword arguments, it converts 
positional and keyword arguments (in whatever form they were presented) 
into a linear array of PyObject* (with NULLs for not specified optional 
arguments). The second part is common and similar to 
_PyArg_ParseStack(), but supports NULLs. It converts an array of 
PyObject* to a sequence of C values. I tried to implement this idea, is 
is not simple, and results were mixed, but I don't loss a hope.


And here we return to METH_FASTCALL|METH_KEYWORDS. The first part of 
handling arguments can be made outside of the C function, by the calling 
API. Then the signature of the C function can be simpler, the same as 
for METH_FASTCALL. But we need to expose the list of keyword parameter 
names as an attribute of CFunction.


I don't know whether this ides is vital or dead, but I' going to try it. 
And implementing it will change the METH_FASTCALL|METH_KEYWORDS calling 
convention a lot.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Removal of install_misc command from distutils

2018-07-07 Thread Paul Moore
On 7 July 2018 at 01:25, Ned Deily  wrote:
> On Jul 6, 2018, at 19:43, Alexander Belopolsky 
>  wrote:

>> I think it will be prudent to get this command back in Python 3.7.1.  My 
>> work-around was to simply copy the 20-something lines that define 
>> install_misc from Python 3.6 to my setup.py file.

You'll still need those 20 lines in your code if you want to support
Python 3.7.0, and *not* supporting 3.7.0 but supporting 3.7.1 doesn't
seem like a good idea to me. TBH, I'd consider "copy the 20 lines of
code into your own setup.py to be a perfectly acceptable workaround.

>> It was my impression that it is precisely due to situations like this, 
>> distutils was considered more or less frozen for development and all new 
>> features went to setuptools.

For a long while distutils was frozen, and yes it was basically
because of a fear that nothing could be changed without potentially
disrupting someone. But that resulted in essentially a paralysis of
work on packaging for many years, and wasn't a good thing. The embargo
on changes to distutils was lifted a long time ago - although we
remain cautious about making changes because we don't want to break
people's code. As Ned noted, you had the whole Python 3.7 alpha and
beta phase to test your code, and if you had raised the issue then, we
could have considered reverting this change (but my view would still
have been "we'll remove it and you should just copy the code").

A quick search of bpo [1] shows many changes to distutils in recent time, FWIW.

> If you want to pursue it, I think the best thing to do would be to bring up 
> the issue on the distutils-sig mailing list where the change probably should 
> have been discussed first if it wasn't and also reopen 
> https://bugs.python.org/issue29218.  AFAIK, the removal hasn't come up as a 
> problem before in the nearly 18 months since the change was first merged into 
> the feature branch.

IMO, a tracker issue was fine, and  the OP could have commented there.
Distutils-sig isn't really the right audience for minor detail changes
like this - although if someone wanted to raise the broader principle
"let's reinstate the total freeze on distutils changes" that would be
appropriate for distutils-sig. I'd fully expect a resounding rejection
for that proposition, though.

At this point, I think the only realistic resolution would be to add a
note to the "Porting to Python 3.7" docs explaining that users of
install_misc should copy the code from 3.6 into their setup.py. But
it's quite likely that the only person who needs that documentation is
Alexander (we've not had any other reports of this issue to my
knowledge) so that may be overkill...

Paul

[1] 
https://bugs.python.org/issue?%40search_text=&ignore=file%3Acontent&title=&%40columns=title&id=&%40columns=id&stage=6&creation=&creator=&activity=&%40columns=activity&%40sort=activity&actor=&nosy=&type=&components=3&versions=&dependencies=&assignee=&keywords=&priority=&status=2&%40columns=status&resolution=3&nosy_count=&message_count=&%40group=&%40pagesize=50&%40startwith=0&%40sortdir=on&%40action=search
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] On the METH_FASTCALL calling convention

2018-07-07 Thread Nathaniel Smith
On Sat, Jul 7, 2018 at 12:19 AM, Stefan Behnel  wrote:
> Typically, it's calls with 1 to ~3 positional arguments that occur in
> performance critical situations. Often just one, rarely more, and zero
> arguments is a fast case anyway. Keyword arguments will always suffer some
> kind of penalty compared to positional arguments, regardless of how they
> are implemented (at runtime). But they can easily be avoided in many cases,
> and anyone designing a performance relevant API that *requires* keyword
> arguments deserves to have their code forked away from them. :)

In numpy we do sometimes hesitate to use kwargs, even when we
otherwise ought to, because of the slowdown they incur. You're right
that there's always going to be some overhead here, and I can't say
how important these kinds of optimizations are compared to other
things people could be spending their time on. But kwargs do improve
readability, and it's nice when we can make readable code fast, so
people aren't tempted to obfuscate things in the name of speed.

-n

-- 
Nathaniel J. Smith -- https://vorpus.org
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] On the METH_FASTCALL calling convention

2018-07-07 Thread Stefan Behnel
Hi Serhiy!

Serhiy Storchaka schrieb am 07.07.2018 um 10:55:
> There is my idea. Split every of keyword argument parsing functions on two
> parts. The first part linearize keyword arguments, it converts positional
> and keyword arguments (in whatever form they were presented) into a linear
> array of PyObject* (with NULLs for not specified optional arguments). The
> second part is common and similar to _PyArg_ParseStack(), but supports
> NULLs. It converts an array of PyObject* to a sequence of C values. I tried
> to implement this idea, is is not simple, and results were mixed, but I
> don't loss a hope.

That proposal sounds good to me. Cython currently does something similar
/inside/ of its function entry code, in that it executes an unrolled series
of PyDict_GetItem() calls for the expected keyword arguments (instead of
iterating over a dict, which turned out to be slower), and maps those to an
array of arguments, all before it passes over that array to convert the
values to the expected C types. I agree that it makes sense to do the name
matching outside of the callee since the caller knows best in what way
(sequence, dict, ...) the arguments are available and can decide on the
fastest way to map them to a flat array, given the expected argument names.

And I think the proposal would fit nicely into PEP-580.


> And here we return to METH_FASTCALL|METH_KEYWORDS. The first part of
> handling arguments can be made outside of the C function, by the calling
> API. Then the signature of the C function can be simpler, the same as for
> METH_FASTCALL. But we need to expose the list of keyword parameter names as
> an attribute of CFunction.

And names should be expected to be interned, so that matching the keywords
can be done via pointer comparisons in almost all cases. That should make
it pretty fast, and errors can be detected in a slow separate pass if the
pointer matching fails. I think we cannot strictly assume a predictable,
helpful ordering of the keyword arguments on the calling side (that would
allow for a one-pass merge), but it's rather common for users to pass
keyword arguments in the order in which the signature expects them, so I'm
sure there's a fast algorithm (e.g. something like Insertion Sort) to match
both sides in negligible time.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Comparing PEP 576 and PEP 580

2018-07-07 Thread Mark Shannon

On 07/07/18 00:02, Jeroen Demeyer wrote:

On 2018-07-06 23:12, Guido van Rossum wrote:

It's your PEP. And you seem to be struggling with something. But I can't
tell quite what it is you're struggling with.


To be perfectly honest (no hard feelings though!): what I'm struggling 
with is getting feedback (either positive or negative) from core devs 
about the actual PEP 580.



At the same time I assume you want your PEP accepted.


As I also said during the PEP 575 discussion, my real goal is to solve a 
concrete problem, not to push my personal PEP. I still think that PEP 
580 is the best solution but I welcome other suggestions.



And how do they feel about PEP 576? I'd like to see some actual debate
of the pros and cons of the details of PEP 576 vs. PEP 580.


I started this thread to do precisely that.

My opinion: PEP 580 has zero performance cost, while PEP 576 does make 
performance for bound methods worse (there is no reference 
implementation of the new PEP 576 yet, so that's hard to quantify for 

There is a minimal implementation and has been for a while.
There is a link at the bottom of the PEP.
Why do you claim it will make the performance of bound methods worse?
You provide no evidence of that claim.

now). PEP 580 is also more future-proof: it defines a new protocol which 
can easily be extended in the future. PEP 576 just builds on PyMethodDef 
which cannot be extended because of ABI compatibility (putting 
__text_signature__ and __doc__ in the same C string is a good symptom of 
that). This extensibility is important because I want PEP 580 to be the 
first in a series of PEPs working out this new protocol. See PEP 579 for 
the bigger picture.

PEP 576 adds a new calling convention which can be used by *any* object.
Seems quite extensible to me.



One thing that might count against PEP 580 is that it defines a whole 
new protocol, which could be seen as too complicated. However, it must 
be this complicated because it is meant to generalize the current 
behavior and optimizations of built-in functions and methods. There are 
lots of little tricks currently in CPython that must be "ported" to the 
new protocol.



OK, so is it your claim that the NumPy developers don't care about which
one of these PEPs is accepted or even whether one is accepted at all?


I don't know, I haven't contacted any NumPy devs yet, so that was just 
my personal feeling. These PEPs are about optimizing callables and NumPy 
isn't really about callables. I think that the audience for PEP 580 is 
mostly compilers (Cython for sure but possibly also Pythran, numba, 
cppyy, ...). Also certain C classes like functools.lru_cache could 
benefit from it.



Yet earlier in
*this* thread you seemed to claim that PEP 580 requires changes ro
FASTCALL.


I don't know what you mean with that. But maybe it's also confusing 
because "FASTCALL" can mean different things: it can refer to a 
PyMethodDef (used by builtin_function_or_method and method_descriptor) 
with the METH_FASTCALL flag set. It can also refer to a more general API 
like _PyCFunction_FastCallKeywords, which supports METH_FASTCALL but 
also other calling conventions like METH_VARARGS.


I don't think that METH_FASTCALL should be changed (and PEP 580 isn't 
really about that at all). For the latter, I'm suggesting some API 
changes but nothing fundamental: mainly replacing the 5 existing private 
functions _PyCFunction_FastCallKeywords, _PyCFunction_FastCallDict, 
_PyMethodDescr_FastCallKeywords, _PyMethodDef_RawFastCallKeywords, 
_PyMethodDef_RawFastCallDict by 1 public function PyCCall_FASTCALL.



Hopefully this clears some things up,
Jeroen.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/mark%40hotpy.org

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Giampaolo Rodola'
Sorry in advance for opening yet another topic about PEP-572. With PEP-572
being officially accepted I know debating its inclusion in the language is
a useless exercise at this point, but since it's still in "draft" status I
would like to express my opinion as I think this is a feature which can
potentially be abused fairly easily. FWIW I initially found myself
disliking the idea as a whole but
https://github.com/python/cpython/pull/8122 made me (and others) reconsider
it quite a bit (see: https://twitter.com/grodola/status/1015251302350245888).
PR-8122 clearly shows an improvement in expressiveness and compactness
(many folks argue this is too much) but PEP-572 as it currently stands is
too permissive IMHO. My concern about "easily abusable ugly cases" still
remains, and I think they should be banned instead of just discouraged in
the PEP or in the doc. Since we spend way more time *reading* code rather
than writing it, as a "reader" I would expect a more prudent approach to
the problem.

Proposal


1) allow only one := assignment per line in "if" statements:
>>> if x := val1 and y := val2:   # SyntaxError or SyntaxWarning
>>> if x == val1 and y := val2:   # SyntaxError or SyntaxWarning
>>> if x := val1 and y == val2:   # SyntaxError or SyntaxWarning
>>> if x := val1:  # OK
>>> if (x := val1):  # OK

2) allow := in "while" statements, "if" statements and comprehensions only:
>>> foo(x := 0)  # SyntaxError
>>> yield x := 3  # SyntaxError
>>> assert y := 3  # SyntaxError

3) (debatable) disallow := if the variable is already defined:
>>> x = 5
>>> if (x := val):  # SyntaxError or SyntaxWarning

4) ban "a = (b := c)", "x = a := (b := (c := d))" and similar (they're just
too ugly IMHO)

Rationale 1
===

In visual terms assignments in Python have always occurred at the BEGINNING
of the line and always on the most LEFT side:

>>> foo = fun1()
>>> bar = fun2()
>>> baz = fun3()

That is where I naturally expect an assignment to be when reading code. My
main concern with PEP-572 is that an assignments can now occur at *any
point* in the line:

>>> foo = fun1()
>>> bar = fun2()
>>> if foo == val1 and bar == val2 and baz := fun3():
......

That forces me to visually scan the whole line horizontally from left to
right 'till its end, looking for possible variables being set. I'm
concerned that I will miss := occurrences because visually they are very
similar to == unless parentheses are made mandatory:

>>> if foo == val1 and bar == val2 and (baz := fun3()):
......

Also, in case of multi-line conditionals I have to visually scan the
construct both horizontally AND vertically:

>>> if (foo == val1 and \
...bar == val2 and \
...baz := val3):
... ...

Again, that is not a place where I would expect to find or look for a
variable assignment. I know I wouldn't like to read or review a code which
does that and I suspect linters will likely end up wanting to emit a
warning in that case (see: https://github.com/PyCQA/pylint/issues/2246).
https://github.com/python/cpython/pull/8116/files avoids using multiple :=
per line and that's why the result appears readable enough IMO.

Rationale 2
===

PEP-572 states:

> The := operator may be used directly in a positional function call
argument

That means allowing:

>>> foo(x := 0)

I honestly don't see why anyone would want to call a function AND assign a
variable value at the same time (except in comprehensions). With this in
place I not only have to guard against "if" statements assigning values at
any point in the code, but also function calls, both horizontally and
vertically e.g.:

>>> foo(some_long_var_name, another_one, x := bar(),
y := fun())

To me this looks like the perfect example of where this functionality can
be abused. Also, I'm not clear what PEP-572 intend to do about "all other
places". E.g. should these cases be allowed? (IMO no)

>>> yield x := 3
>>> assert y := 3

--
Giampaolo - http://grodola.blogspot.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Chris Angelico
On Sat, Jul 7, 2018 at 11:09 PM, Giampaolo Rodola'  wrote:
> To me this looks like the perfect example of where this functionality can be
> abused. Also, I'm not clear what PEP-572 intend to do about "all other
> places". E.g. should these cases be allowed? (IMO no)
>
> >>> yield x := 3
> >>> assert y := 3

Special cases aren't special enough to break the rules.

Can we stop trying to nerf this into complexity, please?

ChrisA
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Comparing PEP 576 and PEP 580

2018-07-07 Thread Nick Coghlan
On 7 July 2018 at 07:12, Guido van Rossum  wrote:
> On Fri, Jul 6, 2018 at 2:52 AM Jeroen Demeyer  wrote:
>> The Cython developers (in particular Stefan Behnel) certainly support my
>> work. I have talked with them in person at a workshop and they posted a
>> few emails to python-dev and they also gave me some personal comments
>> about PEP 580.
>
>
> And how do they feel about PEP 576? I'd like to see some actual debate of
> the pros and cons of the details of PEP 576 vs. PEP 580. So far I mostly see
> you and INADA Naoki disagreeing about process, which doesn't give me the
> feeling that there is consensus.

I think part of the confusion here stems from the fact that Jeroen's
original PEP 575 tried to cover a lot of different things, with two of
the most notable being:

1. Getting functions implemented in C to act more like their Python
counterparts from an introspection and capability perspective (e.g.
having access to their defining module regardless of whether they're a
top level function or a method on a type definition)
2. Allowing third party compilers like Cython to route function calls
through the CPython C API more efficiently than the existing public
APIs that require building and deconstructing Python tuples and dicts

Hence the request that the PEP be split up into an overview PEP
describing the problem space (now available as
https://www.python.org/dev/peps/pep-0579/ ), and then follow-up PEPs
targeting specific sub-topics within that PEP.

That's happened for PEP 580 (since Jeroen was working on both PEPs at
the same time as an initial replacement for PEP 575), but PEP 576
hasn't been updated yet to specify which of the subtopics within PEP
579 it is aiming to address

My current reading is that PEP 576 isn't really targeting the same
aspects as PEP 580: PEP 580 aims to allow third party callable
implementations to be as fast as native CPython internal ones
regardless of which callable type they use (point 2 above), while PEP
576 has the more modest aim of eliminating some of the current reasons
that third parties find it necessary to avoid using the CPython native
callable types in the first place (point 1 above).

That said, I think Inada-san's request for benchmarks that clearly
demonstrate the challenges with the status quo and hence can be used
to quantify the potential benefits is a sound one, as those same
benchmarks can then be used to assess the complexity of adapting
existing third party tools and libraries like Cython and NumPy to
implement the proposals in order to produce updated benchmarking
numbers (for PEP 580, add code to implement the new protocol method,
for PEP 576, switch to inheriting from one of the two newly defined C
level types).

At a micro-benchmark level, that would probably involve just comparing
mapping builtin functions and methods over a rangewith the performance
of variants of those functions implemented using only the public C API
(the specific functions and methods chosen for the benchmark will need
to be those where the optimisations for particularly simple function
signatures don't apply).

At a macro-benchmark level, it would likely require first choosing or
defining a language level computational performance benchmark that's
based on real world code using libraries like
NumPy/pandas/scikit-learn, akin to the domain specific benchmarks we
already have for libraries like SQL Alchemy, dulwich, and various
templating engines (django, mako, genshi).

One possibility that may make sense could be to set up comparisons of
https://github.com/numpy/numpy/tree/master/benchmarks numbers between
a conventional NumPy and one with more optimised C level calls, and
doing something similar for
https://pandas.pydata.org/pandas-docs/stable/contributing.html#running-the-performance-test-suite.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Mark Shannon

Hi,

We seem to have a plethora of PEPs where we really ought to have one (or 
none?).


Traditionally when writing a new piece of software, one gathered 
requirements before implementing the code. Let us return to that 
venerable tradition.


IMO, mailing lists are a terrible way to do software design, but a good 
way to gather requirements as it makes less likely that someone will be 
forgotten.


So, let us gather the requirements for a new calling API.

Here are my starting suggestions:

1. The new API should be fully backwards compatible and shouldn't break 
the ABI
2. The new API should be used internally so that 3rd party extensions 
are not second class citizens in term of call performance.
3. The new API should not prevent 3rd party extensions having full 
introspection capabilities, supporting keyword arguments or another 
feature supported by Python functions.
4. The implementation should not exceed D lines of code delta and T 
lines of code in total size. I would suggest +200 and 1000 for D and T 
respectively (or is that too restrictive?).

5. It should speed up CPython for the standard benchmark suite.
6. It should be understandable.

What am I missing? Comments from the maintainers of Cython and other 
similar tools would be appreciated.


Cheers,
Mark.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Nick Coghlan
On 7 July 2018 at 23:38, Mark Shannon  wrote:
> Hi,
>
> We seem to have a plethora of PEPs where we really ought to have one (or
> none?).
>
> Traditionally when writing a new piece of software, one gathered
> requirements before implementing the code. Let us return to that venerable
> tradition.
>
> IMO, mailing lists are a terrible way to do software design, but a good way
> to gather requirements as it makes less likely that someone will be
> forgotten.

That's the purpose of PEP 579: gather the background information on
the problems that folks want to solve such that the competing proposed
solutions aren't defining the problem that needs to be solved in
different ways.

If PEP 579 isn't working as a problem specification from your
perspective, then I'd suggest posting a PR that Jeroen could review
(although I think this thread is a good idea as well).

> So, let us gather the requirements for a new calling API.

> Here are my starting suggestions:
>
> 1. The new API should be fully backwards compatible and shouldn't break the
> ABI
> 2. The new API should be used internally so that 3rd party extensions are
> not second class citizens in term of call performance.
> 3. The new API should not prevent 3rd party extensions having full
> introspection capabilities, supporting keyword arguments or another feature
> supported by Python functions.
> 4. The implementation should not exceed D lines of code delta and T lines of
> code in total size. I would suggest +200 and 1000 for D and T respectively
> (or is that too restrictive?).
> 5. It should speed up CPython for the standard benchmark suite.
> 6. It should be understandable.

I like points 1, 2, 3, and 6, but I think point 4 should be a design
trade-off rather than a requirement, since minimising the delta in
CPython becomes an anti-goal if the outcome of doing so is to make the
change harder to adopt for third party projects (at the same time, a
delta that's too large is unlikely to be accepted, reviewed and
merged, which is what makes it a trade-off).

I don't think point 5 is a goal here either, as the problem isn't that
these calling optimisations don't exist, it's that they don't
currently have a public API that third party projects can access (the
most recent METH_FASTCALL thread covers that pretty well).

My own additional concern that I think is also on the debatable border
between "design requirement" and "design trade-off" is whether or not
it's acceptable for us to require that existing third party projects
change their parent CPython type in order to access the optimised
calling conventions. Changing Python base types in an extension module
can end up being an annoyingly intrusive change, since it changes the
memory layout in your instances. Whether or not that's a problem
depends on exactly what you're doing, but when the new calling
convention is tied to a protocol that any type can implement (as PEP
580 proposes), the concern doesn't even arise.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Stefan Behnel
Nick Coghlan schrieb am 07.07.2018 um 16:14:
> when the new calling
> convention is tied to a protocol that any type can implement (as PEP
> 580 proposes), the concern doesn't even arise.

Nick, +1 to all of what you said in your reply, and I also really like the
fact that this proposal is creating a new, general protocol that removes
lots of type special casing from places where objects are being called
"efficiently". We're essentially designing a Fast Duck Calling convention here.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Antoine Pitrou
On Sun, 8 Jul 2018 00:14:13 +1000
Nick Coghlan  wrote:
> 
> > So, let us gather the requirements for a new calling API.  
> 
> > Here are my starting suggestions:
> >
> > 1. The new API should be fully backwards compatible and shouldn't break the
> > ABI
> > 2. The new API should be used internally so that 3rd party extensions are
> > not second class citizens in term of call performance.
> > 3. The new API should not prevent 3rd party extensions having full
> > introspection capabilities, supporting keyword arguments or another feature
> > supported by Python functions.
> > 4. The implementation should not exceed D lines of code delta and T lines of
> > code in total size. I would suggest +200 and 1000 for D and T respectively
> > (or is that too restrictive?).
> > 5. It should speed up CPython for the standard benchmark suite.
> > 6. It should be understandable.  
> 
> I like points 1, 2, 3, and 6, but I think point 4 should be a design
> trade-off rather than a requirement, since minimising the delta in
> CPython becomes an anti-goal if the outcome of doing so is to make the
> change harder to adopt for third party projects (at the same time, a
> delta that's too large is unlikely to be accepted, reviewed and
> merged, which is what makes it a trade-off).
> 
> I don't think point 5 is a goal here either, as the problem isn't that
> these calling optimisations don't exist, it's that they don't
> currently have a public API that third party projects can access (the
> most recent METH_FASTCALL thread covers that pretty well).

Agreed.  The goal is not to speed up CPython but to bring third-party
extensions up to speed (both literally and figuratively).

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Antoine Pitrou
On Sat, 7 Jul 2018 16:39:08 +0200
Stefan Behnel  wrote:
> Nick Coghlan schrieb am 07.07.2018 um 16:14:
> > when the new calling
> > convention is tied to a protocol that any type can implement (as PEP
> > 580 proposes), the concern doesn't even arise.  
> 
> Nick, +1 to all of what you said in your reply, and I also really like the
> fact that this proposal is creating a new, general protocol that removes
> lots of type special casing from places where objects are being called
> "efficiently". We're essentially designing a Fast Duck Calling convention 
> here.

The Quick Quack protocole?



___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread INADA Naoki
> IMO, mailing lists are a terrible way to do software design, but a good
> way to gather requirements as it makes less likely that someone will be
> forgotten.
>

Agreed.  There are several topics we should discuss for these PEPs.
Mailing list is hard to follow.

Can we have other communication channel?  Dedicated Github repository?
zulip? or discuss.python.org?

> So, let us gather the requirements for a new calling API.
>
> Here are my starting suggestions:
>
> 1. The new API should be fully backwards compatible and shouldn't break
> the ABI

Agreed.  We have chance to break ABI/API slightly at Python 4, although
breakage should be very small compared with Python 3.

Until then, we should keep backward compatibility as possible.

> 2. The new API should be used internally so that 3rd party extensions
> are not second class citizens in term of call performance.

These PEPs proposes new public protocol which can be implemented
by 3rd party extensions, especially Cython.
In this meaning, it's not used only *internally*.

> 3. The new API should not prevent 3rd party extensions having full
> introspection capabilities, supporting keyword arguments or another
> feature supported by Python functions.

OK.

> 4. The implementation should not exceed D lines of code delta and T
> lines of code in total size. I would suggest +200 and 1000 for D and T
> respectively (or is that too restrictive?).

Hmm, I think this should be considered as (Frequency * Value) / (Complexity).
Especially, if PEP 580 can removes 2000 lines of code, T>1000 seems OK.

> 5. It should speed up CPython for the standard benchmark suite.

I think it's impossible in short term.  We have specialized optimization
(FASTCALL and LOAD_METHOD/CALL_METHOD) already.
These optimization makes simple method calls 30% faster.
These PEPs makes 3rd party callable types can utilize these optimization.

> 6. It should be understandable.
>

OK.
While main audience is Cython, C extension writer should be able to use
new protocols by handwritten extension.

> What am I missing? Comments from the maintainers of Cython and other
> similar tools would be appreciated.
>
> Cheers,
> Mark.


-- 
INADA Naoki  
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Stefan Behnel
INADA Naoki schrieb am 07.07.2018 um 17:16:
>> 2. The new API should be used internally so that 3rd party extensions
>> are not second class citizens in term of call performance.
> 
> These PEPs proposes new public protocol which can be implemented
> by 3rd party extensions, especially Cython.
> In this meaning, it's not used only *internally*.

I think Mark meant that the API should *also* be used internally, in the
same way that external code uses it. Meaning, there shouldn't be a separate
internal API.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread INADA Naoki
> > > 5. It should speed up CPython for the standard benchmark suite.
...
> >
> > I don't think point 5 is a goal here either, as the problem isn't that
> > these calling optimisations don't exist, it's that they don't
> > currently have a public API that third party projects can access (the
> > most recent METH_FASTCALL thread covers that pretty well).
>
> Agreed.  The goal is not to speed up CPython but to bring third-party
> extensions up to speed (both literally and figuratively).
>

For clarify, main goal is not just only 3rd party extension faster.
Publicate some private APIs is enough for it.

Goals of these PEP 576 (GitHub version) and 580 is making
custom callable type (especially method-like object) faster.

Because most functions and methods are defined with PyMethodDef
and m_methods / tp_methods, these PEPs are not needed for them.

I think main motivation of these PEPs are modern Python usages:
Jupyter notebook + Cython.

Unlike extension module writer, we shouldn't expect user knows
difference between C and Python.  That's why Cython want emulate
normal Python function/methods as possible.

Regards,

-- 
INADA Naoki  
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A "day of silence" on PEP 572?

2018-07-07 Thread Steve Dower
There has been off-list discussion with the authors, for sure. But most of the 
recent threads are disputes and not useful.

At this point, if you're not helping clarify what’s in the PEP, you’re not 
helping by posting an opinion.

But since we can’t possibly convince everyone not to post their opinions, 
perhaps all the rest of us (especially the PEP authors!) should abandon the 
list for a few days and let them shout it out :)

Top-posted from my Windows 10 phone

From: Steve Holden
Sent: Friday, July 6, 2018 16:30
To: Ryan Gonzalez
Cc: Antoine Pitrou; Python-Dev@Python. Org
Subject: Re: [Python-Dev] A "day of silence" on PEP 572?

On Sat, Jul 7, 2018 at 12:18 AM, Ryan Gonzalez  wrote:
On July 6, 2018 5:04:05 PM Antoine Pitrou  wrote:

(or contact the PEP's authors
privately).

Hoenstly, this feels like a recipe for a disaster...
​Many of the people who have strong opinions in this know the PEP authors from 
years of working together.​
 
​They might feel that personal channels are appropriate.​ I'd agree it would be 
a bit presumptuous and spammy of others to use them.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Guido van Rossum
This seems more suitable for a style guide.

Enforcing such restrictions in the grammar would actually be complicated,
due to nesting -- but even if it wasn't, I wouldn't want to, as I don't
want to limit future generations to only the benefits of the new construct
that we can now come up with. Orthogonality is a good thing in my mind,
else we might never have had nested functions or conditional imports.

As to why you might want to use := in a function call, I could imagine
writing

if validate(name := re.search(pattern, line).group(1)):
return name

The benefit of combining the assignment with the if would be more apparent
if there was an if-elif...elif-else pattern, like here:
https://github.com/python/peps/pull/695/files#diff-09a4f112ea673a2339f0bec6014ff47fR409
(click off the comments to see it better).



On Sat, Jul 7, 2018 at 6:12 AM Giampaolo Rodola'  wrote:

> Sorry in advance for opening yet another topic about PEP-572. With PEP-572
> being officially accepted I know debating its inclusion in the language is
> a useless exercise at this point, but since it's still in "draft" status I
> would like to express my opinion as I think this is a feature which can
> potentially be abused fairly easily. FWIW I initially found myself
> disliking the idea as a whole but
> https://github.com/python/cpython/pull/8122 made me (and others)
> reconsider it quite a bit (see:
> https://twitter.com/grodola/status/1015251302350245888). PR-8122 clearly
> shows an improvement in expressiveness and compactness (many folks argue
> this is too much) but PEP-572 as it currently stands is too permissive
> IMHO. My concern about "easily abusable ugly cases" still remains, and I
> think they should be banned instead of just discouraged in the PEP or in
> the doc. Since we spend way more time *reading* code rather than writing
> it, as a "reader" I would expect a more prudent approach to the problem.
>
> Proposal
> 
>
> 1) allow only one := assignment per line in "if" statements:
> >>> if x := val1 and y := val2:   # SyntaxError or SyntaxWarning
> >>> if x == val1 and y := val2:   # SyntaxError or SyntaxWarning
> >>> if x := val1 and y == val2:   # SyntaxError or SyntaxWarning
> >>> if x := val1:  # OK
> >>> if (x := val1):  # OK
>
> 2) allow := in "while" statements, "if" statements and comprehensions only:
> >>> foo(x := 0)  # SyntaxError
> >>> yield x := 3  # SyntaxError
> >>> assert y := 3  # SyntaxError
>
> 3) (debatable) disallow := if the variable is already defined:
> >>> x = 5
> >>> if (x := val):  # SyntaxError or SyntaxWarning
>
> 4) ban "a = (b := c)", "x = a := (b := (c := d))" and similar (they're
> just too ugly IMHO)
>
> Rationale 1
> ===
>
> In visual terms assignments in Python have always occurred at the
> BEGINNING of the line and always on the most LEFT side:
>
> >>> foo = fun1()
> >>> bar = fun2()
> >>> baz = fun3()
>
> That is where I naturally expect an assignment to be when reading code. My
> main concern with PEP-572 is that an assignments can now occur at *any
> point* in the line:
>
> >>> foo = fun1()
> >>> bar = fun2()
> >>> if foo == val1 and bar == val2 and baz := fun3():
> ......
>
> That forces me to visually scan the whole line horizontally from left to
> right 'till its end, looking for possible variables being set. I'm
> concerned that I will miss := occurrences because visually they are very
> similar to == unless parentheses are made mandatory:
>
> >>> if foo == val1 and bar == val2 and (baz := fun3()):
> ......
>
> Also, in case of multi-line conditionals I have to visually scan the
> construct both horizontally AND vertically:
>
> >>> if (foo == val1 and \
> ...bar == val2 and \
> ...baz := val3):
> ... ...
>
> Again, that is not a place where I would expect to find or look for a
> variable assignment. I know I wouldn't like to read or review a code which
> does that and I suspect linters will likely end up wanting to emit a
> warning in that case (see: https://github.com/PyCQA/pylint/issues/2246).
> https://github.com/python/cpython/pull/8116/files avoids using multiple
> := per line and that's why the result appears readable enough IMO.
>
> Rationale 2
> ===
>
> PEP-572 states:
>
> > The := operator may be used directly in a positional function call
> argument
>
> That means allowing:
>
> >>> foo(x := 0)
>
> I honestly don't see why anyone would want to call a function AND assign a
> variable value at the same time (except in comprehensions). With this in
> place I not only have to guard against "if" statements assigning values at
> any point in the code, but also function calls, both horizontally and
> vertically e.g.:
>
> >>> foo(some_long_var_name, another_one, x := bar(),
> y := fun())
>
> To me this looks like the perfect example of where this functionality can
> be abused. Also, I'm not clear what PEP

Re: [Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Tim Peters
[Guido]

> ...
> As to why you might want to use := in a function call, I could imagine
> writing
>
> if validate(name := re.search(pattern, line).group(1)):
> return name
>

When I was staring at my code, I never mentioned the very first plausible
use I bumped into (in code I was actively working on at the time):

while not probable_prime(p := randrange(lo, hi)):
 pass
# and now `p` is likely a random prime in range

I never mentioned it because I expected it would annoy people on 3(!)
counts:

- assigning in a function call
- reducing the loop body to `pass`
- using the binding long after the loop ended

Indeed, for those reasons it wasn't "an obvious" win to me - or an obvious
loss.  So I just moved on.

However, after staring at hundreds of other cases, it does strike me as "a
small win" today - my brain cells have rewired to recognize more ":="
patterns at a glance.

Whether that's a good thing or not I don't know, but it is a real thing ;-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Terry Reedy

On 7/7/2018 12:53 PM, Tim Peters wrote:

[Guido]

...
As to why you might want to use := in a function call, I could
imagine writing

     if validate(name := re.search(pattern, line).group(1)):
     return name


If name has to be non-blank to pass validate, one can avoid the 
assignment within the function call be adding a redundant pre-test.


if name := re.search(pattern, line).group(1) and validate(name):
return name

Giampaolo would presumably prefer this, but I don't think such 
preference should be enforced on everyone.


If name == '' is valid, then the alternative is the current one, using a 
separate assignment statement.


name = re.search(pattern, line).group(1)
if validate(name):
return name

When I was staring at my code, I never mentioned the very first 
plausible use I bumped into (in code I was actively working on at the time):


while not probable_prime(p := randrange(lo, hi)):
      pass
# and now `p` is likely a random prime in range


As long as lo excludes 0:

while p := randrange(lo, hi) and not probable_prime(p):
continue

I can see how someone might prefer this stylistically, but it is buggy. 
If this is contained in a function (very likely) and lo could be <= 0, 
because it is either passed in or calculated, 0 could be passed on a 
likely prime!
I never mentioned it because I expected it would annoy people on 3(!) 
counts:


- assigning in a function call


This is a style preference that people can and will disagree on.  In any 
case, I think correctness trumps beauty, just as it trumps speed.



- reducing the loop body to `pass`


I fixed that ;-).  'continue' better expresses the 'try again' part of 
English versions, such as "While the trial value is not acceptable, try 
again."



- using the binding long after the loop ended


The same is true for the current 4-line loop and a half.

while True:
p = randrange(lo, hi)
if probable_prime(p):
break  # p used somewhere else

Indeed, for those reasons it wasn't "an obvious" win to me - or an 
obvious loss.  So I just moved on.


However, after staring at hundreds of other cases, it does strike me as 
"a small win" today - my brain cells have rewired to recognize more ":=" 
patterns at a glance.


Whether that's a good thing or not I don't know, but it is a real thing ;-)


I must admit that I too am already more comfortable with := now than I 
was originally.


--
Terry Jan Reedy


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Naming comprehension syntax [was Re: Informal educator feedback on PEP 572 ...]

2018-07-07 Thread Brett Cannon
On Fri, Jul 6, 2018, 16:32 Guido van Rossum,  wrote:

> On Fri, Jul 6, 2018 at 4:19 PM Terry Reedy  wrote:
>
>> Since Guido, the first respondent, did not immediately shoot the idea
>> down, I intend to flesh it out and make it more concrete.
>>
>
> Maybe I should have shot it down. The term is entrenched in multiple
> languages by now (e.g. https://en.wikipedia.org/wiki/List_comprehension).
> Regarding "list builder" one could argue that it would just add more
> confusion, since there's already an unrelated Builder Pattern (
> https://en.wikipedia.org/wiki/Builder_pattern) commonly used in Java.
> (Though I worry about the presence of a Python example in that Wikipedia
> page. :-)
>
> Also, "generator builder" is not much more expressive than "generator
> expression", and the key observation that led to this idea was that it's
> such a mouthful to say "comprehensions and generator expressions". Maybe
> it's not too late to start calling the latter "generator comprehensions" so
> that maybe by the year 2025 we can say "comprehensions" and everyone will
> understand we mean all four types?
>
> FWIW more people should start using "list display" etc. for things like
> [a, b, c].
>

I can get behind "generator comprehension" and "list display". The builder
idea isn't doing it for me.

-Brett


> --
> --Guido van Rossum (python.org/~guido)
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Jeroen Demeyer

On 2018-07-07 15:38, Mark Shannon wrote:

Hi,

We seem to have a plethora of PEPs where we really ought to have one (or
none?).


- PEP 575 has been withdrawn.
- PEP 579 is an informational PEP with the bigger picture; it does 
contain some of the requirements that you want to discuss here.
- PEP 580 and PEP 576 are two alternative implementations of a protocol 
to optimize callables implemented in C.



5. It should speed up CPython for the standard benchmark suite.


I'd like to replace this by: must *not slow down* the standard benchmark 
suite and preferable should not slow down anything.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Comparing PEP 576 and PEP 580

2018-07-07 Thread Jeroen Demeyer

On 2018-07-07 14:54, Mark Shannon wrote:

There is a minimal implementation and has been for a while.
There is a link at the bottom of the PEP.


Yes, I saw that but the implementation does not correspond to the PEP. 
In particular, this sentence from the PEP has not been implemented:


When binding a method_descriptor instance to an instance of its owning 
class, a bound_method will be created instead of a 
builtin_function_or_method


It's not clear to me whether you still want to implement that or whether 
it should be dropped from the PEP.



PEP 576 adds a new calling convention which can be used by *any* object.
Seems quite extensible to me.


Yes and no. Yes, it can do anything. But because it can do anything, 
callers cannot optimize certain special cases. For example, in PEP 576 
you need an extra flag Py_TPFLAGS_FUNCTION_DESCRIPTOR because your 
protocol doesn't specify anything about __get__. Imagine that you want 
to support more optimizations like that in the future, how do you plan 
to do that? Of course, you can always add more stuff to PyTypeObject, 
but a separate structure like what I propose in PEP 580 might make more 
sense.



Jeroen.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Brett Cannon
On Sat, Jul 7, 2018, 08:17 INADA Naoki,  wrote:

> > IMO, mailing lists are a terrible way to do software design, but a good
> > way to gather requirements as it makes less likely that someone will be
> > forgotten.
> >
>
> Agreed.  There are several topics we should discuss for these PEPs.
> Mailing list is hard to follow.
>
> Can we have other communication channel?  Dedicated Github repository?
> zulip? or discuss.python.org?
>
> > So, let us gather the requirements for a new calling API.
> >
> > Here are my starting suggestions:
> >
> > 1. The new API should be fully backwards compatible and shouldn't break
> > the ABI
>
> Agreed.  We have chance to break ABI/API slightly at Python 4, although
> breakage should be very small compared with Python 3.
>
> Until then, we should keep backward compatibility as possible.
>
> > 2. The new API should be used internally so that 3rd party extensions
> > are not second class citizens in term of call performance.
>
> These PEPs proposes new public protocol which can be implemented
> by 3rd party extensions, especially Cython.
> In this meaning, it's not used only *internally*.
>
> > 3. The new API should not prevent 3rd party extensions having full
> > introspection capabilities, supporting keyword arguments or another
> > feature supported by Python functions.
>
> OK.
>
> > 4. The implementation should not exceed D lines of code delta and T
> > lines of code in total size. I would suggest +200 and 1000 for D and T
> > respectively (or is that too restrictive?).
>
> Hmm, I think this should be considered as (Frequency * Value) /
> (Complexity).
> Especially, if PEP 580 can removes 2000 lines of code, T>1000 seems OK.
>

I don't think any concrete number is really going to be helpful. This is
probably going to come down to subjective "will this be complicated and
hard to maintain?" And that call will probably come down to the BDFL for
the PEP.

-Brett 0p


> > 5. It should speed up CPython for the standard benchmark suite.
>
> I think it's impossible in short term.  We have specialized optimization
> (FASTCALL and LOAD_METHOD/CALL_METHOD) already.
> These optimization makes simple method calls 30% faster.
> These PEPs makes 3rd party callable types can utilize these optimization.
>
> > 6. It should be understandable.
> >
>
> OK.
> While main audience is Cython, C extension writer should be able to use
> new protocols by handwritten extension.
>
> > What am I missing? Comments from the maintainers of Cython and other
> > similar tools would be appreciated.
> >
> > Cheers,
> > Mark.
>
>
> --
> INADA Naoki  
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Making the status of Provisional PEPs clearer

2018-07-07 Thread Nick Coghlan
Hi folks,

After review from Barry & Guido, I've just merged an update to PEP 1
and the PEP index generator that separates out provisionally accepted
PEPs to their own state in the PEP flow:
https://github.com/python/peps/commit/307dda38d4e7a5760dd4979ae9978a4eb1e70589

To date, that status has been reported as "Accepted" both in the
original PEPs and in the main PEP index, now it gets reported as
Provisional in both places.

Cheers,
Nick.

P.S. As part of this, I switched the flow diagram in PEP 1 from a PNG
to an SVG, which seems to have confused python.org's image rendering:
https://github.com/python/peps/issues/701

I'm already looking into it, but am open to tips from folks more
familiar with the website's rendering machinery.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Making the status of Provisional PEPs clearer

2018-07-07 Thread Guido van Rossum
Should we update some PEPs with the new status? E.g. PEP 484.

On Sat, Jul 7, 2018 at 7:46 PM Nick Coghlan  wrote:

> Hi folks,
>
> After review from Barry & Guido, I've just merged an update to PEP 1
> and the PEP index generator that separates out provisionally accepted
> PEPs to their own state in the PEP flow:
>
> https://github.com/python/peps/commit/307dda38d4e7a5760dd4979ae9978a4eb1e70589
>
> To date, that status has been reported as "Accepted" both in the
> original PEPs and in the main PEP index, now it gets reported as
> Provisional in both places.
>
> Cheers,
> Nick.
>
> P.S. As part of this, I switched the flow diagram in PEP 1 from a PNG
> to an SVG, which seems to have confused python.org's image rendering:
> https://github.com/python/peps/issues/701
>
> I'm already looking into it, but am open to tips from folks more
> familiar with the website's rendering machinery.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
-- 
--Guido (mobile)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Mike Miller



On 2018-07-07 06:09, Giampaolo Rodola' wrote:
 I initially found myself disliking the idea as a whole but 
https://github.com/python/cpython/pull/8122 made me (and others) reconsider it 
quite a bit (see: https://twitter.com/grodola/status/1015251302350245888). 
PR-8122 clearly shows an improvement in expressiveness and compactness (many 
folks argue this is too much) 



One of the requirements from the PEP (informing its design) is that there would 
be a significant need and opportunities to use it with multiple and/or compound 
conditions.


While it may be a function of Victor's choices, from my reading of the pull 
request the vast majority of the improved lines has but a single, simple 
condition such as:


 while (data := f.readframes(1024)):

I believe Giampaolo has a good point.  These expressions are useful, however 
don't seem to have much value outside these examples.


On the subject on imagining how they would be used, suppose we could look at 
JavaScript or other expression-oriented languages for best practices.  Didn't 
find many admonitions however, other than keep it simple:


https://stackoverflow.com/q/9726496/450917

-Mike
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Making the status of Provisional PEPs clearer

2018-07-07 Thread Nick Coghlan
On 8 July 2018 at 12:48, Guido van Rossum  wrote:
> Should we update some PEPs with the new status? E.g. PEP 484.

Aye, 3 PEPs were given the new status as part of the commit:

- 484 (type hinting syntax)
- 518 (pyproject.toml/[build-system].requires)
- 517 (pyproject.toml/[build-system].backend)

Checking the other Accepted PEPs, it looks like a few of them need to
be marked Final now that 3.7 has been released, but I don't believe
any of them are Provisional.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Making the status of Provisional PEPs clearer

2018-07-07 Thread Guido van Rossum
Thanks!


On Sat, Jul 7, 2018 at 8:29 PM Nick Coghlan  wrote:

> On 8 July 2018 at 12:48, Guido van Rossum  wrote:
> > Should we update some PEPs with the new status? E.g. PEP 484.
>
> Aye, 3 PEPs were given the new status as part of the commit:
>
> - 484 (type hinting syntax)
> - 518 (pyproject.toml/[build-system].requires)
> - 517 (pyproject.toml/[build-system].backend)
>
> Checking the other Accepted PEPs, it looks like a few of them need to
> be marked Final now that 3.7 has been released, but I don't believe
> any of them are Provisional.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
>
-- 
--Guido (mobile)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 575, 576, 579 and 580

2018-07-07 Thread Nathaniel Smith
On Sat, Jul 7, 2018 at 6:38 AM, Mark Shannon  wrote:
> 1. The new API should be fully backwards compatible and shouldn't break the
> ABI

Which ABI? The stable ABI (PEP 384)? I don't think object layout is
exposed there, though I'm not sure of the details.

The regular ABI that almost everyone actually uses? That's already
broken on every minor release, so you shouldn't spend any time
worrying about preserving compatibility there.

-n

-- 
Nathaniel J. Smith -- https://vorpus.org
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Call for prudence about PEP-572

2018-07-07 Thread Tim Peters
>
> [Tim]
> > When I was staring at my code, I never mentioned the very first
> > plausible use I bumped into (in code I was actively working on at the
> time):
> >
> > while not probable_prime(p := randrange(lo, hi)):
> >   pass
> > # and now `p` is likely a random prime in range
>

{Terry Reedy]
>
> As long as lo excludes 0:
>
> while p := randrange(lo, hi) and not probable_prime(p):
>  continue
>
I can see how someone might prefer this stylistically, but it is buggy.
> If this is contained in a function (very likely) and lo could be <= 0,
> because it is either passed in or calculated, 0 could be passed on a
> likely prime!
>

I never write code that uses "and" relying on that context-specific data
constraints "guarantee" the LHS is always true.  That combines a delicate
precondition with "a trick".  Dreadful.

I won't even write it this way, which keeps "the trick" but eliminates the
hidden data assumption:

while [p := randrange(lo, hi)] and not probable_prime(p):

A singleton list is always truthy, so at least now it makes no assumptions
about the value bound to `p`.

I could be paid to write it this way, but most employers couldn't afford to
pay enough to do it twice ;-) :

while [(p := randrange(lo, hi)), not probable_prime(p)][-1]:

That always works and doesn't rely on "a trick", but is ugly, obscure, and
inefficient.


> I never mentioned it because I expected it would annoy people on 3(!)
> > counts:
> >
> > - assigning in a function call
>
> This is a style preference that people can and will disagree on.  In any
> case, I think correctness trumps beauty, just as it trumps speed.
>

All else being equal (and, yup, correctness is more equal than the others),
I like to push assignments "to the left" as much as possible.


> > - reducing the loop body to `pass`
>
> I fixed that ;-).  'continue' better expresses the 'try again' part of
> English versions, such as "While the trial value is not acceptable, try
> again."
>

Thanks!  Now that you mention it (it had not occurred to me), I like
`continue` much better than `pass` here too.


> - using the binding long after the loop ended

>
The same is true for the current 4-line loop and a half.
>
> while True:
>  p = randrange(lo, hi)
>  if probable_prime(p):
>  break  # p used somewhere else
>

Sure.  But this PEP _started_ with a fancier model wherein the language
would magically limit the scope of assignment targets in these
block-opening tests.  That was eventually removed, but I'm sure we'll see
"style guides" demanding that it "should" never be used unless the target
is in fact never referenced (at least not before re-binding) after the
associated block ends.

It's been mildly surprising to me to see how often that _is_ the case in
real code.  But, as in the example above, I won't be following such a rule
rigidly.


> Indeed, for those reasons it wasn't "an obvious" win to me - or an
> > obvious loss.  So I just moved on.
> >
> > However, after staring at hundreds of other cases, it does strike me as
> > "a small win" today - my brain cells have rewired to recognize more ":="
> > patterns at a glance.
> >
> > Whether that's a good thing or not I don't know, but it is a real thing
> ;-)
>


> I must admit that I too am already more comfortable with := now than I
> was originally.


The stories about its uncanny ability to destroy entire projects with a
single use may have been exaggerated ;-)

But, ya, I've tried playing with it much more than most so far, and my bar
for "obvious little win" has lowered.  Not much, though, and it seems to
have bottomed out with that example.

So, in the end, I expect I'll use it as sparingly - and gratefully! - as in
all the other languages I've used with assignment expressions.

Next project:  rehabilitate the much-maligned GOTO statement ;-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com