Re: [Python-Dev] Can we split PEP 489 (extension module init) ?

2018-08-11 Thread Jeroen Demeyer

> Would this be better than a flag + raising an error on init?

Exactly. PEP 489 only says "Extensions using the new initialization 
scheme are expected to support subinterpreters". What's wrong with 
raising an exception when the module is initialized the second time?



Jeroen.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Let's change to C API!

2018-08-11 Thread Antoine Pitrou


Hi Armin,

On Fri, 10 Aug 2018 19:15:11 +0200
Armin Rigo  wrote:
> Currently, the C API only allows Psyco-style JITting (much slower than
> PyPy).  All three other points might not be possible at all without a
> seriously modified C API.  Why?  I have no proof, but only
> circumstantial evidence.  Each of (2), (3), (4) has been done in at
> least one other implementation: PyPy, Jython and IronPython.  Each of
> these implementation has also got its share of troubles with emulating
> the CPython C API.  You can continue to think that the C API has got
> nothing to do with it.  I tend to think the opposite.  The continued
> absence of major performance improvements for either CPython itself or
> for any alternative Python implementation that *does* support the C
> API natively is probably proof enough---I think that enough time has
> passed, by now, to make this argument.

Jython and IronPython never got significant manpower AFAIK, so even
without being hindered by the C API, chances are they would never have
gotten very far.  Both do not even seem to have stable releases
implementing the Python 3 language...

That leaves us with CPython and PyPy, which are only two data points.
And there are enough differences, AFAIK, between those two that picking
up "supports the C API natively" as the primary factor leading to a
performance difference sounds arbitrary.

(the major difference being IMHO that PyPy is written in RPython, which
opens up possibilities that are not realistic with a C implementation,
such as the JIT being automatically able to inspect implementations of
core / stdlib primitives; in a CPython-based JIT such as Numba, you
have to reimplement all those primitives in a form that's friendly to
the JIT compiler)

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Let's change to C API!

2018-08-11 Thread Stefan Behnel
Antoine Pitrou schrieb am 11.08.2018 um 15:19:
> On Fri, 10 Aug 2018 19:15:11 +0200 Armin Rigo wrote:
>> Currently, the C API only allows Psyco-style JITting (much slower than
>> PyPy).  All three other points might not be possible at all without a
>> seriously modified C API.  Why?  I have no proof, but only
>> circumstantial evidence.  Each of (2), (3), (4) has been done in at
>> least one other implementation: PyPy, Jython and IronPython.  Each of
>> these implementation has also got its share of troubles with emulating
>> the CPython C API.  You can continue to think that the C API has got
>> nothing to do with it.  I tend to think the opposite.  The continued
>> absence of major performance improvements for either CPython itself or
>> for any alternative Python implementation that *does* support the C
>> API natively is probably proof enough---I think that enough time has
>> passed, by now, to make this argument.
> [...]
> That leaves us with CPython and PyPy, which are only two data points.
> And there are enough differences, AFAIK, between those two that picking
> up "supports the C API natively" as the primary factor leading to a
> performance difference sounds arbitrary.

IMHO, while it's not clear to what extent the C-API hinders performance
improvements or jittability of code in CPython, I think it's fair to assume
that it's easier to improve internals when they are internal and not part
of a public API. Whether it's worth the effort to design a new C-API, or at
least make major changes to it, I cannot say, lacking an actual comparable
implementation of such a design that specifically targets better performance.

As it stands, extensions can actually make good use of the fact that the
C-API treats them (mostly, see e.g. PEPs 575/580) as first class citizens
in the CPython ecosystem. So, the status quo is at least a tradeoff.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Can we split PEP 489 (extension module init) ?

2018-08-11 Thread Stefan Behnel
Petr Viktorin schrieb am 10.08.2018 um 13:48:
> Would this be better than a flag + raising an error on init?

Ok, I've implemented this in Cython for now, to finally move the PEP-489
support forward. The somewhat annoying drawback is that module reloading
previously *seemed* to work, simply because it didn't actually do anything.
Now, people will get an exception in cases that previously worked silently.
An exception would probably have been better from the beginning, because it
clearly tells people that what they are trying is not supported. Now it's a
bit of a breaking change. I'll see what it gives.

Thanks for your feedback on this.

Stefan

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Class decorators can't be pickled, which breaks multiprocessing and concurrent.futures. Any plans of improving this?

2018-08-11 Thread Santiago Basulto
Hello folks! I'm using the `concurrent.futures.ProcessPoolExecutor` with a
couple of functions that have been decorated with a class decorator. Both
`concurrent.futures` and `multiprocessing` breaks because "the object's
can't be pickled". There's a really simple fix for this, which is just,
instead of "decorating" the function (with the @), instantiate the
decorator and use it directly.

Example. This is my (very simple, for demonstration purposes) decorator:

class CheckOnlyIntegers:
def __init__(self, fn):
self.fn = fn

def __call__(self, *args):
if not all([type(arg) == int for arg in args]):
raise ValueError("Invalid param is not an integer")
return self.fn(*args)

If I define a simple `add` function and decorate it using the
`CheckOnlyIntegers` decorator:

@CheckOnlyIntegers
def add(x, y):
return x + y

and try using a regular `ProcessPoolExecutor().submit(add, 2, 3)`, it fails
with:

```
Can't pickle : it's not the same object as
__main__.add.
```

The fix for this is simple, instead of "decorating" the function,
instantiate the class and use a different name:

def add(x, y):
return x + y

add_2 = CheckOnlyIntegers(add)

In this case `ProcessPoolExecutor().submit(add_2, 2, 3)` works correctly.
(here's the full sample code
)

I know this is an issue with the pickle module (not concurrent.futures or
multiprocessing). But are there any ways of improving this in future
versions? Not being able to pickle functions decorated with Class
Decorators seems like an unnecessary limitation.

Thanks for your feedback!

-- 
Santiago Basulto.-
Up!
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Class decorators can't be pickled, which breaks multiprocessing and concurrent.futures. Any plans of improving this?

2018-08-11 Thread Serhiy Storchaka

11.08.18 23:08, Santiago Basulto пише:
Hello folks! I'm using the `concurrent.futures.ProcessPoolExecutor` with 
a couple of functions that have been decorated with a class decorator. 
Both `concurrent.futures` and `multiprocessing` breaks because "the 
object's can't be pickled". There's a really simple fix for this, which 
is just, instead of "decorating" the function (with the @), instantiate 
the decorator and use it directly.


Example. This is my (very simple, for demonstration purposes) decorator:

     class CheckOnlyIntegers:
         def __init__(self, fn):
             self.fn = fn

         def __call__(self, *args):
             if not all([type(arg) == int for arg in args]):
                 raise ValueError("Invalid param is not an integer")
             return self.fn(*args)

If I define a simple `add` function and decorate it using the 
`CheckOnlyIntegers` decorator:


     @CheckOnlyIntegers
     def add(x, y):
         return x + y

and try using a regular `ProcessPoolExecutor().submit(add, 2, 3)`, it 
fails with:


```
Can't pickle : it's not the same object as 
__main__.add.

```


By default instances of Python classes are pickled by pickling a set of 
their attributes. Functions are pickled by name. But since the name of 
self.fn corresponds to the decorated function, not the function itself, 
it can't be pickled.


You can implement the explicit pickle support for your decorator that 
bypass this limitation.


def __reduce__(self):
return self.fn.__qualname__

Now the decorated function will be pickled by name.

It may help to set also self.__module__ = self.fn.__module__ in the 
constructor.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com