[Python-Dev] Are undocumented functions part of the stable ABI?
Hello, I have a question about PEP 384: can undocumented functions ever be considered as part of the stable ABI? With undocumented, I mean not appearing in the "Python/C API Reference Manual". Whatever the answer to this question is, it would be good to make it explicit in PEP 384. I am in particular asking about functions starting with PyCFunction_ appearing in Include/methodobject.h Thanks, Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Are undocumented functions part of the stable ABI?
On 2018-04-04 17:56, Guido van Rossum wrote: It would be helpful if you explained the context of your request. The context is PEP 575. I guess my question is mostly about PyCFunction_Check(). I will not be able to keep it 100% backwards compatible simply because the goal of that PEP is precisely changing the classes of some objects. Now the question is: am I allowed to change the implementation of PyCFunction_Check()? If it's considered part of the stable ABI, then the answer is immediately "no". By the way, does anybody happen to know why the PyCFunction_* functions are undocumented? Is it just an oversight in the docs or is it intentional? But regardless of the context, I think that the question "Are undocumented functions part of the stable ABI?" should be answered in PEP 384. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Are undocumented functions part of the stable ABI?
On 2018-04-08 05:17, Nick Coghlan wrote: Changing macro definitions doesn't break the stable ABI, as long as the *old* macro expansions still do the right thing. To me, it looks like a bad idea to change macros. Imagine that the PyCFunction_Check macro changes in Python 3.8. Then an extension module compiled on 3.7 (but run on 3.8) would behave differently from the same extension compiled on 3.8. I cannot imagine that this is in line with the "stable ABI" philosophy. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Are undocumented functions part of the stable ABI?
On 2018-04-10 13:49, Nick Coghlan wrote: If it's only a semantic level change in the way the macro gets expanded, then whether or not it needs an ABI version guard gets judged on a case-by-case basis, and in this particular case, my view would be that developers should be able to write extensions using the stable ABI that accept function subclasses on 3.8+, without having to *require* the use of 3.8+ to import their module. I don't really get this paragraph, but in any case I decided to *not* change PyCFunction_Check in PEP 575. It doesn't seem worth the trouble as this macro is probably not often used anyway. Also, it's hard to guess what it should be replaced with: why would extensions be calling PyCFunction_Check()? ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 575: Unifying function/method classes
Dear Python developers, I would like to request a review of PEP 575, which is about changing the classes used for built-in functions and Python functions and methods. The text of the PEP can be found at https://www.python.org/dev/peps/pep-0575/ No substantial changes to the contents of the PEP were made compared to the first posting. However, many details have been changed, clarified or added, based on comments from the initial discussion thread and the work on an implementation. My implementation is at https://github.com/jdemeyer/cpython/tree/pep575 This is certainly not meant to be ready to merge upstream; in particular, the Python test suite does not fully pass. Nevertheless, it should be good enough to review the PEP. If the PEP would be accepted, I plan to continue working on the implementation, including adding tests and documentation. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
To make it easier to test and try out PEP 575, I created a binder repo: https://mybinder.org/v2/gh/jdemeyer/pep575binder.git /master?filepath=index.ipynb ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-13 21:30, Raymond Hettinger wrote: It would be nice to have a section that specifically discusses the implications with respect to other existing function-like tooling: classmethod, staticmethod, partial, itemgetter, attrgetter, methodgetter, etc. My hope is that there are no such implications. An important design goal of this PEP (which I believe I achieved) is that as long as you're doing duck typing, you should be safe. I believe that the tools in your list do exactly that. It's only when you use inspect or when you do type checks that you will see the difference with this PEP. After implementing the C code part of my PEP, there were only a relatively small number of test failures. You can look at this commit which contains all Python code changes of my implementation, it doesn't look so bad: https://github.com/jdemeyer/cpython/commit/c404a8f1b7d9525dd2842712fe183a051a4b5094 For example, I would need to update the code in random._randbelow(). For the record, there are no test failures related to this, but maybe that's just because tests for this are missing. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-13 15:23, Nick Coghlan wrote: There's also a section in the rationale which refers to METH_USRx flags, which I'm guessing from context are an idea you were considering proposing, but eventually dropped from the rest of the PEP. No, I actually still want to propose it. In my latest update of the PEP, I hope to have made it more clear. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-14 23:14, Guido van Rossum wrote: That actually sounds like a pretty big problem. I'm sure there is lots of code that doesn't *just* duck-type nor calls inspect but uses isinstance() to decide how to extract the desired information. In the CPython standard library, the *only* fixes that are needed because of this are in: - inspect (obviously) - doctest (to figure out the __module__ of an arbitrary object) - multiprocessing.reduction (something to do with pickling) - xml.etree.ElementTree (to determine whether a certain method was overridden) - GDB support I've been told that there might also be a problem with Random._randbelow, even though it doesn't cause test failures. The fact that there is so little breakage in the standard library makes me confident that the problem is not so bad. And in the cases where it does break, it's usually pretty easy to fix. Finally: changing the classes of certain objects is exactly the point of this PEP, so it's impossible to achieve 100% backwards compatibility. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-16 02:32, Raymond Hettinger wrote: I don't think that confidence is warranted. The world of Python is very large. When public APIs (such as that in the venerable types module) get changed, is virtually assured that some code will break. Yes, *some* code will break, I never denied that. I just think that there is not much existing code which needs to distinguish between different kinds of methods, such that not much code will break. And if existing code does need to make that distinction, the reason for it might go away after PEP 575 (this is what Nick Coghlan also alluded to). The hard question is whether the expected breakage is bad enough to reject this PEP. This is my first PEP, so I honestly don't have a good idea how high the bar for backwards compatibility is. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 572: Assignment Expressions
On 2018-04-18 02:13, Chris Angelico wrote: I'm much happier promoting a full-featured assignment expression than something that can only be used in a limited set of situations. Is there reason to believe that extensions to the := operator might take it in a different direction? If not, there's very little to lose by permitting any assignment target, and then letting style guides frown on it if they like. This is a very good argument: why artificially restrict the operator? This reminds me of the artificial restriction of decorator syntax (why is @foo()() not legal?). There was never a rationale given for that and now we are stuck with it. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-14 23:14, Guido van Rossum wrote: That actually sounds like a pretty big problem. I'm sure there is lots of code that doesn't *just* duck-type nor calls inspect but uses isinstance() to decide how to extract the desired information. I have been thinking about this some more... One solution to improve backwards compatibility would be to duplicate some classes. For example, make a separate class for bound methods in extension types, which would be literally a duplicate of the existing types.MethodType class (possibly with a different name). In other words, a bound method of an extension type would work exactly the same way as an existing bound method but it would artificially be a different class for the benefit of non-duck-typing. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
Hello, I just saw this PEP. There is a bit of overlap between PEP 573 and PEP 575 since these both change the calling convention for built-in methods. In particular, PEP 575 also proposes to add a "defining class" member (for different reasons). In PEP 575, this is added to the PyCFunction struct itself instead of a separate struct PyCMethod. It would be nice to justify whether you really need a new class (PyCMethod_Type) to support METH_METHOD. It looks strange to me that the class of some object depends on an implementation detail like whether METH_METHOD is specified. The current PEP 573 implies that backwards compatibility concerns would arise every time that METH_METHOD is added to an existing method. People have asked questions on PEP 575 about that: it would break code depending on "types.BuiltinFunctionType" for example. You could instead just change PyCFunctionObject to add that field (that's what I did in PEP 575). For practical reasons, it would be nice to implement PEP 573 and PEP 575 together as they affect the same code (assuming that both PEPs are accepted of course). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
In PEP 573, instead of passing the defining class to the C function, why not pass the function object itself? That is far more general: once you have the function object, you can still access the defining class using your PyCMethod_CLASS. It's also more future-proof: if we ever decide to add even more attributes to the function object, those could be accessed the same way. In PEP 575, I'm already proposing a flag (METH_ARG0_FUNCTION) to pass the function *instead* of self. Unless PEP 573 is rejected, maybe that should change to passing the function *in addition* to self. Of course, this doesn't quite work with your current version of PEP 573 since METH_METHOD really does two things: it changes the class of the function object (which is not a good idea anyway) and it changes the calling convention. It could work if you add mm_class to PyCFunctionObject instead of creating a new class. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
On 2018-04-24 14:53, Nick Coghlan wrote: In PEP 575, I'm already proposing a flag (METH_ARG0_FUNCTION) to pass the function *instead* of self. Unless PEP 573 is rejected, maybe that should change to passing the function *in addition* to self. That would definitely be an elegant way of addressing both use cases. On the other hand, if you are passing the function object, then you can get __self__ from it (unless it's an unbound method: in that case __self__ is NULL and self is really args[0]). So there wouldn't be a need for passing "self". I'm not saying that this is better than passing "self" explicitly... I haven't yet decided what is best. In any case, these things would be handled by Argument Clinic anyway, so it only matters if you are parsing arguments "by hand". Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
On 2018-04-24 16:34, Jeroen Demeyer wrote: On the other hand, if you are passing the function object, then you can get __self__ from it (unless it's an unbound method: in that case __self__ is NULL and self is really args[0]). So there wouldn't be a need for passing "self". I'm not saying that this is better than passing "self" explicitly... I haven't yet decided what is best. One thing I realized from PEP 573: the fact that __self__ for built-in functions is set to the module is considered a feature. I never understood the reason for it (and I don't know if the original reason was the same as the reason in PEP 573). If we want to continue supporting that and we also want to support __get__ for built-in functions (to make them act as methods), then there are really two "selfs": there is the "self" from the method (the object that it's bound to) and the "self" from the built-in function (the module). To support that, passing *both* the function and "self" seems like the best way. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-20 12:02, Jeroen Demeyer wrote: One solution to improve backwards compatibility would be to duplicate some classes. For example, make a separate class for bound methods in extension types, which would be literally a duplicate of the existing types.MethodType class (possibly with a different name). In other words, a bound method of an extension type would work exactly the same way as an existing bound method but it would artificially be a different class for the benefit of non-duck-typing. I elaborated on this: https://www.python.org/dev/peps/pep-0575/#two-phase-implementation ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
On 2018-04-25 20:33, Petr Viktorin wrote: Perhaps "m_objclass" could point to the module in this case That was exactly my idea also today. Instead of treating m_objclass as the defining class, we should generalize it to be the "parent" of the function: either the class or the module. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
- In Python code, __objclass__ should be the defining class, not the module. Indeed. My idea would be to add an accessor __parent__ returning the m_parent field (whatever it is) and then implement __objclass__ as something like: @property def __objclass__(self): parent = getattr(self, "__parent__", None) if isinstance(parent, type): return parent else: raise AttributeError In PEP 575, I don't plan to add a Python attribute specifically for getting the defining module: I'll leave that to PEP 573. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 573 -- Module State Access from C Extension Methods
On 2018-04-26 16:37, Nick Coghlan wrote: PEP 487 refers to this as the "owner" of a descriptor That's just copied from the Python docs: https://docs.python.org/3.8/reference/datamodel.html#object.__get__ Anyway, I never liked the name "owner" there, "cls" would have been much clearer. For PEP 575, I thought of "owner" too, but I don't like it because it sounds too possessive. It sounds like the "owner" somehow has complete control over the function and I don't want that association. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 575 (Unifying function/method classes) update
Hello all, I have updated PEP 575 and its reference implementation. See https://www.python.org/dev/peps/pep-0575/ The main differences with respect to the previous version are: * METH_PASS_FUNCTION now passes the function *in addition* to self (previously, it was passed *instead* of self). * __objclass__ was generalized to __parent__ and stores either the defining class or the defining module of a built-in function/method. * Proposed two-phase implementation for better backwards compatibility (at the cost of added complexity). The first two items on the above list are meant to prepare for PEP 573 but are sufficiently useful by itself to add them to PEP 575. On this mailing list, there have been concerns about backwards compatibility. This PEP does indeed affect code not using duck typing, but using type checks or things like inspect.isbuiltin(). Note that "affect" != "break". I don't know how bad this presumed breakage is. Personally, I think it will be acceptable, but others may disagree. What I *do* know for sure is that very little breaks in the Python standard library. If anybody has a clever idea to estimate the breakage, I would love to know. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-04-30 15:38, Mark Shannon wrote: While a unified *interface* makes sense, a unified class hierarchy and implementation, IMO, do not. The main reason for the common base class is performance: in the bytecode interpreter, when we call an object, CPython currently has a special case for calling Python functions, a special case for calling methods, a special case for calling method descriptors, a special case for calling built-in functions. By introducing a common base class, we reduce the number of special cases. Second, we allow using this fast path for custom classes. With PEP 575, it is possible to create new classes with the same __call__ performance as the current built-in function class. Bound-methods may be callables, but they are not functions, they are a pair of a function and a "self" object. From the Python language point of view, that may be true but that's not how you want to implement methods. When I write a method in C, I want that it can be called either as unbound method or as bound method: the C code shouldn't see the difference between the calls X.foo(obj) or obj.foo(). And you want both calls to be equally fast, so you don't want that the bound method just wraps the unbound method. For this reason, it makes sense to unify functions and methods. IMO, there are so many versions of "function" and "bound-method", that a unified class hierarchy and the resulting restriction to the implementation will make implementing a unified interface harder, not easier. PEP 575 does not add any restrictions: I never claimed that all callables should inherit from base_function. Regardless, why would the common base class add restrictions? You can still add attributes and customize whatever you want in subclasses. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575: Unifying function/method classes
On 2018-05-03 11:30, Victor Stinner wrote: Please don't queue backward incompatible changes for Python 4.0. You should use the regular deprecation process. I don't really see how that can be done here. As Stefan said The problem is that this change does not really fit into the deprecation cycle since there is no specific use case to warn about. The PEP proposes to change an implementation detail. It's really hard to determine at runtime whether code is relying on that implementation detail. We could insert a DeprecationWarning in some places, but those would mostly be false positives (a DeprecationWarning is shown but the code won't break). On top of that, there is no way to show a DeprecationWarning for code like "type(x) is foo". Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 575 (Unifying function/method classes) update
Hello all, I have updated PEP 575 in response to some posts on this mailing list and to some discussions in person with the core Cython developers. See https://www.python.org/dev/peps/pep-0575/ The main differences with respect to the previous version are: * "builtin_function" was renamed to "cfunction". Since we are changing the name anyway, "cfunction" looked like a better choice because the word "built-in" typically refers to things from the builtins module. * defined_function now only defines an API (it must support all attributes that a Python function has) without specifying the implementation. * The "Two-phase Implementation" proposal for better backwards compatibility has been expanded and now offers 100% backwards compatibility for the classes and for the inspect functions. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-06 09:35, Nick Coghlan wrote: Thanks for this update Jeroen! If it doesn't come up otherwise, I'll try to claim one of the lightning talk slots at the Language Summit to discuss this with folks in person :) Sounds great! I'd love to hear what people think. As an example of how the new functionality of PEP 575 can be used, I changed functools.lru_cache to implement the _lru_cache_wrapper class as subclass of base_function. I added this to the reference implementation https://github.com/jdemeyer/cpython/tree/pep575 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-14 22:38, Petr Viktorin wrote: Why are these flags added? They aren't free – the space of available flags is not infinite. If something (Cython?) needs eight of them, it would be nice to mention the use case, at least as an example. What should Python do with a m_methods entry that has METH_CUSTOM set? Again it would be nice to have an example or use case. They have no specific use case. I just added this because it made sense abstractly. I can remove this from my PEP to simplify it. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-14 19:56, Petr Viktorin wrote: It does quite a lot of things, and the changes are all intertwined, which will make it hard to get reviewed and accepted. The problem is that many things *are* already intertwined currently. You cannot deal with functions without involving methods for example. An important note is that it was never my goal to create a minimal PEP. I did not aim for changing as little as possible. I was thinking: we are changing functions, what would be the best way to implement them? The main goal was fixing introspection but a secondary goal was fixing many of the existing warts with functions. Probably this secondary goal will in the end be more important for the general Python community. I would argue that my PEP may look complicated, but I'm sure that the end result will be a simpler implementation than we have today. Instead of having four related classes implementing similar functionality (builtin_function_or_method, method, method_descriptor and function), we have just one (base_function). The existing classes like method still exist with my PEP but a lot of the core functionality is implemented in the common base_function. This is really one of the key points: while my PEP *could* be implemented without the base_function class, the resulting code would be far more complicated. Are there parts that can be left to a subsequent PEP, to simplify the document (and implementation)? It depends. The current PEP is more or less a finished product. You can of course pick parts of the PEP and implement those, but then those parts will be somewhat meaningless individually. But if PEP 575 is accepted "in principle" (you accept the new class hierarchy for functions), then the details could be spread over several PEPs. But those individual PEPs would only make sense in the light of PEP 575. A few small details could be left out, such as METH_BINDING. But that wouldn't yield a significant simplification. It seems to me that the current complexity is (partly) due to the fact that how functions are *called* is tied to how they are *introspected*. The *existing* situation is that introspection is totally tied to how functions are called. So I would argue that my PEP improves on that by removing some of those ties by moving __call__ to a common base class. Maybe we can change `inspect` to use duck-typing instead of isinstance? That was rejected on https://bugs.python.org/issue30071 Then, if built-in functions were subclassable, Cython functions could need to provide appropriate __code__/__defaults__/__kwdefaults__ attributes that inspect would pick up. Of course, that's possible. I don't think that it would be a *better* solution than my PEP though. Essentially, my PEP started from that idea. But then you realize that you'll need to handle not only built-in functions but also method descriptors (unbound methods of extension types). And you'll want to allow __get__ for the new subclasses. For efficiency, you really want to implement __get__ in the base classes (both builtin_function_or_method and method_descriptor) because of optimizations combining __get__ and __call__ (the LOAD_METHOD and CALL_METHOD opcodes). And then you realize that it makes no sense to duplicate all that functionality in both classes. So you add a new base class. You already end up with a major part of my PEP this way. That still leaves the issue of what inspect.isfunction() should do. Often, "isfunction" is used to check for "has introspection" so you certainly want to allow for custom built-in function classes to satisfy inspect.isfunction(). So you need to involve Python functions too in the class hierarchy. And that's more or less my PEP. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-14 22:38, Petr Viktorin wrote: Why are these flags added? I made a minor edit to the PEP to remove those flags: https://github.com/python/peps/pull/649 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-15 18:36, Petr Viktorin wrote: What is your ultimate use case? (I'll just answer this one question now and reply to the more technical comments in another thread) My ultimate use case is being able to implement functions and methods which are (A) equally fast as the existing built-in function and methods (B) and behave from a user's point of view like Python functions. With objective (A) I want no compromises. CPython has many optimizations for built-in functions and all of them should work for my new functions. Objective (B) means more precisely: 1. Implementing __get__ to turn a function in a method. 2. Being recognized as "functions" by tools like Sphinx and IPython. 3. Introspection support such as inspect.signature() and inspect.getsource(). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-15 18:36, Petr Viktorin wrote: Naturally, large-scale changes have less of a chance there. Does it really matter that much how large the change is? I think you are focusing too much on the change instead of the end result. As I said in my previous post, I could certainly make less disruptive changes. But would that really be better? (If you think that the answer is "yes" here, I honestly want to know). I could make the code less different than today but at the cost of added complexity. Building on top of the existing code is like building on a bad foundation: the higher you build, the messier it gets. Instead, I propose a solid new foundation. Of course, that requires more work to build but once it is built, the finished building looks a lot better. With such a "finished product" PEP, it's hard to see if some of the various problems could be solved in a better way -- faster, more maintainable, or less disruptive. With "faster", you mean runtime speed? I'm pretty confident that we won't lose anything there. As I argued above, my PEP might very well make things "more maintainable", but this is of course very subjective. And "less disruptive" was never a goal for this PEP. It's also harder from a psychological point of view: you obviously already put in a lot of good work, and it's harder to waste that work if an even better solution is found. I hope that this won't be my psychology. As a developer, I prefer to focus on problems rather than on solutions: I don't want to push a particular solution, I want to fix a particular problem. If an even better solution is accepted, I will be a very happy man. What I would hate is that this PEP gets rejected because some people claim that the problem can be solved in a better way, but without actually suggesting such a better way. Is a branching class hierarchy, with quite a few new of flags for feature selection, the kind of simplicity we want? Maybe yes because it *concentrates* all complexity in one small place. Currently, we have several independent classes (builtin_function_or_method, method_descriptor, function, method) which all require various forms of special casing in the interpreter with some code duplication. With my PEP, this all goes away and instead we need to understand just one class, namely base_function. Would it be possible to first decouple things, reducing the complexity, and then tackle the individual problems? What do you mean with "decouple things"? Can you be more concrete? The class hierarchy still makes it hard to decouple the introspection side (how functions look on the outside) from the calling mechanism (how the calling works internally). Any class who wants to profit from fast function calls can inherit from base_function. It can add whatever attributes it wants and it can choose to implement documentation and/or introspection in whatever way it wants. It can choose to not care about that at all. That looks very decoupled to me. Starting from an idea and ironing out the details it lets you (and, if since you published results, everyone else) figure out the tricky details. But ultimately it's exploring one path of doing things – it doesn't necessarily lead to the best way of doing something. So far I haven't seen any other proposals... That's a good question. Maybe inspect.isfunction() serves too many use cases to be useful. Cython functons should behave like "def" functions in some cases, and like built-in functions in others. From the outside, i.e. user's point of view, I want them to behave like Python functions. Whether it's implemented in C or Python should just be an implementation detail. Of course there are attributes like __code__ which dive into implementation details, so there you will see the difference. before we change how inspect.isfunction ultimately behaves, I'd like to make its purpose clearer (and try to check how that meshes with the current use cases). The problem is that this is not easy to do. You could search CPython for occurrences of inspect.isfunction() and you could search your favorite Python projects. This will give you some indication, but I'm not sure whether that will be representative. From what I can tell, inspect.isfunction() is mainly used as guard for attribute access: it implies for example that a __globals__ attribute exists. And it's used by documentation tools to decide that it should be documented as Python function whose signature can be extracted using inspect.signature(). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-16 17:31, Petr Viktorin wrote: The larger a change is, the harder it is to understand I already disagree here... I'm afraid that you are still confusing the largeness of the *change* with the complexity of the *result* after the change was implemented. A change that *removes* complexity should be considered a good thing, even if it's a large change. That being said, if you want me to make smaller changes, I could do it. But I would do it for *you* personally because I'm afraid that other people might rightly complain that I'm making things too complicated. So I would certainly like some feedback from others on this point. Less disruptive changes tend to have a better backwards compatibility story. Maybe in very general terms, yes. But I believe that the "disruptive" changes that I'm making will not contribute to backwards incompatibility. Adding new ml_flags flags shouldn't break anything and adding a base class shouldn't either (I doubt that there is code relying on the fact that type(len).__base__ is object). In my opinion, the one change that is most likely to cause backwards compatibility problems is changing the type of bound methods of extension types. And that change is even in the less disruptive PEP 576. Mark Shannon has an upcoming PEP with an alternative to some of the issues. I'm looking forward to a serious discussion about that. However, from a first reading, I'm not very optimistic about its performance implications. Currently, the "outside" of a function (how it looks when introspected) is tied to the "inside" (what happens internally when it's called). Can we better enable pydoc/IPython developers to tackle introspection problems without wading deep in the internals and call optimizations? I proposed complete decoupling in https://bugs.python.org/issue30071 and that was rejected. Anyway, decoupling of introspection is not the essence of this PEP. This PEP is really about allowing custom built-in function subclasses. That's the hard part where CPython internals come in. So I suggest that we leave the discussion about introspection and focus on the function classes. But, it still has to inherit from base_function to "look like a function". Can we remove that limitation in favor of duck typing? Duck typing is a Python thing, I don't know what "duck typing" would mean on the C level. We could change the existing isinstance(..., base_function) check by a different fast check. For example, we (together with the Cython devs) have been pondering about a new "type" field, say tp_cfunctionoffset pointing to a certain C field in the object structure. That would work but it would not be so fundamentally different from the current PEP. *PS*: On friday, I'm leaving for 2 weeks on holidays. So if I don't reply to comments on PEP 575 or alternative proposals, don't take it as a lack of interest. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-16 17:31, Petr Viktorin wrote: Less disruptive changes tend to have a better backwards compatibility story. A less intertwined change makes it easier to revert just a single part, in case that becomes necessary. I'll just repeat what I said in a different post on this thread: we can still *implement* the PEP in a less intertwined and more gradual way. The PEP deals with several classes and each class can be changed separately. However, there is not much point in starting this process if you don't intend to go all the way. The power of PEP 575 is really using this base_function class in many places. A PEP just adding the class base_function as base class of buitin_function_or_method without using it anywhere else would make no sense by itself. Still, that could be a first isolated step in the implementation. If PEP 575 is accepted, I would like to follow it up with PEPs to add more classes to the base_function hierarchy (candidates: staticmethod, classmethod, classmethod_descriptor, method-wrapper, slot wrapper, functools.lru_cache). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes
On 2018-05-19 11:15, mark wrote: PEP 576 aims to fulfill the same goals as PEP 575 (this is a copy of my comments on GitHub before this PEP was official) **Performance** Most importantly, changing bound methods of extension types from builtin_function_or_method to bound_method will yield a performance loss. It might be possible to mitigate this somewhat by adding specific optimizations for calling bound_method. However, that would add extra complexity and it will probably still be slower than the existing code. And I would also like to know whether it will be possible for custom built-in function subclasses to implement __get__ to change a function into a method (like Python functions) and whether/how the LOAD_METHOD opcode will work in that case. **Introspection** When I want "introspection support", that goes beyond the call signature. Also inspect.getfile should be supported. Currently, that simply raises an exception for built-in functions. I think it's important to specify the semantics of inspect.isfunction. Given that you don't mention it, I assume that inspect.isfunction will continue to return True only for Python functions. But that way, these new function classes won't behave like Python functions. fully backwards compatible. I wonder why you think it is "fully backwards compatible". Just like PEP 575, you are changing the classes of certain objects. I think it's fairer to say that both PEP 575 and PEP 576 might cause minor backwards compatibility issues. I certainly don't think that PEP 576 is significantly more backwards compatible than PEP 575. PS: in your PEP, you write "bound_method" but I guess you mean "method". PEP 575 proposes to rename "method" to "bound_method". Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-05-19 15:29, Nick Coghlan wrote: That's not how code reviews work, as their complexity is governed by the number of lines changed (added/removed/modified), not just the number of lines that are left at the end. Of course, you are right. I didn't mean literally that only the end result matters. But it should certainly be considered. If you only do small incremental changes, complexity tends to build up because choices which are locally optimal are not always globally optimal. Sometimes you need to do some refactoring to revisit some of that complexity. This is part of what PEP 575 does. That said, "deletes more lines than it adds" is typically a point strongly in favour of a particular change. This certainly won't be true for my patch, because there is a lot of code that I need to support for backwards compatibility (all the old code for method_descriptor in particular). Going back to the review of PEP 575, I see the following possible outcomes: (A) Accept it as is (possibly with minor changes). (B) Accept the general idea but split the details up in several PEPs which can still be discussed individually. (C) Accept a minimal variant of PEP 575, only changing existing classes but not changing the class hierarchy. (D) Accept some yet-to-be-written variant of PEP 575. (E) Don't fix the use case that PEP 575 wants to address. Petr Viktorin suggests (C). I am personally quite hesitant because that only adds complexity and it wouldn't be the best choice for the future maintainability of CPython. I also fear that this hypothetical PEP variant would be rejected because of that reason. Of course, if there is some general agreement that (C) is the way to go, then that is fine for me. If people feel that PEP 575 is currently too complex, I think that (B) is a very good compromise. The end result would be the same as what PEP 575 proposes. Instead of changing many things at once, we could handle each class in a separate PEP. But the motivation of those mini-PEPs will still be PEP 575. So, in order for this to make sense, the general idea of PEP 575 needs to be accepted: adding a base_function base class and making various existing classes subclasses of that. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes
For the record: the only reason that I replied on GitHub was because the proposal was not yet posted (as far as I know) to any mailing list. Typically, a post is made to a mailing list more or less at the same time as creating the PEP. In this case, there was a delay of a few days, maybe also because of unrelated issues with the compilation of the PEPs. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Stable ABI
On 2018-06-01 17:18, Nathaniel Smith wrote: Unfortunately, very few people use the stable ABI currently, so it's easy for things like this to get missed. So there are no tests for the stable ABI in Python? ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
Hello, I have been working on a slightly different PEP to use a new type slot tp_ccalloffset instead the base_function base class. You can see the work in progress here: https://github.com/jdemeyer/PEP-ccall By creating a new protocol that each class can implement, there is a full decoupling between the features of a class and between the class hierarchy (such coupling was complained about during the PEP 575 discussion). So I got convinced that this is a better approach. It also has the advantage that changes can be made more gradually: this PEP changes nothing at all on the Python side, it only changes the CPython implementation. I still think that it would be a good idea to refactor the class hierarchy, but that's now an independent issue. Another advantage is that it's more general and easier for existing classes to use the protocol (PEP 575 on the other hand requires subclassing from base_function which may not be compatible with an existing class hierarchy). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-06-17 14:50, Ronald Oussoren wrote: This looks interesting. Why did you add a tp_ccalloffset slot to the type with the actual information in instances instead of storing the information in a slot? Think of built-in functions. Every built-in function is a different callable and calls a different C function. So it must be stored in the instances. However, the case where all instances share a PyCCallDef is also possible: all instances would then simply have the same PyCCallDef pointer. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-06-18 03:34, INADA Naoki wrote: Victor had tried to add `tp_fastcall` slot, but he suspended his effort because it's benefit is not enough for it's complexity. https://bugs.python.org/issue29259 I has a quick look at that patch and it's really orthogonal to what I'm proposing. I'm proposing to use the slot *instead* of existing fastcall optimizations. Victor's patch was about adding fastcall support to classes that didn't support it before. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-06-18 15:09, Victor Stinner wrote: 2) we implemented a lot of other optimizations which made calls faster without having to touch tp_call nor tp_fastcall. And that's a problem because these optimizations typically only work for specific classes. My PEP wants to replace those by something more structural. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-06-18 16:55, INADA Naoki wrote: Speeding up most python function and some bultin functions was very significant. But I doubt making some 3rd party call 20% faster can make real applications significant faster. These two sentences are almost contradictory. I find it strange to claim that a given optimization was "very significant" in specific cases while saying that the same optimization won't matter in other cases. People *have* done benchmarks for actual code and this is causing actual slow-downs of around 20% in actual applications. That is the main reason why I am trying to push this PEP (or PEP 575 which solves the same problem in a different way). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575 (Unifying function/method classes) update
On 2018-06-18 15:09, Victor Stinner wrote: There are multiple issues with tp_fastcall: Personally, I think that you are exaggerating these issues. Below, I'm writing the word FASTCALL to refer to tp_fastcall in your patch as well as my C call protocol in the PEP-in-progress. * ABI issue: it's possible to load a C extension using the old ABI, without tp_fastcall: it's not possible to write type->tp_fastcall on such type. This limitation causes different issues. It's not hard to check for FASTCALL support and have a case distinction between using tp_call and FASTCALL. * If tp_call is modified, tp_fastcall may be outdated. I plan to support FASTCALL only for extension types. Those cannot be changed from Python. If it turns out that FASTCALL might give significant benefits also for heap types, we can deal with those modifications: we already need to deal with such modifications anyway for existing slots like __call__. * Many public functions of the C API still requires the tuple and dict to pass positional and keyword arguments, so a compatibility layer is required to types who only want to implement FASTCALL. Related issue: what is something calls tp_call with (args: tuple, kwargs: dict)? Crash or call a compatibility layer converting arguments to FASTCALL calling convention? You make it sound as if such a "compatibility layer" is a big issue. You just need one C API function to put in the tp_call slot which calls the object instead using FASTCALL. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods
Hello, Let me present PEP 579 and PEP 580. PEP 579 is an informational meta-PEP, listing some of the issues with functions/methods implemented in C. The idea is to create several PEPs each fix some part of the issues mentioned in PEP 579. PEP 580 is a standards track PEP to introduce a new "C call" protocol, which is an important part of PEP 579. In the reference implementation (which is work in progress), this protocol will be used by built-in functions and methods. However, it should be used by more classes in the future. You find the texts at https://www.python.org/dev/peps/pep-0579 https://www.python.org/dev/peps/pep-0580 Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] C-level calling
On 2018-06-20 08:00, Stefan Behnel wrote: Just to add another bit of background on top of the current discussion, there is an idea around, especially in the scipy/big-data community, (and I'm not giving any guarantees here that it will lead to a PEP + implementation, as it depends on people's workload) to design a dedicated C level calling interface for Python. Think of it as similar to the buffer interface, but for calling arbitrary C functions by bypassing the Python call interface entirely. Objects that wrap some kind of C function (and there are tons of them in the CPython world) would gain C signature meta data, maybe even for overloaded signatures, and C code that wants to call them could validate that meta data and call them as native C calls. See also https://www.python.org/dev/peps/pep-0579/#allowing-native-c-arguments I specifically designed PEP 580 to be extendable such that it would be possible to add features later. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods
On 2018-06-20 16:09, Antoine Pitrou wrote: But there seems to be some complication on top of that: - PyCCall_FastCall() accepts several types for the keywords, even a dict; That is actually a *simplification* instead of a *complication*. Currently, there is a huge amount of code duplication between _PyMethodDef_RawFastCallKeywords and _PyMethodDef_RawFastCallDict. Folding both of these in one function actually makes things simpler. does it get forwarded as-is to the `cc_func` or is it first transformed? Transformed (obviously, otherwise it would be a huge backwards incompatibility problem). - there's CCALL_OBJCLASS and CCALL_SLICE_SELF which have, well, non-obvious behaviour (especially the latter), especially as it is conditioned on the value of other fields or flags It's actually quite obvious when you think of it: both are needed to support existing use cases. Perhaps it's just not explained well enough in the PEP. I wonder if there's a way to push some of the specificities out of the protocol and into the C API that mediates between the protocol and actual callers? Sorry, I have no idea what you mean here. Actually, those flags are handled by the C API. The actual C functions don't need to care about those flags. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods
On 2018-06-20 16:42, Antoine Pitrou wrote: I'm wondering what amount of code and debugging is needed for, say, Cython or Numba to implement that protocol as a caller, without going through the C API's indirections (for performance). The goal is to have a really fast C API without a lot of indirections. If Cython or Numba can implement the protocol faster than CPython, we should just change the CPython implementation to be equally fast. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579
On 2018-06-20 17:42, INADA Naoki wrote: I don't have any idea about changing METH_FASTCALL more. If Victor and Serhiy think so, and PyPy maintainers like it too, I want to make it public as soon as possible. There are two different things here: The first is documenting METH_FASTCALL such that everybody can create built-in functions using the METH_FASTCALL signature. I think that the API for METH_FASTCALL (without or with METH_KEYWORDS) is fine, so I support making it public. This is really just a documentation issue, so I see no reason why it couldn't be added to 3.7.0 if we're fast. The API for calling functions using the FASTCALL convention is more of a mess though. There are functions taking keyword arguments as dict and functions taking them as tuple. As I mentioned in PEP 580, I'd like to merge these and simply allow either a dict or a tuple. Since this would require an API change, this won't be for 3.7.0. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods
On 2018-06-21 11:22, Victor Stinner wrote: https://www.python.org/dev/peps/pep-0580/#the-c-call-protocol CCALL_VARARGS: cc_func(PyObject *self, PyObject *args) If we add a new calling convention This is not a *new* calling convention, it's the *existing* calling convention for METH_VARARGS. Obviously, we need to continue to support that. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] About [].append == [].append
Currently, we have: >>> [].append == [].append False However, with a Python class: >>> class List(list): ... def append(self, x): super().append(x) >>> List().append == List().append True In the former case, __self__ is compared using "is" and in the latter case, it is compared using "==". I think that comparing using "==" is the right thing to do because "is" is really an implementation detail. Consider >>> (1).bit_length == (1).bit_length True >>> (1).bit_length == (1+0).bit_length False I guess that's also the reason why CPython internally rarely uses "is" for comparisons. See also: - https://bugs.python.org/issue1617161 - https://bugs.python.org/issue33925 Any opinions? Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] About [].append == [].append
On 2018-06-21 13:33, Ivan Pozdeev via Python-Dev wrote: First, tell us what problem you're solving. There is no specific problem I want to solve here. I just noticed an inconsistency and I wondered if it would be OK to change the implementation of comparisons of builtin_function_or_method instances. It's a valid question to ask even if it doesn't solve an actual problem. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 580 (C call protocol) draft implementation
Hello all, I have a first draft implementation of PEP 580 (introducing the C call protocol): https://github.com/jdemeyer/cpython/tree/pep580 Almost all tests pass, only test_gdb and test_pydoc fail for me. I still have to fix those. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] About [].append == [].append
On 2018-06-23 03:50, Steven D'Aprano wrote: I think it is more important that builtin methods and Python methods behave the same. +1 This inconsistency is the *real* problem here. It's one little extra complication to merging those classes (which was proposed in PEP 575, 576 and 579). ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Policy on refactoring/clean up
Hello, On https://github.com/python/cpython/pull/7909 I encountered friction for a PR which I expected to be uncontroversial: it just moves some code without changing any functionality. So basically my question is: is there some CPython policy *against* refactoring code to make it easier to read and write? (Note that I'm not talking about pure style issues here) Background: cpython has a source file "call.c" (introduced in https://github.com/python/cpython/pull/12) but the corresponding declarations are split over several .h files. While working on PEP 580, I found this slightly confusing. I decided that it would make more sense to group all these declarations in a new file "call.h". That's what PR 7909 does. In my opinion, the resulting code is easier to read. It also defines a clear place for declarations of future functionality added to "call.c" (for example, if we add a public API for FASTCALL). Finally, I added/clarified a few comments. I expected the PR to be either ignored or accepted. However, I received a negative reaction from Inada Naoki on it. I don't mind closing the PR and keeping the status quo if there is a general agreement. However, I'm afraid that a future reviewer of PEP 580 might say "your includes are a mess" and he will be right. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy on refactoring/clean up
On 2018-06-26 13:11, Ivan Pozdeev via Python-Dev wrote: AFAICS, your PR is not a strict improvement What does "strict improvement" even mean? Many changes are not strict improvements, but still useful to have. Inada pointed me to YAGNI (https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) but I disagree with that premise: there is a large gray zone between "completely useless" and "really needed". My PR falls in that gap of "nice to have but we can do without it". You may suggest it as a supplemental PR to PEP 580. Or even a part of it, but since the changes are controversial, better make the refactorings into separate commits so they can be rolled back separately if needed. If those refactorings are rejected now, won't they be rejected as part of PEP 580 also? ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy on refactoring/clean up
On 2018-06-26 13:54, INADA Naoki wrote: No, YAGNI is posted by someone and they removed their comment. Sorry for that, I misunderstood the email that GitHub sent me. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy on refactoring/clean up
On 2018-06-26 13:54, Ivan Pozdeev via Python-Dev wrote: This is exactly what that the YAGNI principle is about, and Inada was right to point to it. Until you have an immediate practical need for something, you don't really know the shape and form for it that you will be the most comfortable with. Thus any "would be nice to have" tinkerings are essentially a waste of time and possibly a degradation, too: you'll very likely have to change them again when the real need arises -- while having to live with any drawbacks in the meantime. It is important to clarify that this is exactly what I did. I *have* an implementation of PEP 580 and it's based on that PR 7909. I just think that this PR makes sense independently of whether PEP 580 will be accepted. So, if you suggest those changes together with the PEP 580 PR That sounds like a bad idea because that would be mixing two issues in one PR. If I want to increase my chances of getting PEP 580 and its implementation accepted, I shouldn't bring in unrelated changes. To put it in a different perspective: if somebody else would make a PR to one of my projects doing a refactoring and adding new features, I would ask them to split it up. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy on refactoring/clean up
On 2018-06-26 13:54, INADA Naoki wrote: Real need is important than my preference. If it is needed PEP 580, I'm OK. Of course it's not needed. I never claimed that it was. I think it's *nice to have* right now and slightly more *nice to have* when changes/additions are made to call.c in the future. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 576
On 2018-06-26 21:43, Mark Shannon wrote: https://github.com/markshannon/pep-576 Your version of PEP 576 looks very different from the "official" PEP 576 at https://www.python.org/dev/peps/pep-0576/ So can you please make a pull request to https://github.com/python/peps/pulls Also feel free to add pointers to your PEP on PEP 579. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy on refactoring/clean up
On 2018-06-27 00:02, Guido van Rossum wrote: And TBH a desire to refactor a lot of code is often a sign of a relatively new contributor who hasn't learned their way around the code yet, so they tend to want to make the code follow their understanding rather than letting their understanding follow the code. ...or it could be that the code is written the way it is only for historical reasons, instead of being purposely written that way. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 576
On 2018-06-26 21:43, Mark Shannon wrote: https://github.com/markshannon/pep-576 This actually looks close to Victor Stinner's bpo-29259. But instead of storing the function pointer in the class, you're storing it in the instance. One concern that I have is that this might lead to code duplication. You require that every class implements its own specialized _FOO_FastcallKeywords() function. So you end up with _PyCFunction_FastCallKeywords(), _PyMethodDescr_FastCallKeywords(), _PyFunction_FastCallKeywords(). If I want to implement a similar class myself, I have to reinvent that same wheel again. With PEP 580, I replace all those _FOO_FastCallKeywords() functions by one PyCCall_FASTCALL() function. Admittedly, my PyCCall_FASTCALL() is more complex than each of those _FOO_FastcallKeywords() individually. But overall, I think that PEP 580 leads to simpler code. Second, you still have a performance problem for methods. You made sure that the method optimizations in the Python bytecode interpreter continue to work, but method calls from C will be slowed down. I don't know to what extent and whether it really matters, but it's something to note. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Comparing PEP 576 and PEP 580
Hello all, in order to make reviewing PEP 576/580 easier and possibly take some ideas from one PEP to the other, let me state the one fundamental difference between these PEPs. There are many details in both PEPs that can still change, so I'm focusing on what I think is the big structural difference. To be clear: I'm referring to the PEP 576 version at https://github.com/markshannon/pep-576/blob/master/README.rst (this really should be merged in the main PEP repo). Both PEPs add a hook for fast calling of C functions. However, they do that on a different level. Let's trace what _PyObject_FastCallKeywords() currently does when acting on an instance of builtin_function_or_method: A. _PyObject_FastCallKeywords() calls B. _PyCFunction_FastCallKeywords() which calls C. _PyMethodDef_RawFastCallKeywords() which calls D. the actual C function (*ml_meth)() PEP 576 hooks the call A->B while PEP 580 hooks the call B->D (getting rid of C). Advantages of the high-level hook (PEP 576): * Much simpler protocol than PEP 580. * More general since B can be anything. * Not being forced to deal with "self". * Slightly faster when you don't care about B. Advantages of the low-level hook (PEP 580): * No need to duplicate the code from B (see the various existing _{FOO}_FastCallKeywords functions). * Enables certain optimizations because other code can make assumptions about what B does. In my personal opinion, the last advantage of PEP 580 is really important: some existing optimizations depend on it and it also allows extending the protocol in a "performance-compatible" way: it's easy to extend the protocol in a way that callers can benefit from it. Anyway, it would be good to have some guidance on how to proceed here. I would really like something like PEP 580 to be accepted and I'm willing to put time and effort into achieving that. Thanks, Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-04 03:31, INADA Naoki wrote: I think both PEPs are relying on FASTCALL calling convention, and can't be accepted until FASTCALL is stable & public. First of all, the fact that FASTCALL has not been made public should not prevent from discussing those PEPs and even making a (provisional?) decision on them. I don't think that the precise API of FASTCALL really matters that much. More importantly, I don't think that you can separate making FASTCALL public from PEP 576/580. As you noted in [1], making FASTCALL public means more than just documenting METH_FASTCALL. In particular, a new API should be added for calling objects using the FASTCALL convention. Here I mean both an abstract API for arbitrary callables as well as a specific API for certain classes. Since PEP 580 (and possibly also PEP 576) proposes changes to the implementation of FASTCALL, it makes sense to design the public API for FASTCALL after it is clear which of those PEPs (if any) is accepted. If we fix the FASTCALL API now, it might not be optimal when either PEP 576 or PEP 580 is accepted. Jeroen. [1] https://mail.python.org/pipermail/python-dev/2018-June/153956.html ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-05 05:41, INADA Naoki wrote: And stabilizing calling convention is prerequirements of designing new calling APIs. I don't see why. I made my PEP with the assumption that the METH_FASTCALL calling convention won't change. As far as I know, nobody advocated for changing it. But even if we decide to change METH_FASTCALL, I could trivially adapt my PEP. That's why I suggest discussing METH_FASTCALL first. I certainly agree that it's a good idea to discuss METH_FASTCALL, but I still don't see why that should block the discussion of PEP 576/580. I can understand that you want to wait to *implement* PEP 576/580 as long as METH_FASTCALL isn't public. But we should not wait to *discuss* those PEPs. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-05 13:32, INADA Naoki wrote: Core devs interested in this area is limited resource. I know and unfortunately there is nothing that I can do about that. It would be a pity that PEP 580 (or a variant like PEP 576) is not accepted simply because no core developer cares enough. As far as I understand, there are some important topics to discuss. a. Low level calling convention, including argument parsing API. b. New API for calling objects without argument tuple and dict. c. How more types can support FASTCALL, LOAD_METHOD and CALL_METHOD. d. How to reorganize existing builtin types, without breaking stable ABI. Right, that's why I wanted PEP 580 to be only about (c) and nothing else. I made the mistake in PEP 575 of also involving (d). I still don't understand why we must finish (a) before we can even start discussing (c). Reference implementation helps discussion. METH_FASTCALL and argument parsing for METH_FASTCALL is already implemented in CPython. Not in documented public functions, but the implementation exists. And PEP 580 also has a reference implementation: https://github.com/jdemeyer/cpython/tree/pep580 Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] On the METH_FASTCALL calling convention
Hello all, As discussed in some other threads ([1], [2]), we should discuss the METH_FASTCALL calling convention. For passing only positional arguments, a C array of Python objects is used, which is as fast as it can get. When the Python interpreter calls a function, it builds that C array on the interpreter stack: >>> from dis import dis >>> def f(x, y): return g(x, y, 12) >>> dis(f) 1 0 LOAD_GLOBAL 0 (g) 2 LOAD_FAST0 (x) 4 LOAD_FAST1 (y) 6 LOAD_CONST 1 (12) 8 CALL_FUNCTION3 10 RETURN_VALUE A C array can also easily and efficiently be handled by the C function receiving it. So I consider this uncontroversial. The convention for METH_FASTCALL|METH_KEYWORDS is that keyword *names* are passed as a tuple and keyword *values* in the same C array with positional arguments. An example: >>> from dis import dis >>> def f(x, y, z): return f(x, foo=y, bar=z) >>> dis(f) 1 0 LOAD_GLOBAL 0 (f) 2 LOAD_FAST0 (x) 4 LOAD_FAST1 (y) 6 LOAD_FAST2 (z) 8 LOAD_CONST 1 (('foo', 'bar')) 10 CALL_FUNCTION_KW 3 12 RETURN_VALUE This is pretty clever: it exploits the fact that ('foo', 'bar') is a constant tuple stored in f.__code__.co_consts. Also, a tuple can be efficiently handled by the called code: it is essentially a thin wrapper around a C array of Python objects. So this works well. The only case when this handling of keywords is suboptimal is when using **kwargs. In that case, a dict must be converted to a tuple. It looks hard to me to support efficiently both the case of fixed keyword arguments (f(foo=x)) and a keyword dict (f(**kwargs)). Since the former is more common than the latter, the current choice is optimal. In other words: I see nothing to improve in the calling convention of METH_FASTCALL. I suggest to keep it and make it public as-is. Jeroen. [1] https://mail.python.org/pipermail/python-dev/2018-June/153945.html [2] https://mail.python.org/pipermail/python-dev/2018-July/154251.html ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-05 14:20, INADA Naoki wrote: What you can do is "divide and conquer". Split PEP in small topics we can focus. The PEP is already small and focused, I really did my best to make it as minimal as possible. I don't see a meaningful way to split it up even further. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-05 21:57, Guido van Rossum wrote: Would it be possible to get outside experts to help? I don't understand what you mean: to help with what? Designing the PEP? Discussing the PEP? Accepting the PEP? Lobbying Python core devs? The Cython developers (in particular Stefan Behnel) certainly support my work. I have talked with them in person at a workshop and they posted a few emails to python-dev and they also gave me some personal comments about PEP 580. As for NumPy, one obvious place where some ideas of PEP 579 could be applied is ufuncs. However, typically ufuncs are not *called* in tight loops, they implement tight loops internally. So I don't think that there is pressing need for anything like PEP 580 in NumPy. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] On the METH_FASTCALL calling convention
On 2018-07-06 06:07, INADA Naoki wrote: Maybe, one way to improve METH_FASTCALL | METH_KEYWORDS can be this. kwds can be either tuple or dict. But that would be just pushing the complexity down to the callee. I'd rather have a simpler protocol at the expense of a slightly more complex supporting API. I also don't see the point: the calls where performance truly matters typically don't use keyword arguments anyway (independently of whether the called function accepts them). Moreover, the large majority of functions take normal keyword arguments, not **kwargs. When parsing those arguments, the dict would need to be unpacked anyway. So you don't gain much by forcing the callee to handle that instead of doing it in PyCCall_FASTCALL(). Functions just passing through **kwargs (say, functools.lru_cache) don't need a dict either: they can implement the C call protocol of PEP 580 with METH_FASTCALL and then call the wrapped function also using FASTCALL. So really the only remaining case is when the callee wants to do something with **kwargs as dict. But I find it hard to come up with a natural use case for that, especially one where performance matters. And even then, that function could just use METH_VARARGS. So I don't see any compelling reason to allow a dict in METH_FASTCALL. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-05 14:20, INADA Naoki wrote: like you ignored my advice about creating realistic benchmark for calling 3rd party callable before talking about performance... I didn't really want to ignore that, I just didn't know what to do. As far as I can tell, the official Python benchmark suite is https://github.com/python/performance However, that deals only with pure Python code, not with the C API. So those benchmarks are not relevant to PEP 580. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-06 23:12, Guido van Rossum wrote: It's your PEP. And you seem to be struggling with something. But I can't tell quite what it is you're struggling with. To be perfectly honest (no hard feelings though!): what I'm struggling with is getting feedback (either positive or negative) from core devs about the actual PEP 580. At the same time I assume you want your PEP accepted. As I also said during the PEP 575 discussion, my real goal is to solve a concrete problem, not to push my personal PEP. I still think that PEP 580 is the best solution but I welcome other suggestions. And how do they feel about PEP 576? I'd like to see some actual debate of the pros and cons of the details of PEP 576 vs. PEP 580. I started this thread to do precisely that. My opinion: PEP 580 has zero performance cost, while PEP 576 does make performance for bound methods worse (there is no reference implementation of the new PEP 576 yet, so that's hard to quantify for now). PEP 580 is also more future-proof: it defines a new protocol which can easily be extended in the future. PEP 576 just builds on PyMethodDef which cannot be extended because of ABI compatibility (putting __text_signature__ and __doc__ in the same C string is a good symptom of that). This extensibility is important because I want PEP 580 to be the first in a series of PEPs working out this new protocol. See PEP 579 for the bigger picture. One thing that might count against PEP 580 is that it defines a whole new protocol, which could be seen as too complicated. However, it must be this complicated because it is meant to generalize the current behavior and optimizations of built-in functions and methods. There are lots of little tricks currently in CPython that must be "ported" to the new protocol. OK, so is it your claim that the NumPy developers don't care about which one of these PEPs is accepted or even whether one is accepted at all? I don't know, I haven't contacted any NumPy devs yet, so that was just my personal feeling. These PEPs are about optimizing callables and NumPy isn't really about callables. I think that the audience for PEP 580 is mostly compilers (Cython for sure but possibly also Pythran, numba, cppyy, ...). Also certain C classes like functools.lru_cache could benefit from it. Yet earlier in *this* thread you seemed to claim that PEP 580 requires changes ro FASTCALL. I don't know what you mean with that. But maybe it's also confusing because "FASTCALL" can mean different things: it can refer to a PyMethodDef (used by builtin_function_or_method and method_descriptor) with the METH_FASTCALL flag set. It can also refer to a more general API like _PyCFunction_FastCallKeywords, which supports METH_FASTCALL but also other calling conventions like METH_VARARGS. I don't think that METH_FASTCALL should be changed (and PEP 580 isn't really about that at all). For the latter, I'm suggesting some API changes but nothing fundamental: mainly replacing the 5 existing private functions _PyCFunction_FastCallKeywords, _PyCFunction_FastCallDict, _PyMethodDescr_FastCallKeywords, _PyMethodDef_RawFastCallKeywords, _PyMethodDef_RawFastCallDict by 1 public function PyCCall_FASTCALL. Hopefully this clears some things up, Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575, 576, 579 and 580
On 2018-07-07 15:38, Mark Shannon wrote: Hi, We seem to have a plethora of PEPs where we really ought to have one (or none?). - PEP 575 has been withdrawn. - PEP 579 is an informational PEP with the bigger picture; it does contain some of the requirements that you want to discuss here. - PEP 580 and PEP 576 are two alternative implementations of a protocol to optimize callables implemented in C. 5. It should speed up CPython for the standard benchmark suite. I'd like to replace this by: must *not slow down* the standard benchmark suite and preferable should not slow down anything. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Comparing PEP 576 and PEP 580
On 2018-07-07 14:54, Mark Shannon wrote: There is a minimal implementation and has been for a while. There is a link at the bottom of the PEP. Yes, I saw that but the implementation does not correspond to the PEP. In particular, this sentence from the PEP has not been implemented: When binding a method_descriptor instance to an instance of its owning class, a bound_method will be created instead of a builtin_function_or_method It's not clear to me whether you still want to implement that or whether it should be dropped from the PEP. PEP 576 adds a new calling convention which can be used by *any* object. Seems quite extensible to me. Yes and no. Yes, it can do anything. But because it can do anything, callers cannot optimize certain special cases. For example, in PEP 576 you need an extra flag Py_TPFLAGS_FUNCTION_DESCRIPTOR because your protocol doesn't specify anything about __get__. Imagine that you want to support more optimizations like that in the future, how do you plan to do that? Of course, you can always add more stuff to PyTypeObject, but a separate structure like what I propose in PEP 580 might make more sense. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] On the METH_FASTCALL calling convention
On 2018-07-07 10:55, Serhiy Storchaka wrote: The first part of handling arguments can be made outside of the C function, by the calling API. Sure, it could be done but I don't see the advantage. I don't think you will gain performance because you are just moving code from one place to another. And how do you plan to deal with *args and **kwds in your proposal? You'll need to make sure that this doesn't become slower. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 575, 576, 579 and 580
On 2018-07-08 23:13, Mark Shannon wrote: I've added you suggestion, and everyone else's, to this github repo: https://github.com/markshannon/extended-calling-convention Feel free to comment on github, submit PRs or just email me directly if you have anything else you want to add. Do you agree to add this to PEP 579 instead? ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Micro-benchmarks for function calls (PEP 576/579/580)
Here is an initial version of a micro-benchmark for C function calling: https://github.com/jdemeyer/callbench I don't have results yet, since I'm struggling to find the right options to "perf timeit" to get a stable result. If somebody knows how to do this, help is welcome. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Micro-benchmarks for PEP 580
OK, I tried with --duplicate 200 and you can see the results at https://gist.github.com/jdemeyer/f0d63be8f30dc34cc989cd11d43df248 In short, the timings with and without PEP 580 are roughly the same (which is to be expected). Interestingly, a small but significant improvement can be seen when calling *unbound* methods. The real improvement comes from supporting a new calling protocol: formerly custom classes could only implement tp_call, but now they can use FASTCALL just like built-in functions/methods. For this, there is an improvement of roughly a factor 1.2 for calls without arguments, 1.6 for calls with positional arguments and 2.8 for calls with keywords. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Micro-benchmarks for PEP 580
On 2018-07-10 14:59, INADA Naoki wrote: Currently, we create temporary long object for passing argument. If there is protocol for exposeing format used by PyArg_Parse*, we can bypass temporal Python object and call myfunc_impl directly. Indeed, I already mentioned this at https://www.python.org/dev/peps/pep-0579/#allowing-native-c-arguments The way I see it, these kind of improvements should be done on top of PEP 580. Maybe I didn't emphasize this enough, but one of the goals of PEP 580 is to create an *extensible* protocol where such features can easily be added later in a compatible way. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Micro-benchmarks for PEP 580
On 2018-07-11 00:48, Victor Stinner wrote: About your benchmark results: "FASTCALL unbound method(obj, 1, two=2): Mean +- std dev: 42.6 ns +- 29.6 ns" That's a very big standard deviation :-( Yes, I know. My CPU was overheating and was slowed down. But that seemed to have happened for a small number of benchmarks only. But given that you find these benchmarks stupid anyway, I assume that you don't really care. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Micro-benchmarks for PEP 580
On 2018-07-11 10:27, Antoine Pitrou wrote: I agree PEP 580 is extremely complicated and it's not obvious what the maintenance burden will be in the long term. But the status quo is also very complicated! If somebody would write a PEP describing the existing implementation of builtin_function_or_method and method_descriptor with all its optimizations, probably you would also find it complicated. Have you actually looked at the existing implementation in Python/ceval.c and Object/call.c for calling objects? One of the things that PEP 580 offers is replacing 5 (yes, five!) functions _PyCFunction_FastCallKeywords, _PyCFunction_FastCallDict, _PyMethodDescr_FastCallKeywords, _PyMethodDef_RawFastCallKeywords, _PyMethodDef_RawFastCallDict by a single function PyCCall_FASTCALL. Anyway, it would help if you could say why you (and others) think that it's complicated. Sure, there are many details to be considered (for example, the section about Descriptor behavior), but those are not essential to understand what the PEP does. I wrote the PEP as a complete specification, give full details. Maybe I should add a section just explaining the core ideas without details? Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Micro-benchmarks for PEP 580
On 2018-07-11 10:50, Victor Stinner wrote: As you wrote, the cost of function costs is unlikely the bottleneck of application. With that idea, METH_FASTCALL is not needed either. I still find it very strange that nobody seems to question all the crazy existing optimizations for function calls in CPython, yet claiming at the same time that those are just stupid micro-optimizations which are surely not important for real applications. Anyway, I'm thinking about real-life benchmarks but that's quite hard. One issue is that PEP 580 by itself does not make existing faster, but allows faster code to be written in the future. A second issue is that Cython (my main application) already contains optimizations for Cython-to-Cython calls. So, to see the actual impact of PEP 580, I should disable those. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 580 (C call protocol) minor update
I made some minor updates to PEP 580 (PEP editors: please merge https://github.com/python/peps/pull/741) and its reference implementation: - Added a new introductory section explaining the basic idea. - The C protocol no longer deals with __name__; a __name__ attribute is required but the protocol does not deal with its implementation. - The PEP no longer deals with profiling. This means that profiling only works for actual instances of builtin_function_or_method and method_descriptor. Profiling arbitrary callables would be nice, but that is deferred to a future PEP. The last two items are meant to simplify the PEP (although this is debatable since "simple" is very subjective). Enjoy! Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Specification of C function profiling?
Hello, it seems to me that there is no clear specification for the sys.setprofile() event c_call: the docs say "A C function is about to be called" but it's not clear what that means exactly, in particular when that C function is an unbound method like list.append. I also noticed that Lib/test/test_sys_setprofile.py doesn't test any of the c_* events. I'm asking in the context of https://bugs.python.org/issue34125 I found out that list.append([], None) *does* generate a c_call event but list.append([], None, **{}) does not. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Benchmarks why we need PEP 576/579/580
Hello, I finally managed to get some real-life benchmarks for why we need a faster C calling protocol (see PEPs 576, 579, 580). I focused on the Cython compilation of SageMath. By default, a function in Cython is an instance of builtin_function_or_method (analogously, method_descriptor for a method), which has special optimizations in the CPython interpreter. But the option "binding=True" changes those to a custom class which is NOT optimized. I ran the full SageMath testsuite several times without and with binding=True to find out any significant differences. The most dramatic difference is multiplication for generic matrices. More precisely, with the following command: python -m timeit -s "from sage.all import MatrixSpace, GF; M = MatrixSpace(GF(9), 200).random_element()" "M * M" With binding=False, I got 10 loops, best of 3: 692 msec per loop With binding=True, I got 10 loops, best of 3: 1.16 sec per loop This is a big regression which should be gone completely with PEP 580. I should mention that this was done on Python 2.7.15 (SageMath is not yet ported to Python 3) but I see no reason why the conclusions shouldn't be valid for newer Python versions. I used SageMath 8.3.rc1 and Cython 0.28.4. I hope that this finally shows that the problems mentioned in PEP 579 are real. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-21 19:07, INADA Naoki wrote: Good job. But I already +1 for adding support for extension callable type. Do you think this benchmark can be optimized more in future optimization which is possible by PEP 580, but not 576? I should clarify that the benchmark did not involve an implementation of PEP 576 or PEP 580. It simply shows what kind of regressions one gets when *not* implementing something like those PEPs. So this can't be used to compare PEP 576 versus PEP 580. I still think that PEP 576 would slow down bound method calls but without a reference implementation, I can only guess. (I know that PEP 576 claims a reference implementation but it doesn't correspond to the PEP. I'm basing myself on the text of PEP 576, not the "reference implementation".) Do you mean you backport LOAD_METHOD and fastcall to Python 2.7 for benchmarking? No, I did not. This is just benchmarking the difference between tp_call and more specialized call functions (In Python 2.7, that is call_function()). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-21 19:55, Guido van Rossum wrote: I don’t think we can safely assume Python 3.7 has the same performance, actually. A lot has changed. I'm not denying that some things have changed. Rather, I claim that those changes wouldn't invalidate the benchmarks. I am comparing calls through tp_call (A) versus optimized call paths (B). I only need to assume that the speed improvements to (A) between 2.7 and 3.7 are not bigger than the speed improvements to (B). Most optimizations which have been done in Python 3.x target (B). In fact, I'm not aware of any optimization to (A) apart from maybe some minor code improvements. So I think it's a relatively safe assumption that the speed difference between (A) and (B) did not decrease from 2.7 to 3.7. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-22 08:27, INADA Naoki wrote: It's interesting... But I failed to build sage. What went wrong? It's build step is too different from normal Python package. That's true because Sage considers itself a distribution rather than a package. But it's possible to pick the distribution apart and build just the Python package "sage". In fact, various Linux distros package Sage that way. The reason for it being a distribution is mainly that it has a huge number of dependencies (many of them not Python), so it wouldn't be possible to do "pip install sage" anyway. It tooks very long time to build. That's just a matter of waiting a few hours. And "install from source" document only describe step to `./sage` command work. It doesn't describe step to `improt sage` works. Those two are pretty much equivalent. If you really want just the latter, you can run "make sageruntime" in the Sage root. Target application should be easy to test, benchmark and profile for all of core-devs interesting in these PEPs. I feel like the bar for this PEP is being raised all the time. First, you ask for an application benchmark and I provided an application benchmark. Now you complain that my application is not suitable. Why don't you just believe my timings? Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-22 01:14, Guido van Rossum wrote: Jeroen was asked to provide benchmarks but only provided them for Python 2. The reasoning that not much has changed that could affect the benchmarks feels a bit optimistic, that’s all. The micro-benchmarks showed a clear difference on Python 3.8 (git master) between tp_call and optimized call paths. The application benchmark on Python 2.7.15 shows that the difference between tp_call and optimized call paths matters in real applications. I agree that this isn't 100% bullet-proof evidence, but at least it's a strong indication that it might be worth it to start discussing the actual PEP. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-22 18:14, Guido van Rossum wrote: Sorry I don't have better news. I don't consider that particularly bad news (but not good news either). I feel like the status on PEP 580 isn't anywhere near accepted anyway. I just hope that Python development won't stall completely. Even if no formal action can be taken on this PEP (or any other), I hope that there will still be informal discussion. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-22 14:52, Stefan Behnel wrote: Someone has to maintain the *existing* code base and help newcomers to get into it and understand it. This is an important point that people seem to be overlooking. The complexity and maintenance burden of PEP 580 should be compared to the status-quo. The existing code is complicated, yet nobody seems to find that a problem. But when PEP 580 makes changes to that complicated code (and documents some of the existing complexity), it's suddenly the fault of PEP 580 that the code is complicated. I also wonder if there is a psychological difference simply because this is a PEP and not an issue on bugs.python.org. That might give the impression that it's a more serious thing somehow. Previous optimizations (https://bugs.python.org/issue26110 for example) were not done in a PEP and nobody ever mentioned that the extra complexity might be a problem. Finally, in some ways, my PEP would actually be a simplification because it replaces several special cases by one general protocol. Admittedly, the general protocol that I propose is more complicated than each existing special case individually but the overall complexity might actually decrease. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-22 22:32, Antoine Pitrou wrote: Two things: - the issue26110 changes are not very large, it's just two additional opcodes and a bit of support code I didn't mean to compare PEP 580 to that specific issue, it was just an example. Even if issue26110 is small, the total complexity added by many incremental changes can still be big. - more importantly, issue26110 is entirely internal changes, while you are proposing to add a new protocol (which is like a new API) Just to make sure I understand you: your point is that it's automatically more complicated because it's an API instead of an internal change? ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
I did exactly the same benchmark again with Python 3.7 and the results are similar. I'm copying and editing the original post for completeness: I finally managed to get some real-life benchmarks for why we need a faster C calling protocol (see PEPs 576, 579, 580). I focused on the Cython compilation of SageMath. By default, a function in Cython is an instance of builtin_function_or_method (analogously, method_descriptor for a method), which has special optimizations in the CPython interpreter. But the option "binding=True" changes those to a custom class which is NOT optimized. I ran the full SageMath testsuite several times on Python 2.7 without and with binding=True to find out any significant differences. I then checked if those differences could be reproduced on Python 3.7 (SageMath has not been fully ported to Python 3 yet). The most dramatic difference is multiplication for generic matrices. More precisely, with the following command: python3 -m timeit -s "from sage.all import MatrixSpace, GF; M = MatrixSpace(GF(9), 200).random_element()" "M * M" With binding=False, I got 1 loop, best of 5: 1.19 sec per loop With binding=True, I got 1 loop, best of 5: 1.83 sec per loop This is a big regression which should be gone completely with PEP 580. I used Python 3.7, SageMath 8.3.rc1 (plus a few patches to make it work with binding=True and with Python 3.7) and Cython 0.28.4. I hope that this finally shows that the problems mentioned in PEP 579 are real. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-23 01:54, Ivan Pozdeev via Python-Dev wrote: All the material to discuss that we have in this thread is a single test result that's impossible to reproduce and impossible to run in Py3. I just posted that it can be reproduced on Python 3.7: https://mail.python.org/pipermail/python-dev/2018-July/154740.html I admit that it's not entirely trivial to do that. The Python 3 port of SageMath is still work in progress and the Python 3.7 port even more so. So it requires a few patches. If somebody wants to reproduce those results right now, I could give more details. But really, I would recommend to wait a month or so and then hopefully those patches will be merged. It's however impossible to say from this how frequent these scenarios are in practice And how would you suggest that we measure that? All benchmarks are artificial in some way: for every benchmark, one can find reasons why it's not relevant. and how consistent the improvement is among them. I only posted the most serious regression. As another data point, the total time to run the full SageMath testsuite increased by about 1.8% when compiling the Cython code with binding=True. So one could assume that there is an average improvement of 1.8% with a much larger improvement in a few specific cases. Likewise, it's impossible to say anything about the complexity the changes will reduce/introduce without a proof-of-concept implementation. Why do you think that there is no implementation? As mentioned in PEP 580, there is an implementation at https://github.com/jdemeyer/cpython/tree/pep580 Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-23 00:28, Guido van Rossum wrote: So does your implementation of the PEP result in a net increase or decrease of the total lines of code? 12 files changed, 918 insertions(+), 704 deletions(-) That's a net increase, so there is no obvious win here. Still, I have various excuses for this increased number of lines of code: 1. I'm adding more comments and lines containing only characters from the set " {};". This accounts for about 60% in the increase in number of lines of code. Clearly, this shouldn't count against me. 2. I still need to support some old ways of doing things. For backwards compatibility, functions and methods are still defined using a PyMethodDef. So I need to convert this old structure to the new protocol. I also keep support for some functions that my PEP makes obsolete, such as PyCFunction_Call(). All this requires code, but that code is simple. 3. In a few cases, specialized code is replaced by general code. For example, code that needs the __name__ of a function changes from accessing a known field in a C structure (func->m_ml->ml_name) to an actual Python attribute lookup. And Python code like def __qualname__(func): name = func.__name__ try: parent_qualname = func.__parent__.__qualname__ except AttributeError: return name return str(parent_qualname) + "." + name is conceptually pretty simple, but it becomes 37 lines of C code in my implementation. 4. The PEP does actually add a completely new feature: the flag CCALL_FUNCARG. That is something that I could easily remove for now, but PEP 580 would be a lot less useful without it. So it's something that I would certainly want to add in a later PEP anyway. It's also required for PEP 573. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Benchmarks why we need PEP 576/579/580
On 2018-07-23 12:13, Antoine Pitrou wrote: IMHO the main point of discussion should be judging the solution you are proposing Yes please. I would very much welcome a discussion about the actual content of the PEP instead of meta-discussions like how complicated it is. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 576/579/580 benchmark: mistune
Hello all, since my latest benchmark for PEP 580 [1] involved SageMath, which is quite a big project, I instead propose a much simpler benchmark involving mistune. mistune [2] is a Markdown parser implemented in the Python language. It optionally allows Cython compilation. It doesn't use any kind of optimization beyond that, but I created a branch [3] to use extension types instead of Python classes. Cython can either use built-in functions/methods or a custom class (which is not optimized but which would be optimized with PEP 580). I benchmarked mistune with custom classes [3] (binding=True, the default) and with built-in functions/methods [4] (binding=False). This is the median time of 5 runs: Binding=True: 9.063s Binding=False: 8.658s So this shows again that PEP 580 improves performance in actual real-world use cases. Jeroen. [1] https://mail.python.org/pipermail/python-dev/2018-July/154740.html [2] https://github.com/lepture/mistune [3] https://github.com/jdemeyer/mistune/tree/cython_pxd [4] https://github.com/jdemeyer/mistune/tree/cython_pxd_nobinding ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Let's change to C API!
My first impression is that making things faster and hiding implementation details in the ABI are contrary goals. I agree with hiding implementation details in the API but not in the ABI. For example, you mention that you want to make Py_INCREF() a function call instead of a macro. But since Py_INCREF is very common, I would guess that this would make performance worse (not by much maybe but surely measurable). Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Error message for wrong number of arguments
Hello, I noticed an inconsistency in the error messages for the number of arguments to a method call. For Python methods, the "self" argument is counted. For built-in methods, the "self" argument is *not* counted: >>> class mylist(list): ... def append(self, val): super().append(val) >>> f = list().append >>> f(1,2) Traceback (most recent call last): File "", line 1, in TypeError: append() takes exactly one argument (2 given) >>> g = mylist().append >>> g(1,2) Traceback (most recent call last): File "", line 1, in TypeError: append() takes 2 positional arguments but 3 were given I think it has been argued before that it's a feature that self is counted. So I consider the error message for list().append a bug. This is one of the many oddities I noticed while working on improving built-in functions. Would you agree to change the error message for built-in methods to be closer to Python methods? Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 576/579/580 benchmark: mistune
On 2018-07-30 13:11, INADA Naoki wrote: Like previous SageMath bench, this is caused by Cython's specialization; __Pyx_PyObject_CallOneArg. It specializing calling PyFunction and PyCFunction, but it doesn't specialized for calling CyFunction. Yes, I saw that too. But this is exactly what CPython does (it optimizes PyFunction and PyCFunction but not CyFunction), so I would still argue that the benchmark is fair. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [PEP 576/580] Reserve one type slot for Cython
On 2018-07-30 15:35, INADA Naoki wrote: As repeatedly said, PEP 580 is very complicated protocol when just implementing callable object. Can you be more concrete what you find complicated? Maybe I can improve the PEP to explain it more. Also, I'm open to suggestions to make it less complicated. It is optimized for implementing custom method object, although almost only Cython want the custom method type. For the record, Numba also seems interested in the PEP: https://groups.google.com/a/continuum.io/d/msg/numba-users/2G6k2R92MIM/P-cFKW7xAgAJ I'm not sure adding such complicated protocol almost only for Cython. If CyFunction can be implemented behind PEP 576, it may be better. I recall my post https://mail.python.org/pipermail/python-dev/2018-July/154238.html explaining the main difference between PEP 576 and PEP 580. I would like to stress that PEP 580 was designed for maximum performance, both today and for future extensions (such as calling with native C types). * PEP 576 and 580 are not strictly mutually exclusive; PEP 576 may be accepted in addition to PEP 580 I don't think that this is a good idea: you will mostly end up with the disadvantages of both approaches. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Error message for wrong number of arguments
On 2018-07-30 20:22, Chris Barker wrote: is it possible for the interpreter to know when this error is generated that this is a bound method expecting a "self", rather than an arbitrary function with n parameters? That would be quite hard. The error message is generated by the underlying function. At that point, the information of how it was called (as bound method or not) is already gone. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com