Re: [Python-Dev] PEP 580 and PEP 590 comparison.
On 2019-04-14 13:34, Mark Shannon wrote: I'll address capability first. I don't think that comparing "capability" makes a lot of sense since neither PEP 580 nor PEP 590 adds any new capabilities to CPython. They are meant to allow doing things faster, not to allow more things. And yes, the C call protocol can be implemented on top of the vectorcall protocol and conversely, but that doesn't mean much. Now performance. Currently the PEP 590 implementation is intentionally minimal. It does nothing for performance. So, we're missing some information here. What kind of performance improvements are possible with PEP 590 which are not in the reference implementation? The benchmark Jeroen provides is a micro-benchmark that calls the same functions repeatedly. This is trivial and unrealistic. Well, it depends what you want to measure... I'm trying to measure precisely the thing that makes PEP 580 and PEP 590 different from the status-quo, so in that sense those benchmarks are very relevant. I think that the following 3 statements are objectively true: (A) Both PEP 580 and PEP 590 add a new calling convention, which is equally fast as builtin functions (and hence faster than tp_call). (B) Both PEP 580 and PEP 590 keep roughly the same performance as the status-quo for existing function/method calls. (C) While the performance of PEP 580 and PEP 590 is roughly the same, PEP 580 is slightly faster (based on the reference implementations linked from PEP 580 and PEP 590). Two caveats concerning (C): - the difference may be too small to matter. Relatively, it's a few percent of the call time but in absolute numbers, it's less than 10 CPU clock cycles. - there might be possible improvements to the reference implementation of either PEP 580/PEP 590. I don't expect big differences though. To repeat an example from an earlier email, which may have been overlooked, this code reduces the time to create ranges and small lists by about 30% That's just a special case of the general fact (A) above and using the new calling convention for "type". It's an argument in favor of both PEP 580 and PEP 590, not for PEP 590 specifically. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 590 discussion
On 2019-04-14 13:30, Mark Shannon wrote: PY_VECTORCALL_ARGUMENTS_OFFSET exists so that callables that make onward calls with an additional argument can do so efficiently. The obvious example is bound-methods, but classes are at least as important. cls(*args) -> cls.new(cls, *args) -> cls.__init__(self, *args) But tp_new and tp_init take the "cls" and "self" as separate arguments, not as part of *args. So I don't see why you need PY_VECTORCALL_ARGUMENTS_OFFSET for this. The updated minimal implementation now uses `const` arguments. Code that uses args[-1] must explicitly cast away the const. https://github.com/markshannon/cpython/blob/vectorcall-minimal/Objects/classobject.c#L55 That's better indeed. Jeroen. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] No longer enable Py_TRACE_REFS by default in debug build
On Thu, 11 Apr 2019 08:26:47 -0700 Steve Dower wrote: > On 10Apr2019 1917, Nathaniel Smith wrote: > > It sounds like --with-pydebug has accumulated a big grab bag of > > unrelated features, mostly stuff that was useful at some point for > > some CPython dev trying to debug CPython itself? It's clearly not > > designed with end users as the primary audience, given that no-one > > knows what it actually does and that it makes third-party extensions > > really awkward to run. If that's right then I think Victor's plan of > > to sort through what it's actually doing makes a lot of sense, > > especially if we can remove the ABI breaking stuff, since that causes > > a disproportionate amount of trouble. > > Does it really cause a "disproportionate" amount of trouble? It's > definitely not meant for anyone who isn't working on C code, whether in > CPython, an extension or a host application. If you want to use > third-party extensions and are not able to rebuild them, that's a very > good sign that you probably shouldn't be on the debug build at all. I can't really agree with that. There are third-party extensions that have non-trivial build requirements. The fact that you have to rebuild third-party dependencies is a strong deterrent against using pydebug builds even when they may be actually useful (for example when debugging an extension module of your own). If you could just install mainstream binary packages (e.g. from Anaconda or PyPI) on a debug build interpreter, the pain would go away. Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Season of Docs
Hello Python Developers, Google is running a program called Season of Docs ( https://developers.google.com/season-of-docs/) to encourage technical writers to improve the documentation of Open Source Projects. As Python-Dev, and Python Software Foundation, do you think: a) We should participate? b) If yes to a), are you willing to be a mentor and identify project ideas? If you are willing to mentor and have project ideas, please let us know and we can think about the next steps. The deadline for org application is April 23, 2019. This discussion started here https://discuss.python.org/t/will-python-apply-for-season-of-docs-and-allow-suborgs/ Thank you, Senthil ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [Core-mentorship] Season of Docs
I don't know if Julien Palard is on this mailing list, but maybe he could be interested by this initiative. On 04/15, Senthil Kumaran wrote: Hello Python Developers, Google is running a program called Season of Docs ( https://developers.google.com/season-of-docs/) to encourage technical writers to improve the documentation of Open Source Projects. As Python-Dev, and Python Software Foundation, do you think: a) We should participate? b) If yes to a), are you willing to be a mentor and identify project ideas? If you are willing to mentor and have project ideas, please let us know and we can think about the next steps. The deadline for org application is April 23, 2019. This discussion started here https://discuss.python.org/t/will-python-apply-for-season-of-docs-and-allow-suborgs/ Thank you, Senthil == Core-mentorship mailing list: core-mentors...@python.org To unsubscribe send an email to core-mentorship-le...@python.org https://mail.python.org/mm3/mailman3/lists/core-mentorship.python.org/ Code of Conduct: https://www.python.org/psf/codeofconduct/ -- Stéphane Wirtel - https://wirtel.be - @matrixise ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Collaboration on a set of Python snaps
Hi Python devs, I work on the Snapcraft [0] team at Canonical. I'm looking for a Python contributor to collaborate with us on making snaps of supported releases of Python available in the Snap Store [1]. Travis CI and Canonical are looking for someone (preferably North-America based) to participate in an in-person Snapcraft Summit in downtown Montreal, Canada from 11th to 13th June. We're sponsoring a number of software vendors, device manufacturers people from the robotics sector to come. We'd love someone from the Python project to join us. We’ve published this blog post to explain the event in more detail: https://snapcraft.io/blog/snapcraft-summit-montreal The goal would be to create snaps of the major supported releases of Python, and authoritatively publish them in the Snap Store. This would enable users of many different Linux distributions to easily obtain up to date supported versions of Python directly from the Python project. It also enables providers of CI systems (such as Travis) to the latest builds of Python are easily available to developers who use their services. We've done this previously with NodeJS [2] and Ruby [3] - among others. It would be great to have Python available via this method too. All the best, Al. [0] - https://snapcraft.io/ [1] - https://snapcraft.io/store [2] - https://snapcraft.io/node [3] - https://snapcraft.io/ruby -- Alan Pope Community Advocate Canonical - Ubuntu Engineering and Services +44 (0) 7973 620 164 alan.p...@canonical.com http://ubuntu.com/ ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 589 discussion (TypedDict) happening at typing-sig@
Hi everyone, I submitted PEP 589 (TypedDict: Type Hints for Dictionaries with a Fixed Set of Keys) for discussion to typing-sig [1]. Here's an excerpt from the abstract of the PEP: PEP 484 defines the type Dict[K, V] for uniform dictionaries, where each value has the same type, and arbitrary key values are supported. It doesn't properly support the common pattern where the type of a dictionary value depends on the string value of the key. This PEP proposes a type constructor typing.TypedDict to support the use case where a dictionary object has a specific set of string keys, each with a value of a specific type. Jukka Lehtosalo [1] https://mail.python.org/mailman3/lists/typing-sig.python.org/ ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] No longer enable Py_TRACE_REFS by default in debug build
On 12Apr2019 1819, Nathaniel Smith wrote: On Fri, Apr 12, 2019 at 5:05 PM Steve Dower wrote: On 12Apr.2019 1643, Nathaniel Smith wrote: On Thu, Apr 11, 2019 at 8:26 AM Steve Dower wrote: The very first question I asked was whether this would let us converge the ABIs, and the answer was "no". Otherwise I'd have said go for it, despite the C runtime issues. I don't see that in the thread... just Victor saying he isn't sure whether there might be other ABI incompatibilities lurking that he hasn't found yet. Did I miss something? "I don't know" means we can't say the APIs are converged, which is a no. I don't think you missed anything, but just read it through a different filter. I'm mostly interested in this because of the possibility of converging the ABIs. If you think that the C runtime thing isn't a blocker for that, then that's useful information. Though obviously we still need to figure out whether there are any other blockers :-). [SNIP] Do you happen to have a list of places where the C API leaks details of the underlying CRT? (I'm mostly curious because whenever I've looked my conclusion was essentially: "Well... I don't see any places that are *definitely* broken, so maybe mixing CRTs is fine? but I have zero confidence that I caught everything, so probably better to play it safe?". At least on py3 – I know the py2 C API was definitely broken if you mixed CRTs, because of the exposed FILE*.) Not since the discussions about migrating to VS 2015, but a few off the top of my head: * locale * file descriptors * stream buffers * thread locals * exception [handler] state (yes, there are exceptions used within the CRT, and they occasionally intentionally leak out past the C code) * atexit handlers * internal callbacks (mostly debug handlers, but since we're talking about debugging...) I'm pretty sure if I did some digging I'd be able to figure out which of these come from vcruntime140.dll vs ucrtbase.dll, and then come up with some far-too-clever linker options to make some of these more consistent, but there's no complete solution other than making sure you've got a complete debug or complete release build. For the most part, disabling optimizations in your own extension but using the non-debug ABI is sufficient, and if you're having to deal with other people's packages then maybe you don't have any choice (though I do know of people who have built debug versions of numpy before - turns out Windows developers are often just as capable as non-Windows developers when it comes to building things ;) I'm not sure why you think I was implying otherwise? I'm sorry if you thought I was attacking your users or something. I did say that I thought most users downloading the debug builds were probably confused about what they were actually getting, but I didn't mean because they were stupid Windows users, I meant because the debug builds are so confusing that even folks on the Python core team are confused about what they're actually getting. "Our users", please :) In my experience, Windows developers just treat debug and release builds as part of the normal development process. The only confusion I've seen has been related to CPython's not-quite-Windows-ish approach to debug builds, and in practically every case it's been enough to explain "release CPython uses a different CRT to your debug extension, but once you align those it'll be fine". I definitely *do not* want to force or encourage package developers to release debug ABI versions of their prebuilt packages. But at the same time I don't want to remove the benefits that debug builds currently include. Basically, I'm happy with the status quo, and the users I talk to are happy with it. So I'd rather not worry about optimising debug builds for speed or memory usage. (It's a question of direction more than anything else, and until we get some official statement of direction then I'll keep advocating a direction based on my experiences ;) ) Cheers, Steve ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] No longer enable Py_TRACE_REFS by default in debug build
On Mon, 15 Apr 2019 12:50:00 +0200 Antoine Pitrou wrote: > On Thu, 11 Apr 2019 08:26:47 -0700 > Steve Dower wrote: > > On 10Apr2019 1917, Nathaniel Smith wrote: > > > It sounds like --with-pydebug has accumulated a big grab bag of > > > unrelated features, mostly stuff that was useful at some point for > > > some CPython dev trying to debug CPython itself? It's clearly not > > > designed with end users as the primary audience, given that no-one > > > knows what it actually does and that it makes third-party extensions > > > really awkward to run. If that's right then I think Victor's plan of > > > to sort through what it's actually doing makes a lot of sense, > > > especially if we can remove the ABI breaking stuff, since that causes > > > a disproportionate amount of trouble. > > > > Does it really cause a "disproportionate" amount of trouble? It's > > definitely not meant for anyone who isn't working on C code, whether in > > CPython, an extension or a host application. If you want to use > > third-party extensions and are not able to rebuild them, that's a very > > good sign that you probably shouldn't be on the debug build at all. > > I can't really agree with that. There are third-party extensions that > have non-trivial build requirements. The fact that you have to rebuild > third-party dependencies is a strong deterrent against using pydebug > builds even when they may be actually useful (for example when > debugging an extension module of your own). Oh, and as a datapoint, there are user requests for pydebug builds in Anaconda and conda-forge: https://github.com/ContinuumIO/anaconda-issues/issues/80 https://github.com/conda-forge/staged-recipes/issues/1593 The problem is, while it's technically relatively easy to build and distribute a special build of Python, to make it useful implies also building a whole separate distribution of Python libraries as well. I suspect the latter is why those issues were never acted upon. So, there's actual demand from people who would (probably) benefit from it, but are blocked by burden of recompiling all dependencies. Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 578: Python Runtime Audit Hooks
On 28/03/2019 23.35, Steve Dower wrote: > Hi all > > Time is short, but I'm hoping to get PEP 578 (formerly PEP 551) into > Python 3.8. Here's the current text for review and comment before I > submit to the Steering Council. > > The formatted text is at https://www.python.org/dev/peps/pep-0578/ > (update just pushed, so give it an hour or so, but it's fundamentally > the same as what's there) > > No Discourse post, because we don't have a python-dev equivalent there > yet, so please reply here for this one. > > Implementation is at https://github.com/zooba/cpython/tree/pep-578/ and > my backport to 3.7 (https://github.com/zooba/cpython/tree/pep-578-3.7/) > is already getting some real use (though this will not be added to 3.7, > unless people *really* want it, so the backport is just for reference). Hi Steve, (memory dump before I go to bed) Steve Grubb from Red Hat security pointed me to some interesting things [1]. For instance there is some work on a new O_MAYEXEC flag for open(). Steve came to similar conclusions like we, e.g. streaming code from stdin is insecure. I think it would be also beneficial to have auditing events for the import system to track when sys.path or import loaders are changed. Christian [1] https://marc.info/?l=linux-fsdevel&m=155535414414626&w=2 ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 578: Python Runtime Audit Hooks
On 15Apr2019 1344, Christian Heimes wrote: Hi Steve, (memory dump before I go to bed) Steve Grubb from Red Hat security pointed me to some interesting things [1]. For instance there is some work on a new O_MAYEXEC flag for open(). Steve came to similar conclusions like we, e.g. streaming code from stdin is insecure. [1] https://marc.info/?l=linux-fsdevel&m=155535414414626&w=2 Thanks for the pointer! Using this for open_code() by default on platforms that support it might be a good opportunity in the future. But I'm glad I'm not the only one who thinks this is the right approach :) I think it would be also beneficial to have auditing events for the import system to track when sys.path or import loaders are changed. Already in there (kind of... the "import" events include the contents of the sys properties that are about to be used to resolve it - since these are plain-old lists, and can be easily reassigned, passing them through here allows you to add a check if you really want it but otherwise not pay the cost of replacing the sys module with a special implementation and its attributes with special lists). Cheers, Steve ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Cannot find documented API in PEP-376 (Database of Installed Python Distributions)
Hello, I am on a PEP scouting effort to check the current status of python packaging and its historical context, mostly for learning purposes. I noted that the PEP defines some functions for pkgutil (e.g. get_distributions), but I cannot find them. I tried to do some searching on the mailing list history, but I came up with pretty much nothing of value. It appears that the topic was last considered in 2009 (the year of the PEP). dist-info was then implemented, but I cannot find any information about the missing API, nor any additional PEP, except for a brief reference in PEP-427. Does anyone have some context for this? I understand it was 10 years ago, so it's mostly a curiosity. Thanks. -- Kind regards, Stefano Borini ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Cannot find documented API in PEP-376 (Database of Installed Python Distributions)
On Mon, 15 Apr 2019 at 22:35, Stefano Borini wrote: > > Hello, > > I am on a PEP scouting effort to check the current status of python > packaging and its historical context, mostly for learning purposes. I > noted that the PEP defines some functions for pkgutil (e.g. > get_distributions), but I cannot find them. > I tried to do some searching on the mailing list history, but I came > up with pretty much nothing of value. It appears that the topic was > last considered in 2009 (the year of the PEP). dist-info was then > implemented, but I cannot find any information about the missing API, > nor any additional PEP, except for a brief reference in PEP-427. > > Does anyone have some context for this? > > I understand it was 10 years ago, so it's mostly a curiosity. Thanks. PEP 376 was part of a rather grand plan to re-engineer a lot of Python's packaging tools (distutils and setuptools at the time, mainly). Although the PEP was accepted, a lot of the coding never got done and ultimately the project was abandoned, and we moved over to a more incremental approach of improving what was there, rather than wholesale replacing things. So the PEP itself is something of a mixture now, some parts that are implemented, some parts that are relevant in principle but the details never got filled in, and some parts that simply never happened. >From what I recall (I was around at the time) a lot of the discussion was on distutils-sig - did you check the archives of that list in your searching? But there was a lot of what I would describe as "heated debate" going on at that point, so it may be hard to find anything particularly informative. Hopefully, that's of some use - good luck in your investigations! Paul ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] No longer enable Py_TRACE_REFS by default in debug build
> The main question is if anyone ever used Py_TRACE_REFS? Does someone > use sys.getobjects() or PYTHONDUMPREFS environment variable? I used sys.getobjects() today to track down a memory leak in the mypyc-compiled version of mypy. We were leaking memory badly but no sign of the leak was showing up in mypy's gc.get_objects() based profiler. Using a debug build and switching to sys.getobjects() showed that we were badly leaking int objects. A quick inspection of the values in question (large and random looking) suggested we were leaking hash values, and that quickly pointed me to https://github.com/mypyc/mypyc/pull/562. I don't have any strong feelings about whether to keep it in the "default" debug build, though. I was using a debug build that I built myself with every debug feature that seemed potentially useful. -sully ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] No longer enable Py_TRACE_REFS by default in debug build
On Mon, Apr 15, 2019, 15:27 Michael Sullivan wrote: > > The main question is if anyone ever used Py_TRACE_REFS? Does someone > > use sys.getobjects() or PYTHONDUMPREFS environment variable? > > I used sys.getobjects() today to track down a memory leak in the > mypyc-compiled version of mypy. > > We were leaking memory badly but no sign of the leak was showing up in > mypy's gc.get_objects() based profiler. Using a debug build and switching > to sys.getobjects() showed that we were badly leaking int objects. A quick > inspection of the values in question (large and random looking) suggested > we were leaking hash values, and that quickly pointed me to > https://github.com/mypyc/mypyc/pull/562. > > I don't have any strong feelings about whether to keep it in the "default" > debug build, though. I was using a debug build that I built myself with > every debug feature that seemed potentially useful. > This is mostly to satisfy my curiosity, so feel free to ignore: did you try using address sanitizer or valgrind? -n > ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 591 discussion (final qualifier) happening at typing-sig@
I've submitted PEP 591 (Adding a final qualifier to typing) for discussion to typing-sig [1]. Here's the abstract: This PEP proposes a "final" qualifier to be added to the ``typing`` module---in the form of a ``final`` decorator and a ``Final`` type annotation---to serve three related purposes: * Declaring that a method should not be overridden * Declaring that a class should not be subclassed * Declaring that a variable or attribute should not be reassigned Full text at https://www.python.org/dev/peps/pep-0591/ -sully [1] https://mail.python.org/mailman3/lists/typing-sig.python.org/ ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 589 discussion (TypedDict) happening at typing-sig@
Hi Jukka, Thanks for submitting this PEP, I think it will be a net plus for the python language. I have been using TypedDict as a mypy_extensions module and it's been a great help. I found that the one thing that may be less intuitive and its design is the totality property. The fact that you need to use inheritance to compose TypedDicts that contain both required and optional keys create situations that may be a little verbose for some use cases. Perhaps an "optional" property taking a list of keys that type checkers would recognize as (no surprise) optional could be an alternative design with some merit. Best regards, Philippe On Mon, Apr 15, 2019 at 12:44 PM Jukka Lehtosalo wrote: > Hi everyone, > > I submitted PEP 589 (TypedDict: Type Hints for Dictionaries with a Fixed > Set of Keys) for discussion to typing-sig [1]. > > Here's an excerpt from the abstract of the PEP: > > PEP 484 defines the type Dict[K, V] for uniform dictionaries, where each > value has the same type, and arbitrary key values are supported. It doesn't > properly support the common pattern where the type of a dictionary value > depends on the string value of the key. This PEP proposes a type > constructor typing.TypedDict to support the use case where a dictionary > object has a specific set of string keys, each with a value of a specific > type. > > Jukka Lehtosalo > > [1] https://mail.python.org/mailman3/lists/typing-sig.python.org/ > ___ > Python-Dev mailing list > Python-Dev@python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/philgagnon1%40gmail.com > ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 591 discussion (final qualifier) happening at typing-sig@
On Mon, Apr 15, 2019 at 5:00 PM Michael Sullivan wrote: > > I've submitted PEP 591 (Adding a final qualifier to typing) for discussion to > typing-sig [1]. I'm not on typing-sig [1] so I'm replying here. > Here's the abstract: > This PEP proposes a "final" qualifier to be added to the ``typing`` > module---in the form of a ``final`` decorator and a ``Final`` type > annotation---to serve three related purposes: > > * Declaring that a method should not be overridden > * Declaring that a class should not be subclassed > * Declaring that a variable or attribute should not be reassigned I've been meaning to start blocking subclassing at runtime (e.g. like [2]), so being able to express that to the typechecker seems like a nice addition. I'm assuming though that the '@final' decorator doesn't have any runtime effect, so I'd have to say it twice? @typing.final class MyClass(metaclass=othermod.Final): ... Or on 3.6+ with __init_subclass__, it's easy to define a @final decorator that works at runtime, but I guess this would have to be a different decorator? @typing.final @alsoruntime.final class MyClass: ... This seems kinda awkward. Have you considered giving it a runtime effect, or providing some way for users to combine these two things together on their own? -n [1] https://github.com/willingc/pep-communication/issues/1 [2] https://stackoverflow.com/a/3949004/1925449 -- Nathaniel J. Smith -- https://vorpus.org ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] No longer enable Py_TRACE_REFS by default in debug build
On Mon, Apr 15, 2019 at 4:06 PM Nathaniel Smith wrote: > On Mon, Apr 15, 2019, 15:27 Michael Sullivan wrote: > >> > The main question is if anyone ever used Py_TRACE_REFS? Does someone >> > use sys.getobjects() or PYTHONDUMPREFS environment variable? >> >> I used sys.getobjects() today to track down a memory leak in the >> mypyc-compiled version of mypy. >> >> We were leaking memory badly but no sign of the leak was showing up in >> mypy's gc.get_objects() based profiler. Using a debug build and switching >> to sys.getobjects() showed that we were badly leaking int objects. A quick >> inspection of the values in question (large and random looking) suggested >> we were leaking hash values, and that quickly pointed me to >> https://github.com/mypyc/mypyc/pull/562. >> >> I don't have any strong feelings about whether to keep it in the >> "default" debug build, though. I was using a debug build that I built >> myself with every debug feature that seemed potentially useful. >> > > This is mostly to satisfy my curiosity, so feel free to ignore: did you > try using address sanitizer or valgrind? > > I didn't, mostly because I assume that valgrind wouldn't play well with cpython. (I've never used address sanitizer.) I was curious, so I went back and tried it out. It turned out to not seem to need that much fiddling to get to work. It slows things down a *lot* and produced 17,000 "loss records", though, so maybe I don't have it working right. At a glance the records did not shed any light. I'd definitely believe that valgrind is up to the task of debugging this, but my initial take with it shed much less light than my sys.getobjects() approach. (Though note that my sys.getobjects() approach was slotting it into an existing python memory profiler we had hacked up, so...) -sully > -n > >> ___ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com