[Python-Dev] clarification on PEP 3124 status
Could somebody please clarify the status of PEP 3124? At http://ftp.python.org/dev/peps/ , it is listed as "under consideration", but http://ftp.python.org/dev/peps/pep-3124/ says it has been deferred. I was reading through the discussion on the python-3000 mailing list archive, and at one point somebody asked for other examples where generic functions are used in the community. The numpy project has a basic generic function mechanism for numpy's ufuncs (regular functions that operate on arrays), where subclasses of numpy.ndarray can define __array_prepare__ (this method will be added in numpy-1.4) and __array_wrap__ methods, which are sort of analogous to @before and @after in PEP 3124 (ndarray subclasses define an __array_priority__ attribute to decide how to dispatch). The numpy approach is not a general solution and is not as flexible as what is described in the PEP, but it can be used by functions that operate on subclasses that implement matrices, masked arrays, arrays with physical units. I would be very interested in seeing a framework for generic functions in the numpy standard library. I think would be more simple and flexible than what we currently have. Is there still interest/motivation for supporting generic functions in the standard library? Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] clarification on PEP 3124 status
On Sat, Sep 12, 2009 at 9:57 AM, Darren Dale wrote: > I would be very interested in seeing a framework for generic functions > in the numpy standard library. Sorry, I meant to say "python standard library" ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] clarification on PEP 3124 status
Hi Martin, On Sun, Sep 13, 2009 at 2:29 AM, "Martin v. Löwis" wrote: >> Could somebody please clarify the status of PEP 3124? At >> http://ftp.python.org/dev/peps/ , it is listed as "under >> consideration", but http://ftp.python.org/dev/peps/pep-3124/ says it >> has been deferred. > > This isn't really contradictory. "under consideration" means "in > progress": it has neither been accepted or rejected. > > If Phillip doesn't respond here, you may want to ask him directly. > My impression is that it is deferred because nobody is pursuing it > actively (including Phillip Eby). It's common for a PEP to be in that > state for several years, "deferred" then is an indication that readers > shouldn't expect a resolution in short term. That is why I asked, I wondered if it is being actively considered and pursued, or if it had been deferred or worse abandoned. > That said: my personal feeling is that this PEP is way too large, and > should be broken into seperate pieces of functionality that can be > considered independently. There is a lot of stuff in it that isn't > strictly necessary to provide the feature listed in the rationale. It would be nice to have a suitable foundation upon which more elaborate third party dispatchers could build. The potential generic functions have in a project like numpy are pretty exciting. Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] clarification on PEP 3124 status
Hi Paul, On Sun, Sep 13, 2009 at 10:54 AM, Paul Moore wrote: > 2009/9/13 Darren Dale : >>> If Phillip doesn't respond here, you may want to ask him directly. >>> My impression is that it is deferred because nobody is pursuing it >>> actively (including Phillip Eby). It's common for a PEP to be in that >>> state for several years, "deferred" then is an indication that readers >>> shouldn't expect a resolution in short term. >> >> That is why I asked, I wondered if it is being actively considered and >> pursued, or if it had been deferred or worse abandoned. >> >>> That said: my personal feeling is that this PEP is way too large, and >>> should be broken into seperate pieces of functionality that can be >>> considered independently. There is a lot of stuff in it that isn't >>> strictly necessary to provide the feature listed in the rationale. >> >> It would be nice to have a suitable foundation upon which more >> elaborate third party dispatchers could build. The potential generic >> functions have in a project like numpy are pretty exciting. > > You may also be interested in http://bugs.python.org/issue5135 which > is a (much) simpler attempt to introduce generic functions into the > standard library. Thanks for the pointer. I actually read through the discussion there yesterday. I don't think simplegeneric would be especially useful to numpy. For example, multiplying a numpy.array([1,2,3]) with a quantities.Quantity([1,2,3], 'm/s') should produce a new Quantity regardless of the order in which they are provided to numpy.multiply(). Numpy can handle this particular example now, but the mechanisms are a bit convoluted. > Generally, these things get stalled because the core developers don't > have sufficient interest in the topic to do anything directly, and the > arguments in favour aren't compelling enough to make a difference. > Maybe the benefits numpy would get would help the case. I am a relatively new contributor to the numpy project, contributing bug fixes and features (most of which have been related to -- or could benefit from -- generic functions) to better support subclasses like Quantity. Numpy has different kinds of arrays (ndarrays, array scalars, masked arrays, matrices) and supports many different data types (int8, float32, complex64, etc). The ability to dispatch based on the object type (or combinations thereof), or on combinations of data types, or perhaps on the units of quantities, seem like good examples where predicative dispatch would be useful. I am primarily trying to get up to speed to help with the effort to transition numpy to python-3. Perhaps generic functions could help make the numpy source code more accessible and maintainable, so that maybe someday there would even be interest in including numpy or some subset thereof in the standard library. Anyway, it is helpful to me to see where generic functions stand and how they might develop in the standard library as we work on numpy support for python 3. Regards, Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] nonstandard behavior of reflected functions
According to http://docs.python.org/reference/datamodel.html , the reflected operands functions like __radd__ "are only called if the left operand does not support the corresponding operation and the operands are of different types. [3] For instance, to evaluate the expression x - y, where y is an instance of a class that has an __rsub__() method, y.__rsub__(x) is called if x.__sub__(y) returns NotImplemented." Consider the following simple example: == class Quantity(object): def __add__(self, other): return '__add__ called' def __radd__(self, other): return '__radd__ called' class UnitQuantity(Quantity): def __add__(self, other): return '__add__ called' def __radd__(self, other): return '__radd__ called' print 'Quantity()+Quantity()', Quantity()+Quantity() print 'UnitQuantity()+UnitQuantity()', UnitQuantity()+UnitQuantity() print 'UnitQuantity()+Quantity()', UnitQuantity()+Quantity() print 'Quantity()+UnitQuantity()', Quantity()+UnitQuantity() == The output should indicate that __add__ was called in all four trials, but the last trial calls __radd__. Interestingly, if I comment out the definition of __radd__ in UnitQuantity, then the fourth trial calls __add__ like it should. I think this may be an important bug. I'm running Python 2.6.4rc1 (r264rc1:75270, Oct 13 2009, 17:02:06) an ubuntu Karmic. Is it a known issue, or am I misreading the documentation? Thanks, Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] nonstandard behavior of reflected functions
On Sun, Oct 18, 2009 at 10:50 AM, Darren Dale wrote: > According to http://docs.python.org/reference/datamodel.html , the > reflected operands functions like __radd__ "are only called if the > left operand does not support the corresponding operation and the > operands are of different types. [3] For instance, to evaluate the > expression x - y, where y is an instance of a class that has an > __rsub__() method, y.__rsub__(x) is called if x.__sub__(y) returns > NotImplemented." > > Consider the following simple example: > > == > class Quantity(object): > > def __add__(self, other): > return '__add__ called' > > def __radd__(self, other): > return '__radd__ called' > > class UnitQuantity(Quantity): > > def __add__(self, other): > return '__add__ called' > > def __radd__(self, other): > return '__radd__ called' > > print 'Quantity()+Quantity()', Quantity()+Quantity() > print 'UnitQuantity()+UnitQuantity()', UnitQuantity()+UnitQuantity() > print 'UnitQuantity()+Quantity()', UnitQuantity()+Quantity() > print 'Quantity()+UnitQuantity()', Quantity()+UnitQuantity() > == > > The output should indicate that __add__ was called in all four trials, > but the last trial calls __radd__. Interestingly, if I comment out the > definition of __radd__ in UnitQuantity, then the fourth trial calls > __add__ like it should. > > I think this may be an important bug. I'm running Python 2.6.4rc1 > (r264rc1:75270, Oct 13 2009, 17:02:06) an ubuntu Karmic. Is it a known > issue, or am I misreading the documentation? I'm sorry, I should have read further down the page in the documentation: "Note: If the right operand’s type is a subclass of the left operand’s type and that subclass provides the reflected method for the operation, this method will be called before the left operand’s non-reflected method. This behavior allows subclasses to override their ancestors’ operations." ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposing PEP 386 for addition
On Thu, Dec 10, 2009 at 7:24 AM, sstein...@gmail.com wrote: > > On Dec 10, 2009, at 3:44 AM, Malthe Borch wrote: > >> On 12/8/09 6:16 PM, Tarek Ziadé wrote: >>> I believe that the current situation is as close to consensus as we >>> will get on distutils-sig, and in the interests of avoiding months of >>> further discussion which won't take things any further, I propose to >>> allow final comments from python-dev and then look for a final >>> decision. >> >> Great work, Tarek. I think you've managed to establish a good body of >> knowledge on this and the proposal seems sound. >> >> That said, I think the terms ``LooseVersion`` and ``StrictVersion`` are less >> than optimal. Really, what's meant is ``LexicalVersion`` and >> ``ChronologicalVersion`` (or ``NumberedVersion``). It's not about strictness >> or looseness. > > I agree about the impreciseness of these terms. I'm not sure what the > correct terminology is... Those aren't new proposals, though, they already exist in distutils. Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposing PEP 386 for addition
On Thu, Dec 10, 2009 at 7:43 AM, Malthe Borch wrote: > 2009/12/10 Darren Dale : >> Those aren't new proposals, though, they already exist in distutils. > > I see. Thanks for clarifying –– maybe the PEP should better explain this. It is already pretty clear: "Distutils currently provides a StrictVersion and a LooseVersion class that can be used to manage versions." Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposing PEP 386 for addition
On Thu, Dec 10, 2009 at 8:54 AM, Antoine Pitrou wrote: > Tarek Ziadé gmail.com> writes: >> >> Do you have a better suggestion ? I was thinking about StandardVersion >> but "Standard" >> doesn't really express what we want to achieve here I think, > > I think StandardVersion is fine. I prefer StandardVersion as well. Rational (according to websters.com): 1. agreeable to reason; reasonable; sensible: a rational plan for economic development. 2. having or exercising reason, sound judgment, or good sense: a calm and rational negotiator. Standard (according to websters.com): 1. something considered by an authority or by general consent as a basis of comparison; an approved model. 2. an object that is regarded as the usual or most common size or form of its kind 3. a rule or principle that is used as a basis for judgment Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [Distutils] At least one package management tool for 2.7
On Wed, Mar 24, 2010 at 6:26 AM, Tarek Ziadé wrote: > The open question is: do we want to include a full installer that > takes care of installing / removing dependencies as well ? > > I think not. Pip already provides this feature on the top of distutils > (and distutils2 later I guess) and is not hard to install on the top > of Python. Is pip able to determine and install dependencies recursively, like easy_install does? Or is it up to the requested package to it specify its dependencies (and its dependencies dependencies) in a pip requirements file that is distributed separately? Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [Distutils] At least one package management tool for 2.7
On Wed, Mar 24, 2010 at 1:19 PM, Ian Bicking wrote: > On Wed, Mar 24, 2010 at 7:27 AM, Olemis Lang wrote: >> My experience is that only `install_requires` is needed (unless you >> want to create app bundles AFAICR) , but in practice I've noticed that >> *some* easy_installable packages are not pip-able (though I had no >> time to figure out why :-/ ) > > Usually this is because Setuptools is poking at objects to do its > work, while pip tries to work mostly with subprocesses. Though to > complicate things a bit, pip makes sure the Setuptools monkeypatches > to distutils are applied, so that it's always as though the setup.py > says "from setuptools import setup". easy_install *also* does this. > > But then easy_install starts calling methods and whatnot, while pip just does: > > setup.py install --single-version-externally-managed --no-deps > --record some_tmp_file > > The --no-deps keeps Setuptools from resolving dependencies Seeking clarification: how can pip recursively install dependencies *and* keep Setuptools from resolving dependencies? Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] question/comment about documentation of relative imports
I have a couple questions/comments about the use of PEP 328-style relative imports. For example, the faq at http://docs.python.org/py3k/faq/programming.html#what-are-the-best-practices-for-using-import-in-a-module reads: "Never use relative package imports. If you’re writing code that’s in the package.sub.m1 module and want to import package.sub.m2, do not just write from . import m2, even though it’s legal. Write from package.sub import m2 instead. See PEP 328 for details." There is no explanation to support the claim that relative imports should "never" be used. It seems to me that someone read the following in PEP 328:: from .moduleY import spam from .moduleY import spam as ham from . import moduleY from ..subpackage1 import moduleY from ..subpackage2.moduleZ import eggs from ..moduleA import foo from ...package import bar from ...sys import path Note that while that last case is legal, it is certainly discouraged ("insane" was the word Guido used). ... and interpreted it to mean that relative imports are in general discouraged. I interpreted it to mean that relative imports should not be used to import from python's standard library. There are cases where it is necessary to use relative imports, like a package that is included as a subpackage of more than one other project (when it is not acceptable to add an external dependency, for example due to version/compatibility issues). There is some additional context on relative imports in the programming faq for python-2.7 at http://docs.python.org/faq/programming.html#what-are-the-best-practices-for-using-import-in-a-module : "Never use relative package imports. If you’re writing code that’s in the package.sub.m1 module and want to import package.sub.m2, do not just write import m2, even though it’s legal. Write from package.sub import m2 instead. Relative imports can lead to a module being initialized twice, leading to confusing bugs. See PEP 328 for details." Is there some documentation explaining why the module may be initialized twice? I don't see it in PEP 328. Is this also the case for python-3, or does it only apply to the old-style (pre-PEP 328) relative imports in python-2? If relative imports are truly so strongly discouraged, then perhaps warnings should also be included in places like http://docs.python.org/library/functions.html#__import__ , and especially http://docs.python.org/tutorial/modules.html#intra-package-references and http://www.python.org/dev/peps/pep-0328/ (which, if I have misinterpreted, is ambiguously written. Though I doubt this is the case). There is also this warning against relative imports in PEP 8: - Relative imports for intra-package imports are highly discouraged. Always use the absolute package path for all imports. Even now that PEP 328 [7] is fully implemented in Python 2.5, its style of explicit relative imports is actively discouraged; absolute imports are more portable and usually more readable. ... but one could argue, as I just have, that relative imports are more portable, not less. In a sense, the statement "explicit relative imports is actively discouraged" is objectively false. They are passively discouraged. If they were actively discouraged, perhaps performing a relative import would raise a warning, or maybe distutils would raise a warning at install time, or maybe an additional import would be required to enable them. Up until now, I was not aware that use of PEP 328 relative imports might be discouraged. I'm still unclear as to why they might be discouraged. I recently helped convert a popular package to use PEP 328 relative imports. Would the python devs consider this a mistake? Thanks, Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] question/comment about documentation of relative imports
On Tue, Oct 5, 2010 at 12:43 PM, Antoine Pitrou wrote: > On Tue, 05 Oct 2010 17:18:18 +0100 > Michael Foord wrote: >> > >> > Generally I'm +0 on relative imports as a whole. >> >> As the OP pointed out, for code that may be *included* in other projects >> there is no other choice. This is often useful for packages shared >> between one or two projects that nonetheless don't warrant separate >> distribution. > > You can put several packages in a single distribution. Thats not the point though. Due to compatibility issues, maybe I don't want to expose the code at the top level. Maybe the foo package is distributed elsewhere as a top-level package, but I need to use an older version due to compatibility problems. I certainly don't want to risk overwriting a pre-existing installation of foo with my required version of foo. This is not a hypothetical, we once had exactly this problem when we distributed an old version of enthought.traits with matplotlib (even though we checked for pre-existing installations, crufty build/ directories containing the out-of-date traits package were overwriting existing installations). Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] question/comment about documentation of relative imports
On Tue, Oct 5, 2010 at 1:45 PM, Antoine Pitrou wrote: > Le mardi 05 octobre 2010 à 13:28 -0400, Darren Dale a écrit : >> >> >> >> As the OP pointed out, for code that may be *included* in other projects >> >> there is no other choice. This is often useful for packages shared >> >> between one or two projects that nonetheless don't warrant separate >> >> distribution. >> > >> > You can put several packages in a single distribution. >> >> Thats not the point though. Due to compatibility issues, maybe I don't >> want to expose the code at the top level. Maybe the foo package is >> distributed elsewhere as a top-level package, but I need to use an >> older version due to compatibility problems. I certainly don't want to >> risk overwriting a pre-existing installation of foo with my required >> version of foo. This is not a hypothetical, we once had exactly this >> problem when we distributed an old version of enthought.traits with >> matplotlib > > That use case requires that the third-party package, not your package, > use relative imports. I don't think you can require other projects to > follow your coding style recommendations (unless of course you maintain > both). I'm not talking about requiring other projects to follow my coding style. > I'm not sure I understand the issue. The issue is implementing a PEP with nice support for relative imports, and then documenting that it should never be used. Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] question/comment about documentation of relative imports
On Tue, Oct 5, 2010 at 3:37 PM, Terry Reedy wrote: > On 10/5/2010 2:21 PM, Guido van Rossum wrote: >> >> On Tue, Oct 5, 2010 at 11:17 AM, Darren Dale wrote: >>> >>> The issue is implementing a PEP with nice support for relative >>> imports, and then documenting that it should never be used. >> >> Isn't this mostly historical? Until the new relative-import syntax was >> implemented there were various problems with relative imports. The >> short-term solution was to recommend not using them. The long-term >> solution was to implement an unambiguous syntax. Now it is time to >> withdraw the anti-recommendation. Of course, without going overboard >> -- I still find them an acquired taste; but they have their place. > > Darren, if you have not yet done so, open a tracker that quotes the above > and gives your recommended changes at which locations. Done: http://bugs.python.org/issue10031 Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] styleguide inconsistency
I was recently searching for some guidance on how to name packages and modules, and discovered an inconsistency in the style guides published at www.python.org. http://www.python.org/doc/essays/styleguide.html says "Module names can be either MixedCase or lowercase." That page also refers to PEP 8 at http://www.python.org/dev/peps/pep-0008/, which says "Modules should have short, all-lowercase names. ... Python packages should also have short, all-lowercase names ...". Some discussion on dev.lang.python has so far turned up the following points of view: 1) There isn't technically a contradiction because "can be" is not the same as "should be". However, since this is a style guide and not a syntax guide, I still think the documents are contradictory. 2) There isn't any confusion because the styleguide refers to the PEPs, so they have priority. However, styleguide.html does not explain that the PEPs are more up-to-date. We shouldn't expect someone to go to the PEPs after finding an answer to their question in the styleguide. Perhaps one of these documents could be revised to make the situation more clear? Thanks, Darren Dale ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] improvement to declaring abstract properties
I suggested at python-ideas a way that the declaration of abstract properties could be improved to support the decorator syntax: http://mail.python.org/pipermail/python-ideas/2011-March/009411.html . A relatively small change to the property builtin would allow properties to identify themselves as abstract when they are passed abstract methods (the same way that function objects identify themselves as abstract when decorated with @abstractmethod). As a result, @abstractproperty would no longer be needed. I submitted a patch at http://bugs.python.org/issue11610. It includes the changes to the property builtin, documentation, and unit tests. Unfortunately, I have not been able to python-3.3 from a mercurial checkout on either Ubuntu 11.04 or OS X 10.6.6 (for reasons unrelated to the patch), and so I have not been able to test the patch. Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] improvement to declaring abstract properties
On Sat, Mar 19, 2011 at 3:06 PM, Darren Dale wrote: > I suggested at python-ideas a way that the declaration of abstract > properties could be improved to support the decorator syntax: > http://mail.python.org/pipermail/python-ideas/2011-March/009411.html . > A relatively small change to the property builtin would allow > properties to identify themselves as abstract when they are passed > abstract methods (the same way that function objects identify > themselves as abstract when decorated with @abstractmethod). As a > result, @abstractproperty would no longer be needed. > > I submitted a patch at http://bugs.python.org/issue11610. It includes > the changes to the property builtin, documentation, and unit tests. This patch has been improved so it only touches abc.abstractproperty. "make test" yields the same results with and without the patch. As a result of the review (http://bugs.python.org/review/11610/show), the documentation was improved to make it clear that the changes are backward compatible. The reviewer seemed satisfied and provided some encouraging feedback, but will not be available to guide the patch to submission. So I am soliciting for another core committer to continue the review and hopefully apply the changes for python-3.3. Thanks, Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Can we improve support for abstract base classes with desciptors
I would like to try to address some shortfalls with the way python deals with abstract base classes containing descriptors. I originally was just concerned with improving support for defining abstract properties with the decorator syntax and converting between abstract and concrete properties, but recently realized that the problem extends to descriptors in general. ABCs First, a bit of background may be in order. An abstract base class is defined by specifying its metaclass as ABCMeta (or a subclass thereof):: class MyABC(metaclass=ABCMeta): @abstractmethod def foo(self): pass When trying to instantiate MyABC or any of its subclasses, ABCMeta inspects the current class namespace for items tagged with __isabstractmethod__=True:: class ABCMeta(type): #[...] def __new__(mcls, name, bases, namespace): cls = super().__new__(mcls, name, bases, namespace) # Compute set of abstract method names abstracts = {name for name, value in namespace.items() if getattr(value, "__isabstractmethod__", False)} ABCMeta then checks if any of the base classes define any items tagged with __isabstractmethod__ and whether they remain abstract in the current class namespace:: for base in bases: for name in getattr(base, "__abstractmethods__", set()): value = getattr(cls, name, None) if getattr(value, "__isabstractmethod__", False): abstracts.add(name) cls.__abstractmethods__ = frozenset(abstracts) In Objects/typeobject.c, __abstractmethods__ is actually a descriptor, and setting it gives the type a chance to set an internal flag specifying if it has any abstract methods defined. When object_new is called in typeobject.c, the flag is checked and an error is raised if any abstract methods were identified. Issues with ABCs and descriptors In order for this scheme to work, ABCMeta needs to identify all of the abstract methods, but there are some limitations when we consider descriptors. For example, Python's property is a composite object, whose behavior is defined by the getter, setter, and deleter methods with which it is composed. Since there is already an @abstractmethod decorator, I would have suspected that defining abstract properties would be intuitive:: class MyABC(metaclass=ABCMeta): @abstractmethod def _get_foo(self): pass @abstractmethod def _set_foo(self, val): pass foo = property(_get_foo, _set_foo) @property @abstractmethod def bar(self): pass @bar.setter @abstractmethod def bar(self, val): pass Ideally, one would want the flexibility of defining a concrete getter and an abstract setter, for example. However, ABCMeta does not inspect the descriptors of a class to see if they contain any abstract methods. It only inspects the descriptor itself for a True __isabstractmethod__ attribute. This places the burdon on every descriptor implementation to provide its own support for ABC compatibility. For example, support for abstract properties was attempted by adding abstractproperty to the abc module. abstractproperty subclasses the property builtin (as opposed to the relationship between every other abstract and concrete class in the python language). Here is the definition of abstractproperty, in its entirety (modulo docstrings):: class abstractproperty(property): __isabstractmethod__ = True A number of problems manifest with this approach, and I think they all can be traced to the fact that the abstractedness of a descriptor is currently not dependent upon the abstractedness of the methods with which it is composed. The documentation for abstractproperty doesn't suggest using @abstractmethod:: class C(metaclass=ABCMeta): def getx(self): ... def setx(self, value): ... x = abstractproperty(getx, setx) which leads to Issue #1: What is abstract about C.x? How does a subclass of C know whether it needs to override the getter or setter? Issue #2: The decorator syntax cannot be used to convert an abstract property into a concrete one. (This relates to Issue #1: how would a descriptor even know when such a conversion would be appropriate?) Running the following code:: from abc import ABCMeta, abstractmethod, abstractproperty class AbstractFoo(metaclass=ABCMeta): @abstractproperty def bar(self): return 1 @bar.setter def bar(self, val): pass class ConcreteFoo(AbstractFoo): @AbstractFoo.bar.getter def bar(self): return 1 @bar.setter def bar(self, val): pass foo = ConcreteFoo() yields:: TypeError: Can't instantiate abstrac
Re: [Python-Dev] Can we improve support for abstract base classes with desciptors
On Wed, Jun 8, 2011 at 11:55 AM, Nick Coghlan wrote: > On Thu, Jun 9, 2011 at 1:01 AM, Darren Dale wrote: > [snip excellent analysis of the problem] > > I have some suggestions regarding a few details of your current code, > but your basic proposal looks sound to me. > > I would tweak __new__ along the following lines though: [snip] Thank you, I agree. Concerning the following block: > def get_abstract_names(ns): > names = [] > for item in ns.items(): > names.extend(get_abstract_names_for_item(item)) > return names > > abstract_names = get_abstract_names(namespace.items()) That should be "get_abstract_names(namespace)", since ns.items() gets called again in the for loop. I think the get_abstract_names function isn't needed though, since it is only ever called that one time. Any reason not replace the above block with:: abstract_names = [] for item in namespace.items(): abstract_names.extend(get_abstract_names_for_item(item)) > for base in bases: > for name in getattr(base, "__abstractmethods__", ()): > # CHANGE 4: Using rpartition better tolerates weird > naming in the metaclass > # (weird naming in descriptors will still blow up in > the earlier search for abstract names) Could you provide an example of weird naming? Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Can we improve support for abstract base classes with desciptors
On Wed, Jun 8, 2011 at 10:01 PM, Nick Coghlan wrote: > On Thu, Jun 9, 2011 at 8:51 AM, Darren Dale wrote: >>> for base in bases: >>> for name in getattr(base, "__abstractmethods__", ()): >>> # CHANGE 4: Using rpartition better tolerates weird >>> naming in the metaclass >>> # (weird naming in descriptors will still blow up in >>> the earlier search for abstract names) >> >> Could you provide an example of weird naming? > >>>> class C(object): > ... pass > ... >>>> setattr(C, 'weird.name', staticmethod(int)) [...] > This is definitely something that could legitimately be dismissed as > "well, don't do that then" (particularly since similarly weird names > on the descriptors will still break). However, I also prefer the way > partition based code reads over split-based code, so I still like the > modified version. Yes, I like your modified version as well. I just wanted to understand your concern, since it had never occurred to me to try something like "setattr(C, 'pathological.name', ...)". > Full tolerance for weird naming would require storing 2-tuples in > __abstractmethods__ which would cause a whole new set of problems and > isn't worth the hassle. I'm glad you feel that way. I'll work on a patch that includes docs and unit tests and post it at http://bugs.python.org/issue11610. What do you think about deprecating abstractproperty, or removing it from the documentation? Darren ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com