Re: [Python-Dev] PEP 380 (yield from a subgenerator) comments

2009-03-27 Thread Michele Simionato
On Fri, Mar 27, 2009 at 1:33 PM, Jesse Noller  wrote:
> Antoine Pitrou:
>> As a matter of fact, the people whom this PEP is supposed to benefit haven't
>> expressed a lot of enthusiasm right now. That's why it looks so academic.
> That's because most of us who might like this have been patently
> avoiding this thread.

I have been avoiding this thread too - even if I have implemented my
own trampoline as
everybody else here - because I had nothing to say that was not said
already here.
But just to add a data point, let me say that I agree with Eby.
I am 0+ on the syntax, but please keep the hidden logic simple and
absolutely do NOT add confusion
between yield and return. Use yield Return(value) or raise SomeException(value),
as you like. The important thing for me is to have a trampoline in the
standard library, not
the syntax.

   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] decorator module in stdlib?

2009-04-07 Thread Michele Simionato
On Tue, Apr 7, 2009 at 11:04 PM, Terry Reedy  wrote:
>
> This probably should have gone to the python-ideas list.  In any case, I
> think it needs to start with a clear offer from Michele (directly or relayed
> by you) to contribute it to the PSF with the usual conditions.

I have no problem to contribute the module to the PSF and to maintain it.
I would just prefer to have the ability to change the function signature in
the core language rather than include in the standard library a clever hack.

   M. Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] decorator module in stdlib?

2009-04-08 Thread Michele Simionato
On Wed, Apr 8, 2009 at 8:10 AM, Jack diederich  wrote:
> Plus he's a softie for decorators, as am I.

I must admit that while I still like decorators, I do like them as
much as in the past.
I also see an overuse of decorators in various libraries for things that could
be done more clearly without them ;-(
But this is tangential.
What I would really like to know is the future of PEP 362, i.e. having
a signature object that could be taken from an undecorated function
and added to the decorated function.
I do not recall people having anything against it, in principle,
and there is also an implementation in the sandbox, but
after three years nothing happened. I guess this is just not
a high priority for the core developers.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] decorator module in stdlib?

2009-04-08 Thread Michele Simionato
On Wed, Apr 8, 2009 at 7:51 PM, Guido van Rossum  wrote:
>
> There was a remark (though perhaps meant humorously) in Michele's page
> about decorators that worried me too: "For instance, typical
> implementations of decorators involve nested functions, and we all
> know that flat is better than nested." I find the nested-function
> pattern very clear and easy to grasp, whereas I find using another
> decorator (a meta-decorator?) to hide this pattern unnecessarily
> obscuring what's going on.

I understand your point and I will freely admit that I have always had mixed
feelings about the advantages of a meta decorator with
respect to plain simple nested functions. I see pros and contras.
If functools.update_wrapper could preserve the signature I
would probably use it over the decorator module.

> I also happen to disagree in many cases with decorators that attempt
> to change the signature of the wrapper function to that of the wrapped
> function. While this may make certain kinds of introspection possible,
> again it obscures what's going on to a future maintainer of the code,
> and the cleverness can get in the way of good old-fashioned debugging.

Then perhaps you misunderstand the goal of the decorator module.
The raison d'etre of the module is to PRESERVE the signature:
update_wrapper unfortunately *changes* it.

When confronted with a library which I do not not know, I often run
over it pydoc, or
sphinx, or a custom made documentation tool, to extract the
signature of functions. For instance, if I see a method
get_user(self, username) I have a good hint about what it is supposed
to do. But if the library (say a web framework) uses non signature-preserving
decorators, my documentation tool says to me that there is function
get_user(*args, **kwargs) which frankly is not enough [this is the
optimistic case, when the author of the decorator has taken care
to preserve the name of the original function].
 I *hate* losing information about the true signature of functions, since I also
use a lot IPython, Python help, etc.

>> I must admit that while I still like decorators, I do like them as
>> much as in the past.

Of course there was a missing NOT in this sentence, but you all understood
the intended meaning.

> (All this BTW is not to say that I don't trust you with commit
> privileges if you were to be interested in contributing. I just don't
> think that adding that particular decorator module to the stdlib would
> be wise. It can be debated though.)

Fine. As I have repeated many time that particular module was never
meant for inclusion in the standard library. But I feel strongly about
the possibility of being able to preserve (not change!) the function
signature.

> To me, introspection is mostly useful for certain
> situations like debugging or interactively finding help, but I would
> hesitate to build a large amount of stuff (whether a library,
> framework or application) on systematic use of introspection. In fact,
> I rarely use the inspect module and had to type help(inspect) to
> figure out what you meant by "signature". :-) I guess one reason is
> that in my mind, and in the way I tend to write code, I don't write
> APIs that require introspection -- for example, I don't like APIs that
> do different things when given a "callable" as opposed to something
> else (common practices in web frameworks notwithstanding), and
> thinking about it I would like it even less if an API cared about the
> *actual* signature of a function I pass into it. I like APIs that say,
> for example, "argument 'f' must be a function of two arguments, an int
> and a string," and then I assume that if I pass it something for 'f'
> it will try to call that something with an int and a string. If I pass
> it something else, well, I'll get a type error. But it gives me the
> freedom to pass something that doesn't even have a signature but
> happens to be callable in that way regardless (e.g. a bound method of
> a built-in type).

I do not think everybody disagree with your point here. My point still
stands, though: objects should not lie about their signature, especially
during  debugging and when generating documentation from code.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] decorator module in stdlib?

2009-04-09 Thread Michele Simionato
On Thu, Apr 9, 2009 at 2:11 PM, Nick Coghlan  wrote:
> One of my hopes for PEP 362 was that I would be able to just add
> __signature__ to the list of copied attributes, but that PEP is
> currently short a champion to work through the process of resolving the
> open issues and creating an up to date patch (Brett ended up with too
> many things on his plate so he wasn't able to do it, and nobody else has
> offered to take it over).

I am totally ignorant about the internals of Python and I cannot certainly
take that role. But I would like to hear from Guido if he wants to support
a __signature__ object or if he does not care. In the first case
I think somebody will take the job, in the second case it is better to
reject the PEP and be done with it.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Lack of sequential decompression in the zipfile module

2007-02-16 Thread Michele Simionato
Derek Shockey  gmail.com> writes:

> 
> Though I am an avid Python programmer, I've never forayed into the area of
developing Python itself, so I'm not exactly sure how all this works.I was
confused (and somewhat disturbed) to discover recently that the zipfile module
offers only one-shot decompression of files, accessible only via the read()
method. It is my understanding that the module will handle files of up to 4 GB
in size, and the idea of decompressing 4 GB directly into memory makes me a
little queasy. Other related modules (zlib, tarfile, gzip, bzip2) all offer
sequential decompression, but this does not seem to be the case for zipfile
(even though the underlying zlib makes it easy to do).
> Since I was writing a script to work with potentially very large zipped files,
I took it upon myself to write an extract() method for zipfile, which is
essentially an adaption of the read() method modeled after tarfile's extract().
I feel that this is something that should really be provided in the zipfile
module to make it more usable. I'm wondering if this has been discussed before,
or if anyone has ever viewed this as a problem. I can post the code I wrote as a
patch, though I'm not sure if my file IO handling is as robust as it needs to be
for the stdlib. I'd appreciate any insight into the issue or direction on where
I might proceed from here so as to fix what I see as a significant problem.

This is definitely a significant problem. We had to face it at work, and
at the end we decided to use zipstream
(http://www.doxdesk.com/software/py/zipstream.html) instead of zipfile,
but of course having the functionality in the standard library would be
much better.

 Michele Simionato


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Py2.6 ideas

2007-02-19 Thread Michele Simionato
Raymond Hettinger  verizon.net> writes:
> * Add a pure python named_tuple class to the collections module.  I've been 
> using the class for about a year and found that it greatly improves the 
> usability of tuples as records. 
> http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/500261

The implementation of this recipe is really clean and I like it a lot
(I even think of including it in our codebase), but there are a few issues
that I would like to point out.

1. I find the camelcase confusing and I think that NamedTuple should be 
   spelled namedtuple, since is a function, not a class. The fact that it 
   returns classes does not count ;)

2. I agree with Giovanni Bajo, the constructor signature should be consistent
   with regular tuples. For instance I want to be able to map a named tuple 
   over a record set as returned by fetchall.

3. I would like to pass the list of the fields as a sequence, not as
   a string. It would be more consistent and it would make easier
   the programmatic creation of NamedTuple classes at runtime.

4. I want help(MyNamedTuple) to work well; in particular it should
   display the right module name. That means
   that in the m dictionary you should add a __module__ attribute:
   
__module__ = sys._getframe(1).f_globals['__name__']

5. The major issue is that pickle does work with named tuples since the
   __module__ attribute is wrong. The suggestion in #4 would solve even
   this issue for free.

6. The ability to pass a show function to the __repr__ feems over-engineering
   to me.

In short, here is how I would change the recipe:

import sys
from operator import itemgetter

def namedtuple(f):
"""Returns a new subclass of tuple with named fields.

>>> Point = namedtuple('Point x y'.split())
>>> Point.__doc__   # docstring for the new class
'Point(x, y)'
>>> p = Point((11,), y=22)  # instantiate with positional args or keywords
>>> p[0] + p[1] # works just like the tuple (11, 22)
33
>>> x, y = p# unpacks just like a tuple
>>> x, y
(11, 22)
>>> p.x + p.y   # fields also accessable by name
33
>>> p   # readable __repr__ with name=value style 
Point(x=11, y=22)

"""
typename, field_names = f[0], f[1:]
nargs = len(field_names)

def __new__(cls, args=(), **kwds):
if kwds:
try:
args += tuple(kwds[name] for name in field_names[len(args):])
except KeyError, name:
raise TypeError(
'%s missing required argument: %s' % (typename, name))
if len(args) != nargs:
raise TypeError(
'%s takes exactly %d arguments (%d given)' %
(typename, nargs, len(args)))
return tuple.__new__(cls, args)

template = '%s(%s)' % (
typename, ', '.join('%s=%%r' % name for name in field_names))

def __repr__(self):
return template % self

m = dict(vars(tuple)) # pre-lookup superclass methods (for faster lookup)
m.update(__doc__= '%s(%s)' % (typename, ', '.join(field_names)),
 __slots__ = (),# no per-instance dict 
 __new__ = __new__,
 __repr__ = __repr__,
 __module__ = sys._getframe(1).f_globals['__name__'],
 )
m.update((name, property(itemgetter(index)))
 for index, name in enumerate(field_names))

return type(typename, (tuple,), m)


if __name__ == '__main__':
import doctest
TestResults = namedtuple(['TestResults', 'failed', 'attempted'])
print TestResults(doctest.testmod())


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Py2.6 ideas

2007-02-20 Thread Michele Simionato
Raymond Hettinger  rcn.com> writes:

> 
> More thoughts on named tuples after trying-out all of Michele's suggestions:
> 
> * The lowercase 'namedtuple' seemed right only because it's a function, but
> as a factory function, it is somewhat class-like.  In use, 'NamedTuple' more
> closely matches my mental picture of what is happening and distinguishes
> what it does from the other two entries in collections, 'deque' and 
> 'defaultdict'
> which are used to create instances instead of new types.

This is debatable. I remember Guido using lowercase for metaclasses
in the famous descrintro essay. I still like more the lowercase for
class factories. But I will not fight on this ;)

> * I remembered why the __repr__ function had a 'show' argument.  I've
> changed the name now to make it more clear and added a docstring.
> The idea was the some use cases require that the repr exactly match
> the default style for tuples and the optional argument allowed for that
> possiblity with almost no performance hit.

But what about simply changing the __repr__?

In [2]: Point = NamedTuple('Point','x','y')

In [3]: Point(1,2)
Out[3]: Point(x=1, y=2)

In [4]: Point.__repr__ = tuple.__repr__

In [5]: Point(1,2)
Out[5]: (1, 2)

It feels clearer to me.

  Michele Simionato

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Py2.6 ideas

2007-02-20 Thread Michele Simionato
Raymond Hettinger wrote:
> The constructor signature has been experimented with
> several time and had best results in its current form
> which allows the *args for casting a record set returned
> by SQL or by the CSV module as in Point(*fetchall(s)),

I think you mean something like [Point(*tup) for tup in fetchall(s)],
which I don't like for the reasons explained later.

> and it allows for direct construction with Point(2,3) without the
> slower and weirder form: Point((2,3)).  Also, the current signature
> works better with keyword arguments:  Point(x=2, y=3) or
> Point(2, y=3)  which wouldn't be common but would be
> consistent with the relationship between keyword arguments
> and positional arguments in other parts of the language.

I don't buy this argument. Yes, Point(2,3) is nicer than Point((2,3))
in the interactive interpreter and in the doctests, but in real life
one has always tuples coming as return values from functions.
Consider your own example, TestResults(*doctest.testmod()). I will
argue that the * does not feel particularly good and that it would be
better to just write TestResults(doctest.testmod()). 
Moreover I believe that having a subclass constructor incompatible
with the base class constructor is very evil. First of all, you must be 
consistent with the tuple constructor, not with 
"other parts of the language".
Finally I did some timing of code like this::

 from itertools import imap
 Point = namedtuple('Point x y'.split())

 lst = [(i, i*i) for i in range(500)]

 def with_imap():
 for _ in imap(Point, lst):
 pass

 def with_star():
 for _ in (Point(*t) for t in lst):
 pass

and as expected the performances are worse with the * notation.
In short, I don't feel any substantial benefit coming from the *args
constructor.

> The string form for the named tuple factory was arrived at
> because it was easier to write, read, and alter than its original
> form with a list of strings:
>Contract = namedtuple('Contract stock strike volatility
> expiration rate 
> iscall')
> vs.
>Contract = namedtuple('Contract', 'stock', 'strike', 'volatility', 
> 'expiration', 'rate', 'iscall')
> That former is easier to edit and to re-arrange.  Either form is trivial to 
> convert
> programmatically to the other and the definition step only occurs
> once while the
> use of the new type can appear many times throughout the code.
> Having experimented with both forms, I've found the string form to
> be best thought it seems a bit odd.  Yet, the decision isn't central to
> the proposal and is still an open question.

``Contract = namedtuple('Contract stock strike volatility expiration rate 
iscall'.split())`` is not that bad either, but I agree that this is a
second order issue.

Michele Simionato


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A "record" type (was Re: Py2.6 ideas)

2007-02-20 Thread Michele Simionato
Steven Bethard  gmail.com> writes:
> Here's a simple implementation using __slots__:
> 
> http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/502237

That's pretty cool! Two suggestions:

1. rename the _items method to __iter__, so that you have easy casting
to tuple and lists;

2. put a check in the metaclass such as
   ``assert '__init__' not in bodydict`` to make clear to the users
   that they cannot override the __init__ method, that's the metaclass job.

Great hack!

 Michele Simionato

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Py2.6 ideas

2007-02-21 Thread Michele Simionato
Michele Simionato  gmail.com> writes:
> Finally I did some timing of code like this::
> 
>  from itertools import imap
>  Point = namedtuple('Point x y'.split())
> 
>  lst = [(i, i*i) for i in range(500)]
> 
>  def with_imap():
>  for _ in imap(Point, lst):
>  pass
> 
>  def with_star():
>  for _ in (Point(*t) for t in lst):
>  pass
> 
> and as expected the performances are worse with the * notation

BTW, I take back this point. It is true that the generation expression
is slower, but the direct equivalent of imap is starmap and using
that I don't see a significant difference in execution times. So it seems that
starmap is smart enough to avoid unnecessary tuple unpacking (as you
probably know ;-). I still don't like for a subclass to have an incompatible
signature with the parent class, but that's just me.

 Michele Simionato

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-26 Thread Michele Simionato
On Tue, Aug 26, 2008 at 5:32 PM, Raymond Hettinger <[EMAIL PROTECTED]> wrote:
> What I would really like to see is a fourth blog entry that shows
> how to use super() reliably and correctly.

That could be arranged.

> In general, I opposed
> to creating documentation in the form of "danger, danger this
> could explode "   IMO, there is not much point in dwelling on
> bugs that have already been fixed, nor is there an advantage to
> showing all the ways the tool can be misused.

Yep. The parts about the bugs of super in 2.2 and 2.3 were written years
ago, when they were relevant. Nowadays they are less relevant, but since they
were already written and since there are still people using older versions
of Python I decided to keep them. I would not keep them in a revised version
intended as "semi-official" documentation of super. Still, I think they are
fine as a blog post.

> For cooperative multiple inheritance, I take issue with the abstracted
> examples provided (i.e. inconsistent signatures).  In a real app that
> actually needs cooperative multiple inheritance, it becomes self-evident
> what "cooperative" actually means -- the methods *have* to be
> designed to interoperate -- it is intrinsic to the problem at hand.

> Cooperative multiple inheritance is *not* about mixing two unrelated
> parents that just happen to use the same method name but have
> different semantics and were not designed to cooperate with each other.
>
> The A-B-C-D diagrams and foo/bar methods in the examples are
> deceptive because they silently drop the precondition of cooperation
> while attempting to demonstrate a supposed flaw in the tool.

They just show that the tool is delicate and not easy to use.

> If I understand the problem correctly, in the rare cases where you do
> need cooperative multiple inheritance, then super() is the only workable
> solution short of designing some equivalent using composition instead
> of inheritance.

In my experience, one can go a long way using composition instead of
inheritance.
I also think that Python would not lose much without cooperative
multiple inheritance.
This is however a personal opinion and in any case the point is moot
because the language is the way it is. Still, in a blog post personal
opinions and even rants have their place. That part could be removed in an
"semi-official" document.

> Also, it may be controversial, but there may be some merit in de-documenting
> the "unbound" case.  It seems to add more confusion than it's worth.

Fine with me.

> Lastly, I take issue with one other part of the blogs.  While they show
> a clear dislike for cooperative multiple inheritance, they take a potshot
> at multiple inheritance in general.  I don't follow the logic here.  IMO,
> mixin classes like DictMixin have proven themselves as being very
> useful.  Plenty of frameworks share this approach.  Likewise, the
> new ABCs offer mixin capabilities that are really nice.
>
> I think it is a non-sequiter to reason from "diamond diagrams are
> complicated" to "mixins should be disallowed".  Instead, I think it better
> to simply recommend that a key to happiness is to keep various mixin classes
> completely orthogonal to one another (no overlapping method names).

I not completely against multiple inheritance. I am against multiple inheritance
as it is now. A restricted form of multiple inheritance in which mixins classes
are guaranteed to be orthogonal would be fine with me (provided it is
not abused).
This concept exists already in other languages, the orthogonal mixins
are called "traits".

I have written a trilogy of papers about mixins: if you read them, you will see
where I come from (Zope, which I do not like) and you will also see
that I like DictMixin
instead.
I will publish the papers in the blog soon or later, but you can find
the Italian version here:

http://stacktrace.it/articoli/2008/06/i-pericoli-della-programmazione-con-i-mixin1/
http://stacktrace.it/articoli/2008/07/i-pericoli-della-programmazione-con-i-mixin2/
http://stacktrace.it/articoli/2008/08/i-pericoli-della-programmazione-con-i-mixin3/


 Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-26 Thread Michele Simionato
On Tue, Aug 26, 2008 at 8:56 PM, Raymond Hettinger <[EMAIL PROTECTED]> wrote:
> I would state this differently:  "The use cases for cooperative multiple
> inheritence don't arise often in practice; so, if we dropped support
> for those cases, you probably wouldn't notice until you encountered
> one of the rare occasions where it was the right answer to your problem."
>
> There was some quote floating around that expressed the situation
> well -- it went something like: "Python makes most problems easy
> and hard problems possible".  The use cases for cooperative multiple
> inheritance fall in the latter category.

It is just a matter of how rare the use cases really are. Cooperative
methods has been introduced 6+ years ago. In all this time surely
they must have been used. How many compelling uses of cooperation
we can find in real life code? For instance in the standard library or
in some well known framework? This is a serious question I have been
wanting to ask for years. I am sure people here can find some example,
so just give me a pointer and we will see.

> BTW, I really like your paper explaining the MRO.  Excellent work.

The issue with that paper is that I wrote it when my Python experience
was reduced to six month and my experience with real life large object oriented
frameworks was zero. Nowadays I value simplicity more.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-26 Thread Michele Simionato
On Wed, Aug 27, 2008 at 3:30 AM, Alex Martelli <[EMAIL PROTECTED]> wrote:
> On Tue, Aug 26, 2008 at 6:16 PM, Michele Simionato
> <[EMAIL PROTECTED]> wrote:
>   ...
>> It is just a matter of how rare the use cases really are. Cooperative
>> methods has been introduced 6+ years ago. In all this time surely
>> they must have been used. How many compelling uses of cooperation
>> we can find in real life code? For instance in the standard library or
>> in some well known framework? This is a serious question I have been
>> wanting to ask for years. I am sure people here can find some example,
>> so just give me a pointer and we will see.
>
> http://www.koders.com/default.aspx?s=super&btn=&la=Python&li=* finds
> over 5,000 hits, but it would take substantial work to sift through
> them (in particular because not all refer to the built-in super, as
> you'll see even in the first page!)

Yep. Notice (I am sure you understood the point correctly, but just to clarify)
that I am not interested in random occurrences of super, but in
code/frameworks expressly designed to leverage on cooperation
and doing it in a compelling way. IOW, I want to see cases where using
cooperation
is really better than relying on other techniques. Guido gives an example in
http://www.python.org/download/releases/2.2.3/descrintro/#cooperation
with a .save method, so in theory there are good use cases, but I
wonder in practice how common they are and if they are
frequent enough to justify the added complication.

   M.S.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-26 Thread Michele Simionato
On Tue, Aug 26, 2008 at 6:13 PM, Michele Simionato
<[EMAIL PROTECTED]> wrote:
> I not completely against multiple inheritance. I am against multiple 
> inheritance
> as it is now. A restricted form of multiple inheritance in which mixins 
> classes
> are guaranteed to be orthogonal would be fine with me (provided it is
> not abused).
> This concept exists already in other languages, the orthogonal mixins
> are called "traits".

I must correct myself here. Even if for practical purposes traits look
like a restricted multiple
inheritance, in principle it is better to think of them as of an
enhanced single inheritance.
With traits there is always a single superclass: traits are just
single inheritance with a nice
syntax to include methods (like in Ruby) and a guarantee that methods
will not be overridden
silently (this one is missing in Ruby).


   M.S.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-26 Thread Michele Simionato
On Wed, Aug 27, 2008 at 5:15 AM, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
> ISTR pointing out on more than one occasion that a major use case for
> co-operative super() is in the implementation of metaclasses.  The __init__
> and __new__ signatures are fixed, multiple inheritance is possible, and
> co-operativeness is a must (as the base class methods *must* be called).
>  I'm hard-pressed to think of a metaclass constructor or initializer that
> I've written in the last half-decade or more where I didn't use super() to
> make it co-operative.
>
> That, IMO, is a compelling use case even if there were not a single other
> example of the need for super.
I have been giving a lot of thought to this use case, at least
from the time of the metaclass conflict recipe. I have always wondered
why the recipe had to be so complicated. At the end, I have come to
the conclusion that the problem was not with the recipe but with
multiple inheritance itself.
Let me explain the argument.

A possible use case for multiple inheritance of metaclasses is the
following: suppose I have a DebugMeta metaclass which adds some
debugging support to classes; now I want to apply it to a third party
framework which uses a FrameworkMeta metaclass internally. Let us
suppose the framework author wrote its metaclass correctly, by
supporting cooperation:

.. code-block:: python

 class FrameworkMeta(type):
def __new__(mcl, name, bases, dic):
print "Adding framework features to %s" % name
return super(FrameworkMeta, mcl).__new__(mcl, name, bases, dic)


>>> class FrameworkClass(object):
...__metaclass__ = FrameworkMeta
Adding framework features to FrameworkClass

Moreover, suppose I wrote my DebugMeta to support cooperation
correctly:

.. code-block:: python

 class DebugMeta(type):
def __new__(mcl, name, bases, dic):
print "Adding debugging features to %s" % name
return super(DebugMeta, mcl).__new__(mcl, name, bases, dic)


Now I can add the debugging features to a class in this way:

.. code-block:: python

 class DebugFrameworkMeta(DebugMeta, FrameworkMeta):
 pass


>>> class DebugFrameworkClass(FrameworkClass):
... __metaclass__ = DebugFrameworkMeta
Adding debugging features to DebugFrameworkClass
Adding framework features to DebugFrameworkClass

As you see everything works fine. Now, lets travel in the fictional
world of a fictional language called T-Python which is just like
Python, except it lacks multiple inheritance but has some support for
traits.  By this I mean that there is an ``include_mixin`` function
working more or less like this (it could be enhanced but I am keeping
it dead simple here for the sake of the argument):

.. code-block:: python

 def include_mixin(mixin, cls): # could be extended to use more mixins
 # traits as in Squeak take the precedence over the base class
 dic = vars(mixin).copy() # could be extended to walk the ancestors
 return type(mixin.__name__ + cls.__name__, (cls,),  dic)


I will argue that T-Python is not worse than Python for this use
case (composition of metaclasses).

In the fictional world there is not need for super since
all hierarchies are linear and you can just call the base class;
FrameworkMeta could have been written as

.. code-block:: python

 class FrameworkMeta2(type):
def __new__(mcl, name, bases, dic):
print "Adding framework features to %s" % name
return type.__new__(mcl, name, bases, dic)


and DebugMetas as

.. code-block:: python

 class DebugMeta2(type):
def __new__(mcl, name, bases, dic):
print "Adding debugging features to %s" % name
return mcl.__base__.__new__(mcl, name, bases, dic)


Notice that DebugMeta2 is performing a sort of cooperative call here
(``mcl.__base__.__new__``) but dead simple since there is just one base class.

The analogous of FrameworkClass can be defined as

>>> class FrameworkClass2(object):
... __metaclass__ = FrameworkMeta2
Adding framework features to FrameworkClass2

and the analogous of DebugFrameworkClass as

>>> class DebugFrameworkClass2(FrameworkClass2):
... __metaclass__ = DebugFrameworkMeta2
Adding debugging features to DebugFrameworkClass2
Adding framework features to DebugFrameworkClass2

So, as you see, it works. Checks of the kind
``isinstance(DebugFrameworkClass2, DebugMeta2)`` would fail, but this
is not a big issue (isinstance should not be used, or you could
register DebugMeta2 as a base class even if it is not by using
Python 2.6 ABC's).

Now, I am not claiming that I have thought of all possible usages of
multiple inheritance and metaclasses: however I have not found yet a
use case that I could not rewrite by using single-inheritance + traits
as the one I have just shown. Possibly there are cases which are
difficult to rewrite: but how common are they?

Notice that I am not advocating rewriting Python. The argument here is
purely hyphotetic and concerning a fictional language: I just want to
understand if full multiple

Re: [Python-Dev] Things to Know About Super

2008-08-26 Thread Michele Simionato
On Tue, Aug 26, 2008 at 11:10 PM, Steve Holden <[EMAIL PROTECTED]> wrote:
> If you aren't aware of it you should take a look at Enthought's traits
> package. It's part of the Enthought Tool Suite (ETS).

I know of the existence of that framework, however it is quite
large and I don't see the relation with the concept of traits
I have in mind, which is more or less the one described here:
http://www.iam.unibe.ch/%7Escg/Archive/Papers/Scha03aTraits.pdf

Basically, these are the properties of traits:

1. the methods/attributes in a trait go logically together;
2. if a trait enhances a class, then all subclasses are enhanced too;
3. if a trait has methods in common with the class, then the
   methods defined in the class have the precedence;
4. the ordering of traits is not important, i.e. enhancing a class
   first with trait T1 and then with trait T2 or viceversa is the same;
5. if traits T1 and T2 have names in common, enhancing a class both
   with T1 and T2 raises an error unless there is an explicitoverriding;
6. if a trait has methods in common with the base class, then the
   trait methods have the precedence;

Properties from 4 to 6 are the distinguishing properties of traits
with respect to multiple inheritance and mixins.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-27 Thread Michele Simionato
On Wed, Aug 27, 2008 at 4:30 PM, Alex Martelli <[EMAIL PROTECTED]> wrote:
> Maybe you could help me understand this by giving a fully executable
> Python snippet using __bases__[0] instead of the hypothetical
> __base__?

Sorry Alex, I have the fully functional snippet but evidently I have
sent some other blurb instead (it was early in the morning)
It is on my machine at home and now I am at work, so have patience ;)

 Michele
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-27 Thread Michele Simionato
Alex Martelli wrote:
> What's DebugFrameworkMeta2?  I assume it's some kind of mix but I
> don't see it defined anywhere so I'm having to guess.

Sorry Alex, here is definition which got lost in cut&paste:

DebugFrameworkMeta2 = include_mixin(DebugMeta2, FrameworkMeta2)

> I'd like to understand what, in this example, removes the apparent
> "fragility" (or, lack of flexibility) where DebugMeta2 specifically
> uses mcl.__base__ -- IOW, if I have another "mixin metaclass" just
> like DebugMeta2 but called (say) RemotableMeta which does a print
> "Adding remoting features" and then also calls
> mcl.__base__.__new__(mcl ... just like DebugMeta2, what gets both of
> their __new__ methods called in the right order?

If you want to reimplement full cooperation between mixins classes
you must rework a bit the example, but it does not take a big effort
(see later). However my main point is: do we really want cooperative
methods? Multiple inheritance of metaclasses is perhaps
the strongest use case for multiple inheritance, but is it strong
enough? I mean, in real code how many times did I need that?
I would not mind make life harder for gurus and simpler for
application programmers. I do not think removing cooperation
would be so bad in practice. In many practical cases, one could just write
the metaclass by hand, in this example something like

class RemotableDebugFrameworkMeta(FrameworkMeta):
   def __new__(mcl, name, bases, dic):
   print "Adding remoting features to %s" % name
   print "Adding debugging features to %s" % name
   return FrameworkMeta.__new__(mcl, name, bases, dic)

Maybe you would need to duplicate a couple of lines and/or to introduce
an helper function, but you would have the benefit of having a more
explicit metaclass, with a simpler hierarchy (see the appendix for
an alternative solution).

> Maybe you could help me understand this by giving a fully executable
> Python snippet using __bases__[0] instead of the hypothetical
> __base__?

To the best of my knowledge __base__ is a valid class attribute,
it denotes the "right" class to use as base. Usually it is the same
as bases[0], but there is at least one case when it is different,
when composing old style and new style classes:

  >>> class OldStyle: pass
  >>> class NewStyle(object): pass
  >>> class New(OldStyle, NewStyle): pass
  >>> New.__bases__[0]
  
  >>> New.__base__
  

Appendix: how to reimplement cooperation in a single-inheritance world


Quoting Raymond: "To achieve substantially the
same functionality, you would likely have to
re-invent much of what super() does for us automatically, and
you would still be imposing constraits on the composed classes
that are substantially the same as what you have with inheritance."

Raymond of course is right, but I am arguing that one does not really
need to re-invent cooperation because the use case for cooperation
are exceedingly rare. Still, if one really wants to reimplement
cooperation, here is my take at the challenge:

$ cat cooperative_mixins.py
"Implements cooperative mixins using multiple-inheritance only"

## everything in three lines
def include_mixin(mixin, cls): # could be extended to use more mixins
# traits as in Squeak take the precedence over the base class
dic = vars(mixin).copy() # could be extended to walk the ancestors
dic['_%s__super' % mixin.__name__] = cls
return type(mixin.__name__ + cls.__name__, (cls,),  dic)

## examples:

class FrameworkMeta(type): # example metaclass
   def __new__(mcl, name, bases, dic):
   print "Adding framework features to %s" % name
   return type.__new__(mcl, name, bases, dic)

class DebugMeta(type): # mixin metaclass
   def __new__(mcl, name, bases, dic):
   print "Adding debugging features to %s" % name
   return mcl.__super.__new__(mcl, name, bases, dic)

class RemotableMeta(type): # another mixin metaclass
   def __new__(mcl, name, bases, dic):
   print "Adding remoting features to %s" % name
   return mcl.__super.__new__(mcl, name, bases, dic)

class FrameworkClass(object):
__metaclass__ = FrameworkMeta

DebugFrameworkMeta = include_mixin(DebugMeta, FrameworkMeta)

print ' creating DebugFrameworkClass'
class DebugFrameworkClass(FrameworkClass):
__metaclass__ = DebugFrameworkMeta

RemotableDebugFrameworkMeta = include_mixin(
   RemotableMeta, DebugFrameworkMeta)

print ' creating RemotableDebugFrameworkClass'
class RemotableDebugFrameworkClass(FrameworkClass):
__metaclass__ = RemotableDebugFrameworkMeta
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-28 Thread Michele Simionato
On Aug 28, 5:30 pm, "Phillip J. Eby" <[EMAIL PROTECTED]> wrote:
> How is that making things easier for application programmers?

We have different definitions of "application programmer". For me a typical
application programmer is somebody who never fiddles with metaclasses,
which are the realm of framework builders. But the borders are fluid, I agree.

> >Maybe you would need to duplicate a couple of lines and/or to introduce
> >an helper function,
>
> ...which then has to have an agreed-upon protocol that all metaclass
> authors have to follow...  which we already have...  but which you're
> proposing to get rid of...  so we can re-invent it lots of
> times...  in mutually incompatible ways.  :)

Notice that I was discussing an hypothetical language. I was arguing
that in principle
one could write a different language from Python, with single inheritance only,
and not lose much expressivity. I am not advocating any change to
current Python.
My point was in language design: I want to know how much I can remove
from a language
and still have something useful, in the spirit of the famous
Saint-Exupery quote.

 Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-28 Thread Michele Simionato
On Thu, Aug 28, 2008 at 8:54 PM, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
> I created a "universal metaclass" in
> DecoratorTools whose sole function is to delegate metaclass __new__,
> __init__, and __call__ to class-level methods (e.g. __class_new__,
> __class_call__, etc.), thereby eliminating the need to have custom
> metaclasses for most use cases in the first place.  Now, wherever possible,
> I use that single metaclass in my frameworks, so that there's no need to mix
> them.

easy_installed DecoratorTools and found it: classy_class.
>From the point of view of the code, this is a beautiful and elegant
snippet. However, suppose that from tomorrow everybody starts
using it. Since metaclasses would become so easy to use, possibly a
lot of people would take advantage of them. Then we would have
potentially complex (multiple) inheritance hierarchies with
chains of methods (_class__new__/_class__init__) calling
themselves cooperatively in the MRO. Would the resulting
code be readable? How easy would be for an average framework user
to understand what is happening to his class?
I think class decorators would be a much better solution than
classy_class for most use cases and we should push that way,
not the cooperative inheritance way.

Generally speaking I like
more solutions bases on functional composition (as in WSGI
that you know very well) than on method cooperation. Rather than
improve the support for inheritance, I would like (in an ideal
world) to reduce it, to make easier the choice for people between
inheritance and alternatives (object composition, delegation, functional
composition). In the real world, I am content in documenting the
pitfalls of super, warn people about the dangers of complex
design involving multiple inheritance and cooperation, and suggest
alternatives.

   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-28 Thread Michele Simionato
On Fri, Aug 29, 2008 at 6:22 AM, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
> You're right, let's abolish inheritance, too, because then you might have to
> read more than one class to see what's happening.

You are joking, but I actually took this idea quite seriously. Once
(four years ago or so) I did implement an object system from scratch
in Scheme, completely without inheritance, to see how far it would
go. It didn't go far, of course (nor I did expect it to go very far) but
at least I learned exactly what (single) inheritance was good for.
OTOH, for what concerns multiple inheritance, I am still not
convinced it is really worth it. I mean, the MRO is beautiful,
elegant and all that on paper, but on real-life code things as different,
especially from the side of the users of frameworks heavily
based on inheritance.

> Naturally, if you can design a system to use delegates instead of class
> hierarchy to represent a chain of responsibility, it might well be an
> improvement.  But there are tradeoffs, and no matter what you are going to
> end up coding chains of responsibility.

Agreed, it is all about tradeoffs. We have a different opinion on what
a good tradeoff is in this case, but that's fine. I guess it depends
on personal experience and the kind of code one has to work with.
For instance I never had to integrated different frameworks
using different metaclasses in my daily work, so I don't see
a very strong case for classy_class over class decorators,
but I could change my mind in the future, who knows?

Anyway, It would be nice to have a good simple *real life*
use case of cooperative inheritance not involving metaclasses,
suitable for a beginners' tutorial about super, but I haven't
found one yet :-(

  M.S.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-29 Thread Michele Simionato
On Fri, Aug 29, 2008 at 6:15 PM, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> The mixin methods in the ABC machinery would be a lot less useful
> without multiple inheritance (and the collections ABCs would be a whole
> lot harder to define and to write).
>
> So if you're looking for use cases for multiple inheritance, I'd suggest
> starting with the Python 2.6 collections module and seeing how you would
> go about rewriting it using only single inheritance. I believe the new
> io module is also fairly dependent on multiple inheritance.

I am very well aware of the collection module and the ABC mechanism.
However, you are missing that mixins can be implemented in a single-inheritance
world without the complications of the MRO. See my answer to Alex
Martelli in this same thread.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-29 Thread Michele Simionato
On Fri, Aug 29, 2008 at 8:33 PM, Casey Duncan <[EMAIL PROTECTED]> wrote:
> - There are good alternatives to multiple inheritance for many cases, but
> there are cases where multiple inheritance is arguably best.

Maybe, but I am still biased in the opposite direction ;)

>Traits are a
> possible alternative that deserve further study. I think that study would be
> greatly aided by a 3rd-party library implementing traits for Python. If
> traits are to gain any traction or ever be considered for inclusion into the
> language such a library would need to exist first and demonstrate its
> utility.

I wrote a trait library which I plan to release soon or later. However
it is intended as a proof of concept, not as a candidate for inclusion
in the standard library. As of now, I don't think we should have
a different way of doing mixins in the standard library. There should
be only one obvious way and the obvious way in current Python
is multiple inheritance as it is now. The proof of concept is
important for educational purposes, to open the mind to
alternatives, to give inspiration to the creators of new languages:
it is not intended to add complication (whenever small) to
current Python. Having said that, maybe once I release the
library people will start using it in production and will ask
for inclusion for the standard library, but this is not my goal
now. This happened for my decorator module years ago:
when I wrote it I did not expect people to use it,  I saw it
as a temporary hack until we got a better support for
fiddling with function signatures in the standard library.
Nevertheless now a lot of people are using it and I am
not even sure it is a good thing (I have seen many decorator
abuses out there). This the strange thing that happens when
you release a library: you will never now what people will
end up using it for ;)

  Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Things to Know About Super

2008-08-31 Thread Michele Simionato
On Sat, Aug 30, 2008 at 6:16 AM, Michele Simionato
> I wrote a trait library which I plan to release soon or later.

Ok, just for the people here that cannot wait I have prepared a pre-alpha
snapshot and uploaded it to my site:

http://www.phyast.pitt.edu/~micheles/python/strait.html

At some moment I want to release it officially, but as of now
I do not feel I have nailed out all the details and there may be
difficulties I have not foreseen. If it is so, I am sure Phillip
will find out all the loopholes ;)

Nevertheless, I feel that I have covered out a lot of use
cases that I cared to cover, and that there is something good
in there, so have fun with this foolish attempt of putting multiple
inheritance straight! ;-)

      Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [issue3769] Deprecate bsddb for removal in 3.0

2008-09-03 Thread Michele Simionato
On Thu, Sep 4, 2008 at 1:41 AM, Raymond Hettinger <[EMAIL PROTECTED]> wrote:
> The release
> candidate seems to be the wrong time to
> yank this out (in part because of the surprise
> factor) and in part because I think the change
> silently affects shelve performance so that the
> impact may be significantly negative but not
> readily apparent.

I do not use bsddb directly, but I use shelve which on Linux usually
takes advantage of bsddb. Does removing bsddb mean that
I will not be able to read shelve files written with Python 2.5
with Python 3.0? That would be quite disturbing to me.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] ',' precedence in documentation

2008-09-29 Thread Michele Simionato
I like Martin's proposals (use a function, remove -O) very much.
Actually I wanted
to propose the same months ago. Here is my take at the assert function, which
I would like to be able to raise even exceptions different from AssertionError:

def assert_(cond, exc, *args):
"""Raise an exception if cond is not satisfied. exc can be a template
string (then args are the interpolation arguments) or an exception
class (then args are passed to the constructor). Here are a few
examples:

>>> assert_(False, 'ahia!')
Traceback (most recent call last):
   ...
AssertionError: ahia!

>>> assert_(False, ValueError)
Traceback (most recent call last):
  ...
ValueError

>>> a = 1
>>> assert_(isinstance(a, str), TypeError, '%r is not a string' % a)
Traceback (most recent call last):
TypeError: 1 is not a string

"""
if isinstance(exc, basestring):
raise AssertionError(exc % args)
elif inspect.isclass(exc) and issubclass(exc, Exception):
raise exc(*args)
else:
raise TypeError('The second argument of assert_ must be a string '
'or an exception class, not %r' % exc)

 M. Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Feedback from numerical/math community on PEP 225

2008-11-08 Thread Michele Simionato
On Sat, Nov 8, 2008 at 5:01 AM, Fernando Perez <[EMAIL PROTECTED]> wrote:
> Hi all,
>
> a while back there was a discussion about new operators for the language, 
> which
> ended in people mentioning that the status of PEP 225 was still undecided and
> that it was the most likely route to consider in this discussion.  I offered
> to collect some feedback from the numerical and math/scientific computing
> communities and report back here.

While not an user of numpy, I feel legitimate to give my feedback as a
person with a scientific background.
I personally have always felt the lack of a binary operator expressing
non-commutative multiplication.
It could be used for matrices, but also for functions composition and
other more abstract things. I think a single new operator is all is
needed.

    Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] my first post: asking about a "decorator" module

2005-05-04 Thread Michele Simionato
My first post to python-dev, but I guess my name is not completely unknown
in this list ;)

Actually, I have been wondering about subscribing to python-dev for at least 
a couple of years, but never did it, because of the limited amount of time 
I have to follow all the interesting mailing lists in the world.

However, in the last few months I have been involved with teaching Python 
and I have decided to follow more closely the development to keep myself
updated on what is going on.

Plus, I have some ideas I would like to share with people in this list.

One of them concerns decorators.

Are there plans to improve decorators support in future Python versions?
By "improving decorator support" I mean for instance a module in the standard
library providing some commonly used decorators such as ``memoize``,
or utilities to create and compose decorators, and things like that.

I have been doing some work on decorators lately and I would be
willing to help is there is a general interest about a "decorator"
module. Actually, I have already a good candidate function for that module,
and plenty of recipes.

I submitted an early version of the idea some time ago on c.l.py

http://groups-beta.google.com/group/comp.lang.python/browse_frm/thread/60f22ed33af5dbcb/5f870d271456ccf3?q=simionato+decorate&rnum=1&hl=en#5f870d271456ccf3

but I could as well flesh it out and deliver a module people can
play with and see if they like it. This is especially interesting in this
moment, since decorators may address many of the use cases 
of PEP 340 (not all of them).

I need to write down some documentation, but it could be done by tomorrow.

What do people think?


   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] my first post: asking about a "decorator" module

2005-05-06 Thread Michele Simionato
On 5/5/05, Raymond Hettinger <[EMAIL PROTECTED]> wrote:
> 
> Yes, there has been quite a bit of interest including several ASPN
> recipes and a wiki:
> 
>http://www.python.org/moin/PythonDecoratorLibrary

Thanks, I didn't know about that page. BTW, I notice that all the decorators
in that page are improper, in the sense that they change the signature of
the function they decorate. So, all those recipes would need some help
from my decorator module, to make them proper ;-)

http://www.phyast.pitt.edu/~micheles/python/decorator.zip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The decorator module

2005-05-06 Thread Michele Simionato
On 5/6/05, Jim Jewett <[EMAIL PROTECTED]> wrote:
> Thank you; this is very good.
> 
> I added a link to it from http://www.python.org/moin/PythonDecoratorLibrary;
> please also consider adding a version number and publishing via PyPI.

Yes, this was in my plans. For the moment,  however, this is just version 0.1,
I want to wait a bit before releasing an official release.

> Incidentally, would the resulting functions be a bit faster if you compiled
> the lambda instead of repeatedly eval ing it, or does the eval overhead still
> apply?
> 
> -jJ
> 

Honestly, I don't care, since "eval" happens only once at decoration time.
There is no "eval" overhead at calling time, so I do not expect to have
problems. I am waiting for volunteers to perform profiling and
performance analysis ;)

Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The decorator module

2005-05-06 Thread Michele Simionato
On 5/6/05, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> [Michele]
> > Honestly, I don't care, since "eval" happens only once at decoration time.
> > There is no "eval" overhead at calling time, so I do not expect to have
> > problems. I am waiting for volunteers to perform profiling and
> > performance analysis ;)
> 
> Watch out. I didn't see the code referred to, but realize that eval is
> *very* expensive on some other implementations of Python (Jython and
> IronPython). Eval should only be used if there is actual user-provided
> input that you don't know yet when your module is compiled; not to get
> around some limitation in the language there are usually ways around
> that, and occasionally we add one, e.g. getattr()).

I actually posted the code on c.l.p. one month ago asking if there was
a way to avoid "eval", but I had no answer. So, let me repost the code
here and see if somebody comes out with a good solution.
It is only ~30 lines long (+ ~30 of comments & docstrings)

## I suggest you uncomment the 'print lambda_src' statement in _decorate
## to understand what is going on.

import inspect

def _signature_gen(func, rm_defaults=False):
argnames, varargs, varkwargs, defaults = inspect.getargspec(func)
argdefs = defaults or ()
n_args = func.func_code.co_argcount
n_default_args = len(argdefs)
n_non_default_args = n_args - n_default_args
non_default_names = argnames[:n_non_default_args]
default_names = argnames[n_non_default_args:]
for name in non_default_names:
yield "%s" % name
for i, name in enumerate(default_names):
if rm_defaults:
yield name
else:
yield "%s = arg[%s]" % (name, i) 
if varargs:
yield "*%s" % varargs
if varkwargs:
yield "**%s" % varkwargs

def _decorate(func, caller):
signature = ", ".join(_signature_gen(func))
variables = ", ".join(_signature_gen(func, rm_defaults=True))   
lambda_src = "lambda %s: call(func, %s)" % (signature, variables)
# print lambda_src # for debugging
evaldict = dict(func=func, call=caller, arg=func.func_defaults or ())
dec_func = eval(lambda_src, evaldict)
dec_func.__name__ = func.__name__
dec_func.__doc__ = func.__doc__
dec_func.__dict__ = func.__dict__ # copy if you want to avoid sharing
return dec_func

class decorator(object):
"""General purpose decorator factory: takes a caller function as
input and returns a decorator. A caller function is any function like this:

def caller(func, *args, **kw):
# do something
return func(*args, **kw)

Here is an example of usage:

>>> @decorator
... def chatty(f, *args, **kw):
... print "Calling %r" % f.__name__
... return f(*args, **kw)
    >>> @chatty
... def f(): pass
>>> f()
Calling 'f'
"""
def __init__(self, caller):
self.caller = caller
def __call__(self, func):
return _decorate(func, self.caller)

Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340

2005-05-06 Thread Michele Simionato
On 5/6/05, Steven Bethard <[EMAIL PROTECTED]> wrote:
> FWIW, I'm +1 on this.  Enhanced Iterators
>  * updates the iterator protocol to use .__next__() instead of .next()
>  * introduces a new builtin next()
>  * allows continue-statements to pass values to iterators
>  * allows generators to receive values with a yield-expression
> The first two are, I believe, how the iterator protocol probably
> should have been in the first place.  The second two provide a simple
> way of passing values to generators, something I got the impression
> that the co-routiney people would like a lot.

Thank you for splitting the PEP. Conceptually, the "coroutine" part  
has nothing to do with blocks and it stands on its own, it is right
to discuss it separately from the block syntax.

Personally, I do not see an urgent need for the block syntax (most of
the use case can be managed with decorators) nor for the "couroutine"
syntax (you can already use Armin Rigo's greenlets for that).

Anyway, the idea of passing arguments to generators is pretty cool,
here is some code I have, adapted from Armin's presentation at the
ACCU conference:

from py.magic import greenlet

def yield_(*args):
return greenlet.getcurrent().parent.switch(*args)

def send(key):
return process_commands.switch(key)

@greenlet
def process_commands():
while True:
line = ''
while not line.endswith('\n'):
line += yield_()
print line,
if line == 'quit\n':
print "are you sure?"
if yield_() == 'y':
break

process_commands.switch() # start the greenlet

send("h")
send("e")
send("l")
send("l")
send("o")
send("\n")

send("q")
send("u")
send("i")
send("t")
send("\n")
  

Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The decorator module

2005-05-08 Thread Michele Simionato
On 5/6/05, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
> In this case, the informally-discussed proposal is to add a mutable
> __signature__ to functions, and have it be used by inspect.getargspec(), so
> that decorators can copy __signature__ from the decoratee to the decorated
> function.

Is there in the plans any facility to copy functions? Currently I am doing

def copyfunc(func):
"Creates an independent copy of a function."
c = func.func_code
nc = new.code(c.co_argcount, c.co_nlocals, c.co_stacksize, c.co_flags,
  c.co_code, c.co_consts, c.co_names, c.co_varnames,
  c.co_filename, c.co_name, c.co_firstlineno,
  c.co_lnotab, c.co_freevars, c.co_cellvars)
return new.function(nc, func.func_globals, func.func_name,
func.func_defaults, func.func_closure)
 
and I *hate* it!

I have updated my module to version 0.2, with an improved discussion
of decorators in multithreaded programming ("locked", "threaded",
"deferred"): http://www.phyast.pitt.edu/~micheles/python/decorator.zip


Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The decorator module

2005-05-10 Thread Michele Simionato
On 5/9/05, Raymond Hettinger <[EMAIL PROTECTED]> wrote:
> 
> Choices:
> - submit a patch adding a __copy__ method to functions,
> - submit a patch for the copy module, or
> - submit a feature request, assign to me, and wait.

Well, actually I am even more ambitious than that: not only I would like
to be able to copy functions, but I also would like to be able to subclass
FunctionType with an user-defined __copy__ method.

Don't worry, I will submit the feature request ;)

  Michele Simionato 

P.S. I have added yet another example to the documentation of
the decorator module, now arrived at version 0.3:

http://www.phyast.pitt.edu/~micheles/python/decorator.zip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The decorator module

2005-05-10 Thread Michele Simionato
On 5/10/05, Michele Simionato <[EMAIL PROTECTED]> wrote:
> 
> Well, actually I am even more ambitious than that: not only I would like
> to be able to copy functions, but I also would like to be able to subclass
> FunctionType with an user-defined __copy__ method.

BTW,  it seems possible to copy closures, but how about *pickling* them?
Is that technically feasible with a reasonable affort or is it a mess?

  Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] a patch to inspect and a non-feature request

2005-05-12 Thread Michele Simionato
Well, I logged into Sourceforge with the idea of filing my feature request
about copying functions, and then my eye went on my previous submissions.
It seems it takes some time to fix non-critical bugs, isn't it? ;)

Two years ago, I discovered a bug with pydoc for classes containing "super" 
objects:

>>> class C(object):
...pass

>>> C.s = super(C)

>>> help(C) # aargh!!

I filed that bug 25 months ago and it is still there (actually Brett
Cannot fixed it but then somebody else broke his patch).
 
Clearly nobody uses this feature and the bug fixing is not at
all urgent still it disturbs me, so I have worked out
a patch. Actually, the problem is not in pydoc but in inspect,
that treats super objects as methods, whereas they should be treated
as data. Here is the patch:

$ diff -c /home/micheles/python/dist/src/Lib/inspect.py inspect.py
*** /home/micheles/python/dist/src/Lib/inspect.py   Thu May 12 13:05:10 2005
--- inspect.py  Thu May 12 13:06:55 2005
***
*** 77,83 
  and not hasattr(object, "__set__") # else it's a data descriptor
  and not ismethod(object)   # mutual exclusion
  and not isfunction(object)
! and not isclass(object))

  def isdatadescriptor(object):
  """Return true if the object is a data descriptor.
--- 77,84 
  and not hasattr(object, "__set__") # else it's a data descriptor
  and not ismethod(object)   # mutual exclusion
  and not isfunction(object)
! and not isclass(object)
! and not isinstance(object, super))

  def isdatadescriptor(object):
  """Return true if the object is a data descriptor.

It changes the code of ismethoddescriptor to make sure that super objects
are not treated as methods.

BTW, I have downloaded the CVS version of Python and run test_inspect
against the patch and it is working. However, introspection tools have
the tendency to be very fragile (especially with the rate of changes
in Python) and it is possible that this fix would break something else.

Let The Powers That Be to decide.

The test suite should be augmented with a test such

>>> inspect.ismethoddescriptor(C.s)
False

In my experience super is a huge can of worms and actually I have a non-feature 
request about the descriptor aspect of super: I would like super's
__get__ method
and the possibily to call super with just one argument to be removed
in Python 3000.
They are pretty much useless (yes I know of "autosuper") and error prone.

   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] a patch to inspect and a non-feature request

2005-05-12 Thread Michele Simionato
On 5/12/05, Steven Bethard <[EMAIL PROTECTED]> wrote:
>super doesn't work with "meta-attributes" and classmethods:
> 
> py> super(C, C).__name__
> Traceback (most recent call last):
>   File "", line 1, in ?
> AttributeError: 'super' object has no attribute '__name__'

Actually this is the Right Thing to do for super. It is something
to be aware of, not something to change. Since __name__ is
a descriptor defined in the type metaclass and not an attribute
defined in the base class, super correctly does not retrieve it.
It is enough to add some documentation about "super" caveats
and nonobvious points.
What I really dislike is super called with only one argument since
it has many unpleasant surprises and not real advantages :-(
   
Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Merging PEP 310 and PEP 340-redux?

2005-05-12 Thread Michele Simionato
On 5/12/05, Benji York <[EMAIL PROTECTED]> wrote:
> if the file object had these methods:
> 
> def __enter__(self): return self
> def __exit__(self, *args): self.close()
> 
> you could write
> 
> do file('whatever) as f:
>  lines = f.readlines()
> 
> Or a lock:
> 
> def __enter__(self): self.aquire(); return self
> def __exit__(self, *args): self.release()
> 
> do my_lock:
>  a()
>  b()
>  c()

Ah, finally a proposal that I can understand! 
But maybe the keyword should be "let":

let lock:
   do_something

let open("myfile") as f:
for line in f: do_something(line)

or even, without need of "as":

let f=file("myfile") :
for line in f: do_something(line)

which I actually like more

 Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] the current behavior of try: ... finally:

2005-05-12 Thread Michele Simionato
All this talk about try: ... finally: and exceptions reminded me of a curious
behavior I discovered a while back, i.e. that finally can swallow
your exceptions. This is a contrived example, but shows the point:

def divide1(n1, n2): 
try:
result = n1/n2
finally:
print "cleanup"
result = "Infinity\n"
return result # the exception is swallowed away

def divide2(n1, n2):
try:
result = n1/n2
finally:
print "cleanup"
result = "Infinity\n"
return result # the exception is NOT swallowed away

print divide1(10, 0) # no-exception
print divide2(10, 0) # error

If there is an indentation error in "divide2" and the return line is too
indented the exceptions get swallowed by the finally clause.

I am not sure if this is good or bad, but sure it surprised me that a
finally clause could hide my exception.

 Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] the current behavior of try: ... finally:

2005-05-12 Thread Michele Simionato
On 5/13/05, Greg Ewing <[EMAIL PROTECTED]> wrote:
> Michele Simionato wrote:
> 
> > def divide1(n1, n2):
> > try:
> > result = n1/n2
> > finally:
> > print "cleanup"
> > result = "Infinity\n"
> > return result # the exception is swallowed away
> 
> What would you prefer to have happen in this case?
> 
> Or do you think return (and break and continue) should
> be disallowed in a finally?
> 

Honestly, I don't know. This is why I ask here ;)

  Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] threadless brownian.py

2006-04-09 Thread Michele Simionato
Recently I downloaded Python 2.5a1 and I have started playing with it. In 
doing so,
I have been looking at the Demo directory in the distribution, to check if 
demos
of the new features have been added there. So, I rediscovered brownian.py,
in Demo/tkinter/guido. I just love this little program, because it reminds 
myself 
of one my first programs, a long long time ago (a brownian motion in 
AmigaBasic, 
with sprites!). It is also one of the first programs I looked at, when I 
started 
studying threads four years ago and I thought it was perfect. However, 
nowadays 
I know better and I have realized that brownian.py is perfect textbook example 
of a case where you don't really need threads, and you can use generators 
instead.
So I thought it would be nice to add a threadless version of brownian.py in
the Demo directory.
Here it is. If you like it, I donate the code to the PSF!



# Brownian motion -- an example of a NON multi-threaded Tkinter program ;)

from Tkinter import *
import random
import sys

WIDTH = 400
HEIGHT = 300
SIGMA = 10
BUZZ = 2
RADIUS = 2
LAMBDA = 10
FILL = 'red'

stop = 0# Set when main loop exits
root = None # main window

def particle(canvas):   # particle = iterator over the moves
r = RADIUS
x = random.gauss(WIDTH/2.0, SIGMA)
y = random.gauss(HEIGHT/2.0, SIGMA)
p = canvas.create_oval(x-r, y-r, x+r, y+r, fill=FILL)
while not stop:
dx = random.gauss(0, BUZZ)
dy = random.gauss(0, BUZZ)
try:
canvas.move(p, dx, dy)
except TclError:
break
else:
yield None

def move(particle): # move the particle at random time
particle.next()
dt = random.expovariate(LAMBDA)
root.after(int(dt*1000), move, particle)

def main():
global root, stop
root = Tk()
canvas = Canvas(root, width=WIDTH, height=HEIGHT)
canvas.pack(fill='both', expand=1)
np = 30
if sys.argv[1:]:
np = int(sys.argv[1])
for i in range(np):  # start the dance
move(particle(canvas))
try:
root.mainloop()
finally:
stop = 1

if __name__ == '__main__':
main()

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] feature request: inspect.isgenerator

2006-05-29 Thread Michele Simionato
Is there still time for new feature requests in Python 2.5?
I am missing a isgenerator function in the inspect module. Right now
I am using

def isgenerator(func):
return func.func_code.co_flags == 99

but it is rather ugly (horrible indeed).


   Michele Simionato


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] feature request: inspect.isgenerator

2006-06-01 Thread Michele Simionato
Neal Norwitz  gmail.com> writes:
> 
> > I wonder whether a check shouldn't just return (co_flags & 0x20), which
> > is CO_GENERATOR.
> 
> Makes more sense.

Okay, but my point is that the final user should not be expected to know
about those implementation details. The one obvious way to me is to have an
inspect.isgenerator, analogous to inspect.isfunction, inspect.ismethod, etc.
The typical use case is in writing a documentation/debugging tool. Now I
was writing a decorator that needed to discriminate in the case it was
decorating a regular function versus a generator. 

Michele Simionato 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] feature request: inspect.isgenerator

2006-06-01 Thread Michele Simionato
Terry Reedy  udel.edu> writes:
> To me, another obvious way is isinstance(object, gentype) where
> gentype = type(i for i in []) # for instance
> which should also be in types module.

No, that check would match generator objects, not generators tout court.
On a related notes, inspect.isfunction gives True on a generator, such
as

def g(): yield None

This could confuse people, however I am inclined to leave things as they are.
Any thoughts?

 Michele Simionato

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] feature request: inspect.isgenerator

2006-06-01 Thread Michele Simionato
Georg Brandl  gmx.net> writes:

> I'd say, go ahead and write a patch including docs, and I think there's no
> problem with accepting it (as long as it comes before beta1).

I was having a look at http://docs.python.org/dev/lib/inspect-types.html
and it would seem that adding isgenerator would imply changing
inspect.getmembers() and its documentation. Also, should one add
a GeneratorType, perhaps as a subclass of FunctionType?

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] feature request: inspect.isgenerator

2006-06-01 Thread Michele Simionato
Georg Brandl  gmx.net> writes:
> 
> > Also, should one add
> > a GeneratorType, perhaps as a subclass of FunctionType?
> 
> Add GeneratorType where? There is already one in the types module.

Yep, this is the crux. types.GeneratorType refers to generator objects,
which in an improper sense are "instances" of a "generator function".
I.e.

def g(): yield 1 # this is a generator

go = g() # this is a generator object

I want isgenerator(g) == True, but isgenerator(go) == False.

So, what should be the class of g ? Maybe we can keep FunctionType
and don't bother.

 Michele Simionato





___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] feature request: inspect.isgenerator

2006-06-06 Thread Michele Simionato
Phillip J. Eby  telecommunity.com> writes:

> I think the whole concept of inspecting for this is broken.  *Any* 
> function can return a generator-iterator.  A generator function is just a 
> function that happens to always return one.
> In other words, the confusion is in the idea of introspecting for this in 
> the first place, not that generator functions are of FunctionType.  The 
> best way to avoid the confusion is to avoid thinking that you can 
> distinguish one type of function from another without explicit guidance 
> from the function's author.

Nolo contendere.

I am convinced and I am taking back my feature request.


 Michele Simionato

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] feature request: inspect.isgenerator

2006-06-06 Thread Michele Simionato
Terry Reedy  udel.edu> writes:
> tout court?? is not English or commonly used at least in America

It is French:

http://encarta.msn.com/dictionary_561508877/tout_court.html

I thought it was common in English too, but clearly I was mistaken.
 
> Ok, you mean generator function, which produces generators, not generators 
> themselves.  So what you want is a new isgenfunc function.  That makes more 
> sense, in a sense, since I can see that you would want to wrap genfuncs 
> differently from regular funcs.  But then I wonder why you don't use a 
> different decorator since you know when you are writing a generator 
> function.

Because in a later refactoring I may want to replace a function with a
generator function or viceversa, and I don't want to use a different
decorator. The application I had in mind was a Web framework 
where you can write something like

@expose
def page(self):
   return 'Hello World!'

or

@expose
def page(self):
   yield 'Hello '
   yield 'World!'

indifferently. I seem to remember CherryPy has something like that.

  Michele Simionato

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-18 Thread Michele Simionato
On 10/18/05, Antoine Pitrou <[EMAIL PROTECTED]> wrote:
> Le mardi 18 octobre 2005 à 10:57 -0400, Barry Warsaw a écrit :
> Currently I never use properties, because it makes classes much less
> readable for the same kind of reasons as what Jim wrote.

Me too, I never use properties directly. However I have experimented
with using helper functions to generate the properties:

_dic = {}

def makeproperty(x):
def getx(self):
return _dic[self, x]
def setx(self, value):
_dic[self, x] = value
return property(getx, setx)

class C(object):
x = makeproperty('x')

c = C()
c.x = 1
print c.x

But in general I prefer to write a custom descriptor class, since it
gives me much more control.

Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-19 Thread Michele Simionato
On 10/18/05, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> I wonder if at some point in the future Python will have to develop a
> macro syntax so that you can write
>
> Property foo:
> def get(self): return self._foo
> ...etc...

This reminds me of an idea I have kept in my drawer for a couple of years or so.
Here is my proposition: we could have the statement syntax

  :
   

to be syntactic sugar for

 = (, , )

For instance properties could be defined as follows:

def Property(name, args, dic):
return property(
   dic.get('fget'), dic.get('fset'), dic.get('fdel'), dic.get('__doc__'))

Property p():
"I am a property"
def fget(self):
pass
def fset(self):
pass
def fdel(self):
pass

Another typical use case could be a dispatcher:

class Dispatcher(object):
def __init__(self, name, args, dic):
self.dic = dic
def __call__(self, action, *args, **kw):
return self.dic.get(action)(*args, **kw)

Dispatcher dispatch(action):
  def do_this():
 pass
  def do_that():
 pass
  def default():
 pass

dispatch('do_this')

Notice that the proposal is already implementable by abusing the class
statement:

class  :
   __metaclass__ = 
   

But abusing metaclasses for this task is ugly. BTW, if the proposal was
implemented, the 'class' would become redundant and could be replaced
by 'type':

class  :
   

<=>

type  :
   


;)

   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-19 Thread Michele Simionato
On 10/19/05, Paul Moore <[EMAIL PROTECTED]> wrote:
>
> One question - in the expansion, "name" is used on both sides of the
> assignment. Consider
>
> something name():
> 
>
> This expands to
>
> name = something(name, (), )
>
> What should happen if name wasn't defined before? A literal
> translation will result in a NameError. Maybe an expansion
>
> name = something('name', (), )
>
> would be better (ie, the callable gets the *name* of the target as an
> argument, rather than the old value).
>
> Also, the  bit needs some clarification. I'm guessing
> that it would be a suite, executed in a new, empty namespace, and the
>  is the resulting modified namespace (with
> __builtins__ removed?)
>
> In other words, take , and do
>
> d = {}
> exec  in d
> del d['__builtins__']
>
> then  is the resulting value of d.
>
> Interesting idea...
>
> Paul.
>

 would be a string and  a dictionary.
As I said, the semantic would be exactly the same as the current
way of doing it:

class  :
__metaclass__ = 

I am just advocating for syntactic sugar, the functionality is already there.

   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-20 Thread Michele Simionato
As other explained, the syntax would not work for functions (and it is
not intended to).
A possible use case I had in mind is to define inlined modules to be
used as bunches
of attributes. For instance, I could define a module as

module m():
a = 1
b = 2

where 'module' would be the following function:

def module(name, args, dic):
mod = types.ModuleType(name, dic.get('__doc__'))
for k in dic: setattr(mod, k, dic[k])
return mod
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-24 Thread Michele Simionato
On 10/23/05, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> Very nice indeed. I'd be more supportive if it was defined as a new statement
> such as "create" with the syntax:
>
>create TYPE NAME(ARGS):
>  BLOCK

I like it, but it would require a new keyword. Alternatively, one
could abuse 'def':

def  TYPE NAME(ARGS):
  BLOCK

but then people would likely be confused as Skip was, earlier in this thread,
so I guess 'def' is a not an option.

IMHO a new keyword could be justified for such a powerful feature,
but only Guido's opinion counts on this matters ;)

Anyway I expected people to criticize the proposal as too powerful and
dangerously close to Lisp macros.

 Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-24 Thread Michele Simionato
On 10/24/05, Josiah Carlson <[EMAIL PROTECTED]> wrote:
> I would criticise it for being dangerously close to worthless.  With the
> minor support code that I (and others) have offered, no new syntax is
> necessary.
>
> You can get the same semantics with...
>
> class NAME(_(TYPE), ARGS):
> BLOCK
>
> And a suitably defined _.  Remember, not every X line function should be
> made a builtin or syntax.
>
>  - Josiah

Could you re-read my original message, please? Sugar is *everything*
in this case. If the functionality is to be implemented via a __metaclass__
hook, then it should be considered a hack that nobody in his right mind
should use. OTOH, if there is a specific syntax for it, then it means
this the usage
has the benediction of the BDFL. This would be a HUGE change.
For instance, I would never abuse metaclasses for that, whereas I
would freely use a 'create' statement.

   Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Definining properties - a use case for class decorators?

2005-10-24 Thread Michele Simionato
On 10/24/05, Ronald Oussoren <[EMAIL PROTECTED]> wrote:
> I'd say using a class statement to define a property is metaclass
> abuse, as would
> anything that wouldn't define something class-like. The same is true
> for other
> constructs, using an decorator to define something that is not a
> callable would IMHO
> also be abuse.

+1

> That said, I really have an opinion on the 'create' statement
> proposal yet. It
> does seem to have a very limited field of use.

This is definitely non-true. The 'create' statement would have lots of
applications. On top of my mind I can think of 'create' applied to:

- bunches;
- modules;
- interfaces;
- properties;
- usage in framewors, for instance providing sugar for
Object-Relational mappers,
  for making templates (i.e. a create HTMLPage);
- building custom minilanguages;
- ...

This is way I see a 'create' statement is frightening powerful addition to the
language.

 Michele Simionato
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com