I'm trying to write a class factory to create new classes dynamically at
runtime from simple 'definition' files that happen to be written in python as
well. I'm using a class factory since I couldn't find a way to use properties
with dynamically generated instances, for example:
I would prefer this, but it doesn't work:
class Status(object):
pass
def makeStatus(object):
def __init__(self,definitions):
for key,function in definitions:
setattr(self,key,property(function))
this works (and it's fine by me):
def makeStatus(definitions):
class Status(object):
pass
for key,function in definitions:
setattr(Status,key,property(function))
return Status()
but I would also like the functions to only be evaluated when necessary since
some may be costly, so I want to do the following:
def makeStatus(definitions):
class Status(object):
pass
for key,function,data in definitions:
setattr(Status,key,property(lambda x: function(data)))
return Status()
but all my properties now act as if they were invoked with the same data even
though each one should have been a new lambda function with it's own associated
data. It seems Python is 'optimizing'? all the lambdas to the same object
even though that's clearly not what I want to do. Anyone have any suggestions
as to:
1) why
2) what I should do
3) a better way in which to implement this pattern
Cheers!,
-Craig
--
http://mail.python.org/mailman/listinfo/python-list