creating many similar properties

L

Lee Harr

I understand how to create a property like this:

class RC(object):
def _set_pwm(self, v):
self._pwm01 = v % 256
def _get_pwm(self):
return self._pwm01
pwm01 = property(_get_pwm, _set_pwm)


But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?

Thanks for your time.
 
J

James Stroud

Lee said:
I understand how to create a property like this:

class RC(object):
def _set_pwm(self, v):
self._pwm01 = v % 256
def _get_pwm(self):
return self._pwm01
pwm01 = property(_get_pwm, _set_pwm)


But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?

Thanks for your time.

You want a "class factory". This can either be the esoteric "metaclass"
(using "type"), which one can only understand for moments at a time, or
something more grounded in intuition, or, a combo (for fun). You can
probably tighten this a little, but this is the idea:

def c_factory(cname, pname, num, modulus, default):
def accessorize(prop):
def _get(self):
return getattr(self, prop)
def _set(self, v):
v_new = v % modulus
setattr(self, prop, v_new)
return (_get, _set)
def _init(self):
for i in xrange(num):
setattr(self, '%s%s' % (pname,i), default)
_C = type(cname, (object,), {'__init__':_init})
for i in xrange(num):
prop = '_%s%s' % (pname, i)
setattr(_C, '%s%s' % (pname, i), property(*accessorize(prop)))
return _C

E.g.:


py> def c_factory(cname, pname, num, modulus, default):
.... def accessorize(prop):
.... def _get(self):
.... return getattr(self, prop)
.... def _set(self, v):
.... v_new = v % modulus
.... setattr(self, prop, v_new)
.... return (_get, _set)
.... def _init(self):
.... for i in xrange(num):
.... setattr(self, '%s%s' % (pname,i), default)
.... _C = type(cname, (object,), {'__init__':_init})
.... for i in xrange(num):
.... prop = '_%s%s' % (pname, i)
.... setattr(_C, '%s%s' % (pname, i), property(*accessorize(prop)))
.... return _C
....
py> Bob = c_factory('Bob', 'bob', 4, 256, 128)
py> b = Bob()
py> b.bob0
128
py> b.bob1
128
py> b.bob1 = 258
py> b.bob1
2
py> b.bob3
128
py> dir(b)

['__class__',
'__delattr__',
'__dict__',
'__doc__',
'__getattribute__',
'__hash__',
'__init__',
'__module__',
'__new__',
'__reduce__',
'__reduce_ex__',
'__repr__',
'__setattr__',
'__str__',
'__weakref__',
'_bob0',
'_bob1',
'_bob2',
'_bob3',
'bob0',
'bob1',
'bob2',
'bob3']
py> Bob
<class '__main__.Bob'>




--
James Stroud
UCLA-DOE Institute for Genomics and Proteomics
Box 951570
Los Angeles, CA 90095

http://www.jamesstroud.com/
 
M

Marc 'BlackJack' Rintsch

But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?

Use `__getattr__()` and `__setattr__()` methods and a dictionary of names
mapped to values and another that maps the names to default values for the
modulo operation.

Ciao,
Marc 'BlackJack' Rintsch
 
M

Michele Simionato

Lee said:
I understand how to create a property like this:

class RC(object):
def _set_pwm(self, v):
self._pwm01 = v % 256
def _get_pwm(self):
return self._pwm01
pwm01 = property(_get_pwm, _set_pwm)


But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?

Yes, what about this?

import sys

def defprop(name, default=127):
loc = sys._getframe(1).f_locals
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop, default)
loc[name] = property(_get, _set)

class RC(object):
defprop('pwm01')
defprop('pwm02')

rc = RC()

print rc.pwm01 # 127
print rc.pwm02 # 127
rc.pwm02 = 1312
print rc.pwm02 # 32

This is a bit hackish, but I would prefer this over a metaclass
solution. since it does not add
any hidden magic to your class.

Michele Simionato
 
C

Carl Banks

Lee said:
I understand how to create a property like this:

class RC(object):
def _set_pwm(self, v):
self._pwm01 = v % 256
def _get_pwm(self):
return self._pwm01
pwm01 = property(_get_pwm, _set_pwm)


But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?


The metaclass solution. I use this idiom occasionally, whenever I want
to fiddle with the class dict before letting the type constructor at
it.

class mod256metatype(type):
def __new__(metatype,name,bases,clsdict):
for sym in clsdict.get('__mod256__',()):
prop = '_%s' % sym
def _set(self,v):
setattr(self,prop,v%256)
def _get(self):
return getattr(self,prop)
clsdict[sym] = property(_get,_set)
return type.__new__(metatype,name,bases,clsdict)

class RC(object):
__metaclass__ = mod256metatype
__mod256__ = ["pwm01","pwm02"]


Carl
 
G

George Sakkis

Michele said:
Lee said:
I understand how to create a property like this:

class RC(object):
def _set_pwm(self, v):
self._pwm01 = v % 256
def _get_pwm(self):
return self._pwm01
pwm01 = property(_get_pwm, _set_pwm)


But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?

Yes, what about this?

import sys

def defprop(name, default=127):
loc = sys._getframe(1).f_locals
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop, default)
loc[name] = property(_get, _set)

class RC(object):
defprop('pwm01')
defprop('pwm02')

rc = RC()

print rc.pwm01 # 127
print rc.pwm02 # 127
rc.pwm02 = 1312
print rc.pwm02 # 32

This is a bit hackish, but I would prefer this over a metaclass
solution. since it does not add
any hidden magic to your class.

Why is this less hidden or magical than a metaclass ? I'd prefer the
following instead:

from itertools import chain, izip, repeat

def ByteProperties(*names, **defaulted_names):
def byte_property(name, default):
return property(lambda self: getattr(self, name, default),
lambda self,v: setattr(self, name, v%256))
def make_class(clsname, bases, dict):
for name,default in chain(izip(names, repeat(127)),
defaulted_names.iteritems()):
assert name not in dict # sanity check
dict[name] = byte_property('_'+name, default)
return type(clsname,bases,dict)
return make_class


class RC(object):
__metaclass__ = ByteProperties('pwm01', pwm02=64)

rc = RC()
print rc.pwm01, rc.pwm02 # 127 64
rc.pwm01 = 1312
print rc.pwm01, rc.pwm02 # 32 64


George
 
G

George Sakkis

Carl said:
Lee said:
I understand how to create a property like this:

class RC(object):
def _set_pwm(self, v):
self._pwm01 = v % 256
def _get_pwm(self):
return self._pwm01
pwm01 = property(_get_pwm, _set_pwm)


But what if I have a whole bunch of these pwm properties?

I made this:

class RC(object):
def _makeprop(name):
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop)
return property(_get, _set)

pwm01 = _makeprop('pwm01')
pwm02 = _makeprop('pwm02')


Not too bad, except for having to repeat the name.

I would like to just have a list of pwm names and
have them all set up like that. It would be nice if
each one was set to a default value of 127 also....

Any thoughts?


The metaclass solution. I use this idiom occasionally, whenever I want
to fiddle with the class dict before letting the type constructor at
it.

class mod256metatype(type):
def __new__(metatype,name,bases,clsdict):
for sym in clsdict.get('__mod256__',()):
prop = '_%s' % sym
def _set(self,v):
setattr(self,prop,v%256)
def _get(self):
return getattr(self,prop)
clsdict[sym] = property(_get,_set)
return type.__new__(metatype,name,bases,clsdict)

class RC(object):
__metaclass__ = mod256metatype
__mod256__ = ["pwm01","pwm02"]

There's a subtle common bug here: all _get and _set closures will refer
to the last property only. You have to remember to write "def
_set(self,v,prop=prop)" and similarly for _get to do the right thing.
By the way, I can't think of a case where the current behavior (i.e.
binding the last value only) is the desired one. Is this just an
implementation wart or am I missing something ?

George
 
M

Michele Simionato

George said:
Why is this less hidden or magical than a metaclass ?

Because it does not use inheritance. It is not going to create
properties on subclasses without
you noticing it. Also, metaclasses are brittle: try to use them with
__slots__, or with non-standard
classes (i.e. extensions classes), or try to use multiple metaclasses.
I wrote a paper
about metaclasses abuses which should be published soon or later; you
can see the draft
here:
http://www.phyast.pitt.edu/~micheles/python/classinitializer.html

Michele Simionato
 
M

Michele Simionato

George said:
from itertools import chain, izip, repeat

def ByteProperties(*names, **defaulted_names):
def byte_property(name, default):
return property(lambda self: getattr(self, name, default),
lambda self,v: setattr(self, name, v%256))
def make_class(clsname, bases, dict):
for name,default in chain(izip(names, repeat(127)),
defaulted_names.iteritems()):
assert name not in dict # sanity check
dict[name] = byte_property('_'+name, default)
return type(clsname,bases,dict)
return make_class


class RC(object):
__metaclass__ = ByteProperties('pwm01', pwm02=64)

Notice that you are NOT using a custom metaclass here, you are just
using the metaclass hook
and you will avoid all issues of custom metaclasses. This is exactly
the approach I advocate
in the paper I referred before, so I think your solution is pretty safe
in that respect. Still I
think in this particular problem avoiding the __metaclass__ at all is
possible and it should be
preferred, just for sake of simplicity, not of safety).

Michele Simionato

M
 
C

Carl Banks

George said:
Michele said:
import sys

def defprop(name, default=127):
loc = sys._getframe(1).f_locals
prop = '_%s' % name
def _set(self, v):
v_new = v % 256
setattr(self, prop, v_new)
def _get(self):
return getattr(self, prop, default)
loc[name] = property(_get, _set)

class RC(object):
defprop('pwm01')
defprop('pwm02')

rc = RC()

print rc.pwm01 # 127
print rc.pwm02 # 127
rc.pwm02 = 1312
print rc.pwm02 # 32

This is a bit hackish, but I would prefer this over a metaclass
solution. since it does not add
any hidden magic to your class.

Why is this less hidden or magical than a metaclass ?

Devil's Advocate: he did say "hidden magic TO YOUR CLASS".

If you use a (real) metaclass, then you have the icky feeling of a
class permanently tainted by the unclean metaclass (even though the
metaclass does nothing other than touch the class dict upon creation);
whereas if you use Michele Simionato's hack, the icky feeling of using
a stack frame object goes away after the property is created: you are
left with a clean untainted class.

Personally, the former doesn't make me feel icky at all.


Carl Banks
 
M

Michele Simionato

Carl said:
Devil's Advocate: he did say "hidden magic TO YOUR CLASS".

If you use a (real) metaclass, then you have the icky feeling of a
class permanently tainted by the unclean metaclass (even though the
metaclass does nothing other than touch the class dict upon creation);
whereas if you use Michele Simionato's hack, the icky feeling of using
a stack frame object goes away after the property is created: you are
left with a clean untainted class.

Yep, exactly.
Personally, the former doesn't make me feel icky at all.

Please, do this experiment: take all classes defined in the Python
standard library and add
to them a custom do-nothing metaclass. See what happens.


Michele Simionato
 
C

Carl Banks

George said:
There's a subtle common bug here: all _get and _set closures will refer
to the last property only. You have to remember to write "def
_set(self,v,prop=prop)" and similarly for _get to do the right thing.

Sorry. My mistake.

By the way, I can't think of a case where the current behavior (i.e.
binding the last value only) is the desired one. Is this just an
implementation wart or am I missing something ?

def some_function(a):
def printvars():
print "DEBUG: %r,%r,%r" % (a,b,i)
for i in some_range():
b = something(i)
printvars()

If you fix the value of the closure at function definition time,
printvars() above doesn't work. One way or another, someone's going to
get surprised. Better to let it be the experts.


Carl
 
C

Carl Banks

Michele said:
Yep, exactly.


Please, do this experiment: take all classes defined in the Python
standard library and add
to them a custom do-nothing metaclass. See what happens.

Do you expect the result to be better or worse than if you applied
stack frame hacks to the whole library?

Come on, I don't think anyone's under the impression we're being
indiscriminate here.


Carl Banks

(BTW, most of the standard library still uses old-style classes.)
 
M

Michele Simionato

Carl said:
Come on, I don't think anyone's under the impression we're being
indiscriminate here.

Ok, but I don't think that in the case at hand we should recommend a
metaclass
solution.

Michele Simionato
 
C

Carl Banks

Michele said:
Ok, but I don't think that in the case at hand we should recommend a
metaclass
solution.

You sound as if you're avoiding metaclasses just for the sake of
avoiding them, which is just as bad as using them for the sake of using
them.

Here's how I see it: either it's ok to fiddle with the class dict, or
it isn't. If it's ok, then a metaclass is the way to do it. If it's
not ok to fiddle with the class dict, then he should be using
__setattr__, or creating properties longhand. Messing with frames is
not the answer for production code.

....

Just for the hell of it, I decided to accept your challenge to run the
standard library with a different metaclass applied to all new-style
classes. I ran the 2.4.3 regression test, replacing the builtin object
with a class that used the mod256metaclass I presented in this thread.
The results:

229 tests OK.
34 tests failed:
test___all__ test_asynchat test_cgi test_cookielib test_copy
test_copy_reg test_cpickle test_decimal test_descr test_descrtut
test_email test_email_codecs test_httplib test_imaplib
test_inspect test_logging test_mailbox test_mimetools
test_mimetypes test_minidom test_pickle test_pyclbr
test_robotparser test_sax test_sets test_socket test_socket_ssl
test_sundry test_timeout test_urllib test_urllib2 test_urllib2net
test_urllibnet test_xpickle

Not A-OK, but not exactly mass-pandemonium either. There were plenty
of modules used the the new base object and worked fine.

There were only two causes for failure:
1. A class attempting to use __weakref__ slot.
2. There were also a few metaclass conflicts.

IMO, neither of these failure modes argues against using a metaclass to
preprocess the class dict. The __weakref__ error is not applicable;
since it's an error to use it on any class with an instance dict. (In
fact, the metaclass wasn't even causing the error: the same error would
have occurred if I had replaced builtin object with an empty subclass
of itself, without the metaclass.)

The metaclass conflict would only occur in the present case only if
someone wanted to subclass it AND specify a different metaclass.
Arguing that metaclasses should be avoided just to guard against this
rare possibility is defensive to the extreme.

I did not uncover any kinds of subtle, unexpected behavior that can
occur when metaclasses do weird things. I know such things are
possible; how likely they are is another question. The tests I ran
didn't uncover any. So for now, the results of these tests don't seem
to support your point very well.

Carl


Appendix: Here's how I ran the test. I inserted the following code at
the top of Lib/test/regrtest.py, and
ran make test. I ran the tests on the Python 2.4.3 source tree, in
Linux.

=======================
import sys

class mod256metatype(type):
def __new__(metatype,name,bases,clsdict):
print >> sys.__stdout__, \
"++++++++ Creating class %s of type mod256metatype" %
name
def makeprop(sym):
prop = '_%s' % sym
def _set(self,v):
setattr(self,prop,v%256)
def _get(self):
return getattr(self,prop)
return property(_get,_set)
for sym in clsdict.get('__mod256__',()):
clsdict[sym] = makeprop(sym)
return super(metatype
return type.__new__(metatype,name,bases,clsdict)

class _object:
__metaclass__ = mod256metatype

import __builtin__
__builtin__.object = _object
=======================
 
J

James Stroud

Michele said:
Because it does not use inheritance. It is not going to create
properties on subclasses without
you noticing it. Also, metaclasses are brittle: try to use them with
__slots__, or with non-standard
classes (i.e. extensions classes), or try to use multiple metaclasses.
I wrote a paper
about metaclasses abuses which should be published soon or later; you
can see the draft
here:
http://www.phyast.pitt.edu/~micheles/python/classinitializer.html

I am in the midst of reading this paper, and, while I don't share your
level of expertise, I modestly share your opinion that solving problems
with dynamically generated classes unnecessarily complicates code. For
example, the problem of the OP could be solved easily with a dictionary
and two functions used to access the dictionary and then, if making a
class was truely necessary, including these functions as members of the
class. But the OP mentioned something about "properties" and all hell
broke loose.

However, I think that what you are saying about metaclasses being
brittle relates more to implementation than language. In theory, these
should be equivalent:

(1) class Bob(object): pass
(2) Bob = type('Bob', (), {})

And indeed a cursory inspection of the resulting classes show that they
are indistinguishable.

That they wouldn't be seems an implementation bug and perhaps that bug
should be fixed rather than promoting the avoidance of (2) because it
does not create classes that behave as number (1).

James


--
James Stroud
UCLA-DOE Institute for Genomics and Proteomics
Box 951570
Los Angeles, CA 90095

http://www.jamesstroud.com/
 
M

Michele Simionato

Carl said:
You sound as if you're avoiding metaclasses just for the sake of
avoiding them, which is just as bad as using them for the sake of using
them.

Do you realize that you are effectively saying "avoiding a complex
tool in favor of a simpler one is just as bad as avoing the simple tool
in favor of the complex one" ?
Here's how I see it: either it's ok to fiddle with the class dict, or
it isn't. If it's ok, then a metaclass is the way to do it. If it's
not ok to fiddle with the class dict, then he should be using
__setattr__, or creating properties longhand. Messing with frames is
not the answer for production code.

I agree that messing with frames is not nice (however I should notice
that
this is how Zope interfaces are implemented, and they have been in
production use for years) but I disagree with your point about the
class dict. You should use a custom metaclass *only if you want to
mess with the class dict of all subclasses at each derivation*: this is

rarely the case, and definitely was not requested for the OP problem.
Just for the hell of it, I decided to accept your challenge to run the
standard library with a different metaclass applied to all new-style
classes. I ran the 2.4.3 regression test, replacing the builtin object
with a class that used the mod256metaclass I presented in this thread.
The results:

229 tests OK.
34 tests failed:
test___all__ test_asynchat test_cgi test_cookielib test_copy
test_copy_reg test_cpickle test_decimal test_descr test_descrtut
test_email test_email_codecs test_httplib test_imaplib
test_inspect test_logging test_mailbox test_mimetools
test_mimetypes test_minidom test_pickle test_pyclbr
test_robotparser test_sax test_sets test_socket test_socket_ssl
test_sundry test_timeout test_urllib test_urllib2 test_urllib2net
test_urllibnet test_xpickle

34 tests failed, worse than I expected.
Not A-OK, but not exactly mass-pandemonium either. There were plenty
of modules used the the new base object and worked fine.

There were only two causes for failure:
1. A class attempting to use __weakref__ slot.
2. There were also a few metaclass conflicts.

Yes, this agree with my findings. I was curious to know if there were
additional issues.
IMO, neither of these failure modes argues against using a metaclass to
preprocess the class dict.

But they argue against using metaclasses in general, IMO! (or at least,
against using them for users that are not aware of all the potential
pittfalls).
The __weakref__ error is not applicable;
since it's an error to use it on any class with an instance dict. (In
fact, the metaclass wasn't even causing the error: the same error would
have occurred if I had replaced builtin object with an empty subclass
of itself, without the metaclass.)

Correct, this more of a problems of __slots__ that play havoc with
inheritance
than a problem of metaclasses.
The metaclass conflict would only occur in the present case only if
someone wanted to subclass it AND specify a different metaclass.
Arguing that metaclasses should be avoided just to guard against this
rare possibility is defensive to the extreme.

Not too extreme in my opinion. Real life example: I had a debugging
tool
using a custom metaclass, I tried to run it on Zope 2.7 classes and I
have got segmentation faults. In Zope 2.8 I get "only" metatype
conflicts, and to avoid that I had to rewrite the tool :-(
I did not uncover any kinds of subtle, unexpected behavior that can
occur when metaclasses do weird things. I know such things are
possible; how likely they are is another question. The tests I ran
didn't uncover any. So for now, the results of these tests don't seem
to support your point very well.

Well, this is a matter of opinion. In my opinion your tests support my
point pretty well, better than I expected ;)
Appendix: Here's how I ran the test. I inserted the following code at
the top of Lib/test/regrtest.py, and
ran make test. I ran the tests on the Python 2.4.3 source tree, in
Linux.

=======================
import sys

class mod256metatype(type):
def __new__(metatype,name,bases,clsdict):
print >> sys.__stdout__, \
"++++++++ Creating class %s of type mod256metatype" %
name
def makeprop(sym):
prop = '_%s' % sym
def _set(self,v):
setattr(self,prop,v%256)
def _get(self):
return getattr(self,prop)
return property(_get,_set)
for sym in clsdict.get('__mod256__',()):
clsdict[sym] = makeprop(sym)
return super(metatype
return type.__new__(metatype,name,bases,clsdict)

class _object:
__metaclass__ = mod256metatype

import __builtin__
__builtin__.object = _object
=======================

I used a similar approach. I added in sitecustomize.py the following
lines:


import __builtin__

class chatty_creation(type):
"Print a message every time a class is created"
def __new__(mcl, name, bases, dic):
try:
cls = super(chatty_creation, mcl).__new__(mcl, name, bases,
dic)
except Exception, e:
print e
print 'Could not enhance class %s' % name
cls = type(name, tuple(b for b in bases if b is not
Object), dic)
# removing Object from the bases is enough only in the
trivial
# cases :-(
else:
print 'Creating class %s.%s' % (dic.get('__module__'),
name)
return cls

class Object:
__metaclass__ = chatty_creation

__builtin__.object = Object

Now it is impossible to run both Zope and Twisted due to metatype
conflicts.
I haven't looked in detail, but the first conflict in Twisted is due
to a metaclass-enhanced class which is setting properties. This is
the typical example of what I call metaclass *abuse* in my paper, since
the custom metaclass could have been avoided (for instance using George
Sakkis trick) and the conflict could have been avoided.

Now, I know how to solve the conflict
(http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/204197) but I
would rather avoid it altogether.

The problem with metaclasses is that you are adding magic to the
classes of
your USERS, and the users are known to play any kind of dirty tricks.
You
(speaking in general of you as the author of a framework) should strive
to keep things clean as much as possible.

I agree that the problems are rare: but just for this reason they are
prone to very subtle bugs, the hardest to find. And there is no
documentation of metaclass pittfall AFAIK :-(


Michele Simionato
 
M

Michele Simionato

James said:
However, I think that what you are saying about metaclasses being
brittle relates more to implementation than language. In theory, these
should be equivalent:

(1) class Bob(object): pass
(2) Bob = type('Bob', (), {})

And indeed a cursory inspection of the resulting classes show that they
are indistinguishable.

That they wouldn't be seems an implementation bug and perhaps that bug
should be fixed rather than promoting the avoidance of (2) because it
does not create classes that behave as number (1).

You got something wrong ;)

'type' is the builtin metaclass, it works, I have nothing against it,
and (1) and (2) are *exactly*
equivalent. My gripe is against *custom* metaclasses, i.e. subclasses
of 'type'. The paper is
all about avoiding custom metaclasses and using 'type' instead (i.e.
use the __metaclass__
hook, but not custom metaclasses). It is the same trick used by George
Sakkis in this same
thread.

Michele Simionato
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,997
Messages
2,570,241
Members
46,831
Latest member
RusselWill

Latest Threads

Top