Singleton-like pattern

P

Pedro Werneck

I need to implement an object that behaves just like imutable objects,
like int, str and tuple: if you create two objects with the same value
at the same time, you get the same object, like a singleton, but
instead of storing a unique object, store a set of objects, with no
duplicates. I don't know if this is a common pattern (and if there's a
name for it), but I remember reading something about it some time
ago... the ASPN cookbook has some similar patterns, but none works for
me...

I know there are several ways to implement this, but I think this
metaclass approach is more elegant than using __new__ and
inheritance... at least it solves my problem (I'm working on a card
game) but I am sure that there are better ways to do it... I could not
find any other value suitable as a key and using str(args) + str(kwds)
is ugly and easy to break...


class MultiSingleton(type):
def __call__(cls, *args, **kwds):
cache = cls.__dict__.get('__cache__')
if cache is None:
cls.__cache__ = cache = {}
tag = str(args) + str(kwds)
if tag in cache:
return cache[tag]
obj = object.__new__(cls)
obj.__init__(*args, **kwds)
cache[tag] = obj
return obj


Any help will be much appreciated... thanks in advance

Pedro Werneck
 
M

Michele Simionato

I need to implement an object that behaves just like imutable objects,
like int, str and tuple: if you create two objects with the same value
at the same time, you get the same object, like a singleton, but
instead of storing a unique object, store a set of objects, with no
duplicates. I don't know if this is a common pattern (and if there's a
name for it), but I remember reading something about it some time
ago... the ASPN cookbook has some similar patterns, but none works for
me...

I know there are several ways to implement this, but I think this
metaclass approach is more elegant than using __new__ and
inheritance... at least it solves my problem (I'm working on a card
game) but I am sure that there are better ways to do it... I could not
find any other value suitable as a key and using str(args) + str(kwds)
is ugly and easy to break...


class MultiSingleton(type):
def __call__(cls, *args, **kwds):
cache = cls.__dict__.get('__cache__')
if cache is None:
cls.__cache__ = cache = {}
tag = str(args) + str(kwds)
if tag in cache:
return cache[tag]
obj = object.__new__(cls)
obj.__init__(*args, **kwds)
cache[tag] = obj
return obj


Any help will be much appreciated... thanks in advance

Pedro Werneck

"memoize" is a possible name for this. Notice that the metaclass is a
bit of overkill, you may well use a simple function for this job.
About the issue of finding a suitable key, in the same situation I have
used the tuple (args,kw) as key. But me too I would like to ask if this is a
good idea. What's the custom solution for getting a good key from
a dictionary ?


Michele
 
B

Bengt Richter

[email protected] (Pedro Werneck) wrote in message news: said:
I need to implement an object that behaves just like imutable objects,
like int, str and tuple: if you create two objects with the same value
at the same time, you get the same object, like a singleton, but
instead of storing a unique object, store a set of objects, with no
duplicates. I don't know if this is a common pattern (and if there's a
name for it), but I remember reading something about it some time
ago... the ASPN cookbook has some similar patterns, but none works for
me...

I know there are several ways to implement this, but I think this
metaclass approach is more elegant than using __new__ and
inheritance... at least it solves my problem (I'm working on a card
game) but I am sure that there are better ways to do it... I could not
find any other value suitable as a key and using str(args) + str(kwds)
is ugly and easy to break...


class MultiSingleton(type): class MultiSingleton(object):
def __call__(cls, *args, **kwds): def __new__(cls, *args, **kwds):
cache = cls.__dict__.get('__cache__')
if cache is None:
cls.__cache__ = cache = {}
tag = str(args) + str(kwds) tag = '%r%r'% (args, kwds) # might be an alternative
if tag in cache:
return cache[tag]
obj = object.__new__(cls)
obj.__init__(*args, **kwds)
cache[tag] = obj
return obj


Any help will be much appreciated... thanks in advance

Pedro Werneck

"memoize" is a possible name for this. Notice that the metaclass is a
bit of overkill, you may well use a simple function for this job.
Why not just change as above, and sublass to inherit the behavior
(and create subclass-specific caches) ?
About the issue of finding a suitable key, in the same situation I have
used the tuple (args,kw) as key. But me too I would like to ask if this is a
good idea. What's the custom solution for getting a good key from
a dictionary ?
Does (args,kw) work in general? IWT you could easily get something unhashable?
IWT using a tuple of actual args may also be a bad idea since it would prevent callers'
temp args from being garbage collected. I use repr to avoid that, maybe mistakenly?

Regards,
Bengt Richter
 
C

Carl Banks

Michele said:
About the issue of finding a suitable key, in the same situation I have
used the tuple (args,kw) as key.

Surely not? Perhaps you meant tuple(args,tuple(kw.items())) ? It gets
rid of the unhashable kw dict.

But me too I would like to ask if this is a
good idea. What's the custom solution for getting a good key from
a dictionary ?

Howabout cPickle.dumps((args,kwargs),1)?
 
P

Pedro Werneck

class MultiSingleton(type): class MultiSingleton(object):
def __call__(cls, *args, **kwds): def __new__(cls, *args, **kwds):
cache = cls.__dict__.get('__cache__')
if cache is None:
cls.__cache__ = cache = {}
tag = str(args) + str(kwds) tag = '%r%r'% (args, kwds) # might be an alternative
if tag in cache:
return cache[tag]
obj = object.__new__(cls)
obj.__init__(*args, **kwds)
cache[tag] = obj
return obj

This is exactly what I did at first... I only changed it to a metaclass
because I was adding it to a module that already contain some other
metaclasses I use; I think it's more elegant as I said before... a
'quick and dirty' benchmark with the timeit module returned the
folowing results, and in this project I will be dealing with a database
of 7000+ items, this 20% bit may help a little:

__metaclass__:
[39.744094967842102, 40.455733060836792, 42.027853965759277]

__new__:
[47.914013981819153, 48.721022009849548, 49.430392026901245]


"memoize" is a possible name for this. Notice that the metaclass is a
bit of overkill, you may well use a simple function for this job.

Hey... you wrote such a good article on python metaclasses and now don't
want people to use them ? :)
Does (args,kw) work in general? IWT you could easily get something unhashable?
IWT using a tuple of actual args may also be a bad idea since it would prevent callers'
temp args from being garbage collected. I use repr to avoid that, maybe mistakenly?
About the issue of finding a suitable key, in the same situation I have
used the tuple (args,kw) as key. But me too I would like to ask if this is a
good idea. What's the custom solution for getting a good key from
a dictionary ?

Actually, (args, kw) doesn't work at all, even if you only get hashable
arguments... the 'kw' dict invalidate it as a key... besides, there's the
garbage collection problem, as Bengt Richter mentioned...

The "%r%r"%(args, kwds) works, as well as str((args, kwds))), but it
breaks if you get as argument an object with the default __repr__, as
the object id may be different, even if they are equal. Overriding __repr__
and returning a real representation avoids this problem.

Thanks for your time.

Pedro Werneck
 
M

Michele Simionato

class MultiSingleton(type): class MultiSingleton(object):
def __call__(cls, *args, **kwds): def __new__(cls, *args, **kwds):
cache = cls.__dict__.get('__cache__')
if cache is None:
cls.__cache__ = cache = {}
tag = str(args) + str(kwds) tag = '%r%r'% (args, kwds) # might be an alternative
if tag in cache:
return cache[tag]
obj = object.__new__(cls)
obj.__init__(*args, **kwds)
cache[tag] = obj
return obj

This is exactly what I did at first... I only changed it to a metaclass
because I was adding it to a module that already contain some other
metaclasses I use; I think it's more elegant as I said before... a
'quick and dirty' benchmark with the timeit module returned the
folowing results, and in this project I will be dealing with a database
of 7000+ items, this 20% bit may help a little:

__metaclass__:
[39.744094967842102, 40.455733060836792, 42.027853965759277]

__new__:
[47.914013981819153, 48.721022009849548, 49.430392026901245]


"memoize" is a possible name for this. Notice that the metaclass is a
bit of overkill, you may well use a simple function for this job.

Hey... you wrote such a good article on python metaclasses and now don't
want people to use them ? :)

Maybe it is because I know them that I don't want to abuse them ;)
Consider also that not everybody knows about metaclasses, and
I want my code to be readable to others; finally, there is
Occam's rasor argument (i.e. do not use metaclasses without necessity).
Actually, (args, kw) doesn't work at all, even if you only get hashable
arguments... the 'kw' dict invalidate it as a key... besides, there's the
garbage collection problem, as Bengt Richter mentioned...

Yes, as I replied to Carl Banks, actually I have got problems with
(args,kw) and I think at the end I used args+tuple(kw.iteritems()),
but I had forgotten at the time of the posting.
The "%r%r"%(args, kwds) works, as well as str((args, kwds))), but it
breaks if you get as argument an object with the default __repr__, as
the object id may be different, even if they are equal. Overriding __repr__
and returning a real representation avoids this problem.

Thanks for your time.

Pedro Werneck


Michele
 
S

Steven Taschuk

Quoth Michele Simionato:
[...]
Actually you are right, I remember now that I got problems with
(args,kw) and at the end I used args,tuple(kw.items()). But I was
ensure if this was a good solution.

I'd worry about the order of kw.items().
 
M

Michele Simionato

Steven Taschuk said:
Quoth Michele Simionato:
[...]
Actually you are right, I remember now that I got problems with
(args,kw) and at the end I used args,tuple(kw.items()). But I was
ensure if this was a good solution.

I'd worry about the order of kw.items().

Yes, indeed. Also, the args tuple can contain nested dictionaries and
become unhashable; in such a case I should recursively flatten all
the dictionaries to tuples, taking in account the ordering.
To much work for me ...I have really to look at the cPickle solution.
How does it solve the ordering issue?

Michele
 
C

Carl Banks

Michele said:
Steven Taschuk said:
Quoth Michele Simionato:
[...]
Actually you are right, I remember now that I got problems with
(args,kw) and at the end I used args,tuple(kw.items()). But I was
ensure if this was a good solution.

I'd worry about the order of kw.items().

Yes, indeed. Also, the args tuple can contain nested dictionaries and
become unhashable; in such a case I should recursively flatten all
the dictionaries to tuples, taking in account the ordering.
To much work for me ...I have really to look at the cPickle solution.
How does it solve the ordering issue?


Unfortunately, not well.
'(dp1\nI9\nI1\nsI1\nI1\ns.'
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,083
Messages
2,570,589
Members
47,211
Latest member
JaydenBail

Latest Threads

Top