Removing inheritance (decorator pattern ?)

G

George Sakkis

I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").

However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:

class Joinable(object):
def __init__(self, words):
self.__words = list(words)
def join(self, delim=','):
return delim.join(self.__words)

class Sorted(Joinable):
def __init__(self, words):
super(Sorted,self).__init__(sorted(words))
def join(self, delim=','):
return '[Sorted] %s' % super(Sorted,self).join(delim)

class Reversed(Joinable):
def __init__(self, words):
super(Reversed,self).__init__(reversed(words))
def join(self, delim=','):
return '[Reversed] %s' % super(Reversed,self).join(delim)

class SortedReversed(Sorted, Reversed):
pass

class ReversedSorted(Reversed, Sorted):
pass

if __name__ == '__main__':
words = 'this is a test'.split()
print SortedReversed(words).join()
print ReversedSorted(words).join()


So I'm wondering, is the decorator pattern applicable here ? If yes,
how ? If not, is there another way to convert inheritance to
delegation ?

George


[1] http://en.wikipedia.org/wiki/Decorator_pattern
 
D

Diez B. Roggisch

George said:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").

However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:

class Joinable(object):
def __init__(self, words):
self.__words = list(words)
def join(self, delim=','):
return delim.join(self.__words)

class Sorted(Joinable):
def __init__(self, words):
super(Sorted,self).__init__(sorted(words))
def join(self, delim=','):
return '[Sorted] %s' % super(Sorted,self).join(delim)

class Reversed(Joinable):
def __init__(self, words):
super(Reversed,self).__init__(reversed(words))
def join(self, delim=','):
return '[Reversed] %s' % super(Reversed,self).join(delim)

class SortedReversed(Sorted, Reversed):
pass

class ReversedSorted(Reversed, Sorted):
pass

if __name__ == '__main__':
words = 'this is a test'.split()
print SortedReversed(words).join()
print ReversedSorted(words).join()


So I'm wondering, is the decorator pattern applicable here ? If yes,
how ? If not, is there another way to convert inheritance to
delegation ?

Factory - and dynamic subclassing, as shown here:

import random

class A(object):
pass

class B(object):
pass


def create_instance():
superclasses = tuple(random.sample([A, B], random.randint(1, 2)))
class BaseCombiner(type):

def __new__(mcs, name, bases, d):
bases = superclasses + bases
return type(name, bases, d)

class Foo(object):
__metaclass__ = BaseCombiner
return Foo()

for _ in xrange(10):
f = create_instance()
print f.__class__.__bases__



Diez
 
T

Terry Reedy

|I have a situation where one class can be customized with several
| orthogonal options. Currently this is implemented with (multiple)
| inheritance but this leads to combinatorial explosion of subclasses as
| more orthogonal features are added. Naturally, the decorator pattern
| [1] comes to mind (not to be confused with the the Python meaning of
| the term "decorator").
|
| [1] http://en.wikipedia.org/wiki/Decorator_pattern

I read the first part of the article. The following
"This difference becomes most important when there are several independent
ways of extending functionality. In some object-oriented programming
languages, classes cannot be created at runtime, and it is typically not
possible to predict what combinations of extensions will be needed at
design time. This would mean that a new class would have to be made for
every possible combination"

suggests to me that this pattern is not needed in Python, where all user
classes are created at runtime. One can define a class factory with an
'extensions' parameter that creates a class and adds methods according to
the extensions. One could even, for instance, start with a basic text for
__init__, add lines according to the extensions, compile the definition,
and add *that*. One could even generate a particularized value for the
..__name__ attribute.

If it is not important that instances in the general group have different
__class__ attributes, one might consider a master class with all methods
and an init function that only adds the data attributes needed
(borderwidth, scroller state, etc.).

I did not read your toy example enough to quite see how it connected to the
wiki article.

tjr
 
D

Diez B. Roggisch

Diez said:
George said:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").

However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:

class Joinable(object):
def __init__(self, words):
self.__words = list(words)
def join(self, delim=','):
return delim.join(self.__words)

class Sorted(Joinable):
def __init__(self, words):
super(Sorted,self).__init__(sorted(words))
def join(self, delim=','):
return '[Sorted] %s' % super(Sorted,self).join(delim)

class Reversed(Joinable):
def __init__(self, words):
super(Reversed,self).__init__(reversed(words))
def join(self, delim=','):
return '[Reversed] %s' % super(Reversed,self).join(delim)

class SortedReversed(Sorted, Reversed):
pass

class ReversedSorted(Reversed, Sorted):
pass

if __name__ == '__main__':
words = 'this is a test'.split()
print SortedReversed(words).join()
print ReversedSorted(words).join()


So I'm wondering, is the decorator pattern applicable here ? If yes,
how ? If not, is there another way to convert inheritance to
delegation ?

Factory - and dynamic subclassing, as shown here:

import random

class A(object):
pass

class B(object):
pass


def create_instance():
superclasses = tuple(random.sample([A, B], random.randint(1, 2)))
class BaseCombiner(type):

def __new__(mcs, name, bases, d):
bases = superclasses + bases
return type(name, bases, d)

class Foo(object):
__metaclass__ = BaseCombiner
return Foo()

for _ in xrange(10):
f = create_instance()
print f.__class__.__bases__

Right now I see of course that I could have spared myself the whole
__metaclass__-business and directly used type()... Oh well, but at least it
worked :)

Diez
 
G

George Sakkis

Diez said:
George said:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").
However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:
class Joinable(object):
    def __init__(self, words):
        self.__words = list(words)
    def join(self, delim=','):
        return delim.join(self.__words)
class Sorted(Joinable):
    def __init__(self, words):
        super(Sorted,self).__init__(sorted(words))
    def join(self, delim=','):
        return '[Sorted] %s' % super(Sorted,self).join(delim)
class Reversed(Joinable):
    def __init__(self, words):
        super(Reversed,self).__init__(reversed(words))
    def join(self, delim=','):
        return '[Reversed] %s' % super(Reversed,self).join(delim)
class SortedReversed(Sorted, Reversed):
    pass
class ReversedSorted(Reversed, Sorted):
    pass
if __name__ == '__main__':
    words = 'this is a test'.split()
    print SortedReversed(words).join()
    print ReversedSorted(words).join()
So I'm wondering, is the decorator pattern applicable here ? If yes,
how ? If not, is there another way to convert inheritance to
delegation ?
Factory - and dynamic subclassing, as shown here:
import random
class A(object):
     pass
class B(object):
     pass
def create_instance():
     superclasses = tuple(random.sample([A, B], random.randint(1, 2)))
     class BaseCombiner(type):
         def __new__(mcs, name, bases, d):
             bases = superclasses + bases
             return type(name, bases, d)
     class Foo(object):
         __metaclass__ = BaseCombiner
     return Foo()
for _ in xrange(10):
     f = create_instance()
     print f.__class__.__bases__

Right now I see of course that I could have spared myself the whole
__metaclass__-business and directly used type()... Oh well, but at least it
worked :)

Diez


Ok, I see how this would work (and it's trivial to make it cache the
generated classes for future use) but I guess I was looking for a more
"mainstream" approach, something that even a primitive statically
typed language could run :) Even in Python though, I think of Runtime
Type Generation like eval(); it's good that it exists but it should be
used as a last resort. Also RTG doesn't play well with pickling.

Since I don't have many useful subclasses so far, I'll stick with
explicit inheritance for now but I'll consider RTG if the number of
combinations becomes a real issue.

George
 
D

Diez B. Roggisch

Ok, I see how this would work (and it's trivial to make it cache the
generated classes for future use) but I guess I was looking for a more
"mainstream" approach, something that even a primitive statically
typed language could run :) Even in Python though, I think of Runtime
Type Generation like eval(); it's good that it exists but it should be
used as a last resort. Also RTG doesn't play well with pickling.

Since I don't have many useful subclasses so far, I'll stick with
explicit inheritance for now but I'll consider RTG if the number of
combinations becomes a real issue.

I wouldn't compare the usage of the type-constructor to "eval". Python *is*
a dynamic language, and it explicitly exposes parts of it's internals
through things like metaclasses, descriptors and such.

And your requirements simply can't be met by a "traditional" language. There
it's either decoration or similar approaches, or nothing - especially you
can't create objects that will survive static type-analysis with aggregated
subclasses. You must lose that information through the delegation-nature of
these recipes.

Diez
 
G

Gerard flanagan

George said:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").

However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:

I don't know if it will map to your actual problem, but here's a
variation of your toy code. I was thinking the Strategy pattern,
different classes have different initialisation strategies? But then you
could end up with as many Strategy classes as subclasses, I don't know.
(Also in vaguely similar territory -
http://bazaar.launchpad.net/~grflanagan/python-rattlebag/trunk/annotate/head:/src/template.py
)


class MetaBase(type):

def __init__(cls, name, bases, data):
cls.strategies = []
cls.prefixes = []
for base in bases:
print base
if hasattr(base, 'strategy'):
cls.strategies.append(base.strategy)
if hasattr(base, 'prefix'):
cls.prefixes.append(base.prefix)
super(MetaBase, cls).__init__(name, bases, data)

class Joinable(object):
__metaclass__ = MetaBase
strategy = list
prefix = ''

def __init__(self, words):
self._words = words
for strategy in self.strategies:
self._words = strategy(self._words)

def join(self, delim=','):
return '%s %s' % (' '.join(self.prefixes), delim.join(self._words))

class Sorted(Joinable):
strategy = sorted
prefix = '[sorted]'

class Reversed(Joinable):
strategy = reversed
prefix = '[reversed]'

class SortedReversed(Sorted, Reversed):
pass

class ReversedSorted(Reversed, Sorted):
pass

if __name__ == '__main__':
words = 'this is a test'.split()
print SortedReversed(words).join()
print ReversedSorted(words).join()
 
G

George Sakkis

George said:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").
However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:

I don't know if it will map to your actual problem, but here's a
variation of your toy code. I was thinking the Strategy pattern,
different classes have different initialisation strategies? But then you
could end up with as many Strategy classes as subclasses, I don't know.
(Also in vaguely similar territory -http://bazaar.launchpad.net/~grflanagan/python-rattlebag/trunk/annota...
)

class MetaBase(type):

     def __init__(cls, name, bases, data):
         cls.strategies = []
         cls.prefixes = []
         for base in bases:
             print base
             if hasattr(base, 'strategy'):
                 cls.strategies.append(base.strategy)
             if hasattr(base, 'prefix'):
                 cls.prefixes.append(base.prefix)
         super(MetaBase, cls).__init__(name, bases, data)

class Joinable(object):
     __metaclass__ = MetaBase
     strategy = list
     prefix = ''

     def __init__(self, words):
         self._words = words
         for strategy in self.strategies:
             self._words = strategy(self._words)

     def join(self, delim=','):
         return '%s %s' % (' '.join(self.prefixes), delim.join(self._words))

class Sorted(Joinable):
     strategy = sorted
     prefix = '[sorted]'

class Reversed(Joinable):
     strategy = reversed
     prefix = '[reversed]'

class SortedReversed(Sorted, Reversed):
     pass

class ReversedSorted(Reversed, Sorted):
     pass

if __name__ == '__main__':
     words = 'this is a test'.split()
     print SortedReversed(words).join()
     print ReversedSorted(words).join()

This doesn't solve the original problem, the combinatorial explosion
of empty subclasses. At the end of the day, I'd like a solution that
uses a (mostly) flat, single-inheritance, hierarchy, allowing the
client say:

j = Joinable(words)
if sort:
j = Sorted(j)
if reverse:
j = Reversed(j)
...
print j.join()


George
 
M

Maric Michaud

Le Monday 16 June 2008 20:35:22 George Sakkis, vous avez écrit :
George said:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").

However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:

I don't know if it will map to your actual problem, but here's a
variation of your toy code. I was thinking the Strategy pattern,
different classes have different initialisation strategies? But then you
could end up with as many Strategy classes as subclasses, I don't know.
(Also in vaguely similar territory
-http://bazaar.launchpad.net/~grflanagan/python-rattlebag/trunk/annota...
)

class MetaBase(type):

     def __init__(cls, name, bases, data):
         cls.strategies = []
         cls.prefixes = []
         for base in bases:
             print base
             if hasattr(base, 'strategy'):
                 cls.strategies.append(base.strategy)
             if hasattr(base, 'prefix'):
                 cls.prefixes.append(base.prefix)
         super(MetaBase, cls).__init__(name, bases, data)

class Joinable(object):
     __metaclass__ = MetaBase
     strategy = list
     prefix = ''

     def __init__(self, words):
         self._words = words
         for strategy in self.strategies:
             self._words = strategy(self._words)

     def join(self, delim=','):
         return '%s %s' % (' '.join(self.prefixes),
delim.join(self._words))

class Sorted(Joinable):
     strategy = sorted
     prefix = '[sorted]'

class Reversed(Joinable):
     strategy = reversed
     prefix = '[reversed]'

class SortedReversed(Sorted, Reversed):
     pass

class ReversedSorted(Reversed, Sorted):
     pass

if __name__ == '__main__':
     words = 'this is a test'.split()
     print SortedReversed(words).join()
     print ReversedSorted(words).join()

This doesn't solve the original problem, the combinatorial explosion
of empty subclasses. At the end of the day, I'd like a solution that
uses a (mostly) flat, single-inheritance, hierarchy, allowing the
client say:

Yes, and it fails to implement the strategy pattern as well... which would
have solved the problem as it is intended exactly for this purpose.
j = Joinable(words)
if sort:
j = Sorted(j)
if reverse:
j = Reversed(j)
...
print j.join()

The example given by Gerard is hard to translate into a strategy pattern
because it's more often a use case for a decorator-like design (which is
easily rendered with method decorators in python).

A abstract strategy pattern is basically implemented as follow.

class Strategy(object) :

def do_init(self, *args) : raise NotImplementedError
def do_job(self, *args) : raise NotImplementedError
def do_finalize(self, *args) : raise NotImplementedError


# modules can define their own strategies now

class algo_strategy(Strategy) :
...

class algo1_strategy(Strategy) :
...

# whether this is possible or not depend on the implementation
# and should be documented
class algo2_strategy(algo1_strategy) :
...

class MyClassUsingStrategies(object) :

def __init__(self, meth1_strategies=[], meth2_strategies=[]) :
self._meth1_strategies = meth1_strategies
if [ s for s in meth2_strategies
if not isinstance(s, algo_strategy) ] :
raise RuntimeError("Not a valid strategy !")
self._meth2_strategies = meth2_strategies

def meth1(self, arg) :
for i in self._meth1_strategies :
i.do_init(...)

....

for i in self._meth1_strategies :
i.do_job(...)

....

for i in self._meth1_strategies :
i.do_finalize(...)

...

The class complextiy problem is actually solved by :

inst_with_alg1 = MyClassUsingStrategies((algo1_strategy,), (algo1_strategy,))
inst_with_alg1_alg2 = MyClassUsingStrategies(
(algo1_strategy,),
(algo2_strategy,)
)
inst_with_alg12 = MyClassUsingStrategies(
(algo1_strategy, algo2_strategy),
(algo1_strategy, algo2_strategy)
)


etc...
 
M

Maric Michaud

Le Tuesday 17 June 2008 05:10:57 Maric Michaud, vous avez écrit :
The class complextiy problem is actually solved by :

inst_with_alg1 = MyClassUsingStrategies((algo1_strategy,),
(algo1_strategy,)) inst_with_alg1_alg2 = MyClassUsingStrategies(
                                                      (algo1_strategy,),
                                                      (algo2_strategy,)
                                  )
inst_with_alg12 = MyClassUsingStrategies(
                                             (algo1_strategy,
algo2_strategy), (algo1_strategy, algo2_strategy) )


etc...

Ah ! they should be instances here, this also permit extra configuration
parameters to be passed in the Strategies constructors :

inst_with_alg12 = MyClassUsingStrategies(
(algo1_strategy(), algo2_strategy()),
(algo1_strategy(), algo2_strategy())
)
 
G

Gerard flanagan

Maric said:
Le Monday 16 June 2008 20:35:22 George Sakkis, vous avez écrit :
variation of your toy code. I was thinking the Strategy pattern,
different classes have different initialisation strategies? But then you
could end up with as many Strategy classes as subclasses, I don't know.
[...]
This doesn't solve the original problem, the combinatorial explosion
of empty subclasses.
[...]

Yes, and it fails to implement the strategy pattern as well... which would
have solved the problem as it is intended exactly for this purpose.

Ok, better would have been 'my made-up strategy pattern, any resemblance
to other patterns, either living or dead, is purely coincidental' :)

Non-canonically,

G.
 
G

George Sakkis

Le Monday 16 June 2008 20:35:22 George Sakkis, vous avez écrit :


George Sakkis wrote:
I have a situation where one class can be customized with several
orthogonal options. Currently this is implemented with (multiple)
inheritance but this leads to combinatorial explosion of subclasses as
more orthogonal features are added. Naturally, the decorator pattern
[1] comes to mind (not to be confused with the the Python meaning of
the term "decorator").
However, there is a twist. In the standard decorator pattern, the
decorator accepts the object to be decorated and adds extra
functionality or modifies the object's behavior by overriding one or
more methods. It does not affect how the object is created, it takes
it as is. My multiple inheritance classes though play a double role:
not only they override one or more regular methods, but they may
override __init__ as well. Here's a toy example:
I don't know if it will map to your actual problem, but here's a
variation of your toy code. I was thinking the Strategy pattern,
different classes have different initialisation strategies? But then you
could end up with as many Strategy classes as subclasses, I don't know..
(Also in vaguely similar territory
-http://bazaar.launchpad.net/~grflanagan/python-rattlebag/trunk/annota....
)
class MetaBase(type):
def __init__(cls, name, bases, data):
cls.strategies = []
cls.prefixes = []
for base in bases:
print base
if hasattr(base, 'strategy'):
cls.strategies.append(base.strategy)
if hasattr(base, 'prefix'):
cls.prefixes.append(base.prefix)
super(MetaBase, cls).__init__(name, bases, data)
class Joinable(object):
__metaclass__ = MetaBase
strategy = list
prefix = ''
def __init__(self, words):
self._words = words
for strategy in self.strategies:
self._words = strategy(self._words)
def join(self, delim=','):
return '%s %s' % (' '.join(self.prefixes),
delim.join(self._words))
class Sorted(Joinable):
strategy = sorted
prefix = '[sorted]'
class Reversed(Joinable):
strategy = reversed
prefix = '[reversed]'
class SortedReversed(Sorted, Reversed):
pass
class ReversedSorted(Reversed, Sorted):
pass
if __name__ == '__main__':
words = 'this is a test'.split()
print SortedReversed(words).join()
print ReversedSorted(words).join()
This doesn't solve the original problem, the combinatorial explosion
of empty subclasses. At the end of the day, I'd like a solution that
uses a (mostly) flat, single-inheritance, hierarchy, allowing the
client say:

Yes, and it fails to implement the strategy pattern as well... which would
have solved the problem as it is intended exactly for this purpose.

As someone in another newsgroup demonstrated, it can be solved with a
combination of strategy and decorator: http://tinyurl.com/5ulqh9

George
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,982
Messages
2,570,190
Members
46,740
Latest member
AdolphBig6

Latest Threads

Top