Zope 3.0, and why I won't use it

T

Thomas Heller

Tim Peters said:
Doing something other than that (for example, maybe popping up a
README file in Notepad at the end of installation) would require
changes to distutils, since X3's Windows installer is built by
distutils. The point of that isn't that changing distutils makes it
impossible, the point is that since distutils *today* doesn't have
such abilities, no distutils-built installer built today (whether X3's
or any other Python project's) can do such things.

Recent distutils' bdist_wininst command has a post_install_script (or
something like that), which is run *after* the installation. It can do
anything that Python does, so os.startfile("readme.txt") or
os.startfile("readme.html") would be possible.

I do not know which Python version zope uses, and I'm not sure the
post_install_script stuff has been ported to 2.3. OTOH, there are
several ways to use 2.4's distutils with earlier Python releases.

(Although I'll doubt it would help at all - experience shows that nearly
nobody reads this stuff)

Thomas
 
I

Ian Bicking

Ville said:
Alex> where builtin 'adapt' does protocol adaptation. On the
Alex> other hand, if the semantics of those 'as' clauses were
Alex> something like:

Alex> def x(y, z):
Alex> if type(y) is not foo: raise TypeError, 'y must be foo'
Alex> if type(z) is not bar: raise TypeError, 'z must be bar'
Alex> <body of x>

Alex> then that's the day I move to Ruby, or any other language
Alex> that remained halfway sensible (if any).

Even if it allowed portions of the code to be statically compiled to
reach "native" performance? Or if it allowed resolving ambiguous
situations in type inference? Type declarations would of course be
optional, and they could be a big win esp. for non-cpython
implementations of the language. Beats reverting to C#/whatever for
situations where you want good JITtability. In CPython "adaptation"
might be the way to go, of course, but that would still enable the
same code to run on different variants of Python.

I would think of it differently. I don't see a need for static typing
to deal with the outside world. There's always a bridge when
communicating with the outside world, and if that bridge is smart enough
it should be okay. You can annotate things (I think type annotation for
PyObjC was one of the motivations behind the decorator syntax), that's
different from static typing, and is just a guide for coercion.

And obviously foreign functions cannot take arbitrary arguments.
Nothing in Python can, there's always limits; though foreign functions
may not accept immitation objects (e.g., list subclasses), and that's
just part of the API. That's why people tend to create wrappers around
non-Python libraries.

I can see some performance advantages to static typing. I don't think
it would be worth it for any external interface, but for internal
interfaces it might be worth it, where there's less of a concern about
flexibility. In that case, decorators might be all that's necessary, e.g.:

@constrain_types(a=float, b=str, c=int:
def foo(a, b):
c = 1
...

Imagining an implementation of constrain_types that knew how to pick
apart a function's code, apply the type information and optimizations,
and put it back together. It really doesn't need pretty syntax, because
it's not a pretty operation (though the decorator syntax is actually
reasonably attractive). It would be nice if this applied both to
arguments and local variables.

Now, I could imagine this turning into a Cargo Cult programming
practice, where people mindlessly apply this type information despite
there being no reason or performance gain, thus adding weight to their
code and reducing flexibility. But then, I'd rather work under the
expectation that the programmer will do the right thing.
 
J

Josiah Carlson

Since the actual need for type-checking is extremely rare, I contend it
should not _at all_ be a candidate for a syntax to facilitate it. Since
the actual need for adaptation is much wider, I contend it's the one
obvious choice for the "nice syntax" if any.

Sounds reasonable, but there will be a group of people who will beg and
plead for type checking, because they want "static type checking on
function call and return".

Type checking is ugly, so it's quite fitting that it be implemented in
ugly ways -- by boilerplate or user-coded splats. Adaptation is
beautiful, and beautiful is better than ugly (see, I can quote the zen
of Python too...), so it should have nice syntax (<var> 'as' <itf>).

Adaptation is quite nifty, but being that I am a rare user of either, it
probably isn't worth the effort to attempt to convince me that
adaptation is the one true way, especially considering that I've got
little to no pull in python-dev.

Type-based dispatch is a little bit of a mess to implement with
decorators, IMHO. Consider the hypothetical code...:

class A(object):
@ dispatch(int)
def foo(x): ...

@ dispatch(float)
def foo(x): ...

class B(object):
@ dispatch(int)
def foo(x): ...

@ dispatch(str)
def foo(x): ...

Sure, dispatch can be a H**2OF which returns a dispatching-foo and
stashes somewhere a (say) foo__int, foo__float or whatever, but the
problem is, _WHAT_ 'somewhere'. dispatch isn't told what class body
it's being called from, so it can't easily stash stuff in the obvious
place, the class dictionary being built. Attributes of the
dispatching-foo don't seem good, because of the problem of getting at
the right previously-returned dispatching-foo object on successive calls
of dispatch from within the same class body (but not from separate class
bodies). Maybe some peek-into-the-stack black magic or custom
metaclass can help, but it does seem a little bit of a mess, unless I'm
overlooking something obvious. If the implementation is hard to
explain, it's a bad idea...

It can be a mess, which is one reason why I haven't been using it, and
generally don't use it. But you know those C/C++ guys who are all like
"hey, where's my polymorphism?"

I agree that it would be best of one could place everything into the
class dict without frame hacks, but for simplicity in implementation's
sake, I would stick with a dispatch-factory and jam all of the requisite
dispatch information into the closure (or instance) that is generated.
Sure, you need to say 'hey, I'm going to do type-based dispatch here'
at the beginning of your desired non-overlapping namespaces, but it is
explicit what one would want in such a case. Heck, with a properly
defined metaclass or class decorator, a post-decoration pass could be
done to extract all of the dispatches and place them into foo__int, etc.

@ multi_dispatch_to_methods
class A(object):
dispatch = dispatch_factory()

@ dispatch(int)
def foo(x): ...

@ dispatch(float)
def foo(x): ...


It may have a couple warts, but it is implementable today.

- Josiah
 
V

Ville Vainio

Ian> I would think of it differently. I don't see a need for
Ian> static typing to deal with the outside world. There's always
Ian> a bridge when communicating with the outside world, and if
Ian> that bridge is smart enough it should be okay. You can

With things like IronPython, there isn't necessarily any "outside
world". The compiler can look at the code, determine that it's "typed
enough" and compile it to a more static form of code that can be
JIT-compiled to native code.

Ian> it would be worth it for any external interface, but for
Ian> internal interfaces it might be worth it, where there's less
Ian> of a concern about flexibility. In that case, decorators
Ian> might be all that's necessary, e.g.:

Ian> @constrain_types(a=float, b=str, c=int:
Ian> def foo(a, b):
Ian> c = 1
Ian> ...

Guido has stated that we shouldn't jump all over decorator syntax for
function/method signatures, because type declarations will appear in
language syntax at some point. We'll see how that pans out - in the
meantime, perhaps having some kind of "standard" type declaration
(-ish) decorator in the standard library would be conductive to having
One Obvious Way of telling IDEs about the types.
 
A

Alex Martelli

Carlos Ribeiro said:
Well, my name is not Alex, and my answer will probably fall short of a
comprehensive definition :) But let's see if I can help here...

I think you did a great job! May I recommend some further reading...:

http://www.python.org/peps/pep-0246.html
http://www.aleax.it/os04_pydp.pdf
http://peak.telecommunity.com/protocol_ref/module-protocols.html

Adaptation is the act of taking one object and making it conform to a
given protocol (or interface). Adaptation is the key to make dynamic
code that takes parameters from arbitrary types work in a safe, well
behaved way.

Hear, hear!
The basic concept underlying adaptation is the "protocol", also called
"interface" in some implementations. For all purposes of this
discussion, and for simplicity reasons, we can safely assume that
protocols and interfaces are equivalent.

Right, but, for the curious...: "Interface" mostly describes a certain
set of methods and their signatures; "Protocol" adds semantics and
pragmatics -- some level of conceptualization of what the methods _do_
and constraints on how they are used (e.g. "after calling .close(), no
other method can ever be called on the object any more"). This is a
reasonably popular use of the words in question, though far from
universal.

A protocol is a set of primitives that is supported by a given object.
For example: the iterator protocol defines the following primitives:
__iter__ and next() (as documented in
http://docs.python.org/lib/typeiter.html). Any object from any class
that implement these methods, regardless of its ancestors, is said to
support the iterator protocol.

....if the semantics and pragmatics are also respected, e.g.:
x.__iter__() is x

Any object that supports the iterator protocol can be used whenever an
iterable is acceptable. This includes for loops and list
comprehensions. The biggest advantage of adaptation comes when one
realize how flexible this design is, specially when compared with
old-style type checking. In a old-style type checking environment,
parameters to a given routine must conform to the declared type of the
arguments. For iterators, it would mean that only objects descending
from some abstract class (let's say, "Iterable") would be accepted.
Now, even with multiple inheritance, this design is severely limited.

In old-style Python, inheritance isn't really the issue (except for
exceptions, where inheritance does matter). Rather, a protocol is
defined by a given set of methods.

An iterable is an object supplying a special method __iter__ which,
called without arguments, returns an iterator (any object which respects
the iterator protocol). A sequence besides being iterable supports
__len__, AND __getitem__ with integer and slice arguments, and there is
a rich semantic and pragmatic web of mutual constraints between behavior
of __getitem__ and __len__ and iteration. A mutable sequence is a
sequence which also supports __setitem__ (again with specific
constraints wrt __getitem__, __len__...) and is also supposed to expose
a rich set of other methods such as pop, append, extend, etc, etc.

This IS great BUT limited by what the LANGUAGE designer(s) sanction(s)
as 'blessed protocols'. There are quite a few, but they're never enough
to cover the needs of an application or field of applications, of
course. With protocols based on certain special methods, you have a
great concept which however is not really extensible, nor flexible
enough to help the authors of large frameworks and applications.

Framework authors do define new protocols, of course -- they can't help
doing that. "X must be an object supplying methods 'foo' and 'bar' with
the following constraints...:". This is OK for somebody who's writing
an application using just one framework -- they can code their classes
to the framework's specifications.

The problem comes in when you're writing an application that uses two or
more frameworks... the two frameworks likely haven't heard of each
other... one wants objects supplying 'foo' and 'bar, the other supplies
objects supplying 'oof' and 'rab' instead, with subtly different
semantics and pragmatics. So, what do you do then? You write adapters:
wrappers over Y with its oof and rab which provide an X with its foo and
bar. Python is _great_ at that kind of job!

But, who applies the adapters, where, when? Well, unless we do get PEP
246 in... _you_, the application writer, are the only one who can.
You'd like to spend your time developing your application, with
frameworks to relieve you from some parts of the job, but to do that you
also need to develop adapters _and_ tediously, boilerplatishly, apply
them to every object from one framework that's going over to the other,
etc. Basically, you risk ending up with very *thick glue* (cfr Eric
Raymond's excellent "The Art of Unix Programming", great book also
available online for free) -- throughout your application, there will be
knowledge about the details of all frameworks you're using, spread in
thick layers of glue.

Now, back to Python world. In many situations, there is no need for
adaptation; the object itself supports the protocol, and can be
supplied directly. But there are situations when the object itself
can't be immediately used; it has to be adapted, or prepared, to
support the protocol. Another situation is when an object is passed to
a routine that *doesn't* support the required protocol; this is an
error, that can be catched by the adapt() framework in a superficially
similar but fundamentally different approach from type checking (and
that's whats Alex has been pointing out).

Oh yes, VERY different. Let me try an analogy...

A policeman's job is to ensure you respect the law. He does that by
trying to catch you violating the law, and punishing you for that.

A civics teacher's job is to ensure you respect the law. He does that
by teaching you the law, explaining its rationale, engaging you in
discussion to find instances where your spontaneous behavior might
violate the law, and working out together with you how to adapt your
behavior and your instincts so that the law gets respects.

Type checking is a policeman. Adaptation is a civics teacher.

The adapt protocol (as presented on PEP246 -
http://www.python.org/peps/pep-0246.html) defines a very flexible
framework to adapt one object to a protocol. The result of the

....and yet the text of PEP 246 is still missing the specs about
registering "third party adapters". Phil Eby's PyProtocols is much
better that way!!! (I promise I'll do something about PEP 246 updating:
just as soon as I'm done with the 2nd ed of the cookbook....!!!!).
adaptation (if possible) is an object that is guaranteed to support
the protocol. So, using adapt(), we can write code like this:

def myfunc(obj):
for item in adapt(obj, Iterable):
...

Hmmm, yes, we can. It's a standard protocol so not the best of
examples, but still, it may be meaningful.

Finally, one may be wondering, is there any situation when an object
needs to be "adapted"? Why don't just check for the availability of
the interface? There are many reasons to use the adapt framework. The
protocol checking is one of the reasons -- it allows errors to be
catched much earlier, and at a better location. Another possible
reason is that complex objects may support several protocols, and
there may be name clashes between some of the methods. One such
situation is when an object support different *versions* of the same
protocol. All versions have the same method names, but semantics may
differ slightly. The adapt() call can build a new object with the
correct method names and signatures, for each protocol or version
supported by the object. Finally, the adaptation method can optionally
build an opaque "proxy" object, that hides details of the original
methods signature, and it's thus safer to pass around.

The main motivation I'd give is that different frameworks not knowing
about each other may define [1] what the object supplies and [2] what
the object is REQUIRED to supply -- there are often discrepancies, and
an adapter in-between is gonna be required. With 246 (once suitably
updated;-) we can write the adapter ONCE, register it in a suitable
global registry, and 'adapt' will just find it. Oh bliss -- as long as
adapt DOES get called all over the place!-)

Well, that's a broad overview of the subject. There is a lot of stuff
to learn, and using adaptation properly is something that takes some
time. Hope it helps.

My compliments for your excellent presentation! I hope my enrichment of
it may have proved useful rather than distracting....


Alex
 
I

Ian Bicking

Josiah said:
Sounds reasonable, but there will be a group of people who will beg and
plead for type checking, because they want "static type checking on
function call and return".

Yes, but they've been begging for it for years. They seem insistant
that this Python Thing will never catch on without these important features.

Anyway, a much more general and useful feature would be contracts.
Which is just to say, a formal place to put constraints on input and
output for a function. Contracts could include type checking --
sometimes quite validly -- but not necessarily. There's no type that
can indicate "only valid username strings" or "only integers contrained
by some global variable".
@ multi_dispatch_to_methods
class A(object):
dispatch = dispatch_factory()

@ dispatch(int)
def foo(x): ...

@ dispatch(float)
def foo(x): ...


It may have a couple warts, but it is implementable today.

Phillip Eby just did this, in like the last week. It's very cool:

http://peak.telecommunity.com/DevCenter/VisitorRevisited
http://www.eby-sarna.com/pipermail/peak/2004-November/001916.html

It looks similar to what you were thinking:

class A(object):
@dispatch.on('x')
def foo(x):
"""This is a dummy function, for documentation purposes"""

@foo.when(int)
def foo(x): ...

@foo.when(float)
def foo(x): ...

Of course, you can use interfaces in addition to types, and you can also
use arbitrary expressions (e.g., @foo.when("x % 2") and @foo.when("not x
% 2") to dispatch odd and even numbers separately).
 
C

Carlos Ribeiro

I think you did a great job! May I recommend some further reading...:

http://www.python.org/peps/pep-0246.html
http://www.aleax.it/os04_pydp.pdf
http://peak.telecommunity.com/protocol_ref/module-protocols.html



Hear, hear!


Right, but, for the curious...: "Interface" mostly describes a certain
set of methods and their signatures; "Protocol" adds semantics and
pragmatics -- some level of conceptualization of what the methods _do_
and constraints on how they are used (e.g. "after calling .close(), no
other method can ever be called on the object any more"). This is a
reasonably popular use of the words in question, though far from
universal.

Nice summary of the difference.
...if the semantics and pragmatics are also respected, e.g.:
x.__iter__() is x





In old-style Python, inheritance isn't really the issue (except for
exceptions, where inheritance does matter). Rather, a protocol is
defined by a given set of methods.

An iterable is an object supplying a special method __iter__ which,
called without arguments, returns an iterator (any object which respects
the iterator protocol). A sequence besides being iterable supports
__len__, AND __getitem__ with integer and slice arguments, and there is
a rich semantic and pragmatic web of mutual constraints between behavior
of __getitem__ and __len__ and iteration. A mutable sequence is a
sequence which also supports __setitem__ (again with specific
constraints wrt __getitem__, __len__...) and is also supposed to expose
a rich set of other methods such as pop, append, extend, etc, etc.

This IS great BUT limited by what the LANGUAGE designer(s) sanction(s)
as 'blessed protocols'. There are quite a few, but they're never enough
to cover the needs of an application or field of applications, of
course. With protocols based on certain special methods, you have a
great concept which however is not really extensible, nor flexible
enough to help the authors of large frameworks and applications.

Framework authors do define new protocols, of course -- they can't help
doing that. "X must be an object supplying methods 'foo' and 'bar' with
the following constraints...:". This is OK for somebody who's writing
an application using just one framework -- they can code their classes
to the framework's specifications.

The problem comes in when you're writing an application that uses two or
more frameworks... the two frameworks likely haven't heard of each
other... one wants objects supplying 'foo' and 'bar, the other supplies
objects supplying 'oof' and 'rab' instead, with subtly different
semantics and pragmatics. So, what do you do then? You write adapters:
wrappers over Y with its oof and rab which provide an X with its foo and
bar. Python is _great_ at that kind of job!

But, who applies the adapters, where, when? Well, unless we do get PEP
246 in... _you_, the application writer, are the only one who can.
You'd like to spend your time developing your application, with
frameworks to relieve you from some parts of the job, but to do that you
also need to develop adapters _and_ tediously, boilerplatishly, apply
them to every object from one framework that's going over to the other,
etc. Basically, you risk ending up with very *thick glue* (cfr Eric
Raymond's excellent "The Art of Unix Programming", great book also
available online for free) -- throughout your application, there will be
knowledge about the details of all frameworks you're using, spread in
thick layers of glue.


Oh yes, VERY different. Let me try an analogy...

A policeman's job is to ensure you respect the law. He does that by
trying to catch you violating the law, and punishing you for that.

A civics teacher's job is to ensure you respect the law. He does that
by teaching you the law, explaining its rationale, engaging you in
discussion to find instances where your spontaneous behavior might
violate the law, and working out together with you how to adapt your
behavior and your instincts so that the law gets respects.

Type checking is a policeman. Adaptation is a civics teacher.

Good example. I have my own take on this: type checking is about being
strict (to the point of intolerance). Adaptation is about being
flexible.
...and yet the text of PEP 246 is still missing the specs about
registering "third party adapters". Phil Eby's PyProtocols is much
better that way!!! (I promise I'll do something about PEP 246 updating:
just as soon as I'm done with the 2nd ed of the cookbook....!!!!).

That's another thing that I thought about including... but I was
afraid of broadening the scope too much. But at least a pointer to
PyProtocols is needed, if this small intro is to be turned out into a
'what is adaptation' tutorial of sorts.
Hmmm, yes, we can. It's a standard protocol so not the best of
examples, but still, it may be meaningful.

I thought that it was a good example to show what adapt() does... for
someone who never thought about it before.
Finally, one may be wondering, is there any situation when an object
needs to be "adapted"? Why don't just check for the availability of
the interface? There are many reasons to use the adapt framework. The
protocol checking is one of the reasons -- it allows errors to be
catched much earlier, and at a better location. Another possible
reason is that complex objects may support several protocols, and
there may be name clashes between some of the methods. One such
situation is when an object support different *versions* of the same
protocol. All versions have the same method names, but semantics may
differ slightly. The adapt() call can build a new object with the
correct method names and signatures, for each protocol or version
supported by the object. Finally, the adaptation method can optionally
build an opaque "proxy" object, that hides details of the original
methods signature, and it's thus safer to pass around.

The main motivation I'd give is that different frameworks not knowing
about each other may define [1] what the object supplies and [2] what
the object is REQUIRED to supply -- there are often discrepancies, and
an adapter in-between is gonna be required. With 246 (once suitably
updated;-) we can write the adapter ONCE, register it in a suitable
global registry, and 'adapt' will just find it. Oh bliss -- as long as
adapt DOES get called all over the place!-)

I was not entirely satisfied with this part of my explanation, but I
was afraid of taking too many things at once. If I was to rewrite it
now I would probably restructure it, and include a few considerations
about the case with similar (but slightly and annoyingly different)
frameworks.
My compliments for your excellent presentation! I hope my enrichment of
it may have proved useful rather than distracting....


Alex


--
Carlos Ribeiro
Consultoria em Projetos
blog: http://rascunhosrotos.blogspot.com
blog: http://pythonnotes.blogspot.com
mail: (e-mail address removed)
mail: (e-mail address removed)
 
T

Terry Reedy

example?

The example used in the Zope3 programmer tutorial:

You have database of (USA) Buddy objects, each with firstname, lastname,
streetaddress, zipcode, and email fields. You want to use them in a
context that requires the (redundant) city and state fields. So you write
a zip to city/state lookup function and then an ExpandedBuddy adapter that
gets the zip from the adapted Buddy, calls city_state(zip), and packages
the result together as needed in the usage context.

Terry J. Reedy
 
B

Bryan

Terry said:
The example used in the Zope3 programmer tutorial:

You have database of (USA) Buddy objects, each with firstname, lastname,
streetaddress, zipcode, and email fields. You want to use them in a
context that requires the (redundant) city and state fields. So you write
a zip to city/state lookup function and then an ExpandedBuddy adapter that
gets the zip from the adapted Buddy, calls city_state(zip), and packages
the result together as needed in the usage context.

Terry J. Reedy

thanks carlos, alex, terry...

the part i was missing was that adapt() retrieves a registered customized adapter from a registry. so, i see that this
would completely decouple code which wants objects of protocol Y from code that created objects with different protocols
X[n]. this seems to be the basic adapter pattern, but with the added feature of the registry lookup. i'm sure there
are some other dynamic benifits you get by using python. i'll read the links the links that were posted and catch
up... thanks,

bryan
 
F

Fred Pacquier

Tim Peters said:
Ah, but where would you put this file? In a *tarball* distribution,
it's easy: you stick README_FILES_WITH_SCREAMING_NAMES in the root of
the distribution, and then they're easy to find.

But this is a Windows *installer*, and every file it installs is
"hiding" under some subdirectory of your pre-existing Windows Python
installation. Where could the X3 Windows installer put
WINDOWS_INSTALL_README.TXT where a new user is likely to find it?

Yes, good point. I was sort of speaking for myself because I had actually
browsed through the newly created directories under site-packages looking
for some such, and would have found it there. So it would work for people
used to installing Python packages (like most of those who've posted in
this thread), but it's clearly not a bulletproof "end-user" solution.
 
D

Doug Holton

Alex said:
include static type support. The Zope project has driven numerous past
changes to Python. What's funny about this?



Glad to hear that. As for me, I'd love it if function arguments, only,
could bear an 'as' clause (making 'as' a keyword is long overdue):

def x(y as foo, z as bar):
<body of x>


I know many people here already know about this, but you can use this
syntax in boo:

def foo(y as int, z as bool) as bool:
<body returns a boolean>

You can also use attributes (similar to decorators):

def foo([required(value > 3)] value as int):
pass

http://boo.codehaus.org/

If anyone here familiar with Zope, mod_python, etc. ever tries out boo
and ASP.NET (or Mono's XSP/mod_mono), I'd appreciate you sharing any
notes about it on the boo mailing list to help others getting started.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,211
Messages
2,571,092
Members
47,693
Latest member
david4523

Latest Threads

Top