Python syntax in Lisp and Scheme

P

Pascal Costanza

Carlo said:
sure, but it seems like noone was able to let CLOS have
(virtual) inner classes,
methods inside methods,
virtual methods (yeah I know about those stupid generic functions :),
method overloading,

+ Inner classes only make sense when the language requires you to put
method definitions inside of class definitions. It doesn't make a lot of
sense to put a class definition inside another class when it only
consists of field definitions, as is the case in CLOS. (Except for
having the benefit of additional namespaces, but namespaces are handled
differently in Common Lisp.)

+ So what you want is method definitions inside of other methods. Of
course, this is possible. Here is a little toy example that sketches how
you can achieve this:

(defclass person ()
((name :accessor name :initarg :name)
(address :accessor address :initarg :address)))

(defun make-out-method (person)
(with-slots (name address) person
(defmethod out ((p (eql person)))
(format t "Name: ~A; address: ~A~%" name address))))

(defvar *pascal* (make-instance 'person :name "Pascal" :address "Bonn"))

(make-out-method *pascal*)

(out *pascal*)

=> Name: Pascal; address: Bonn

+ All methods in CLOS are virtual. What do you mean?

+ Method overloading is a way to have static dispatch, and this doesn't
fit well with a dynamic language. (Apart from that, static dispatch is a
source for some nasty bugs.)

What you probably really mean here is that there are some strict
compatibility requirements wrt the lambda lists of methods that belong
to the same generic function. I don't think Common Lispers have serious
issues with these requirements.

In general, dynamic type checking in Common Lisp makes these things much
easier than you might think in case you have only considered statically
typed languages so far.
A decent API (I tried playing with it.. it doesn't even have a freaking
date library as standard ;-p

No language with an official standard (ANSI, ISO, etc.) defines
everything you might ever need in its standard. That's simply not
possible. Standardized languages rely on vendor support, and more often
than not, community-driven de-facto standards emerge.

Single-vendor languages follow a totally different approach in this
regard. You are comparing apples and oranges here.

One can have a debate about language standards vs. single-vendor
languages, but that's a different issue altogether.

Baseline: If you are looking for a decent date library, check out what
Common Lisp vendors have to offer and/or what is available from third
parties.
Yes I agree with the compile time macro expansion is a nice thing.
However, if I want to do some serious changes to the structure of
objects and classes (i.e. create a new kind of objects) then I have to
spend a long time finding out how the CLOS people hacked together their
representation of classes, methods, method call etc... it has been far
easier for me to just do some small changes using __getattribute__ and
metaclasses in python. So in that respect Im not really sure the macro
idea is advantageous for other than 'straight away' macros...

Are you sure that you are not confusing macros and the CLOS MOP here?
(Your remarks are too general to be able to comment on this.)
yes this mail is provocative.. please count slowly to 10 before replying
if you disagree with my point of view (and I know Pascal will disagree
;-) ... not that I ever seen him angry ;-)

grrr

;)


Pascal
 
A

Andreas Rossberg

Dirk said:
you can use macros to
do everything one could use HOFs for (if you really want).

Really? What about arbitrary recursion?

--
Andreas Rossberg, (e-mail address removed)-sb.de

"Computer games don't affect kids; I mean if Pac Man affected us
as kids, we would all be running around in darkened rooms, munching
magic pills, and listening to repetitive electronic music."
- Kristian Wilson, Nintendo Inc.
 
D

Dirk Thierbach

Andreas Rossberg said:
Dirk Thierbach wrote:

I should have added: As long as it should execute at compile time, of
course.
Really? What about arbitrary recursion?

I don't see the problem. Maybe you have an example? I am sure the
Lisp'ers here can come up with a macro solution for it.

- Dirk
 
B

Bruce Lewis

Pascal Bourguignon said:
Most probably, you would write a macro named WITH-INFIX and thus
automatically scope the infix part:

(with-infix 1 / x + 1 / ( x ^ 3 ) + 1 / ( x ^ 5 ) )

How about a with-algebraic macro? Someone mentioned that Python uses a
nice algebraic syntax. That would obviously be different from the infix
syntax you illustrated. With infix syntax, it takes some examination to
notice that the above expression is the sum of three fractions. You
have to think about operator precedence and everything. With algebraic
syntax you could see it at a glance:

1 1 1
----- + ----- + -----
3 5
x + 1 x x


Of course, prefix notation would also make it obvious at a glance that
you have the sum of three fractions:

(+ (/ 1 (+ x 1))
(/ 1 (expt x 3))
(/ 1 (expt x 5)))

You're already pretty close to algebraic syntax once you upgrade from
infix to prefix notation. But that actual algebraic syntax must be
really cool in Python. Isn't there some math tool out there that also
does it?

When Python programmers put algebraic formulas in their code, does it
mess up the indentation at all? I'm curious exactly how it works.
 
A

Alex Martelli

Doug Tolton wrote:
...
Alex, this is pure un-mitigated non-sense.

Why, thanks! Nice to see that I'm getting on the nerves of _some_
people, too, not just having them get on mine.
Python's Metaclasses are
far more dangerous than Macro's. Metaclasses allow you to globally
change the underlying semantics of a program.

Nope: a metaclass only affects (part of the semantics of) the
classes that instantiate it. No "globally" about it. When I
write a class I can explicity control what metaclass it uses,
or inherit it. E.g., by writing 'class My(object):', with no
explicit metaclass, I ensure type(my)==type(object). The type
of the built-in named 'object' is the built-in named 'type'
(which most custom metaclasses subclass), which is also the
type of most other built-in types (numbers, strings, list,
tuple, dict, functions, methods, module, file, ...). I.e.,
your assertion is pure un-mitigated FUD.
Macros only allow you
to locally change the Syntax.

"Locally"? Not globally? Care to explain? Each 'invocation'
(expansion) of a macro must occur in a particular locus, sure,
but isn't the _point_ of (e.g.) defining a 'loop-my-way' macro
that every occurrence of 'loop-my-way' is such an expansion?

As for that mention of "the Syntax" AS IF somehow contrasted
with the "underlying semantics" just before, don't even bother
to try explaining: one example of macros' wonders offered by a
particularly vocal and emotional advocate was a macro
'with-condition-maintained' that was somehow supposed to make
whatever alterations might be needed in the control program of
a reactor in order to regulate temperature -- and it was
passed that code as three calls to functions (or expansions
of macros) NOT defined inside it, so how it could possibly
work "only...locally", when to do anything at all it MUST
necessarily find, "expand", and alter the local instantiations
of those functions (or macros)...?!

If that's an example of "only allow you to locally change
the syntax", what would be an example of *EVEN DEEPER
AND MORE PERVASIVE* changes ?!
Your comparison is spurious at best.

What "my comparison" are you blabbering about? My text that
you quoted, and called "pure un-mitigated non-sense", had no
"comparisons", neither my own nor others'. I see YOU attempting
to draw some "comparison" (eminently spurious, to be sure)
between metaclasses and macros...

Your argument simply shows a serious mis-understanding of Macros.
Macros as has been stated to you *many* times are similar to
functions. They allow a certain type of abstraction to remove
extraneous code.

Yeah, right. Kindly explain that 'with-condition-maintained'
example and the exhalted claims made and implied for it, then.

Based on your example you should be fully campaigning against
Metaclasses, FP constructs in python and Functions as first class
objects. All of these things add complexity to a given program,

"FUD" and "nonsense" (with or without a hyphen) would be serious
understatements in an attempt to characterize *THIS*. *HOW* do
"functions as first class objects" perform this devilish task of
"adding complexity to a given program", for example?! The extra
complexity would be in rules trying to FORBID normal usage of an
object (passing as argument, returning as value, appending to a
list, ...) based on the object's type. There is obviously no
complexity in saying "_WHATEVER_ x happens to stand for, you
can correctly call somelist.append(x)" [passing x as argument to
a method of the somelist which appends x to the list], for example.
The added complexity would come if you had to qualify this with
"UNLESS ..." for whatever value of ``...''.
however they also reduce the total number of lines. Reducing program
length is to date the only effective method I have seen of reducing
complexity.

For some (handwaving-defined) "appropriate" approach to measuring
"length" (and number of lines is most definitely not it), it is ONE
important way. But you're missing another crucial one, which is
the count of interactions, actual and potential, between "parts"
of the program -- the key reason why global effects do not in fact
effectively reduce complexity, but rather bid fair to increase it,
even though they might often "reduce the total [[length]]", is
exactly this. E.g., if large parts of my program needed all kinds
of comparisons between strings (including comparison-related
functionality such as hashing) to be case-insensitive, it might
make my program 'shorter' if I could set case insensitivity as
the global default -- but it might easily mess up totally unrelated
and otherwise stable modules that rely on the usual case sensitive
operations, causing weird, hard-to-trace malfunctionings. I've
mentioned my youthful APL experiences: with its quad-IO to globally
set index origin for arrays, and its quad-I forget what to globally
set comparison tolerance in all comparisons between floating point
numbers, APL was a prime example of this (among other things,
reusability-destroying) global-effects risk. Sure, it was cool
and made my program shorter to be able to check if "a < b" and
have this IMPLICITLY mean "to within N significant digits" (or
whatever) -- but it regularly broke other otherwise-stable modules
and thus destroyed reuse. Not to mention the mind-boggling effects
when a<b, a>b and a=b can ALL be 'true' at once thanks to the
"to within N significant digits" IMPLICIT proviso...

Complexity is not just program length, and reducing program length
not the only important thing in reducing complexity. Removing
*repetition* (boilerplate), sure, that's nice -- and if there was
a way to constrain macros to ONLY do that (as opposed to ending up
with examples such as 'with-condition-maintained', see above) I
would be very interested in seeing it. I doubt there is one, though.

If you truly believe what you are saying, you really should be
programming in Java. Everything is explicit, and most if not all of

Hmmm, one wonders -- are you a liar, or so totally ignorant of what
you're talking about that you don't even KNOW that one of Java's
most "cherished" features is that the "self." is just about ALWAYS
implicit...? Anyway, in my text which you quoted and characterized
as "pure un-mitigated non-sense" I was speaking of UNIFORMITY as
a plus -- and Java's use of { } for example ensures NON-uniformity
on a lexical plane, since everybody has different ideas about where
braces should go:).

But I've NEVER argued in favour of boilerplate, of repetitiousness.
I think that the occasional error that you can catch by forcing
redundancy is generally outweighed by all the errors that just
would not be there if the language let me state things "once, and
only once". So, for example, when I write
x = 23
I most definitely don't WANT to have to redundantly state that,
by the way, there is a variable x, and, whaddyaknow, x refers
to an integer. As to whether it makes more sense to later let
the same name x in the same scope refer to OTHER objects (of
the same type; or, of any type) -- I still don't know; maybe
a single-assignment kind of functional language would in fact be
preferable, or maybe Python's relaxed attitude about re-bindings
is best, or maybe something in-between, allowing re-bindings but
only within a single type's items (for "re-bindings" you may
choose to read "assignments" if you wish, I'm not trying to
reopen THAT particular lexical flamewar for further debate;-).

So far, I'm pretty happy with Python's permissive approach to
mutation and re-binding, but I notice I don't mind (differently
from many others) the inability to re-bind SOME references
(e.g., items of tuples, or lexically-outer names) -- and in
Haskell or ML I don't recall ever feeling confined by the
inability to have the same name refer to different values at
successive times (in the same scope). [I _do_ recall some
unease at being unable to mutate "large" data structures, as
opposed to rebinding simple names, so it's not as if I can
claim any natural affinity for the functional [immutable-data]
approach to programming -- I just wonder if perhaps the current
widespread _emphasis_ on rebinding and mutation may not be a
TAD overdone -- but, enough for this aside].

I do, of course, truly believe in what I'm saying -- what
WOULD have stopped me from taking up any of a zillion different
languages, instead of Python, when I started studying it
about four years ago? Indeed, my opportunities for making
money, and the audience for my books, would be vaster if I
had stuck with what I was mainly using at work then (mostly C++,
some Java, VB, Perl), my academic respectability higher if I
had stuck with Haskell or some ML. But while I don't mind
money, nor fans, I care most about other values -- and the
amount to which "Python fits my brain" and makes me most
comfortable and productive meets and exceeds all claims I had
heard to this effect, PLUS, I have experiential proof (enough
to convince me personally, if nobody else:) that it's just
as comfortable and productive for many others, from programming
newbies to highly experienced professionals. Sure, Java would
let me program my cellphone (which currently doesn't support
Python) -- oh well, I'll have to eschew that crucial pursuit
for a while longer now...


Alex
 
D

David Rush

Yes, this discussion is frustrating. It's deeply frustrating to hear
someone without extensive experience with Macros arguing why they are
so destructive.

You know I think that this thread has so far set a comp.lang.* record for
civilitiy in the face of a massively cross-posted language comparison
thread. I was even wondering if it was going to die a quiet death, too.

Ah well, We all knew it was too good to last. Have at it, lads!

Common Lisp is an ugly language that is impossible to understand with
crufty semantics

Scheme is only used by ivory-tower academics and is irerelevant to real
world programming

Python is a religion that worships at the feet of Guido vanRossum
combining the syntactic flaws of lisp with a bad case of feeping
creaturisms taken from languages more civilized than itself

There. Is everyone pissed off now?

david rush
 
M

Marco Antoniotti

David said:
My answer sucked in a couple ways.

(1) As Bengt Ricther pointed out up-thread, I should have changed David
Eppstein's names 'filter' and 'iter' to something other than the
built-in names.

(2) The function categorize_compose() IS named correctly, but it doesn't
DO what I said it would. If you want to fulfill ALL the filters, you
don't to compose them, but... well, 'all()' them:

| def categorize_jointly(preds, it):
| results = [[] for _ in len(preds)]
| for x in it:
| results[all(filters)(x)].append(x)
| return results

Now if you wonder what the function 'all()' does, you could download:

http://gnosis.cx/download/gnosis/util/combinators.py

But the relevant part is:

from operator import mul, add, truth
apply_each = lambda fns, args=[]: map(apply, fns, [args]*len(fns))
bools = lambda lst: map(truth, lst)
bool_each = lambda fns, args=[]: bools(apply_each(fns, args))
conjoin = lambda fns, args=[]: reduce(mul, bool_each(fns, args))
all = lambda fns: lambda arg, fns=fns: conjoin(fns, (arg,))

For 'lazy_all()', look at the link.

See, Python is Haskell in drag.

Come on. Haskell has a nice type system. Python is an application of
Greespun's Tenth Rule of programming.

Cheers
 
M

Marco Antoniotti

Carlo said:
sure, but it seems like noone was able to let CLOS have
(virtual) inner classes,
methods inside methods,
virtual methods (yeah I know about those stupid generic functions :),
method overloading,

From what you are saying it is obvious that you do not know what you
are talking about.

True, you do not have "inner" classes, but that has never stopped
anybody from writing good code. As for your comments on methods and
generic functions it is obvious that you do not know what multiple
dispatching is (yes, there is an ugly hacked up Python library to do
that floating around; I do not know if it will make it it 3.0), so you
comment looses value immediately.
A decent API (I tried playing with it.. it doesn't even have a freaking
date library as standard ;-p


Apart form the fact that the language has GET-UNIVERSAL-TIME,
DECODE-UNIVERSAL-TIME etc etc, you can get a nice and portable (across
all n > 1.8 CL implementations) date parsing library at

http://www.cliki.net/net-telent-date


Yes I agree with the compile time

The term "compile" should already make you think.
macro expansion is a nice thing.
However, if I want to do some serious changes to the structure of
objects and classes (i.e. create a new kind of objects) then I have to
spend a long time finding out how the CLOS people hacked together their
representation of classes, methods, method call etc... it has been far
easier for me to just do some small changes using __getattribute__ and
metaclasses in python. So in that respect Im not really sure the macro
idea is advantageous for other than 'straight away' macros...

And how exactly does CLOS forbid you to do the same? You can do that
using accessor, reader and writer generic functions (ooops, I forgot
that you do not know enough about them :) ) If that is not enough, the
CLOS Metaobject Protocol is available in practically all major CL
implementations (and that is more than the 1.8 Python implementations
out there). And it seems to me that
yes this mail is provocative.. please count slowly to 10 before replying
if you disagree with my point of view (and I know Pascal will disagree
;-) ... not that I ever seen him angry ;-)


I counted until 42 :)

Cheers
 
D

Doug Tolton

Doug Tolton wrote:
...

Why, thanks! Nice to see that I'm getting on the nerves of _some_
people, too, not just having them get on mine.

Yes, this discussion is frustrating. It's deeply frustrating to hear
someone without extensive experience with Macros arguing why they are
so destructive. Particularly to hear the claim the Macros make it
impossible to have large teams work on a project, and at the same time
supporting features in python that make it far more difficult to share
code between people. (ie, white space delimiting, the ability to
rebind interenals, Metaclasses). Personally I'm not against any of
these features, however they all suffer from serious potential
drawbacks.
Nope: a metaclass only affects (part of the semantics of) the
classes that instantiate it. No "globally" about it. When I
write a class I can explicity control what metaclass it uses,
or inherit it. E.g., by writing 'class My(object):', with no
explicit metaclass, I ensure type(my)==type(object). The type
of the built-in named 'object' is the built-in named 'type'
(which most custom metaclasses subclass), which is also the
type of most other built-in types (numbers, strings, list,
tuple, dict, functions, methods, module, file, ...). I.e.,
your assertion is pure un-mitigated FUD.
Please explain to me how according to your logic, a semantic change to
the language is good, but a syntactic change is bad. Your logic is
extremely confusing to me. On the one hand you think Macro's are bad
because I can define a construct such as
(do-something-useful-here
arg1
arg2
arg3)
which operates according to all the regular semantic rules. In fact
there is even an explicit construct that will show me *exactly* what
this code is doing. Yet you apparently think using Metaclasses to
change the underlying semantics is somehow ok, because you can check
to see if it's built-in? So are you saying that using only built-in
constructs are good to use? If someone else gives you a class to use
which uses Metaclasses to change how it operates for some reason or
another, are you ok with that? What if they need to re-bind some of
the builtins to do something? Because you can't prevent that in python
either. In fact any piece of python code that runs on your system
could do that, yet you are ok with that?
"Locally"? Not globally? Care to explain? Each 'invocation'
(expansion) of a macro must occur in a particular locus, sure,
but isn't the _point_ of (e.g.) defining a 'loop-my-way' macro
that every occurrence of 'loop-my-way' is such an expansion?
yes, but the point is that if I just wrote 'loop-my-way', it doesn't
change existant code in unexpected ways. It only affects new code
that is written using 'loop-my-way'. Whereas re-binding the buildins
*will* change existant code.
As for that mention of "the Syntax" AS IF somehow contrasted
with the "underlying semantics" just before, don't even bother
to try explaining: one example of macros' wonders offered by a
particularly vocal and emotional advocate was a macro
'with-condition-maintained' that was somehow supposed to make
whatever alterations might be needed in the control program of
a reactor in order to regulate temperature -- and it was
passed that code as three calls to functions (or expansions
of macros) NOT defined inside it, so how it could possibly
work "only...locally", when to do anything at all it MUST
necessarily find, "expand", and alter the local instantiations
of those functions (or macros)...?!
I'm not going to attempt to explain with-condition-maintained, as you
are clearly ignoring the general idea in favor of nitpicking
non-essential details of a hypothetical construct. When you descend
to the statements you have made about that construct, it's no longer
worth continuing discussion of it.

That's my exact problem though, your statements continually brand you
as ignorant of what macros are and how they operate on a fundamental
level, yet for some reason you feel qualified to extol their evils as
if you actually have significant experience with them.
If that's an example of "only allow you to locally change
the syntax", what would be an example of *EVEN DEEPER
AND MORE PERVASIVE* changes ?!
No idea what you are talking about here.
What "my comparison" are you blabbering about? My text that
you quoted, and called "pure un-mitigated non-sense", had no
"comparisons", neither my own nor others'. I see YOU attempting
to draw some "comparison" (eminently spurious, to be sure)
between metaclasses and macros...
The only reason you think it's spurious is because of your
fundamentally flawed conception of Macros.
Yeah, right. Kindly explain that 'with-condition-maintained'
example and the exhalted claims made and implied for it, then.
I thought you didn't want it explained to you? If you are serious
about wanting to know what the *point* of the code snippet was, then
I'll explain it. If on the other hand you are going to make some
ridiculous argument about how with-condition-maintained isn't in fact
hooked to any control room circuitry, then forget it.
Based on your example you should be fully campaigning against
Metaclasses, FP constructs in python and Functions as first class
objects. All of these things add complexity to a given program,

"FUD" and "nonsense" (with or without a hyphen) would be serious
understatements in an attempt to characterize *THIS*. *HOW* do
"functions as first class objects" perform this devilish task of
"adding complexity to a given program", for example?! The extra
complexity would be in rules trying to FORBID normal usage of an
object (passing as argument, returning as value, appending to a
list, ...) based on the object's type. There is obviously no
complexity in saying "_WHATEVER_ x happens to stand for, you
can correctly call somelist.append(x)" [passing x as argument to
a method of the somelist which appends x to the list], for example.
The added complexity would come if you had to qualify this with
"UNLESS ..." for whatever value of ``...''.
Hmm...what if using a class with Metaclasses behaves in a totally
non-standard way? What if a function re-binds the builtins? What if
they over use FP constucts and nest 50 maps and filters? Are you ok
with all of these things? They are certainly more confusing than
Macros. To make the statement that *any* technology can't be abused
is foolish. To make that claim implies there is no correct usage,
only usage. In other words if there is no correct way to use
Metaclasses or re-bind builtins then any way that someone sees fit to
do it *is* the right way. We all know that is a ridiculous claim.
Macros are like any other sufficiently powerful technology. If they
aren't used right, they will complicate a program not simplify it.

I believe the crux of our difference is that you don't want to give
expressive power because you believe it will be misused. I on the
other hand want to give expressive power because I believe it could be
used correctly most of the time. For the times when it's not, well
that's why I have debugging skills. Sadly not eveyone uses looping
the way I would, but using my brain I can figure out what they are
doing.
For some (handwaving-defined) "appropriate" approach to measuring
"length" (and number of lines is most definitely not it), it is ONE

Both from my experience and Fred Brooks it's the only actual way I've
seen of measuring the time it will take to write a program.
important way. But you're missing another crucial one, which is
the count of interactions, actual and potential, between "parts"
of the program -- the key reason why global effects do not in fact
effectively reduce complexity, but rather bid fair to increase it,
even though they might often "reduce the total [[length]]", is
exactly this.

I understand this point very well. That's why I believe in building
layered software, and using good high order constructs to acheive
this. As I've said before, your statements reveal your fundamental
mis-understanding of the way Macro's work. To support Metaclasses,
classes, functions, first order functions etc as tools to support this
concept while at the same time reviling macros is simply showing an
un-educated bias about Macros. I wouldn't be suprised to hear you
respond with some argument about how you've read the writings by
people who have used Macros (as you've done in the past), but I
believe you do not have sufficient understanding to make the claims
you are making. If you really understood macros, I don't believe you
would be making such statements.
E.g., if large parts of my program needed all kinds
of comparisons between strings (including comparison-related
functionality such as hashing) to be case-insensitive, it might
make my program 'shorter' if I could set case insensitivity as
the global default -- but it might easily mess up totally unrelated
and otherwise stable modules that rely on the usual case sensitive
operations, causing weird, hard-to-trace malfunctionings. I've
mentioned my youthful APL experiences: with its quad-IO to globally
set index origin for arrays, and its quad-I forget what to globally
set comparison tolerance in all comparisons between floating point
numbers, APL was a prime example of this (among other things,
reusability-destroying) global-effects risk. Sure, it was cool
and made my program shorter to be able to check if "a < b" and
have this IMPLICITLY mean "to within N significant digits" (or
whatever) -- but it regularly broke other otherwise-stable modules
and thus destroyed reuse. Not to mention the mind-boggling effects
when a<b, a>b and a=b can ALL be 'true' at once thanks to the
"to within N significant digits" IMPLICIT proviso...

Well that was a long winded digression into something that is
completely un-related to Macros. Seems like a good argument why
re-binding the buildins is bad though
..
Complexity is not just program length, and reducing program length
not the only important thing in reducing complexity. Removing
*repetition* (boilerplate), sure, that's nice -- and if there was
a way to constrain macros to ONLY do that (as opposed to ending up
with examples such as 'with-condition-maintained', see above) I
would be very interested in seeing it. I doubt there is one, though.
I agree that reducing complexity is the goal. I disagree that you can
*ever* guarantee a high order construct is always used correctly
though.
Hmmm, one wonders -- are you a liar, or so totally ignorant of what
you're talking about that you don't even KNOW that one of Java's
most "cherished" features is that the "self." is just about ALWAYS
implicit...? Anyway, in my text which you quoted and characterized
as "pure un-mitigated non-sense" I was speaking of UNIFORMITY as
a plus -- and Java's use of { } for example ensures NON-uniformity
on a lexical plane, since everybody has different ideas about where
braces should go:).
Where braces should go is a trivial issues. However if braces is an
issue that seriously concerns you then I can see why macros are giving
you a heart attack.
But I've NEVER argued in favour of boilerplate, of repetitiousness.
I think that the occasional error that you can catch by forcing
redundancy is generally outweighed by all the errors that just
would not be there if the language let me state things "once, and
only once". So, for example, when I write
x = 23
I most definitely don't WANT to have to redundantly state that,
by the way, there is a variable x, and, whaddyaknow, x refers
to an integer. As to whether it makes more sense to later let
the same name x in the same scope refer to OTHER objects (of
the same type; or, of any type) -- I still don't know; maybe
a single-assignment kind of functional language would in fact be
preferable, or maybe Python's relaxed attitude about re-bindings
is best, or maybe something in-between, allowing re-bindings but
only within a single type's items (for "re-bindings" you may
choose to read "assignments" if you wish, I'm not trying to
reopen THAT particular lexical flamewar for further debate;-).

So far, I'm pretty happy with Python's permissive approach to
mutation and re-binding, but I notice I don't mind (differently
from many others) the inability to re-bind SOME references
(e.g., items of tuples, or lexically-outer names) -- and in
Haskell or ML I don't recall ever feeling confined by the
inability to have the same name refer to different values at
successive times (in the same scope). [I _do_ recall some
unease at being unable to mutate "large" data structures, as
opposed to rebinding simple names, so it's not as if I can
claim any natural affinity for the functional [immutable-data]
approach to programming -- I just wonder if perhaps the current
widespread _emphasis_ on rebinding and mutation may not be a
TAD overdone -- but, enough for this aside].

I do, of course, truly believe in what I'm saying -- what
WOULD have stopped me from taking up any of a zillion different
languages, instead of Python, when I started studying it
about four years ago? Indeed, my opportunities for making
money, and the audience for my books, would be vaster if I
had stuck with what I was mainly using at work then (mostly C++,
some Java, VB, Perl), my academic respectability higher if I
had stuck with Haskell or some ML. But while I don't mind
money, nor fans, I care most about other values -- and the
amount to which "Python fits my brain" and makes me most
comfortable and productive meets and exceeds all claims I had
heard to this effect, PLUS, I have experiential proof (enough
to convince me personally, if nobody else:) that it's just
as comfortable and productive for many others, from programming
newbies to highly experienced professionals. Sure, Java would
let me program my cellphone (which currently doesn't support
Python) -- oh well, I'll have to eschew that crucial pursuit
for a while longer now...
The ironic thing is that I'm not bashing Python. I really like
python. It's a great language. I think we both use Python and Lisp
(for me) for the same reasons. If I wanted a higher paying job I'd be
using Java. I have aspirations to write books as well, I agree that
Python and Lisp aren't the biggest markets, and yet I use them because
they fit my brain also.

What I am in point of fact upset by is your constant barrage against
Macros. I feel that your stance is based on ignorance and
mis-information. You certainly don't have significant first hand
exposure to Lisp style Macros or you wouldn't be making statements
that were so obviously incorrect. Why don't you seriously try to
learn them? If you don't care to, why argue about them so much? I
haven't seen anyone bring up (I could've missed it) putting Macro's
into Python again. I personally don't think Macros would work very
well in Python, at least not as well as they do in Lisp. So
understanding that I'm not pushing for Macro's in python, why are you
so vehement against them? Are you on campaign to get Macros out of
Lisp?

Doug
Doug Tolton
(format t "~a@~a~a.~a" "dtolton" "ya" "hoo" "com")
 
D

Daniel P. M. Silva

<posted & mailed>

Alex said:
Daniel P. M. Silva wrote:
...

Right: you need to code this very differently, namely:
with_directory("/tmp", do_something)
*deferring* the call to do_something to within the with_directory
function. Python uses strict evaluation order, so if and when you
choose to explicitly CALL do_something() it gets called,

So, I would code:

def with_directory(thedir, thefunc, *args, **kwds):
pwd = os.getcwd()
try: return thefunc(*args, **kwds)
finally: os.chdir(pwd)

this is of course a widespread idiom in Python, e.g. see
unittest.TestCase.assertRaises for example.

The only annoyance here is that there is no good 'literal' form for
a code block (Python's lambda is too puny to count as such), so you
do have to *name* the 'thefunc' argument (with a 'def' statement --
Python firmly separates statements from expressions).

That was my point. You have to pass a callable object to with_directory,
plus you have to save in that object any variables you might want to use,
when you'd rather say:

x = 7
with_directory("/tmp",
print "well, now I'm in ", os.getpwd()
print "x: ", x
x = 3
)
A "using" statement (which would take a specialized object, surely not
a string, and call the object's entry/normal-exit/abnormal-exit methods)
might often be a good alternative to try/finally (which makes no provision
for 'entry', i.e. setting up, and draws no distinction between normal
and abnormal 'exits' -- often one doesn't care, but sometimes yes). On
this, I've seen some consensus on python-dev; but not (yet?) enough on
the details. Consensus is culturally important, even though in the end
Guido decides: we are keen to ensure we all keep using the same language,
rather than ever fragmenting it into incompatible dialects.

The point is that the language spec itself is changed (along with the
interpreter in C!) to add that statement. I would be happier if I could
write syntax extensions myself, in Python, and if those extensions worked
on CPython, Jython, Python.Net, Spy, etc.
Some people use Python's hooks to create little languages inside Python
(eg. to change the meaning of instantiation), which are not free of
problems:

class Object(object):
def __init__(this, *args, **kwargs):

[invariably spelt as 'self', not 'this', but that's another issue]
this.rest = args
this.keys = kwargs

def new_obj_id(count=[0]):
count[0] = count[0] + 1
return count[0]

def tag_obj(obj, id):
obj.object_id = id
return obj

def obj_id(obj): return obj.object_id

type.__setattr__(Object, "__new__", staticmethod(lambda type, *args:
tag_obj(object.__new__(type), new_obj_id()))) ...
# forgot to check for this case...
print Object(foo="bar")

It's not an issue of "checking": you have written (in very obscure
and unreadable fashion) a callable which you want to accept (and
ignore) keyword arguments, but have coded it in such a way that it
in fact refuses keyword arguments. Just add the **kwds after the
*args. This bug is not really related to "little languages" at all:
you might forget to specify arguments which you do want your callable
to accept and ignore in a wide variety of other contexts, too.

I think changing the meaning of __new__ is a pretty big language
modification...

- Daniel
 
A

Alexander Schmolck

Pascal Bourguignon said:
Well, I would say that kanji is badly designed, compared to latin
alphabet. The voyels are composed with consones (with diacritical
marks) and consones are written following four or five groups with
additional diacritical marks to distinguish within the groups. It's
more a phonetic code than a true alphabet.

Huh? You seem to be confused (BTW French is misleading here: it's vowels and
consonants in English). *Kanji* are not phonetic, you seem to be talking about
*kana*. And the blanket claim that Japanese spelling in kana is badly designed
compared to say, English orthography seems really rather dubious to me.

'as
 
M

Marco Antoniotti

Dirk said:
I can't parse this sentence, but of course you can also use HOFs in Lisp
(all flavours). The interesting part is that most Lisp'ers don't seem
to use them, or even to know that you can use them, and use macros instead.

The only real advantage of macros over HOFs is that macros are guaranteed
to to executed at compile time. A good optimizing compiler (like GHC
for Haskell) might actually also evaluate some expressions including
HOFs at compile time, but you have no control over that.




HOFs can of course be used directly in CL, and you can use macros to
do everything one could use HOFs for (if you really want).

The advantage of HOFs over macros is simplicity:

As R^nRS shows, simplicity leads to language specs without useful things
(like records/struct) in them.

You want to make things simple, not any simpler (was it Einstein who
said that?)
You don't need additional
language constructs (which may be different even for different Lisp
dialects, say),

As we well know, there is now one dominant Lisp, which is Common by
name. (No. ELisp does not count as you do (require 'cl) in your .emacs
file) This argument is moot.
and other tools (like type checking) are available for
free;

Yes. Type Checking is in CMUCL/SBCL.
and the programmer doesn't need to learn an additional concept.

The programmer needs to learn to use the tool at its best. If your tool
is limited you just have to learn less.

Cheers
 
P

Pascal Bourguignon

(defconstant -> '-> "More sugar")

;; Example usage
(convert *thing* -> (class-of *other-thing*))

Of course, these are lame examples, but they show that Lisp *can*
incorporate little ascii-picture-symbols. Good examples would
necessarily be very domain-dependant.

Have a look ad DrScheme. There, you can use real images (gif, jgp) as
values. It should not be too difficult to use them as symbol names
too...
 
A

Alex Martelli

Daniel P. M. Silva wrote:
...
...
That was my point. You have to pass a callable object to with_directory,
plus you have to save in that object any variables you might want to use,
when you'd rather say:

x = 7
with_directory("/tmp",
print "well, now I'm in ", os.getpwd()
print "x: ", x
x = 3
)

I'm definitely NOT sure I'd "rather" use this specific syntax to pass
a block of code to with_directory (even in Ruby, I would delimit the
block of code, e.g. with do/end).

I *AM* sure I would INTENSELY HATE not knowing whether

foo('bar', baz())

is being evaluated by normal strict rules -- calling baz and passing
the result as foo's second argument -- or rather a special form in
which the 'baz()' is "a block of code" which foo may execute zero or
more times in special ways -- depending on how foo happens to be
bound at this time. *SHUDDER*. Been there, done that, will NEVER
again use a language with such ambiguities if I can possibly help it.

The point is that the language spec itself is changed (along with the
interpreter in C!) to add that statement. I would be happier if I could
write syntax extensions myself, in Python, and if those extensions worked
on CPython, Jython, Python.Net, Spy, etc.

So, I hope the cultural difference is sharply clear. To us, consensus
is culturally important, we are keen to ensure we all keep using the
same language; *you* would be happier if you could use a language that
is different from those of others, thanks to syntax extensions you
write yourself. Since I consider programming to be mainly a group
activity, and the ability to flow smoothly between several groups to
be quite an important one, I'm hardly likely to appreciate the divergence
in dialects encouraged by such possibilities, am I?

I think changing the meaning of __new__ is a pretty big language
modification...

Surely you're jesting? Defining a class's __new__ and __init__
just means defining the class's *constructor*, to use the term
popular in C++ or Java; as I can change any other method, so I
can change the constructor, of course (classes being mutable
objects -- by design, please note, not by happenstance). "Pretty
big language modification" MY FOOT, with all due respect...


Alex
 
T

Thomas F. Burdick

Carlo v. Dango said:
sure, but it seems like noone was able to let CLOS have
(virtual) inner classes,

This is kind of like saying we weren't able to have setjmp/longjmp;
yeah, but doing so makes no sense.
methods inside methods,

There was a proposal to add lexically-scoped methods, but it got
tossed because no one liked it.
virtual methods (yeah I know about those stupid generic functions :),

As has been already stated, we only have "virtual methods".
method overloading,

How could you have both noncongruent argument lists, and multiple
dispatch? With an either/or like that, Lisp chose the right one.
A decent API (I tried playing with it.. it doesn't even have a freaking
date library as standard ;-p

Who does? Have all that stuff standard, I mean. Python doesn't even
have a standard. We have some date support in ANSI -- get the rest from your
vendor (commercial or free).
yes this mail is provocative..

Seems more ignorant, to me. I guess when you're conversing on an
archived forum, that can seem like the same thing, though.

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'
 
T

Thomas F. Burdick

Alexander Schmolck said:
And the blanket claim that Japanese spelling in kana is badly
designed compared to say, English orthography seems really rather
dubious to me.

And you can tell by his phrasing that he was straining against writing
that in English ;-)

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'
 
J

Joe Marshall

Alex Martelli said:
Yeah, right. Kindly explain that 'with-condition-maintained'
example and the exhalted claims made and implied for it, then.

I posted some lisp macros used in a production environment.
(the msgid <[email protected]> )

The `with-condition-maintained' was a hypothetical, but the ones I
posted are in actual use. The claim is that these macros
significantly *reduce* the intellectual burden of the people that
write and maintain the code that uses them.
 
E

Erann Gat

method overloading,

How could you have both noncongruent argument lists, and multiple
dispatch?[/QUOTE]

C++ seems to manage it somehow.

#include <stdio.h>

void foo(int x, int y) { printf("1\n"); }
void foo(double x, int y) { printf("2\n"); }
void foo(char* x) { printf("3\n"); }

main() {
foo(1,2);
foo(1.2,2);
foo("foo");
}

compiles and runs without complaint.

E.
 
D

Daniel Silva

I'm definitely NOT sure I'd "rather" use this specific syntax to pass
a block of code to with_directory (even in Ruby, I would delimit the
block of code, e.g. with do/end).

I *AM* sure I would INTENSELY HATE not knowing whether

foo('bar', baz())

is being evaluated by normal strict rules -- calling baz and passing
the result as foo's second argument -- or rather a special form in
which the 'baz()' is "a block of code" which foo may execute zero or
more times in special ways -- depending on how foo happens to be
bound at this time. *SHUDDER*. Been there, done that, will NEVER
again use a language with such ambiguities if I can possibly help it.

My mistake in choosing stx_id(stmt). Maybe foo{'bar', baz()} would be
better, or
foo bar:
baz()
So, I hope the cultural difference is sharply clear. To us, consensus
is culturally important, we are keen to ensure we all keep using the
same language; *you* would be happier if you could use a language that
is different from those of others, thanks to syntax extensions you
write yourself. Since I consider programming to be mainly a group
activity, and the ability to flow smoothly between several groups to
be quite an important one, I'm hardly likely to appreciate the divergence
in dialects encouraged by such possibilities, am I?

Yes, we disagree about how we use a language. It's too bad the choice
isn't left to the programmer, though.
Surely you're jesting? Defining a class's __new__ and __init__
just means defining the class's *constructor*, to use the term
popular in C++ or Java; as I can change any other method, so I
can change the constructor, of course (classes being mutable
objects -- by design, please note, not by happenstance). "Pretty
big language modification" MY FOOT, with all due respect...

__init__ is the constructor, but isn't __new__ the allocator or factory?

A mutable __new__ means you are not guaranteed that Cls() gives you a Cls
object.

- Daniel
 
A

Andrew Dalke

Doug Tolton:
I believe the crux of our difference is that you don't want to give
expressive power because you believe it will be misused. I on the
other hand want to give expressive power because I believe it could be
used correctly most of the time. For the times when it's not, well
that's why I have debugging skills. Sadly not eveyone uses looping
the way I would, but using my brain I can figure out what they are
doing.

That point has been made over and over to you. The argument is
that expressive power for a single developer can, for a group of
developers and especially those comprised of people with different
skill sets and mixed expertise, reduce the overall effectiveness of the
group.

If this is indeed the crux, then any justification which says "my brain"
and "I" is suspect, because that explicitly ignores the argument. By
comparison, Alex's examples bring up
- teaching languages to others
- interference between his code and others' (the APL example)
- production development
"Imagine a group of, say, a dozen programmers, working together ...
to develop a typical application program of a few tens of thousands
of
function points -- developing about 100,000 new lines of delivered
code
plus about as much unit tests, and reusing roughly the same amount"
- writing books for other people

which at the very least suggests the expertise and background
by which to evaluate the argument. It may be that his knowledge of
how and when to use macros is based on the statements of people he
respects rather than personal experience, but given the discussions on
this topic and the exhibited examples of when macros are appropriately
used, it surely does seem that metaclasses, higher-level functions, and
iterators can be used to implement a solution with a roughly equal amount
of effort and clarity. Th only real advantage to macros I've seen is the
certainty of "compile-time" evaluation, hence better performance than
run-time evaluation

Alex:
You:
Both from my experience and Fred Brooks it's the only actual way I've
seen of measuring the time it will take to write a program.

You mean "estimating"; for measuring I suspect you can use a
combination of a clock and a calendar. (This from a guy who recently
posted that the result of 1+1 is 4. ;)

You should use McConnell as a more recent reference than Brooks.
(I assume you are arguing from Mythical Man Month? Or from his
more recent writings?) In any case, in Rapid Development McConnell
considers various alternatives then suggests using LOC, on the view
that LOC is highly correlated with function points (among 3rd
generation programming languages! see below) and that LOC has a
good correlation to development time, excluding extremes like APL
and assembly. However, his main argument is that LOC is an easy
thing to understand.

The tricky thing about using McConnell's book is the implications
of table 31-2 in the section "Using Rapid Development Languages",
which talks about languages other than the 3rd generation ones used
to make his above estimate.

Table 31-2 shows the approximate "language levels" for a wider
variety of languages than Table 31-1. The "language level" is
intended to be a more specific replacement for the level implied
by the phrases "third-generation language" and "fourth-generation
language." It is defined as the number of assembler statements
that would be needed to replace one statement in the higher-level
language. ...

The numbers ... are subject to a lot of error, but they are the best
numbers available at this time, and they are accurate enough to
support this point: from a development poing of view, you should
implement your projects in the highest-level language possible. If
you can implement something in C, rather than assembler, C++
rather than C, or Visual Basic rather than C++, you can develop
faster.

And here's Table 31-2

Statements per
Language Level Function Point
-------- ----- --------------
Assembler 1 320
Ada 83 4.5 70
AWK 15 25
C 2.5 125
C++ 6.5 50
Cobol (ANSI 85) 3.5 90
dBase IV 9 35
spreadsheets ~50 6
Focus 8 40
Fortran 77 3 110
GW Basic 3.25 100
Lisp 5 65
Macro assembler 1.5 215
Modula 2 4 80
Oracle 8 40
Paradox 9 35
Pascal 3.5 90
Perl 15 25
Quick Basic 3 5.5 60
SAS, SPSS, etc. 10 30
Smalltalk (80 & V) 15 20
Sybase 8 40
Visual Basic 3 10 30

Source: Adapted from data in 'Programming Languages
Table' (Jones 1995a)


I'll use Perl as a proxy for Python; given that that was pre-OO
Perl I think it's reasonable that that sets a minimum level for
Python. Compare the Lisp and Perl numbers

Lisp 5 65
Perl 15 25

and the differences in "statements per function point" (which isn't
quite "LOC per function point") is striking. It suggests that
Python is more than twice as concise as Lisp, so if LOC is
used as the estimate for implementation time then it's a strong
recommendation to use Python instead of Lisp because it
will take less time to get the same thing done. And I do believe
Lisp had macros back in the mid-1990s.

Sadly, this is a secondary reference and I don't have a
copy of
Jones, Capers, 1995a. "Software Productivity Research
Programming Languages Table," 7th ed. March 1995.
and the referenced URL of www.spr.com/library/langtbl.htm
is no longer valid and I can't find that table on their site.

Well that was a long winded digression into something that is
completely un-related to Macros. Seems like a good argument why
re-binding the buildins is bad though

It was a long winded digression into how LOC can be a
wrong basis by which to judge the appropriateness of a
language feature.
Where braces should go is a trivial issues. However if braces is an
issue that seriously concerns you then I can see why macros are giving
you a heart attack.

See that smiley and the "--"? This is a throwaway point at the end
of the argument, and given Alex's noted verboseness, if it was a
serious point he would have written several pages on the topic.

Andrew
(e-mail address removed)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,172
Messages
2,570,934
Members
47,478
Latest member
ReginaldVi

Latest Threads

Top