Python syntax in Lisp and Scheme

M

Marco Antoniotti

Andrew said:
Edi Weitz ....


I'm already making my living from doing Python, so I've got an
incentive to stay with it. ;)

In the scientific conferences I attend, no one I've seen uses Lisp
for their work, excepting those old enough that they started before
there were other high-quality high-level languages.

Maybe thos of us "old enough" know that some high-level high-quality
languages are better than others :)

No one has told me they would hire me for contract work "if only
you were a Lisp programmer."

If the barrier to entry to do what are common-place tasks requires
I buy a commercial Lisp then it's much less likely others will use
my code. I like having others use my code.

(Then why do I use Python? It's a tradeoff, since writing Java/C++
is just too tedious. And I like the people in Python (Hi Laura!).
And I'm picky the domain -- I like doing computational life sciences.)

Well, I am doing that too. Do you know what is the core of
Biocyc/Ecocyc/Metacyc written in?
And I'm doing it for Python. For my domain, it seems like a much
better language choice, for reasons I've mentioned here several times.

Your reasons seem to boil down to the "I do not know Lisp enough" thingy
you hear over and over. I know I sound trite, but that is exactly the
point. Meanwhile, CL languishes because people don't understand
Greespun's Tenth :)

I know I am whining :) I *am* an old geezer :)

Technically I'm cross-posting from c.l.py. And I actually complain
for other reasons. ;)




So far I've made ... 4(?) half-hearted attempts at learning Lisp.
And 1 at learning Haskell. And 0.1 at learning OCaml.




"A rising tide lifts all boats". The same is true in Python, in
Java, in Ruby, in ...

With the main difference that Greespun's Tenth Rule of Programming does
not apply in only one case :)

Cheers
 
M

Marco Antoniotti

Andrew said:
Thomas F. Burdick:



Not at the mercy of your vendor unless you want to use something
which isn't in the standard, like unicode (esp "wide" unicode, >16bit),

Given that there are more than 1.84 implementations of Common Lisp, yes,
you are at the mercy of the implementor to have access to a good UNICODE
implementation. (Now, whn is the last time I really really really
needed to write error messages in Tamil script? >:| )
regular expressions (esp. regexps of unicode),

There are several completely portable regexps libraires. For UNICODE
see above.

There are at least two completely portable sockets libraries for CL.

Last I checked UFFI did pretty much the right thing.
But that's just flaming -- ignore me. ;)

I am a fireman :)

Cheers
 
A

Alex Martelli

Kenny Tilton wrote:
...
...
Hey! No pulling rank! :) Actually, I think we heard Mr. Martelli say
something along these lines at one point, tho I grok that he does know
his stuff. As for having "a better understanding", hmmm, check your
lotus, I think you'll find something in there about Beginner's Mind.

Doug was not talking about understanding, but experience. I've had
such experience -- perhaps not "extensive" enough for him? -- and I
have personally experienced and suffered the problems.

Alex reports his experience of The Divergence Problem and blames macros.
Hell, I worked in Tall Buildings for years. I saw Divergence,
Convergence, /and/ the dread Regurgitation Problems everywhere I went.
No macros, tho, just groups of programmers.

So I think the groups are the problem.

I never claimed that without macros there would be no possible problems
whatsoever. Human biology guarantees that such sociological problems
remain possible. However, technology aspects, as well as cultural ones,
can either ameliorate or exacerbate the issues. Python's overall culture
of consensus and simplicity matters: at a BOF as OSCON, somebody was
asking "...so I don't know if I should just do the simple thing here or
rather do something clever..." and the audience immediately gave their
feedback -- if you think of it as "something clever", _don't do it_.
(Not in production code, not in code that you AND more relevantly many
others will have to maintain, enhance, and change for years to come).

Are you familiar with the "JAPH" idea, those bazillion clever ways that
Perl programmers have dreamed up to emit the string "just another perl
hacker" in delightfully contorted ways? Well, somebody once asked how
a Pythonista would go about it -- and the answer was unanimous:
print "Just another Python hacker"
Sure, this will get you no kudos for cleverness, but the point is that
cleverness does NOT garner kudos among Pythonistas. Simplicity, clarity,
explicitness, directness -- these are the community's core values. Do
we all always live by them? No way -- we're not saints, just practical
guys trying to get our jobs done -- and perhaps make the world a little
better along the way.

Technological aspects interplay with the cultural ones. Again speaking
in terms of ideals and targets, I quote Antoine de Saint-Exupery:
"La perfection est atteinte non quand il ne reste rien à ajouter, mais
quand il ne reste rien à enlever." (perfection is achieved not when
nothing is left to add, but when nothing is left to take away). Now,
since "practicality beats purity", one doesn't (e.g.) remove 'if' just
because it can reasonably be substituted by 'while' -- we're not talking,
in fact, about truly minimalist practice. But the only apparently
irreducible use of macros would appear to be in (some form of) code
that "reasons about itself" (not just by simple reflection and
introspection, quite easy to achieve without macros, but in that
"with-condition-maintained" example which was apparently [or allegedly]
able to analyze and modify reactor-control code to affect the reactor's
temperature limits). Do I need or want such exoterica in the kind of
code that I am interested in writing, and helping others write? No way:
such "creative", deep merging of discourse and meta-discourse does not
appear to be at all necessary in these endeavours -- and if it not
strictly necessary, I would MUCH rather use a simpler, lighter-weight
tool that does not support it. When I need to modify (e.g.) some
class's constructor at runtime, I can do it by such blazingly obvious
code as
someclass.__new__ = staticmethod(costructor_implementation)
though the Lisp'er originally proposing such needs, in an attempt to
show how complicated Python's approach was, used a hugely messy call to
type.__setattr__ with a weirdly and unjustifiably long lambda in it.

Won't fly, friends. We're simple, down-to-earth folks, doing simple,
down-to-earth things. I suspect some kind of 10/90 rules apply: our
simple tools may be barely 10% of what CL has got, but they cover the
needs arising in 90% of applications. (Maybe it's 15/85 or whatever:
no, I don't have scientific studies in the matter to quote so you
can exert your ingenuity at shooting them down:). When we need to
process some specialized language we may use roughly-traditional parser
technology, keeping language and meta-language separated, rather than
embed and entwine them inside each other.

My perhaps-not-extensive-enough experience with macros showed them
being used to merge language and meta-language -- in widely different
ways in different labs or even within a given lab -- while at the
same time other firms were using languages without macros (APL and
variants thereof) and processing them with different and separate
metalanguages AND thereby managing to achieve better (intra-firm, at
least) cooperation. As "adventures in programming", those glorious
lisp-dialects-cum-hardware-description-languages-inside-them were,
no doubt, a hoot. For somebody looking for a more productive way to
design chips, and particularly to foster cooperation in this design
task, they looked (and in retrospect still look) sub-optimal to me.

The macros ended up being used to bend and stretch the general
purpose language to express specialized issues (about hardware design,
in that case) which it was not optimally suited to express -- and
since it WAS a case of bending and stretching, it was inevitable that
each different lab and faction would stretch and bend in divergent
directions. The computer-scientists in question were no doubt happy
as larks with their toys and in some cases their shiny new lisp
machines (I think TI ended up making a LM of their own a bit later,
but that was after my time); us engineers weren't _quite_ as happy,
though. And the chips didn't get designed as well, nor as fast...


Alex
 
M

Marco Antoniotti

Alex said:
Kenny Tilton wrote:
.....

The very 'feature' that was touted by Erann Gat as macros' killer advantage
in the WITH-CONDITION-MAINTAINED example he posted is the crucial
difference: functions (HO or not) and classes only group some existing code
and data; macros can generate new code based on examining, and presumably to
some level *understanding*, a LOT of very deep things about the code
arguments they're given. If all you do with your macros is what you could
do with HOF's, it's silly to have macros in addition to HOF's -- just
MTOWTDItis encouraging multiple different approaches to solve any given
problem -- this, of course, in turn breeds divergence when compared to a
situation in which just one approach is encouraged. If you do use the
potential implied in that example from Gat, to do things that functions and
classes just couldn't _begin_ to, it's worse -- then you're really
designing your own private divergent language (which most posters from
the Lisp camp appear to assert is an unalloyed good, although admittedly
far from all). This is far from the first time I'm explaining this, btw.

I am extremely careful to design new macros for my "extensions". And
when I do so I do it in my specialized packages. Moreover, I am
personally against blindly importing names when you do not actually need to.

This may or may not cause language divergence. It is a social issue
that is rather independent. For example, people forget Greenspun's
Tenth Rule of programming every other day and continue to diverge :)

Oh, and if you're one of those who disapprove of Gat's example feel free
to say so, but until I hear a substantial majority denouncing it as idiotic
(and I haven't seen anywhere near this intensity of disapproval for it from
your camp) I'm quite justifyied in taking it as THE canonical example of a
macro doing something that is clearly outside the purview of normal tools
such as functions and classes. As I recall there was a lot of that going
on in TI labs, too -- instead of writing and using compilers for hardware
description languages, circuit simulators, etc, based on appropriate and
specialized languages processed with the help general-purpose ones,
the specialized languages (divergent and half-baked) were embedded in
programs coded in the general-purpose languages (Lisp variants, including
Scheme; that was in 1980) using macros that were supposed to do
everything but serve you coffee while you were waiting -- of course when
the snippets you passed (to represent hardware operation) were correct
from the GP language viewpoint but outside the limited parts thereof that
the macros could in fact process significantly down to circuit design &c,
the error messages you got (if you were lucky enough to get error
messages rather than just weird behavior) were QUITE interesting.

What people were doing not too long ago (1998) in a major electronic CAD
company was to develop special intermediate languages to represent some
design modules (we are talking about a not-so-cheap application here).
Guess what. They were using a tabbed format. Going from version 1.0 of
the product to version 2.0 involved writing a complex "migration" tool,
as the previous format would break (not to mention the common place "cut
and paste" errors).

How would you do that today? You would write a XML DTD (or Schema, if
you are so inclined) to achieve the same goal. Now, given that XML is
S-expr in a drag, Greenspun's Tenth applies again.

This has nothing to do with HOF vs Macros etc etc, but it shows that you
are always using some "language design" thingy while you program. After
all, Stroustroup correctly said that "library design" is "language
design". Jumping back to the topic, the bottom line is that you want
both macros and HOFs. If you do not want both you are just reconfirming
Greenspun's Tenth Rule of Programming :)

Do they do things a HOF or class could do? If so why bother using such
an over-powered tool as macros instead of HOFs or classes? If not, how do
they specially process and understand code snippets they're passed?

Because you have them and because they are easier to use than a HOF. If
you have both you can make the best of both. If you miss either, you
have one less tool in your belt. As for the previous examples, you do
not necessarily need to understand the code snippets that are passed to
the macros. Most of the time macros are used as code transformations.
If you use them carefully, then your (Common Lisp) programs get more
succinct and more readable (and, incidentally more efficient, as Common
Lisp can use macros to shortcut the road to the *NATIVE CODE* compiler).
You cannot achieve this effect if you do not have both.

Cheers
 
A

Alex Martelli

Bengt Richter wrote:
...
This way lambda would only be needed for backwards compatibility, and
since "def(" is a syntax error now, IWT it could be introduced cleanly.

In theory, yes, I think it could (and wrt my similar idea with 'do' has
the advantage of not requiring a new keyword). In practice, trying to
hack the syntax to allow it seems a little nightmare. Wanna try your
hand at it? I'm thinking of Grammar/Grammar and Modules/parsermodule.c ...


Alex
 
M

Marco Antoniotti

Ok. At this point I feel the need to apoligize to everybody for my
rants and I promise I will do my best to end this thread.

I therefor utter the H-word and hopefully cause this thread to stop.

Cheers
 
A

Alex Martelli

Lulu of the Lotus-Eaters wrote:
...
Python never had an "aspiration" of being "just a scripting language",

Hmmm, at the start it sure seemed like that. Check out

http://www.python.org/search/hypermail/python-1992/0001.html

actually from late '91. By 1991 Guido was already calling Pyhton
"a prototyping language" and "a programming language" (so, the
"just a scripting language" was perhaps only accurate in 1990), but
in late '91 he still wrote:

"""
The one thing that Python definitely does not want to be is a GENERAL
purpose programming language. Its lack of declarations and general laziness
about compile-time checking is definitely aimed at small-to-medium-sized
programs.
"""

Apparently, it took us (collectively speaking) quite a while to realize
that the lack of declarations and compile-time checks aren't really a
handicap for writing larger programs (admittedly, Lispers already knew
it then -- so did I personally, thanks also to experiences with e.g.
Rexx -- but I didn't know of Python then). So, it _is_ historically
interesting to ascertain when the issue of large programs first arose.
nor WAS it ever such a thing. From its release, Python was obviously a
language very well suited to large scale application development (as

Well, clearly that was anything but obvious to Guido, from the above
quote. Or maybe you mean by "release" the 1.0.0 one, in 1994? At
that time, your contention becomes quite defensible (though I can't
find a Guido quote to support it, maybe I'm just not looking hard
enough), e.g. http://www.python.org/search/hypermail/python-1994q1/0050.html
where Bennett Todd muses
"""
I think Python will outstrip every other language out there, and Python
(extended where necessary in C) will be the next revolutionary programming
tool ... Perl seems (in my experience) to be weak for implementing large
systems, and having them run efficiently and be clear and easy to maintain.
I hope Python will do better.
"""
So, here, the idea or hope that Python "will do better" (at least wrt
Perl) "for implementing large systems" seems already in evidence, though
far from a community consensus yet.


I do find it fascinating that such primary sources are freely available
on the net -- a ball for us amateur historians...!-)



Alex
 
A

Alex Martelli

Pascal said:
This presumes that language features can be judged in isolation. I think
it's rather more likely that good programming languages are holistic
systems, in the sense that the whole language is more than the sum of
its features.

....and/or less, if N features are just offering N different ways to
perform essentially the same tasks, of course. Still, be the whole
more or less than "the sum of the parts", one still can't rule out
(as no "hard-scientific studies" are ever likely to exist) such
non-linearities and complications. This, of course, points out that
programming languages are NOT "mathematics", as some claim -- they
are engineering designs, and interact with human minds, sociology
of groups, cultural and educational features, at least as much as
they interact with the architecture and capabilities of computers.


Alex
 
A

Alex Martelli

Edi Weitz wrote:
...
Thanks, I think my reading comprehension is quite good. What you said
doesn't change the fact that Mr. Martelli's wording insinuates that in
Scheme and Python functions are first-class objects and in Common Lisp

I put "first class" in quotes and immediately explained what I meant.
they're not. For the sake of c.l.p readers who might not know CL I
think this should be corrected.

* (let ((fn (lambda (x) (* x x))))
(mapcar fn (list 1 2 3 4 5)))

(1 4 9 16 25)

There you have it. I can create a function, assign it to a variable
and pass it to another function like any other object, that's what I'd
call a "first-class object."

Yes, but:
This old namespace debate only makes me yawn, sorry.

If so then why jump on assertions related exactly just to that --
namespaces? Re-read my quote above, o you of self-proclaimed "quite
good" reading comprehension: I was trying to explain to Doug Tolton
why many think that Haskell, Scheme or Python "do HOFs better",
while he was claiming that the use of #' if "far clearner" (sic)
because "in lisp with #' it's immediately obvious that you are
receiving or sending a HOF that will potentially alter how the
call operates". I.e., it IS strictly a namespace debate from the
word go. Whether it SHOULD be emphasized with horns and bells
that "warning, HOF coming!!!" -- as Doug claimed -- or not.

If you're bored by debating namespaces, don't jump into a debate
on namespaces -- seems simple common sense (as opposed to
common lisp?)...


Alex
 
P

Pascal Costanza

Alex said:
Pascal Costanza wrote:




...and/or less, if N features are just offering N different ways to
perform essentially the same tasks, of course. Still, be the whole
more or less than "the sum of the parts", one still can't rule out
(as no "hard-scientific studies" are ever likely to exist) such
non-linearities and complications. This, of course, points out that
programming languages are NOT "mathematics", as some claim -- they
are engineering designs, and interact with human minds, sociology
of groups, cultural and educational features, at least as much as
they interact with the architecture and capabilities of computers.

I definitely agree. Computer science is more a sociological science than
a natural science IMHO.

Pascal
 
P

Pascal Bourguignon

+---------------
| (and yes, I know about the lawsuit against disk drive manufacturors
| and their strange definition of "gigabyte"... )
+---------------

Oh, you mean the fact that they use the *STANDARD* international
scientific/engineering notation for powers of 10 instead of the
broken, never-quite-right-except-in-a-few-cases pseudo-binary
powers of 10?!?!?


No we mean the fact that they subreptitiously switched from the
industry standard of defining giga as 2^30 to the scientific standard
of defining giga as 10^9, which allowed them to display bigger size
while in fact they did not have bigger hard drives. That was a pure
marketing trick. Happily, after these lawsuits, they now write the
exact number of byte storable on their devices. But be assured that
they would have never switched if 2^30 had been smaller than 10^9.

[Hmmm... Guess you can tell which side of *that*
debate I'm on, eh?] The "when I write powers of 10 which are 3*N
just *asssume* that I meant powers of 2 which are 10*N" hack simply
fails to work correctly when *some* of the "powers of 10" are *really*
powers of 10. It also fails to work correctly with things that aren't
instrinsically quantized in powers of 2 at all.

Examples: I've had to grab people by the scruff of the neck and push
their faces into the applicable reference texts before they believe me
when I say that gigabit Ethernet really, really *is* 1000000000.0 bits
per second [peak payload, not encoded rate], not 1073741824, and that
64 kb/s DS0 telephone circuits really *are* 64,000.0 bits/sec, not 65536.
[And, yes, 56 kb/s circuits are 56000 bits/sec, not 57344.]

Yes, that's because telecoms are not computers. In particular,
telecoms were invented long before computers and binary base became
interesting.

On the other hand, hard drives are purely computer stuff...

Solution: *Always* use the internationally-recognized binary prefixes
<URL:http://physics.nist.gov/cuu/Units/binary.html> when that's really
what you mean, and leave the old scientific/engineering notation alone,
as pure powers of 10. [Note: The historical notes on that page are well
worth reading.]

Perhaps we should start serriously to use the kibi (Ki), mibi (Mi),
gibi (Gi), tibi (Ti), etc, that have been proposed.
 
A

Alex Martelli

Kenny Tilton wrote:
...
Sheesh, who hasn't been exposed to basic? From my generation, that is.
:) But no matter, the point is anyone can handled parens if they try for
more than an hour.

Yes, but will that make them most happy or productive? The Autocad
case appears to be relevant, though obviously only Autodesk knows
for sure. When I was working in the mechanical CAD field, I had
occasion to speak with many Autocad users -- typically mechanical
drafters, or mechanical or civil engineers, by training and main working
experience -- who HAD painfully (by their tales) learned to "handle
parens", because their work required them occasionally to write Autocad
macros and once upon a time Autolisp was the only practical way to do it --
BUT had jumped ship gleefully to the VBA interface, happily ditching
years of Autolisp experience, just as soon as they possibly could (or
earlier, i.e. when the VBA thingy was very new and still creaky in its
integration with the rest of Autocad -- they'd rather brave the bugs
of the creaky new VBA thingy than stay with the Autolisp devil they
knew). I don't know if syntax was the main determinant. I do know
that quite a few of those people had NOT had any previous exposure to
any kind of Basic -- we're talking about mechanics-junkies, more likely
to spend their spare time hot-rodding their cars at home (Bologna is,
after all, about 20 Km from Ferrari, 20 Km on the other side from
Minardi, while the Ducati motorcycle factory is right here in town,
etc -- *serious* mechanics-freaks country!), rather than playing with
the early home computers, or program for fun.

So, I think Autocad does prove that non-professional programmers
(mechanical designers needing to get their designs done faster) CAN
learn to handle lisp if no alternatives are available -- and also
that they'd rather not do so, if any alternatives are offered. (I
don't know how good a lisp Autolisp is, anyway -- so, as I mentioned,
there may well be NON-syntactical reasons for those guys' dislike
of it despite years of necessarily using it as the only tool with
which they could get their real job done -- but I have no data that
could lead me to rule out syntax as a factor, at least for users
who were OCCASIONAL users anyway, as programming never was their
REAL, MAIN job, just means to an end).

Boy, you sure can read a lot into a casually chosen cliche. But can we
clear up once and for all whether these genius scientists are or are not
as good a programmer as you? I thought I heard Python being recommended
as better for non-professional programmers.

Dunno 'bout Andrew, but -- if the scientists (or their employers) are
paying Andrew for programming consultancy, training, and advice, would
it not seem likely that they consider that he's better at those tasks
than they are...? Otherwise why would they bother? Most likely the
scientists are better than him at _other_ intellectual pursuits -- be
it for reasons of nature, nurture, or whatever, need not interest us
here, but it IS a fact that some people are better at some tasks.
There is too much programming to be done, to let ONLY professional
programmers do it -- just like there's too much driving to be done, to
let only professional drivers do it -- still, the professionals can be
expected to be better at their tasks of specialistic expertise.


Alex
 
J

Jon S. Anthony

Bruce Lewis said:
I find no significant difference in startup time between python and
mzscheme.

Category error. The context (I would have thought) clearly indicated
that "startup costs" concerned the effort needed to use the language!

/Jon
 
A

Alex Martelli

Andrew Dalke wrote:
...[quoting me indirectly]...
If Python's syntax defined
other forms of suites, e.g. hypothetically:

with <object>:
<suite>

meaning to call the object (or some given method in it, whatever)
with the suite as its argument, it would be just as explicit as, e.g.:

for <name> in <object>:
<suite>


A reasonable point. However, inside the 'with' statement it's hard
to know if

print x


Sorry, I was NOT using 'with' in a Pascal/Basic sense, but rather
to mean, and I quote: "meaning to call ... with the suite" (others
have proposed 'using' etc for this construct in python-dev). I
was using 'with' only because so many macros quoted on the xposted
thread appear to start with "WITH-..." ...!-)


Alex
 
L

Lulu of the Lotus-Eaters

|Lulu of the Lotus-Eaters wrote:
|> I would think Lisp is more like cricket: wickets bracket both ends, no
|> one can actually understand the rules, but at least the players wear
|> white.

|Oh, come on! Anyone can understand cricket! There are two teams.
|The team that's in sits out, except for two batsmen...

I apologize, I overstated it. I meant "No American can understand..."

FWIW. I very much enjoyed watching part of an amateur cricket match on
my vacation to Vancouver a few weeks back. But exactly what they were
doing was as perplexing as the Lisp code in this thread :).

Yours, Lulu...

P.S. It's odd that I hadn't KNOWN about my Dutch ancestry... but Python
fits my brain, so there must be some.
 
V

Vis Mike

I think everyone who used Python will agree that its syntax is
the best thing going for it. It is very readable and easy
for everyone to learn. But, Python does not a have very good
macro capabilities, unfortunately. I'd like to know if it may
be possible to add a powerful macro system to Python, while
keeping its amazing syntax, and if it could be possible to
add Pythonistic syntax to Lisp or Scheme, while keeping all
of the functionality and convenience. If the answer is yes,
would many Python programmers switch to Lisp or Scheme if
they were offered identation-based syntax?

What about an editor that simply hides outer parenthesis and displays them
as
tabs, for Scheme for example. Then you could edit in any program, or use an
editor designed for it. Kind of like editing raw HTML or using an HTML
editor.

I might just adapt this idea for my pet language which uses indentation for
blocks. I like code to flow like an outline, with as few extraneous symbols
and junk as possible.

Mike
 
V

Vis Mike

Lulu of the Lotus-Eaters said:
|Something like this seems more logical to me:
|for line in file('input.txt').lines:
| do_something_with(line)
|for byte in file('input.txt').bytes:
| do_something_with(byte)

Well, it's spelled slightly differently in Python:

for line in file('input.txt').readlines():
do_something_with(line)

for byte in file('input.txt').read():
do_something_with(byte)

Of course, both of those slurp in the whole thing at once. Lazy lines
are 'fp.xreadlines()', but there is no standard lazy bytes.

xreadlines()? What kind of naming convention is that: :)

what about 'eachline()'?
A method 'fp.xread()' might be useful, actually. And taking a good idea
from Dave Benjamin up-thread, so might 'fp.xreadwords()'. Of course, if
you were happy to write your own class 'File' that provided the extra
iterations, you'd only need to capitalize on letter to get these extra
options.

Mike
 
A

Alex Martelli

Kenny Tilton wrote:
...
As for non-professional programmers, the next question is whether a good
language for them will ever be anything more than a language for them.

Being a professional programmer, I find Python makes me very productive
at my job -- and yet I know from experience it's also an excellent language
for non-professional programmers. So, the experiential answer is clear to
me.
Perhaps Python should just stay with the subset of capabalities that
made it a huge success--it might not be able to scale to new
sophistication without destroying the base simplicity.

Interestingly enough, you and I probably agree on that. I don't _want_
Python to grow in any way that would "destroy the base simplicity"; indeed,
I'm looking forward to 3.0 (even though it's probably 3 years off or so)
exactly because it may get simplified again then, shedding some accumulated
legacy baggage (there is no reason, except backwards compatibility, to
have e.g "classic classes", range, xrange, etc etc). Any proposed new
feature, in my opinion, must be judged strictly on the criterion of: how
much better than today's set of features would it let us do our jobs? If
it provides another roughly equivalent way to perform some of the same
tasks, that's a substantial minus (as it encourages divergence of language
dialects) and must be compensated by really big advantages elsewhere.

Today's Python has the features we need to productively build ambitious
frameworks for asynchronous network clients and servers (Twisted), spam
filters that apparently work better than Graham's (spambayes), search
engines (Verity Ultraseek, nee Infoseek, as wel as Google), ship-design
optimization apps (Tribon Vitesse), commercial games (Freedom Force, EVE
Online, Star Trek Bridge Commander...), collaborative enterprise app
frameworks (CAPS), scientific visualization tools such as MayaVi, business
logic for factory and tool control (IBM/Philips Fishkill plant)... oh,
you're perfectly capable of reading the various "Python success stories"
sites and booklets yourself, I don't want to bore you TOO much:). The
point is, it MIGHT, as you point out, be unfeasible to "scale to new
sophistication" -- beyond the few 100,000s function points MAXIMUM of
any of these Python successes (roughly equivalent to, say, a few 10's
of millions of lines of C code), to e.g. many millions of function points.

I think the world needs LOTS AND LOTS of application programs in
the 1000-100,000 function points range -- and Python's current set of
features has proven amply sufficient to provide those without damage
to "the base simplicity" which you mention. If many applications of
many millions FP's (roughly equivalent to a billion lines of C, or so)
are needed, I don't know -- but, if so, I share your doubts about it making
any sense to destroy Python's simplicity in an attempt to tackle THOSE
monsters, "scaling to new sophistication".

You (Alex?) also worry about groups of programmers and whether what is
good for the gurus will be good for the lesser lights. What you are
saying is that the guru will dazzle the dorks with incomprehensible
gobbledygook. That does happen, but those kinds of gurus should be fired.

That's only part of the problem, of course -- programmers who are not
quite as good as they think they are and inflict their "enhancements" to
the language to everybody else are another issue. Maybe they should
be fired, but such turnover in the team would decrease productivity
anyway.
On a suffciently large project (well, there's another point: with Lisp
one does not hire ten people (unless one is doing three projects)) the

Lisp isn't able to let 10 people work together productively on the
same project? Oh my -- now THAT would be a huge problem;-).
team should be divided into those who will be building the
domain-specific embedded language and those who will be using it.
Ideally the latter could be not just non-professional programmers, but
even non-programmers.

If they're programming (in whatever language) they can't be
non-programmers, by definition. But anyway, this misses the key
issue: who are the *real* experts of the application domain that
your "domain-specific" language is supposed to address so well?
E.g., the *real* experts on turbo-compressor design, on optimization
of ship designs, on the business logic of tool control, on logical
and physical design of integrated circuits, etc, etc? Answer: they
are likely to be non-professional programmers. Are THEY designing
the domain-specific language -- or is it going to be designed by
computer scientists who don't really grasp the intricacies of turbo
compressors, ships, etc, etc?

The Agile Programming (Extreme Programming, in particular) answer
to this enormously important issue is that the whole team, customer
included, *works together* and *collectively owns* the whole body of code.

The computer scientist learns enough about turbo compressors, and
the turbo compressor expert enough about programming, by the
incredibly productive social process of *pair-programming* -- sitting
side by side and working together at testing and building code (in
this order -- but I won't bore you with test-driven-design paeans...;-).

This holistic approach is incompatible with your favourite "the gurus
build the domain-specific language, the peons just use it" approach.
It works particularly well when the language is as simple, as good at
"getting out of your way", as Python (or, admittedly, Ruby -- I have
enormous respect for Ruby! -- though I have some issues on quite
another plane that make me keep preferring Python for this specific,
and very important to me, kind of tasks).

It surely can feel "cooler" for the Guru (uppercase G mandatory) not
to have to mingle with such low-lifes as the lusers (a Guru's favourite
spelling of "users", apparently) -- to just sit in their ivory tower
spitting out domain-specific embedded languages for domains they
_aren't_ as expert at as the peons. But I've seen that approach at
work (for hardware design with various lisp variants and dialects) and
don't like the results I have observed, particularly in application domains
where the intended users ARE quite expert in their field (which is the
case for many interesting apps -- not just those targeting the people
which our society acknowledges as "respected professionals", mind you:
a good secretary knows FAR more than I ever will on how to make an
office run, a shopkeeper on how things work in a shop, etc, etc).


Alex
 
M

Matthias

Alex Martelli said:
...and/or less, if N features are just offering N different ways to
perform essentially the same tasks, of course. Still, be the whole
more or less than "the sum of the parts", one still can't rule out
(as no "hard-scientific studies" are ever likely to exist) such
non-linearities and complications. This, of course, points out that
programming languages are NOT "mathematics", as some claim -- they
are engineering designs, and interact with human minds, sociology
of groups, cultural and educational features, at least as much as
they interact with the architecture and capabilities of computers.

You are right, of course.

But that it is a complicated matter to study does not mean that it's
not worthwhile: An example where human minds, sociology, culture,
etc. intervene in a complicated way is education. In Germany we've
had /ages/ of hot debate on how to educate children (at elementary and
highschool, mainly). Then scientists came and did some tests. They
defined some educational goals ("children at a certain age should be
able to read this text and solve such kind of math problem") and
looked which factors influence how well students met the goals
previously defined. The results were quite surprising to our
education experts: Certain factors which were previously believed to
matter a great deal (like teacher/student ratio) were found to be of
almost no importance. Other factors (like parental income and ethnic
origin) had an alarmingly high influence. Now our education experts
have, for the first time, real data as input, and they can start to
work on the real problems.

In the context of programming languages I find studies from Lutz
Prechtel <http://www.ipd.uka.de/~prechelt/Biblio/> or Erann Gat's
Lisp/Java paper interesting. Doing such studies on a larger scale and
with non-self selected participants should be possible. In "Patterns
of Software" Peter Gabriel reports (p. 128) that a group of advanced
Lisp developers experienced a 30% drop in productivity one year after
switching to C++. This is merely an anecdote, but if you have a
reasonable measure of programmers' productivity (I know, that's hard)
and examine how language-switchers in industry perform after 1, 2, 3
years you might find other interesting results. One could also try to
compare small software companies which do well (e.g., financially)
with those that do not so well.

In all these cases defining acceptable performance measures and/or
getting enough data is hard, and no single study would reveal "the
truth". But scientifically examining the act of producing software
should be possible (within limits) if one tries and has enough
funding. ;-)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,171
Messages
2,570,935
Members
47,472
Latest member
KarissaBor

Latest Threads

Top