Python syntax in Lisp and Scheme

J

james anderson

Alex said:
Doug Tolton wrote:

...

I think it's about a single namespace (Scheme, Python, Haskell, ...) vs
CLisp's dual namespaces. People get used pretty fast to having every
object (whether callable or not) "first-class" -- e.g. sendable as an
argument without any need for stropping or the like. To you, HOFs may
feel like special cases needing special syntax that toots horns and
rings bells; to people used to passing functions as arguments as a way
of living, that's as syntactically obtrusive as, say, O'CAML's mandate
that you use +. and not plain + when summing floats rather than ints
(it's been a couple years since I last studied O'CAML's, so for all I
know they may have changed that now, but, it IS in the book;-).

it can't really be the #' which is so troubling.

? (defmacro define (name parameters &rest body)
`(set (defun ,name ,parameters ,@body)
(function ,name)))
DEFINE
? (define lof (a b) (cons a b))
#<Compiled-function LOF #x78467E6>
? (mapcar lof '(1 2 3) '(a s d))
((1 . A) (2 . S) (3 . D))
?

what is the real issue?

....
 
K

Kenny Tilton

Pascal said:

Oh, please:

"My point is... before I started teaching Scheme, weak students would
get overwhelmed by it all and would start a downward spiral. With
Scheme, if they just keep plugging along, weak students will have a
strong finish. And that's a great feeling for both of us!"

That kind of anecdotal crap is meaningless. We need statistics!
Preferably with lots of decimal places so we know they are accurate.

:)
 
A

Alex Martelli

Doug Tolton wrote:
...
don't know me or my background. Alex has stated on many occasions that
he has not worked with Macros, but that he is relying on second hand
information.

I never used Common Lisp in production: in the period of my life when I
was hired (by Texas Instruments) specifically for my knowledge of "Lisp",
that meant Scheme and a host of other dialects (mostly but not entirely now
forgotten). I did use things that "passed for" macros in those dialects:
I had no choice, since each TI lab or faction within the lab was using a
different divergent mutant thing, all named "lisp" (save a few were named
"scheme" -- hmmm, I do believe that some were using Prolog, too, but I
did not happen to use it in TI), with some of the divergence hinging on
locally developed sets of macros (and some on different vendors/versions).

For all I know, CLisp's macros are SO head and shoulders above any of a
quarter century ago that any vaguely remembered technical problem from
back then may be of purely historical interest. I do believe that the
divergence problem has more to do with human nature and sociology, and
that putting in a language features that encourage groups and subgroups
of users to diverge that language cannot be compensated by technical
enhancements -- it _will_, in my opinion, cause co-workers in any middle-
or large-sized organization to risk ending up standing on each others'
feet, rather than on each others' shoulders. (Remedies must of course
be sociological and lato sensu political first and foremost, but the way
the language & tools are designed CAN help or hinder).

So, I'm nowhere near an _expert_ -- over 20 years' hiatus ensures I
just can't be. But neither is it totally 2nd hand information, and if
I gave the mistaken impression of never having used macros in a
production setting I must have expressed myself badly. I do know I
jumped on the occasion of moving to IBM Research, and the fact that
this would mean going back to APL instead of "lisp" (in the above
vague sense) did matter somewhat in my glee, even though I still
primarily thought of myself as a hardware person then (the programming
was needed to try out algorithms, simulate possible hardware
implementations thereof, etc -- it was never an end in itself).

I don't claim to be a guru on Lisp, however I believe I understand it
far better than Alex does. If the people who actually know and use
Common Lisp think I am mis-speaking and mis-representing Lisp, please
let me know and I will be quiet.

Give that I've heard "everything and its opposite" (within two constant
parameters only: S-expressions are an unalloyed good -- macros are good,
some say unconditionally, others admit they can be prone to abuse) from
posters on this thread from "people who actually know and use" Lisp, I
don't know how you could "mis-speak and mis-represent" as long as you
stick to the two tenets of party doctrine;-).

Like I said, I'm not an expert at Lisp, but I think I understand the
spirit and semantics of Lisp far better than Alex, and from what I've

If by Lisp you mean Common Lisp and exclude Scheme, I'm sure you do; if
Scheme is to be included, then I'm not sure (but it's quite possible,
nevertheless) -- at least the "spirit" of the small core and widespread
HOFs w/single-namespace seem to be things I understand more (but the
"spirit" of why it's so wonderful to have extensible syntax isn't:).


Alex
 
D

Dave Benjamin

But being a function, it'd have the nasty property of a
separate scope (yes, that can be nasty sometimes). I'd perhaps
want to do

open('input.txt', { |f| data = f.read() })

But alas, 'data' would be local to the anonymous function and
not usable outside.

Well, that's the same problem that lambda's got. I don't really have a
solution for that, other than the usual advice: "Use a namespace". =)

Dave
 
R

Rainer Deyke

Pascal said:
Pick the one Common Lisp implementation that provides the stuff you
need. If no Common Lisp implementation provides all the stuff you
need, write your own libraries or pick a different language. It's as
simple as that.

Coming from a C/C++ background, I'm surprised by this attitude. Is
portability of code across different language implementations not a priority
for LISP programmers?
 
D

Dave Benjamin

Dave Benjamin wrote (answering Mike Rovner):
...

I don't see that there's anything "implicit" in the concept that a
special operation works as indicated by its syntax. I.e., I do not
find this construct any more "implicit" in the first line than in
its second one, which is the juxtaposition of a name and a pair of
parentheses to indicate calling-with-arguments -- and alternatives...

What's implicit to me is that the use of an iterator is never specified.
For instance, we could (and I'm *not* suggesting this) do this:

iterator = file('input.txt')
while iterator.has_next():
line = iterator.next()
do_something_with(line)
This has nothing to do with "eschewing code blocks", btw; code blocks
are not "eschewed" -- they are simply syntactically allowed, as
"suites", only im specific positions. If Python's syntax defined
other forms of suites, e.g. hypothetically:

with <object>:
<suite>

meaning to call the object (or some given method in it, whatever)
with the suite as its argument, it would be just as explicit as, e.g.:

for <name> in <object>:
<suite>

or

<object>(<object>)


This would be an interesting alternative, but it's back to being a special
case, like Ruby has. I think it'd be more flexible as a literal that returns
a callable.
You could do so even if you HAD to say iter(<object>) instead of
just <object> after every "for <name> in" -- it wouldn't be any
more "explicit", just more verbose (redundant, boiler-platey). So
I do not agree with your motivation for liking "for x in y:" either;-).

Well, let's just say I've been on the Java side of the fence for a little
while, and it has redefined my personal definition of explicit. One of the
reasons Python code is so much smaller than Java code is that a lot of
things are implicit that are required to be explicit in Java. I see this as
a good thing.
Not huge, but the abundance of ({ | &c here hurts a little bit.

Well, we all __pick__ our __poisons__...
Here, you're arguing for redundance, not for explicitness: you are claiming
that IF you had to say the same thing more than once, redundantly, then
mistakes might be more easily caught. I.e., the analogy is with:

file('foo.txt').write('wot?')

where the error is not at all obvious (until runtime when you get an
exception): file(name) returns an object *open for reading only* -- so
if you could not call file directly but rather than do say, e.g.:

file.open_for_reading_only('foo.txt').write('wot?')

Nah, I'm not arguing for redundancy at all. I'm saying that there is some
voodoo going on here. When the *constructor* for a file object behaves like
a generator that loops over newline-delimited lines of a text field, doesn't
that seem like it's been specialized for a particular domain in an unobvious
way? Why lines? Why not bytes, words, unicode characters? I mean, it's more
convenient for people that do a lot of text processing, but I don't see
anything specific to text or lines in the phrase "file('foo.txt')". That's
all I'm saying.
Already addressed above: nothing implicit there.

Likewise, and I still disagre... =)
There are none, so how could such a nonexisting thing be EITHER implicit
OR explicit? Variables don't HAVE types -- OBJECTS do.

The very fact that variables to not have types, and following that, that
variables do not have manifest types, is an example of implicit being
chosen over explicit. I know your argument, and I understand that Python
variables are Post-It sticky notes and all of that, but please, just try to
look at it from a non-Python-centric perspective. Other languages (like C++,
which I hear you are vaguely familiar with ;) require you to be explicit
about what type of thing you're defining and sending where. Python does not.
This is one of its strengths, because it allows for ad-hoc interfaces and
polymorphism without a lot of boilerplate.
Etc, etc -- can't spend another 1000 lines to explain why your "lots of
things" do not indicate violations of "explicit is better than implicit".

They're not *violations*. Correct me if I'm wrong, but the Zen of Python is
not the LAW! It's a poem! It's very beautiful, very concise, inspiring, and
thoughtful, but it's not the 10 commandments! I just get very tired of every
idea getting shot down because of some rule from Tim Peters. I really don't
think he intended for it to be used to prove the validity of ideas.

The implicit/explicit thing is one of the most abused, in my opinion,
because it can quite frankly be used to shut down any attempt at creating
abstraction. In fact, for that reason alone, I'm tempted to say "Implicit is
better than explicit". Say what you want, not how you want it. Be abstract,
not concrete.
Sometimes it is (to avoid perilous nesting), sometimes it isn't (to
avoid wanton naming). I generally don't mind naming things, but it IS
surely possible to overdo it -- without going to the extreme below,
just imagine a language where ONLY named argument passing, and no use
of positional arguments, was allowed (instead of naming arguments being
optional, as it is today in Python).

I don't have to imagine. It's called Smalltalk, and also (to some extent)
Tcl/Tk. Even Tkinter seems to be named-argument only. It's not that bad.
I still like positional parameters, though.
If a Pythonic syntax can't be found to solve ALL use cases you've
raised, then the "balance" may be considered not nice enough to
compensate for the obvious problem -- a serious case of MTOWTDI.

That's another argument for another day. ;)
Cite pls? I knew that Logo and ABC had been specifically designed
with children in mind, but didn't know that of Smalltalk.
http://ei.cs.vt.edu/~history/GASCH.KAY.HTML
http://www.cosc.canterbury.ac.nz/~wolfgang/cosc205/smalltalk1.html
http://www.cs.washington.edu/homes/dugan/history.html


As an ex-user of APL (and APL2) from way back when, I think you're
both talking through your respective hats: neither list comprehensions
(particularly in the Python variation on a Haskell theme, with
keywords rather than punctuation) nor code blocks resemble APL in the least.

Well, it was a rough analogy, and I've never done any APL myself, but here's
my justification, FWIW:

- APL provides syntactical constructs for high-level array processing
- List comprehensions do this also
- Code blocks have nothing inherently to do with array processing

But I agree that neither resemble APL as I've seen. I guess it's like saying
a carrot is more like APL than a rutabega.

Dave
 
K

Kenny Tilton

Alex said:
... I do believe that the
divergence problem has more to do with human nature and sociology, and
that putting in a language features that encourage groups and subgroups
of users to diverge that language ....

Can someone write a nifty Python hack to figure out how many times
Lispniks have tried to get Alex to explain how macros are any different
than high-order functions or new classes when it comes to The Divergence
Problem? I love that we have given it a name, by the way.

One popular macro is WITH-OUTPUT-TO-FILE. My budding RoboCup starter kit
was a vital WITH-STD-ATTEMPT macro. Oh god, no! I need to see the ANSI
Lisp commands for these things so I can really understand them. Better
yet...

why not the disassembly? preferably without meaningful symbols from the
HLL source. I think we are finally getting somewhere with TDP. Those
high order classes, functions, and macros keep me from seeing what is
really going on. Now if I could only see the microcode....

:)

kenny
 
J

james anderson

Rainer said:
Coming from a C/C++ background, I'm surprised by this attitude. Is
portability of code across different language implementations not a priority
for LISP programmers?

there are some things which the standard does not cover.

....
 
M

Matthias

Kenny Tilton said:
Oh, please:

"My point is... before I started teaching Scheme, weak students would
get overwhelmed by it all and would start a downward spiral. With
Scheme, if they just keep plugging along, weak students will have a
strong finish. And that's a great feeling for both of us!"

That kind of anecdotal crap is meaningless. We need statistics!
Preferably with lots of decimal places so we know they are accurate.

:)

Why the smiley? Many hours of discussions could be spared if there
were real, scientific, solid studies on the benefit of certain
language features or languages in certain domains or for certain types
of programmers. It would help get successful languages become
accepted in slow/big/dump organizations. It would point language
designers in the right directions. Project leaders could replace
trail-and-error by more efficient search techniques. (Assuming for a
second, programmers or managers would make rational decisions when
choosing a programming language and having available trustworthy
data.)

I imagine such studies are quite hard to do properly, but having them
would be useful.
 
T

Thomas F. Burdick

Andreas Rossberg said:
I'm not terribly familiar with the details of Lisp macros but since
recursion can easily lead to non-termination you certainly need tight
restrictions on recursion among macros in order to ensure termination of
macro substitution, don't you? Or at least some ad-hoc depth limitation.

I'm not terribly familiar with the details of Python's iteration constructs
but since iteration can easily lead to non-termination you certainly need tight
restrictions on ...

In some cases, recursive macros and functions are easier to get right
(avoid infinite recursion) than their iterative counterparts.
Careless coders will always find a way to code themselves into
infinite loops. The easy way to avoid infinite recursion is:

(cond
((===> base case <===) ...)
((===> another base case? <===) ...)
((...) recursive call)
((...) recursive call)
...
(t recursive call))

Most of the time, it's easy to ensure that all recursive calls "move
up" the cond tree. Times when you can't do that (or not easily), you
should be writing iterative code, or you're just doing something
inherently difficult.
"Computer games don't affect kids; I mean if Pac Man affected us
as kids, we would all be running around in darkened rooms, munching
magic pills, and listening to repetitive electronic music."
- Kristian Wilson, Nintendo Inc.

(That's a great sig!)

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'
 
D

Doug Tolton

Alex said:
Doug Tolton wrote:
...



I never used Common Lisp in production: in the period of my life when I
was hired (by Texas Instruments) specifically for my knowledge of "Lisp",
that meant Scheme and a host of other dialects (mostly but not entirely now
forgotten). I did use things that "passed for" macros in those dialects:
I had no choice, since each TI lab or faction within the lab was using a
different divergent mutant thing, all named "lisp" (save a few were named
"scheme" -- hmmm, I do believe that some were using Prolog, too, but I
did not happen to use it in TI), with some of the divergence hinging on
locally developed sets of macros (and some on different vendors/versions).

For all I know, CLisp's macros are SO head and shoulders above any of a
quarter century ago that any vaguely remembered technical problem from
back then may be of purely historical interest. I do believe that the
divergence problem has more to do with human nature and sociology, and
that putting in a language features that encourage groups and subgroups
of users to diverge that language cannot be compensated by technical
enhancements -- it _will_, in my opinion, cause co-workers in any middle-
or large-sized organization to risk ending up standing on each others'
feet, rather than on each others' shoulders. (Remedies must of course
be sociological and lato sensu political first and foremost, but the way
the language & tools are designed CAN help or hinder).
I can understand and respect honest differences of opinions. I too
believe that causes of divergence are largely sociological. I differ
though in thinking that features which allow divergence will necessarily
result in divergence.

I have this personal theory (used in the non-strict sense here) that
given enough time any homogenous group will split into at least two
competing factions. This "theory" of mine had it's roots in a nice
dinner at Medieval Times in California. We had arrived for dinner and
we were waiting to be seated, everyone was milling around in a sort of
shop/museum area. We had been given "crowns" for dinner, but no one
paid much attention to them. We were one large group of people, bound
by nothing and separated by nothing. Then the one of the staff took a
microphone and began giving us instructions. She told us the color of
our hats indicated the color of the Knight we would be rooting for, and
tha we would be sitting only with people of similar colored crowns.
Immediately the group (without instructions from the hostess) began
separating into similarly colored groups. Then they began calling the
groups by color to be seated. When they called out group, and we were
ascending the staircase, I looked over my shoulder at the remaining
groups. I was utterly shocked to see apparent hatred and revulsion of
our group on people's faces. To me this was a game, but to some people
in the crowd, having a different colored crown was a serious cause for
emnity.

I have long looked back on that incident, and I have since compared it
to many situations I have observed. Over time it seems to me that human
beings are incapable of remaining as one single cohesive group, rather
that they will always separate into several competing factions. Or at
the very least groups will splinter off the main group and form their
own group.

So it doesn't surprise me when groups splinter and diverge if they are
not strictly controlled from an organizational or sociological point of
view.

However in the opensource world I expect splinters to happen frequently,
simply because there is little to no organizational control. Even
Python hasn't been immune to this phenomenon with both Jython and
Stackless emerging.

Some people want power and expressiveness. Some people want control and
uniformity. Others still will sacrifice high level constucts for raw
pedal to the metal speed, while others wouldn't dream of this sacrifice.

What I'm getting at is that I can understand why people don't like
Macros. As David Mertz said, some people are just wired in dramatically
different ways.
So, I'm nowhere near an _expert_ -- over 20 years' hiatus ensures I
just can't be. But neither is it totally 2nd hand information, and if
I gave the mistaken impression of never having used macros in a
production setting I must have expressed myself badly. I do know I
jumped on the occasion of moving to IBM Research, and the fact that
this would mean going back to APL instead of "lisp" (in the above
vague sense) did matter somewhat in my glee, even though I still
primarily thought of myself as a hardware person then (the programming
was needed to try out algorithms, simulate possible hardware
implementations thereof, etc -- it was never an end in itself).
Thank you for that clarification. I must have been mis-interpreting
something, because I did think you had never used them.
Give that I've heard "everything and its opposite" (within two constant
parameters only: S-expressions are an unalloyed good -- macros are good,
some say unconditionally, others admit they can be prone to abuse) from
posters on this thread from "people who actually know and use" Lisp, I
don't know how you could "mis-speak and mis-represent" as long as you
stick to the two tenets of party doctrine;-).
For me it isn't about party doctrine. :p My mindset very closely
matches Paul Grahams. I can understand why other people have a
different mindset, and from what you've said I can even understand why
you don't like Macros, I just have a different viewpoint.

What get's me is when people (and I do this sometimes as well) expess an
opinion as fact, and that all rational people will agree with them. So,
for what it' worth, for the times I have expressed my opinion as the one
true way of thinking, I'm sorry.
If by Lisp you mean Common Lisp and exclude Scheme, I'm sure you do; if
Scheme is to be included, then I'm not sure (but it's quite possible,
nevertheless) -- at least the "spirit" of the small core and widespread
HOFs w/single-namespace seem to be things I understand more (but the
"spirit" of why it's so wonderful to have extensible syntax isn't:).

Honestly I've only used scheme in trivial things. My preference has
been more towards Common Lisp, primarily because I need it for building
real systems, rather than doing language research.

I'm sure there many things that you know that I don't. From what I
understand you've been at this a bit longer than I have. I've only been
doing serious programming for a little over ten years now. In that
time I have been involved with some very large projects on a very large
scale. I think I understand the concepts of abstraction and code reuse
pretty well, and how to build large systems that integrate the efforts
of numerous people.

I personally just don't believe macros are "evil" per se. I believe
they like any other tool can be used effectively, or misused
effectively. However most of my problems don't come from people who
misuse advanced features of the language, rather they come from people
who don't understand basic concepts of optimization and code reuse.

In any event, I think Lisp and Python are both great languages. I use
Python every day at work, and I work on learning more about Lisp (and a
top secret pet project ;) ) every day. I very much respect your
knowledge Alex, because I do believe you have some good insights, and I
do enjoy discussing issues that we disagree on (when we aren't being
"bristly" ;) ) because you have many times helped me to understand my
own point of view better. So even though we don't always agree, I still
appreciate your opinions.
 
V

Vis Mike

[snip]

Something like this seems more logical to me:

for line in file('input.txt').lines:
do_something_with(line)

for byte in file('input.txt').bytes:
do_something_with(byte)

Is it possible?

Mike
 
T

Thomas F. Burdick

Kenny Tilton said:
One popular macro is WITH-OUTPUT-TO-FILE. My budding RoboCup starter kit
was a vital WITH-STD-ATTEMPT macro. Oh god, no! I need to see the ANSI
Lisp commands for these things so I can really understand them. Better
yet...

why not the disassembly?

Fortunately for insane, paranoid programmers like Kenny who don't read
docstrings, and refuse to believe that others' libraries might work
correctly, Lisp is quite accomidating. You want to see what a macro
call expands to? Hey, I think we have tools for just that problem.
Disassembly? Now I could be mistaken, but I remember it being far
easier than in most languages ... oh yeah, DISASSEMBLE. Wow, I didn't
have to dig through a whole object file, or anything!

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'
 
D

Dave Benjamin

[snip]

Something like this seems more logical to me:

for line in file('input.txt').lines:
do_something_with(line)

for byte in file('input.txt').bytes:
do_something_with(byte)

I like that. =)
Is it possible?

Depends on your definition of "possible".

Dave
 
A

Andrew Dalke

Pascal Costanza:
So what's the result of ("one" - "two") then? ;)

It's undefined on strings -- a type error. Having + doesn't
mean that - must exist.

(A more interesting question would be to ask what
the result of "throne" - "one" is. But again, a type error.)

Pascal Costanza:
It's a myth that bytes are restricted to 8 bits. See
http://www.wikipedia.org/wiki/Byte

(I can't connect to the URL but I know what you're talking
about.)

Sure. But I'm just barely old enough to have programmed on
a CDC Cyber. When I see the word 'byte' I assume it means
8 bits unless told otherwise. When I buy memory, I don't ask
the sales staff "so is this an 8 bit byte or a 60 bit byte?" (And
yes, I know about the lawsuit against disk drive manufacturors
and their strange definition of "gigabyte", but even then, they
still use an 8 bit byte.)

Me:
Pascal Costanza:
No, not yet. ANSI CL was finalized in 1994.

Sure. That's why I asked about consensus. De facto rather
than de jure. SAX for XML processing is a de facto standard
but portable across different implementations and even
portable across different languages (that is, there's a very
mechanical and obvious way to convert from one language
to another.)
Again, not part of ANSI CL. Don't judge a standardized language with the
measures of a single-vendor language - that's a different subject.

I understand your point of view. OTOH, it's like when I used to
work with C. It was standardized, but required that I download
a slew of packages in order to do things. Eg, I had to evalutate
several different regexp packages before I found one which was
appropriate. I know there are good reasons for a standard to
leave out useful packages, but I know there are good reasons for
an implementation to include a large number of useful packages.

Is there a free Lisp/Scheme implementation I can experiment with
which include in the distribution (without downloading extra
packages; a "moby" distribution in xemacs speak):
- unicode
- xml processing (to some structure which I can use XPath on)
- HTTP-1.1 (client and server)
- URI processing, including support for opening and reading from
http:, file:, and https:
- regular expressions on both 8-bit bytes and unicode
- XML-RPC
- calling "external" applications (like system and popen do for C)
- POP3 and mailbox processing

As far as I can tell, there isn't. I'll need to mix and match packages
from a variety of sources. It isn't like these are highly unusual
requirements; I use them pretty frequently in Python. For examples:

- connect to my POP server, delete messages with .exe attachements
on the assumption that it's spam
- Use DAS (see http://www.biodas.org/) to get genomic sequence
annotations. Requires HTTP and XML processing (mostly
Latin-1 encoding)
- Make an XML-RPC server which takes as input molecular
structure information (eg, "CCO" is ethanol) and calls several
existing command-line packages to compute properties about
the compound. Parse the output with regular expressions and
combine and return all the results.
- Make a client which uses that server.

(Okay, looks like https isn't needed for these, but the rest are.)
(Apart from that, Jython also doesn't provide everything that Python
provides, right?)

No, but there is a good overlap. I believe all of the above are
supported on both implementations.
Pick the one Common Lisp implementation that provides the stuff you
need. If no Common Lisp implementation provides all the stuff you need,
write your own libraries or pick a different language. It's as simple as
that.

Which Common Lisp *distribution* provides the above? I
don't doubt that an implementation + some add-in packages do
all that, but each new package means one extra lump on the
barrier to entry and one extra worry when I want others to
use the code I've written.

I use Python in part because of the "batteries included"
philosophy. There will always be 3rd party packages (PIL for
images, ReportLab for PDF generation, PyDaylight or OEChem
for chemical informatics), but there's a lot which comes in
the standard distributions.
You can ask these things in comp.lang.lisp or in one of the various
mailing lists. Common Lispniks are generally very helpful.

Understood. I did managed to get the biolisp code working, btw.

Andrew
(e-mail address removed)
 
A

Alex Martelli

Kenny said:
Can someone write a nifty Python hack to figure out how many times
Lispniks have tried to get Alex to explain how macros are any different
than high-order functions or new classes when it comes to The Divergence
Problem? I love that we have given it a name, by the way.

The very 'feature' that was touted by Erann Gat as macros' killer advantage
in the WITH-CONDITION-MAINTAINED example he posted is the crucial
difference: functions (HO or not) and classes only group some existing code
and data; macros can generate new code based on examining, and presumably to
some level *understanding*, a LOT of very deep things about the code
arguments they're given. If all you do with your macros is what you could
do with HOF's, it's silly to have macros in addition to HOF's -- just
MTOWTDItis encouraging multiple different approaches to solve any given
problem -- this, of course, in turn breeds divergence when compared to a
situation in which just one approach is encouraged. If you do use the
potential implied in that example from Gat, to do things that functions and
classes just couldn't _begin_ to, it's worse -- then you're really
designing your own private divergent language (which most posters from
the Lisp camp appear to assert is an unalloyed good, although admittedly
far from all). This is far from the first time I'm explaining this, btw.

Oh, and if you're one of those who disapprove of Gat's example feel free
to say so, but until I hear a substantial majority denouncing it as idiotic
(and I haven't seen anywhere near this intensity of disapproval for it from
your camp) I'm quite justifyied in taking it as THE canonical example of a
macro doing something that is clearly outside the purview of normal tools
such as functions and classes. As I recall there was a lot of that going
on in TI labs, too -- instead of writing and using compilers for hardware
description languages, circuit simulators, etc, based on appropriate and
specialized languages processed with the help general-purpose ones,
the specialized languages (divergent and half-baked) were embedded in
programs coded in the general-purpose languages (Lisp variants, including
Scheme; that was in 1980) using macros that were supposed to do
everything but serve you coffee while you were waiting -- of course when
the snippets you passed (to represent hardware operation) were correct
from the GP language viewpoint but outside the limited parts thereof that
the macros could in fact process significantly down to circuit design &c,
the error messages you got (if you were lucky enough to get error
messages rather than just weird behavior) were QUITE interesting.

One popular macro is WITH-OUTPUT-TO-FILE. My budding RoboCup starter kit
was a vital WITH-STD-ATTEMPT macro. Oh god, no! I need to see the ANSI

Do they do things a HOF or class could do? If so why bother using such
an over-powered tool as macros instead of HOFs or classes? If not, how do
they specially process and understand code snippets they're passed?


Alex
 
A

Andrew Dalke

james anderson:
i realize that this thread is hopelessly amorphous, but this post did
introduce some concrete issues which bear concrete responses...

Thank you for the commentary.
i got only as far as the realization that, in order to be of any use, unicode
data management has to support the eventual primitive string operations. which
introduces the problem that, in many cases, these primitive operations
eventually devolve to the respective os api. which, if one compares apple and
unix apis are anything but uniform. it is simply not possible to provide them
with the same data and do anything worthwhile. if it is possible to give some
concrete pointers to how other languages provide for this i would be
grateful.

Python does it by ignoring the respective os APIs, if I understand
your meaning and Python's implementation correctly. Here's some
more information about Unicode in Python

http://www.python.org/peps/pep-0100.html
http://www.python.org/peps/pep-0261.html
http://www.python.org/peps/pep-0277.html

http://www.python.org/doc/current/ref/strings.html

http://www.python.org/doc/current/lib/module-unicodedata.html
http://www.python.org/doc/current/lib/module-codecs.html

and i have no idea what people do with surrogate pairs.

See PEP 261 listed above for commentary, and you may want
to email the author of that PEP, Paul Prescod. I am definitely
not the one to ask.
yes, there are several available common-lisp implementations for http clients
and servers. they offer significant trade-offs in api complexity,
functionality, resource requirements and performance.

And there are several available Python implementations for the same;
Twisted's being the most notable. But the two main distributions (and
variants like Stackless) include a common API for it, which makes
it easy to start, and for most cases is sufficient.

I fully understand that it isn't part of the standard, but it would be
useful if there was a consensus that "packages X, Y, and Z will
always be included in our distributions."
if one needs to _port_ it to a new lisp, yes. perhaps you skipped over the
list of lisps to which it has been ported. if you look at the #+/-
conditionalization, you may observe that the differences are not
significant.

You are correct, and I did skip that list.

Andrew
(e-mail address removed)
 
K

Kenny Tilton

Matthias said:
Why the smiley?


Sorry, I was still laughing to myself about that study with the lines of
code count (and measuring the power of a language by the number of
machine instructions per line or whatever that was).
...Many hours of discussions could be spared if there
were real, scientific, solid studies on the benefit of certain
language features or languages...

Studies schmudies. Everyone knows 10% of the people do 90% of the code
(well it might be 5-95). Go ask them. I think they are all saying (some)
Lisp and/or Python right now.

in certain domains or for certain types
of programmers.

There's that relativism thing again. I think a good programming language
will be good for everyone, not some. What many people do not know is
that Lisp (macros aside!) is just a normal computer language with a
kazillion things done better, like generic functions and special
variables to name just two. Norvig himself talked about this, pardon my
alziness in not scaring up that well-know URL: Python is getting to be a
lot like Lisp, though again macros forced him into some hand-waving.
.. It would help get successful languages become
accepted in slow/big/dump organizations.

Why you starry-eyed dreamer, you! Yes, here comes the PHB now waving his
copy of Software Engineering Quarterly.

It would point language
designers in the right directions. Project leaders could replace
trail-and-error by more efficient search techniques. (Assuming for a
second, programmers or managers would make rational decisions when
choosing a programming language and having available trustworthy
data.)

Careful, any more of that and the MIB will come get you and send you
back to the planet you came from.
I imagine such studies are quite hard to do properly, but having them
would be useful.

OK, I am smiling again at the first half of that sentence. But there is
hope. My Cells package naturally exposes the interdependency of program
state, something Brooks (correctly) identified as a huge problem in
software engineering, hence his (mistaken) conviction there could be no
magic bullet.

Now Cells can (and have been to various degrees) been ported to C++,
Java, and Python. If those ports were done as fully as possible, such
that they passed the regression tests used on the Lisp reference
implementation, we could then measure productivity, because (I am
guessing) the internal state dependencies will serve quite nicely as a
measure of "how much" program got written by a team, one which could be
used to compare intelligently the productivity on different projects in
different languages. (You can't have the same team do the same project,
and you can't use two different teams, for obvious reasons.)

kenny
 
A

Andrew Dalke

Pascal Bourguignon:
Because the present is composed of the past. You have to be
compatible, otherwise you could not debug a Deep Space 1 probe
160 million km away, (and this one was only two or three years old).

Huh? I'm talking purely in the interface. Use ASCII '[' and ']' in the
Lisp code and display it locally as something with more "directionality".
I'm not suggesting the unicode character be used in the Lisp code.
Take advantages of advances in font display to overcome limitations
in ASCII.
Mathematicians indeed overload operators with taking into account
their precise properties. But mathematicians are naturally
intelligent. Computers and our programs are not. So it's easier if
you classify operators per properties; if you map the semantics to the
syntax, this allow you to apply transformations on your programs based
on the syntax without having to recover the meaning.

Ahhh, so make the language easier for computers to understand and
harder for intelligent users to use? ;)

Andrew
(e-mail address removed)
 
K

Kenny Tilton

Thomas said:
Fortunately for insane, paranoid programmers like Kenny who don't read
docstrings, and refuse to believe that others' libraries might work
correctly, Lisp is quite accomidating. You want to see what a macro
call expands to? Hey, I think we have tools for just that problem.
Disassembly? Now I could be mistaken, but I remember it being far
easier than in most languages ... oh yeah, DISASSEMBLE. Wow, I didn't
have to dig through a whole object file, or anything!

<rofl> yer right! Lisp is sick. I am going to go disassemble DOTIMES
right now, find out once and for all what those plotters over at Franz
are up to.

kenny

ps. Arnold is your governor, Arnold is your governor, nyeah, nyeah, nya,
nyeah, nyeah.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,172
Messages
2,570,933
Members
47,472
Latest member
blackwatermelon

Latest Threads

Top