is LISP the ultimate prgram language?

S

Stefan Ram

Jerry Coffin said:
As for the rest, you seem far more interested in advocating a
viewpoint than being accurate. Characterizing six out of thirteen as
"most" is little short of a blatant lie.

On that page, I read:

»binary-trees 1/3
regex-dna 1/3
k-nucleotide 1/3
reverse-complement 1/3
mandelbrot 1/2
chameneos-redux 1/1
pidigits 1/1
spectral-norm 1/1
fannkuch 1/1
n-body 1/1
fasta 1/1«

http://shootout.alioth.debian.org/u32q/benchmark.php?test=all&lang=gpp&lang2=java&box=1

These are 6 entries with »1/1« and 5 others.

»1/1« seems to indicate a ratio of run-times of
approximately 1.
 
O

osmium

Juha Nieminen said:
As others have concluded, there's no silver bullet. However, I like
C++ much more than C for many reasons.

For one, standard C++ offers enormously more tools than standard C
does, right out of the box. In C, if you want to do anything even
slightly complicated, you either need to implement all data containers
and such from scratch, download (and learn) third-party libraries which
might or might not be simple to use, or you have to have a personal
collection of such libraries (self-made or third-party) which you always
"carry around" everywhere where you might have to write a small C program.

While standard C++, quite naturally, doesn't offer ready solutions for
every possible problem, it nevertheless offers good solutions for quite
many common problems. There aren't many programs I have written in C++
where I wouldn't have found the C++ standard library quite useful and
practical (ie. the parts which lack from C).

Also C++ makes it a lot easier to use eg. generic data containers and
algorithms than C does. For example, if I need a balanced binary search
tree of strings, almost everything I may ever need to do with it can be
done with one-liners or two-liners. Most importantly, I don't need to
worry about the memory management of those strings (nor the tree nodes)
in any way. And moreover, if I need a similar binary tree, but rather
than containing strings, containing something else (like integers or
some struct instances), the exact same library can be very easily used
for that too with an almost identical syntax. In C, such generic data
containers are by necessity complicated to use, error-prone and often
less efficient.

Automatic memory management of stack-allocated objects is also a huge
timesaver and makes it enormously easier to write safe programs which
handle dynamically allocated memory. Such programs become shorter,
cleaner, easier to understand and maintain and, most importantly, safer.
Also, if standard containers are used, other C++ programmers will
understand the code much more easily because it will contain
significantly less custom code and custom programming conventions.

C++ might be a "swiss army knife", but if all those features help me
writing shorter, cleaner and safer programs more easily, I'm all for it.
I don't consider being a "swiss army knife" a bad thing, even though the
term is often used in a derogatory manner.

Compared to the "swiss army knife" that C++ is, C is just a regular
knife. Without a handle. You might be able to do something with it, but
be careful to not to cut your fingers off.

I think we are in pretty complete agreement, why would anyone use C when
they can use C++?

My argument was that the improvement in productivity is closer to 40% than
the 1000%. suggested by Kanze. Put starkly, I think C++ is over hyped.
Thinking about my post, after I made it, I think even 40% may be too high.
There are lots of problems where the C++ advantage is minuscule.

Overstating the advantages of a language one will spend hundreds of hours
learning is not good, it can only lead to disappointment.
 
A

Anand Hariharan

(e-mail address removed)>,
(e-mail address removed) says...

[ ... ]
Google for "Worse is Better".  It was written at a time when C was
becoming more popular than LISP and gives insights on why it was so.

Mostly it gives insights into the phenomenon mostly commonly referred
to as "sour grapes". I.e. it was written by a guy who could see that
what he liked was losing out to something he didn't like. He
proceeded to proclaim, in essence, "Look, the whole rest of the band
is out of step!"

FWIW, I disagree with your editorial.

Yes, Richard Gabriel was clearly prejudiced in favour of LISP, against
UNIX and C. Yes, many of his arguments were strawman.

That said, my takeaway from his essay was that for a programming
language (or for that matter, most anything) to succeed -- the world
expects/insists simplicity rather than correctness from it.

The irony here is that C++ is anything but simple!
 
S

Stefan Ram

Anand Hariharan said:
That said, my takeaway from his essay was that for a programming
language (or for that matter, most anything) to succeed -- the world
expects/insists simplicity rather than correctness from it.

Another opinion:

"Unfortunately, the success or failure of a computer
language is often dependent on factors unrelated to its
technical merits (...)"

S. H. Valentin in "The Computer Journal", Vol. 17, No. 4, p. 331

@article{DBLP:journals/cj/Valentine74,
author = {Samuel H. Valentine},
title = {Comparative Notes on ALGOL 68 and PL/I.},
journal = {Comput. J.},
volume = {17},
number = {4},
year = {1974},
pages = {325-331},
bibsource = {DBLP, http://dblp.uni-trier.de}
}

http://dblp.uni-trier.de/rec/bibtex/journals/cj/Valentine74
 
B

Balog Pal

Jerry Coffin said:
[ ... ]
- At the time, LISP was created, FORTRAN was IIRC
unable to do recursive function calls, so possibly
that was implemented in LISP first, too. See

Simon, Newell and Shaw's IPL implemented recursion before McCarthy
even started working on Lisp.

As for the rest, you seem far more interested in advocating a
viewpoint than being accurate.

LOL. I'm not sure that was the actual intent, but the post definitely drags
that feeling, and uses nice cuts that in turn twist the reality.
Characterizing six out of thirteen as
"most" is little short of a blatant lie.

The point here is not the count, but the selection of tests is fishy in
itself, the figures beyond the isolated benchmark time show huge mem
footprint, that in a real app translates to wasted time; also uses gcc as
the strowman that is known for being way lousy in the optimisation field.
Quoting Joel Spolsky
makes you seem overly credulous -- while he seems like a perfectly
decent guy, he doesn't seem to have any credentials to qualify him as
an authority on much of anything.

The referred article is a pretty good in with a ton of insight. The quoted
portion is not in its main stream, but what is more important, in the put
sense C++ count in the 'managed' group :)). Unless certainly one tries to
sell C code as C++. Normal C++ code does not manage memory, but uses
vector, string and locals naturally, and for the tiny portion asking dynamic
creation the suite of smart pointers.

And that in effect brings superior productivity. Joel is right here too.

Going to the essence of the thought, it should not pick 'memory' as a
special item, but go for more generality, a language that allows automation
of pesky management, is ways more productive than one relies on the
programmer to do everything. And C++ with RAII enabled is absolute king,
covering more than memory.
Those, however, pale to insignificance when your quote Ian Joyner.

He says in intro:
http://burks.brighton.ac.uk/burks/pcinfo/progdocs/cppcrit/index001.htm
"Another factor has been the publishing of Bjarne Stroustrup's "Design and
Evolution of C++" [Stroustrup 94]. This has many explanations of the
problems of extending C with object-oriented extensions while retaining
compatibility with C. In many ways, Stroustrup reinforces comments that I
made in the original critique, but I differ from Stroustrup in that I do not
view the flaws of C++ as acceptable, even if they are widely known, and many
programmers know how to avoid the traps. Programming is a complex endeavour:
complex and flawed languages do not help."

That translates to the usual "hey, I could create so much better a language
from scratch". We have hundreds of those cool languages, written for the
drawer.

His following quote of java white paper made me LOL then cry. Guess this guy
thinks Java 1.0 is some good thing. Nuff said.


Just to add something positive, if it comes to critique of C++ I rather
chode Matthew Wilson with "Imperfect C++".


(OTOH, IMO LISP is "the ultimate language" in many senses)
 
J

Jerry Coffin

(e-mail address removed)>,
(e-mail address removed) says...

[ ... ]
That said, my takeaway from his essay was that for a programming
language (or for that matter, most anything) to succeed -- the world
expects/insists simplicity rather than correctness from it.

The irony here is that C++ is anything but simple!

So either C++ flopped completely, or his claim of a preference for
simplicity was wrong -- and C++ doesn't seem to have completely
flopped.
 
J

Jerry Coffin

[ ... ]
These are 6 entries with »1/1« and 5 others.

»1/1« seems to indicate a ratio of run-times of
approximately 1.

My apologies -- the '13' was a typo. It should have read '11'. That
doesn't really change anything though -- 6 out of 11 still isn't
"most", or even very close to it.
 
K

Keith H Duggar

Going to the essence of the thought, it should not pick 'memory' as a
special item, but go for more generality, a language that allows automation
of pesky management, is ways more productive than one relies on the
programmer to do everything. And C++ with RAII enabled is absolute king,
covering more than memory.
[snip]

(OTOH, IMO LISP is "the ultimate language" in many senses)

That is intriguing given that you appreciate RAII+RRID (resource
release is destruction) as a general method for controlling any
resource (not just memory); and given that LISP does not support
RAII+RRID. Any further thoughts on this one aspect of comparison?

KHD
 
R

Rui Maciel

Anand said:
That said, my takeaway from his essay was that for a programming
language (or for that matter, most anything) to succeed -- the world
expects/insists simplicity rather than correctness from it.

The irony here is that C++ is anything but simple!

I don't believe that simplicity is an important factor. If that was the case then languages such as Perl
would never have caught.


Rui Maciel
 
K

Keith H Duggar

I don't believe that simplicity is an important factor. If that was
the case then languages such as Perl would never have caught.

I think if one considers "worse is better" more carefully
you find that the essence is pontification vs production.
In other words, those who sit about pontificating the ivory
perfection of the "ideal" solution lose to those who
produce a workable solution in the meantime.

KHD
 
B

Balog Pal

"Juha Nieminen"
Where are you getting such an exact number as "40%" from?

I have programmed both in C and C++, and I consider myself quite
proficient in both (but more in the latter, naturally), and I have to
agree with the (rough) estimate of 1000% improved productivity.

It is probably way dependent on what quality level you aim at. If it is
high (i.e. no tolerance for problems) the gap opens wide. TMK James works
for such environment. Me too. I'd also vouch on the 1000% estimate, though
must add that I'm yet to see the desired quality reached with C that leaves
little to compare...
 
B

Balog Pal

Keith H Duggar said:
I think if one considers "worse is better" more carefully
you find that the essence is pontification vs production.
In other words, those who sit about pontificating the ivory
perfection of the "ideal" solution lose to those who
produce a workable solution in the meantime.

It's stll a poor way to put it. Why not stick with Voltaire's "Perfect is
enemy of good"?
 
J

James Kanze

"Juha Nieminen"
It is probably way dependent on what quality level you aim at.
If it is high (i.e. no tolerance for problems) the gap opens
wide. TMK James works for such environment. Me too. I'd also
vouch on the 1000% estimate, though must add that I'm yet to
see the desired quality reached with C that leaves little to
compare...

It also depends on the application domain. If you're writing
small numeric applications, where the only real types you're
dealing with are double and fixed sized arrays of double (and
int for indexing, of course), then the gap is considerably
smaller than if you're having to deal with more complex
abstractions (and in older C, even complex numbers would qualify
as a "more complex abstraction"), dynamic allocations and large
teams.
 
J

James Kanze

I don't believe that simplicity is an important factor. If
that was the case then languages such as Perl would never
have caught.
[/QUOTE]
I think if one considers "worse is better" more carefully you
find that the essence is pontification vs production. In
other words, those who sit about pontificating the ivory
perfection of the "ideal" solution lose to those who produce a
workable solution in the meantime.

Stated that way, your statement is as tendential as the original
paper:). The major problem with the original paper is that it
is arguing a foregone conclusion.

There is a hint of a valid point in the original paper: if I had
to categorize it, it would be something like "works today vs.
correct tomorrow" (not that the Lisp community has any monopoly
on "correct tomorrow", and the definitions the paper uses for
"better" don't always correspond to what I would consider
"correct"). If you expand the idea, howver, you realize that
you can't always ignore the importance of "works today", and
that the two positions don't really have to be in opposition: a
lot of places I've seen do use the principle of a working
prototype, as soon as possible, followed up with a more rigorous
implementation later. One could argue that in doing this, you
should use a very dynamic language (e.g. something like
Smalltalk) for the prototype, and a bondage and discipline
language (Ada?) for the final version; perhaps one of the things
C++ has going for it is that it can be used both ways, so you
don't need to use two different languages.
 
N

Nick Keighley

That is intriguing given that you appreciate RAII+RRID (resource
release is destruction) as a general method for controlling any
resource (not just memory); and given that LISP does not support
RAII+RRID. Any further thoughts on this one aspect of comparison?

scheme has its (dynamic-wind before proc after) which calls "proc" and
calls "before" before entry to proc and "after" on exit from proc.
Since continuations allow proc to be exited (and re-entered) this is
not a matter of simply calling one at the beginning and one at the.
Scheme has the same early exit possibilities as C++ exceptions. So for
resources other than memory you could build something like RAII.

I'm not sure if Common Lisp (what is often meant by "Lisp") has
corresponding features



nick keighley
 
A

ardjussi

And that is your problem.  There is no magic bullet.  There are
a number of important paradigms which improve productivity, each
important, and the strength of C++ is that all of them are
possible in it.  Any one may be better implemented in another
language, but the real productivity gain is in using whichever
one is appropriate in a given situation.


Perhaps.  The screwdriver in a Swiss army knife probably isn't
as good as a purpose built screwdriver, but if you need to drive
a screw, it's a lot better than a hammer.


That's certainly a problem, and a more Pascal like syntax (with
a lot less special characters) would certainly improve things.
But the number of special characters, or the use of {} instead
of begin/end is, in the end, a detail.  The real diffence comes
with encapsulation (and access control), the ability to use
polymorphism when appropriate (and the fact that you're not
stuck with it when it isn't), the ability to use programming by
contract idioms (with private virtual functions), the ability to
define abstract types (like std::vector), the ability to
separate interface (in the header file) and implemenation (in
the source).  None of these are unique to C++, and for any one,
you could probably find a better language, but having all of the
possibilities at hand makes C++ a powerful tool.  Not perfect,
but it works, and the alternatives all seem to have some fatal
weakness.
In my understanding there is a single weakness in alternatives: In
this discussion
http://www.artima.com/forums/flat.jsp?forum=106&thread=268226
Achilleas Margaritis gives an interesting insight C++ being the only
language combining
high and low level.

br Jussi
 
J

Jerry Coffin

[ ... ]
In my understanding there is a single weakness in alternatives: In
this discussion
http://www.artima.com/forums/flat.jsp?forum=106&thread=268226
Achilleas Margaritis gives an interesting insight C++ being the only
language combining high and low level.

I can't agree that it's the only one -- Ada (for one example)
supports about the same levels of abstraction.

If we eliminate that part, I'd agree -- or more accurately, he's
agreeing with something I said around 5 years ago:

Then again, C++ does have one advantage in this respect:
it supports enough different levels of abstraction that
it can be used right down to the metal, or in a fairly
safe, high-level manner, and perhaps most importantly,
more or less seamlessly marrying the two, so even in my
system programming, much of what I write works at a fairly
high level of abstraction.

http://groups.google.com/group/comp.lang.java/msg/36943ca538b19724
 
S

Stefan Ram

On the other hand, many features today's Lisp programmers
are fond of, like macros and CLOS, were not a part of the
classical LISP.

Indeed, there is a paper claiming that Lisp macros were
an »inspiration« for C macros:

»Lisp programmers have long used macros to extend their
language. Indeed, their success has inspired macro
notations for a variety of other languages, such as C
and Java.«

http://www.cs.brown.edu/~sk/Publications/Papers/Published/sk-automata-macros/paper.pdf

This influence is neither contradicted nor confirmed by:

»Many other changes occurred around 1972-3, but the most
important was the introduction of the preprocessor,
partly at the urging of Alan Snyder [Snyder 74], but
also in recognition of the utility of the the
file-inclusion mechanisms available in BCPL and PL/I.«

http://cm.bell-labs.com/cm/cs/who/dmr/chist.pdf

It is possible that C macros indeed were influenced by LISP
macros:

(Begin of quote from <[email protected]>)
From: Kaz Kylheku <[email protected]>
Newsgroups: comp.lang.lisp,comp.lang.java.programmer
Subject: Re: macros
Date: Fri, 26 Jun 2009 22:30:12 +0000 (UTC)
Message-ID: <[email protected]>
(...)
The Software Preservation Group web page for Lisp history,
http://www.softwarepreservation.org/projects/LISP/ has a link to an
October 22, 1963 paper "MACRO Definition for LISP" by Timothy P. Hart,
ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-057.pdf

I don't know enough about Lisp to evaluate whether what it describes is
the foundation of the current Lisp feature. Perhaps someone with the
necessary knowledge could take a look at it?

It positively, definitely describes the foundations of macros.
(End of quote from <[email protected]>)
 
J

Jerry Coffin

[ ... ]
Indeed, there is a paper claiming that Lisp macros were
an »inspiration« for C macros:

If you honestly want to know what "inspired" C's macros, take a look
at Macro-8 (an assembler for the PDP-8), or Control Data 160G GASS
(General ASsembly System). E.g. See page 5 of:

http://www.bitsavers.org/pdf/cdc/160/CDC_160G_Brochure_Feb64.pdf

C was designed as a "portable assembly language", and by the early
1970's when it was designed, everybody "knew" that any sort of
assembly language needed to have macros.
 
J

James Kanze

[ ... ]
Indeed, there is a paper claiming that Lisp macros were
an »inspiration« for C macros:
If you honestly want to know what "inspired" C's macros, take
a look at Macro-8 (an assembler for the PDP-8), or Control
Data 160G GASS (General ASsembly System). E.g. See page 5 of:

C was designed as a "portable assembly language", and by the
early 1970's when it was designed, everybody "knew" that any
sort of assembly language needed to have macros.

I'm not sure that macros were added because C was trying to be
like assembler---C's macros are a lot weaker than anything I've
seen in assembler---, but what is sure is that all assemblers
back then did have some support for macros, and that the people
developing C were very familiar with assembler, and macro
technology, so when the need for e.g. "inline functions" was
felt, macros would be a more or less natural response.

What I don't get is Lisp proponents trying to imply that C
macros are derived from Lisp. Even if it were true, it's
something that I'd try to hide, rather than claim credit for;
C's preprocessor are not exactly the best feature of the
language.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,156
Messages
2,570,878
Members
47,408
Latest member
AlenaRay88

Latest Threads

Top