Python syntax in Lisp and Scheme

D

Dave Benjamin

That's true, although you don't really need macros for that,
just something like Smalltalk-style code blocks -- if anyone
can come up with a way of grafting them into the Python
syntax...

Well, you'll have to do a better job than I did, because my proposal made
a distinct "thud" when I dropped it here. ;)

Dave
 
R

Rayiner Hashem

Ahh, but overloading only works at compile time:
void foo( SomeBaseObject* object );
void foo( SomeDerivedObject* object );

doesn't work if you're using a base class pointer for all your derived
classes.
I think that the point was that the overload resolution rules can handle the
situation. Nothing in these rules prevents them from being applied to a
dynamically dispatched case.
 
N

Neil Hodgson

james anderson:
i'm trying to understand how best to approach unicode representations.
i am told the pep-0261 is the standard for python.

PEP 261 is the standard for the 4 byte wide implementation. It was
implemented after 2 byte Unicode which was documented after-the-fact in PEP
100.
it was not clear what mechanism it entails for access to os-level text
management facilities on the order of osx's "apple type services for unicode
imaging"[0].

ATSUI is a text rendering library. Core Python doesn't include
text-rendering, leaving this up to GUI toolkits. Python does ship with Tk,
which has Unicode text support.
i looked through the mac extensions, but did not discern anything
relevant. can anyone point me to code in wide and narrow builds which uses
such os-level facilities. i was given a reference which appeared to concern
windows' file names, but that, as is the case with direct stream codecs, is
primarly a static situation.

Static as opposed to what? A fixed API that is explicitly wrapped versus
a dynamically wrapped system call convention as is done on Windows by
PythonCOM or ctypes?
i would also be interested to hear if there have been any data collected on
preponderance of wide builds, and on the consequences in those installations
for storage and algorithm efficiency.

Red Hat Linux 9.0 ships with a 4 byte wide build of Python and that is
quite widely distributed. On Windows, I would expect 4 byte to be very rare
as 2 byte matches the system conventions and the binary downloads available
from python.org are 2 byte builds.

Neil
 
I

Ingvar Mattsson

Kenny Tilton said:
I agree with everything you said except that last bit, and I only
disagree with that because of what I have heard from Pythonistas, so
maybe I missed something. I did not think Python (or GVR or both) had
aspirations of being a full-blown language vs just being a powerful
scripting language.

Do they ever plan to do a compiler for it?

Python always compiles to byte-code, saved in a ".pyc" file, apart
(possibly) from the main file. Things that get "imported" will eb
compiled and saved out and re-compiled if the source-file is ewer than
the dumped compiled code.

//Ingvar
 
P

Paul F. Dietz

Andrew said:
Pascal Costanza:



Is the implementation free to *not* compile the code when the
COMPILE function is called? That is, to leave it as is? How
would a user tell the difference without running a timing test?

In CL, COMPILE (and COMPILE-FILE) are not required by the CL spec
to convert lisp code to machine language. Many implementations do,
however, and it tends to be one of the major defining characteristics
of a CL implementation.

A conforming implementation must perform certain minimal steps
at compilation time. See section 3.2.2.2 of the CL spec. It states:

Minimal compilation is defined as follows:

* All compiler macro calls appearing in the source code being
compiled are expanded, if at all, at compile time; they will
not be expanded at run time.

* All macro and symbol macro calls appearing in the source code
being compiled are expanded at compile time in such a way that
they will not be expanded again at run time. macrolet and
symbol-macrolet are effectively replaced by forms corresponding
to their bodies in which calls to macros are replaced by their expansions.

* The first argument in a load-time-value form in source code
processed by compile is evaluated at compile time; in source code
processed by compile-file, the compiler arranges for it to be
evaluated at load time. In either case, the result of the evaluation
is remembered and used later as the value of the load-time-value
form at execution time.

Other than this,a conforming implementation is allowed to leave the
source code alone, compile it to byte codes, compile it to machine
language, or any other correctness-preserving transformation.

It would be conforming to do the minimal compilation, produce byte codes, then
dynamically convert the byte codes to machine language at run time as in Psyco.

Paul
 
?

=?iso-8859-1?q?Bj=F6rn_Lindberg?=

Andrew Dalke said:
Björn Lindberg:

I wasn't the one who pointed out those micro benchmarks. Kenny
Tilton pushed the idea that more concise code is better and that Lisp
gives the most concise code, and that Perl is much more compact
than Python. He suggested I look at some comparisons, so I followed
his suggestion and found that 1) the Lisp code there was not more
succinct than Python and 2) neither was the Perl code.


While OCaml, which has the smallest size, does type inferencing....

The good Lisp compilers do type inferencing too AFAIK, but since Lisp
is fully dynamic it is not always possible for the compiler to do full
type inference at compile time. Declarations help the compiler in this
respect.

Absolutely correct. Both Alex Martellli and I tried to dissuade
Kenny Tilton that LOC was the best measure of succinctness and
appropriateness, and he objected.

I think in general there *is* a correlation between LOC and
succinctness, eg LOC(assembler) > LOC(C) > LOC(awk). It is probably
not a very strong correlation though, and it would probably be more
accurate for larger programs than small code snippets.
Agreed. I pointed out elsewhere that there has been no systematic
study to show that Lisp code is indeed "so much shorter than the
equivalent code in other languages" where "other languages" include
Python, Perl, or Ruby.

It would be interesting to see such studies made.
The closest is
http://www.ipd.uka.de/~prechelt/Biblio/
where the example program, which was non-trivial in size
took about 100LOC in Tcl/Rexx/python/perl and about 250LOC
in Java/C/C++.

That is an interesting study, although there are some possible flaws
(eg no controlled selection of participants). The programming problem
in that study is far to small to meaningfully test any abstraction
capabilities in the language on the level of macros, OO or HOF
though.

In any case, it implies you need to get to some serious sized
programs (1000 LOC? 10000LOC? A million?) before
the advantages of Lisp appear to be significant.

I think that goes for any advantages due to abstraction capabilities
of macros, OO or HOF. The small program in the study above seems to
capture the scripting languages higher level compared to the
close-to-the-machine languages C & C++. (I have not read all of it
though.) To show advantages of the abstraction facilities we have been
discussing in this thread, I believe much larger programs are needed.


Björn
 
A

Alex Martelli

Pascal Bourguignon wrote:
...
The question being whether it's better to be needing several different
languages to solve a set of problems because none of them languages is
powerful enough, or if it's better to have one good and powerful
language that helps you solve all your problems?

A reasonably good way to highlight the key difference between the "horses
for courses" and "one ring to bind them all" schools of thought.

Would I rather have just one means of transportation "powerful enough"
to help me solve all my "going from A to B" problems? Nope! I want a
bicycle for going short and middle distances on sunny days, a seat on
a reasonably fast jet plane for much longer trips, and several things
in-between. A single ``thing'' able to cater for such hugely disparate
needs would be way too complicated to be an optimal solution to any
single one of them.

Would I rather have just one woodworking tool "powerful enough" to help
me solve all my "working wood" problems? No way! A suitably large
and complicated "Swiss Army Knife" may be handy in emergencies, but in
my little woodworking shop I want several separate, optimized tools, each
optimized for its own range of tasks. A single "multi-blade" tool able
to cater for all of the disparate needs that arise in working wood would
be way too complicated and unwieldy to be an optimal solution to any
single one of them.

Do I want a single writing tool, or separate pencils, pens, markers,
highlighters...? Do I want a single notation to write down ANYthing
on paper, or separate ones for music, algebraic formulas, shopping
lists, accounting, ...? Do I want a single font to be used for ANY
writing, from books to billboards to shops' signs to handwriting...?

More generally, are there ANY situations in which "one size fits all"
is a GOOD, OPTIMAL solution, rather than a make-do approach? Maybe
some can be found, but it seems to me that in most cases a range of
tools / solutions / approaches tailored to different classes of
problems may well be preferable. So, it should come as no surprise
that I think this applies to computer languages. In fact I am
sometimes amazed at how wide a range of problems I can solve quite
well with Python -- but I still want C for e.g. device drivers,
spreadsheets to "program" simple what-if scenarios and play around
interactively with parameters, make (or one of its successors, such
as SCons), bash for interactively typed one-liners, HTML / SGML etc
for documents, XML for data interchange with alien apps, SQL to
access relational databases, etc, etc -- and no doubt more besides.

See http://www.strakt.com/sol_capsao_7.html for example -- not
very detailed, but a generic description of a specialized declarative
language for Entity - Relationship descriptions, with embedded
actions in procedural languages [currently, Python only], that we're
developing as part of the CAPS framework (no macros were used in the
production of that language, just traditional boring parsers &c in
Python -- of course, it IS quite possible that _the specialized
language itself_ might benefit from having macros, that's a separate
issue from the one of _implementing_ that "BLM language").


Alex
 
P

Pascal Costanza

David said:
It's only been out, what, twenty years? And another twenty before that
for other lisps... How much time do you think you need?

AFAIK, Lisp was very popular in the 70's and 80's, but not so in the
90's. At the moment, Common Lisp is attracting a new generation of
programmers.

The basic idea of Lisp (programs = data) was developed in the 50's (see
http://www.paulgraham.com/rootsoflisp.html ). This idea is fundamentally
different and much more powerful than the approach taken by almost all
other languages.

You can't argue that. You can argue whether you want that power or not,
but Lisp is definitely more powerful than other languages in this
regard. As Eric Raymond put it, "Lisp is worth learning for the profound
enlightenment experience you will have when you finally get it; that
experience will make you a better programmer for the rest of your days,
even if you never actually use Lisp itself a lot."

And, as Paul Graham put it, if you take a language and "add that final
increment of power, you can no longer claim to have invented a new
language, but only to have designed a new dialect of Lisp". (see
http://www.paulgraham.com/diff.html )

These are the essential reasons why it is just a matter of time that
Lisp will be reinvented and/or rediscovered again and again, and will
continue to attract new followers. It is a consequential idea once you
have got it.

"What was once thought can never be unthought." - Friedrich Dürrenmatt

Pascal
 
R

Rob Warnock

+---------------
| (and yes, I know about the lawsuit against disk drive manufacturors
| and their strange definition of "gigabyte"... )
+---------------

Oh, you mean the fact that they use the *STANDARD* international
scientific/engineering notation for powers of 10 instead of the
broken, never-quite-right-except-in-a-few-cases pseudo-binary
powers of 10?!?!? [Hmmm... Guess you can tell which side of *that*
debate I'm on, eh?] The "when I write powers of 10 which are 3*N
just *asssume* that I meant powers of 2 which are 10*N" hack simply
fails to work correctly when *some* of the "powers of 10" are *really*
powers of 10. It also fails to work correctly with things that aren't
instrinsically quantized in powers of 2 at all.

Examples: I've had to grab people by the scruff of the neck and push
their faces into the applicable reference texts before they believe me
when I say that gigabit Ethernet really, really *is* 1000000000.0 bits
per second [peak payload, not encoded rate], not 1073741824, and that
64 kb/s DS0 telephone circuits really *are* 64,000.0 bits/sec, not 65536.
[And, yes, 56 kb/s circuits are 56000 bits/sec, not 57344.]

Solution: *Always* use the internationally-recognized binary prefixes
<URL:http://physics.nist.gov/cuu/Units/binary.html> when that's really
what you mean, and leave the old scientific/engineering notation alone,
as pure powers of 10. [Note: The historical notes on that page are well
worth reading.]


-Rob

p.s. If you're hot to file a lawsuit, go after the Infiniband Trade
Association for its repeated claims that 4x IB is "10 Gb/s". It isn't,
it's 8 Gb/s [peak user payload rate, not encoded rate]. Go read the
IBA spec if you don't believe me; it's right there.
 
P

Paul F. Dietz

Pascal said:
David Eppstein wrote:


AFAIK, Lisp was very popular in the 70's and 80's, but not so in the
90's. At the moment, Common Lisp is attracting a new generation of
programmers.

Early lisp usage was driven by government money and the AI bubble.
That declined in the late 1980s or in the 1990s. Individuals could
not afford adequate hardware to run lisp until sometime in the 1990s.

But now, hardware has more than caught up with the demands of
lisp and individuals are carrying it forward. This is something
that drives the newer languages also. The cycle time for
improving the languages or their implementations goes down
as the hardware gets faster.

Paul
 
D

Daniel Berlin

Wow, that IS great news! Does it apply to 32-bit Intel-oid machines
(the
most widespread architecture)

Yes, but not windows.

and the newest releases of MS VC++ (7.1)
and gcc, the most widespread compilers for it?

GCC, yes.

MS is not participating in the ABI (take that to mean what you will),
AFAIK.

http://codesourcery.com/cxx-abi
(it's not really draft anymore since compilers are shipping using it,
but it is updated for bug fixes occasionally)

"This document was developed jointly by an informal industry coalition
consisting of (in alphabetical order) CodeSourcery, Compaq, EDG, HP,
Intel, Red Hat, IBM and SGI. Additional contributions were provided by
a variety of individuals."

I can't find any docs on what
switches or whatever I should give the two compilers to get seamless
interop.

Specifically, the standard Python on Windows has long been built with
MSVC++
and this has given problems to C-coded extension writers who don't own
that
product -- it IS possible to use other compilers to build the
extensions, but
only with much pain and some limitations (e.g on FILE* arguments). If
this
has now gone away there would be much rejoicing -- with proper docs on
the
Python side of things and/or use of whatever switches are needed to
enable
this, if any, when we do the standard Python build on Windows.



I'm not very familiar with Python on the Mac but I think it uses
another
commercial compiler (perhaps Metrowerks?), so I suspect the same
question
may apply here.

It depends. I've built it with both.
It's not as crucial on other architectures where Python is
more normally built with free compilers, but it sure WOULD still be
nice to
think of possible use of costly commercial compilers with
hypothetically
great optimizations for the distribution of some "hotspot" object
files, if
that sped the interpreter up without giving any interoperability
problems.

At least on Mac, Apple's gcc -fast is better than any other compiler
around, according to recent benchmarks.

Unsurprising to me, but i'm a gcc hacker, so i might be biased a bit. :p

Most, if not all, optimizations that commercial compilers implement are
or are being implemented in gcc for 3.5/3.6.

--Dan
 
J

Joe Marshall

Andrew Dalke said:
No one has told me they would hire me for contract work "if only
you were a Lisp programmer."

Next time I'm hiring, I'll be sure to let you know.
 
A

Alex Martelli

Björn Lindberg wrote:
...
It would be interesting to see such studies made.

Absolutely! But funding such studies would seem hard. Unless some
company or group of volunteers had their own reasons to take some
existing large app coded in Lisp/Python/Perl/Ruby, and recode it in
one of the other languages with essentially unchanged functionality,
which doesn't seem all that likely. And if it happened, whatever
group felt disappointed in the results would easily find a zillion
methodological flaws to prove that the results they dislike should
be ignored, nay, reversed.

In practice, such a re-coding would likely involve significant
changes in functionality, making direct comparisons iffy, I fear.

I know (mostly by hearsay) of some C++/Java conversions done
within companies (C++ -> Java for portability, Java -> C++ for
performance) with strong constraints on functionality being "just
the same" between the two versions (and while that's far from
a "scientific result", a curious pattern seems to emerge: going
from C++ to Java seems to produce the same LOC's, apparently a
disappointment for some; going from Java to C++ seems to expand
LOC's by 10%/20%, ditto -- but how's one to say if the C++ code
had properly exploited the full macro-like power of templates,
for example...?). But I don't even have hearsay about any such
efforts between different higher-level languages (nothing beyond
e.g. a paltry few thousand lines of Perl being recoded to Python
and resulting in basically the same LOC's; or PHP->Python similarly,
if PHP can count as such a language, perhaps in a restricted context).

I think that goes for any advantages due to abstraction capabilities
of macros, OO or HOF. The small program in the study above seems to
capture the scripting languages higher level compared to the
close-to-the-machine languages C & C++. (I have not read all of it
though.) To show advantages of the abstraction facilities we have been
discussing in this thread, I believe much larger programs are needed.

Yes, and perhaps to show advantages of one such abstraction facility
(say macros) wrt another (say HOFs) would require yet another jump up
in application size, if it could be done at all. Unless some great
benefactors with a few megabucks to wast^H^H^H^H invest for the general
benefit of humanity really feel like spending them in funding such
studies, I strongly suspect they're never really going to happen:-(.


Alex
 
B

Bruce Lewis

If your problems are trivial, I suppose the presumed lower startup
costs of Python may mark it as a good solution medium.

I find no significant difference in startup time between python and
mzscheme.
 
J

Joe Marshall

Pascal Costanza said:
AFAIK, Lisp was very popular in the 70's and 80's, but not so in the
90's.

`Popular' in this case being somewhat relative, much like
the way pustular psoriasis is more popular than leprosy.
 
K

Kenny Tilton

You might be thinking of someone else. I remember a recent discussion
focused on this, but IIRC all other things were equal.

All else being equal, shorter is better. But then right away things can
get longer, since cryptic languages like APL, K, and (I gather) Perl are
not equal in value to nice long function and data names.

As for that ridiculous study, it includes VB. VB suffers from The 4GL
Problem. It reduces LOC by making decisions for you. But no general tool
can successfully get the decision right for all the people all the time.
And 4GL tools are not meant to be tailored to individual requirements.
Where hooks even exist, one ends up in the dread situation of Fighting
the Tool.

So leave me out of this. :)

All things being equal.
 
J

Joe Marshall

Paul Foley said:
Oh, come on! Anyone can understand cricket! There are two teams.
The team that's in sits out, except for two batsmen, and the other
team come out and try to get the men that are in out. When a man goes
out, he goes in and another man comes out. When the team that's in
are all out, except for the one who's not out, the other team goes in,
until they're all out, too; and then a second innings is played.
That's more or less all there is to it!

In other words, the man that's in may be out or in. If he's in, he
can go back out, but if he's out, then he can't go back in. Once
everyone is out, everyone goes out, then once the in team is out
again, the out team goes in again and everyone in can go out again.

Thanks for straightening that out!
 
?

=?iso-8859-1?q?Bj=F6rn_Lindberg?=

Bruce Lewis said:
I find no significant difference in startup time between python and
mzscheme.

My preliminary results in this very important benchmark indicates that
python performs equally well to the two benchmarked Common Lisps:

200 bjorn@nex:~> time for ((i=0; i<100; i++)); do lisp -noinit -eval '(quit)'; done

real 0m2,24s
user 0m1,36s
sys 0m0,83s
201 bjorn@nex:~> time for ((i=0; i<100; i++)); do lisp -noinit -eval '(quit)'; done

real 0m2,24s
user 0m1,39s
sys 0m0,82s
202 bjorn@nex:~> time for ((i=0; i<100; i++)); do clisp -q -x '(quit)'; done

real 0m2,83s
user 0m1,74s
sys 0m1,03s
203 bjorn@nex:~> time for ((i=0; i<100; i++)); do clisp -q -x '(quit)'; done

real 0m2,79s
user 0m1,67s
sys 0m1,09s
204 bjorn@nex:~> time for ((i=0; i<100; i++)); do python -c exit; done

real 0m2,41s
user 0m1,85s
sys 0m0,52s
205 bjorn@nex:~> time for ((i=0; i<100; i++)); do python -c exit; done

real 0m2,41s
user 0m1,89s
sys 0m0,52s

</sarcasm>


Björn
 
M

Marco Antoniotti

Rainer said:
Coming from a C/C++ background, I'm surprised by this attitude. Is
portability of code across different language implementations not a priority
for LISP programmers?

It is. However, history has run against Lisp in this respect. First of
all, there are more than 1.84 implementations of Lisp (4 commercial
ones) and the vendors do not have much incentive in making something
completely portable. OTOH *there are* cross platforms compatibility
layers for many of the things you need. But the problem facing any
Common Lisp library writer is to decide how much to go in terms of cross
implementation and cross platform portability.

Having said that, lets note however, that the actual footprint of CL is
large enough to allow you to write nice portable programs in a much
easier way than e.g. in Scheme or in pre- (and, to some extent post-)
STL C++.

Cheers
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,173
Messages
2,570,938
Members
47,474
Latest member
VivianStuk

Latest Threads

Top