Python vs. Lisp -- please explain

E

El Loco

Kay said:
Yes, it's Guidos master-plan to lock programmers into a slow language
in order to dominate them for decades. Do you also believe that Al
Quaida is a phantom organization of the CIA founded by neocons in the
early '90s who planned to invade Iraq?

Actually, it was created by Bruce Lee, which is not dead but working
undercover for the Hong Kong police to fight against the chinese
triads. At this point you might guess what does Bruce Lee have to do
with Al Qaida? Well my friend, if you don't understand this, you don't
get it at all! (Hint: he's a c++ hacker who wanted to build a tracking
system for terrorists in middle east, but the project got halted when
Guido Van Rossum, who's real name is Abdul Al Wazari, convinced him to
use python). Now the system is so so slow that Ben Laden never gets
caught!

El Loco
 
T

Torsten Bronger

Hallöchen!

[...] Do you truly believe that fewer people would use Python if
its execution were faster?

I think I can answer my own question: yes. Since posting, I came
across a different follow-up where Alexander explains that he sees
healthy elements of the Python ethos--focus on a reliable, widely-
used library, willingness to make Python-C partnerships, and so
on--as results at least in part of early acceptance of Python as
intrinsically slow. That's too interesting an argument for me to
respond without more thought.

I was rather stunned, too, when I read his line of thought.
Nevertheless, I think it's not pointless, albeit formulated in an
awkward way. Of course, Python has not been deliberately slowed
down.

I don't know how strong the effect is that a language design which
doesn't allow for (easy to implement) fast execution speed makes you
write cleaner code, however, I must say that I feel so, too. When I
came from C++ to Python I had to find my Pythonic style, and part of
it was that I threw away those ubiquitous little optimisations and
concentrated on formulating my idea in a clear and expressive way.

By the way, this is my main concern about optional static typing: It
may change the target group, i.e. it may move Python closer to those
applications where speed really matters, which again would have an
effect on what will be considered Pythonic.

Tschö,
Torsten.
 
K

Kay Schluehr

Steven said:
Of course not. The alternative, that Osama has been able to lug his
dialysis machine all over the Pakistan and Afghan mountains without being
detected for four years is *much* more believable. *wink*

Osama? Who is Osama? A media effect, a CNN invention.
I don't think it was the poster's implication that Guido deliberately
created a slow language for the sake of slowness. I think the implication
was more that Guido made certain design choices that increased
productivity and code clarity. (That much is uncontroversial.) Where the
poster has ruffled some feathers is his suggestion that if Guido had only
known more about the cutting edge of language design from CS, Python would
have been much faster, but also much less productive, clear and popular.

I guess the feather ruffling is because of the suggestion that Guido
merely _didn't_know_ about language features that would have increased
Python's speed at the cost of productivity, rather than deliberately
choose to emphasis productivity at the expense of some speed.

Alexanders hypothesis is completely absurd. It turned out over the
years that capabilities of Python optimization are lower than those of
Lisp and Smalltalk. But its a system effect and epiphenomenon of
certain design decisions. This might change with PyPy - who knows? The
Lisp/Smalltalk design is ingenious, radical and original and both
languages were considered too slow for many real-world-applications
over decades. But no one has ever claimed that Alan Kay intentionally
created a slow language in order to hold the herd together - and it
accidentally turned out to be reasonably fast with JIT technology in
the late 90s.

Smalltalk was killed by Java while Lisp was killed by the paradigm
shift to OO in the early 90s. The IT world has absolutely no interest
in the hobby horses of computer scientists or language lovers (like
me). It consolidates in direction of a small set of Algol successors
namely C,C++,Java and C# and some dynamically typechecked languages
like Perl, Python and Ruby that play nice with the bold mainstream
languages as their more flexible addition. A conspiracy like theory
used to explain what's going on is needless.

Kay
 
S

Steven D'Aprano

The reason this isn't just an abstruse philosophical argument where it
makes sense for us to obtusely cling to some indefensible point of view,
is that as the man points out, there are differences that we can't hide
forever from potential Python users. The most obvious to me is that
your Python program essential includes its interpreter - can't go anywhere
without it, and any change to the interpreter is a change to the program.
There are various strategies to address this, but pretending that Python
isn't interpreted is not one of them.

Python programs (.py files) don't contain an interpreter. Some of those
files are *really* small, you can't hide a full blown Python interpreter
in just half a dozen lines.

What you mean is that Python programs are only executable on a platform
which contains the Python interpreter, and if it is not already installed,
you have to install it yourself.

So how exactly is that different from the fact that my compiled C program
is only executable on a platform that already contains the correct machine
code interpreter, and if that interpreter is not already installed, I have
to install a machine code interpreter (either the correct CPU or a
software emulator) for it?

Moving compiled C programs from one system to another one with a different
CPU also changes the performance (and sometimes even the semantics!) of
the program. It is easier for the average user to change the version of
Python than it is to change the CPU.

Nobody denies that Python code running with no optimization tricks is
(currently) slower than compiled C code. That's a matter of objective
fact. Nobody denies that Python can be easily run in interactive mode.
Nobody denies that *at some level* Python code has to be interpreted.

But ALL code is interpreted at some level or another. And it is equally
true that at another level Python code is compiled. Why should one take
precedence over the other?

The current state of the art is that the Python virtual machine is slower
than the typical machine code virtual machine built into your CPU. That
speed difference is only going to shrink, possibly disappear completely.

But whatever happens in the future, it doesn't change two essential facts:

- Python code must be compiled to execute;
- machine code must be interpreted to execute.

Far from being indefensible, philosophically there is no difference
between the two. Python and C are both Turing Complete languages, and both
are compiled, and both are interpreted (just at different places).

Of course, the difference between theory and practice is that in theory
there is no difference between theory and practice, but in practice there
is. I've already allowed that in practice Python is slower than machine
code. But Python is faster than purely interpreted languages like bash.

Consider that Forth code can be as fast (and sometimes faster) than the
equivalent machine code despite being interpreted. I remember a
Apple/Texas Instruments collaborative PC back in the mid to late 1980s
with a Lisp chip. Much to their chagrin, Lisp code interpreted on a
vanilla Macintosh ran faster than compiled Lisp code running on their
expensive Lisp machine. Funnily enough, the Apple/TI Lisp machine sank
like a stone.

So speed in and of itself tells you nothing about whether a language is
interpreted or compiled. A fast interpreter beats a slow interpreter,
that's all, even when the slow interpreter is in hardware.

Describing C (or Lisp) as "compiled" and Python as "interpreted" is to
paint with an extremely broad brush, both ignoring what actually happens
in fact, and giving a false impression about Python. It is absolutely true
to say that Python does not compile to machine code. (At least not yet.)
But it is also absolutely true that Python is compiled. Why emphasise the
interpreter, and therefore Python's similarity to bash, rather than the
compiler and Python's similarity to (say) Java or Lisp?
 
A

Alexander Schmolck

Torsten Bronger said:
I was rather stunned, too, when I read his line of thought.
Nevertheless, I think it's not pointless, albeit formulated in an
awkward way. Of course, Python has not been deliberately slowed
down.

Indeed -- and I'm really not sure what defect in someone's reading skills or
my writing skills would make anyone think I suggested this.

'as
 
P

Paul Boddie

Torsten said:
By the way, this is my main concern about optional static typing: It
may change the target group, i.e. it may move Python closer to those
applications where speed really matters, which again would have an
effect on what will be considered Pythonic.

Yes, I think that with optional static typing, it's quite likely that
we would see lots of unnecessary declarations and less reusable code
("ints everywhere, everyone!"), so I think the point about not
providing people with certain features is a very interesting one, since
people have had to make additional and not insignificant effort to
optimise for speed. One potential benefit is that should better tools
than optional static typing be considered and evaluated, the "ints
everywhere!" line of thinking could prove to be something of a dead end
in all but the most specialised applications. Consequently, the Python
platform could end up better off, providing superior tools for
optimising performance whilst not compromising the feel of the language
and environment.

Paul
 
A

Alexander Schmolck

Kay Schluehr said:
Alexanders hypothesis is completely absurd.

You're currently not in the best position to make this claim, since you
evidently misunderstood what I wrote (I certainly did not mean to suggest that
Guido *deliberately* chose to make python slow; quite the opposite in fact).

Maybe I wasn't sufficiently clear, so if rereading my original post doesn't
bring about enlightenment, I'll try a restatement.
It turned out over the years that capabilities of Python optimization are
lower than those of Lisp and Smalltalk. But its a system effect and
epiphenomenon of certain design decisions.

The point is that the design decisions, certainly for Common Lisp, scheme and
particularly for dylan where also informed by what could be done
*efficiently*, because the people who designed these languages knew a lot
about advanced compiler implementation strategies for dynamic languages and
thought that they could achieve high levels of expressiveness whilst retaining
the possibility of very fast implementations (IIRC dylan specifically was
meant to be something like within 90% of C performance). CL and dylan were
also specifically designed for building very large and sophisticated systems,
whereas it seems Guido originally thought that python would scale to about 500
LOC.
This might change with PyPy - who knows? The Lisp/Smalltalk design is
ingenious, radical and original and both languages were considered too slow
for many real-world-applications over decades. But no one has ever claimed
that Alan Kay intentionally created a slow language in order to hold the
herd together - and it accidentally turned out to be reasonably fast with
JIT technology in the late 90s.

I'm pretty sure this is wrong. Smalltalk and Lisp were both quite fast and
capable before JIT technology in the 90ies came along -- just not necessarily
on hardware optimized for C-like languages, partly because no one anticipated
that the x86 and co. would become so dominant (I also roughly remember Alan
Kay expressing his frustration not to long ago over the fact that despite a
50000 fold increase in processing speed, current hardware would only run early
smalltalk 100x faster than the lisa -- I almost certainly misremember the
details but you get the picture).
A conspiracy like theory used to explain what's going on is needless.

Indeed.

'as
 
P

Paul Rubin

I'm wondering if someone can explain to me please what it is about
Python that is so different from Lisp that it can't be compiled into
something as fast as compiled Lisp? From this above website and
others, I've learned that compiled Lisp can be nearly as fast as C/C++,
so I don't understand why Python can't also eventually be as efficient?
Is there some *specific* basic reason it's tough?

The issues of compiling Python and compiling Lisp are similar. Lisp
implementers tend to care about performance more, so Lisp tends to be
compiled. There's a Python compiler called Psyco which can be used
with CPython and which will be part of PyPy. I'd expect its output
code to be comparable to compiled Lisp code.
 
P

Paul Rubin

Steven D'Aprano said:
Python is not slow by design. Python is dynamically typed by design, and
relative slowness is the trade-off that has to be made to give dynamic
types.

I think both of you are missing the point of the question, which is
that Lisp is dynamically typed exactly the way Python is and maps to
Python almost directly; yet good Lisp implementations are much faster
than CPython.
The Python developers have also done marvels at speeding up Python since
the early days, with the ultimate aim of the PyPy project to make Python
as fast as C, if not faster. In the meantime, the question people should
be asking isn't "Is Python fast?" but "Is Python fast enough?".

That is the real answer: CPython doesn't reach performance parity with
good Lisp implementations, but is still fast enough for lots of
purposes. Psyco and PyPy are ongoing efforts to close the performance
gap and which are showing promise of success.
 
A

Alexander Schmolck

Michele Simionato said:
Just for personal enlightment, where do you think Lisp is more dynamic
of Python?
Can you new name a few features?

Sure (I'm assuming that by lisp you mean common lisp):

- development model

- by default the development model is highly interactive and you can
redefine all sorts of stuff (functions, methods, classes even the syntax)
without having to start over from scratch (you can also save the current
state of affairs as an image). By contrast changing modules or classes in
interactive python sessions is rather tricky to do without screwing things
up, and it doesn't support images, so sessions tend to be much more
short-lived.

Emacs+slime also offers more powerful functionality than emacs+python mode
(autocompletion, jumping to definitions, various parallel sessions or
remotely connecting to running applications via sockets etc.)

- hot-patching

- partly as a consequence of the above, you can also arrange to hot-patch
some running application without too much trouble (IIRC Graham claims in
one of his essays that one of his favorite pranks was fixing a bug in the
application whilst on the phone to the customer as she reported it and
then telling her that everything in fact seemed to work fine and to try
again)

- error handling:

- by default you end up in the debugger if something goes wrong and in many
cases you can correct it in place and just continue execution (if you've
ever had some long numerical computation die on plotting the results
because of a underflow as you exponentiate to plot in transformed
coordinates, you'd appreciate being able to just return 0 and continue)

- Apart from "usual" exception handling CL has much more powerful resumable
exceptions that offer far more fine grained error handling possibiliies.

For more info let me recommend the chapter from Peter Seibel's excellent
"practical common lisp" (available in print, and also freely online):

<http://www.gigamonkeys.com/book/beyond-exception-handling-conditions-and-restarts.html>

(The whole book is a great ressource for people who want to have quick
play with common lisp and see how it's features can be leveraged for real
applications (such as html creation, or writing an mp3 server or id3
parser). Peter also makes an easy to install, preconfigured "lispbox"
bundle of Emacs+a free lisp implementation available on his web-page).

- OO:

- You know this already, but I'd argue that multimethods and method
combinations give you more dynamism then class-centric OO.

- Also, if you change the definition of a class all existing instances will
be updated automatically. You can get a similar effect in python by
laboriously mutating the class, provided it doesn't use __slots__ etc, but
that is more brittle and also more limited -- for example, in CL you can specify what
happens to instances when the class is updated.

- chameleon like nature:

It's much easier to make CL look like something completely different (say prolog
or some indentation based, class-centric language like python) than it would
be with python. In particular:

- there are no reserved keywords

- you can e.g. implement new syntax like python style """-strings easily
(I've done so in fact) with reader macros. Indentation based syntax should
also be possible along the same lines, although I haven't tried.

- you can introduce pretty much any sort of control structure you might
fancy and you can carry out very sophisticated code transformations behind
the scenes.

- finally, with Lispmachines there at least used to be whole operating systems
written all the way down in lisp and according to all the testimony you
could interactively modify quite basic system behaviour. The only extant
systems that come even remotely close would be smalltalks, but even squeak
which is incredibly malleable and a cross-platform mini-OS in its own right
uses the host OS for many basic tasks.


'as
 
D

Donn Cave

Quoth Steven D'Aprano <[email protected]>:
....
| Nobody denies that Python code running with no optimization tricks is
| (currently) slower than compiled C code. That's a matter of objective
| fact. Nobody denies that Python can be easily run in interactive mode.
| Nobody denies that *at some level* Python code has to be interpreted.
|
| But ALL code is interpreted at some level or another. And it is equally
| true that at another level Python code is compiled. Why should one take
| precedence over the other?

I have no idea, what precedence? All I'm saying is that Python matches
what people think of as an interpreted language. You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long. If you
give me a Python program, you have 3 choices: cross your fingers and
hope that I have the required Python interpreter version, slip in a
25Mb Python interpreter install and hope I won't notice, or come clean
and tell me that your program needs an interpreter and I should check to
see that I have it.

Donn Cave, (e-mail address removed)
 
S

Steven D'Aprano

Donn said:
Quoth Steven D'Aprano <[email protected]>:
...
| Nobody denies that Python code running with no optimization tricks is
| (currently) slower than compiled C code. That's a matter of objective
| fact. Nobody denies that Python can be easily run in interactive mode.
| Nobody denies that *at some level* Python code has to be interpreted.
|
| But ALL code is interpreted at some level or another. And it is equally
| true that at another level Python code is compiled. Why should one take
| precedence over the other?

I have no idea, what precedence?

There seem to be two positions in this argument:

The "Python is interpreted and not compiled" camp, who
appear to my eyes to dismiss Python's compilation stage
as a meaningless technicality.

The "Python is both interpreted and compiled" camp, who
believe that both steps are equally important, and to
raise one over the other in importance is misleading.

> All I'm saying is that Python matches
what people think of as an interpreted language.

Most people in IT I know of still think of
"interpreted" as meaning that every line of source code
is parsed repeatedly every time the code is executed.
Even when they intellectually know this isn't the case,
old habits die hard -- they still think of
"interpreted" as second class.

If you think that Python has to parse the line "print
x" one hundred times in "for i in range(100): print x"
then you are deeply, deeply mistaken.

That's why Sun doesn't describe Java as interpreted,
but as byte-code compiled. They did that before they
had JIT compilers to compile to machine code.
Consequently nobody thinks of Java source having to be
parsed, and parsed, and parsed, and parsed again.

> You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long.

Pot, meet kettle.

A simple question for you: does Python compile your
source code before executing it? If you need a hint,
perhaps you should reflect on what the "c" stands for
in .pyc files.

> If you
give me a Python program, you have 3 choices: cross your fingers and
hope that I have the required Python interpreter version, slip in a
25Mb Python interpreter install and hope I won't notice, or come clean
and tell me that your program needs an interpreter and I should check to
see that I have it.

Hey Donn, here is a compiled program for the PowerPC,
or an ARM processor, or one of IBM's Big Iron
mainframes. Or even a Commodore 64. What do you think
the chances are that you can execute it on your
x86-compatible PC? It's compiled, it should just
work!!! Right?

No of course not. If your CPU can't interpret the
machine code correctly, the fact that the code is
compiled makes NO difference at all.

In other words, I have three choices:

- cross my fingers and hope that you have the required
interpreter (CPU);

- slip in an interpreter install (perhaps an emulator)
and hope you won't notice;

- or come clean and tell you that my program needs an
interpreter ("Hey Donn, do you have a Mac you can run
this on?") and you should check to see that you have it.
 
B

Ben Sizer

Steven said:
The "Python is both interpreted and compiled" camp, who
believe that both steps are equally important, and to
raise one over the other in importance is misleading.
That's why Sun doesn't describe Java as interpreted,
but as byte-code compiled. They did that before they
had JIT compilers to compile to machine code.
Consequently nobody thinks of Java source having to be
parsed, and parsed, and parsed, and parsed again.

They also described it that way to help marketing, and I don't think
that should be overlooked. They would have known full well that calling
their language "interpreted" would have affected public perceptions.

It's interesting to see how culture affects things. You talked of 'IT
people' in your post and hold up Java of an example of how byte-code
doesn't mean slow, the implication being that Python uses the same
mechanisms as Java and therefore is good enough. In the general IT
world, Java is quite popular (to make a bit of an understatement) and
it would often be used as some sort of benchmark.

On the other hand, the backgrounds I have familiarity with are computer
game development and embedded development. In these areas, we would
point to Java as evidence that 'interpreted' bytecode is too slow and
that anything using a similar technology is likely to be a problem.

I'm not saying you're wrong, just highlighting that comparisons
themselves always sit in some wider context which can make the
comparison unimportant.

I think it's also important to note that 'interpreted' doesn't
necessarily mean 'parsed repeatedly'. Many older machines which came
with BASIC installed would store their statements in a tokenised form -
arguably bytecode with a large instruction set, if you look at it a
certain way. This isn't necessarily so far from what Python does, yet
few people would argue that those old forms of BASIC weren't
interpreted.
 
K

Kay Schluehr

Alexander said:
You're currently not in the best position to make this claim, since you
evidently misunderstood what I wrote (I certainly did not mean to suggest that
Guido *deliberately* chose to make python slow; quite the opposite in fact).

Like everyone else. It's sometimes hard extract the intended meaning in
particular if it's opposed to the published one. I apologize if I
overreacted.
Maybe I wasn't sufficiently clear, so if rereading my original post doesn't
bring about enlightenment, I'll try a restatement.


The point is that the design decisions, certainly for Common Lisp, scheme and
particularly for dylan where also informed by what could be done
*efficiently*, because the people who designed these languages knew a lot
about advanced compiler implementation strategies for dynamic languages and
thought that they could achieve high levels of expressiveness whilst retaining
the possibility of very fast implementations (IIRC dylan specifically was
meant to be something like within 90% of C performance). CL and dylan were
also specifically designed for building very large and sophisticated systems,
whereas it seems Guido originally thought that python would scale to about 500
LOC.

O.K. To repeat it in an accurate manner. Python was originally designed
by Guido to be a scripting language for a new OS as a more complete
version of a shell scripting language. Unlike those its design was
strongly influenced by the usability ideas of the ABC development team.
Therefore speed considerations were not the primary concern but an open
model that was easily extendable both on the C-API level and the
language level. So a VM architecture was chosen to achieve this. Adding
new opcodes should have been as simple as interfacing with the C-API.
After growing strongly in the late 90s large scale projects emerged
such as Zope and many users started to request more Python performance
since they wanted to escape from the dual-language model. Writing
C-code was not self evident for a new programmers generation grewn up
with Java and the ffi turned out to be a hurdle. After remodeling the
object core ( "new style classes" ) progressive optimizations came to
hold. In 2002 a new genius programmer entered the scene, namely Armin
Rigo, who came up with Psyco and launched the PyPy project together
with a few other Python hackers in order to aggressively optimize
Python using Pythons introspective capabilities. That's were we still
are.

Remembering the historical context we might draw some parallels to
other contexts and language design intentions. We might also figure out
parallels and differences between motives of language designers and
leading persons who drive language evolution. Python is not just Guido
although his signature is quite pervasive. In his latest musings he
comes back to his central idea of language design as a kind of user
interface design. It's probably this shift in perspective that can be
attributed as original to him and which goes beyond making things just
"simple" or "powerfull" or "efficient" ( at least he made this shift
public and visible ). It is also the most controversial aspect of the
language because it is still inseparable from technical decisions (
non-true closures, explicit self, statement-expression distinction,
anonymous function as an expression with limited abilities etc. )

Kay
 
P

Peter Mayne

Torsten said:
My definiton would be that an interpreted language has in its
typical implementation an interpreting layer necessary for typical
hardware. Of couse, now we could discuss what is "typical",
however, in practice one would know it, I think. In case of Python:
CPython and all important modern processors.

In a previous century, I used something called UCSD Pascal, which at the
time was a typical implementation of Pascal. It ran on (amongst other
things) an Apple ][, which at the time was typical hardware. It worked
by compiling Pascal source to bytecode (called p-code), and interpreting
the p-code. So, in practice, one would know that Pascal was an
interpreted language.

Later on, I used a typical implementation called VAX Pascal: a compiler
reduced Pascal source to VAX object code. In practice, Pascal was not an
interpreted language. Of course, more than one of the VAXen we had did
not implement the entire VAX instruction set, and some instructions were
emulated, or interpreted, if you will, by other VAX instructions. So, in
practice, some of the Pascal was interpreted.

And, as someone in this thread has pointed out, it is likely that your
important modern (x86) processor is not natively executing your x86
code, and indeed meets your definition of having "in its typical
implementation an interpreting layer necessary for typical hardware".

Another example: is Java the bytecode, which is compiled from Java the
language, interpreted or not? Even when the HotSpot JIT cuts in? Or when
a native Java processor is used? Or when your Java program is compiled
with GCJ (if GCJ does what I think it does)? Does this make Java an
interpreted language or not?

Personally, in practice I don't care, so don't ask me. Ponder on getting
angels to dance on the head of a pin before you worry about whether the
dance can be interpreted or not.

PJDM
 
C

Chris Mellon

Quoth Steven D'Aprano <[email protected]>:
...
| Nobody denies that Python code running with no optimization tricks is
| (currently) slower than compiled C code. That's a matter of objective
| fact. Nobody denies that Python can be easily run in interactive mode.
| Nobody denies that *at some level* Python code has to be interpreted.
|
| But ALL code is interpreted at some level or another. And it is equally
| true that at another level Python code is compiled. Why should one take
| precedence over the other?

I have no idea, what precedence? All I'm saying is that Python matches
what people think of as an interpreted language. You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long. If you
give me a Python program, you have 3 choices: cross your fingers and
hope that I have the required Python interpreter version, slip in a
25Mb Python interpreter install and hope I won't notice, or come clean
and tell me that your program needs an interpreter and I should check to
see that I have it.


You're correct as far as it goes, but can you provide a reasonable
definition for "interpreted" that matches the common usage? Most
people can't.

When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...

They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.

Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.

There is an obvious difference between Python and C. Nobody would deny
that. But it's a fairly hard thing to *quantify*, which is why people
make sloppy categorizations. That's not a problem as long as there
isn't prejudice associated with the categorization, which there is.

I wonder how "interpreted" people would think Python is if the
automagic compilation to .pyc was removed and you had to call
"pythonc" first.
 
P

Paul Boddie

Chris said:
You're correct as far as it goes, but can you provide a reasonable
definition for "interpreted" that matches the common usage? Most
people can't.

I thought Torsten's definition was good enough: if the instructions
typically produced when preparing your programs for execution can be
handled directly by the CPU then let's call it a "compiled language";
otherwise, let's call it an "interpreted language". I think we all know
about the subtleties of different levels of virtual machines, but if
you want an arbitrary definition that lots of people feel is intuitive
then that's the one to go for.
When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...

Right: compiled Perl and Python instructions typically aren't executed
directly by the hardware; Ruby visits the parse tree when executing
programs (see [1] for some casual usage of "interpreted" and "compiled"
terms in this context), although other virtual machines exist [2];
JavaScript varies substantially, but I'd imagine that a lot of the
implementations also do some kind of parse tree walking (or that the
developers don't feel like documenting their bytecodes), although you
can also compile JavaScript to Java class files [3]; BASIC varies too
much for any kind of useful summary here, but I'd imagine that early
implementations have tainted the language's "compiled" reputation
substantially.
They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.

That's why I put Java and Python in the same category elsewhere in this
thread. Bear in mind, though, that Java's just-in-time compilation
features were hyped extensively, and I imagine that many or most
implementations have some kind of native code generation support,
either just-in-time or ahead-of-time.
Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.

Well, I think Torsten's definition was more objective and yet arrives
at the same result. Whether we're happy with that result, I have my
doubts. ;-)
There is an obvious difference between Python and C. Nobody would deny
that. But it's a fairly hard thing to *quantify*, which is why people
make sloppy categorizations. That's not a problem as long as there
isn't prejudice associated with the categorization, which there is.

I refer you again to Torsten's definition.
I wonder how "interpreted" people would think Python is if the
automagic compilation to .pyc was removed and you had to call
"pythonc" first.

Well, such things might have a psychological impact, but consider
removing Python's interactive mode in order to enhance Python's
non-interpreted reputation, and then consider Perl (an interpreted
language according to the now-overly-referenced definition) which
doesn't have an interactive mode (according to [4] - I don't keep up
with Perl matters, myself), but which allows expression evaluation at
run-time. No-one would put Perl together with C in a compiled vs.
interpreted categorisation. Removing the automatic compilation support
might strengthen the compiled feel of the both languages further, but
with knowledge of the technologies employed, both languages (at least
in their mainstream forms) are still on the other side of the fence
from C.

Paul

[1] http://www.rubygarden.org/faq/entry/show/126
[2] http://www.atdot.net/yarv/
[3] http://www.mozilla.org/rhino/doc.html
[4] http://dev.perl.org/perl6/rfc/184.html
 
D

Dave Hansen

On Tue, 21 Feb 2006 08:36:50 -0600 in comp.lang.python, "Chris Mellon"

[...]
When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...

They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.

IMHO, it's marketing. Soon after (as soon as?) Sun introduced Java,
they announced microprocessors that would implement the JVM natively.
Thus on those machines, Java would not be "interpreted."

AIUI, the reason native Java chips never took off is 1) limited
utility (who wants a chip that can only run Java programs?), and 2)
performance on native chips wasn't even better than JVMs running on
commodity microprocessors, so what's the point?
Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.

I think another reason "perl, Python etc." are known to be interpreted
and Java is not is the interactivity afforded by former group. This
is also why, e.g., lisp and Forth are thought of as interpreted (at
least by those with only a passing familiarity with the languages),
though native compilers for both languages are readily available.

Regards,
-=Dave
 
D

D H

Donn said:
I can say "Python can serve as a scripting language for some applications",
but not "Python is a scripting language!"
> as soon as you say "interpreted, scripting", peoples think "not
> serious".

Cameron said:
> I *think* you're proposing that,
> were Guido more knowledgeable, he would have created a Python
> language that's roughly as we know now, implemented it with
> FASTER software ... and "to its own detriment".

Fredrik said:
> define "scripting language".
>
> the only even remotely formal definition I've ever seen is "language
> with designed to script an existing application, with limited support
> for handling
> its own state". Early Tcl and JavaScript are scripting languages,
> Python is not.

Kay said:
> Yes, it's Guidos master-plan to lock programmers into a slow language
> in order to dominate them for decades.

Donn said:
> All I'm saying is that Python matches
> what people think of as an interpreted language. You can deny it, but
> but it's going to look like you're playing games with words, and to no
> real end, since no one could possibly be deceived for very long.
> Describing C (or Lisp) as "compiled" and Python as "interpreted" is to
> paint with an extremely broad brush, both ignoring what actually
> happens in fact, and giving a false impression about Python. It is
> absolutely true to say that Python does not compile to machine code.
> (At least not yet.) But it is also absolutely true that Python is
> compiled. Why emphasise the interpreter, and therefore Python's
> similarity to bash, rather than the compiler and Python's similarity
> to (say) Java or Lisp?

Paul said:
> Yes, I think that with optional static typing, it's quite likely that
> we would see lots of unnecessary declarations and less reusable code
> ("ints everywhere, everyone!"), so I think the point about not
> providing people with certain features is a very interesting one,
> since
> people have had to make additional and not insignificant effort to
> optimise for speed. One potential benefit is that should better tools
> than optional static typing be considered and evaluated, the "ints
> everywhere!" line of thinking could prove to be something of a dead
> end
> in all but the most specialised applications. Consequently, the Python
> platform could end up better off, providing superior tools for
> optimising performance whilst not compromising the feel of the
> language
> and environment.

Torsten said:
> By the way, this is my main concern about optional static typing: It
> may change the target group, i.e. it may move Python closer to those
> applications where speed really matters, which again would have an
> effect on what will be considered Pythonic.
> The "Python is both interpreted and compiled" camp, who
> believe that both steps are equally important, and to
> raise one over the other in importance is misleading.
> That's why Sun doesn't describe Java as interpreted,
> but as byte-code compiled. They did that before they
> had JIT compilers to compile to machine code.

Bruno said:
> It's not a "scripting" language, and it's not interpreted.


It will all be sorted out once and for all in Python 3000: The Reckoning
 
D

Donn Cave

Steven D'Aprano said:
Hey Donn, here is a compiled program for the PowerPC,
or an ARM processor, or one of IBM's Big Iron
mainframes. Or even a Commodore 64. What do you think
the chances are that you can execute it on your
x86-compatible PC? It's compiled, it should just
work!!! Right?

No of course not. If your CPU can't interpret the
machine code correctly, the fact that the code is
compiled makes NO difference at all.

In other words, I have three choices:

- cross my fingers and hope that you have the required
interpreter (CPU);

- slip in an interpreter install (perhaps an emulator)
and hope you won't notice;

- or come clean and tell you that my program needs an
interpreter ("Hey Donn, do you have a Mac you can run
this on?") and you should check to see that you have it.

Sure, all this is true, except for the term "interpreter."
You would surely not use the word that way, unless you
just didn't want to communicate.

Your paragraph above that starts with "No of course not",
even omits a point that everyone understands, you can in
fact expect a .py file will work independent of machine
architecture - like any interpreted language. We all know
what native code compilation buys you and what it doesn't.

Donn Cave, (e-mail address removed)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,982
Messages
2,570,189
Members
46,735
Latest member
HikmatRamazanov

Latest Threads

Top