Why I love python.

M

Michael Scarlett

There is an amazing article by paul graham about python, and an even
better discussion about it on slashdot. The reason I point this out,
is the more I read both articles, the more I realised how we would be
mutilating the language with that god forsaken @ decorator.
I don't know about the rest of you, but I learned python and fell in
love with its syntax and simplicity. Python - just works. So please
GVR. Don't complicate it. Leave it as is. Work on making it faster,
not uglier. Work on - in some cases - better algorithms for certain
modules, not for it to even closely resemble C or perl or god knows
whateverotherlanguagethereisoutthere. Am i the only one with a
visceral reaction to this thing???

paul Graham article: http://www.paulgraham.com/pypar.html

Slashdot discussion:
http://developers.slashdot.org/developers/04/08/12/1721239.shtml?tid=156&tid=218
 
R

Robert

Michael Scarlett said:
There is an amazing article by paul graham about python, and an even
better discussion about it on slashdot. The reason I point this out,
is the more I read both articles, the more I realised how we would be
mutilating the language with that god forsaken @ decorator.
I don't know about the rest of you, but I learned python and fell in
love with its syntax and simplicity. Python - just works. So please
GVR. Don't complicate it. Leave it as is. Work on making it faster,
not uglier. Work on - in some cases - better algorithms for certain
modules, not for it to even closely resemble C or perl or god knows
whateverotherlanguagethereisoutthere. Am i the only one with a
visceral reaction to this thing???

Nope I have the same reaction.
 
M

Mark Bottjer

Michael said:
I don't know about the rest of you, but I learned python and fell in
love with its syntax and simplicity.

That's the funny thing about Python. It really isn't simple, but it sure
seems like it is. There's tons of little niggling rules about scoping,
inheritance, and what have you--but you can be blissfully ignorant of
most of them and still get work done. That's pretty unique.

I think that's why people are so concerned about @pie: it has the
*potential* to be an obstacle early in the learning process, instead of
after someone is already hooked.
Am i the only one with a visceral reaction to this thing???

Goodness, no! Why do you think we've all been pissing and moaning so much?

-- Mark
 
N

Nick Patavalis

There is an amazing article by paul graham about python, and an even
better discussion about it on slashdot.

Yes, the article is very good
Don't complicate it. Leave it as is. Work on making it faster, not
uglier.

Python needs drastic performance improvement if it is to scrap-off the
"scripting language" stigma. The only way to get these improvements is
making it possible for a python implementation to produce *efficient*
*compiled* code. At the same time the dynamic-typing nature of the
language is one of its most valuable characteristics. And this is one
of the hardest problems when trying to write a decent python
compiler. If you define a function like:

def sum (a, b):
return a + b

How can the compiler know what code to produce? It could trace all the
applications of sum(), and decide what types of arguments sum() is
actually applied on. But this is not easy, and sometimes it is
straight-out impossible.

A different approach would be for the programmer to *suggest* what
kind of types the function will *most probably* be applied on. The
programmer might suggest to the compiler that "a" and "b" will *most
probably* be integers or floats, so the compiler will have to produce
code for a function that handles these cases (or code for two
"functions", one for each case). One might say that the function could
be "decorated" by the programmer regarding the type of its
arguments. Notice that in such a case the decoration does not alter
the behavior of the function in any way! The function can still be
called with string arguments, in which case execution will be
dispatched to an "interpreted" version thereof.

So I believe is that "making it faster" requires some fundamental
work, and not simply devising "better algorithms for some
modules". Better algorithms for some modules will give you something
like point-something improvement in performance. Being able to produce
efficient compiled code will give you improvement between a factor of
10 and an order of magnitude (depending on the type of the
program). The same it true for "making it more secure" (e.g by
providing the programmer a way to specify what types of arguments are
*allowed* to be passed to a function).

In general, python must break free from its perl-ish adolescence
(language for small things, which doesn't have to be very fast or very
safe), but without loosing its agility. Decorators might be a step in
the right direction, or at least they might allow some experimentation
with such matters.

Because of this they are welcome.

Just my 2c
/npat
 
J

John Roth

Nick Patavalis said:
Python needs drastic performance improvement if it is to scrap-off the
"scripting language" stigma.

More performance would be helpful. There are a number
of projects that are working toward that end, of which
the most visible is the PyPy project. Jim Hughnin claims
that he's getting substantial improvements with his port
to the .Net framework, but see Fredrick Lundh's August
4 post on the subject.

As far as I'm aware, the biggest current performance
sink is function and method call overhead. Lookup for
module and built-in level variables is also a significant
time sink - and both module level and builtin identifiers
are used quite frequently.

Another thing to notice is the garbage collection
algorithm. Python uses reference counting as the
basic algorithm, which wasn't that bad a choice
a decade ago. Today real garbage collection
technology has outstripped it so that maintaining
the reference counts is another time sink.

The descriptor technology in new style classes
is a stunning techincal achievement, but in the
worst case it requires a full scan of the class
hierarchy before the mechanism can decide if
it's appropriate to insert an attribute into an
instance or invoke a property.
The only way to get these improvements is
making it possible for a python implementation
to produce *efficient* *compiled* code.

I think there are lots of people that would dispute
you on that. Current Java environments run close
to C++ performance due to the JIT compilers
that are built into the runtimes. Current JIT
technology doesn't require pre-declaration of
variable types; it's perfectly happy to insert checks
at appropriate points so that it can reuse code when
the object types don't change (which they don't
most of the time.)

John Roth
 
J

Jack Diederich

That's the funny thing about Python. It really isn't simple, but it sure
seems like it is. There's tons of little niggling rules about scoping,
inheritance, and what have you--but you can be blissfully ignorant of
most of them and still get work done. That's pretty unique.

I think that's why people are so concerned about @pie: it has the
*potential* to be an obstacle early in the learning process, instead of
after someone is already hooked.
Agreed, python isn't simple and those hidden things are actually useful for
getting real work done. I've been using python industrially for three years
and I'm a big fan of decorators; decorators would help me get things done.
I liked the look of [decorators] before colon option more, but the current
situation of

def foo(a,b,c):
#
# 60 lines of code here
#
foo = mutate(foo) # oh, and by the way the 'def foo'
# signature might be misleading

'foo = mutate(foo)' It is boilerplate, python is nice because it eschews
boilerplate.

While the decorator syntax might not be obvious to newbies they won't see
it in simple code. When then do see it having @mutate right next to the
func def has to be more of a clue than 'foo=mutate(foo)' lines or screens away.

-Jack
 
N

Nick Patavalis

More performance would be helpful. There are a number
of projects that are working toward that end, of which
the most visible is the PyPy project.

Yes, I know about PyPy, and I think what they are trying to do is
write Python itself in a Python-subset that can be efficiently
compiled, or something along these lines. This is interesting (to say
the least).
As far as I'm aware, the biggest current performance
sink is function and method call overhead [...]

Another thing to notice is the garbage collection
algorithm [...]

Both very true!
I think there are lots of people that would dispute
you on that. Current Java environments run close
to C++ performance due to the JIT compilers
that are built into the runtimes.

You 're right, I was maybe a bit too dogmatic on my point. But you
must accept that JIT-compilers are, nevertheless, compilers! They may
be more intelligent and more flexible than traditional "ahead of time"
compilers, but still they are fundamentally compilers. Furthermore,
for most cases, it might be possible for an AOT compiler to produce a
"binary" that doesn't contain the compiler itself.
Current JIT technology doesn't require pre-declaration of variable
types; it's perfectly happy to insert checks at appropriate points
so that it can reuse code when the object types don't change (which
they don't most of the time.)

What you mean I guess, is that the first time a function is applied,
it is compiled to native-code, and a signature for the application is
generated. The next time, the application is checked against the
signature and if they match, the existing code is used, otherwise the
function is re-compiled (preserving the previously compiled one too,
is some sort of "cache"). Or am I getting it wrong? Even in such a
case though per-declarations would help.

Do you happen to know of any efforts to build such "AOT"/"JIT"
compilation/execution environments for Python?

Regards
/npat
 
A

Anthony Baxter

Python needs drastic performance improvement if it is to scrap-off the
"scripting language" stigma.

I'm biased, having done a paper on this at the most recent PyCon, but
I firmly believe that much of the "Python is too slow" arguments can be
answered with "too slow for what?" See the pycon proceedings, but I've
been doing VoIP in Python, complete with audio mixing, and it's been
more than fast enough.

Another large app is a database ETL tool - Python is more than
adequate for flinging around a very large number of rows of data.
Indeed, it could be 4-5 times as slow, and Oracle would still be the
bottleneck.

Sure, you're not going to get great performance for your numerical
computation in Python, but luckily, we have numarray for this.

Yes, additional performance would be a nice-to-have, but I've not
really found the existing interpreter's performance to be that much
of a problem. I suspect that one of the many new implementations
will provide us with some wins here.
 
A

Anthony Baxter

[ pie decorators ]
Am i the only one with a
visceral reaction to this thing???

So did you have a similar reaction on first hitting the indentation for
blocks? I know I dimly recall thinking that this was very strange and
horrible (dimly, because it was 1992 or 1993).
 
S

Sam Holden

I'm biased, having done a paper on this at the most recent PyCon, but
I firmly believe that much of the "Python is too slow" arguments can be
answered with "too slow for what?" See the pycon proceedings, but I've
been doing VoIP in Python, complete with audio mixing, and it's been
more than fast enough.

I've also been doing rtp voice in python - on an iPAQ H3800... I'm using
your rtp.py code (I'm not doing sip), so I can't take credit for it
though :)

I dont know if it's "more than fast enough", but it's "fast enough".
 
A

Anthony Baxter

I've also been doing rtp voice in python - on an iPAQ H3800... I'm using
your rtp.py code (I'm not doing sip), so I can't take credit for it
though :)

I dont know if it's "more than fast enough", but it's "fast enough".

Neat! Well, on that sort of extremely limited hardware, it's not suprising
that it's more of a struggle. Someone else got the full shtoom working on
WinCE or similar.
 
E

Erik de Castro Lopo

Python needs drastic performance improvement if it is to scrap-off the
"scripting language" stigma. The only way to get these improvements is
making it possible for a python implementation to produce *efficient*
*compiled* code. At the same time the dynamic-typing nature of the
language is one of its most valuable characteristics. And this is one
of the hardest problems when trying to write a decent python
compiler. If you define a function like:

def sum (a, b):
return a + b

How can the compiler know what code to produce?

I know of at least one language which has solved this problem, Ocaml

http://www.ocaml.org/

Its called type inferencing and since there is at least one working
implementation, it can't be THAT hard.


Erik
--
+-----------------------------------------------------------+
Erik de Castro Lopo (e-mail address removed) (Yes it's valid)
+-----------------------------------------------------------+
Never argue with stupid people. They'll just drag you down to
their level and beat you with experience
 
E

Erik Max Francis

Erik said:
I know of at least one language which has solved this problem, Ocaml

http://www.ocaml.org/

Its called type inferencing and since there is at least one working
implementation, it can't be THAT hard.

That's actually the kind of thing that is planned for Python with
Starkiller, however silly a project name that might be.
 
B

Brian Quinlan

Erik said:
I know of at least one language which has solved this problem, Ocaml

http://www.ocaml.org/

Its called type inferencing and since there is at least one working
implementation, it can't be THAT hard.

You are comparing apples and oranges. The programmer provides OCaml
with additional information that allows it to interfer the type.
Looking at the example above, the OCaml equivalent would be:

let sum x,y = x + y;;

But this function would only work for integers because the * operator
only applies to integers. If you wanted to multipy floats then you
would write:

let sum x,y = x +. y;;

So there is no magic in OCaml, just a different way of providing type
information.

Cheers,
Brian
 
R

Reinhold Birkenfeld

Erik said:
I know of at least one language which has solved this problem, Ocaml

http://www.ocaml.org/

Its called type inferencing and since there is at least one working
implementation, it can't be THAT hard.

Refer to the task "Typed Python" somewhere in the past if you want more
information about Python and Type inferencing.

Reinhold
 
R

Reinhold Birkenfeld

Reinhold said:
Refer to the task "Typed Python" somewhere in the past if you want more
information about Python and Type inferencing.

s/task/thread/

Reinhold
 
J

John Roth

Nick Patavalis said:
You 're right, I was maybe a bit too dogmatic on my point. But you
must accept that JIT-compilers are, nevertheless, compilers! They may
be more intelligent and more flexible than traditional "ahead of time"
compilers, but still they are fundamentally compilers. Furthermore,
for most cases, it might be possible for an AOT compiler to produce a
"binary" that doesn't contain the compiler itself.

It's generally regarded as not worth doing, simply because
JITs might compile different code for each time through a
method if the signature changes dynamically.
What you mean I guess, is that the first time a function is applied,
it is compiled to native-code, and a signature for the application is
generated. The next time, the application is checked against the
signature and if they match, the existing code is used, otherwise the
function is re-compiled (preserving the previously compiled one too,
is some sort of "cache"). Or am I getting it wrong? Even in such a
case though per-declarations would help.

Exactly, although the scope is smaller than a function - it has
to check other variables that the method might refer to.
Declarations don't help unless they can provide a solid
guarantee of the variable's type. If they can't, they're
useless because the JIT has to insert the type checking
code anyway.
Do you happen to know of any efforts to build such "AOT"/"JIT"
compilation/execution environments for Python?

That's part of the plan for PyPy.

John Roth
 
N

Nick Patavalis

It's generally regarded as not worth doing, simply because
JITs might compile different code for each time through a
method if the signature changes dynamically.

What is regarded as not worth doing? I don't really understand this
remark?
Declarations don't help unless they can provide a solid
guarantee of the variable's type. If they can't, they're
useless because the JIT has to insert the type checking
code anyway.

Agreed! The only way to avoid type-checking at runtime, it to have
static typing, but nobody wants this, do they? Declarations though can
help by indication to the compiler what types of applications it's
worths to optimize (i.e. do the best you can for strings, but for ints
and foats I do want this code to be fast).

/npat
 
N

Nick Patavalis

That's actually the kind of thing that is planned for Python with
Starkiller, however silly a project name that might be.

Correct me if I'm wrong, but I thing that starkiller produces
optimized code (i.e. native code) only if it can unambiguously
inference the types a-priori, and there are cases (in a dymanically
typed language like python) where this is impossible. In these cases,
I believe, starkiller does nothing. Are there any plans for treating
such cases? And how?

/npat
 
N

Nick Patavalis

I'm biased, having done a paper on this at the most recent PyCon, but
I firmly believe that much of the "Python is too slow" arguments can be
answered with "too slow for what?" See the pycon proceedings, but I've
been doing VoIP in Python, complete with audio mixing, and it's been
more than fast enough.

Yes but what parts of it were done in python, and what parts were done
inside modules written in C?

Do you believe, for example, that a web-server written in python could
outperform apache? How about an H323 implementation, or a TCP/IP
stack? Or a font renderer? Or a ray-tracer? A gate-level circuit
simulator? A web-browser? A relational database?
Sure, you're not going to get great performance for your numerical
computation in Python, but luckily, we have numarray for this.

If numarray was written *in Python* I would be delighted. But even
with numarray, if you want to do FFT, you do it in C, not in
Python. And if FFT is not good for you and you need DCT, again in
C. And if the FFT of numarray is not sufficient (e.g. you want an
integer version with certain bit-exact properties), hmmm sory, you
have to do it in C.

At this moment Python is an excelent *glue* language for stuff written
in low-level laguages. It is also an exelent prototyping language. It
has a long way to go before becomming a true "production" language (in
the sense outlined above). Most of this way has to do with Python
*implementations* and not with Python-the-Language. But it seems that
there are some steps that must be taken by the language itself in
order to open the road to efficient implementations.

/npat
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,990
Messages
2,570,211
Members
46,796
Latest member
SteveBreed

Latest Threads

Top