Why I love python.

E

Erik de Castro Lopo

Brian said:
You are comparing apples and oranges.

Yes, Ocaml and Python are very different languages but ...
The programmer provides OCaml
with additional information that allows it to interfer the type.

But Ocaml does have parametric polymorphism, ...
Looking at the example above, the OCaml equivalent would be:

let sum x,y = x + y;;

But this function would only work for integers because the * operator
only applies to integers. If you wanted to multipy floats then you
would write:

let sum x,y = x +. y;;

So there is no magic in OCaml, just a different way of providing type
information.

Using ints and floats is a bad example because Ocaml has different
operators for float and int. A better example might be a function to
do the Python eqivalent of

string.join (list_of_strings, ", ")

Ie:

(* The list version *)
let rec comma_join lst =
match lst with
[] -> ""
| hd :: [] -> hd
| hd :: tl -> hd ^ ", " ^ (comma_join tl)
;;

(* The array version, an example only. *)
let comma_join ary =
comma_join (Array.to_list ary)
;;


Ocaml has no problem differentiating two functions with the same
name by looking at how the function arguments are used and assuming
the function is generic if insufficient information is available.

Erik
--
+-----------------------------------------------------------+
Erik de Castro Lopo (e-mail address removed) (Yes it's valid)
+-----------------------------------------------------------+
"One World, one Web, one Browser." - Microsoft promotion
"Ein Volk, ein Reich, ein Fuhrer." - Adolf Hitler
 
M

Michael J. Fromberger

At this moment Python is an excelent *glue* language for stuff written
in low-level laguages. It is also an exelent prototyping language. It
has a long way to go before becomming a true "production" language (in
the sense outlined above). Most of this way has to do with Python
*implementations* and not with Python-the-Language. But it seems that
there are some steps that must be taken by the language itself in
order to open the road to efficient implementations.

Let me play the Devil's advocate for a moment here.

Why is it important to write an entire program in a single language
(e.g., Python), versus using a hybrid approach? If you can use Python
at all, that means your platform already has good support for a C
compiler, so there is no reason not to use C extensions if you really
need performance.

Now, perhaps you'll argue that C extensions are not as portable as
Python ones. And yet, portability failures usually arise from
differences in how you access hardware (e.g., graphics cards, audio
hardware, input devices) or operating system API's, and those
differences are going to crop up in Python as well. If you are going to
have to write system-specific code anyway, and assuming you are very
concerned about "high performance," you might as well just provide
multiple C extensions to accommodate the difference, and let the Python
glue code remain the same.

By this view, I would argue that Python is a much better "production"
language than many other languages currently being used in that role.
It is no harder to write extensions for Python than to write native
methods for Java (and, I would argue, easier for several common cases).
Furthermore, Python can be stripped down and imbedded without too much
pain, so that the developer is not forced to maintain a single
monolithic code-base for their entire application in Python, simply to
take advantage of a few of its powerful features.

In short, I would argue that Python's ability to play nicely in a
multipe-language development project is actually a sign of its maturity
as a production tool. More cool languages are killed by their lack of
ability to interface nicely with other cool languages, than all other
reasons combined.

-M
 
D

Dave Brueck

Nick said:
Yes but what parts of it were done in python, and what parts were done
inside modules written in C?

Do you believe, for example, that a web-server written in python could
outperform apache?

Yes. Apache is not that fast, and web servers are often more network
bound than CPU bound.
> How about an H323 implementation, or a TCP/IP
stack? Or a font renderer? Or a ray-tracer? A gate-level circuit
simulator? A web-browser? A relational database?

Nobody is arguing that Python is as fast as C. But being slower does not
imply that Python is unsuitable for those tasks. I'd consider your list
to be pretty atypical of normal development (how many TCP stacks and
relational databases need to get written each year?), but even so most
of the items on your above list _have_ been done in Python and have done
pretty well for a number of applications. I'd wager that the vast
majority of programs written have at their disposal more CPU than they
need, so using more of that spare CPU power (by using a higher level
language like Python) is a cost many people are ready to pay for many,
many applications.

Note also that all or most of those programs on your last at one time
had to be partially implemented in assembly language even if the main
language was C or C++, and yet that didn't make C or C++ unsuitable
development languages for the task (nor did it make them only "glue
languages"). The same can hold true for Python in many cases - if a
small portion needs to be developed in a lower-level language you can
still derive great benefit from doing the rest of the application in
Python.

In nearly all of the cases where I was sure I'd have to later recode a
portion in C for performance, that day never arrived. For some reason
additional performance is always welcome, but the lack thereof rarely
ends up becoming a big deal. (And this is not just in my own projects -
when other people/companies are driving the requirements they are pretty
much always more interested in getting it to market and adding new
features. On one project in particular I have on my todo list to go
rewrite the performance "critical" core in C and it's been on my todo
list for a couple of _years_ now because I'm the only one left who cares
that it could be faster - everyone else is focused on feature set. And
since the core is partially CPU bound its performance has more than
doubled during that time due to faster CPUs - here I am sitting still
and the problem is going away :) ).
If numarray was written *in Python* I would be delighted. But even
with numarray, if you want to do FFT, you do it in C, not in
Python. And if FFT is not good for you and you need DCT, again in
C. And if the FFT of numarray is not sufficient (e.g. you want an
integer version with certain bit-exact properties), hmmm sory, you
have to do it in C.

At this moment Python is an excelent *glue* language for stuff written
in low-level laguages. It is also an exelent prototyping language. It
has a long way to go before becomming a true "production" language (in
the sense outlined above).

I have to disagree - we use it as our main production language for so
many different things it's hard for me to sit still when it's
pigeonholed as just a glue language (*especially* when a lot of our
Python programs sit idle for large blocks of time waiting for e.g. the
database to get done or for the network pipe to become less saturated).

Maybe it all comes down to domain, but for me the cases you describe are
rare and oddball enough that if a little C is needed to get the job done
then it's no big deal because they make up such a tiny minority of all
the problems we're solving.

C/C++ are becoming less and less suitable for production use - their
main remaining advantage is performance and that becomes a smaller and
smaller issue each year. Everything from manual memory management to
hacky, primitive data structures (even _with_ C++/STL) make them more of
a liability than an asset - development in them is slow, error-prone,
and therefore too expensive. For personal projects I don't have enough
spare time to waste it coding in something so low-level as C, and for
professional projects the raw speed is generally valued but much less so
than time-to-market and cost of change so I can't justify C/C++/etc
there either.

99% of the time the tradeoff for using Python comes down to this:

Benefits: low cost, fast time to market, cheap addition of new features,
fewer bugs

Costs: use CPU cycles that were already idle anyway

Score!

-Dave
 
N

Neuruss

I guess you are looking for type inference or something along these
lines.
There's a very amibicious project called "Starkiller" which is a
static type inferencer and a c++ compiler for Pyhon.
It's being developed by Michael Salib, an MIT graduate, and as far as
I know it will be released very soon.

Preliminary results show speedups by a factor of 60.
http://www.python.org/pycon/dc2004/papers/1/presentation.pdf
 
R

Reinhold Birkenfeld

Nick said:
Correct me if I'm wrong, but I thing that starkiller produces
optimized code (i.e. native code) only if it can unambiguously
inference the types a-priori, and there are cases (in a dymanically
typed language like python) where this is impossible. In these cases,
I believe, starkiller does nothing. Are there any plans for treating
such cases? And how?

I think the dynamic nature does make it impossible to do anything in
such cases at the first place. Consider:

klass = raw_input()
classobj = eval(klass + "()")
print classobj.whatami

A compiler can tell absolutely _nothing_ about the resulting class
object since typing information is not contained in the program.

One would have to tell the "compiler" explicitly which types the
variable will be allowed to hold, such as:

klass = raw_input()
classobj as (FooObject, BarObject, BazInterface) = eval(klass + "()")
print classobj.whatami

But that requires "typed Python" extensions, and as such isn't pure type
inferencing any more.

Reinhold
 
N

Nick Patavalis

Yes. Apache is not that fast, and web servers are often more network
bound than CPU bound.

We 're obviously interested in cases where the problem is
CPU-bound. In a network-bound server, there is no *meaning* in
speaking about performance (with respect to the implementation
language).
Nobody is arguing that Python is as fast as C. But being slower does
not imply that Python is unsuitable for those tasks. I'd consider
your list to be pretty atypical of normal development (how many TCP
stacks and relational databases need to get written each year?),

I also mentioned web-browsers, ray-tracers, circuit-simulators. I
could add word-processor, spreadsheets, video editing programs, and
GUI toolkits to the list. Are they still too exotic? To cut the thread
short, what I mean is that an application that has to do something
like:

for i in range(N):
a = b + c

is bound to be 10 to 100 times slower than the equivalent coded in
C. Which means that the cost of doing *computation* in Python is
prohibitively high! Have you ever seen, say, an AVL-tree
implementation in production Python code? Probably not. Have you ever
seen someone implementing some sort of string-lookup algorithm in
Python (instead of using the build-in dictionaries)? Again no. Is it
because Python has found the "one-size-fits-all",
"best-algorithm-ever-devised" solution? Or is it because the weight of
the language itself is such that even a suboptimal algorithm
implemented in C will never be matched by a python implementation?

The very fact that the python interpreter itself in implemented in C
(and not in Python) is indicative.
Note also that all or most of those programs on your last at one time
had to be partially implemented in assembly language even if the main
language was C or C++, and yet that didn't make C or C++ unsuitable
development languages for the task (nor did it make them only "glue
languages").

No, but the performance difference between C and Assembly was
*small*. And at some point the C compilers became so good, that you
couldn't beat them by hand coding something (of considerable length)
in assembly. As for C++, one of its primary design-goals were "zero
unneeded overhead"; so it *is* possible to write a C++ program that is
as fast as a C program, if you want to do it.
The same can hold true for Python in many cases - if a small portion
needs to be developed in a lower-level language you can still derive
great benefit from doing the rest of the application in Python.

Of course you can! Nobody argued that Python is useless. Python is one
of the cleanest, most pleasant, and most productive languages one
could wish for. For me it would not be an exaggeration to say that
Python has brought a lot of fun back in programming (and in many
ways). The reason I'm writing this is that *I also* hate to see it
pigeon-holed as a "glue" or "scripting" language. Our difference, I
guess, is that I believe that there is *some* truth in such missives;
and this has to do with the current, immature, state of the Python
*environments*. So my point is that we should not relax in the cozy
feeling that "Python is great for most applications, even if it's a
little slow, but who cares". I want to be able to write signal
processing functions in Python, or implement that optimized
special-case search algorithm, and I want to be sure that---by guiding
the compiler properly---it will produce code that is as efficient as a
well-written C program, or hand-coded assembly (or at least close to
that). I want the next GUI toolkit I use to be written in Python
(instead of written in C++ and simply wrapped in Python). And I
believe that this *is* possible, provided that we don't ignore all the
years that have been spent advancing compiler technology, and that we
don't treat the current Python environments as the "end of the
line". CPython is a good proof that Python works, and that it is a
great language. For Python to become a "primary" language, there's
still much work to be done. Most of this work, as I said before, has
to do with the environments (interpreters, AOT/JIT compilers,
optimizers, runtime modules, etc). But some of it has to do with
ensuring---at the language level---that efficient environments are
possible. Considering CPython and Python one and the same leads
straight to a Perl-ish hell!

/npat
 
N

Nick Patavalis

Why is it important to write an entire program in a single language
(e.g., Python), versus using a hybrid approach? If you can use Python
at all, that means your platform already has good support for a C
compiler, so there is no reason not to use C extensions if you really
need performance.

If this is the best thing possible then yes, this is a solution. But
you see, I believe that even for these performance-sensitive parts,
Python has a lot to offer in terms of expressive power. The question
is: is it possible to make a Python environment fast-enough to be able
to produce code as efficient as a C compiler? I believe the answer is
yes, and we shouldn't be satisfied with the approach "do these glue
things in Python, and for the computationally expensive ones, well
code them in C".

/npat
 
N

Nick Patavalis

I think the dynamic nature does make it impossible to do anything in
such cases at the first place. Consider:

klass = raw_input()
classobj = eval(klass + "()")
print classobj.whatami

Yes, that's exactly what I meant. The only solution in such a case
would be for the environment to call the compiler at run time, and
compile classobj then. This means of course that in such cases the
compiler must be included in the "executable".

I believe this has been done in other dynamic languages.

Typed-extensions, as you mention, would also help.

/npat
 
D

Dan Schmidt

| To cut the thread short, what I mean is that an application that has
| to do something like:
|
| for i in range(N):
| a = b + c
|
| is bound to be 10 to 100 times slower than the equivalent coded in
| C.

Although note that (at least by my timing)

a = map( operator.add, b, c )

is 3 times as fast as your Python version, bringing the code up to
being only 3 to 33 times slower than C.

I'm sure that pyrex could bring the speed up a heck of a lot more, too.
 
K

kosh

On 2004-08-13, Reinhold Birkenfeld
Yes, that's exactly what I meant. The only solution in such a case
would be for the environment to call the compiler at run time, and
compile classobj then. This means of course that in such cases the
compiler must be included in the "executable".

Why is there a need for a stand alone executable? At least on all the unixes
whether something is executable is just determined by the executable bit on
the file. I can execute a python program just as transparently as one in
compiled c, c++, etc. I really don't see the point of that.

Overall I would rather that there was more reliance on runtimes and that psyco
was improved to the point that it was just part of python and could save its
jited versions of code for reuse later. That way I can upgrade libraries, the
runtime etc and as long as the system is still source compatible the
application would still work and it would speed up as it ran as things where
compiled to optimized code as needed.

Overall I think that standalone binaries are bad long term. I would prefer
source compatibility since that is more flexible long term. With a jit the
code should run just as fast but it would make things like security update
and updating pieces of the system simpler.
 
D

Dave Brueck

Nick said:
I also mentioned web-browsers, ray-tracers, circuit-simulators. I
could add word-processor, spreadsheets, video editing programs, and
GUI toolkits to the list. Are they still too exotic?

No - but I still don't think they reflect anywhere near a majority of
the development that goes on. I'm not at all saying that there aren't
applications where performance matters, just that (1) it tends to be far
less common than most people believe/realize (2) when it does matter, it
usually matters in a very tiny portion of the total code, and (3) rarely
in today's development do you need to build from scratch everything down
to the building blocks themselves anyway (so if you're e.g. building a
web browser, in many cases you won't even be dealing with lowest level
data anyway - you won't be handling individual pixels but will be
calling a library somewhere else to render an image for you - and as the
developer of the web browser, that's just peachy)

IOW, it'd be lovely to have a blazingly fast-as-C Python, but the lack
of fast-as-C performance is rarely the most important problem in practice.
To cut the thread
short, what I mean is that an application that has to do something
like:

for i in range(N):
a = b + c

is bound to be 10 to 100 times slower than the equivalent coded in
C. Which means that the cost of doing *computation* in Python is
prohibitively high!


Not necessarily, and that's the point. You're making the assumption that
10-100 times slower is too slow. In some cases it most definitely is.
In many cases it most definitely is not.
Have you ever seen, say, an AVL-tree
implementation in production Python code? Probably not. Have you ever
seen someone implementing some sort of string-lookup algorithm in
Python (instead of using the build-in dictionaries)? Again no. Is it
because Python has found the "one-size-fits-all",
"best-algorithm-ever-devised" solution?

Or is it because in 99% of the cases what is there works good enough, so
much so that obtaining the difference is not worth the opportunity cost
of working on something else?

If you have an overabundance of a particular resource, and you can gain
some advantage in exchange for some of that abundance, it's nearly
always worth the trade. Such is the case with CPU - for many, many
programs we have oodles of excess CPU time lying around, so it's a
worthwhile trade. And that's exactly why everybody would love a faster
Python but most people aren't willing to invest time working on it.

I welcome any speed boost we see from people working on improving
performance, but it's just not what's standing between me and most of my
development goals. Heck, right now I'm writing this message, I've got my
mail & IM messages going, a couple of code editors open, a virtual PC
instance running my database and webserver, and I'm building web pages
off content in the database and saving them to disk. CPU usage is
hovering around 5%. If I were to increase the speed of all of these
applications by a factor of 1000 I wouldn't even notice. Add a few
features to any of them, and I would.
Or is it because the weight of
the language itself is such that even a suboptimal algorithm
implemented in C will never be matched by a python implementation?

Practice has shown that, not only is this not true, but that a lot of
times working in a higher level language is also worth it because the
cost of discovering and implementing a better algorithm is cheaper, so
you could end up with better performance than going to C. Or, you'd
arrive at plenty-fast-enough sooner.
No, but the performance difference between C and Assembly was
*small*.

Over time, yes, but certainly not initially. I still remember how
appallingly slow my graphics routines were in C - way too much overhead
- while in assembly they had no problems at all.
And at some point the C compilers became so good, that you
couldn't beat them by hand coding something (of considerable length)
in assembly.

That happened later, at least on PCs. The real transition happened as
CPU speed grew so much that the difference between e.g. 4.77 MHz and 10
MHz was boring.
As for C++, one of its primary design-goals were "zero
unneeded overhead"; so it *is* possible to write a C++ program that is
as fast as a C program, if you want to do it.

It was a design goal, but (1) implementations didn't achieve it very
well initially and (2) in the end it didn't matter that they didn't
achieve it. More and more people migrated to C++ because the cost of
doing so (overhead) fell steadily over time - even more quickly than the
compilers improved.
and I want to be sure that---by guiding
the compiler properly---it will produce code that is as efficient as a
well-written C program, or hand-coded assembly (or at least close to
that).

Ugh - any time you spend guiding the compiler is time you could have
spent actually solving a problem. I find it interesting that when
programming in C we don't use the 'register' compiler hint anymore. Why?
It's a combination of smarter compilers and faster computers, but either
way its existence was awful IMO - don't distract the programmer like that.

I *love* it whenever I see that Pystone benchmarks are improving, or
that any of the various VM implementations are making headway, and I'll
gladly use them. But at the same time, I have to admit that they aren't
solving any problems I encounter on a daily basis.
For Python to become a "primary" language, there's
still much work to be done.

Couldn't disagree more. Yes, things like interpreters, compilers, etc.
could use more maturity and will continue to evolve over time, but even
_lacking_ those things it's still far enough ahead of other "primary"
languages to make the _net_ result a huge advantage. With the evolution
of those things the advantage will just become more pronounced.
But some of it has to do with
ensuring---at the language level---that efficient environments are
possible.

Again, I disagree. IMO one of the benefits of higher level languages is
that they underlying implementation technology takes care of details
that the developer need not be concerned with.

Overall, every little performance improvement in a Python implementation
extends the domain in which Python is a good tool for the job, and
that's great. But AFAICT it's already a terrific fit for a massive chunk
of real-world development, so much so that increasing its speed by, say,
a factor of 10 isn't going to even double its domain of usability.

-Dave
 
C

Christopher T King

Why is there a need for a stand alone executable? At least on all the unixes
whether something is executable is just determined by the executable bit on
the file.

Not if you don't have the interpreter installed.
I can execute a python program just as transparently as one in
compiled c, c++, etc. I really don't see the point of that.

Indeed, you can do that just as easily on Windows, too. The point of a
stand-alone executable is not to make running the script easier, but to
make distribution easier. Users don't need to install Python to run a
Python script if it's a stand-alone executable.
 
K

kosh

Not if you don't have the interpreter installed.

So install the runtime. If you want to run the .NET stuff you need to have the
the .NET CLR or Mono installed. If you want to run java apps you need the jvm
etc. Overall once a runtime is installed it makes distributing apps a lot
easier since the actual thing you need to send someone is tiny. Also at least
on unixes I have not run into a box in about 6 years or so that did not have
python and perl installed so in practice I have not run into that problem.
Indeed, you can do that just as easily on Windows, too. The point of a
stand-alone executable is not to make running the script easier, but to
make distribution easier. Users don't need to install Python to run a
Python script if it's a stand-alone executable.

Overall it would be better if there was an easy way on windows to get the
runtime installed since then you can send users far smaller files, smaller
updates and it makes it easier for people to patch their systems. I have seen
more then a few cases where a bug like temp file creation was found to be a
problem in python and in some c code. However the difference is that you can
update the python runtime and all affected python programs are fixed. The
same is not true of the c versions.

One of them I have run into which is a pain is stuff like openssl. When that
gets updated it seems a whole bunch of programs have to be compiled to work
with it again. The change is source compatible but for whatever reason the
bug fix breaks binary compatibility on a number of apps. Just update the
runtime though for things like python,java, etc and all apps on those
runtimes just become fixed.
 
C

Christopher T King

So install the runtime. If you want to run the .NET stuff you need to
have the the .NET CLR or Mono installed. If you want to run java apps
you need the jvm etc.

And if you want to run Python scripts you need the Python interpreter
installed.
Overall once a runtime is installed it makes distributing apps a lot
easier since the actual thing you need to send someone is tiny.

Oftentimes users will only have one Python app. They'd much rather
download a 2MB ZIP file and dump it somewhere, than download a 20MB Python
distribution and install it somewhere, download XMB of extension modules
needed by the app (e.g. PIL, numarray, to name a few), and then finally
download and install your script.
Also at least on unixes I have not run into a box in about 6 years or so
that did not have python and perl installed so in practice I have not
run into that problem.

And hence the lack of an executablization program for Unix.
Overall it would be better if there was an easy way on windows to get the
runtime installed since then you can send users far smaller files, smaller
updates and it makes it easier for people to patch their systems.

True. I don't see that happening anytime soon, though.
I have seen more then a few cases where a bug like temp file creation
was found to be a problem in python and in some c code. However the
difference is that you can update the python runtime and all affected
python programs are fixed. The same is not true of the c versions.

What? C programs use a runtime library, just the same as any other
language. Google for "libc.so" or "msvcrt.dll" if you don't believe me.
One of them I have run into which is a pain is stuff like openssl. When that
gets updated it seems a whole bunch of programs have to be compiled to work
with it again. The change is source compatible but for whatever reason the
bug fix breaks binary compatibility on a number of apps.

That sounds like an openssl-specific problem, perhaps relating to
configuration issues. Since C libraries are linked dynamically, source
compatibility inherently translates to binary compatibility, assuming
functions were not moved to different libraries or rewritten as macros (or
vice-versa).

In a perfect world, all OSes would use proper package management systems,
and single-executable programs would not be needed. Unfortunately, this
isn't true.
 
N

Nick Patavalis

Why is there a need for a stand alone executable? At least on all the unixes
whether something is executable is just determined by the executable bit on
the file. I can execute a python program just as transparently as one in
compiled c, c++, etc. I really don't see the point of that.

Perhaps you target system has no Python environment installed. And
perhaps it has no resources to have a complete python environment
installed (appart from the fact that it might not need one). Don't
think of your 2GHz / 512MB desktop. Think of your cell-phone.
 
K

kosh

What? C programs use a runtime library, just the same as any other
language. Google for "libc.so" or "msvcrt.dll" if you don't believe me.

I do know that c programs can be dynamically linked and most of the time on
unixes they seem to be. But it seems to be far too common that the library
changes in some way that requires that the program be recompiled. I have seen
it with both kde and gnome where just recompiling them with no changes to
their code at all fixed library problems that where being reported.

I have never seen that kind of thing in python.
That sounds like an openssl-specific problem, perhaps relating to
configuration issues. Since C libraries are linked dynamically, source
compatibility inherently translates to binary compatibility, assuming
functions were not moved to different libraries or rewritten as macros (or
vice-versa).

I don't know why they break. I know that they do.
In a perfect world, all OSes would use proper package management systems,
and single-executable programs would not be needed. Unfortunately, this
isn't true.

Well in my world everything is written for unixes and deployed on unixes and
most of them are linux boxes which have good package systems.
 
K

kosh

Perhaps you target system has no Python environment installed. And
perhaps it has no resources to have a complete python environment
installed (appart from the fact that it might not need one). Don't
think of your 2GHz / 512MB desktop. Think of your cell-phone.

Cell phones should be cell phones not multifunction device that can run all
kinds of apps, need virus scanners etc. I don't want my cell phone to run
python, java, ruby, c# etc etc. I want it to just be a telephone and do that
job well. Most of the modern cell phones are crap if you want all of that
stuff get a pda and get a virus scanner for it.
 
M

Michael Scarlett

Anthony Baxter said:
[ pie decorators ]
Am i the only one with a
visceral reaction to this thing???

So did you have a similar reaction on first hitting the indentation for
blocks? I know I dimly recall thinking that this was very strange and
horrible (dimly, because it was 1992 or 1993).


one of the first books I read on python was Magnus lie Hetland's
Practical python.
http://hetland.org/writing/practical-python/
in the introduction he had a few quotes.
"A C program is like a fast dance on a newly waxed floor by people
carrying razors"

"C++: hard to learn and built to stay that way"

"Java is, in many ways, C++"

"And now for something completely different....."

the last was his intro to learning Python. When the trs-80 was out
from radio shack, i used to code in basic on it. I lost interest in
computers and only picked it up a few years ago. I investigated python
and fell in love with the language. It's elegance, its simplicity (at
least for the programmer) and its sheer delight to code in. I don't
work in the IT field, and programming isn't my bread and butter. But
just because its so fun, python brought me back into computers - i've
created a a few websites with python on the back end, manipulated
files and a few other projects for intellectual curiousity and for my
day to day work and home life easier. As a result of learning and
coding in python, I wanted to learn more about it, and so I turned to
C, and am now actively learning it simply to learn how to integrate
and build on pyton. I tolerate C's ugliness because i know the end
result is a labour of love? when I can work with it and python. Silly
maybe, but i'm going on emotion here, not logic. Python is simply fun
to code in, and when its fun your more productive and excited to learn
and tackle new problems cause your not bogged down in remebering how
the increment(++) operator works for pointers in a particular function
that supposed to dynamically allocate memory. terminating every
statement with a ";" or manipulating fgets to discard the '\n'. yada
yada yada.....
The point is python just works.
someone once said python is runable pseudocode. And it works, and
works good at that. I couldn't agree more. I think and then I code.
simple.

addenum: i've had this article bookmarked for some time because once
you read it, you have to wonder are they talking about python. because
you realise python is there already. for those of you interested to
read it and comment on it:

http://archive.gamespy.com/legacy/articles/devweek_b.shtm

thats my $0.02
 
N

Nick Patavalis

No - but I still don't think they reflect anywhere near a majority of
the development that goes on.

Yes, the majority of development goes on little *glue programs* that
take data from a database, and format is as XML/HTML, or aggregate and
analyze data stored in a database, and stuff like that. But for all
these to be possible a massive amount of *infrastructure* is
required. And this infrastructure cannot be created in Python. So you
don't say that Python isn't a glue language, but that the greatest
percentage of development that currently goes-on *is* glue-stuff
development.

This of-course presupposes that the infrastructure *is* available,
that it is stable, and that it doesn't have to be modified or
augmented. For me a "primary" language is not the language in which
you develop most of the software, but the language in which you
develop the current and future software
*infrastructure*. Quantitatively most of the software is glue-stuff
anyway!

Put yourself in this position: Its a few years ago (say 1998 or 1999),
and no graphical web-browser exists for Linux. You are planning to
develop the "iso-standard" web-browser for this operating
system. Would you do it in Python? Remember that no HTML parsers
exist, no decent HTML renderers, the GUI toolkit is more or less
primitive, and the low-end desktops runs at about 200-something
MHz. You might argue "this is not the case today", but how can you
preclude that *similar* challenges do not occur today, or will not
occur in the future? Are you saying that all the computationally hard
problems have been already solved? Or are you saying that, as a Python
programmer you don't want to deal with them? Another example: It's
2004 again, and you decide to scrap and replace the age-old X11 window
system; do away with it and start from scratch. Build a moder
windowing system; 3D all over, fully network transparent, with widget
support on the server-side, fully object oriented interface, and so
on. How much of it would you be able to code in Python? How much
*more* would you rather be able to code in Python?

/npat
 
N

Nick Patavalis

I don't want my cell phone to run python, java, ruby, c# etc etc. I
want it to just be a telephone and do that job well.

I understand. You want a nice analog cell-phone, with a large rotary
dial, and very long cord. Sorry but resistors, capacitors, and diodes
can only go that far. For everything else you need large clusters of
transistors (integrated-ccircuits they are called by some) and a lot
of them need (God forbid!) "software".

/npat

P.S. I *have* to sign with this :)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,992
Messages
2,570,220
Members
46,805
Latest member
ClydeHeld1

Latest Threads

Top