Fortran vs Python - Newbie Question

N

Nomad.C

So, after reading much of animated debate here, I think few would
suggest that Python is going to be faster than FORTRAN when it comes to raw
execution speed. Numeric and SciPy are Python modules that are geared
towards numerical computing and can give substantial performance gians over
plain Python.

A reasonable approach (which has already been hinted at here), is to try
to have the best of both world by mixing Python and FORTRAN - doing most of
the logic and support code in Python and writing the raw computing routines
in FORTRAN. A reasonable approach might be to simply make your application
work in Python, then use profiling to identify what parts are slowest and
move those parts into a complied language such as FORTRAN or C if overall
performance is not fast enough. Unless your number crunching project is
truly massive, you may find that Python is a lot faster than you thought and
may be plenty fast enough on it's own.

So, there is a tradeoff of resources between development time, execution
time, readability, understandability, maintainability, etc.

psyco is a module I haven't seen mentioned here - I don't know a lot
about it, but have seen substantial increases in performance in what little
I have used it. My understanding is that it produces multiple versions of
functions tuned to particular data types, thus gaining some advantage over
the default, untyped bytecode Python would normally produce. You can think
of it as a JIT compiler for Python (but that's not quite what it is doing).
The home page for that module is here: http://psyco.sourceforge.net/

Hope that help,
-ej

Ok
Thanks you all for giving a little insight into what Python can
actually do, I think I've read enough to convince me that Python is
generally a very flexible, fast, powerful language that can be used in
a wide variety of applications instead of focusing on numerical
functions like fortran does.
Thanks again!
 
S

Steven D'Aprano

Fortran also appears to be a compiled language, whereas Python is an
interpreted language.

Sheesh. Do Java developers go around telling everybody that Java is an
interpreted language? I don't think so.

What do you think the "c" in ".pyc" files stands for? "Cheese"?
 
T

Tal Einat

OK...
I've been told that Both Fortran and Python are easy to read, and are
quite useful in creating scientific apps for the number crunching, but
then Python is a tad slower than Fortran because of its a high level
language nature, so what are the advantages of using Python for
creating number crunching apps over Fortran??
Thanks
Chris

Personally, my #1 reason for favoring Python is the interpreter. When
developing code which implements complex algorithms/calculations, I
always find I want to play around with things during development.
Using Python's interpreter this is a real joy!

Now, I don't mean tweaking the almost-finalized implementation - I
mean tweaking bits and pieces of code during all stages of
development. For instance: Optimizing an algorithm's parameters by
trying different combinations out under different circumstances. Or
checking the potential gains from pre-processing something. Or writing
a specialized version of a certain function and testing it for
correctness and performance. Etc, etc.

As for my background - I have developed "number-crunching" software
with C, C++ and Python over the past 5 years, and have recently
learned Fortran as well. Python has consistently failed to disappoint
me :)

- Tal Einat
reduce(lambda m,x:[m+s[-1] for i,s in enumerate(sorted(m))],
[[chr(154-ord(c)) for c in '.&-&,l.Z95193+179-']]*18)[3]
 
B

Beliavsky

While I have never personally dealt withFortran, I looked it up here:

http://en.wikipedia.org/wiki/Fortran_code_examples

The code examples speak for themselves. I recommend you look those
over and then look over similar simple program written in Python.

Code written using the features of modern Fortran (90 or later),
including free source form, looks better than code written in Fortran
66 or 77. Except that Fortran uses keywords to terminate blocks rather
than indentation as in Python, properly indented Fortran does not look
much different from Python. Neither language requires curly braces or
semicolons to terminate blocks and lines.

It does not make sense to argue against programming in Fortran 95 now
because of limitations that were removed long ago.
 
S

sturlamolden

Thanks you all for giving a little insight into what Python can
actually do, I think I've read enough to convince me that Python is
generally a very flexible, fast, powerful language that can be used in
a wide variety of applications instead of focusing on numerical
functions like fortran does.

I wouldn't go as far as to say Python is a 'fast' language. But it's
certainly fun (particularly that), powerful, readable, and useful,
even for numerical computing.

Speed in numerical computing is a strange beast. First, a small
bottleneck in your code is likely to dominate the execution time. But
very often it is impossible to guess where it is, common sense usually
doesn't help, for some strange reason. You can use the Python profiler
to identify the real bottlenecks, but you may be surprised when you
find where they are. These bottlenecks are the only portion of the
code that can benefit from being moved to Fortran or C. The
bottlenecks will dominate regardless of what you do to the rest of you
code.

You may gain nothing from moving the bottlenecks to C or Fortran.
Don't let that surprise you. Python may be doing the work as fast as
it can be done. It may actually be the case that moving this code to C
or Fortran causes a slowdown. Python may be calling hand tuned C,
assembly or Fortran code that you cannot match. You can hand code a
sorting routine in C, but will you be able to beat Python's 'timsort'?
Probably not! Timsort is one of the fastest and stable sorting
routines known to man. It has taken years to develop and fine tune,
and you simply cannot beat that, even if you spend hours on end
optimizing your code. The same holds for the hashing routines in
Pythons dictionaries. You can write a hashtable in C, but I doubt it
can match the speed of Python's dictionaries. And then there is files,
strings, network sockets and protocols, access to databases,
encryption, etc. Python's libraries are tuned by people who knew what
they were doing and had the time to do the fiddling. Most likely you
don't (well, I certainly do not). Finally, you will most likely use
numerical libraries like LAPACK and BLAS. These are written in Fortran
or C anyway. Whether you call the libraries from Fortran or Python may
not matter very much! In the end, the code that can really benefit
from being moved to C or Fortran is rare to come by.

Python is a very high-level language. That means there are certain
things that put constraint on the attained speed. Most importantly:
keep the number of interpreter evals as scarce as possible. If you
make a for loop, the interpreter may evaluate the lines within the
loop several times. Languages like C, C++ and Java really teach you
bad habits when it comes to an interpreted language like Python. In
these languages, loops are almost free. In Python, they may be very
expensive as the interpreter is invoked multiple times. But if you can
'vectorize' the loop statement into a one-liner (or a few lines) using
slicing, list comprehensions or functional programming (e.g. the
lambda, map and filter intrinsics), the overhead will be very small.
That is the secret to getting high-performance from a language like
Python. If you can get away from the C++ habit of using for loops for
any kind of iteration, Python may not feel sluggish at all.

In the end, the main determinant of performance in numerical tasks is
really big-O, not the language you prefer to use. Choice of algorithm
is fare more important than the Python vs. Fortran issue!
 
B

Beliavsky

You can get the speed of fortran in Python by using libraries like
Numeric without losing the readability of Python.

Numeric and Numpy will faster than raw Python for array operations,
but I don't think they will match well-written C or Fortran, because
compilers can better optimize code in those "less dynamic" languages.

Someone recently mentioned here ("Quantum term project code- for
interest and for help") an example where Numeric was 180 times faster
in Fortran 95 using gfortran, asking for advice on how to speed up the
Python code -- see
https://wiki.asu.edu/phy416/index.php/A_Simple_Bose-Einstein_Condensate_Simulation
.. No one replied.
 
B

Beliavsky

As said by others, "Portability, scalability & RAD" as an advantage of
Python are probably far more important.

All of those claimed advantages can be debated, although they may
exist for some tasks.

(1) Portability. Fortran has been run on hundreds if not thousands of
platforms since 1957. People who value portability often want
assurance that their code will be supported by compilers/interpreters
produced in the future. Standard-conforming Fortran 95 code is
conforming Fortran 2003 code, and the standards committee has decided
not to remove features in future versions. Python 3 is still somewhat
up in the air, and it will NOT be backward compatible with Python 2.x,
although migration tools will be provided.

(2) Scalability. If talking about parallel computing, the widely used
OpenMP Application Program Interface (API) supports multi-platform
shared-memory parallel programming only in C/C++ and Fortran. In
general, high performance computing is done in C, C++, and Fortran.

(3) RAD. Scripting programs WILL be faster to write in Python, because
of duck typing, the many built-in data structures, and other features.
For larger programs, a Fortran (or C++ or Java) compiler will catch
some errors at compile time that are caught only at run time in
Python, perhaps after considerable time has elapsed. Furthermore, the
higher speed of Fortran may mean that the time between program runs is
1 minute vs. 10 minutes in the corresponding Python program. This can
speed the development cycle.
 
S

Steven D'Aprano

Python is a very high-level language. That means there are certain
things that put constraint on the attained speed. Most importantly:
keep the number of interpreter evals as scarce as possible. If you
make a for loop, the interpreter may evaluate the lines within the
loop several times. Languages like C, C++ and Java really teach you
bad habits when it comes to an interpreted language like Python. In
these languages, loops are almost free. In Python, they may be very
expensive as the interpreter is invoked multiple times.

Jeez-Louise, this is why my blood boils when I read ignora^H^H^H^H
misguided people referring to Python as "interpreted": it leads other
people to imagine that Python is interpreting lines over and over and over
again.

Is the interpreter invoked multiple times in a for loop? Let's find out.

Here is a function with a for loop:

def looper():
x = 1
for i in range(1000000):
print x
x += i
return x


Let's look at the compiled code:
2 0 LOAD_CONST 1 (1)
3 STORE_FAST 0 (x)

3 6 SETUP_LOOP 35 (to 44)
9 LOAD_GLOBAL 0 (range)
12 LOAD_CONST 2 (1000000)
15 CALL_FUNCTION 1
18 GET_ITER 22 STORE_FAST 1 (i)

4 25 LOAD_FAST 0 (x)
28 PRINT_ITEM
29 PRINT_NEWLINE

5 30 LOAD_FAST 0 (x)
33 LOAD_FAST 1 (i)
36 INPLACE_ADD
37 STORE_FAST 0 (x)
40 JUMP_ABSOLUTE 19
6 >> 44 LOAD_FAST 0 (x)
47 RETURN_VALUE

Certainly Python isn't interpreting the lines in the for loop one million
times. It interprets them once, compiles them once, and executes them one
million times.

But if you can
'vectorize' the loop statement into a one-liner (or a few lines) using
slicing, list comprehensions or functional programming (e.g. the
lambda, map and filter intrinsics), the overhead will be very small.

This is generally good advice because it moves the loop from
moderately fast Python code to very fast C code, not because Python is
interpreting each line over and over again.
 
B

Beliavsky

You couldn't be more incorrect. I have run some very old (pre-Fortran
77) programs that are very far from trivial.


This is a lie. I've seen some Fortran code that was hellspawned, and
some that was clear as glass. The latter came about without a "truly
insane amount of trouble".

I quite agree with you. People have written big, mission-critical
programs to run nuclear reactors and design planes in Fortran 66 and
77, although I hoped that they ran static analysis programs such as
FTNCHEK to verify them. Fortran 90 and later versions has modules,
whose use allows the compiler to check types in procedure calls. I
used Fortran 77 in my physics PhD program and find that I am much more
productive in Fortran 95 now, making fewer errors not caught at
compile time. The operations on arrays and array sections in Fortran
90+
help one to write number-crunching code that is "clear as glass".
 
B

Beliavsky

Yes.

Several, in fact--all available at no charge. The Python
world is different from what experience with Fortran might
lead you to expect.

Your experience with Fortran is dated -- see below.
I'll be more clear: Fortran itself is a distinguished
language with many meritorious implementations. It can be
costly, though, finding the implementation you want/need
for any specific environment.

Gfortran, which supports Fortran 95 and a little of Fortran 2003, is
part of GCC and is thus widely available. Binaries for g95, also based
on GCC, are available for more than a dozen platforms, including
Windows, Mac OS X, and Linux. I use both and consider only g95 mature,
but gfortran does produce faster programs. Intel's Fortran compilers
cost about $500 on Windows and Mac OS and $700 on Linux. It's not
free, but I would not call it costly for professional developers.

Speaking of money, gfortran and g95 have free manuals, the latter
available in six languages
http://ftp.g95.org/ . Final drafts of Fortran standards, identical to
the official ISO standards, are freely available. The manual for Numpy
costs $40 per copy.
 
A

Alex Martelli

Cameron Laird said:
Yes and no. Alex, while useful scientific computing under Mac OS X
will almost certainly eventually involve installation of XCode and
so on, Nomad.C can start to learn Python without a need to install
ANYTHING. As you know, Python is already there, and the version that
comes with 10.4 (2.3.5, as nearly as I can easily tell) is easily
adequate to take him through the Tutorial (with minor exceptions).

I disagree: the Python 2.3.5 distributed by Apple as part of MacOSX 10.4
comes _without_ readline -- meaning that an up-arrow at the interactive
interpreter prompt gives Esc-[-A, a left-arrow gives Esc-[-D, etc,
instead of usefully recovering previous lines and allowing easy edit of
the current line. There is really no reason one should suffer through
that! python.org's 2.5 DMG (and, if I recall correctly, 2.4.4 as well)
come with a working idle, which is even nicer to use, and at any rate a
working readline, so that one can easily correct minor mistakes made
entering code at the interactive interpreter.

XCode, or other add-ons, may surely come later, but downloading the 2.5
or 2.4.4 DMG from python.org is HIGHLY recommended anyway.

Also, while I, like you, am aware of no minimal-gcc package from
Apple, I think third parties make it available. However, I'm not
motivated enough at this point to track down the details. I think
Nomad.C should start with what he has under 10.4, and plan to move
on later to all of XCode.

10.4 does come with a version of XCode (on MacOSX's DVD, if not already
on disk in /Applications/Installers) -- though not the latest and
greatest, it's quite likely to be adequate for any learning whatsoever.
The bundled Python is a different issue, and I do _not_ consider it
adequate -- I'd suggest 2.5 instead, though 2.4.4 will be fine too.


Alex
 
A

Alex Martelli

I've been told that Both Fortran and Python are easy to read, and are
quite useful in creating scientific apps for the number crunching, but

Incidentally, and a bit outside what you asked: if your "number
crunching" involves anything beyond linear systems, run, don't walk, to
get Forman Acton's "Real Computing Made Real",
<http://www.amazon.com/Real-Computing-Made-Engineering-Calculations/dp/0
486442217/ref=ed_oe_p/002-1610918-5308009> -- the original hardback was
a great buy at $50, and now you can have the Dover paperback at $12.44
-- a _steal_, I tell you!

You don't need any programming background (and there's no code in the
book): you need good college-level maths, which is all the book uses (in
abundance). It doesn't teach you the stuff you can easily find on the
web, such as, say, Newton's method for the numerical solution of general
equations: it teaches you to _understand_ the problems you're solving,
the foibles of numerical computation, how to massage and better
condition your equations not to run into those foibles, when to apply
the classic methods (those you can easily find on the web) and to what
versions of your problems _and why_, etc, etc. It may be best followed
with a cheap calculator (and lots of graph paper, a pencil, and a good
eraser:), though Python+matplotlib will be OK and Fortran+some
equivalent plotting library should be fine too.


Let me just give you my favorite, _simplest_ example... say we want to
find the two values of x that solve the quadratic:
a x**2 + b x + c x = 0

Looks like a primary/middle school problem, right? We all know:

x = ( -b +/- sqrt(b**2 - 4 a c) ) / 2 a

so we verify that b**2 > 4 a c (to have two real solutions), take the
square root, then do the sum and the difference of the square root with
- b, and divide by 2 a. Simple and flawless, right?!

Whoa, says Acton! What if 4 a c is much smaller than b**2? Then that
sqrt will be very close to b -- and inevitably, depending on the sign of
b, either the sum or the difference will be a difference between two
numbers that are VERY close to each other.

Such operations are the foremost poison of numeric computation! When
you take the difference between numbers that are very close, you're
dropping significant digits all over the place: one of your roots will
be well computed, the other one may well be a disaster.

Solution: take EITHER the sum OR the difference -- specifically, the one
that is actually a sum, not a difference, depending on the sign of b.
That gives you one root with good precision. Then, exploit the fact
that the product of the roots is c/a - compute the other root by
dividing this constant by the root you've just computed, and the
precision will be just as good for the other root, too.


Sure, this is a trick you expect to be already coded in any mathematical
library, in the function that solves quadratics for you -- exactly
because it's so simple and so general. But the key point is, far too
many people doing "number crunching" ignore even such elementary issues
-- and too many such issues are far too idiosyncratic to specific
equations, quadratures, etc, to be built into any mathematical library
you may happen to be using. Acton's book should be mandatory reading
for anybody who'll ever need to "crunch numbers"...!-)

(I've taught "Numerical Analysis" to undergrad level engineers, and
while I know I've NOT done anywhere as good a job as Acton does, even
forearming my students with 1/4th of the techniques so well covered in
the book, did, in my opinion, make them better "computors", to borrow
Acton's term, than any of their peers... _I_ was never thought any of
this precious, indispensable stuff in college, I had to pick it all up
later, in the school of hard knocks!-).


Alex
 
C

Chris Smith

Carl said:
You couldn't be more incorrect. I have run some very old (pre-Fortran
77) programs that are very far from trivial.




This is a lie. I've seen some Fortran code that was hellspawned, and
some that was clear as glass. The latter came about without a "truly
insane amount of trouble".




Perhaps this is your impression because it's the only Fortran code
you've ever been exposed to?




I suspect you're speaking from a narrow perspective, because "number
crunching", as you define it, is still a problem of interest and
heavily researched. Maybe it's not in your field. Anyways, you seem
to be ignorant of the complexities of "simple operations over vast
arrays", as if it you could accomplish these operations with a few
lines of Python and numpy. That might be true for your homework, but
good number-crunching codes often did a lot of stuff under the covers.
Hear hear. Python and Fortran both have their place. I'm a grad student
in Electromagnetics (radio frequency research) and I depend a lot on
"number crunching" to help me design the latest and greatest rader array
to the coolest cell phone that will connect anywhere. I actually use
python to speed up my development of codes for Fortran. I prototype some
function that I want in python and then use the final draft of it in my
fortran code. What used to take several hours I can do in less than one
by leveraging both languages for what they're good for, Python for RAD
and Fortran for fast number crunching of my research.

Chris
 
C

Cameron Laird

.
.
.
Your experience with Fortran is dated -- see below.


Gfortran, which supports Fortran 95 and a little of Fortran 2003, is
part of GCC and is thus widely available. Binaries for g95, also based
on GCC, are available for more than a dozen platforms, including
Windows, Mac OS X, and Linux. I use both and consider only g95 mature,
but gfortran does produce faster programs. Intel's Fortran compilers
cost about $500 on Windows and Mac OS and $700 on Linux. It's not
free, but I would not call it costly for professional developers.

Speaking of money, gfortran and g95 have free manuals, the latter
available in six languages
http://ftp.g95.org/ . Final drafts of Fortran standards, identical to
the official ISO standards, are freely available. The manual for Numpy
costs $40 per copy.

My experience with Fortran is indeed dated. However,
I still work with university groups that balk at $500
for valuable software--sometimes because of admini-
strative conflicts with licensing (example: the group
needs an educational license that fits its team
perfectly, but educational license have to be approved
by a campus-wide office that involves the group in
expenses uncovered by its grants, and ... complications
ensue). Intel's compiler, for example, is a great deal,
and recognized as a trivial expense sometimes--but
judged utterly impossible by a research group down a
different corridor.

My summary: practical success depends on specific
details, and specific details in the Fortran and Python
worlds differ.

Also, Beliavsky, thanks for your report on the pertinent
Fortran compilers. There *are* other proprietary Fortan
compilers extant; do you expect them to fade away,
leaving only g* and Intel, or are you simply remarking
on those two as the (intellectual) market leaders?
 
M

Mark Morss

So, after reading much of animated debate here, I think few would
suggest that Python is going to be faster than FORTRAN when it comes to raw
execution speed. Numeric and SciPy are Python modules that are geared
towards numerical computing and can give substantial performance gians over
plain Python.

A reasonable approach (which has already been hinted at here), is to try
to have the best of both world by mixing Python and FORTRAN - doing most of
the logic and support code in Python and writing the raw computing routines
in FORTRAN. A reasonable approach might be to simply make your application
work in Python, then use profiling to identify what parts are slowest and
move those parts into a complied language such as FORTRAN or C if overall
performance is not fast enough. Unless your number crunching project is
truly massive, you may find that Python is a lot faster than you thought and
may be plenty fast enough on it's own.

So, there is a tradeoff of resources between development time, execution
time, readability, understandability, maintainability, etc.

psyco is a module I haven't seen mentioned here - I don't know a lot
about it, but have seen substantial increases in performance in what little
I have used it. My understanding is that it produces multiple versions of
functions tuned to particular data types, thus gaining some advantage over
the default, untyped bytecode Python would normally produce. You can think
of it as a JIT compiler for Python (but that's not quite what it is doing).
The home page for that module is here: http://psyco.sourceforge.net/

Hope that help,
-ej

The question as originally framed was a little ignorant, of course.
Python and Fortran are by no means subtitutes. Python is interpreted,
comprehensively interroperates with just about anything, and is
relatively slow. Fortran is compiled, interoperates with almost
nothing and is blindingly fast. So it is like a battle between an
elephant and a whale.

If there is possible substitution, and hence competition, it is
between Python+Numpy/Scipy on the one hand and Python+Fortran, via
F2PY, on the other. My personal taste is to do things in Fortran when
I can. It is really pretty simple to write well-structured, clear
code in Fortran 95, and I don't find it troublesome to compile before
I run. I don't find type declarations to be a nuisance; on the
contrary, I think they're very useful for good documentation. Also I
am somewhat mistrustful of Numpy/Scipy, because when I visit their
forums, almost all the chatter is about bugs and/or failure of some
function to work on some operating system. Perhaps I am wrong, but
Python+Numpy/Scipy looks a little unstable.

I understand that the purpose of Numpy/Scipy is to make it possible to
do large-scale numerical computation in Python (practically all
serious numerical computation these days is large-scale) without
paying too much of a penalty in speed (relative to doing the same
thing with a compiled language), but I have not yet been persuaded to
take the trouble to learn the special programming vocabulary,
essential tricks, and so forth, necessar for Numpy/Scipy when Fortran
is ready to hand, very well established, and definitely faster.

I do value Python very much for what it was designed for, and I do
plan eventually to hook some of my Fortran code to Python via F2PY, so
that interoperability with spreadsheets, OLAP and the like on the
front and back ends of my information flow.

Maybe somebody reading this will be able to convince me to look again
at Numpy/Scipy, but for the time being I will continue to do my
serious numerical computation in Fortran.
 
J

Jaap Spies

Mark said:
Maybe somebody reading this will be able to convince me to look again
at Numpy/Scipy, but for the time being I will continue to do my
serious numerical computation in Fortran.

What I am missing in this discussion is a link to Pyrex to speed up
Python: Pyrex is almost Python with the speed of compiled C.
http://www.cosc.canterbury.ac.nz/greg.ewing/python/Pyrex/

Pyrex is adapted in SAGE (Software for Algebra and Geometry
Experimentation) as Sagex: http://modular.math.washington.edu/sage/

Jaap
 
M

Mark Morss

What I am missing in this discussion is a link to Pyrex to speed up
Python: Pyrex is almost Python with the speed of compiled C.http://www.cosc.canterbury.ac.nz/greg.ewing/python/Pyrex/

Pyrex is adapted in SAGE (Software for Algebra and Geometry
Experimentation) as Sagex:http://modular.math.washington.edu/sage/

Jaap

Well, the discussion was about Python vs. Fortran, and Pyrex, as I
understand it, is a tool for linking C to Python. So I am not sure of
the relevance of Pyrex to this particular discussion. F2PY is the
leading tool for linking Fortran to Python, and I did mention that.
 
E

Erik Johnson

Sheesh. Do Java developers go around telling everybody that Java is an
interpreted language? I don't think so.

What do you think the "c" in ".pyc" files stands for? "Cheese"?

On the contrary... Sun is very careful to make sure you understand that Java
is *COMPILED*!
Remember, remember, always remember: Java is COMPILED! See that: the java
"compiler": javac. You have to call it explicitly when you build your Java
software so that it compiles Java source code (that way Java executes really
fast)!! (And don't forget, Java source is *compiled*, just like C++.)

What's a JVM? Why would you need one since Java is *compiled*, remember?

But seriously... I'm not a language or architecture guru. Is there any
real difference between a JVM and an interpreter? I mean, I have some
general feel that bytecode is a lower-level, more direct and more efficient
thing to be interpreting that Java or Python source, but at the bottom
level, you are still running an interpreter which is going to be
(significantly?) more inefficient than executing native machine instructions
directly on the CPU, right?

Why is Python able to automatically compile source into bytecode on the
fly (when needed) but Java still forces you to do so explicitly?

I don't mean to bash Java - I think it has it's place as well, but I
mean to note that Java is very carefully marketed whereas Python's image is
not managed by a major, international corporation.
 
M

Marc 'BlackJack' Rintsch

Mark Morss said:
Well, the discussion was about Python vs. Fortran, and Pyrex, as I
understand it, is a tool for linking C to Python.

I think it's more than that. It's more a subset of Python with a little
static typing.

Ciao,
Marc 'BlackJack' Rintsch
 
B

Beliavsky

.
.
.






My experience with Fortran is indeed dated. However,
I still work with university groups that balk at $500
for valuable software--sometimes because of admini-
strative conflicts with licensing (example: the group
needs an educational license that fits its team
perfectly, but educational license have to be approved
by a campus-wide office that involves the group in
expenses uncovered by its grants, and ... complications
ensue). Intel's compiler, for example, is a great deal,
and recognized as a trivial expense sometimes--but
judged utterly impossible by a research group down a
different corridor.

My summary: practical success depends on specific
details, and specific details in the Fortran and Python
worlds differ.

Also, Beliavsky, thanks for your report on the pertinent
Fortran compilers. There *are* other proprietary Fortan
compilers extant; do you expect them to fade away,
leaving only g* and Intel, or are you simply remarking
on those two as the (intellectual) market leaders?

My point was that a few years ago an advantage of Python+Numeric or
Octave or R over Fortran is that the former let one work at a much
higher level, if one restricted oneself to using only free tools. The
creation of g95 and gfortran has changed that somewhat, and the
existence of commercial compilers is a plus, since they can surpass
the free compilers in performance (Intel), functionality (creating
Windows GUI programs entirely in Fortran, for example) or diagnosing
errors (NAG, Lahey/Fujitsu and Salford/Silverfrost). A research group
could purchase a single license of a commercial compiler to use in
nightly builds but use the free compilers for development.

Which commercial compilers will fade away? Decent free compilers will
hurt the market for mediocre commercial ones, which may explain the
demise of the compiler from NA Software. The Fortran 2003 standard
adds many new features to Fortran 95, thus making big demands on
vendors, and I have heard that Lahey and Salford will not be upgrading
their compilers to the new standard. The active vendors appear to be
Absoft, IBM, Intel, Pathscale, Portland, and Sun.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,230
Members
46,818
Latest member
Brigette36

Latest Threads

Top