I'd believe that as consistent with my previously mentioned observation
that most programmers paid little attention to portability. I failed to
mention the qualifier, but my observations at the time were solely
within the US. Not until a bit later did I get any exposure to work
across the pond. In the environments I mostly saw, there was only a
single machine at your place of employment, and that machine was likely
to be the only one you used for somewhere close to a decade, with
software compatibility being a big argument for replacing it with
another from the same vendor when the decade was up.
Yes, this was really the point that I was trying to make before. I
think most programmers up through the 70's (in the US, that was my
only experience at the time) only worked on a single machine. It
might have been an IBM shop, or a Univac shop, or a CDC shop, or
whatever, but that was the hardware that was available to a single
programmer, and his job was to squeeze everything out of it that he
could for his set of applications. That often meant using
machine-specific extensions for things like bit operators, or access
to some hardware widget or i/o device, or operations on character
strings, and so on. Given the choice between slow portable code or
fast machine-specific code, the pressure always seemed to be toward
the latter.
Then in the early 80's when some of the new hardware became
available, such as Cray vector processors, some programmers kept
this same mindset. I've seen rewrites of electronic structure codes
for cray computers that would have almost no chance of compiling on
any other hardware. Every other line of code seemed to have some
kind of special vector operator or something in it. I remember
seeing someone multiply an integer by ten by doing a
shift2-add-shift2 sequence (or maybe it was shift4-shift2-add, I
forget) because he had counted clock cycles and determined that that
was the best way to do it with that version of the hardware and
compiler. But as more and more hardware became available, and it
became necessary to port your codes quickly from one machine to
another, or to be able to run your code simultaneously on multiple
combinations of hardware and software, this kind of coding died out
pretty quickly. Low-level computational kernels, such as the BLAS,
were still done that way, but the higher-level code was written to
be portable, even at the cost of an extra machine cycle here and
there if necessary. All of this was also driven by the need to work
with collaborators who were using different hardware than you, and
the need to access and contribute to software libraries such as
netlib which were used on a wide range of hardware, and network
access to remote machines at the various NSF/DOE/DoD supercomputer
centers around the country.
In the old environment a programmer might take several months or
years to optimize some code for a specific machine, and then that
code might be used for a decade. In the new environment, the code
needed to be ported in a matter of days or weeks, and used for a few
months, at which time the machine might be replaced with new
hardware, or your NSF allocation would expire, or whatever. It was
this newer environment (at least in the US) that I think drove
programmers toward writing portable code, and in many ways that
meant conforming to the fortran standard.
There were some exceptions to this, of course, which have been
discussed often here in clf. One of them was the practical
observation that the nonstandard REAL*8 declarations were more
portable in many situations than the standard REAL and DOUBLE
PRECISION declarations. This was an example of the standard
actually inhibiting portability rather than promoting it. The KINDs
introduced finally in f90 solved this dilemma, but that is probably
part of f90 that should have been included in a smaller revision to
the standard in the early 80's rather than a decade later. There
may be other examples of this, but this is the only one that comes
to mind where I purposely and intentionally avoided using standard
syntax and chose to use instead the common nonstandard extension.
Then when f90 was finally adopted (first by ISO, then force fed to
the foot-dragging ANSI committee), this was one of the first
features that I incorporated into my codes. I even wrote some sed
and perl scripts to help automate these conversions, violating my
"if it ain't broke, don't fix it" guiding principle to code
maintenance.
$.02 -Ron Shepard