:This is progress, the language is dynamic, always changing and progressing.
:Without this, we end up programming by decade old rules on machines far more
:capable than the ones the rules were written for; in effect holding back
rogress in the very industry we support.
That is more or less BS. The last I read, researchers still aren't able
to hold more than 3 quantum transistors stable simultaneously -- so
unless you want to count DNA programming (which does actually work,
if you are willing to get into wetware), then you are still firmly in
the realm of Turing Completeness, which can be summarized as
"Anything that can be computed in one sufficiently powerful logic
system can be computed in any other sufficiently powerful logic system."
"Machines" are NOT "far more capable" than 20 years ago: they are only
far -faster- (and have some interesting extensions to their I/O interfaces.)
:Hardware reaches new generations
:in a matter of months,
*Snort*. Continents reach new positions "in a matter of years".
Read the durned press releases about inventions like new transistors
and computing on amorphous silicon: the usual timeline to take them
out of the lab into *any* product is a minimum of 5 years, and the
usual timeline given before they expected to find their way to
any kind of consumer product is a decade.
It's the old story about the singer that toils in obscurity for
25 years before becoming "an overnight success". What you see is the
Pop Culture of electronics, not the Practice! Practice! Practice!
that proceeds it.
:many programming languages are still essentially
:where they were in the early 80s. Where software once drove hardware,
:demanding progress, we are now approaching a point where the situation is
:reversing... hardware is now driving software. It would be a mistake to not
:keep up.
This is known as "The Software Deficeit". Hardware -is- getting faster
(but with much much higher time latencies than you are observing),
and it is true that software is not keeping pace. This is a signficant
concern to some very bright people and is being actively investigated.
It is, though, not entirely clear to me that the situation is capable
of a meaningful solution. People thought they had the answer with
"4GL", Fourth Generation Programming Languages, which were supposed to be
simple enough thta "even managers" could write code, with all the
details being taken care of by the compiler. The result was, of course,
a lot of very bad code, and and the recognition that large programs
*are* complex and that the best that was really being done was to remove
a constant factor from the complexity analysis, rather than reducing
the inherent complexity.
I have to ask whether you are perhaps expecting too much. In Newton's
time, people were able to create a machine that broke one treelimb
in two pieces over the course of hours -- long enough that one could
be there to catch half of it. After Watt's invention of the steam
engine, and after Ford's essential implimentation of standardized
parts (based upon earlier proposals), we are now at the point that
we have machines that can grind an entire large tree into shavings
within a matter of minutes. Tree-destroying hardware has progressed.
Do you feel significant concern that Evolution hasn't seen fit to
speed up human reactions to the point where we can still catch
half of those output in our bare hands over a matter of minutes?
:Moreover, having worked in Delphi and done a test run with c++ I'm pretty
:sure OOP is not the way to do this. It's not better, it's just easier.
:Underlying all those fancy visual design tools and object methods is still a
rocedural core upon which the CPU itself relies. C still underlies C++,
ascal still underlies Delphi. That procedural languages are not
rogressing in the forefront of programming is, ummm, disappointing.
Well, I can tell you that adding a true String type to C would not have
made any signficant difference towards that end.
Modern procedural languages do not have significantly more expressive
power? That's like saying that Predicate Calculas should be retired
because it hasn't advanced significantly since Goedel's Completeness
Theorem of about 1933. The essential logical elements required for
programming were invented a long time ago, and haven't advanced
signficantly because they were shown to be logically sufficient. It's
like the transition from DFA (Deterministic Finite Automata) to NDFA
(Non-Deterministic Finite Automata) -- it might look wonderful on the
surface, but it turns out only be able to express the same things after
all. For software to "advance" markedly would require that something be
discovered that could not be computed with what we have now -- the
equivilent of finding that we have to go to LALR(1) grammars. Except
that Goedel and Turing and Church-Rosser have shown that there *isn't*
anything like that waiting out there for us, not until we get into
either infinities (traditional computing is not very good at producing
infinite answers in a finite time), or into non-deterministic
computations.
Do you know what the most common solution is for "fixing"
"The Software Deficeit" ? It is producing standardized library
toolkits to act as "building blocks" for programs, so that
people can fit together standardized, debugged, well-understood
tools instead of having to write from scratch so much.
And the odd thing about the mass toolkit/building-block
approach is that it relies *heavily* upon standardization,
upon nailing down every last behaviour and saying "Things will
mean this one thing and this one thing only". Which is the
opposite of what you are advocating -- you want evolution
through the gradual adoption of non-standard components,
whose meaning and API gets defined by historical accident and
historical pressure rather than being rigerously defined by
a group dedicated to producing precise and logically coherent
foundations.
The "evolution and common adoption by the masses" approach has
already been tried: that's a capsule summary of the Microsoft
Windows family of operating systems ("You can't make us publish
API's, that would reduce our capacity to innovate!"). The fobiles
of MS Windows are legendary (but unfortunately, not mythical.)
If the future of computing is more of the same, more of MS Window's
way of operating, then IMHO computing would be in -serious- trouble.