Martin Krischik wrote:
[ ... ]
Did they? Or did they just implemented some 80% of the new features?
My experience with C/C++ (and I have 10 years + of that) is that at
no time there was a fully compiant C compiler available. There where
allways a lot of compiler avaliable who claimed to be
<small>almost</small> complinant - but never one which realy was.
Partily because - unlike Ada (
http://en.wikipedia.org/wiki/ISO_18009)
- there is not official testsuite to test a C/C++ compiler and
runtime library. Such an official testsuite would do C/C++ all sorts
of good.
I'm quite impressed with the statements above -- lots of Usenet posts
contain errors, but most of them are pretty obvious. You've managed to
fit more plausibile-sounding errors into fewer sentences than nearly
any other post I've ever seen.
Let's address the Ada side first. Official Ada validation was done
under the auspices of NIST, who delegated this task to the Ada Joint
Program Office. The AJPO ceased to exist years ago, and the job was
never turned over to anybody else when that happened. Meanwhile, NIST
has discontinued _all_ of its compiler validation programs, not just
the Ada program. Currently, both the ISO standard at a number of FIPS
pubs _require_ Ada compilers to be officially validated, but at least
in the US, there is absolutely NO agency to do that.
The situation on the C side isn't nearly as different as you seem to
think. After the C standard was published, NIST and BSI (to name only
two that I know of with certainty) started doing validation of C
compilers. BSI certified at least one C implementation in 1990, the
same year the ISO C standard was approved. While that first one was for
code-checking, not production of executables, other implementations
(e.g. at least one version from Borland) were certified as well.
As mentioned above, NIST no longer validates/certifies compilers, so at
least in the US, there is no such thing as an officially validated
compiler for C, Ada, or any other language.
What do you mean by "exists today"? C99 is 5 years old and still no\
compiler is available which support all C99 features. "restrict" -
missing in MS-C (even thrue restrict can be implemented as "no
operation") - VA - arrays (savety critical feature) - missing in
MS-C, buggy in GCC.
I take it you've never used or even tested the Comeau compiler?
Maybe just maybe - if there realy was any standart compiler available
- but there isn't - the C/C++ compiler vendors are allways one
release behind the actual ISO standart.
I see Greg has entered the thread, so he can speak for himself, but
unless memory serves me particularly poorly today, he had C++ 2003
implemented _before_ it was officially approved.
In fairness, I should add that I found a _few_ defects in the Borland
compiler that was certified -- but back when I worked in Ada, I found
defects in every certified Ada compiler I used as well. In both cases
the defects were small and easily avoided, but the defects in the C
compilers were smaller and more easily avoided.
[ ... ]
It is true that a programming language need some minimum features set
to be usefull. And that feature set is a lot larger then most belive.
If a successfull language does not provide that set it will be bolted
on later. If anything the current C/C++ ISO standards clearly show'
that the advocates for slim languages hat been wrong all along.
More nonsense, I'm afraid. First of all, the argument is defective from
the beginning: changes in C and/or C++ absolutely cannot prove anything
about languages in general. Second, some extremely slim languages have
clearly been useful at times, easily disproving your conclusion on
minimum feature sets as well.
None of this proves, or even provides evidence, directly related to
what is claimed by most advocates of smaller languages. Most have said,
for example, that all else being equal, a smaller feature set is easier
to understand completely. Now you may be convinced that a larger
feature set outweighs this fact, and you might even be right -- but
that doesn't make them wrong.
IMO, arguments of the benefits of "small" ("slim", etc.) vs. larger
languages mostly miss the point though. IMO, there's a much more
fundamental and useful distinction to be considered. That is the
distinction between those where the language itself is the major
player, and those were the language is mostly a way of producing and/or
using libraries.
Ada certainly provides facilities useful for writing libraries, but at
least to me seems to fall into the former group -- it works well for
writing code directly, but attempting to write good libraries in it
tends to be frustrating.
C++, while certainly providing some features useful for direct coding,
is strongly oriented toward the language providing facilities for
building libraries, and much (in many cases, MOST) of what the end-user
does is uses the libraries.
Looking at things from this perspective, Ada may be the last of its
line. 30 years ago, Lisp was nearly the only library-oriented language,
with a miniscule market share.
Now, the library oriented languages dominate. Fortran and Cobol may
never die, but they're certainly not the market leaders they once were.
PL/I is dead, and Ada is hardly dominant. C++ has already been
mentioned. Java is a fairly small language with a huge library. The
next obvious step is .NET, which de-emphasizes languages to the point
that .NET itself IS simply a huge library, with facilities to make it
easy to use that library from any (or all) of a large and growing
collection of languages.
Of course, if you want to discuss slim languages, there's always
Smalltalk -- the language itself is absolutely puny, providing little
more than the ability to send messsages to objects, and receive back
results. Everything else is in the standard library, even such basics
as creating an object. This means the language needs a fairly large,
pre-existing standard library to be able to do anything at all.