That is incorrect in every particular.
Nonsense. You don't substantiate even *one* counter-claim.
[...] There is *no* part of C99 that
has not been implemented. Many parts were implemented before they were
standardized.
*Sigh*. *OBVIOUSLY*. But that's a ridiculously low bar.
They had to make sure that the whole she-bang could be all done at
once. But its obvious from looking at the Intel effort and the gcc
effort (both groups, being *motivated* to implement C99) that
something stopped each of them. In the end you end up with just a
couple of marginal C99 implementations which can only do so because
they are so marginal that they don't have any backward compatibility
problems to deal with.
[...] They have been implemented precisely because they *are*
of value to actual programmers who need or want them.
You have to be kidding me. Complex numbers that are incompatible with
C++?!?! Randomly positioned declarations? This is nonsense that
special interest groups may have asked for -- but do you see any of
these same people demanding C99 compilers being implemented do you?
If you want to know what programmers want, you have to test it against
the directions that programmers have *actually gravitated* to. This
testimony is to be found from two main directions: 1) modern CPU
capabilities that are used and yet are non-encodable by the language
and 2) ideas from modern languages like Java, Python, Perl and so on.
It's interesting to note that GCC is the only major compiler that has
never been represented in the standardization process. I consider that
a major loss for GCC, the C standard, and the C community in general.
I agree. So what the hell is the committee planning on doing about
it? If you look through the mailing lists for gcc, you will find that
many vendors talk to them and influence them all of the time (I am
specifically thinking of WindRiver and AMD).
While it's not too hard to handle simple complex arithmetic yourself,
the syntax you have to use is quite awkward, at least some of the
operations are apt to be implemented poorly (unless you're a numerical
analyst), and you end up having to implement the whole blessed math
library if you want to do actual math (as opposed to arithmetic), which
is definitely a non-trivial task.
What? First of all, if you need to *deal* with complex numbers in a
non-trivial manner (like implementing trigonometry, or numerical
integration or something like that), you better *BE* good enough to
implement your own complex number library.
The C solution was never intended to be taken forward -- the C++
solution is perfectly fine for C++, but it's not viable for C.
And you are saying the embedded/driver applications have a great need
for really marginal and difficult to implement complex numbers?
C++ is the *path* to take C forward. It means that the areas that
they carve out should be solved by them, and the C standard committee
should accept this fact. This is the only, and best way for them to
remain relevant. This lack of consideration for C++ has help *cause*
C99 to be ignored by end-users.
[...] The
intent was to come up with a solution for C that was "compatible" with
the C++ solution, meaning that it is possible with just a little
preprocessor magic to write code that will compile as either C or C++.
And if you want to do number theory on guassian integers, is it still
just a matter of preprocessor tricks?
Why do you think anyone cares about problems already solved in C++
being solved yet again in C (which is now most commonly available as
just another mode of C++ compiler)?
Look. C is a malloc/free based language. Every new language has
implemented garbage collection as a means of escaping the weaknesses
of the C model. This is one of the most glaringly obvious issue with
C. Now, I don't think C needs to *SWITCH* to GC. Instead, this
outlines the need to do *BETTER* with the malloc/free model, as a
*COUNTER* to those that are running away from it because of its
weaknesses. Also we see from things like Electric Fence, Purify,
Visual C++'s memory debugging facilities, that there is both a need
and demonstrated ways of improved memory managing. This is obvious,
and all this information has been around forever. So who in the
standards committee has looked into improved memory management for
C99?
Many CPUs have bit scan, bit count, and widening multiply
instructions. All of these functions can be emulated very slowly but
portably but cannot reasonably be mapped to the fast hardware via "as-
if" rules. They also all have practical applications in the real
world. If the C standard committee merely added APIs for these into
the standard library, it would open a vast unmatchable performance
wedge against all other languages (including C++, until they added
them to their standard as well.) So which of these functions were
added, or proposed to be added to the C99 standard?
Every database-like program that implements ACID, does so with a very
twisted sort of file management. People who used CVS heavily during
the mid to late 90s knows exactly what the problem with this is (CVS
was unsound due to bad file locking implementations for much of its
lifetime). Different file systems have different characteristics and
guaranteeing atomicity is done on a case by case basis. If there was
an API for some atomic file primitives in the C standard, this could
force OSes to expose a unified API which would in turn force the disk
drive manufacturers to provide some kinds of guarantees about their
product. What did the ANSI C committee say about this?
Just think about those things for a second. If C99 had implemented
any of those, don't you think that there would at least be *SOME* buy
in from end users?
And that ignores what the C standard committee *should* do just for
sanity reasons (fix the ftell/fseek nonsense, fix the time APIs to be
non-static and >= 64 bits).