I agree. It is THE de-facto standard on the desktop. That is my point.
Most desktop "C" programmers are in fact using a C++ compiler. As you
have pointed out recently the current crop of MS compilers are very good
and do adhere to the C++ standards.
Not sure what your point is then. AFAIK, *every* C++ compiler is also
a C compiler these days. If each side of the compiler honors its
particular standard well enough, most programmers think of them as two
different compilers implementing two different but closely related
programming languages.
I can also find millions of hits to show that Elvis lives, Roswell was a
fake AND that it was real aliens, That the Davinci code is ALL real and
all fake. Also that there are WMD in Iraq... Just because lots of
people believe it does not make it true.
Just because you can find millions of hits on google proves absolutely
nothing. I think it was Robert HeinLein who said "never underestimate
the power of human stupidity".
Sorry, but it does prove something, human capacity for stupidity
notwithstanding. Google ain't the real world, but it is a first-order
estimator of what's out there. Same for C.l.c. No matter how many
times someone pedantically insists that there is no such language as
C/C++, lots of folks out in the real world continue to use the term
to refer to the mix of code they use to solve real-world problems.
By contrast, I just Googled:
Pascal/Cobol and got one hit in Korean,
Eiffel/PL/I and got two hits, one in Japanese
So those combinations of languages seem to make less sense treated as
a single entity than C and C++. Also, FWIW, I wrote an article a year
or two ago called "The C/C++ Programming Language" which did *not*
drown in derision from subsequent letters to the editor.
There's something there that has meaning to people, deny it as much
as you'd like. Wouldn't it be more helpful to get out of denial and
start considering the role C.l.c. could play in helping the millions
of programmers, managers, etc. who profitably assume that C/C++ is
a term that makes sense?
No, but it's a step closer, and it shows that C++ walked alongside
C as far as it could to make a standard published before 1999.
That is the *next* version of C++ BTW when is it due out?
The "0" in C++0X indicates a fervent hope that it'll happen before
2010. (Remember C9X?) And it once again indicates that the languages
are making serious efforts to converge, after several years of
drifting in different directions.
Most(?) desktop users of C actually use a [MS]C++ compiler and MS has
taken this off on their own direction
Well, yes, but they seem to have gotten religion about standards
conformance lately.
Yes, they have put a lot of work into ECMA and as you have pointed out
become a lot more ISO compliant.
They haven't done export for C++ yet (because
it's damned hard and next to nobody is asking for it)
This is a bit of a recurring theme where no one is pushing for
compatibility with the latest standards.
You misunderstand the emphasis. It has almost never been the case ever
that a vendor has aimed for 100 per cent conformance to a programming
language standard. (I say this despite the free use of ANSI and ISO
in advertising.) I remember a decade ago Tom Plum expressing frustration
that he couldn't get any of his customers to eliminate the last three to
five test failures when running their C compilers against his validation
suite. Each had damn good reasons -- usually involving backward
compatibility -- why they wouldn't close the gap completely. With C++,
a way more complex language, the fuzz level is typically measured in
the dozens or hundreds of tests, but the same reasons apply. And both
C++ and C99 have made matters worse by setting such a high bar for
100 per cent conformance that few vendors are motivated to approach that
Apollonian ideal.
Nevertheless, the C and C++ Standards carry weight, if only because
many enterprises insist that *certain* portions of each language
implementation conform closely, and there are tools from Plum Hall and
Perennial to probe those portions. So the paucity of 100 per cent
conforming implementations should not be taken as proof that standards
have failed. Their success these days is just somewhat less absolute
than we all hoped for a decade and a half ago.
This is where we all came in. Why does no one (very few) want C99
conformance?
See above. And other reasons given. C99 is an asymptote these days,
not a pressing goal. Same for C++ (which is also a moving target).
thanks
I know. AFAIK most of the major embedded compilers use that combination.
See
http://www.edg.com &
http://www.dinkumware.com I will put the
advert in for you
As you say they do give C99 but.....
though AFAIK not all of the embedded implimenters do a full C99 back
end.
I just plain don't know. I've stopped asking folks like Green Hills,
Wind River, etc. what they're going to release when. Practically all
of our support for our OEMs is helping them give their customers what
they want *now*.
All those things in C99 that people say are broken. They also seem to
work for our customers. But what do we know?
I know. Bloody nightmare. But probably not the place to go into it in
detail.
Ask the usual suspects.... Though one or two seem to have given up and
wandered off.
Well, that'll make for a refreshing interlude.
OK. Though you have said there are parts of C99 you wish had not been
put in.
Of course. Every programming language standard has oodles of compromises,
many made for "political" reasons (as if that were a bad thing). I didn't
like all aspects of ANSI C89, yet I voted to approve the final draft.
I certainly didn't like C++98, yet I did the same. I have no trouble
griping about the problems in the standards I try to implement, while
trying at the same time to implement them well and acknowledging a
committee's right to go against the will of any minority. A standard
doesn't have to be 100 per cent lovely, in the eyes of every admirer,
to be useful. Quite the contrary.
They were arguing against adding any TR's or anything else until all the
DR's had been fixed.
At least the TRs are non-normative. And, thanks to the persistence of
John Benito, the WG14 Convener, the DRs are well under control. If there
are still massive flaws not addressed by open DRs, that's the fault of
those who perceive the flaws, not the diligence of WG14.
Also that is Nick's worries about the whole maths
model.
Yes, I've gotten a dose or two of Nick's worries lately. And been
somewhat surprised about what he doesn't know about what he's
criticizing.
That's good. However as MS is the defacto standard on the desktop and
they are going ECMA C++/CLI isn't that where the vast majority of C++
users will go?
Dunno. Maybe. I know quite a few people who use VC++ as a Standard C
and a Standard C++ compiler (I hesitate to say C/C++ here). The lure
of the managed environment will doubtless draw many, but there are
also many who want nontrivial code to work on Windows, Linux, Unix,
and other platforms. The siren call is not always obeyed.
I think you have more influence than I do in that respect.
Perhaps, but much of the time my influence is bugger all. Particularly
in C++. But at least that gives me more opportunities to gripe...
P.J. Plauger
Dinkumware, Ltd.
http://www.dinkumware.com