Peter Amey wrote:
[ ... ]
Actually, a close reading of the thread should have made it clear
that the additional safety is indeed "free". Since the majority
of Ada's checks are compile time they do not impact on run-time
efficiency.
Some of the cited cases provide checks that are free in terms of
run-time, but run-time is hardly the major cost in most software.
Though I haven't done so recently, back when I used Ada, nowhere close
to all of the checks were free even in terms of run-time either.
Furthermore, even when the run-time check itself imposes no extra code
execution, that doesn't mean it's necessarily free in terms of run-time
cost either. Simple economic reality is that writing a good compiler is
NOT a quick, cheap, or simple thing to do. Generally speaking, it's
done to a budget of some sort and time and effort that's put into
compile-time checks more or less directly translates into time and
effort that was NOT put into better code generation and optimization.
This can be ameliorated in a case like GNAT, where the Ada compiler
shares a back-end with something else, but the effect is almost never
really eliminated -- time and effort are almost always limited
resources, so expending them in one way nearly always reduces their
expenditure elsewhere.
Where Ada's goals can only be met by run-time checks these are
no more expensive than equivalent manually-inserted checks in
any other language (and are often less because Ada provides the
compiler with more information by which to judge when they can
safely be optimised away).
Maybe...or maybe not. IME, absolute statements about generalities (e.g.
all implementations of a language) are rarely accurate. Ada compilers
vary quite widely, and I've certainly seen some emit code that included
checks that were logically unnecessary. I'd guess that the current
compilers are better, but perfection in this respect would surprise me.
In the end, my experiece has been that Ada compilers produce _slightly_
worse code in general than C and C++ compilers, but it would take
considerable work to determine how much of this is due to the source
language, and how much simply because their smaller market share
supports less time and effort in optimization. In fairness, I should
add that the differences are rarely very noteworthy. In most typical
situations, if one produces adequately fast output, so will the other.
Of course, if you're dealing with hard real-time requirements and a C++
compiler meets the requrement with only .5% to spare, there's a pretty
fair chance that Ada would fail -- but this sort of situation is
unusual (probably even rare).
[ ... ]
It should also have been clear from the thread that Ada imposes no
limits on expressiveness.
Nonsense -- examples of Ada expressing a few specific concepts have
been given, but this hardly proves a lack of limits. The fact is,
_every_ programming language places severe limits on expressiveness;
anybody who believes otherwise simply hasn't given the subject much
real thought. Programming languages in general express only a small
range of specific actions that are relevant to (most) computers. By
design, none of them is really expressive in areas such as human
emotions.
Even sticking to programming types of things, Ada has some limits to
its expressiveness.
Just for one obvious example, Ada doesn't provide an easy way to
express/do most of the things one can do with the C or C++
preprocessor. It provides some alternative in _some_ cases, but quite
frankly, these are really the exceptions rather than the rule.
Some areas are somewhat more questionable -- calling a class a "tagged
record" is clearly a mistake, but it's open to question whether it
should be classified under poor expression of the concept, or just
general idiocy.
The Ada folks who insist that things should be part of the base
language rather than an add-on library may have strings, but have
nothing equivalent to a dtor in C++. The possible presence of garbage
collection does little to mitigate this, as dtors are useful for _far_
more than just releasing memory when no longer in use. Ada even tacitly
admits to this shortcoming by providing Ada.Finalization in the
library. My understanding, however, is that this requires anything that
wants a destructor be derived from their base class (oops -- their base
tagged record). This may be usable, but it's hardly what I'd call a
clean expression of the concept. In fairness, I should add that I've
never used Ada.Finalization, so my understanding of it may well be
flawed, and corrections about this area would be welcome.
Speaking of strings, I'll digress for a moment: personally, I find it a
bit humorous when Ada advocates talk about things like having five
string types as an advantage. IMO, the need, or even belief, that five
string types are necessary or even useful is a _strong_ indication that
all five are wrong.
Ada's exception handling is also primitive at best (exceptionally so,
if you'll pardon a pun). In particular, in Ada what you throw is
essentially an enumaration -- a name that the compiler can match up
with the same name in a handler, but nothing more. Only exact matches
are supported and no information is included beyond the identity of the
exception.
In C++ you can throw an arbitrary type of object with an arbitrary
value. All the information relevant to the situation at hand can be
expressed cleanly and directly. The usual inheritance rules apply, so
an exception handler can handle not only one specific exception, but an
entire class of exceptions. Again, this idea can be expressed directly
rather than as the logical OR of the individual values. And, once
again, the addition of tagged records to Ada 95 testifies to the fact
that even its own designers recognized the improvement this adds in
general, but (whether due to shortsightedness, concerns for backward
compatibility or whatever) didn't allow this improvement to be applied
in this situation.
Can you say what led you to the opposite conclusion?
Study and experience. Can you say what leads you to believe your own
claim?