[snips]
A tool that by design is unsafe to use is a badly designed tool.
So let's get rid of hammers, axes, knives, screwdrivers, cutting torches,
most saws, drills, forks, pencils, pens, computers, TVs, cars, toasters...
Every single one of those is a tool. Every single one, used improperly,
is dangerous - even potentially fatal. Most *cannot* be designed to be
safe if used improperly; therefore, by your argument, they are badly
designed and need to be replaced... except... if they cannot be designed
to meet your requirements, all that's left is to abandon them completely.
This, to you, makes sense, does it?
If
there were not literally hundreds of exploits in production code that
have actually been used for evil purposes
And tens of thousands of people killed in car crashes, by knives,
thousands more killed by assorted workshop tools, more still killed by TVs
and radios and toasters...
, then these "C strings are
safe" arguments would make sense. Expert coders who are trying their
hardest still cause errors in this regard.
First, a string isn't a tool; it's a piece of data. The tools in this
case are the functions which operate upon strings. Those tools, like any,
require a certain attention to detail, a certain skill.
Part of that skill, part of that attention, is simply knowing how to use
them correctly in the first place. Another part, at least for "critical"
code, involves items such as code reviews, use of unit testing, use of
scattershot and other randomized testing tools and so forth.
I'm willing to bet you will find *damned* few instances of
C-string-related exploits which have passed a proper testing and
examination regimen.
So now, let the first person who has written at least 10,000 lines of
code and who has also never had even one bug in his code raise his hand.
Umm... er... okay. Mine. 35,000+ lines in a single project... over
200,000 units shipped. While there were a few design issues that came up
- as in "it'd be nice if it could do this" - not a *single* actual bug
report.
Mine is.
Then perhaps we should assume that the users of
the tools are not perfect and (in fact) prone to mistakes.
A tool cannot be prone to anything; it is inert, sitting there waiting for
you to use it. *How* you use it determines the risk. If you choose to
walk into a crowded mall with a high powered rifle, put on a blindfold and
empty the clip in random directions, it is not the fault of the tool that
someone gets hurt.
These mistakes do happen, on a frequent basis.
No, they don't - at least, not to seasoned developers using proper
development, testing and verification strategies.
It is *precisely* when one becomes blase about the use of the tools - I'm
not talking strcpy and the like, I'm talking the entire language, and even
more generally, the entire world of software development, regardless of
language or platform - that dangers arise.
There's an old adage, it goes something like this: "If engineers built
buildings the way programmers built buildings, the first woodpecker to
come along would destroy civilization."
There's a certain truth to that. I have met far too many developers over
the years who simply _don't care_. They rely on functions to "just work"
and don't bother checking return codes. They assume allocations succeed.
They assume they have enough space to copy a string or write a block of
data.
There is *nothing* you can do about such people other than identify them
and either train them properly or try to prevent them ever developing
software. No language, no tool in existence will ever stop them writing
bad software, software full of holes, full of risks. They are simply
incapable - through lack of training, lack of concern, whatever - of
producing quality software.
But that's the whole point: there is *nothing* you can do, from a language
perspective, to stop them. Nothing. They will always manage to do
something wrong. If it's not a failed string copy, it's a division where
they don't check for division by zero. Or something, anything. Lacking
the skills or the concern to do it right, they do it wrong, and the only
way to prevent it is to prevent them writing software - not to try to
create some magical language where it is impossible to write bad code.
Professional developers, writing production code, use professional
methods. They use well-structured designs, for starters. They examine
and handle errors and allocation failures. They make sure buffers are
large enough.
However, a professional developer *also* knows he is human and can make
mistakes, so he'll use other things to help him. Compilers with maximal
warning levels, and he'll treat warnings as fatal errors. He'll run lint
and similar tools. He'll do coverage analysis. He'll use random
injection tests to validate that a piece of code doesn't break when handed
data too large or too small or with improper values. Where possible,
he'll submit the code to review.
Check that failing code you go on about. See how much of it has gone
through all those levels of design, testing and validation and *still*
managed to produce fatal errors.
These mistakes also
cause billions of dollars in damage. This is not a theoretical
argument. It is a statement of fact, using what we actually observe.
Really? Tell you what. How about you provide a single example where the
tools - say the C string handling functions - actually caused billions in
damages... when used correctly.
The claim is that responsible coders will not introduce mistakes that
cause damage because of the way C strings are designed.
No, the claim is that professional developers apply professionalism to
development and take steps to ensure that this sort of thing doesn't
happen in programs properly designed and used within the limits of those
designs. The claim, further, is that professional programmers apply
professionalism to testing and verifying that their code actually does
work as intended, despite being given garbage data.
There's another old saying, GIGO - Garbage In, Garbage Out. However,
while trite and well known, it is actually the mark of a poor programmer.
The saying suggests that it is okay - even expected - that if bad data is
given to a program, producing flawed results is acceptable.
Ask any professional programmer what his routines do when they encounter a
value out of range, for example. The response will almost invariably be
something like "bail out, reporting an error."
There's a reason for it; if something is outside the realm of what the
code is designed to handle, then something has gone fatally wrong: the
program is being used incorrectly, or outside the scope of the design it
was built to, or a device - or user - is giving it invalid input, or some
other routine has failed in some manner.
Garbage in does not mean garbage out, not if you're a pro; it means
_errors_ out. Not crashes, not overflows, not random writes to random
bits of memory - errors. As in "I don't know what to do with this, so
tell someone about it and let them sort it out."
Again, it is precisely the lack of concern and diligence that causes these
problems - not the tools. Short of making a "perfect" - and perfectly
unusable - language, you cannot stop poor developers writing poor code.
The best you can accomplish is making it difficult for good programmers to
write good code - and a good programmer doesn't need that hand-holding in
the first place. He already has the tools necessary to detect and deal
with these problems, and he already *uses* those tools.