But a similar example would be the size of a block of memory. For
instance memcpy takes an unsigned type for its third parameter
presumably since you cannot copy a negative number of bytes of memory.
But this loses the chance to do some error checking.
This is a good thing:
1) The function signature correctly documents that negative numbers are
not valid
2) The called function does not have to waste time performing a
comparison for invalid data. The invalid data cannot be represented in
the parameters.
Or, you may have a strange memcpy which allows you to designate the _end_
of the block in interest, and a negative size indicates you want to copy
the X number of bytes _before_ the pointer....
For instance I might write some code like this
size_t off = p - q; // p >= q
memcpy(a, p, off);
And then I promptly beat you to a pulp (hypothetically) (OK, I promptly
explain to you...) for not range checking the relative values of p and q.
(See pretty much any of the STL algorithms which take two iterators...
the algorithms don't check to see if they are correctly ordered, that's
the programmer's responsibility. Same sort of idea).
In this code p and q are pointers to some location with the char array
a. I am assuming the off is a positive or zero quantity. But suppose
because of bugged code that is wrong, memcpy cannot refuse to copy the
very large positive number of bytes that will result, because it is
designed to copy any unsigned quantity. But if memcpy took a signed
type as its third parameter, it could then quite reasonably refuse to
copy a meaningless negative number of bytes, thus reducing the chance
of a buffer overflow.
Then you would be wasting the great majority of your calls to memcpy
checking for a condition which _might_ happen only during debugging.
Probably more effective to link against a debug version of the C runtime
which would exception on calls to memcpy of blocks larger that
0x7FFFFFFFUL (assuming 32-bit architecture...). Once you've determined
with sufficient certainty that your program is bug-free (HA!
), you
link against the release-mode memcpy which doesn't waste time checking
for a condition that will never (famous last words....) happen.
It could not have happened but for the mix. Surely you're not
proposing to ban signed integers. Now that really would be ridiculous!
No more ridiculous than banning unsigned integers...
I think if you look at the code the programmer did not ignore compiler
warnings. I'm guessing of course but I think he did add a cast to the
while loop condition because of a compiler warning, but did not to the
subtraction because that particular compiler does not produce a
warning for that.
Actually, that's worse. The compiler dutifully warned the programmer
that there might be a problem here, and the programmer willfully ignored
the warning and forced the compiler to shut up.
Of course a better reaction would have been to change the type of
cbRead to unsigned.
If only everywhere was so enlightened. But what to you do about bogus
warnings? I've known compiler to warn me about this
if (x == 0 && y <10 || y > 10)
telling me that I really should put brackets around y <10 || y > 10.
Arguably I should but it's a style issue not something a compiler
should warn me about.
Personally I'd prefer to stick in a whole bunch of extra parens in there
anyway...
(I like the style.. thus: if ((x == 0) && (y != 10)) )