I make them more than most people. As a compensating tactic, I can be pretty
careful about working things out and testing them... But I tend not to when
posting to Usenet (although I always check my facts before posting nonsense
to Usenet!).
In general, no. Someone pointed that one out, and I concede cheerfully that,
yes, that's a pretty stupid mistake. I make stupid mistakes a *lot*, so I'm
used to that. Note that I make them a lot less often on much more
complicated things than on simpler things; a neat quirk of brain chemistry
or something. And I don't just mean "less often, compared to other people"; I
mean less often in absolute terms. I'm much more likely to get something
trivial wrong than something interesting and challenging.
I disagree. The problem is not the lack of an errata list; it's that he
still doesn't even *understand* how feof() works. He's written a series
of Pascal books which have been cleverly relabeled "C" and had some of the
punctuation altered, in effect. He got several of the "easy" mistakes that
were, say, posted on my page, fixed for the 4th edition. He didn't fix the
much more serious logic flaws introduced by his unwillingness to comprehend
how end-of-file conditions are handled in C.
And that's the thing. Errata would be one thing. But if every example I
read using feof() used it incorrectly, and multiple examples of how to do
In Herb's defense, I'd say that intelligent people find it hard to
treat mistakes (such as the design of feof) with respect, and enough
to remember the deviant implementation that feof is. Herb's failure to
recall to mind a massive boner means he's an intelligent person.
Whereas nasty little clerks (who as we saw this morning make their own
stupid mistakes, but expect to be forgiven) take a sour delight in
"learning" clap trap.
This isn't programming, Seebach. It's Trainspotting.
An intelligent programmer, when confronted with the incredibly poor
design of feof, turns right around and changes feof into a simple
macro:
#define READ(f) (attemptToRead(f), feof(f)) ? 0 : -1 // Change
attemptToRead to fgetc, fgets, etc.
He then can forget the design error, which is normalized deviance from
hell, since it forces the program to try to read SOMETHING to detect
IO. On any number of environments, this can change the state of the
underlying hardware in unpredictable ways. Even if the hardware can
signal to the runtime "don't you dare try to read from me, I have
nothing for you now", the C runtime HAS TO READ, which can easily
throw off auditing software or firmware, recording in all cases one
extra read.
Someone late at night has then to remember, while nodding weak and
weary, that oh yeah, this is a C program.
How many ruined marriages, late nights, alcoholic benders, and
suicides has this caused? How many little, stunted, substance abusing
and creepy personalities has this caused? And why do people think that
knowing these coding horrors is anything but an expense of spirit in a
waste of shame?
Herb should have done a better job, but the error is so common among
good programmers and intelligent people so as to cast the aspersion on
C and standards committee members too cowardly and incompetent to fix
it.
In case you hadn't noticed, the only reason for using a programming
language is to make programming intellectually manageable. The blame
lies upon the designers of C, and you had NO STANDING in pretending to
be more qualified than Herb Schildt, and advance your career without
suitable academic preparation thereby. You should have been taking
evening classes in comp sci. You especially have NO STANDING in light
of your own frequent errors.
The intended meaning of "C: The Complete Nonsense" was "look at me, I
am more qualified than Herb Schildt". But your errors in this thread,
and your failure to post decent code, means that you are far LESS
qualified.
You blame Herb for not publishing errata. New editions trump errata.
Sure, he should have fixed the bug: but what about the refusal of the
C standardizers to reform the numerous errors in C? I'd say this is
the greater crime, because again, the only reason for using a high
level language is to make problems intellectually manageable.
I've shown that this is possible in C with the replace() example this
week. My solution (and only my solution) showed aspects of the
problem which other solutions concealed. Had I been Herb, I would have
addressed the feof() issue head-on, but I'm not, and it may be that he
didn't want to confuse his readers.
But in light of how frequent feof() errors are (see the first Google
hit for "feof C" at
http://www.cplusplus.com/reference/clibrary/cstdio/feof/:
it warns about the behavior of this turkey and then has a code sample
with the classic error), it was in fact libel for you to try to make
this global and normalized deviance stick to Schildt.
You have TWICE in the past week posted trivial code snippets with
errors (%s and off by one strlen). You had NO STANDING ten years ago,
and you have NO STANDING now in talking about other people's errors,
especially because with regards to McGraw Hill and here, you act like
an infant, refusing to work collaboratively and compassionately with
others while expecting us to have compassion for your oh-so-precious
and oh-so-unique personality disorders.
You need to WITHDRAW and APOLOGIZE FOR the "C: The Complete Nonsense".
NOW.
input loops used feof() incorrectly, that suggests to me that these are not
merely "stupid mistakes" (fencepost errors, typos, etcetera), but a genuine
failure to comprehend something which I think can be reasonably regarded as
fairly basic.
This is your self-serving interpretation. However,
* Posting a solution that is supposed to replace %s and replaces
instead all percents and the character following them, which may cause
a memory fault,
* Posting a strlen replacement which managed to be off by one in a
few short lines of code,
* Claiming that "the 'Heap' is a DOS term",
* Consistently asking forgiveness for errors whilst uncharitably
withholding it from colleagues,
* Being ignorant of the use of concrete memory layouts of stacks and
heaps in classrooms to explain how runtimes work,
* Throwing away email from Apress colleagues unread, and calling them
"morons" and impugning their credibility and sanity online
indicates a far more troubling personality.