So? Why not use canonical forms that are guaranteed to work
*everywhere*, regardless of whether the underlying platform is case-
insensitive or not?
This was the thing that frustrated me the most about his book and one
of the reasons why it wound up in the trash; a good chunk of his
examples wouldn't even compile for me since I was working on VAX/VMS,
and the VAX C compiler would inevitably choke on some DOS-ism. Why
Then Vax was arguably wrong. And why are we discussing out of date
systems, and why is Richard Heathfield quoting Aho/Sethi et al (Dragon
compiler book) 1986? Are we in a time warp?
limit your audience by writing examples that only work for a subset of
possible platforms? Why not write examples that work on all systems
with conforming compilers? K&R did it. H&S did it. Why couldn't
No they didn't. No intelligent programmer (there are very few)
programs by moronically copying code snippets. Nothing really works,
without change, for all platforms, and this isn't Herb's doing: it's
what vendors do to make money. Even if the code doesn't have to be
changed for a particular target, anyone who expects not to have to
probably change and in consequence doesn't do his homework, deserves
what he gets.
Herb?
Seebach correctly points out that using all caps may not always work.
One would think that information would be valuable.
Yes, it is. Even more valuable would have been an explanation of the
culture divide in computing between case insensitivity in IBM
mainframes and subsequently PCs, and Vaxen and many other systems
which were case-sensitive.
It's not Herb's fault that this difference exists. I do feel that for
best results, he needed a co-author who was expert on non-Microsoft
systems. But this lack was absolutely no reason for the campaign of
abuse to which he was subjected.
It is tribalism to confuse shibboleths such as these with knowledge,
where a "shibboleth" is used to recognized tribe members. It is a
barbarism.
Okay, let's talk about *knowledge*. Let's talk about another example
cited by Seebach:
Page 53
The following code:
/* Write 6 integers to a disk file. */
void put_rec(int rec[6], FILE *fp)
{
int len;
len = fwrite(rec, sizeof rec, 1, fp);
if (len != 1) printf("write error");
}
Is described as causing all of rec to be written, no matter what
size of array is being used.
This is *not* a trivial mistake. We're not arguing over style, we're
not arguing over platform idiosyncracies. This is basic shit that
Shildt gets *wrong*; any student who read this was grossly misinformed
on the semantics of arrays as function parameters.
The shit is in the design of C.
Or how about this:
char s1[] = "hello ";
char s2[] = "there.";
Hello instant security hole!
It's actually a bug in C akin to the sprintf issue (sprintf
intrinsically unsafe on all platforms). As I have said, a professional
doesn't accept the correctness of code in books any more than a
mathematician expects all the statements in an elementary math
textbook to be true in literal terms and given what we now know:
explanations in calculus in particular can be wildly off-base.
As a humanist and not an autistic twerp who worships abstractions and
machines because he can't get laid, I think Schildt's peace of mind
and reputation was MORE important than using a language in which it is
(almost by design) insanely difficult to write correct code. The
standards efforts had a chance to rectify this situation and they
failed to owing to vendor greed.
Again, these aren't trivial mistakes, and they aren't rare in
Actually, they are. Seeback lists only twenty mistakes and says in "C:
The Complete Nonsense" that these are the known errors.
Schildt's book. God knows how many students and professionals
repeated those mistakes in their own code.
I said it in another thread, I believe there's a correlation between
the popularity of Schildt's books and the generally abysmal quality of
C code written in the '80s and '90s. Schildt caused real harm to the
industry, and your quixotic defense of his honor is puzzling to say
the least. Unless you are Herb himself, it simply doesn't make sense
why someone would waste their time defending the indefensible.
Coders who write crap code are in general aliterate or autistic and
either don't read at all, or read standards manuals exclusively.
People who actually can read do so critically. As in the case of my
own experience in 1970 (getting a book about the IBM 7094 in a class
that used the 1401) the "errors" are a learning experience.
I today teach classes in *critical* reading of texts such as Joseph
Conrad. You don't understand until you've found aporias and errors in
a text so arguably Schildt does his readers an unintentional service
with his errors, which as I have said, are fewer in number than
claimed.
In an ideal world computer books would contain "nothing but the
truth". But in the real world, programmers, who proclaim their
dedication to truth, actually are so completely dependent on their
jobs and health insurance that when one reads THEIR code, it is full
of errors and bad practice in nearly all cases. This is because as
paraprofessionals, the rate of speed at which they must work is
consistently set above the rational and humane, and the result is real
code that really sucks.
Furthermore, I see where you don't have the balls to post a fix to the
strcat. Let me try:
char s1[] = "hello ";
char s2[] = "there.";
No, that doesn't work, does it?
OK, how about
s1[6] = '\0'; p = strcat(s1, s2);
That might work (I haven't tested it.)
The point is that C IS A JOKE especially in string handling. As soon
as any competent programmer starts using C (whether Bjarne Stroustrup
in the 1970s or I in the early 1990s) he is appalled by it and starts
using its macro and function facility to craft his own C.
Stop blaming, and scapegoating, Schildt for your own poor taste in
programming languages.