And you decided to do so? Ok, whatever.
Bite me, Snidely.
What is misplaced about criticizing a misleading title?
It is in line with the ordinary usage of hard-working programmers who
mean by a USEFUL reference, a book which covers all issues but in a
usable way. The Standards document is a "reference" in a different and
legalistic sense. Kernighan and Ritchie is not a reference.
A beginner also needs an explanation that's *correct*. Many of
Schildt's errors are just errors; there would be no pedagogical harm
in correcting them.
No, a brilliant teacher can make errors in detail. In fact, the best
are often admit doubt and error. Einstein in particular stumbled in
math, according to his 2007 biographer Walter Kauffman.
Sitting in front of, and auditing, a teacher who is always right is
neither a necessary nor a sufficient condition for learning.
Confucius asked more questions than he gave answers.
You are correct, however, in saying that there would be no harm in
correcting Schildt. But CTCN fails to do this in relation to its
implication that there are "hundreds" (n>100) or "dozens" (n >
approximately 48) errors. And CTCN, not CTCR, is the problem here.
Furthermore, to the ordinary, hard-working programmer, C then and now
constitutes a language with many different dialects. A Platonic idea
of absolute truth as regards such a gelatinous artifact as C does not
exist. Instead, everyday pragmatism (is the book useful) applies, and
this is ordinarily verified by what the free market says. In the free
market, the book went to four editions. Case closed.
[...]
The very first error in "C the complete nonsense" is:
In general, negative numbers are represented using the two's
complement approach...
This is not a C feature. It is a common implementation, but it is
specifically not required. (Binary is, but one's complement is not
unheard of.)
This is suprious, Schildt qualifes by "in general".
The phrase "in general" is ambiguous; it can mean either "usually" or
"always". Assuming that "in general" was meant as "usually", he could
(and IMHO should) have mentioned that other representations exist.
That's just wrong. The only use of "general" to mean "always" is in
formal logic, which is signally misleading as regards English usage
(as in the case of the false belief that in ordinary language anything
can be clear and wrong). A logical "generalization" is a statement
symbolized (x)[P(x)] which is true for all x.
In ordinary English usage, however, "in general" means "usually".
And is therefore ambiguous, and therefore IMHO should be avoided in a
book that purports to teach C.
Sez who? My experience in the corporation is that newbie C
programmers, especially in financial firms, fail to have a computer
science background save in "rocket science" on Wall Street, not, in my
experience, in credit scoring on LaSalle Street. They need to learn
important CS distinctions before language law.
So why did you stop after just two errors? If it was inappropriate
for Seebs to list just a subset of the errors rather than covering the
entire book, is it fair (to your readers, if not to Seebs, Schildt,
or Nilges) to criticize C:TCN based on just the first two errors?
Blow me. I then treated ALL of CTCN, point by point and as usual
you're lying here.
In a quick reading, it appears to me that the first two listed errors
happen to be the least substantial. Keep reading. The third error
is a use of "%f" to print an expression of type size_t (followed by
a use of "%d" for the same purpose, but that's not *quite* as bad
an error). The fourth is an application of sizeof to a parameter
of type int[6], which is really of type int*. These are just plain
wrong, and they're demonstrations that Schildt didn't even try his
code before publishing it. The printf format error *might* be a
typo, perhaps one introduced in typesetting, but the sizeof error
is just a fundamental conceptual misunderstanding on Schildt's part.
You don't know this, and there is no confirmation of your theory in
the noncode text.
Furthermore, the issue isn't Schildt: it's Seebach. He's plainly
incompetent as a C programmer according to the evidence he's posted
and CTCN is inadequate.
And he claims to be teaching C.
As long as I'm posting I'll mention that
The "heap" is a DOS term...
is a perfectly correct statement. It doesn't necessarily imply
that it's *only* a DOS term. It also happens to be a Unix term,
and a Windows term, and a Symbian term, and so forth (and yes,
an updated version of the web page should probably clarify that).
The point is that it isn't a C term.
Again, misuse of ill-understood and second-hand concepts from formal
logic to criticise ordinary English shows that you understand neither
formal logic nor English, which is a common failure of incompetent
programmers. Just as "clear but wrong" is a solecism except when
applied to relatively uninteresting analytic falsehoods expressed in
formal language, and just as you confused the English phrase "in
general" with universal quantification in formal logic, here you rely
on the overgeneral, logical fact that the English copular verb "is"
can mean logical identity OR subset inclusion to buttfuck "is" into a
true interpretation.
However, a literate person knows how to disambiguate "is", as do
literate OO programmers, who use "is-a" to show subset in inheritance
as opposed to "has-a" to show reference to an object, and "==" in C
Sharp in Java to show identity.
The rule is that for "a is b", if a names a class with > 1 members n
or (in "the 'heap' is a DOS term") a unit set of one member
coextensive with the member, and b obviously names a class with m>n
members, then its being stated, formally, that a is either a member or
a subset of b.
Whereas when a and b are at the same level, if they are both singular
things or unit sets, or sets of roughly equal cardinality, "a is b"
asserts "a==b".
Since there are many "DOS terms" whereas "the heap" is one "term",
Seebach was asserting, wrongly, that the word "heap" is used only in
reference to MS-DOS.
You barbarize language to be right.
The only thing that you say that's true is that "the 'heap' is not a C
term". No, it isn't. But it is impossible, an ignotus per ignotum
(explaining the unknown using the unknown) to "teach" C using only C
terms. As I have said, Herb was describing an instance of runtime in
the same way the high school teacher illustrates the Pythagorean
theorem using chalk.
(Nilges doesn't seem to understand -- or maybe he does -- that the
more he keeps pushing his agenda, the more attention will be brought
to Schildt's errors.)
That is unfortunately true, and a concern of mine. I would be
delighted if you downloaded Microsoft .Net Visual Basic Express and my
compiler for Quickbasic written for my book "Build Your Own .Net
Language and Compiler", and analyzed my code for errors since this
would increase my sales and make me money..."there is no such thing as
bad publicity".
But the problem at this time is not Schildt, it is Seebach, who has
fraudulently based his career in some measure on a document that is
far more flawed than Schildt.
The ball is in your court, and that of Seebach. You can end this,
including the troubling information that has emerged about Seebach's
competence, by encouraging Peter to simply withdraw the document and
substituting a blank page. If he does so, I will not post on this
issue. At this point, to spare Schildt further anguish, I waive my
request for an apology.
I disagree completely. "C: The Complete Nonsense" is a valuable
warning to those who might otherwise be misled by reading Schildt's
books. It could stand some revision, particularly an update to the
latest edition of the book.
To believe that anyone can be "misled" by a book is Fascism. I was
assigned Sherman's "Programming and Coding for Digital Computers" in
my first computer science class. It described the IBM 7094 computer, a
fixed word length, single address mainframe with a 36 bit word. The
computer the class used was the IBM 1401, a "variable word length",
two address mainframe. I failed to attend the lecture in which the
professor explained this, sharing some of Peter's intellectual vanity
when I was 20.
Because Sherman provided no clue on 1401 machine language programming
for the first assignment, I bought a separate McGraw Hill book on the
1401 and a 1401 reference manual from the IBM shop in the IBM building
which used to be on the Chicago river.
However, despite the fact that Sherman was globally wrong in relation
to our needs, I read it from cover to cover several times and it was
valuable, since it, and not the McGraw Hill book or the IBM manual,
described how a computer could simulate another. It also explained
floating point, and a number of other computer science topics.
The worst books on "programming" (where "programming" books bear an
ambiguous and rather louche relationship to computer science) I ever
saw were written by the manager in the computer center at my uni. They
concerned a forgotten mini-mainframe, the IBM System/3 of 1970, which
was intended for small and medium businesses so IBM could stop
supporting the extremely popular 1401.
This manager was a nice guy, if stuffy and pompous to a fault; a
former IBMer, he wore blue suits and pocket hankies. Once he got a
load of my long hair and antiwar armband, he was hostile until he
happened to pick up a listing of my assembler code, and liked the
facts that on each side of each line I had a literate comment and that
the code was great.
This was because we were both of Irish Catholic educated background
and were both verbose...he perhaps to a fault, since his books were
absurdly stuffy and pompous in tone. Many people think there's nothing
more stuffy and pompous than my own prose, but he broke the mold.
After he discovered my list he was much kinder, but still advised me
to get a haircut.
His books were "bad" in your sense. But intelligent people never use
the mental model in which one can be misled by a book. That in fact
was the philosophy behind the Catholic Church's "index of prohibited
books" which is no more. A grown up knows that it is his
responsibility to evaluate all books fairly. A grownup programmer
tests code snippets, and treats time taken to correct the creator's
mistake as a valuable time for learning.
Look at
http://www.cvm.qc.ca/gconti/905/BABEL/Index Librorum Prohibitorum-1948.htm
This is the Catholic Church's "Index of Prohibited Books" in 1948.
Here are entries for Spinoza:
Spinoza, Benedictus de Tractatus theologico-politicus, continens
dissertationes aliquot, quibus ostenditur libertatem philosophandi non
tantum salva pietate et reipublicae pace posse concedi, sed eandem
nisi cum pace reipublicae ipsaque pietate tolli non posse. 1679
Spinoza, Benedictus de "Opera posthuma." 1690
Imagine what it would mean for someone to find himself on this index.
How they feel.
I'm afraid that for you, technology has been marked-off as a laager or
zone of control in which you don't have to intellectually mature, and
can practice out of date habits of intolerance. And I'm afraid that
given your approval of a buggy off by one strlen last month, this
intolerance does not imply competence.