In the Matter of Herb Schildt: a Detailed Analysis of "C: TheComplete Nonsense"

S

spinoza1111

This matter has become very time consuming, perhaps for all concerned.
Therefore, unilaterally, I'm going to declare a temporary "cease fire"
in order for people to have time to study my document and let cooler
heads prevail.

I will stop posting and reading until Sunday 11 April China time.

I still propose that Peter Seebach undertake to remove "C: the
Complete Nonsense" based on its lack of valid and sufficient content
in proportion to the damage it has done, and to replace it with an
apology for the considerable misunderstanding it has created. I waive
his having to apologize to me for the considerable harm he has done
me.

If Seebach undertakes to do both before 11 April, I would appreciate
his notifying me of this decision and undertaking by email to
(e-mail address removed).

If this is not done, I will have to continue this matter, with
escalation to the next level.

Edward G. Nilges

Countermanding this, because otherwise Malcolm and Navia will
unnecessarily fight a separate battle, alone. And this IS a battle,
albeit metaphorically. It started to be a war when Peter attacked
Schildt's first edition, probably without reading it. It's not about
Schildt; it is about a whole attitude of incompetent programmers, one
of dull hostility towards language.
 
S

spinoza1111

[snip]






printf("%f", sizeof f);
printf("%d", sizeof(int));
Seebach: “Clearly wrong; sizeof is not a double or float. It is also
not an int; it is an unsigned integral type, thus, one of unsigned
char, unsigned short,unsigned int, or unsigned long.”
“The only safe way to do this is: printf(“%lu”, (unsigned long)
sizeof(int)); while this is larger, a clear explanation of why it is
required will go a long way towards helping people understand C.”
Although I do not know why Herb used %f and %d format codes, he did
know, as Seebach seems not to, that all ints are floats and all floats
are doubles in well-structured languages.

It's not a matter of whether ints are subsets of floats. It's a matter
of how the printf() function expects conversion specifiers and the
types of the corresponding arguments to match.  If you tell printf()
you're going to pass it a value of one type but actually pass a value
of a different type (with a different size and/or representation),
then you're not going to get the correct output.
Because at the time and
even, to an extent, now, C was almost as diverse of the languages of
China, Herb used the educated programmer’s maxim, which is to first
code as if the compiler writers and language designers knew at least
as much as the intelligent programmer, and fix problems later.

Don't waste the energy trying to defend this snippet; you'll hurt
yourself.  This kind of error is just plain sloppiness/carelessness on
Schildt's part.  Even at their most fractured (which was nowhere near
what you suggest), the various implementations of C defined the same
interface for printf().  printf("%f", sizeof f) wasn't going to work
*anywhere*.

This was an honest-to-God mistake on Schildt's part.  Deal with it and
move on.

Yes, it was, but this isn't about Schildt. It is about CTCN which has
a higher density of errors than Schildt's large book, errors which
aren't typos or errata but which betray:

* Lack of education in at least English and computer science (clear
but wrong a solecism, the "heap" is not a DOS term)

* A dull and savage hostility towards the written word

Peter, your errors in CTCN approach unity.
 
S

spinoza1111

FSVO "ASCII" (i.e., not the 7-bit version).

At this point, blm, you are grasping at straws.
In the Matter of Herb Schildt: an Analysis of "C: the Complete
Nonsense
Let's now deconstruct Peter Seebach's document "C: the Complete
Nonsense", an attack on Herb Schildt's first edition of "C: the
Complete Reference" which in becoming the sole source of subsequent
claims that Schildt wrote "bad books", which unfairly damaged his good
name. Let's examine it, line by line.

Others have commented on various parts of critique.  I'll try to
address only points that I don't think have been made already.

[ snip ]




printf("%f", sizeof f);
printf("%d", sizeof(int));
Seebach: "Clearly wrong; sizeof is not a double or float. It is also
not an int; it is an unsigned integral type, thus, one of unsigned
char, unsigned short,unsigned int, or unsigned long."
"The only safe way to do this is: printf("%lu", (unsigned long)
sizeof(int)); while this is larger, a clear explanation of why it is
required will go a long way towards helping people understand C."
Although I do not know why Herb used %f and %d format codes, he did
know, as Seebach seems not to, that all ints are floats and all floats
are doubles in well-structured languages.

On many systems, of course, integers and floating-point values
are represented differently.  In that sense, ints are *not*
floats, and even a claim that any int can be represented as,
or converted to, a float depends on the relative sizes of the
two types; if they are the same (as for example they are in Java
[*]), there will be some values that can be exactly represented
as ints but not as floats.

That is correct. However, a competent programmer will treat the
mathematical relationship of integer to real number as having
precedence over implementations where there can be integer RESULTS
(not, as you claim, integers) that cannot be represented as floats.

Also, I am afraid that a "competent" programmer in America means
someone like Stallman, Ted Nelson, or myself. I do not class myself
with such geniuses except in one regard. This is that each of us
learned mathematics before we had access to a computer, and this
taught the absolute lexical priority of mathematics (and human needs)
over the limitations of computers, including bit precision.

This is possibly why, today, most American programmers are incompetent
and are no longer permitted, in many organizations, to actually code
real software, and must instead content themselves, as Peter seems to,
with writing unneeded scripts and inappropriately making modifications
to existing software out of vanity.

Whereas many seasoned, even young, Indian developers had no access to
computers until recently.

In this case, Herb seems to have remembered something more important
than that which was forgotten. Sure, the example is wrong, but de
minimis.
[*] Whether Java meets your criteria for "well-structured language" --
<shrug>

[ snip ]




Schildt: "You may also declare main() as void if it does not return a
value."
Seebach: "Specifically untrue. ANSI mandates two declarations for
main, and says that main may have declarations compatible with those.
Both return int."
C was not standardized at the time this book was written, it existed
in several different dialects. In fact, I discovered (on behalf, as it
happens, of John "A Beautiful Mind" Nash) that the Microsoft compiler,
which many of Schildt's readers were using, is nonstandard at least as
regards the evaluation of compile-time constant expressions. While it
has become a Shibboleth or Secret Handshake among non-Microsoft
Illuminati that you must declare main as int, it's actually better
style to make it void unless you have an important message, such as
"up yours!" to the OS.
But this shibboleth has become an article of faith amongst the anti-
Microsoft crowd who compensate for the meaninglessness of their lives
and general incompetence by fantasizing that they are Special, and the
OS gives a hoot.

It may be that all(?) of the ways of invoking a program in a
Windows environment ignore its return value.  This is not the
case in all operating systems:

Command shells for UNIX-like systems may (usually do?) provide
access to the return value via an environment variable, and some
shell scripts make use of it.  Other mechanisms for invoking
programs (e.g., fork/exec*/waitpid) also provide access to the
return code.

And didn't JCL for the venerable IBM OS/VS operating system(s) use
the value returned by a called program to control execution flow?
That's how I remember it anyway.

To me it seems like good practice to write code that complies with
the requirements of as many environments as possible.  <shrug>

This is the best explanation I've seen of the main() quibble, but it
remains a quibble.

It is true that it would be great to expect that ANY C program with a
main() (eg., most) would return something that could be trusted by ANY
shell script to mean failure or success...in the unix and linux world,
based as that world is on plugging together reusable software
components.

However, not all of us regard unix-linux as the zenith of
accomplishment. In fact, Jaron "You Are Not a Gadget" Lanier is quite
amusing on this subject.

He points out two related facts based on wide experience, wider and
deeper than anyone's here including my own:

* Since 1990 all Pop music is retro. There's a Sixties "sound", a
Seventies "sound" (get down!) and even an Eighties "sound", but today
to have a distinct "sound" is "lame" and "sucks" because as here,
people are afraid to try to live outside what Lanier calls the "hive
mind".

* The two most notable achievements of the coder kiddies of this same
era? Linux and wikipedia. And what is Linux? Uh, a rewrite of unix
based on a 1989 textbook on something called minix, which I used to
own. And what is Wikipedia? An encyclopedia which is trustworthy only
on hard science and which is based on the theft of intellectual
property of virtual and unwitting slaves, edited by convenience store
clerks, which may be tax fraud.

Therefore, I don't think that a language such as C should be designed
around the needs of an out of date OS. Furthermore, it's bad practice
to assume without good reason that the main() return means anything at
all.
[ snip ]
Hitler was not anonymous; but his followers were, in many cases. And
Mike Godwin is wrong; the probability of comparision to Hitler in
online discussions "converges to unity" not because people are being
shrill and foolish, but because Hitler is our inner troll, as Lanier
calls it. He's the face in the crowd in Munich in August 1914 baying
for war who yearns to be on the podium, and non-anonymous.
Actually I should have been able to deduce Fascism from the memory of
my childhood.

Your childhood?  my, you must be older than I thought ....

Sorry. I italicised the Adorno quote at wordpress as you now know. But
my ex wife calls me an "old soul", and Adorno was an early instance of
the former academic who gets a white collar job, since Hitler
destroyed his academic career and he had to join a "radio research
project" to survive. His statements on Fascism are eerily appropriate,
because Fascism arises in the white collar lower middle class.
[ snip ]
occurring?

but you remembered to double the "r" this time ....

Do me the courtesy of not patronizing me, dear blm. You are nothing
more than a low level computer instructor as I was, but your literacy
has the usual low upper bound. You are posting dishonestly as is
Julienne because you're afraid to take an unpopular view in a
newsgroup which turns upon and brutalizes women posters when they
disagree with the normalized deviance of the ng.
[ snip ]
embarrassed

and here

[ snip ]
In Fascism, the nightmare of childhood has
realized itself.
Theodore Wiesengrund Adorno, Minima Moralia, 1948

*OH!*  Those weren't your words, were they?  Well, perhaps there
were quotation marks, or indentation, or something identifying this
text as a quotation in the version of this review posted at wordpress.

Also, do me the courtesy of focusing more carefully on what Adorno
says.
 
S

Seebs

We have evidence that he has criticised the third edition (and with
ample justification). What evidence do you have that he has "attacked"
the first edition?

The page is 100% unambiguous as to which edition is attacked. Admittedly
the evidence for this was not embedded intentionally, but!

The example cited as page 53 on the existing page CANNOT be from any
other edition.

* In the 2nd edition, the test is written using "<>" rather than "!=".
* In the 4th edition, the sizeof() operator is used differently.

What probably happened is that Nilges saw an observation that the page was
based on the previous edition, and then his natural tendency to reimagine
events to make other people seem worse, and himself seem better, took over;
after a couple of months, he'd persuaded himself that it was actually the
first edition, rather than the third, and no amount of correction, even with
100% unambiguous evidence, will change his mind. (Consider how long it took
him to sort of almost pretend to grant that, in fact, he'd been mistaken*
all along when claiming that I was not a professional programmer, but some
sort of "clerk".)

-s
[*] Normally, when someone continually reasserts a disproven claim
despite having demonstrated that he's seen the disproofs, we call it
"lying", but Nilges gives every sign of being sufficiently entrapped
by his layers of self-deception that he is, in fact, being completely
sincere in claiming these ridiculous things. As such, it is probably
not supportable to call it "lying", it's just a sort of devoted and
persistant refusal to stop being wrong. Totally different, I'm sure.
 
B

blmblm

At this point, blm, you are grasping at straws.

No, just nitpicking. I do that. <shrug>

[ snip ]
printf("%f", sizeof f);
printf("%d", sizeof(int));
Seebach: "Clearly wrong; sizeof is not a double or float. It is also
not an int; it is an unsigned integral type, thus, one of unsigned
char, unsigned short,unsigned int, or unsigned long."
"The only safe way to do this is: printf("%lu", (unsigned long)
sizeof(int)); while this is larger, a clear explanation of why it is
required will go a long way towards helping people understand C."
Although I do not know why Herb used %f and %d format codes, he did
know, as Seebach seems not to, that all ints are floats and all floats
are doubles in well-structured languages.

On many systems, of course, integers and floating-point values
are represented differently. In that sense, ints are *not*
floats, and even a claim that any int can be represented as,
or converted to, a float depends on the relative sizes of the
two types; if they are the same (as for example they are in Java
[*]), there will be some values that can be exactly represented
as ints but not as floats.

That is correct. However, a competent programmer will treat the
mathematical relationship of integer to real number as having
precedence over implementations where there can be integer RESULTS
(not, as you claim, integers) that cannot be represented as floats.

I have no idea what distinction you're trying to make here (between
"integer RESULTS" and "integers"). Of course I'm aware that in the
world of mathematics the integers are a subset of the real numbers.
However, when I write "int" I do not mean an integer as it exists
in the world of mathematics, but an instance of the C data type
"int", and when I write "float" I do not mean a real number,
but an instance of the C data type "float".

Competent programmers will be aware of the ways in which numeric
types in programming languages differ from numbers as they exist
in the world of mathematics. There *may* be some programming
languages in which there are no important differences, but there
are also many programming languages in which the differences
are significant. For example, beginners are often startled by
the behavior of the following C code:

float f;
for (f = 0.0; f != 1.0; f += 0.1) {
printf("%g\n", f);
}

Similar code can be written in Java, and it behaves similarly.
Also, I am afraid that a "competent" programmer in America means
someone like Stallman, Ted Nelson, or myself. I do not class myself
with such geniuses except in one regard. This is that each of us
learned mathematics before we had access to a computer, and this
taught the absolute lexical priority of mathematics (and human needs)
over the limitations of computers, including bit precision.

I'm not sure I understand what you mean here, but ignoring the very
real (ha) differences between numbers as they exist in mathematics
and numbers as they are represented in computer hardware does
not strike me as consistent with being a competent programmer.

(For what it's worth, I also learned most of what I know about
mathematics before having access to a computer. I'm not sure
whether I'd be a better programmer, or a worse one, if that had
not been the case.)

[ snip ]

[ snip ]
This is the best explanation I've seen of the main() quibble, but it
remains a quibble.

It is true that it would be great to expect that ANY C program with a
main() (eg., most) would return something that could be trusted by ANY
shell script to mean failure or success...in the unix and linux world,
based as that world is on plugging together reusable software
components.

However, not all of us regard unix-linux as the zenith of
accomplishment. In fact, Jaron "You Are Not a Gadget" Lanier is quite
amusing on this subject.

He points out two related facts based on wide experience, wider and
deeper than anyone's here including my own:

* Since 1990 all Pop music is retro. There's a Sixties "sound", a
Seventies "sound" (get down!) and even an Eighties "sound", but today
to have a distinct "sound" is "lame" and "sucks" because as here,
people are afraid to try to live outside what Lanier calls the "hive
mind".

* The two most notable achievements of the coder kiddies of this same
era? Linux and wikipedia. And what is Linux? Uh, a rewrite of unix
based on a 1989 textbook on something called minix, which I used to
own. And what is Wikipedia? An encyclopedia which is trustworthy only
on hard science and which is based on the theft of intellectual
property of virtual and unwitting slaves, edited by convenience store
clerks, which may be tax fraud.

Are you quoting here, or paraphrasing? Not that it matters much,
I suppose.
Therefore, I don't think that a language such as C should be designed
around the needs of an out of date OS.

Whatever you think of the design of C, or of the design of UNIX (and I
would not claim that either represents the last word in its domain),
it seems to me that a book that claims to be a reference on C should
describe the language as it actually exists, including whatever the
standard says about the return value from main().
Furthermore, it's bad practice
to assume without good reason that the main() return means anything at
all.

Well, I suppose it's bad practice in general to assume that all programs
were written by competent programmers. Other than that, what's your
point?

[ snip ]
Sorry. I italicised the Adorno quote at wordpress as you now know.

Only by your saying so here. I have not followed the link to the
version of your critique posted there.

[ snip ]
[ snip ]
occurring?

but you remembered to double the "r" this time ....

Do me the courtesy of not patronizing me, dear blm.

Not so much patronizing [*] as describing how I first started to
suspect that this might not be your writing.

[*] Snide, maybe. Fair cop.
You are nothing
more than a low level computer instructor as I was,

Am I? I've been deliberately cagy about my actual job title [*],
so why assume that it's the lowest one that would allow me to
say truthfully that I teach undergraduate CS classes?

[*] For reasons that seem good to me, some of which can perhaps
be inferred from my signature below, particularly the part about
not speaking for my employers.
but your literacy
has the usual low upper bound. You are posting dishonestly as is
Julienne because you're afraid to take an unpopular view in a
newsgroup which turns upon and brutalizes women posters when they
disagree with the normalized deviance of the ng.

Does this have something to do with the current discussion? (Yes,
I'm being a bit snide here too.)

As for being afraid -- I have not observed women posters being
"brutalized" in this newsgroup [*], but I *have* observed you
proposing to escalate disagreements here beyond the bounds of Usenet.

[*] Admittedly I don't read all posts, so I suppose I could have
missed occurrences of this supposed phenomenon.
Also, do me the courtesy of focusing more carefully on what Adorno
says.

My point, insofar as I had one, was that I was rather surprised that
someone who has been quick to take others to task for not making it
clear which words were their own and which were quoted would make the
same mistake himself -- even if it was an oversight.

That I don't read carefully all quoted material -- eh. That's not
why I spend some of my 168 hours a week reading posts here. <shrug>
 
S

spinoza1111

snipola

I have no idea what distinction you're trying to make here (between
"integer RESULTS" and "integers").  Of course I'm aware that in the
world of mathematics the integers are a subset of the real numbers.
However, when I write "int" I do not mean an integer as it exists
in the world of mathematics, but an instance of the C data type
"int", and when I write "float" I do not mean a real number,
but an instance of the C data type "float".

Then you're not a very good programmer, for your code is not connected
with reality. You need to code what you mean, and what you mean has to
be something human in a way mathematics is, and computers are not. If
you don't do this, then your code probably resembles Seebach's.

Even within coding, in cleaned up systems, the integers are indeed a
subset of the reals in the sense that all ints in C Sharp, as a matter
of a REAL standard (ECMA), can be converted without error to doubles.
And note that Microsoft had to go outside ANSI to get a usable and
trustworthy standard for C Sharp, because ANSI has been so privatized
in a way that the Euro standard has not.`

Truly competent programmers WANT to program in the best languages
available, and do not think using an inferior language is a mark of
genius. When I used machine language, I wanted an assembler. When I
got assembler (later on in my first computer science class) I debugged
a Fortran compiler in machine language. When I got Fortran and later
Cobol, I wanted Pascal. When I got Pascal, I wanted C in order to
short-circuit. And when I got C I realized I needed OO, so I got OO
VB .Net. When I got that, I wanted a C like syntax and so I got C
Sharp and Java. In other words, I don't sit around with my head up my
ass.
Competent programmers will be aware of the ways in which numeric
types in programming languages differ from numbers as they exist
in the world of mathematics.  There *may* be some programming
languages in which there are no important differences, but there
are also many programming languages in which the differences
are significant.  For example, beginners are often startled by
the behavior of the following C code:

float f;
for (f = 0.0; f != 1.0; f += 0.1) {
  printf("%g\n", f);

}

Similar code can be written in Java, and it behaves similarly.

We all know this. But you celebrate it as if a mistake or here, a
necessary (?) limitation in bit length is human knowledge. No, while
we learn from our mistakes and the limitations of our tools, those
aporias are not first-class knowledge. They are cautionary tales.
I'm not sure I understand what you mean here, but ignoring the very
real (ha) differences between numbers as they exist in mathematics
and numbers as they are represented in computer hardware does
not strike me as consistent with being a competent programmer.

I did not say we ignore it. Instead, we separate concerns; read
Dijkstra. He was actually told much the same thing, that it was a
fault of him not to understand the time and cost limitations on the
IBM 1620 designers that caused them to fail to support more than one
level of subroutine call, as if (again) mistakes and limitations are
knowledge.

But: Kant (who Dijkstra probably had to learn in school) addressed
this in Kant's essay "On the Old Saw". Science already accounts in
principle for the limitations of the real world. Kant, in writing Zum
Ewigen Frieden, his tract on how to bring about universal peace (which
was one of the sources of the EU constitution), had of necessity to
take the "practical limitations" of diplomacy into account, but his
enemies claimed, falsely, that he had not.

But this implies something that neither Kant nor Dijkstra ever
believed, but in my experience is a self-serving article of faith
amongst MBA types. This is the false proposition that once "science"
as such is finished, there can be an additional layer of monkeyshines
and rules of thumb concerning its application...a layer which if
investigated is almost always sheer fraud, exhibit A being the so-
called "rocket science" of Wall Street, which has turned out since
2008 to be nothing more than a lot of bad code which sucks and which
has destroyed what was left of the American middle class.

"We must store the return address of the subroutine in a fixed
register" on the IBM 1620, or "all things must be done with line by
line commands" in unix are what Kant called the "saws" of practical
men which want the authority and power of Science but simply don't
deserve it. The first "we must" is actually a self-referential, and
self-serving, assertion of the actual 1620 designers who today as
codgers are being unjustly celebrated at the Computer Museum in
California in such a way that we cannot treat their mistakes as
cautionary tales but must use them as precedent for further mistakes
(such as over use of "global variables").

In other fields, this treatment of saws as an additional and
controlling layer of science gets people killed. Lord David Owen used
saws in the failed "cantonizing" of Bosnia-Hercegovina in 1992 and the
result was the Srebenica massacre. In the Columbia disaster, the saw
that "if it's on our power points it is under control" killed the
first and so far only Indian woman astronaut. Donald Rumsfeld used
saws in waging the disastrous Iraq war; when confronted with the
chaos in Baghdad in 2003, chaos that US Marines were willing and able
to end if martial law was declared (as it should have been according
to the Geneva convention, part of the "science" of international
relations), Rumsfeld came up with a truly creepy saw, one that made
the skin to crawl. "Shit happens".
(For what it's worth, I also learned most of what I know about
mathematics before having access to a computer.  I'm not sure
whether I'd be a better programmer, or a worse one, if that had
not been the case.)

Pity you're a counterexample, then.

snip yourself
Are you quoting here, or paraphrasing?  Not that it matters much,
I suppose.

Paraphrasing, dear heart.
Whatever you think of the design of C, or of the design of UNIX (and I
would not claim that either represents the last word in its domain),
it seems to me that a book that claims to be a reference on C should
describe the language as it actually exists, including whatever the
standard says about the return value from main().

No useful standard says anything useful about this. Instead, a good
unix or linux programmer will try to return int so his code can be
useful in shell procedures. SEPARATION OF CONCERNS means that this
should NOT be part of the programming language. Why? Because we would
like to use C on Microsoft and embedded platforms, and we don't all
want to use linux. Why recreate the world of the 1984 Apple ad?
Well, I suppose it's bad practice in general to assume that all programs
were written by competent programmers.  Other than that, what's your
point?

I fail to see yours. In Windows we generally don't want to use the
return code. Therefore it is an excrescence to return anything and as
Herb says, void is the best style here.

[ snip ]
Sorry. I italicised the Adorno quote at wordpress as you now know.

Only by your saying so here.  I have not followed the link to the
version of your critique posted there.

Do your homework.
[ snip ]
[ snip ]
occurring?
but you remembered to double the "r" this time ....
Do me the courtesy of not patronizing me, dear blm.

Not so much patronizing [*] as describing how I first started to
suspect that this might not be your writing.  

Don't start. I am more literate and better read than you, that's
plain. No corporate checklist approach will change this fact. Ooooooh
a spelling error is Human Resources thinking.
[*] Snide, maybe.  Fair cop.
You are nothing
more than a low level computer instructor as I was,

Am I?  I've been deliberately cagy about my actual job title [*],
so why assume that it's the lowest one that would allow me to
say truthfully that I teach undergraduate CS classes?

Litera scripta manet, dear heart. Nothing in your writing conveys
anything more than a middling level of intelligence or literacy.

[*] For reasons that seem good to me, some of which can perhaps
be inferred from my signature below, particularly the part about
not speaking for my employers.
but your literacy
has the usual low upper bound. You are posting dishonestly as is
Julienne because you're afraid to take an unpopular view in a
newsgroup which turns upon and brutalizes women posters when they
disagree with the normalized deviance of the ng.

Does this have something to do with the current discussion?  (Yes,
I'm being a bit snide here too.)

As for being afraid -- I have not observed women posters being
"brutalized" in this newsgroup [*], but I *have* observed you
proposing to escalate disagreements here beyond the bounds of Usenet.

In response to brutalization, I will do so. And you weren't here in
the 1980s when women with brains and courage were driven out of here
and out of programming positions. I'm not making this shit up, luv.
The New York Times did a followup circa 1997 of the Princeton computer
science fe-male contingent class of '87 to find that all of them had
been driven by the pressure of working with assholes like Seebach and
Heathfield, into teaching and other skirt occupations. This has caused
the quality of code produced in America to significantly decline to
the point of Seebach's horrors, in my direct experience.
[*] Admittedly I don't read all posts, so I suppose I could have
missed occurrences of this supposed phenomenon.
Also, do me the courtesy of focusing more carefully on what Adorno
says.

My point, insofar as I had one, was that I was rather surprised that
someone who has been quick to take others to task for not making it
clear which words were their own and which were quoted would make the
same mistake himself -- even if it was an oversight.  

Tu quoque is the favorite argument of criminals, and Women Who Love
and Enable Criminals Because They Don't Want to Be Bitch-Slapped
Online. But you can forget it. I refuse to be classed with anyone
here. I simply have more experience and knowledge inside and outside
of computing. No, I'm not as smart as Nash, nor Kernighan. But apart
from Kenny, McClean, Navia, Willem, the late Dik Winter, and to a
certain extent Bacarisse and Harter, the regs here are extraordinarily
and malevolently stupid, because they've drank a toxic corporate Kool-
Ade which they actually think is scientific asceticism and wisdom.
 
S

spinoza1111

The page is 100% unambiguous as to which edition is attacked.  Admittedly
the evidence for this was not embedded intentionally, but!

Actually, you've given us ANOTHER deliberate error, amounting to
another lie. Yes, you fail, don't you, to identify the edition you are
referencing.

Wow. Not one, not two, but many smoking guns with your fingerprints.
This is one for the red file. I am going to ADD this to my
deconstruction on wordpress.

YOU DON'T EVEN SPECIFY THE EDITION. Your reader probably won't have
the edition you're referencing. But what the **** do YOU care,
asshole? The damage to a reputation has been done. And this was your
malicious purpose: to appear as a professional programmer because you
hope that your readers will conclude that you are the one who is
vastly more qualified on C...when not one of your code examples here
that I have seen are of acceptable quality!

The example cited as page 53 on the existing page CANNOT be from any
other edition.

This is absurd! Your reader would have to have a stack of three
editions to confirm this!

It would have been a simple matter for you to clarify which edition
you were talking about. You did not do so, because you were committing
a malicious libel.
* In the 2nd edition, the test is written using "<>" rather than "!=".
* In the 4th edition, the sizeof() operator is used differently.

What probably happened is that Nilges saw an observation that the page was
based on the previous edition, and then his natural tendency to reimagine
events to make other people seem worse, and himself seem better, took over;
after a couple of months, he'd persuaded himself that it was actually the
first edition, rather than the third, and no amount of correction, even with
100% unambiguous evidence, will change his mind.  (Consider how long it took
him to sort of almost pretend to grant that, in fact, he'd been mistaken*
all along when claiming that I was not a professional programmer, but some
sort of "clerk".)

You're lying. I have not once backed down from my conclusion that
you're not a qualified programmer: quite the opposite. I THOUGHT last
year that you were well-qualified because of your affiliation with C
standards. Then you spoiled my illusion when you said that you paid
your way onto the committee and on that committee did not contribute
but were there to learn. Then you spoiled it even further when in
January you published your silly %s example. Then one morning I logged
on to find that your strlen was off by one, and I was the one to
report this to the group. Then I read your queue.c code.

But above all, you have told us that you find errors in a compiler and
report same. Peter, this is not a programming job.
-s
[*]  Normally, when someone continually reasserts a disproven claim
despite having demonstrated that he's seen the disproofs, we call it
"lying", but Nilges gives every sign of being sufficiently entrapped
by his layers of self-deception that he is, in fact, being completely
sincere in claiming these ridiculous things.  As such, it is probably
not supportable to call it "lying", it's just a sort of devoted and
persistant refusal to stop being wrong.  Totally different, I'm sure.

You keep posting content-free claims that I'm mad when you run out of
arguments or ways to establish credibility and your going to find
yourself in escalating trouble owing to them.

I noticed that you started posting your Coding Horrors here only after
questions arose about your competence and credibility. The first was a
routine you'd been assigned at work because, like many people who are
basically computer operators and not full-time professional
programmers, you occasionally find opportunities to program. This is a
laudable thing in most cases; when my job title at uni was "keypunch
operator for the Registrar" I wrote a lot of assembler code for the
IBM 1401 to make my job easier and have fun.

But alarmingly, none of them exhibit any competence whatsoever,
especially given your claim to have been in the field so many years.
Professional programmers either don't use strchr to find %s, or if
they do so for time pressures, they don't brag about it on
comp.lang.c. Professional programmers don't code off by one bugs in
one line of code, and they use structured programming, not switch
fallthrough. They don't leave variables undefined without assigning
null in the declaration/definition. They don't think that copyleft and
copyright can both apply.
 
B

BruceS

Now this is just sad. You said you "will stop posting and reading
until Sunday 11 April China time", and yet are posting again a mere
SIX HOURS LATER?! Whiskey Tango Foxtrot? Can't you keep a promise
for even one day?
 
B

blmblm

Then you're not a very good programmer, for your code is not connected
with reality. You need to code what you mean, and what you mean has to
be something human in a way mathematics is, and computers are not.

I'm guessing that by "code what you mean" you mean ....

Oh. I was going to guess that you mean "declare variables with
types that correspond to their intended use rather than based on
implementation-specific details", but perhaps you mean to also require
that the variable names follow your scheme, or a similar scheme,
for indicating the type of data they're meant to contain. ?
If
you don't do this, then your code probably resembles Seebach's.

The string-replacement-and-test-framework code I posted in message

<[email protected]>

is a fairly representative sample, if you want to judge for yourself
(about whether it resembles Seebs's) rather than speculating. If I
were going to polish it, I'd review the comments and probably add a bit
to at least some of them, but as best I can tell from a quick skim it's
not something I'm particularly embarrassed about, with two exceptions:

(*) It was an oversight not to make it clearer that the correctness
tests were yours (indeed, I think originally I didn't plan to
post them at all, and somehow in the flurry of combining various
files into a single post included more than I had intended).
But I've already apologized for that.

(*) The function to obtain the time of day with (fairly) high
resolution doesn't even try to be portable. But as far as I
know it's not possible to write such a function in portable C,
and the comments are pretty clear about this code needing to be
reviewed for platforms other than the one I developed on.
Even within coding, in cleaned up systems, the integers are indeed a
subset of the reals in the sense that all ints in C Sharp, as a matter
of a REAL standard (ECMA), can be converted without error to doubles.

I don't know C#, but the same thing is true in Java -- but the
ints are *NOT* a subset of the floats in Java.

And if we're talking about doubles rather than floats, there are
almost surely many implementations of C in which all ints can be
converted without loss of precision to doubles.
And note that Microsoft had to go outside ANSI to get a usable and
trustworthy standard for C Sharp, because ANSI has been so privatized
in a way that the Euro standard has not.`

Truly competent programmers WANT to program in the best languages
available, and do not think using an inferior language is a mark of
genius. When I used machine language, I wanted an assembler. When I
got assembler (later on in my first computer science class) I debugged
a Fortran compiler in machine language. When I got Fortran and later
Cobol, I wanted Pascal. When I got Pascal, I wanted C in order to
short-circuit. And when I got C I realized I needed OO, so I got OO
VB .Net. When I got that, I wanted a C like syntax and so I got C
Sharp and Java. In other words, I don't sit around with my head up my
ass.

In my opinion, truly competent programmers try to choose the best
tool for the job, where "best" is determined by a number of factors.
I'd be unlikely to choose C for developing a new application-level
program, but there are other purposes for which I think it's a
reasonable choice.
We all know this.

I do. You say you do. Beginners often don't, and the evidence
suggests that many drop-in participants in this newsgroup, and
other newsgroups I follow, also do not.
But you celebrate it as if a mistake or here, a
necessary (?) limitation in bit length is human knowledge.

The difficulty here is not that one has finitely many bits to work
with. People who don't know what the real problem is are apt to
think that representing currency amounts as doubles is a good idea.
Perhaps you wouldn't make this mistake, but many people do.
No, while
we learn from our mistakes and the limitations of our tools, those
aporias are not first-class knowledge. They are cautionary tales.

I don't understand what distinction you're making here; to me it
seems obvious that one cannot claim to be a competent programmer
without understanding that most (all?) tools have limitations,
and a prudent programmer informs himself, or herself ....
I did not say we ignore it.

Not in so many words, no. If I misunderstood you -- ah well.
Instead, we separate concerns; read
Dijkstra. He was actually told much the same thing, that it was a
fault of him not to understand the time and cost limitations on the
IBM 1620 designers that caused them to fail to support more than one
level of subroutine call, as if (again) mistakes and limitations are
knowledge.

[ snip ]
No useful standard says anything useful about this. Instead, a good
unix or linux programmer will try to return int so his code can be
useful in shell procedures. SEPARATION OF CONCERNS means that this
should NOT be part of the programming language. Why? Because we would
like to use C on Microsoft and embedded platforms, and we don't all
want to use linux. Why recreate the world of the 1984 Apple ad?

You seem to be saying that one should not bother to write code as
portably as possible. I don't agree. But even if I did -- again,
I'm not arguing for or against the specifics of the C standard, but
instead arguing that *given that it is what it is*, good programmers
work with it, rather than against it, when they reasonably can.

Oh, and that comment about writing code for embedded systems --
aren't there different rules (about the signature of main()) for
free-standing and hosted implementations anyway?
I fail to see yours. In Windows we generally don't want to use the
return code. Therefore it is an excrescence to return anything and as
Herb says, void is the best style here.

I don't agree. It's one more thing that would have to be changed if
one ever wanted to port the code to a platform on which the return
value mattered.
[ snip ]
Sorry. I italicised the Adorno quote at wordpress as you now know.

Only by your saying so here. I have not followed the link to the
version of your critique posted there.

Do your homework.

Why? Isn't it more polite to take your word for it that you got it
right at wordpress, even though you got it wrong here?
[ snip ]
[ snip ]
occurring?

but you remembered to double the "r" this time ....
Do me the courtesy of not patronizing me, dear blm.

Not so much patronizing [*] as describing how I first started to
suspect that this might not be your writing.

Don't start. I am more literate and better read than you, that's
plain. No corporate checklist approach will change this fact. Ooooooh
a spelling error is Human Resources thinking.

It's not so much that it's an error as that it's one you seem to make
often, hence an additional clue that this might be quoted text.

Again -- I really don't care that much about spelling errors, though
I usually do notice them. I've explained elsethread why I initially
mentioned yours.

The thing that's amusing here, or ironic, or something, is that
for me the fact that your writing is for the most part free of the
more obvious kinds of errors lends it a credibility it otherwise
might not have.
[*] Snide, maybe. Fair cop.
You are nothing
more than a low level computer instructor as I was,

Am I? I've been deliberately cagy about my actual job title [*],
so why assume that it's the lowest one that would allow me to
say truthfully that I teach undergraduate CS classes?

Litera scripta manet, dear heart. Nothing in your writing conveys
anything more than a middling level of intelligence or literacy.

Well, you never know. One thing I figured out a long time ago is
that some people come across as being smarter than they actually
are, by virtue of being articulate and self-confident, while others
who are actually very bright fly under the radar, so to speak.
Which group I'm in -- oh, I think it's all relative anyway.
Certainly I've worked with people who are intellectually out of my
league, and with others who are probably no smarter but somehow get
more done. I've also on occasion worked with people about whom I
think "how is it possible for anyone to be this dim?" <shrug>

I do find it amusing to speculate about what you would make of my
educational and other credentials, given that you seem to regard
your good grades and Schildt's degrees as reliable indicators of --
something. But since I'm not willing to post them here, for reasons
that seem good to me, it's a moot point, I guess.
[*] For reasons that seem good to me, some of which can perhaps
be inferred from my signature below, particularly the part about
not speaking for my employers.
but your literacy
has the usual low upper bound. You are posting dishonestly as is
Julienne because you're afraid to take an unpopular view in a
newsgroup which turns upon and brutalizes women posters when they
disagree with the normalized deviance of the ng.

Does this have something to do with the current discussion? (Yes,
I'm being a bit snide here too.)

As for being afraid -- I have not observed women posters being
"brutalized" in this newsgroup [*], but I *have* observed you
proposing to escalate disagreements here beyond the bounds of Usenet.

In response to brutalization, I will do so. And you weren't here in
the 1980s when women with brains and courage were driven out of here
and out of programming positions.

"Here" .... If you mean Usenet, my recollection is that I discovered
it in the late 1980s.

I'm guessing, though, that you mean something along the lines of
"computing as a profession", and if so, for the record:

My rather peculiar career path includes about a decade's worth of
jobs with titles including the word "programmer", starting in
the late 1970s. Certainly I have heard very unhappy stories about
women being treated unfairly in such jobs, but I don't think
I witnessed any of that myself. Just sayin', maybe.
I'm not making this shit up, luv.

Again with the patronizing forms of address .... Knock it off,
would you?
The New York Times did a followup circa 1997 of the Princeton computer
science fe-male

Okay, I guess I'm going to ask -- why the hyphen?
contingent class of '87 to find that all of them had
been driven by the pressure of working with assholes like Seebach and
Heathfield, into teaching and other skirt occupations.

The _Times_ mentioned Seebach and Heathfield? Wow. (Yes, yes, you
almost surely didn't mean to imply that -- or at least not that they
did so by name.)

I can believe that things have gotten worse in the years since I
left "industry" to pursue an advanced degree. For what it's worth,
my decision to do that had nothing to do with the kind of people
I was working with at the time, though -- they were almost without
exception both capable and collegial. Again, it's probable that I
was lucky in that regard. Just sayin', maybe. As for why I took
a teaching job rather than going back to industry -- I thought I'd
enjoy teaching, and the academic-job lifestyle, and on the whole
I have.
This has caused
the quality of code produced in America to significantly decline to
the point of Seebach's horrors, in my direct experience.

[ snip ]
Tu quoque is the favorite argument of criminals,

I don't agree that I'm making a "tu quoque" argument; I'm making
a point about what I perceive as -- oh, "hypocrisy" is too strong
a word, I suppose, but I can't think of a milder one.
 
M

Mark Bluemel

If this is not done, I will have to continue this matter, with
escalation to the next level.

CAPITAL LETTERS AND MULTIPLE EXCLAMATION MARKS!!!!!!!!!!!!
 
B

blmblm

[ snip ]
I do. You say you do. Beginners often don't, and the evidence
suggests that many drop-in participants in this newsgroup, and
other newsgroups I follow, also do not.


The difficulty here is not that one has finitely many bits to work
with.

Oops. Actually that's *exactly* what the problem is. What I meant
to write was "the difficulty is not that one has a limited number
of bits to work with". Why oh why does are some errors apparently
invisible until the post has been dispatched ....
People who don't know what the real problem is are apt to
think that representing currency amounts as doubles is a good idea.
Perhaps you wouldn't make this mistake, but many people do.

[ snip ]
It's not so much that it's an error as that it's one you seem to make
often, hence an additional clue that this might be quoted text.

Again -- I really don't care that much about spelling errors, though
I usually do notice them. I've explained elsethread why I initially
mentioned yours.

The thing that's amusing here, or ironic, or something, is that
for me the fact that your writing is for the most part free of the
more obvious kinds of errors lends it a credibility it otherwise
might not have.

FSVO "credibility". (Yes, I guess that's a bit snide. <shrug> )

[ snip ]
 
N

Nick Keighley

it isn't

It's not funny enough to bother reading the whole thing, but here's one
howler:
[regarding the use of %f and %d printf format specifiers for sizeof]
Although I do not know why Herb used %f and %d format codes, he did
know, as Seebach seems not to, that all ints are floats and all floats
are doubles in well-structured languages.

such as? Languages like scheme do such slippery promotions but they
don't have crude things as floats and doubles.

You wouldn't, because you don't know "programming languages"
independent of a set of facts about a specific programming language.
[...]

If you like to learn from books, please read "Programming Language
Pragmatics" by Michael L. Scott (Morgan Kauffman).

any good? Nice thengs get said about it on Amazon and it sounds
interesting.

You need to learn
theoretic constructs such as the fact that in sensible programming
languages, numbers form a nested set down from double to short or
byte.

really? Which languages do this?

You also need to learn that competent programmers try to write
sensible code based on these constructs, adapting their language to
their model rather than celebrating its mistakes.

I tend to think in a slightly higher level language than C (and I've
been trying to raise this thinking level language) but I still use C
idioms like assigning and testing in an if statemnt and fall-thru case
statements. I probably do this less than many C programmers though. I
don't know if this necessarily is the only way to program.

Just because you can write Fortran in any language doesn't mean you
should.

My guess would be that the actual motivation was "there is a float object,
I'm typing quickly, I'll write %f".  Because I've made that mistake; the
difference is that I recognize that it's a mistake and fix it.

But our [spinoza's] experience is that you DON'T fix or see trivial errors such as
off by one. However, it seems  that here, and at most 20 other place
(and, probably more like 6), Herb ran out of time to test code which
was in the process of being transformed into a hard-to-change PDF.
This is common in software publishing, and it's why computer books
disclaim warranty.

how do you know this? Have you spoken to him? There seem to be a *lot*
of errors and many of them don't look like typos.
No, using a poorly structured language makes the search for good
structure even more important.

yes. I have to be pretty structured if I write assembler (but it's a
while since I did that).
This is why I developed "virtually
structured programming" for assembler and Fortran, and it is a point
made long ago by Brian Kernighan in The Elements of Programming Style.

the problem is you can miss the languages natural paradigm. I'm
learning scheme (Lisp) but my scheme tends to look like C. I have to
fight quite hard note to code in C.

My suspicion of any programmer who uses the mistakes of a language to
use his own poor practice is covering up his incompetence, and this
has been confirmed, for me, by your code here.

who descides what the mistakes of the language are? If it isn't Pascal
it's a mistake?


its certainly getting tougher
Petitio principii:

1. Nilges is mad

2. Therefore anything he says is wrong (a questionable assumption on
two bases: on the monkey/typewriter model, approximately 50% of what I
say is true,

only if you confine yourself to predicates. And probably not even
then. The space of wrong statements is *much* larger than the space of
right statements. Oh, and whichever one of you implied it- mad people
aren't stupid.
and on a Romantic basis, I might have special insight)

sounds like crap
This is the Rainman Hypothesis is it? If I can't understand him he
must have special insight?
3. He thinks C:TCR is coherent

4. Therefore he's mad

Dweebach, CTCR is not at issue here: CTCN has to stand on its own
merits.

I don't think you can seperate them. Hell, I'm sure you can't. It
actually matters if Seebach's critcisms are correct or not. This isn't
poetry we're reviewing or a nineteeth century novel. It purports to be
a technical book. It can be wrong.

It may be incoherent. I noticed long ago that many computer
books are semi-coherent, and I learned why: computer book publishing
is a business, not "truth".

wow. If only I'd known. Except I expect my technical books to be
technically accurate.
But CTCN fails to make your case. It's self-contradictory, misuses
words such as "clarity",

ie. uses the dictionary definition instead of yours. It's stuff like
this that leads people to call you mad. You can't back off. You can't
see that "oh yes you're right I got hold of the wrong end of the stick
there" is sometimes far more sensible than "Standing By My
Principles".
and is completely disorganized. To make your
case, you would have had to list the "hundreds" or "dozens" of errors
you refer to in CTCN. It is illogical to reason from 20 to 24 or 100.

what do you think of the "random page program"?

Sampling can be a valid statistical technique.

The book is in its fourth edition. Therefore, you are dishonest,
because you're telling people about the flaws, with page numbers, in
an edition that is out of print. Dishonest enough to believe that in
the above, you may be lying, and that CTCN is based on the first
edition. This is verifiable by checking page numbers.

well you chose to dig up a 15 year old document

<snip>


--

"A clear statement is a statement to which the opposite is either true
or false. A deep statement is a statement to which the opposite is
another
deep statement." ­ Niels Bohr
 
S

Seebs

The _Times_ mentioned Seebach and Heathfield? Wow. (Yes, yes, you
almost surely didn't mean to imply that -- or at least not that they
did so by name.)

Heh.

Mostly it's just weird that he makes this stuff up. I have consistently
been told, both directly and indirectly, that my coworkers like me and do
not consider me at all abusive or hostile. I'm actually famed for being
calm, polite, and extremely difficult to anger, and I'm consistently quick
to credit others for successes, and quick to take responsibility for
failures. I was taught these skills early on by good managers (they exist,
really!), and I've stuck with them. I have no interest in trying to fool
people into thinking I'm better than I am; I want my coworkers to have an
accurate picture of what I can, and can't, do. I don't want to be one of
those assholes who wrecks the workplace; I want to do what I can to make
coming in to work something interesting and fun to look forward to.

It turns out that these are very useful traits in creating a work environment
where no one feels particularly harassed, "brutalized", or bullied.

-s
 
S

spinoza1111

Heh.

Mostly it's just weird that he makes this stuff up.  I have consistently
been told, both directly and indirectly, that my coworkers like me and do
not consider me at all abusive or hostile.  I'm actually famed for being
calm, polite, and extremely difficult to anger, and I'm consistently quick
to credit others for successes, and quick to take responsibility for
failures.  I was taught these skills early on by good managers (they exist,
really!), and I've stuck with them.  I have no interest in trying to fool
people into thinking I'm better than I am; I want my coworkers to have an
accurate picture of what I can, and can't, do.  I don't want to be one of
those assholes who wrecks the workplace; I want to do what I can to make
coming in to work something interesting and fun to look forward to.

That's wonderful, Peter. In fact, I read your blog on workplace
bullying and I completely agree with it.

But this just makes it more troubling that you come in here and call
me "insane" and a "moron", doesn't it?

Kind of like Dr Jekyll and Mr Hyde, or Ted Bundy, wouldn't you say?

I am well aware that modern software offices are places where people
conduct themselves, *most of the time* respectfully. But I'm afraid
that compensatory to the in fact unrealistically high standards of
such offices, where the delights of the offices I worked in my
twenties (smoking on the job, hot girls at the old Xerox machine
winking at me because I looked so sweet and innocent) have passed
away, the gentility of the modern workplace takes place inside a very
bright line.

Homeless people can't come in to get warm in these offices on a
subzero day, no, not even mothers with children. And when someone is
fired, even if it's because some little dweeb of an incompetent
programmer back-stabbed him, people look away as that person is
escorted out by security (a former boss of mine in North Carolina got
into trouble for deliberately getting up to hug an African American
coworker who was being laid off since, the CEO said, his behavior was
"inappropriate").

I suggest you take a long, hard look at yourself, Seebach. I suggest
you are certainly no bully in your office because you're a physical
coward, afraid of physical confrontation, but here you have no such
fear, so you bully Schildt, you bully Navia, you shit on our visitors
from mainland China, and you bully me. You lie about people
continuously; you make unsupported inferences about Schildt and claim
I don't know switch when it's clear I do. I suggest with the best of
intentions, you won't understand bullying until you've read "The
Authoritarian Personality" by TW Adorno.
It turns out that these are very useful traits in creating a work environment
where no one feels particularly harassed, "brutalized", or bullied.

....then you go home and shit on this newsgroup.
 
S

spinoza1111

it isn't
It's not funny enough to bother reading the whole thing, but here's one
howler:
[regarding the use of %f and %d printf format specifiers for sizeof]
Although I do not know why Herb used %f and %d format codes, he did
know, as Seebach seems not to, that all ints are floats and all floats
are doubles in well-structured languages.

such as? Languages like scheme do such slippery promotions but they
don't have crude things as floats and doubles.
You wouldn't, because you don't know "programming languages"
independent of a set of facts about a specific programming language.
[...]
If you like to learn from books, please read "Programming Language
Pragmatics" by Michael L. Scott (Morgan Kauffman).

any good? Nice thengs get said about it on Amazon and it sounds
interesting.
You need to learn
theoretic constructs such as the fact that in sensible programming
languages, numbers form a nested set down from double to short or
byte.

really? Which languages do this?
You also need to learn that competent programmers try to write
sensible code based on these constructs, adapting their language to
their model rather than celebrating its mistakes.

I tend to think in a slightly higher level language than C (and I've
been trying to raise this thinking level language) but I still use C
idioms like assigning and testing in an if statemnt and fall-thru case
statements. I probably do this less than many C programmers though. I
don't know if this necessarily is the only way to program.

Just because you can write Fortran in any language doesn't mean you
should.

But our [spinoza's] experience is that you DON'T fix or see trivial errors such as
off by one. However, it seems  that here, and at most 20 other place
(and, probably more like 6), Herb ran out of time to test code which
was in the process of being transformed into a hard-to-change PDF.
This is common in software publishing, and it's why computer books
disclaim warranty.

how do you know this? Have you spoken to him? There seem to be a *lot*
of errors and many of them don't look like typos.
No, using a poorly structured language makes the search for good
structure even more important.

yes. I have to be pretty structured if I write assembler (but it's a
while since I did that).
This is why I developed "virtually
structured programming" for assembler and Fortran, and it is a point
made long ago by Brian Kernighan in The Elements of Programming Style.

the problem is you can miss the languages natural paradigm. I'm
learning scheme (Lisp) but my scheme tends to look like C. I have to
fight quite hard note to code in C.
My suspicion of any programmer who uses the mistakes of a language to
use his own poor practice is covering up his incompetence, and this
has been confirmed, for me, by your code here.

who descides what the mistakes of the language are? If it isn't Pascal
it's a mistake?

its certainly getting tougher
Petitio principii:
1. Nilges is mad
2. Therefore anything he says is wrong (a questionable assumption on
two bases: on the monkey/typewriter model, approximately 50% of what I
say is true,

only if you confine yourself to predicates. And probably not even
then. The space of wrong statements is *much* larger than the space of
right statements. Oh, and whichever one of you implied it- mad people
aren't stupid.
and on a Romantic basis, I might have special insight)

sounds like crap
This is the Rainman Hypothesis is it? If I can't understand him he
must have special insight?
3. He thinks C:TCR is coherent
4. Therefore he's mad
Dweebach, CTCR is not at issue here: CTCN has to stand on its own
merits.

I don't think you can seperate them. Hell, I'm sure you can't. It
actually matters if Seebach's critcisms are correct or not. This isn't
poetry we're reviewing or a nineteeth century novel. It purports to be
a technical book. It can be wrong.

Poetry can be wrong, too. It can go
Off the rails, off the side of the black road
This is something that only poets know
Oh heavy is the poet's gravid load.
Whereas to write about technology
Which to the simple hath such referent
Is recreational...science fiction and fantasy
About which geeks are...over-reverent.
They create a fable, and into it they climb
Being unable to deal with people
There to while away a weary time
By onanizing themselves with imaginary sheeple.
Whereas poetry is truer than true itself is true
Truer, surely, than me or you.
wow. If only I'd known. Except I expect my technical books to be
technically accurate.

Actually, since they describe man-made artifacts that are so easily
changed, simulated, or interfaced in such a way that they can appear
under Turing's results to be something else, you're dealing not with a
real, resistant world but with a pornographic fantasy, and arguably
it's less important than books be "technically accurate" (although
this is important) and more that they make you a fit denizen for the
orgy.
ie. uses the dictionary definition instead of yours. It's stuff like

Sorry, we've been through this. The Compact Oxford English Dictionary
defines clarity as conducive to understanding and understanding to
knowledge, and knowledge to justified true belief.
this that leads people to call you mad. You can't back off. You can't

I'd suggest that the collectively mad can't not refrain from joining a
cybernetic mob, but wtf.
see that "oh yes you're right I got hold of the wrong end of the stick
there" is sometimes far more sensible than "Standing By My
Principles".


what do you think of the "random page program"?

Sampling can be a valid statistical technique.

And you're an idiot. Sampling noise, not text.
well you chose to dig up a 15 year old document

....because a recent article on wikipedia, which is still in violation
of Biographies of Living Persons, referenced this out of date
document, which fails to specify which edition it is about. Peter
Seebach chose not to maintain that document, and it is massively
incorrect. For example, it claims that "the heap is a DOS term" and
that all C programmers must return an int to main, and both of these
statements are incorrect.
 
S

spinoza1111

(*) The function to obtain the time of day with (fairly) high
resolution doesn't even try to be portable.  But as far as I
know it's not possible to write such a function in portable C,
and the comments are pretty clear about this code needing to be
reviewed for platforms other than the one I developed on.

I need a full link; this has ellipses. I look forward, quite
seriously, to reading your code.


snipola
I don't know C#, but the same thing is true in Java -- but the
ints are *NOT* a subset of the floats in Java.  

That is correct. But I'd hasard they are a subset of the doubles. Most
competent C Sharp programmers, who use a mental model derived
independently of the mistakes of geeks, prefer doubles to floats for
this reason.
In my opinion, truly competent programmers try to choose the best
tool for the job, where "best" is determined by a number of factors.

No, "best" is determined by correctness, the programming manifestation
of truth. Whereas "a number of factors" usually includes peer group
and management pressure to conform to a normalized deviance. The
pretense is that this is a form of engineering, but if it is, it's
self-reflexive, which means that the "skilled" programmer in such a
milieu is the least ethical, and the most willing to act in ways that
are either self-destructive (as in the frequent phenomenon of
excessive hours) or destructive to others (as in the case of Peter
Seebach, who manages to lie about people, frequently).

I do.  You say you do.  Beginners often don't, and the evidence

Snide, as always, in the passive aggressive corporate register. This
just in, honey. In a "civil" conversation, phrases which imply
dishonesty (such as "you say you do") cross the line, and are an
invitation, sugar, to end the civil, Habermasian conversation. As
such, they are far worse than the usual corporate suspects, such as
"sexism", babe.
The difficulty here is not that one has finitely many bits to work
with.  People who don't know what the real problem is are apt to
think that representing currency amounts as doubles is a good idea.
Perhaps you wouldn't make this mistake, but many people do.

(Sigh) (Eye roll) (Crotch grab)
I don't understand what distinction you're making here; to me it
seems obvious that one cannot claim to be a competent programmer
without understanding that most (all?) tools have limitations,
and a prudent programmer informs himself, or herself ....

I'm tired of the tool metaphor, and paraprogrammers who rise out of
the mechanical sort and who overuse the "software as tool" or
"computer as car" metaphor. These people shouldn't do serious
development and shouldn't waste my time.

If I were in charge of the world, and ain'tcha glad I am not, my
examination for prospective programmers would resemble the Imperial
(Chinese) civil service examination. I would expect the candidates to
be able to write poetry. I get more and more serious and less and less
humorous about this as the years go by.
Not in so many words, no.  If I misunderstood you -- ah well.

Shit happens, right, doll?
You seem to be saying that one should not bother to write code as
portably as possible.  I don't agree.  But even if I did -- again,

Quite the opposite, I'd say. Why code for Linux all the time? It's
basically (cf Lanier's book) just a copy of unix, based on the work of
an author other than the millionaire Torvaldys whose work was stolen
by Torvaldys much as MS-DOS was stolen. The same sleaze and inferior
praxis occurs in both communities.
Oh, and that comment about writing code for embedded systems --
aren't there different rules (about the signature of main()) for
free-standing and hosted implementations anyway?

Which means, of course, that the Linux expectation should not control,
get it yet?
I don't agree.  It's one more thing that would have to be changed if
one ever wanted to port the code to a platform on which the return
value mattered.

Which exposes the main() return bullshit for what it is (why is it not
permitted on the Internet to say "****" and "shit" but it's ok to make
a foul, if misspelled, word out of Herb's patronym? Have human beings
ceased to matter? **** me if I know.)

It is the false belief, powered in fact by a corporation which remains
one of the most powerful, if most obscure, forces on the planet: good
old IBM, which continues to maintain control of computers that really,
really matter (vast server farms and secret data bases), that we can,
after all, force all the technopeasants into one tribe dominated by
Linux and wikipedia.

Back to 1984...

Why?  Isn't it more polite to take your word for it that you got it
right at wordpress, even though you got it wrong here?

Point taken.

It's not so much that it's an error as that it's one you seem to make
often, hence an additional clue that this might be quoted text.

Is it an error? And, of course, orthography and pronunciation, as
opposed to grammar and style, are the usual refuge of the half-
literate.
Again -- I really don't care that much about spelling errors, though
I usually do notice them.  I've explained elsethread why I initially
mentioned yours.

You care enough to keep bothering me.
The thing that's amusing here, or ironic, or something, is that
for me the fact that your writing is for the most part free of the
more obvious kinds of errors lends it a credibility it otherwise
might not have.

The meaning of my literacy is that more more intelligent and more
decent than most people here. In fact, this has been pointed out in
numerous "performance reviews" in which the subtle message was that my
intelligence was out of scale in the dumbed-down corporate world, as
was my outspokenness and even decency. My female coworker at Bell
Northern Research was told that she was "too good" for the "dog eat
dog corporate world".

Well, you never know.  One thing I figured out a long time ago is
that some people come across as being smarter than they actually
are, by virtue of being articulate and self-confident, while others
who are actually very bright fly under the radar, so to speak.

This is an urban legend. In fact, Dijkstra's test for programming
competence included a degree of literacy which most Americans, even
formally educated ones, no longer have. Corporations, however, select
for low but acceptable literacy because highly literate people tend to
get uppity.

You can't be mute and unsung, and a Milton, all your life. Sooner or
later, it's time to **** or walk. I'd be the first to applaud Peter if
he ever said anything truly intelligent.
Which group I'm in -- oh, I think it's all relative anyway.
Certainly I've worked with people who are intellectually out of my
league, and with others who are probably no smarter but somehow get
more done.  I've also on occasion worked with people about whom I
think "how is it possible for anyone to be this dim?"  <shrug>

I do find it amusing to speculate about what you would make of my
educational and other credentials, given that you seem to regard
your good grades and Schildt's degrees as reliable indicators of --
something.  But since I'm not willing to post them here, for reasons
that seem good to me, it's a moot point, I guess.

In the absence of other information, "good grades" and Schildt's MSCS
are in fact all that separates us from the barbarism of *les ancien
regimes*, in which careers were not open to talents, and in which
people were beaten for even thinking of speaking out. I've had it up
to here with the Populism of claiming that one's own poor grades
indicate in themselves that it is "the system" which is at fault,
because white programmers like Seebach use poor school performance or
the absence of coursework so consistently paradoxically as to make
their gesture meaningless. They mean that they are of the race
expected to do well and that any information or any failure to the
contrary is a conspiracy against their Genius.

As a result, a new *ancien regime* is formed of people with money and
their henchmen selected according to class background and race by
"human resources" departments, and careers are once more closed to
talents.

If Seebach manifested Ben Bacarisse's talent, I would be the first to
waive my expectations as an MA was waived on my behalf in 1973 and I
taught logic at university level. But he does not, and this realigns
the evidence against him.

Again with the patronizing forms of address ....  Knock it off,
would you?

Not until you start showing more solidarity with the victims of the
cybernetic mobs that so frequently form in this newsgroup owing to
enabling language expressed in dulcet tones, hon.

"Patronizing forms of address" are not a matter of syntax, but of
intent, and it is a form of fashionable autism to judge another's
sexism by means of keywords alone. I refuse to allow you to make any
inferences about my sexism for essentially the same reason I refuse to
allow Seebach to make inferences about what Schildt knows based on his
own, very limited and very biased, knowledge.

Language, in this and many other newsgroups, is used so often
ironically by chattering ape-men who in a truly bizarre fashion have a
cargo cult theory that words mean single things. They use it to lie
and then they hold others to the truth.

My sexism is ironic. Real malice, of the sort shown Kenny, Navia,
Schildt, Chinese visitors and myself, as well as competent female
programmers, is my concern here.
Okay, I guess I'm going to ask -- why the hyphen?

A deliberate affectation.
The _Times_ mentioned Seebach and Heathfield?  Wow.  (Yes, yes, you
almost surely didn't mean to imply that -- or at least not that they
did so by name.)

Finding Dumb and Dumber interpretations as a way of critiquing writing
is a poor way of improving anyone's writing.
I can believe that things have gotten worse in the years since I
left "industry" to pursue an advanced degree.  For what it's worth,
my decision to do that had nothing to do with the kind of people
I was working with at the time, though -- they were almost without
exception both capable and collegial.  Again, it's probable that I
was lucky in that regard.  Just sayin', maybe.  As for why I took
a teaching job rather than going back to industry -- I thought I'd
enjoy teaching, and the academic-job lifestyle, and on the whole
I have.

I am not saying that Seebach at his worksite is not collegial and
capable in proportion to expectations which have been dumbed-down. In
fact, he has a nice blogpost on how not to be an asshole at work.

The problem is that "work" is so obviously a laager, which is marked
off, and that outside this line (as in my examples elsethread of what
happens to the laid-off, and Seebach's somewhat Ted Bundy like
persona) the artificial constraints on expression at work issue in
deviance, here out of control bullying.

I don't agree that I'm making a "tu quoque" argument; I'm making
a point about what I perceive as -- oh, "hypocrisy" is too strong
a word, I suppose, but I can't think of a milder one.

Well how about "I like making tu quoque arguments?"
<snip>
 
R

rigs

snippety snip




I need a full link; this has ellipses. I look forward, quite
seriously, to reading your code.

snipola




That is correct. But I'd hasard they are a subset of the doubles. Most
competent C Sharp programmers, who use a mental model derived
independently of the mistakes of geeks, prefer doubles to floats for
this reason.



No, "best" is determined by correctness, the programming manifestation
of truth. Whereas "a number of factors" usually includes peer group
and management pressure to conform to a normalized deviance. The
pretense is that this is a form of engineering, but if it is, it's
self-reflexive, which means that the "skilled" programmer in such a
milieu is the least ethical, and the most willing to act in ways that
are either self-destructive (as in the frequent phenomenon of
excessive hours) or destructive to others (as in the case of Peter
Seebach, who manages to lie about people, frequently).




Snide, as always, in the passive aggressive corporate register. This
just in, honey. In a "civil" conversation, phrases which imply
dishonesty (such as "you say you do") cross the line, and are an
invitation, sugar, to end the civil, Habermasian conversation. As
such, they are far worse than the usual corporate suspects, such as
"sexism", babe.




(Sigh) (Eye roll) (Crotch grab)


I'm tired of the tool metaphor, and paraprogrammers who rise out of
the mechanical sort and who overuse the "software as tool" or
"computer as car" metaphor. These people shouldn't do serious
development and shouldn't waste my time.

If I were in charge of the world, and ain'tcha glad I am not, my
examination for prospective programmers would resemble the Imperial
(Chinese) civil service examination. I would expect the candidates to
be able to write poetry. I get more and more serious and less and less
humorous about this as the years go by.




Shit happens, right, doll?




Quite the opposite, I'd say. Why code for Linux all the time? It's
basically (cf Lanier's book) just a copy of unix, based on the work of
an author other than the millionaire Torvaldys whose work was stolen
by Torvaldys much as MS-DOS was stolen. The same sleaze and inferior
praxis occurs in both communities.


Which means, of course, that the Linux expectation should not control,
get it yet?




Which exposes the main() return bullshit for what it is (why is it not
permitted on the Internet to say "****" and "shit" but it's ok to make
a foul, if misspelled, word out of Herb's patronym? Have human beings
ceased to matter? **** me if I know.)

It is the false belief, powered in fact by a corporation which remains
one of the most powerful, if most obscure, forces on the planet: good
old IBM, which continues to maintain control of computers that really,
really matter (vast server farms and secret data bases), that we can,
after all, force all the technopeasants into one tribe dominated by
Linux and wikipedia.

Back to 1984...



Point taken.

<snip>




Is it an error? And, of course, orthography and pronunciation, as
opposed to grammar and style, are the usual refuge of the half-
literate.




You care enough to keep bothering me.




The meaning of my literacy is that more more intelligent and more
decent than most people here. In fact, this has been pointed out in
numerous "performance reviews" in which the subtle message was that my
intelligence was out of scale in the dumbed-down corporate world, as
was my outspokenness and even decency. My female coworker at Bell
Northern Research was told that she was "too good" for the "dog eat
dog corporate world".

<snip>




This is an urban legend. In fact, Dijkstra's test for programming
competence included a degree of literacy which most Americans, even
formally educated ones, no longer have. Corporations, however, select
for low but acceptable literacy because highly literate people tend to
get uppity.

You can't be mute and unsung, and a Milton, all your life. Sooner or
later, it's time to **** or walk. I'd be the first to applaud Peter if
he ever said anything truly intelligent.



In the absence of other information, "good grades" and Schildt's MSCS
are in fact all that separates us from the barbarism of *les ancien
regimes*, in which careers were not open to talents, and in which
people were beaten for even thinking of speaking out. I've had it up
to here with the Populism of claiming that one's own poor grades
indicate in themselves that it is "the system" which is at fault,
because white programmers like Seebach use poor school performance or
the absence of coursework so consistently paradoxically as to make
their gesture meaningless. They mean that they are of the race
expected to do well and that any information or any failure to the
contrary is a conspiracy against their Genius.

As a result, a new *ancien regime* is formed of people with money and
their henchmen selected according to class background and race by
"human resources" departments, and careers are once more closed to
talents.

If Seebach manifested Ben Bacarisse's talent, I would be the first to
waive my expectations as an MA was waived on my behalf in 1973 and I
taught logic at university level. But he does not, and this realigns
the evidence against him.

<snip>





Not until you start showing more solidarity with the victims of the
cybernetic mobs that so frequently form in this newsgroup owing to
enabling language expressed in dulcet tones, hon.

"Patronizing forms of address" are not a matter of syntax, but of
intent, and it is a form of fashionable autism to judge another's
sexism by means of keywords alone. I refuse to allow you to make any
inferences about my sexism for essentially the same reason I refuse to
allow Seebach to make inferences about what Schildt knows based on his
own, very limited and very biased, knowledge.

Language, in this...

read more »

Who knew the Frankfurt School offered degrees in Internet Kookery?
 
S

Seebs

Your doubt was well-founded.

Yeah. Never assume that something a narcissist says is going to stay true
if it would prevent him from being the center of attention.

FWIW, I unkillfiled him briefly to see if he'd gotten any closer to lucidity.
He hadn't.

I've addressed what tiny fragmentary legitimate criticism there was over the
older version of C:TCN, and I think I've now adequately established that the
4th edition of C:TCR is still badly written, glossing over or ignoring core
functionality, littered with errors, and just plain not a good book from
which to learn C. I think I'm done with this stuff, unless someone has
something really interesting to bring up.

-s
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,073
Messages
2,570,539
Members
47,197
Latest member
NDTShavonn

Latest Threads

Top