Efficency and the standard library

B

blmblm

On Feb 19, 9:28 pm, Nick Keighley <[email protected]>
spinoza1111wrote:
Heathfield wrote a "linked list tool"
without pointing to data,
[...] my point was that the DATA in the list is the DATA and
not a link, and nobody with a halfway decent education in data
structures would have proposed such a stupid design, save, perhaps,
for a linked list of data elements whose type is simple and fixed in
length...not a "reusable tool".
Alex Stepanov, the designer of the C++ STL library is untutored in
[computer science?] [this line reinserted]

note the question mark! See furthur down in this post for my take on
this.
Did you read Nick's whole sentence before starting to type your
response? he said

it was a rhetorical question. The C++ Standard template Library is
generally regarded as quite impressive. And Alex Stepanov is generally
regarded as a pretty substantial computer scientist. The STL is
"value" based (this is nothing to do with "values" in the moral or
ethical sense) and copies things.

I was using the STL of an example of a well known and usually highly
regarded generic library that copies things. I really don't understand
how you could fail to understand my point!

which to me says nothing at all about whether he thinks Stepanov
is "untutored in computer science", but is instead an attempt to
suggest that if Heathfield's approach is indeed faulty, he's in
good company.

exactly backwards!

Just for the record -- yes, I rather thought you regarded Stepanov
as more than competent.

If you mean "what?" .... I meant that I thought it was odd
that Mr. Nilges had retained text he apparently ignored in
composing his reply. Something like that.
 
B

blmblm

But how often do they do so in order to call you a douche?

"Massengill" [*] is actually one of the more common variations, and
you are not the first to notice the similarity to the brand name ....

[*] Which isn't actually the variation chosen by Mr. Nilges, exactly.
Never ascribe to incompetence that which is clearly much better explained
by malice, especially coming from someone with a long and visible public
history of excessive malice.

Aside from the accusation of theft of intellectual property, he
hasn't been especially rude to me. "Yet"?
 
S

spinoza1111

spinoza1111wrote:
spinoza1111wrote:
The job of a linked list tool is NOT to "read" data.
fair enough. But you need at least one [data] structure to hold the
actual data.
Yes, you "need" "at least one" "[data]" "structure". But making a linked
list is at base creating a view of data and it needs to support a
minimalist way of creating that view.

you could load the data into a linked list and then have other
structures point to it.
Copying the data is one extra
function point which can go wrong when there is a lot of data.

doesn't seem a particularly error prone area.
"Orthogonal" is here misused since in CS it means "providing a
complete set of features naturally", not "an independent question".

wikipedia:
"Computer science
Orthogonality is a system design property facilitating feasibility and
compactness of complex designs. Orthogonality guarantees that
modifying the technical effect produced by a component of a system
neither creates nor propagates side effects to other components of the
system."

Wikipedia is as usual, quite wrong. Orthogonality is the provision of
expected features in a natural way: it is symmetry. An orthogonal
language allows the programmer to naturally experiment without
constantly checking a reference manual or worse an unreadable
"standard".

C used to be orthogonal. For example, the fact that assignment is an
operator is an orthogonality. In most older languages, it could only
occur in one place, between an lValue and an expression. But in C, one
can simulate the side-effects that make assembler code more efficient
because assignment falls under the category of "operator"
orthogonally.

But because of the fact that vendor greed drove standardization we now
find a lack of orthogonality in C, for example in the fact that a
pointer to void isn't really a pointer. It's spiritually non-
orthogonal to more or less forbid the programmer to use it himself,
but require him to use it in qsort.

Bureaucrats and other destructive individuals are the enemies of
orthogonality. For example, in some early Fortrans, you could only use
a variable in a subscript. The use of more complex expressions as
subscripts was years later still being called "too complicated" in
Fortran code reviews, and this, in my opinion, was bullshit.
as does the C++ STL



links could be indices into arrays

(Sigh) I suppose so. Although I don't know why you'd want to allocate
a huge array and "point" to it when you have pointers.
Python quote

Members of the upper classes, such as were the Cambridge students who
founded Python and the East Coast elite who started Saturday Night
Live, use humor to disempower even when they're funny, as they often
are.
no, the preprocessor isn't the only way.

Didn't say it was. Said it was the best way given the problem
statement, which was to create a basic reusable linked list.
you put some sort of marker in the data to indicate its type. I'm not
sure how you are using the preprocessor or why you are so insistent
that your way is the only way.

It's not the only way, but it's the best way given the problem
statement. I can avoid using pointer to void.

I would probably write a generic void* based linked list then wrapper
it with something that was type specific. That might involve the pre-
processor or a simple code generator.

In other words, you're going to sorta do it my way and steal the
credit. Knock yourself out.

But you can avoid any use of void pointers:

#define NODE(nodeName, type) \
struct nodeName { type * link; nodeName* next; };

#define MAKENODE(nodeName, data, p) \
( (p = malloc(sizeof(struct nodeName))) == 0 ? 0 : ((*p).link = data,
(*p).next = 0, p) )

#define ATTACHNODE(list, newNode, p) \
{ assert(list != 0); p = list; while((*p).next) p = (*p).next;
(*p).next = newNode; }

Sweet... you've solved the problem. No O(N) byte moves at all,
although ATTACHNODE is order the number of entries and needs to be
improved. The above code is extempore, so it may contain errors. But
it's still sweet you ask me. Let me know if you see any problems, and,
when I get time, I will make it into a real program. And blow
Heathfield out of the water. As usual.
 
S

spinoza1111

spinoza1111  said:
[ snip ]

[ snip ]
Thanks for finding this bug, Professor!

In the context of these discussion (and also in your change record),
"BL" or "Ms" is fine.  (I would actually not mind *not* being credited.
Up to you, maybe.)
Yes, you are correct. I did not initialize the ptrSegmentStructStarts
pointer and the program failed on a null master string. Since it is
without meaning to search for a null target, I'd inserted a check for
a null target, but a search in a null master is correct.
I'd been laboring under an incorrect presumption, which was that the
dialect of C supported by Microsoft C++ Express .Net in C mode does
not support variable initialization, therefore I'd initialized
selectively and "by hand". I was also mistakenly under the impression
that the pointer would get initialized inside the loop the first time
a structure instance is created but NO instances need be created, nor
are created, for null masters.

Interesting that gcc warned me that variable might not be properly
initialized, and apparently the compiler you're using didn't ....
Then again, I almost always compile with flags that generate extra
warnings; when I compile without those, I don't get the warnings
about possibly-uninitialized variables.  Maybe the compiler you're
using also has some settings that would have helped you avoid this
bug?

My "warning level" in Microsoft C++ Express .Net (used in C mode with
a file type of C for the source) was W3. I set it to the highest
level, W4, and removed an initialization in a declaration. I got
warned about the uninitialized variable and two other issues as well,
where I didn't use something. Thanks for the sage advice, Professor
Massingill, I shall use W4 from now on.
 
S

spinoza1111

spinoza1111  said:
[snips]
On Sat, 13 Feb 2010 22:17:10 -0800,spinoza1111wrote:
No argument there, none whatsoever. But lemme ask you a question. Where
the **** do you get off criticising people who are, if not perfect,
better than you? People like Schildt?

[ snip ]
Arguably, McGraw Hill should (or should have, when the book was
published) devote/d more resources to debugging, and that they were
inclined to do so was indicated that they offered Peter Seebach, who
they did not know, money to participate. He refused their offer, he
tells us, because it wasn't enough...money.

Has he not said, in this discussion, that in hindsight that might not
have been a good decision?  People with little experience of the world
sometimes make decisions that they later recognize as mistakes.  Why
keep bringing up a decision that has already been admitted by, um,
the decider, to have been flawed?  (Rhetorical question, really.)

I don't know where he said that. However, he can delete "C: The
Complete Nonsense" and I believe he should do so based on its
questionable content and the fact that it targets a person without
advancing the state of the art.
 
W

Walter Banks

Richard said:
It is clear, however, that Don Knuth has a very different
story to tell!

They will happily take and publish the manuscript when
he says its done :)

w..
 
R

Richard Tobin

spinoza1111 said:
Wikipedia is as usual, quite wrong. Orthogonality is the provision of
expected features in a natural way: it is symmetry. An orthogonal
language allows the programmer to naturally experiment without
constantly checking a reference manual or worse an unreadable
"standard".

Orthogonality is a widely-used metaphor, whose metaphier is the
arrangement of lines at right angles to each other. The main
paraphier is the fact that moving in one dimension doesn't affect your
position in a perpendicular direction: moving east-west doesn't affect
your north-south position. The paraphrands are thus varieties of
independence and non-interference.

Of course such independence will often produce the effect you describe:
once you have learnt about arrays by using an integer array, you will
be able to apply your experience to arrays of doubles or structs
without reconsulting a manual.

(For the terms "metaphier" etc, see Julian Jaynes's "The Origin
of Consciousness in the Breakdown of the Bicameral Mind".)

-- Richard
 
N

Nick Keighley

<snip>

[from Pascal to C]
Funny, I was much the same.  "WTF is with all those stupid braces?  Gimme
a break."

I can't say I necessarily love C for aesthetic reasons; it can still
often look like a cross between line noise and something the cat barfed
up,

you've never come across perl then? :)

but then I compare it to languages where whitespace matters (and tab
vs space controversies can *kill* an app) and thank the PTB for a
language which, however aesthetically questionable, at least isn't
designed to drive me insane. :)

I get on fine with python, I thought someone had stolen my p-code.
 
S

spinoza1111

spinoza1111  said:
[ snip ]




Because in fact men have been in most societies definitional of full
humanity. This is unfair to women but the fact is that in patriarchal
societies most women haven't been independent enough for us to make
all but a few (Joan of Arc, Marie Curie, Sophie Germaine) into
exemplars of what it is to be human.
When I use the words, "act like a man" I mean acting like a free and
independent moral agent who tries to do the right thing, like my
father for example, or me at my best. I think most of the male posters
here are not men but guys.

I'm inclined to think that the admonition "act like a man",
addressed to a woman, means something different from the same
admonition addressed to a man.  But it's possible I'm being
influenced by the difference between "he acts like a man"
(almost invariably positive) and "she acts like a man" (in some
contexts negative).  "Whatever", maybe.

I have to explain my use of words more than the average person because
the language itself is changing for the worse. I believe that World
War I & II had an effect on the language by killing so many decent
men, and that post-WWII corporate life led to the creation of poor
role models for boys, in the form of absent or cowardly fathers and
over-mothering. The American writer Richard Yates (author of the book
that became the recent movie Revolutionary Road) said it most
succinctly: World War II for American GIs was a father who was never
satisfied with you. Likewise, the best of Britain's men perished on
the Somme, their lives thrown away by a generation of uncaring
"fathers".

This caused men to forget how to be men, because the world wars made
masculinity an impossible ideal. In programming, what Dijkstra called
the "cruelty" of computer science is the fact that it presents Father-
like demands, and the default in programming environments outside of
certain government laboratories is to consistently expect programmers
to "prove themselves" by working unpaid hours. Women programmers have
escaped this in part by creating productive and humane development
teams or by leaving the field; a New York Times followup study of
women computer science graduates from Princeton found that most of
them had left for "softer" professions.

The result is the personality damage seen in this newsgroup, and the
constant, grinding transformation of what could be decent and humane
technical discussion to a form of scapegoating, in which programmers,
all too inwardly conscious of inadequacy, to reason that because
Schildt or Nilges is so "incompetent", this means they are competent,
normal individuals. It's crazy, because none of us can post bug free
code without at least testing and at best advice from people such as
you or Bacarisse, and one of us (who is loudest in the denunciation of
Schildt, who is the source of the Schildt canard) posted a one line
strlen that was off by one...which was gravely tech reviewed by
another of us (who likes to call people "trolls") and approved.

The "structured walkthrough" movement was a product of the Sixties, in
which people like John Lennon said that they didn't have to be
warriors and right all the time to be men, and who were willing to
accept and celebrate their love and vulnerability. Gerald Weinberg was
one of the few employees of IBM to wear colored shirts and sport a
beard, and his humanism informed his book "The Psychology of Computer
Programming". But Lennon was shot, and the structured walkthrough
quickly became the structured walkover.

I had dinner with women on the business page staff of the New York
Times of the 1980s including Sylvia Nasar, who did a series of
articles for the Times on John Nash and for whom I was a source. They
were fed up with the nonsensical games they were being forced to play
by their male managers merely to keep their jobs, and most of them
(including Nasar) fled the Times for jobs teaching journalism. Their
managers had collectively decided that "nice guys finish last" and
forced their staff to snitch and backstab as in my experience
programmers were snitching and back-stabbing at Bell Northern
Research.

A woman I worked with at Bell Northern Research tried to enforce
software quality standards that had been mandated, in writing, by
BNR's parent, Northern Telecom. Her male reports complained to her
manager that she was "wasting time" by asking them to follow the
rules, and her (male) manager told her to *ignore* the Northern
expectations, because, he said, "this is a dog eat dog world here in
Silicon Valley, and we have to beat the competition to market". He
gave her a poor performance review.

I think the scapegoating of Schildt is normed here because, in fact,
most posters, on their real jobs, make many far worse mistakes, and
are forced to make other mistakes...such as a one-off program that
breaks when % is not followed by s. To tell a programmer (as often
happens) who's taken the same amount of time, a shorter amount of
time, or even a longer amount of time by writing a replace() in the %s
case that he's "wasted time" is in my view an offense to simple human
dignity, because even in situations where we're told to do a bad job,
we need as human beings to do the best job.

It's logically a straight self-contradiction to praise a person for
doing a slapdash job in a short amount of time, as even Kernighan (in
the recent O'Reilly collection Beautiful Code) praises Rob Pike for
writing something that is NOT a regex processor in only an hour. In
view of the real net harm done by so many software systems ("rocket
science" software causing the credit crisis, "intelligence" software
telling us that Saddam Hussein had WMDs, body-counting software
telling Bob McNamara that we were "winning" in Vietnam, etc.), the
world would be far better off if software took a lot longer. During
its extended development time, programmers could be mandated to get
input, not just from strategically placed thugs in management, but
from the actual workers, consumers, and members of the public, even as
Bjarne Stroustrup and his mentor, Ole-Johan Dahl, were mandated in
Denmark, by Danish law, to review their ideas with factory workers...a
mandate from which we get object-oriented programming.

But as it is, Seebach is the deviant norm, who's attacked Schildt for
bugs but expects us to forgive him his bugs and to agree that he's
both very good at what he does yet not, in a contradiction I have yet
to figure out. Sure, it's true that the great programmer is humble
about his own capacities, and it is true that Seebach speaks, as
Heathfield speaks, with what I consider a rather nauseating, Uriah-
Heepish humility at times.

But it's news to me that humility can be delinked from charity, and
that one can pretend to be "humble" when calling Nilges a "moron" or
Schildt incompetent. It's true that I give as well as I get, but I
don't claim to be humble, do I? Increasingly, delinked humility linked
instead to vicious bullying strikes me as a loss of true masculinity.

The "male" competition at the New York Times was such an all-
consuming, no-holds barred affair that it had toxic results as it does
in programming. As a direct result of the pressure on reporters,
Jayson Blair ruined the Times' reputation by filing phony stories
earlier this decade. In programming, it made a fetishized default out
of incorrect software done "on time", such as Seebach's code for (not)
finding %s and Pike's non-regex processor.

Men in this ng (one hesitates to apply the word) become the Father of
WWII and the Somme for the same reason the (on the face of it absurd)
"Swift Boating" of a real military officer worked in the 2004 election
in the US. You can't say, for example, "I wrote a compiler" to people
who've never written a compiler and don't have clue one how, without
them wasting your time parsing "compiler" to exclude a compiler that
generates interpreted code...despite the fact that historically, many
compilers do just that. This is the ethic of the chicken hawk, who sat
out the Vietnam conflict with a hairy growth on his butt like Rush
Limbaugh, but is all too ready to challenge Kerry. The chicken hawks
of this newsgroup have learned the political lesson all too well, and
they will stop at nothing to make their case.

I like it when women act like the best of men: competently, wisely,
compassionately and never pointing the finger at a scapegoat, instead
(like you) focusing on problem solving. I don't know why "acting like
man" for "feminist" woman became acting like Margaret Thatcher, and
authorizing the slaughter of 300 Argentine seamen to save her Prime
Ministership. "Being a man" in my book means building life skills so
you don't have to be a cheesy little office backstabber SOB. I'd
expected that programming would be such a skill, but "rationalization"
in an irrational world means ensuring that anybody, including the
least intelligent and least principled, can crank code.
 
S

spinoza1111

But how often do they do so in order to call you a douche?

"Massengill" [*] is actually one of the more common variations, and
you are not the first to notice the similarity to the brand name ....

[*] Which isn't actually the variation chosen by Mr. Nilges, exactly.
Never ascribe to incompetence that which is clearly much better explained
by malice, especially coming from someone with a long and visible public
history of excessive malice.

When one sups with the Devil, one needs a long spoon:
To fight the greater malice I may seem malicious.
But I call selected targest "douchebags" once a blue moon:
Because in all truth, I regard their conduct as malicious.
I've turned the other cheek far too often in my life
Only to get bitch-slapped within an inch of my life.
The time has come to speak the inconvenient truth
Which I shall do so best as I can without favor and without Ruth.
The douchebags here, and you know who you are
Have too long shat 'pon the intelligent and fair:
I think Heathfield has little more than a big mouth:
I think Kiki should take a long rest down South.
And as for Seebach don't get me started:
Schildt ain't perfect, but Seeb's just farted.
Aside from the accusation of theft of intellectual property, he
hasn't been especially rude to me.  "Yet"?

That accusation I withdraw, because as soon as I noted the omission
you corrected it. Your conduct is exemplary.
 
S

spinoza1111

spinoza1111wrote:

I see no reason why that should be the case. If Seebs were to point out
what he thought was an error in the book, which was not in fact an error
but purely an artifact of Seebs's ignorance in that area of the
language, then anyone who knows the language can point this out. The
detection of errors is a matter of objectivity, not supposed authority
based on "qualifications". In fact, the appalling nature of Schildt's
books serves, if anything, to cast a certain amount of doubt over the
validity of those qualifications.



Wrong. For example, Clive Feather is a recognised authority on C (e.g.
he administered the UK Public Comments for all Committee Drafts for
C99). He has published a critique of Schildt's "Annotated C Standard".
The C FAQ was written not by Seebs but by Steve Summit, another
recognised authority on C. (For example, W Richard Stevens acknowledges
him in UNP.)


And how many of these "balanced commentators at the Amazon sites" are
recognised authorities on C?


It has been, not only by Seebs and Clive, but also by many other C
specialists. I have yet to learn of any recognised C authority who is
prepared to defend Schildt's text.

<snip>

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -http://www. +rjh@
"Usenet is a strange place" - dmr 29 July 1999
Sig line vacant - apply within

Test reply
 
S

spinoza1111

Only if "qualifications" change to "being intimately familiar with,
and treating as Holy Writ, the mistakes of the past as sanctified by
vendor greed". Yes, they don't teach you in computer science skewl how
to live with a fucked up feof() (or how to **** it up) or that it's an
unholy thing to return something other than an int to the holy OS.
Industry has in fact a very poor track record in what should be Job
One, which is correct software. The only solution for C, which is a
laboratory for creating messes, is to use it to get out of C, but this
was not Herb's mandate.

It's arguably far more appalling to make the claims you make in C
Unleashed. One such claim is that C is viable for creating new
software. Another is that it's cool to create a "reusable linked list"
by unnecessarily copying unpredictable amounts of data into each node.

Both these errors illustrates the saw, that an "expert" is one who
avoids the small errors while sweeping on to the grand fallacy.

Both used Seebach's nonsense as their source. Furthermore, if Seebach
essentially bought his way onto the committee without a single
computer science class, I'd be inclined to question Feather's
qualifications.

It's the wikipedia problem. Pimply little convenience store clerks MAY
know a lot, but there's absolutely know way of certifying their
knowledge. You abandon academic qualifications and substitute a false
"knowledge" learned in little computing shops. Why should we trust
you?


The most sensible was.

It needs neither defense nor attack. Under US law, a book author is in
fact free to make unknowingly or even knowingly false claims on
anything under the sun (and Herb's claims were made unknowingly when
false). This is because the onus is on the user to apply Herb's
lessons in a professional manner, which means just using it as a
starting point.

As soon as you start to criticise even trivial technical tomes as
"perniciously wrong", you're on the way to book burning.

In the case of feof(), for example, it might be better to use it to do
one extra read unnecessarily, decrementing any count you make by one,
simply to make the result more Pascal-like, since Pascal and not C is
the model for software reliability. But if I'd been Herb, I'd have
recommended a macro approach to the problem.
 
S

spinoza1111

[snips]
On Sat, 13 Feb 2010 22:17:10 -0800,spinoza1111wrote:
No argument there, none whatsoever. But lemme ask you a question. Where
the **** do you get off criticising people who are, if not perfect,
better than you? People like Schildt?
[ snip ]
Arguably, McGraw Hill should (or should have, when the book was
published) devote/d more resources to debugging, and that they were
inclined to do so was indicated that they offered Peter Seebach, who
they did not know, money to participate. He refused their offer, he
tells us, because it wasn't enough...money.
Has he not said, in this discussion, that in hindsight that might not
have been a good decision?  People with little experience of the world
sometimes make decisions that they later recognize as mistakes.  Why
keep bringing up a decision that has already been admitted by, um,
the decider, to have been flawed?  (Rhetorical question, really.)

And here's a rhetorical answer. Ifspinoza1111allows (as the rest of us
allow) mistakes to drop once they have been acknowledged by the person
making the mistake, he'd have nothing to say. And we can't have that,
can we?

I'm afraid that Seebach's track record (and Kiki's) (ruining Schildt's
name over trivia: calling people "trolls") justifies demonstrating
that they make the errors they decry in others.

Nay, I will: that's flat:
He said, he would not ransome Mortimer:
Forbad my tongue to speake of Mortimer.
But I will finde him when he lyes asleepe,
And in his eare, Ile holla Mortimer.
Nay, Ile haue a Starling shall be taught to speake
Nothing but Mortimer, and giue it him,
To keepe his anger still in motion.
 
S

spinoza1111

[snips]
On Sat, 13 Feb 2010 22:17:10 -0800,spinoza1111wrote:
No argument there, none whatsoever. But lemme ask you a question. Where
the **** do you get off criticising people who are, if not perfect,
better than you? People like Schildt?
[ snip ]
Arguably, McGraw Hill should (or should have, when the book was
published) devote/d more resources to debugging, and that they were
inclined to do so was indicated that they offered Peter Seebach, who
they did not know, money to participate. He refused their offer, he
tells us, because it wasn't enough...money.
Has he not said, in this discussion, that in hindsight that might not
have been a good decision?  People with little experience of the world
sometimes make decisions that they later recognize as mistakes.  Why
keep bringing up a decision that has already been admitted by, um,
the decider, to have been flawed?  (Rhetorical question, really.)

And here's a rhetorical answer. Ifspinoza1111allows (as the rest of us
allow) mistakes to drop once they have been acknowledged by the person
making the mistake, he'd have nothing to say. And we can't have that,
can we?

Since when, Jerk Face, did you let go of Schildt? You've brought him
up repeatedly: I learned about his name from tirades you posted ten
years ago. He's been made a byword and a bad word by you creeps.

This is why Seebach shall ever in my book by Off By One Peter, and
Kiki, He Who Missed It.

You enabled a ridiculous and overblown campaign by Programmer Dude in
2003 against me.

Therefore you shall be "Mr Doesn't Know How to Construct a Linked
List".

Knock your shit off and apologize to the numerous people whose
reputations and employability you've impugned over the years, people
like Navia, and I shall stop bringing up your silly linked list. Knock
off calling people incompetent or "trolls" on insufficient data and a
narrow, technical school education and a Fundamentalist world view.

Then I'll forget your silly mistakes and those of Seebie and Kiki.
 
S

Seebs

I have seen quite a few authors contracts, everyone the
author sets their own delivery dates not the publisher.
How is Apress different?

In my case, tools problems ended up making it much harder to make progress
on schedule than planned, so the schedule slipped some.

It turns out that having to do writing in a buggy version of MS Word (think
"if you switch between Word and another app more than a few times, it
crashes") dramatically reduces my productivity.

-s
 
S

Seebs

actually its your obsession. I doubt he'd get a mention if you didn't
keep banging on about him.

Pretty much. I'd completely forgotten about him by 1999 or so. I got maybe
one comment on the C:TCR page in the following decade, which was actually
after the Great Nilges Schildtwar began, possibly from someone who was
influenced by him. Then Nilges posted his hilarious screed to clcm (and he's
still never answered the substantive questions I asked), and it's been funny
again.

I didn't expect anyone to still care.

-s
 
S

Seebs

Has he not said, in this discussion, that in hindsight that might not
have been a good decision? People with little experience of the world
sometimes make decisions that they later recognize as mistakes. Why
keep bringing up a decision that has already been admitted by, um,
the decider, to have been flawed? (Rhetorical question, really.)

I have no idea. But if he's at least gotten to the point of admitting that,
it's a step up from the repeated assertions that McGraw-Hill rejected my
criticisms.

-s
 
S

Seebs

This brings up a vaguely topical question:

Can anyone name a dialect of C that has ever not supported variable
initialization?
Interesting that gcc warned me that variable might not be properly
initialized, and apparently the compiler you're using didn't ....

MS is not famous for great compiler work, or for reasonable default
options.

-s
 
S

Seebs

You misunderstand. BLM is referring not to spinoza1111's hindsight, but
to yours - i.e. to your acknowledgement (some time ago now) that you
might have been wiser to accept McGraw-Hill's pittance.

Ahh.

And so I see.

....

D'oh! AND THE LIGHT GOES ON!

I think I have come to understand why it is that spinoza1111 has consistently
missed these things.

Have you ever noticed how some of his posts have quoted material ending with
something like "Show quoted text" or "read more"?

I betcha that, if you were to look at the posts in which I first pointed that
out in Google Groups, you might find that my explanations were *below* that
line. And in general, he has consistently failed to respond to information
past that line, suggesting that he doesn't know how to show the rest of the
post.

Which might explain why he's never responded to or acknowledged the multiple
times I've said this. But not all of them:

Message-ID: said:
1. Notice many errors described on Usenet.
2. Find spectacular examples (e.g., "if (x<>1)") in a coworker's copy.
3. Check current ed in bookstore, find many errors remaining.
4. Write McGraw Hill to complain.
5. Get answer back offering small honorarium for a tech review.
6. Send them a note saying that it would cost more than that. (An error
on my part, because I didn't understand publishing at all.)
7. Get a fax from them, containing a forwarded fax from Schildt, containing
two pages or so of standard quotes and poor arguments defending "void main".
8. Decide to just go ahead and write stuff up.

That was in mid-November. I think there were other earlier comments to
similar effect, but my newsreader wasn't archiving at the time.

-s
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,109
Messages
2,570,671
Members
47,263
Latest member
SyreetaGru

Latest Threads

Top