subroutine stack and C machine model

B

Ben Pfaff

Richard Heathfield said:
But it's still a bad idea. I remember that, a few years ago, someone
(and I'm reasonably sure it was you, Ben) suggested all getting
together for some kind of meet. My reaction (paraphrased here, since
I can't face battling the archives right now) was less than
optimistic: "it'd be five minutes of excited chatter, ten minutes of
heated debate, fifteen minutes of mindless violence, and a night in
the cells".

Yes, that was me, back in 2001. But it turned out well: Chris
Torek, Micah Cowan, and myself managed to have a good
conversation over a decent Chinese restaurant meal in downtown
Palo Alto. No violence or legal proceedings ensued.
 
K

Kenny McCormack

Oh puh-leaze.

Actually, I believe him. But note that what he says is simply "...
strike me ..." - i.e., the statement is entirely self-referential.

Explanation by way of analogy: If someone tells you "I believe the sky
is purple", there is no way to falsify that statement.

Similarly, I'm not the least surprised by Seebs's statement above. I'm
not at all surprised that Seebs thinks that way. In fact, nothing the
regs say anymore surprises me. They are a very self-selected bunch.
 
S

Seebs

Oh puh-leaze.

Obviously, you're one of the other exceptions. Same issue; you view the
world in terms of cliques and groups and status, and can't handle the concept
of a thing claim technically correct or incorrect regardless of who looks
good if we accept the claim.

I have worked with people with that trait before. Well, no. I've shared
a workplace with people with that trait before. Generally, people who
actually wanted to get stuff done were able to mitigate the harm.

If you ever want to try judging things based on technical merit rather
than envy, you may find it rewarding. It has cool benefits, such as actually
producing usable results.

-s
 
S

Seebs

Similarly, I'm not the least surprised by Seebs's statement above. I'm
not at all surprised that Seebs thinks that way. In fact, nothing the
regs say anymore surprises me. They are a very self-selected bunch.

Just as you guys are. You've got your little "we're the out-group, we're
so cool, we know how things REALLY work" meme going, you select other people
who are jealous and would rather attack people than solve problems, and you
sit around being smug.

The only difference? We produce working code. You produce snide remarks
and envy.

-s
 
S

Seebs

Delighted to hear it, but the above raises a poignant point - when
will Chris T return to comp.lang.c? (And how long must we put up with
the antichris?)

Dunno. But given that he's one of my coworkers these days (and, curious
data point, this is intentional -- I tracked down his employer and tried
to get a job there some years back, because I figured he wouldn't be working
there if it sucked), I'll ping him.

-s
 
S

Seebs

I have no idea what you're waffling on about. That para above was
meaningless.
Doubtless.

I view the world in terms of what makes things tick. And a bunch of
pedantic morons preening themselves rarely makes for a good productive
development team.

Of course not. No group of morons preening themselves makes for a good
productive development team.

The mere fact that you participate in the delusion about "the regulars"
was enough to establish that you're mistaking status for reality. The
"preening" remark confirms that you mistranslate your experiences into
narratives about status. Sure, status exists -- but most stuff isn't
about status, and indeed, a fair bit of stuff doesn't involve it at all.
If you sit around thinking about the "regulars", you're wasting your
time on illusions.
Envy? What the **** are you talking about?

Your remarks about "the regulars". You are acting precisely as someone
acts who believes that there is a status relationship going on, and that
e has been unfairly denied high status. You consistently justify and
support your actions in terms of how you compare yourself to "the regulars",
rather than in any functional terms.
I have about 25 years
development experience and am well able to judge (to my standards of
course) how and what makes a positive contribution to team spirit and
productivity and I can assure you people who constantly nit pick do not.

I don't believe you. Quite simply, anyone well able to judge what improves
productivity knows that "nit picking" is a derisive way of stating a technical
reality: Finding bugs before they bite.

The reason we pick nits is that nits make lice. The reason we point out
possible problems that could, on some systems, not turn out to create a
bug yet, is that they are likely to in the future.

In short, I don't buy it. Constant nit-picking, done by competent and
experienced people, produces better code -- and the people you jump on
for nit-picking here are consistently good at catching things which really
will affect real-world systems.

Someone who is well able to judge impact on productivity ought to be aware
of that. Now, team spirit is more open to debate. Some people feel that
being "picky" is bad for morale. However, it has been my experience that
if your choices are "good" morale based on not correcting errors, or
correcting errors come what may, you should probably correct errors. The
question of what is good for team spirit, then, becomes one of whether we
should lie to people or figure out how to teach people the art of accepting
constructive criticism graciously. The latter, it turns out, can also produce
good team spirit. Much better team spirit, in fact.

Think about it this way. If you had a code reviewer whom you knew would
never raise "picky" points for fear of hurting your feelings, would you
have justified confidence in your code reviews? No, you wouldn't. But if
you had someone like me poking at things and asking questions about
hypothetical future compilers, and you managed to get code past a review,
you could be pretty sure that the code was actually good -- and would stay
good through the next major compiler update.

(This is why people pick me to review stuff that they never, ever, want to
have to touch again. You can sneak stuff past me, but it's hard, because
I am a very picky reviewer.)
It's not envy. Almost the opposite I can assure you.

It is functionally envy, even if the internal narrative puts it the other
way around.
This is not meant in any way denigrate your technical knowledge and I am
also not including you (yet) in the clique to whom I refer.

That you think there is a significant clique suggests that you are imposing an
inappropriate narrative on a world it does not accurately describe.

-s
 
I

Ian Collins

Richard said:
I have about 25 years
development experience and am well able to judge (to my standards of
course) how and what makes a positive contribution to team spirit and
productivity and I can assure you people who constantly nit pick do not.

Those who nit pick and are ignored are the ones who say "I told you so"
when an expensive and embarrassing bug escapes int the field. Every team
needs a nit picker or two to keep the bugs in check.
 
K

Kenny McCormack

....
That you think there is a significant clique suggests that you are
imposing an inappropriate narrative on a world it does not accurately
describe.

Or that you are completely insane.

Me, I choose the later. Something about Occam and a razor.
 
S

Seebs

Or that you are completely insane.

Not hugely likely. In particular, since I'm mildly insane in a
well-understood way, we have a GREAT deal of data about my internal mental
state.
Me, I choose the later. Something about Occam and a razor.

Ahh, well. Funny you should mention that. Occam's Razor is specifically
the assertion that you should not unnecessarily multiply entities -- that
is to say, don't assume something exists if you can explain the world without
it.

As it happens, I don't think there's a clique thing going on, and I don't
need to appeal to anything special to explain the apparently-mysterious
tendency for the so-called "regulars" (along with nearly any other competent
programmer) to tend to agree:

We're discussing engineering. C programmers agreeing as to whether a
particular technique is unportable is about as much evidence of a clique
as structural engineers agreeing as to whether a given design is likely
to collapse.

You're observing a number of people, many of whom are experts or at least
pretty experienced in a very well-researched field, responding to the same
sets of posts. That they tend to agree is not evidence of some kind of
hidden social norming going on. It's just what you'd ordinarily expect. If
you watch, in fact, you'll note that there are consistent patterns of
disagreements on some issues, of about the sort you'd normally find between
people who have different preferences but accept the same underlying claims.

The interesting question is, have there been recorded instances of people
who think everything is about status imposing a status narrative on events
which had no such causality in them? Why, yes. In fact, it's a
well-documented phenomenon, and in general, we know that peoples' accounts
of social events are only marginally reliable in general, and especially
unreliable when they involve people who have different baseline social
instincts.

If I were describing your behavior, I would likely mistakenly underrate
the degree to which you were weighing status concerns or trying to react
to social groupings. I know about this, but I don't try to correct for
it very much because it's hard for me to do accurately. But, correspondingly,
you're very likely to overrate the degree to which I am weighing status
concerns or trying to react to social groupings.

The difference is that, for me, it's a conscious process, so I can discuss
it and try to correct for it, and in you, it's instinct, and until you're
willing to consider the possibility that your instincts are not an infallible
source of information about other peoples' thoughts, you can't do anything
about it.

-s
 
S

spinoza1111

Obviously, you're one of the other exceptions.  Same issue; you view the
world in terms of cliques and groups and status, and can't handle the concept
of a thing claim technically correct or incorrect regardless of who looks
good if we accept the claim.

I have worked with people with that trait before.  Well, no.  I've shared
a workplace with people with that trait before.  Generally, people who
actually wanted to get stuff done were able to mitigate the harm.

If you ever want to try judging things based on technical merit rather
than envy, you may find it rewarding.  It has cool benefits, such as actually
producing usable results.

The problem is that technical merit, whether defined by a workgroup or
an individual, is founded on a vast number of unfortunately
sociological premises. He-who-codes is at the end of a vast amount of
technical work and presumes such work. He is in Francis Fukuyama's
terms a "last man" who contains within himself the seeds only of
devolution.

Evidence here is the mistreatment of Schildt.
 
S

spinoza1111

Just as you guys are.  You've got your little "we're the out-group, we're
so cool, we know how things REALLY work" meme going, you select other people
who are jealous and would rather attack people than solve problems, and you
sit around being smug.

The only difference?  We produce working code.  You produce snide remarks
and envy.

The good old yawp. "My code works", where "works" is never really
defined and means on investigation "consumes resources and the user is
happy".
 
S

spinoza1111

Of course not.  No group of morons preening themselves makes for a good
productive development team.

The mere fact that you participate in the delusion about "the regulars"
was enough to establish that you're mistaking status for reality.  The
"preening" remark confirms that you mistranslate your experiences into
narratives about status.  Sure, status exists -- but most stuff isn't
about status, and indeed, a fair bit of stuff doesn't involve it at all.
If you sit around thinking about the "regulars", you're wasting your
time on illusions.


Your remarks about "the regulars".  You are acting precisely as someone
acts who believes that there is a status relationship going on, and that
e has been unfairly denied high status.  You consistently justify and
support your actions in terms of how you compare yourself to "the regulars",
rather than in any functional terms.


I don't believe you.  Quite simply, anyone well able to judge what improves
productivity knows that "nit picking" is a derisive way of stating a technical
reality:  Finding bugs before they bite.

But complementary to your ability to solve narrow problems, which is
probably in your case admirable to the point of overdevelopment, as is
Ben Bacarisse's certainly, is associated the inability to organize a
coherent case against Schildt without blowing trivia up. It's the fact
that you have no real way of explaining how, given the fact that
compiler optimization can (and should) be completely independent of
source code, the indeterminacy of a rather incoherent set of C
constructs (a()+b() versus a()||b()) makes code easier to optimize.

You also need to face this autism issue squarely, where autism means
in some measure the inability to recognize the personhood of other
people and to assign the most charitable meaning to what they say, all
other things being equal.

I now realize given your own admission that presuming that Herb didn't
know about alternative LIFO structures or free(NULL) was what autistic
people do. They form a set of syntactical rules that allows them to
ignore the issue of motivation, interpretation and meaning. They are
only able to function within these rules.

Given that, and given the fact that you persist in the autistic
assumptions made in "C: The Complete Nonsense", I don't think it would
be appropriate for you to rewrite the document. I think it would be
appropriate to replace it by an apology.

I don't like having to link this issue to your self-confessed
limitations, but people here, including you, have shown no restraint
in making all sorts of wild and unfounded statements about me.

I suggest it's time for you to apologize to Herb.
 
S

Seebs

I suggest it's time for you to apologize to Herb.

Yes, you do, over and over.

But.

Since you have demonstrated yourself consistently unable to get even
the most fundamental facts about C right, *why should I listen to you*?

You're quite simply not qualified to speak as to whether the document
in question is good or bad. You don't understand the material to begin
with. You've gotten everything wrong from order of evaluation to
whether or not the operand of ++ needs to be an lvalue (it does, in C,
and always has).

Given this, there's simply no point in placing any weight on what you
think about anything to do with C. You make up elaborate conspiracy
theories about the standardization process, and when it's revealed that
your completely made-up stories were wrong, you make up new stories about
how that's bad too.

There is nothing to suggest that your responses here are in any way based
on the actual C language, or the actual process of standardization. Whatever
your problem is, it's got nothing to do with C, and nothing to do with the
people who worked on the standard. It doesn't seem to have anything to do
with Schildt, either -- surely, if you actually cared about his books,
you'd have read them, right? But you obviously haven't, because you can't
even get things right when he does a nice job of making them perfectly
clear to even a novice programmer.

It is totally unclear why you think you care about this. You have made it
totally obvious that you don't have any actual interest in any of the subject
matter, or any knowledge of it. We can only presume that it's a proxy for
your battles with your inner demons, but really, the correct venue for that
is not comp.lang.c. Go write a Russian novel or something.

-s
 
S

Seebs

ROCKIN'! You can actually spell "complementary"! Really, I'm
impressed. Regrettably few people seem able to do that.

Most people can spell both complementary and complimentary, but not so many
people can tell them apart.

-s
 
C

Colonel Harlan Sanders

You guys are so cute! Petey gets one easy word pair right and Dickie
wets himself! Talk about a Mutual Admiration and Complimentary Dick
Sucking Society!

Actually, Heathfield was complimenting you.

Which you'd know if you knew how to use a newsreader, or quote
correctly. Or could remember a post you made just 4 hours ago.

And I leave the determination of who is sucking who's dick to others
less disgusted by your garish scenarios.
 
S

spinoza1111

...


Or that you are completely insane.

He's not. That would get him off the hook under M'Naghten for
attacking Schildt. He's a good person and I think he will apologize to
Schildt and withdraw "C: The Complete Nonsense".
 
R

Richard Tobin

spinoza1111 said:
For example, Ted Nelson thought that the Internet should support
double pointers TO information and BACK TO its source. He was mocked
and reviled for this view

Really? By whom?
Tim Berners-Lee's design makes multiple cites of the same
evidence appear to be more than one piece of evidence.

One of Tim Berners-Lee's insights was that it doesn't have to be
perfect to be useful. Internet-scale hypertext would still be just a
neat idea if we had continued to wait for reliable back-links.

-- Richard
 
S

spinoza1111

[Note to Seebs - the very last paragraph is just for you.]

In <[email protected]>,

spinoza1111wrote:

A programmer who has a certain expectation of the way in which his
program should work, which is violated by a bad software system, is
to me more intelligent than the people who created the bad system.
He's what Kant would call "the citizen of a better world".

If you are referring to left-to-right evaluation, you have yet to
demonstrate that a significant number of programmers expect that
order, and have yet to acknowledge that unspecified evaluation order
is a hallmark of a great many languages, not just C. Do you consider
them all flawed for that reason?

Unspecified evaluation order was a "hallmark" of older programming
languages because it was thought to be something appropriately
determined by the programmers of compilers...for the same reason that
different compilers for the same language, in the earlies, gave
variorum results for Boolean evaluation, parameter sequencing, and
expression operand sequencing beyond precedence. Notoriously the
obsolete language APL enforced right to left evaluation.

This was unnoticed because most programmers stayed on one compiler, so
each compiler's choice was a defacto standard.

Then, in the case of C (an ** older ** language), vendors who didn't
want to change compilers without a "business case" (money to be made
by the wealthy), enforced the "standard" that there was no usable
standard, preferring in the case of C to impose nondeterminacy on C
and make intelligent people look stupid...all in service of Holy
Private Property.

But taking an idiot vote including old practice doesn't decide the
issue. Instead we find that newer, and more truly standardized,
languages enforce determinism, in part because optimization is
possible without making source code non-deterministic:

Java was designed as a reliable replace for C in applications. Here is
the deal with Java:

http://java.sun.com/docs/books/jls/second_edition/html/expressions.doc.html:

"The Java programming language guarantees that the operands of
operators appear to be evaluated in a specific evaluation order,
namely, from left to right." Note that they need only "appear" to be
evaluated in l-r order: they can be sensibly optimzed.

Microsoft, which is powerful enough to enforce defacto standards over
and above conformance to multivendor standards, spares itself trouble
by declaring in MSDN that the order is NOT non-deterministic:

"The precedence and associativity of C operators affect the grouping
and evaluation of operands in expressions. An operator's precedence is
meaningful only if other operators with higher or lower precedence are
present. Expressions with higher-precedence operators are evaluated
first. Precedence can also be described by the word "binding."
Operators with a higher precedence are said to have tighter binding."

"The following table summarizes the precedence and associativity (the
order in which the operands are evaluated) of C operators, listing
them in order of precedence from highest to lowest. Where several
operators appear together, they have equal precedence and are
evaluated according to their associativity."

C's nondeterminacy is recognized as a bug and not a feature in
academia. This journal article recognizes "sequence points" as a C
idiom, and idioms are usually signs of a language mistake:

http://journals.cambridge.org/actio...72398F7187C.tomcat1?fromPage=online&aid=54521

"The presence of side effects in even a very simple language of
expressions gives rise to a number of semantic questions. The issue of
evaluation order becomes a crucial one and, unless a specific order is
enforced, the language becomes non-deterministic. In this paper we
study the denotational semantics of such a language under a variety of
possible evaluation strategies, from simpler to more complex,
concluding with unspecified evaluation order, unspecified order of
side effects and the mechanism of sequence points that is particular
to the ANSI C programming language. In doing so, we adopt a dialect of
Haskell as a metalanguage, instead of mathematical notation, and use
monads and monad transformers to improve modularity. In this way, only
small modifications are required for each transition. The result is a
better understanding of different evaluation strategies and a unified
way of specifying their semantics. Furthermore, a significant step is
achieved towards a complete and accurate semantics for ANSI C."


The C Sharp standard: ISO IEC 23270

"Except for the assignment operators and the null coalescing operator,
all binary operators are left-associative, meaning that operations are
performed from left to right."

Alas, that is *so* true.


Any stick to beat him with, eh? Too academic, not academic enough,
good at mathematics (or "autistic", as you appear to prefer),
uneducated and yet elitist... Emerson redoubled in spades.

Yes, any stick that's appropriate. To insist on irrevelancy may
"sound" academic, but in truth, it's not. In fact, programming needs
more academic theory.

But just because the theory and its consequent praxis emerges from
outside Microsoft, this does not in logic entail that the theory and
praxis are academic, and any less idiomatic than Microsoft.

It was also the case in the mainframe era that programmers at non-IBM
companies consoled themselves for being part of one of the "seven
dwarves" that they were superior beings using high theory. This was
only true of Burroughs, because it's just silly to use such a crude
measure of being academically superior.

Today, Microsoft programmers are regarded with scorn by people who
actually know little about Microsoft's actual design procedures, who
are themselves incompetent in their own specialties.
Feel free to try to back up that claim.




Firstly, C99 *is* the (de jure) C Standard. Secondly, it's a standard
to which Microsoft's compiler does not conform. The ++ operator
requires an operand that is a modifiable lvalue, not only in C99 but
also in C89 and indeed in C compilers that pre-date C89.

Microsoft enforces this rule. So in what way is Microsoft
nonconformant? In not being non-deterministic as regards expression
evaluation?

Note that in fact few or no compilers, Microsoft or other, actually
CONFORM to the nondeterminacy called for in the standard in a()+b().
This is because most compile in a particular sequence, and in the
preponderance of cases this will be left to right. To literally
conform, the C compiler would have to use random number generation to
make the order undeterminate.

This shows the near-criminal misuse of standardization, for making
nondeterminancy a standard was not a favor to coders, nor did it
improve, or for that matter even "standardise" C semantics.

Quite the reverse, for non-determinacy is by definition not standard!

A useless non-determinacy was made the standard to retroactively bless
as many compilers as possible, to pimp Microsoft, and to preserve Holy
Profits, Batman.
Any stick to beat him with.

Quit whining. I have a lot of sticks because my case sticks, mate.
In programming, the details *matter*.

To nasty little clerks. The rest of us automate the detail work.
Then please get on with it.

That's bullshit and you know it. You've learned about most of my
errors when I've admitted them and/or a guy like Bacarisse has found
them. I've corrected them, most recently the grammar error in the
parser discussion, where tonight I've posted C Sharp code to parse
using the corrected grammar.
Eventually, sometimes. You will learn faster if you stop assuming
people are wrong to correct you.

Ben is right to correct me most of the time: Peter some of the time.
You, almost never, but sometimes, such as when you complImented me on
my knowledge of a simple word pair. Keep improving.
Presumably the difficulty in optimising C explains why it wipes the
performance floor with other languages.

Wrong answers are still wrong answers when arrived at fast. What you
imply may be true: C may be more resistant to optimization. But this
means that some idiot's opinion trumps the collective wisdom of
automated optimization.
Autism is not a learning disorder. It is a neural development
disorder. Not quite the same thing.


I see no paradox in Seebs being correct. He is correct (most of the
time) because he has taken the trouble to learn the language.

I think that the design of C is so poor that learning it destroys
other parts of the brain. Dickens saw this in the lawyers of Bleak
House whose knowledge of Jarndyce destroyed them in all other
respects.
because the need for intelligent interpretation is beyond you,

Actually, his articles reek of intelligent interpretation.
errors threaten you

His articles don't give any hint that he feels threatened. He
occasionally gets exasperated, but who doesn't? But not threatened.
and when you see others make them
you are horrified by way of psychological transference.

No, he's just pointing out that they're errors.
["Oh my they might laugh at me like back in school."]

Projecting again?
But, in programming, we know how to deal with errors.

Well, we do. It is not evident that you are particularly skilled in
that area. Acknowledging errors is a vital precursor to dealing with
them, and you're weak in that area.

SO IS DECENCY AND RESPECT, and not jumping to unwarranted conclusions
about what Schildt does or does not know based on his attempt to be
clear...especially when you concede that the attempt is successful.
People won't admit errors in an environment dominated by autistic
twerps who globally question their competence based on one data point.
You made a FOOL out of yourself pulling that shit on me in 2003 when
you so generalized based on one data point, that being my use, for
readability, of repeated limit evaluation for. You later were
embarassed when people brought your attention to my book.
Using "autistic" as a pejorative is just pathetic. As for twerps,
well, twerp is as twerp does. Seebs's articles do not seem to me to
be particularly twerpoid.

In view of the foul abuse which you have enabled from the zanies
here, "autistic twerp" is both documented and defensible.
At least you have the good sense not to learn it from Schildt books.
So there's some hope for you yet. (Unfortunately, Schildt is by no
means the only C author who doesn't know C very well, so beware.)

Get it straight. For the same reason that clarity implies
understandability, and understandability implies truth, the knowledge
of mistakes coupled with the belief that mistakes make for a more
"efficient" language should NOT be called knowledge at all, just as a
lawyer in Bleak House who knows Jarndyce and nothing else does not
know the law. C's nondeterminacy was a mistake.

In fact, the programmer who codes a()+b() is smarter than the twerp of
a compiler developer who inverts the order for shits and giggles. This
is because the twerp, when said twerp decides to get gay and invert
the order, perhaps because of some feature of long-dead hardware, was
HIMSELF probably forgetting that in the language in which he considers
himself an Expert, the operands may have side effects!

Whereas the intelligent Java or C Sharp programmer, forced to maintain
some Fat Bastard's C code, has learned properly of propriety only to
be blind-sided by indeterminacy.

Worse, his code works because in so many cases the order is left to
right, but another Fat Bastard who sleeps with the Standard under his
pillow tells him the code is buggy because it is "not standard" and
"might" not work in the (unlikely) event that we chuck the PC and get
a Univac mainframe.

Whereupon the intelligent programmer tells Fat Bastard to take a hike.
I agree. I don't believe it will convince you either.


*This* is the mainframe era. As for covering up errors, that's a
losing strategy. Those who value correctness in others should also
value it in themselves. You can't correct an error if you won't
acknowledge it.

Equivocation. I admit technical errors, but what you call "errors" are
mostly matters of opinion, and you're one of those pub ranters who
must always be right. What's worse: you're a sober pub ranter.
That's why the better authors provide errata pages. Could you please
point me to Schildt's comprehensive and well-maintained errata page?
I can't seem to find it anywhere.


What's the point in learning wrong stuff?

You learned about learning not in institutions of learning, or it
didn't take, because what you think of as "learning" is what they
teach bairns in Borstal, Army recruits and in corporate training
classes. "Learning" is nothing like being bawled at by a Sergeant
Major to disassemble a Lee-Enfield his way lest the Fuzzy Wuzzies
conquer your sorry ass. It is a DIALOGUE between teacher and student
in which the teacher might not always be right. It is one of MUTUAL
respect in which the teacher, unlike the Sergeant Major, does not
belittle the student any more than the student does not belittle the
teacher as Seebach belittles Schildt.
Really? When was that?


Bad organisation, not bad content.


I thought you said he was an autistic twerp. Perhaps you think he's a
worthwhile and talented autistic twerp?

Yes. A worthwhile and talented autistic twerp.
 
S

spinoza1111

In

spinoza1111wrote:



If you're talking about Microsoft's C compiler, I'd be very interested
in seeing some evidence to support your claim that it claims
conformance with C99, since all indications from MS so far have been
to the contrary.

Well, it doesn't use a random number generator to order a()+b().
Should it?
 
S

spinoza1111

You are going wayyyy too far from too little data.  What makes you think I
never learned to discuss CS outside of overspecific problems?

Because you make, for example, the claim that nondeterminacy makes a
language more efficient. Whereas in my book I point out that strictly
speaking, a language can neither be efficient nor inefficient. You
seem incapable of understanding how when and why we optimize.
It's a concrete example of a value you can pass to free even if it didn't
come from one of those.  Were I writing the page today, though, I'd probably
be more focused on the more significant issue, which is that his writing
could easily leave the reader thinking that:

        x = malloc(5);
        free(x);
        free(x);

is okay, because x is a pointer returned by malloc.

That's absurd. In saying that for every free() there must be a malloc:

(1) Schildt was not denying a minor free(NULL) because especially in
teaching, de minimis non curat lex: the law does not deal in
trivialities. Only an INCOMPETENT teacher insists on this level of
detail when introducing students to essentials. The tyro has no use
for free(NULL).

(2) It's crazy to attack Herb for saying that in effect (x)[A(x)] (for
all x, property A is true) when he does not go on to say, there must
be only ONE free(). The student who's awake knows already that free(x)
returns x to available storage and that because of this x has no
referent. You're asking him when speaking to repeat other facts in
such a way that would only confuse. You say he's clear, and in this
you are right. You want him to be as unclear as the Standard would be
for the beginner!

This is a fascinating theory, but completely wrong.  I'm literal-minded
to a fault because I'm autistic, not because of when I did or didn't discuss
things.

You're literal minded to a fault indeed. You don't realize that in
programming, one can be in a uniquely dazzling way right as when one
finds a bug no-one else can see, and in a complEmentary fashion,
spectacularly wrong in an equally dazzling way, because programming
exists at the end of a long chain of industrial development. As such,
it provided a catchment area for the attention-disordered and
autistic.

Outside of programming, you need to start assuming the best of people
and not the worst.
In fact, I've spent quite a while discussing abstractions in a human context,
because I was raised by mathematics teachers.  I'm a lot better at it now
than I was 15 years ago, though.


It's not useless, though.  It's an extremely useful thing, because it ensures
that it is always safe to free (once) a value returned by malloc or realloc.
It eliminates hundreds of "if (p)" tests.  Very useful.

For boneheads, who need to free() more than once because their code is
leaky.
This is just semantically incoherent.

Naw, I just used a fancy French word. Do you know what a *frisson* is?
Which is a shame, because I took those too.  :)

Not good.
Ahh, and we're back to your little hobbyhorse.

What you say could actually have been a good argument.  I think there's a lot
to be said for describing the logically-nested activation records as a
conceptual "stack", and showing how this could be implemented in various ways,
including the specific case of just having a large chunk of memory and running
up it and down it.

Now you're getting somewhere...
Oh oh...
The objection to Schildt's writing on the topic is not to his use of the
abstraction "stack".  It can't be, because he never deals with the
abstraction.  He talks only about a specific concrete form, and gives it,
not as an example of one way of doing things, but as The Way Things Are.

Schildt nowhere claimed that the stack must be laid out in any
particular way any more than a math teacher says that in order to
conform to Euclid, the triangle must be the same size as that which he
draws on the board.

Had he talked about the abstraction, then shown an example of one way of
implementing it, I certainly wouldn't be complaining.

Aho, Sethi and Ullman start with the abstraction in academic style,
which makes their book inaccessible without a considerable amount of
prep. Whereas my book, while it certainly mentions abstractions,
STARTS with specific examples.

I downplayed code generation and wrote an interpreter for a Basic
machine so as not to get bogged down. I suppose in your book I purvey
the illusion that REAL compilers generate what I call Nutty Professor
code...to run on a virtual machine, but I don't make this claim any
more than Herb did.

The student needs, at the cost of some illusions which can be
unlearned at a later date, to be helped over difficult ground. When
you've both taught and written a computer book, you'll have standing
in this field. Lists of errors don't give you standing.


However, we can tell these apart immediately by his use of language.  It
is true that the set of activation records is *a* stack.  It is only true
on some implementations that the set of activation records is stored in *the*
stack.  The indefinite article refers to an abstraction; the definite article
refer to a specific and concrete implementation.

In your dreams. A stack, like a right triangle, is a "the" not an "a".
Herb, by referring to "the" stack was referring EITHER to the stack he
illustrated or its functional equivalent.
I already posted the hunk of text from C:TCR (3e) showing exactly this point.

I have always thought that Schildt was a clear and lucid communicator; I

In this you concede game, set and match. As I have shown, "clarity"
leads to understanding: understanding is knowledge of that which is
true.
merely think he's wrong about a lot of things.  By contrast, your careful
distinction between the abstraction of "a stack" and specific implementations
is a marked improvement over his material on the stack, which always felt a
bit clumsy.  You are almost right; that would have been a good way to express
it, and if he'd expressed it that way, instead of the way he did, it would be
ridiculous for people to criticize him for it.

It wouldn't have worked in my classes in C for the IBM Mainframe at
Trans Union in Chicago, and probably not even in my classes in C for
prospective computer science majors at Princeton. As it happened, a
few students at Trans Union complained that I used too much math. My
remit at Princeton wasn't to teach computer science and abstract data
structures. It was to get some students started in C.

Give newbies the credit most deserve. When I was handed Sherman's
"Programming and Coding for Digital Computers" and its 7094 based
machine, and then learned about the 1401, I realized, having already
read Turing, that the differences merely masked the fact that as
Sherman pointed out, any computer can simulate any other given enough
time and memory.

I think you padded "C: the Complete Nonsense" by counting what Herb
did not say as positive sins of omission as if he should have written
a computer science treatise AND a standard. Do this, and, as Hamlet
said, none should 'scape whipping.

I don't think it would have been appropriate to prefix use of the
stack with a pompous prologemena on the Idea of the Stack. Computer
people KNOW that things can be done in different ways.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,813
Latest member
lawrwtwinkle111

Latest Threads

Top