subroutine stack and C machine model

S

Seebs

I don't have 30 quid otherwise I'd have read it. I probably shouldn't
have cited it but I wanted to show that C's indeterminacy is
recognized in academia.

What you actually showed was that you didn't understand the paper's
abstract.

-s
 
S

Seebs

I'm the
best thing that's happened to clc in a long time since I create lively
discussion, I do my homework, and while I'm rusty on C, I have a
serious background with the language.

It's not that you're rusty, it's that you hold a bewildering variety
of opinions which are just plain incoherent, and you have a tinfoil-hat
quality theory of what standardization is like and why. That said,
you're pretty amusing.

-s
 
S

spinoza1111

It makes it more amenable to efficient implementations.

THIS IS NONSENSE. "Amenable" is a word people use in low level
corporate jobs when they don't know what they are talking about. For
example, in a corporation, "I'm amenable to that" means "I'm getting
screwed, I have no health insurance, but you're the only employer in
East Jesus and I want to climb Saddle Mountain once more before I die,
therefore I will do as you say".

There are two ways to make code efficient:

(1) As in assembler, manipulate the code "by hand"
(2) Use a compiler optimizer

In the nondeterminacy of a()+b(), it is impossible to do (1) in the
sense of performing something inside a() that's used inside b().

It's unnecessary for the order of evaluation of a()+b() to be
nondeterministic to do (2). This is because modern optimizing
compilers construct a data structure (often but not always a DAG)
which finds all knowable dependencies and can rearrange a and b only
when it's safe.

Optimization works better on more-deterministic languages because the
higher determinism means that optimizer has more information. For
example, C makes pointer analysis difficult, according to Aho et al.:

"Pointer-alias analysis for C programs is particularly difficult,
because C programs can perform arbitrary computations on pointers." -
COMPILERS, Principles, Techniques and Tools, p 934 2nd ed.

C programmers like to brag that their "language" is more
"efficient" (although as I point out in my own book, this statement is
another assault on clear English).

But on the one hand it is significantly more difficult to
automatically optimize than C Sharp or Java. On the other we find that
the vaunted ability to hand optimize is ringed about with weird and
non-orthogonal strictures created by vendor greed and the cowardice of
people who make "standards".
It can, however, be amenable to an efficient implementation, or extremely
difficult to implement efficiently.

I'm not "amenable to that" for the reasons stated above.
In the real world, optimization relies on finite resources.  One of the
things that people do when building optimizers is allocate finite
resources both to implementing the optimizer, and to the execution of
the optimizer.

There are compilers out there which can spend hours to days optimizing
specific hunks of code... And this can be *cost-effective*.  Keep that
in mind.

So, the quuestion is:  Are there real-world cases where the option of
reordering evaluations would permit optimizations which would be otherwise
prohibited?  The answer is a completely unambiguous "yes".

Now, you might argue, perhaps even persuasively, that either such
optimizations will change the behavior of the code, or they could be
performed anyway.  True enough.

But!  What you have apparently missed is that they might change the
behavior of the code *in a way that the user does not care about*.

Let's take the example from Schildt's text:

        x = f1() + f2();

As he correctly notes, the order of execution of f1 and f2 is not defined..

Now, consider:

        int
        f1(void) {
                puts("f1");
                return 3;
        }

        int
        f2(void) {
                puts("f2");
                return 5;
        }

Obviously, these functions have side effects.  Reordering them results in
an observable change.

However, it is not at all obvious that the user cares which is which.

"Cares about", like "amenable", is another corporate barbarism, and
this example is absolutely unprofessional and appalling.

"The user" is also barbaric. What this (to Dijkstra untranslatable)
word means is that the programmer wishes at the point of use to be **
relieved of the responsibility for thinking **.

And amazingly you have claimed that in all cases of the above it
doesn't matter, now or at any time, which string comes out first,
because you are too unimaginative to speculate that the puts may not
be directed at a screen or piece of paper.

It's as if you're unaware of one of the major and most useful features
of unix, piping and redirection. If one version of the code is plugged
into another program and the second program parses the output of the
first, you will BREAK the second program if you use a different
compiler that sequences differently! And because for most compilers a
defacto order is always enforced, you will never know this until the
last possible minute.

Furthermore, "getting answers in random order" is NOT "efficient". It
is wrong.
If we were in a case where the ABI's rules made it more expensive to perform
operations in one order than another, the option of reordering them could
improve performance.  If it produces changes in behavior, that *might* be
a problem -- unless those changes are harmless.  If all it changes is the
order of log messages, the developer might well prefer a faster form to
a slower one, even though it "changes" the output -- because the change is
irrelevant.

One version of a program displays messages (or sends them to another
program) when compiled with one C compiler in one order, and in a
different order when compiled with another. This is not acceptable at
all.
But it is what he said.


Ahh, but this is not a book labeled "Baby's First C Primer".  It claims
to be "The Complete Reference".  You could dispute whether the original
text ought to have mentioned this, but consider the description of
free() on page 434:

        free() must only be called with a pointer that was previously
        allocated with the dynamic allocation system's functions (either
        malloc(), realloc(), or calloc()).  Using an invalid pointer in
        the call will probably destroy the memory management system and
        cause a system crash.

Here, we are not talking about an introduction to the essentials; we are
talking about a *reference work*, billed as such, and it states something
flatly untrue.  It is absolutely the case that a null pointer is "an invalid
pointer".

No, NULL is not "invalid" if as you say free(NULL) is valid. Herb
means clearly by "invalid" a pointer that doesn't point to an
allocated region or one that's freed.
(2) It's crazy to attack Herb for saying that in effect (x)[A(x)] (for
all x, property A is true) when he does not go on to say, there must
be only ONE free(). The student who's awake knows already that free(x)
returns x to available storage and that because of this x has no
referent. You're asking him when speaking to repeat other facts in
such a way that would only confuse. You say he's clear, and in this
you are right. You want him to be as unclear as the Standard would be
for the beginner!

People have asked on clc before why free(x) is failing when it worked the
previous time.  After all, it was previously allocated.
Outside of programming, you need to start assuming the best of people
and not the worst.

I do!  For instance, when I first saw one of your long posts on this topic,
I responded with careful thoughtful analysis to many of your claims and
asked for evidence for them, which was not forthcoming.

Having realized that you're generally unwilling to support your claims,
I've stopped bothering; now I'm just killing time and enjoying the fun.

I have of course supported my claims. And now you confess to
unprofessional levels of insincerity. And isn't "killing time and
enjoying the fun" what trolls do? Fortunately, you're not a good
troll, instead you're just making a fool of yourself.

Have you ever heard the phrase "belt and suspenders" applied to computing?

Yes, from the sort of people who use "amenable". And if the emperor as
here has no clothes, a belt and suspenders won't help much.

If you want to write robust code, it is not enough to be sure that someone
should have done something right -- even if you are that someone and have
verified it.  You should be careful anyway.

Starting by not using C.
Consider the following data structure:

        struct foo {
                int x;
                int y;
                unsigned char *name;
        };

You might find that you have a number of these, and that most but not all of
them have a name.  What should foo_free() look like?

        void
        foo_free(struct foo *p) {
                if (p) {
                        free(p->name);
                        free(p);
                }
        }

Now, why are we checking 'if (p)'?  Because if an error elsewhere in the
program results in inadvertantly passing a null pointer in, the reference
to p->name would invoke undefined behavior.

This nonsense is what I find most tiresome about C, since its
incoherent claim to efficiency is undercut by the need to wear belt,
suspenders, two condoms and a raincoat.
In pre-ISO C, you had to write:

                if (p) {
                        if (p->name)
                                free(p->name);
                        free(p);
                }

and this usage, though common, was obnoxious -- and people sometimes forgot.
Fixing this improved the language.

From a mess to a mess powered.
Yeah, I just don't see the relevance.


Incorrect.  A math teacher might refer to "a" triangle, but will rarely
refer to "the" triangle.

Again, I'll quote for you directly from the book:

        Figure 16-1 shows conceptually how a C program would appear in memory.
        The stack grows downward as it is used.

        +----------------+
        |      Stack     |
        |        |       |
        |        v       |
        +----------------+
        |        ^       |
        |        |       |
        |  Free memory   |
        |      for       |
        |   allocation   |
        +----------------+
        |Global variables|
        +----------------+
        |                |
        |     Program    |
        |                |
        +----------------+

        (Figure 16-1 lovingly reproduced in vi, use fixed-pitch font plz.)
You missed Herb's "conceptually" and his "would". These words mean
that "this is an example, Otto".
He continues:

        "Memory to satisfy a dynamic allocation request is taken from the
        heap, starting just above the global variables and growing towards the
        stack.  As you might guess, under fairly extreme cases the stack may
        run into the heap."

This is not a mere illustration of one possible way of doing things; this is
a clear assertion that they are done in a particular way, and that that
particular way has consequences the reader must be aware of.

I would not write Herb's way but it made sense since everything he's
saying is under the scope of "conceptually" and "would". It is
subjunctive, one possibility among others. He's talking about the non-
virtual and constrained memory of his time in which job one was often
preventing a stack/heap collision.

Real programming students are, to a striking extent, selected from
populations excluded by classism and racism from high level university
education, and they often combine an interest in math with inability
to think in terms of abstractions. They need to see the abstraction
implemented in a "real world" situation once and can then be trusted
to generalize.

The great Edward G. Nilges, in chapter 2 (A Brief Introduction to
the .Net Framework) in his redoubtable book "Build Your Own
Goddamn .Net Language and Compiler" quotes Marx: all that is solid
melts into air. This means that to work with any given generation of
technology, one needs to introduce mechanisms that are later out of
date as instances of the pure idea.
Uh.

I have both taught (though not in college), and written a computer book.

I wrote a book on shell programming, which has only had one complaint made
so far about it, which is that the publisher insisted on tacking the word
"beginning" onto something that the most experienced shell programmers I
know have all considered to be rather advanced.

I have been, at some length, told that I did a very good job of presenting
the shell in a way that allows new readers to pick up the basics without
misleading them.

Amazon link, please.
Ahh, but "its functional equivalent" might never "run into the heap".
He really was referring to a specific thing, not to an abstract
generalization.

It might not run into the heap but it will run out of room if the
programmer gets gay and recursively calls his code in a loop. And even
in a modern implementation, the heap is the other region beyond the
stack and the code. Even in a modern implementation it makes sense to
picture the stack and heap as fighting each other for storage while
the code stands idly by.

Indeed, at the most abstract level, what is there at runtime but some
sort of stack, some sort of heap, and a space for code? Do tell us Mr
Shell expert...
You haven't shown it, you've asserted it.  At most, you've established
that I was mistaken to claim that he was clear, but really, the dictionary
hasn't got your back this time.


That's nice.


I have found that relying on the reader's telepathy makes for a poor
learning experience.  Fundamentally, while it's true that *most* readers
may be able to guess what you meant even when you say something else, it
is not true that *all* will -- and even "most" won't get it right all the
time.

If it were impossible to write easy-to-understand text without introducing
grave errors that the reader must magically guess were intended to be viewed
as possible examples rather than definitions, I think you would have a case;
we could reasonably assert that we have to make

You have no standing in either speaking about computer science or
practical instruction.
read more »...

"How progress and regression are intertwined today, can be gleaned
from the concept of technical possibilities. The mechanical processes
of reproduction have developed independently of what is reproduced and
have become autonomous. They count as progressive, and anything which
does not take part in them, as reactionary and narrow-minded. Such
beliefs are promoted all the more, because the moment the super-
gadgets remain unused, they threaten to turn into unprofitable
investments. Since their development essentially concerns what under
liberalism was called “packaging,” and at the same time crushing the
thing itself under its own weight, which anyway remains external to
the apparatus, the adaptation of needs to this packaging has as its
consequence the death of the objective claim."

- TW Adorno, Minima Moralia

"The moment the super-gadgets are unused", writes Adorno, "they
threaten to turn into unprofitable investments". This updates Marx's
insight that the factory owner must run the factory day and night to
amortize his investment even if sleepless children must fall into the
machine and be killed.

C, as one of Adorno's "super-gadgets", needed from the start, and
continues to need, a Legion of the Undead to follow it. In the 1970s,
given that most of my friends were going "back to the land", I
wondered who would be interested in or support the new technology that
was already appearing. I was amazed to find that for material reasons,
these very hippie-assed scoundrels were driving the technological bus
by 1978.

This was because the super-gadgets of the time, representing such an
enormous risk and investment on the part of men who weren't hippies,
and who were like Ed Roberts former military sorts, required use in
the form of programming and the entrepreneurs of that time were
willing for hippies to work on their systems as an alternative to
losing everything. The crackdown came as soon as Reagan was elected.

But by this time, people had been trained to follow the "super
gadgets" by way of negative and positive conditioning (where the
negative conditioning was almost as seen in the 1950s science fiction
novel The Atlantic Abomination).

However, for them to be loyal to abstract computer science, and to
criticise "gadgets" like C from this perspective would have destroyed
wealth, therefore people were carefully divided into tribes, each
passionately loyal, not to truths of mathematics but to C or what ev
er.

It seemed to me at the time (for example, in Robert Tinney's crude
paintings of technical concepts on the covers of Byte Magazine) that
everyone was thinking in childish pictures and as a result becoming
the overly loyal followers of one paradigm or another, and this was
moronizing them while enriching the few. The men who their loyalty
enriched didn't seem to me to give a rat's ass about software
correctness or the public interest.

Eerily, prophetically, writing in 1948, Adorno predicts "the death of
the objective claim" and here we see that death. Everyone's
passionately loyal, not to truth or even common decency, but to some
artifact, some gadget, some goddamn piece of shit programming language
past its sell-by date.

They do not know it, but what motivates them is the fact that rich
people need them to continue to use the artifact and to sing its
praises.
 
S

spinoza1111

<snip>

You are posting an increasing amount of nonsense of late.  I don't
have the time or the inclination to read a fraction of it but this
stood out as a particular perversion of logic:






A technical matter to discuss, an academic journal, and a quote.  It
looks impressive, but how does the quote back up the claim?

Well, it does not.  Spinoza1111goes from a quote that says that C's
sequence points are "particular to ANSI C" to a claim that this is an
idiom.  From there he injects the idea that "idioms are usually signs
of a language mistake".  But he is not yet done, because all that
misdirection was about sequence points.  To extend that to unspecified
subexpression evaluation order, he simply relies on textual proximity:
by putting his claim just before the one that he says can be drawn
from the paper, he suggests that they are linked when, in fact, there
is no connection at all.  It helps that the paper also discusses
evaluation order, but since the quote says nothing about C's choice
one way or the other, sleight of hand is needed to suggest that is it
critical.

Of course, the article /might/ be critical of C's choices, but the
quote does not support the claim.  It is there just to fluff up the

Perhaps I was corrupted by reading The Vicious Tirade.

The abstract, however, confirmed my view that "sequence points" are
without standing in computer science and were invented for "standard"
C.

Now, this is quite astonishing. Occam's Razor is as Fluellen would say
an excellent moral, and what, exactly, were the Clerkish Twerps of C89
or C99 doing?

A neologism is a neoideologue, and a new idea needs a justification.
EITHER the Clerkish Twerps were like Galileo, seeing a new planet swim
into their ken, making a new DISCOVERY, or else they were fabricating
an abomination to make up for a deficiency.

But by 1989 we'd learned enough about control flow and compiler
optimization theory to know that we DO NOT NEED THE IDIOM. Quite the
opposite: Bohm and Jacopini had shown us that we needed only straight
line code (with absolutely no nondeterminacy of sequence), if then,
and do while. Dijkstra had confirmed that this particular form of
asceticism made code more reliable, and had, being a Prometheus, also
given us semaphores for the case of multiple threads.

There was no reason for sequence points to exist EXCEPT the need to
standardize as many compilers as possible, and this was a need owing
to commercial pressure.

Like "falling dog poo" or "it's not easy to get stupid" in Cantonese,
"sequence points" are idiomatic.
 
S

spinoza1111

In <[email protected]>,

spinoza1111wrote:


He made no such claim. The claim he made was that leaving the choice
of evaluation order up to implementors allowed them to make efficient
choices.


You have already established that you don't know what you're talking
about. Why should we trust what your book says?



He has done both. (So have I.)


No, he has standing in this field because he knows what he's talking
about.


Especially if you're the one that makes the errors.




Rubbish. It is easy to be clear and *wrong*. Schildt manages it
effortlessly. Clarity is of no use if the clear statement is false.
"All elephants are green" is perfectly clear - but wrong. The clarity
doesn't help.

No, it's not clear, and that you think it's clear shows your lack of
linguistic ability.

Read later Wittgenstein. He saw, as do linguists, that "all elephants
are green" cannot be considered in isolation (for the same reason it's
dangerous to copy and paste "code snippets" without research).

If a real speaker says "Hey everybody! All elephants are green!" his
statement would be:

(1) Considered very unclear in this world, where no elephants are
green.

(2) Considered "clear" at the limit and tautologous (too clear) in a
possible world where all elephants are green.

(3) Considered clear ONLY in a world where all elephants are green but
people believe in a magical White Elephant whose appearance would bode
famine.

Herb says at worst that "in the bush, if you would shoot an elephant,
you should know that all elephants are grey". This is true as a
practical matter in the sense of utility and that it covers nearly all
cases EVEN IF some of Herb's bearers mutter that bwana doesn't know
about the White Elephant.
 
S

spinoza1111

Just as you guys are.  You've got your little "we're the out-group, we're
so cool, we know how things REALLY work" meme going, you select other people
who are jealous and would rather attack people than solve problems, and you
sit around being smug.

The only difference?  We produce working code.  You produce snide remarks
and envy.

Where's the working code? I've been reviewing this newsgroup since
1999, and NOT ONCE do I recall either you or Richard Heathfield
actually posting more than exemplary snippets which you've probably
copied from someplace else.
 
S

spinoza1111

Doesn't matter.  They could have standardized that the language would be
implemented entirely by squealing pigs, and if you then claimed that it
was implemented on digital hardware rather than squealing pigs, it'd show
you to be unqualified to comment on it.

Rather Orwellian of you to say that, wouldn't you say?
The fact is that what they standardized is clearly documented and known,
and if you can't correctly describe what they did, then it is ridiculous
for anyone to pay attention to your criticisms of what you imagine they
might have done in an alternate reality.

But it's not clear. It's nondeterministic. Schildt is clear and if I
were going to hire a C programmer, I would hire someone who's read
Schildt, not the Standard. That's because the latter chap would be
coding for the possibility of running on any machine. If that was my
concern, I'd make him use Java, wouldn't I.
As long as we're talking merely about expectations, there is certainly
some merit to such a view.  But we're not talking about expectations.  We're
talking about you claiming that things are true of C, which aren't.

The problem is that "we don't know" is not a useful truth.

We take you now to Mathematica, Inc. Herb Schildt, Peter Seebach, and
Ed Nilges all work for Edgar Whiteman, CEO:

Edgar: look, Peter, I know you don't know. In fact, it doesn't
surprise me. As it happens, Herb has found out that a()+b() is
executed sequentially on our compiler and our hardware. Therefore I
need you to leave the code alone because it works on our platform, and
I have no plans to convert, while keeping C, to change platforms. If I
decide to do so, I am gonna have you, Herb, and Ed Nilges come in here
on weekends and convert the code, line by line, to JAVA, you dig me?"

Peter [lower lip trembling]: But the STANDARD says...

Edgar: I DON'T GIVE A RAT'S ASS WHAT THE STANDARD SAYS. And
furthermore, I need you to stop gossiping about Herb and Ed behind
their back. Just because you're gay doesn't mean I have to put up with
a drama queen like you. I need you to be a macho man and do your job,
and I DON'T want to have Herb and Ed in here complaining to me about
you! Now get out of here before I get gay myself, and fire your ass!
Let's explore this a bit.  I ran into a guy once, forgot his name, who was
firmly convinced that C should have standardized order of evaluation, for
instance, of function arguments.  I disagreed with him, but I did not think
he was an idiot.  That's because he did not claim that it had previously
standardized order of evaluation then ceased to.  He did not claim that 9 out
of 10 compilers did it this way, and then turn out not to be able to give
a single actual example of a compiler doing it.  In short, he made it clear
that he actually *knew* what C was doing, and that he disagreed with it.

You have consistently gotten the question of what C actually does wrong.
That makes your comments much less credible.
Most compilers have left to right evaluation as it happens. We know
this. When will you learn it?
So, your claim here is that you have a compelling argument as to why there
is no evidence for your claim... Okay, granted.  There is no evidence
for your claim.  If you ever hear back from people in an alternate reality
where there is evidence for one of your claims, do let us know what that
evidence turned out to be.

It's called the written record.
I would be much more sympathetic to this view had I not been obliged to deal
with many of those "thousands of people" here, where it consistently turned
out that they were having a very hard time getting from the DOS box Schildt
described into other environments.  Since they consistently turned out to
be having trouble caused by his errors, I concluded that his errors caused
trouble.

The reason is that C is not truly portable, something which I have
maintained all along. It has nothing to do with Schildt, who is being
scapegoated.
You'll note an interesting technique I used in the above:  I took real-world
observations as an input, and from them, tried to develop a theory of what
those observations might mean.  Your tactic, of declaring a theory and then
asserting that such observations would have existed if only someone else
had designed the Internet, is certainly innovative, but I'm not sure it
works better.


It's been that way since 1978.

My guess would be that you tried this on a C++ compiler at one point,
not aware that C++ changed that rule, and you then came here and blamed
C for getting something wrong that C never got wrong.


Individually, no, but the *pattern* of mistakes goes straight to credibility.

Corporatese. It excuses a failure to listen (read).
You're not giving the reader any reason to believe that your opinions are
of value.  When you make an assertion such as "order of evaluation should
be defined" or "order of evaluation should be unspecified", you must either
provide compelling evidence (not just assertions that it's obvious to
people who are smart like you), or have credibility to begin with.  You
have no credibility, so you'd need to provide evidence.

My experience is that people with "credibility" in corporate data
processing are mostly loudmouths and that they use it to defends bugs
so embedded as to be features. I saw this in securitization and I see
this in C.

You mean like the way you keep trying to blame me for stuff I had nothing
to do with?


I originally mentioned the autism thing in the hopes that you would do
no research at all, pick up a couple of vague half-baked stereotypes
of what the word "autism" means, and then try to drag the word into
nearly every paragraph of your future writing as an attack or an insult.
I am pleased to note that you have performed precisely as expected.

Have you considered how this works for your case?  Most people feel
uncomfortable with attacks on people who are viewed to be disabled.  Many
people have wrestled with some kind of mild mental illness, such as
depression; most people have had friends who have done so.  Because autism
is much overrerpresented in the programming industry, the chances are that
many of the readers here have autistic friends or coworkers.  And here
you are, using "autistic" as a derogatory epithet.  And yet, true to
form, getting the facts wrong anyway.

When all else fails, try identity politics. Being autistic and having
a learning disorder isn't the same thing as being black, female or
gay. I had gay coworkers as early as 1981 at Bell Northern Research
and their gayness had no impact on their code.

On the other hand, my ADD made me excessively comment my code and this
interfered with other people's ability to read the actual code, so
they asked me to comment less in reviews, which I did.

Whereas you want your autism, as evidenced in your inability to take
what Herb said other than in the most literal way, to be privileged to
the extent of destroying a reputation.

This is unacceptable. People with learning disorders have to learn to
keep their learning disorder from harming others.


There are three essential forms of argument; ethos, pathos, and logos.
(Or, for the Frank Zappa fans, five:  ethos, pathos, porthos, aramis,
and brut cologne.)

Ethos refers to your character and credibility; the degree to which
people are likely to believe what you say because of whom they
believe you to be.  You presumably expect that hammering on the
"autism" point works to undermine my ethos, but you are gravely mistaken.
Most people working in computers these days have come to associate
"autistic" with "precise, careful, and honest, but also rude."  But
no one cares whether I'm rude; they care whether I'm right or not.
However, on your end, it undermines your ethos rather severely, because
you're picking on the disabled guy.  (That the disabled guy finds
it hilarious may mitigate the effect some, but not very much usually.)
You are coming across as a bully, whether or not you mean to.

Boo fucking hoo. You've endangered Herb Schildt's employability and
given pain to his extended family by making his name a byword. You
started this shit, I'm giving you a way out, and you are refusing it.
You're the bully, just like the gay men who would have sex with other
gay men and tell them afterward that they had AIDs, and being
fashionably precise but rude is no excuse.


For logos (pure reasoning and logic), we find that your arguments
are heavily undermined by the key fact that you don't actually make
arguments, in the form of premises which lead to conclusions.  You
just sort of throw random assertions out and then assert your
conclusions.  If there are connections, they are too subtle for
anyone to have picked them out yet.

Your best case is the pathos argument -- your appeal to the stories
of the programmers who were laid off by the compiler vendors might
carry weight, except that no one seems to believe them to be true,
probably because C99 was a huge effort and involved substantial new
work by large numbers of people at various vendors.
My errors are correctable; like Winston Churchill, who told Lady Astor
that he was drunk but would be sober in the morning, whereas she would
be ugly, I am relearning C and willingly admitting error in what I
consider an important cause. I shall be right on the morrow, whereas
you will still be a stupid little twerp who thinks that free(NULL) is
a counterexample to the need to balance free() with allocation and

As I pointed out already, it is indeed a counterexample, and my other
objection to that passage was more significant.
that C needs to be semirandom because that allows it to be optimized
(the reverse is true: C is more difficult to optimize than more
deterministic languages).

You've said this before, but you haven't supported it.
Because of your learning disorder, which paradoxically makes you
correct in detail because the need for intelligent interpretation is
beyond you, errors threaten you and when you see others make them you
are horrified by way of psychological transference.

Again, you are mistaken.  The root of my complaints was the discovery
that newbies coming to comp.lang.c were getting confused.

Actually, that's something I'd think you'd understand, as an author:
If you need "intelligent interpretation", what you probably actually need
is to go do some editing.  Text which is not clear without relying on
the reader to guess at what the author meant instead of what the author
said can make for delightful fiction, but is poor form in technical
writing.
["Oh my they might laugh at me like back in school."]

You really need to do some basic research on autism.  Hint:  The
predominant characteristic in question would be *not caring* whether
people might laugh at me.  If they're laughing, great!  That probably
means they're happy.
But, in programming, we know how to deal with errors. And one major
way is a free and open discussion undominated by autistic twerps.

I have rarely dominated a discussion.  :)
So stop replying.

But you're *funny*.

Don't you get it?  No one cares.  Your posts are funny.  I show them to
friends and coworkers as jokes, like linking them to a picture of a cat
with a funny caption or one of those youtube videos with the WW2 movie
and hilarious subtitles.  ("Hitler's

read more »...
 
S

Seebs

Rather Orwellian of you to say that, wouldn't you say?

Not really. To be qualified to comment on something, you have to be able
to accurately describe it, even if it's wrong.

If your descriptions are wrong, you're not qualified to say whether the
thing itself is wrong, because you're not talking about it, you're talking
about something else you made up.
But it's not clear.

It seems to be clear.
It's nondeterministic.

Not really, but even if it were, that doesn't mean it can't be clear.
Schildt is clear and if I
were going to hire a C programmer, I would hire someone who's read
Schildt, not the Standard. That's because the latter chap would be
coding for the possibility of running on any machine. If that was my
concern, I'd make him use Java, wouldn't I.

You go ahead and do your kernel in Java. When I need a kernel to run on,
say, 60+ different embedded boards, I think I'll stick with C. :)
The problem is that "we don't know" is not a useful truth.

Sure it is.
We take you now to Mathematica, Inc. Herb Schildt, Peter Seebach, and
Ed Nilges all work for Edgar Whiteman, CEO:

Your mastubatory fantasies are not an argument.
Most compilers have left to right evaluation as it happens.

Do they?
We know this.

Do we?
When will you learn it?

When someone presents me with evidence of it.

int a() { putchar('a'); return 1; }
int b() { putchar('b'); return 1; }
int c() { putchar('c'); return 1; }
printf("\n%d %d %d\n", a(), b(), c());

yields "cba\n1 1 1\n" with gcc on my system. There's a sound technical
reason for which should be the case, I might add -- there is a concrete
technical advantage to evaluating those args right-to-left. It's not an
arbitrary choice.

To make it more interesting:

x = a() + b() * c();

What do you predict the output will be? (Hint: The MSDN page is wrong.)
It's called the written record.

That's nice. You have not presented this written record, or shown how the
existing records we've looked at support your point.
The reason is that C is not truly portable, something which I have
maintained all along. It has nothing to do with Schildt, who is being
scapegoated.

You miss the point.

People learning from K&R don't have those problems. People learning from
H&S don't have those problems. People learning from King don't have those
problems. Only people learning from Schildt had those problems.
Corporatese. It excuses a failure to listen (read).

Not corporatese at all. Heuristics. If I know that you're often wrong
when you make claims about C, what does that tell me?
My experience is that people with "credibility" in corporate data
processing are mostly loudmouths and that they use it to defends bugs
so embedded as to be features. I saw this in securitization and I see
this in C.

Ahh, but I wasn't talking about "credibility" but credibility. As in,
the reasonable observer's estimate of the likelihood that a statement is
true, given only the information that you made it.
Whereas you want your autism, as evidenced in your inability to take
what Herb said other than in the most literal way, to be privileged to
the extent of destroying a reputation.

Never said anything of the sort. But there's a foundational flaw, again:
First off, his continued publication suggests that his reputation hasn't
been destroyed. Secondly, if factually accurate claims about the quality
of your writing destroy your reputation, you need to learn to write better.
You've endangered Herb Schildt's employability and
given pain to his extended family by making his name a byword.

Not demonstrably, but, again, even if true, then maybe he should have learned
the fucking language instead of not doing so.
You started this shit, I'm giving you a way out, and you are refusing it.

No, I wrote a web page which no one even remembered until you came on the
scene and started posting massive screeds about how much people hated
Schildt and how big a deal it was.

But, even if it were theoretically true that there were a problem... You
aren't offering a "way out". You're offering to cease harassing people if
they lie in a way that makes you feel like you won something. It wouldn't
change anything. If there's damage done to the guy's reputation by your
multi-year internet crusade against factual reporting about errors in his
books, it's far too late to change that.

-s
 
S

spinoza1111

Not really.  To be qualified to comment on something, you have to be able
to accurately describe it, even if it's wrong.

If your descriptions are wrong, you're not qualified to say whether the
thing itself is wrong, because you're not talking about it, you're talking
about something else you made up.


It seems to be clear.


Not really, but even if it were, that doesn't mean it can't be clear.


You go ahead and do your kernel in Java.  When I need a kernel to run on,
say, 60+ different embedded boards, I think I'll stick with C.  :)


Sure it is.


Your mastubatory fantasies are not an argument.


Do they?


Do we?


When someone presents me with evidence of it.

int a() { putchar('a'); return 1; }
int b() { putchar('b'); return 1; }
int c() { putchar('c'); return 1; }
printf("\n%d %d %d\n", a(), b(), c());

yields "cba\n1 1 1\n" with gcc on my system.  There's a sound technical
reason for which should be the case, I might add -- there is a concrete
technical advantage to evaluating those args right-to-left.  It's not an
arbitrary choice.

It does the same on Windows.

Of course, if I were autistic, I would say that the above code fails
because if I paste it in as is it doesn't compile. It doesn't have a
main().


Don't you know that all C programs that are run from the command line
must have a main?

My goodness Peter Seebach doesn't know that programs run from the
command line must have a main!

Alert the media.

Peter Seebach doesn't know that programs run from the command line
must have a main.
Peter Seebach doesn't know that programs run from the command line
must have a main.
Peter Seebach doesn't know that programs run from the command line
must have a main.

Nyah nyah.

There is an incompetent programmer named Pete
Who is a stupid idioeeet
He doesn't know
His ass from his elbow
That incompetent programmer named Pete.


NOW do you understand?

You did not mean to imply that the above code could be pasted and used
as is by including neither a main() nor an ellipsis (...) after the
procedures and before the printf().

But I did not conclude that you didn't know your job (which you do,
fortunately, since otherwise you'd be saying you want fries with that)
nor did I conclude that you meant to cover up the truth and confuse
newbies. At worst, you did not exercise the care I would have
exercised even in posting here.


OK, it prints cba and I understand your point. As to there being a
good reason, I disagree. The fact that the operands are best reversed
is an artifact of parsing which could be fixed at any time by a simple
reordering. The order is therefore a historical curio as bad as
Fortran's preserved mistakes.

The parameters are recognized left to right by most parsing
algorithms. But code is generated to push them on the stack and this
naturally reverses their order. All the compiler needs to do is send
the code to an array (with an upper bound that in all cases would be
small, and that could be itself malloc'd case by case) and then move
backwards through the array to generate the code in a more orthogonal
order.

But note that the standard isn't really "undetermined". The filthy
little secret (in twerpland secrets matter more than knowledge) is
that the de facto standard is right to left for parameters.

This means that there's code which uses this fact, in which in ...x(),
y()..., y() calculates a value that is used by x(). The twerp who
developed this code is quite proud of his "skill and knowledge", but
this type of coding style, which is rife and rampant in twerpland, is
not what I'd call professional programming.

As I have said I abandoned C in 1992 when I realized how bad it was.
Had I continued using it I probably would have been bit by this
problem and fixed things so enough because I check and test my code
(do you?) Because outside of Hebrew and certain other languages (where
Hebrew was revived from the dead by the gangsters who founded Israel),
literate people read left to right.

These people make better programmers than otherwise unemployable
autistic twerps because they can communicate without bullying and
offending people and they and not people like you should be
programmers. But having a modicum of self esteem they generally flee
the field.

I did not because I was a responsible father but once my kids were
grown I was glad to find first a job in marketing and today as a
teacher, because I won't work with autistic twerps that talk behind
other people's backs and trash reputations for fun.
To make it more interesting:

x = a() + b() * c();

What do you predict the output will be?  (Hint:  The MSDN page is wrong.)

On my system it is abc and this is correct even according to the Twerp
Standard. It's nondetermined, remember. The problem:

(1) Intelligent people will reason, like intelligent people, from the
left to right order of the sum to the left to right order of the
procedure call.

(2) Stupid people will laugh at them for not knowing that C is not
orthogonal because stupid people prefer ugly and foul things.

The original reason for designing programming languages was NOT to
create a catchment basin for inadequate males. It was to allow
scientists to avoid having to shoe-horn their thoughts into the
restrictions imposed by machine and assembler languages. It was to
allow corporate executives to control their companies. It was to allow
government officials to govern fairly and effectively.

Using an older form of language it was to allow free men to act freely
like men, but of course bimbo feminism forbids men from naming
themselves, whereupon men seek fashionable labels like autistic or
gay.

But as it happens, since the elite prefer secrets, lies and videotape
and need someplace to stuff the inadequate males created in the family
systems of the Fifties, Sixties and Seventies, family systems in which
mothers did all the emotional work, the poor design of artifacts like
C has become welfare for white people.

The only reason you have a job is because data systems have been in
part designed to preserve white male privilege, and in the main to
preserve corporate profits.
That's nice.  You have not presented this written record, or shown how the
existing records we've looked at support your point.


You miss the point.

People learning from K&R don't have those problems.  People learning from
H&S don't have those problems.  People learning from King don't have those
problems.  Only people learning from Schildt had those problems.

I don't think anyone ever learned programming from a book. Instead,
they've learned from a variety of sources including books, coworkers,
mentors and the machine itself.
Not corporatese at all.  Heuristics.  If I know that you're often wrong
when you make claims about C, what does that tell me?

You haven't proven that, since I told you already I abandoned the
language, and I've probably said about 5 "wrong" things to 50 correct
things. "Wrong" would be my claim that "gee, left to right evaluation
probably applies after precedence. At least I sure as hell hope so."
"Right" would be my claim that "C sucks, doesn't it, because you
people confuse everything when you discuss it and when we learn the
truth we find that the truth creates bugs". I am wrong in the trivial
and right in the main: you the reverse.
Ahh, but I wasn't talking about "credibility" but credibility.  As in,
the reasonable observer's estimate of the likelihood that a statement is
true, given only the information that you made it.


Never said anything of the sort.  But there's a foundational flaw, again:
First off, his continued publication suggests that his reputation hasn't
been destroyed.  Secondly, if factually accurate claims about the quality
of your writing destroy your reputation, you need to learn to write better.

These ngs are all first drafts. What's missing is any proof of wrong
in my magazine articles or book. If you'd prove what you claim, you
need to write a "C: The Complete Nonsense" about my book.

I'd send you a free copy to help you get started, but I need your
money.

And be advised that unlike Herb, I will be on your ass like a fly on
shit once you publish your new Vicious Tirade.
Not demonstrably, but, again, even if true, then maybe he should have learned
the fucking language instead of not doing so.

I don't need to since this isn't a technical issue. You have tried and
partly succeeded to destroy a man's reputation at a libelous level.
I'm not a little creep of a computer programmer. Think of me as your
manager. Think of me as your parole officer. Think of me as the
prosecuting attorney nailing your butt. Think of me as your father.

Attorneys chortle when programmers start nattering on about technology
because the law has more majesty than some little programming
language.
 
S

Seebs

Also, (artificial) deadlines make life very hard for technical
authors, especially when you bear in mind that they are trying to do
a full day's work at the same time and in many cases have families. I
can fully understand how code that won't even compile could get into
a Schildt book. And I don't have a huge problem with that. The onus
is heavier on a tutorial author to get things right, but hey, people
are human. No, the problem is not the errors themselves. It's the
lack of a working mechanism for acknowledging them and publishing
corrections.

I had a hellish time with my book due to the need to use MS Word as
a writing "tool" (using the term either extremely loosely or as a
euphemism, not sure which). For the most part, the formatting didn't
bite too much, but the sheer time sink of fighting the software definitely
hurt.

That said... The sizeof(rec) error strikes me as a kind of error that
should NEVER have made it into a book on C. There's tons of obvious
kinds of mistakes that people legitimately make even when they know
something, but I don't think I've ever seen a C programmer with more
than a month or two of experience get that one wrong.

-s
 
K

Keith Thompson

spinoza1111 said:
Not in this scenario. A true constant expression such as 25*8 (or
TWENTY_FIVE * EIGHT where TWENTY_FIVE and EIGHT are preprocessor
variables, or more sensibly FORCE * MASS) has no type at all and I
believe the compiler developer can choose what he thinks is best. I
think widest precision is best.

Either you're using the phrase "true constant expression" to
something that doesn't exist in C, and is therefore most likely
irrelevant both to this newsgroup and to whatever problem you
and John Nash were working on, or you are just factually wrong.
C constant expressions do have types. 25*8, for example, is of type
int, and compiler developers can either accept that and implement
it that way, or not claim to be developing a conforming C compiler.

If you're unwilling to accept the way C defines things, you might
consider finding (or, if necessary, inventing) another language,
which you can discuss elsewhere.
 
S

Seebs

It does the same on Windows.

As we'd expect.
Of course, if I were autistic, I would say that the above code fails
because if I paste it in as is it doesn't compile. It doesn't have a
main().

Really? That's odd. I have a limited sample space, to be sure, but 100%
of my tested sample space of autistic people understand that code fragments
are not the same as complete programs.
NOW do you understand?

I understand that you don't, anyway.
OK, it prints cba and I understand your point. As to there being a
good reason, I disagree. The fact that the operands are best reversed
is an artifact of parsing which could be fixed at any time by a simple
reordering. The order is therefore a historical curio as bad as
Fortran's preserved mistakes.

Wrong.

It's not a question of parsing.

Imagine that we are to push the arguments onto the stack. If we push them
left to right, how do we find the format string? We have to push an
additional value to indicate how far back up the stack to go to find the
last non-variadic argument. If we push them right to left, the format
string is, of course, the first argument, and we can then go back up the
stack, using the format string to figure out what's next.

In short, it has NOTHING to do with parsing -- it has to do with runtime,
and the right-to-left order saves several operations per call.
The parameters are recognized left to right by most parsing
algorithms. But code is generated to push them on the stack and this
naturally reverses their order.

Not really "naturally". Some compilers do right-to-left only for
variadic functions, for instance. Example:

#include <stdio.h>

int a() { putchar('a'); return 2; }
int b() { putchar('b'); return 2; }
int c() { putchar('c'); return 2; }

int foo(int d, int e, int f) {
putchar('\n');
return 0;
}

int
main(int argc, char *argv[]) {
foo(a(), b(), c());
printf("\n%d %d %d\n", a(), b(), c());
return 0;
}

On an x86 box using gcc, I got:

abc
cba
2 2 2

.... but only at higher optimization:

cba
cba
2 2 2
... at -O2
abc
cba
2 2 2
... at -O3
All the compiler needs to do is send
the code to an array (with an upper bound that in all cases would be
small, and that could be itself malloc'd case by case) and then move
backwards through the array to generate the code in a more orthogonal
order.

Oh, sure. But the most natural thing would be to run them in the order
they're going to get pushed.
But note that the standard isn't really "undetermined". The filthy
little secret (in twerpland secrets matter more than knowledge) is
that the de facto standard is right to left for parameters.

You seem to be obsessed with this idea that there's secret knowledge
involved. Honestly, until I actually ran these tests, I had NO IDEA
what order gcc evaluated arguments in. Why would I? I've only been
programming in C for about 20 years; it hasn't come up, because it's
not significant.
This means that there's code which uses this fact, in which in ...x(),
y()..., y() calculates a value that is used by x().

Code like that is broken.
The twerp who
developed this code is quite proud of his "skill and knowledge", but
this type of coding style, which is rife and rampant in twerpland, is
not what I'd call professional programming.

This is another beautiful inversion. I've argued all along for the idea
that it doesn't matter what order things are evaluated in and you
shouldn't be caring. You spent quite a while arguing that good programmers
wrote code which relied on order of evaluation, but that this was broken
by the standard. Now you're arguing that bad programmers like the people
telling you the order is unspecified and you shouldn't use it (a matter
on which we are in agreement with everyone from Schildt to K&R) are, in
fact, secretly relying on the order of evaluation which we just told you
is unspecified and should not be relied on.
As I have said I abandoned C in 1992 when I realized how bad it was.

But you've made it quite clear that you don't understand it well enough to
be competent to have an opinion. The closest we could get was to grant
that it's clearly a language that is ill-suited to your use.
Had I continued using it I probably would have been bit by this
problem and fixed things so enough because I check and test my code
(do you?)

I check my code very carefully, but I have never, ever, been bitten
by this "alleged problem". How could I be? I learned the first time
I opened a book on C that you were supposed to not rely on order of
evaluation. Problem solved.
Because outside of Hebrew and certain other languages (where
Hebrew was revived from the dead by the gangsters who founded Israel),
literate people read left to right.

I'll be sure to let the Chinese and Japanese know that you are smarter
than them.
On my system it is abc and this is correct even according to the Twerp
Standard. It's nondetermined, remember.

Good. That's what I get too.

But the MSDN pages say it should be bca, because they claim that order
of evaluation follows precedence.
(1) Intelligent people will reason, like intelligent people, from the
left to right order of the sum to the left to right order of the
procedure call.

You keep making this handwaving assertion. In fact, intelligent people
usually do not make up arbitrary rules and demand that the world ought to
follow them. Thinking the world is stupid for not being like you is not,
in general, actually a sign of intelligence.
(2) Stupid people will laugh at them for not knowing that C is not
orthogonal because stupid people prefer ugly and foul things.

I would love to know what you think the word "orthogonal" means.
The only reason you have a job is because data systems have been in
part designed to preserve white male privilege, and in the main to
preserve corporate profits.

See, this stuff is why I still read your posts. It's awesome. I mean, sure,
it's rambling, incoherent, probably hebephrenic. But it's still pretty
fucking awesome. It is not every day that I get to see a Dan Brown level
of conspiracy theory used to justify the mysterious decision to hire someone
who loves to work on compilers and writes reliable code to do stuff pertaining
to compilers.

Those people who hired me, they're crazy. I heard they also use shovels to
dig sometimes, eat soup with a spoon, and use a butter knife to spread butter.
I don't think anyone ever learned programming from a book.

I don't know about programming, but I am quite sure people have learned
English from a book [Fawlty 75]. However, you contradict yourself again
(you are large, you contain multitudes). You only recently pointed out the
hypothetical example of someone who needed to learn C and had only Schildt's
book to guide him. Well, we've had that guy show up here, confused and
distraught.
Instead,
they've learned from a variety of sources including books, coworkers,
mentors and the machine itself.

Ahh, yes. But the book is the source you have with you to explain what the
machine is doing -- if it lies to you, you're in trouble.
You haven't proven that, since I told you already I abandoned the
language, and I've probably said about 5 "wrong" things to 50 correct
things.

You've made a number of totally untestable claims, but your testable claims
have been very frequently wrong. You've made a number of claims about
the standardization process that have been consistently, ludicrously,
wrong.
"Wrong" would be my claim that "gee, left to right evaluation
probably applies after precedence. At least I sure as hell hope so."

And also your claim that this changed in C99, then your claim that it
changed in C89, your claim that it was done in response to vendors with
broken compilers, and so on. In fact, it was that way from the very
beginning, for sound reasons, and your assertion that "9 out of 10"
compilers did it a particular way remains totally unsupported.
"Right" would be my claim that "C sucks, doesn't it, because you
people confuse everything when you discuss it and when we learn the
truth we find that the truth creates bugs".

Actually, uhm, no. We don't confuse stuff, we just reveal that you
were confused all along. You've yet to establish that the truth "creates
bugs" in the matter of order of evaluation.
These ngs are all first drafts.

True, but irrelevant.
What's missing is any proof of wrong
in my magazine articles or book.

True, but irrelevant, because I was not discussing your magazine articles
or your book, but Schildt's book. Keep in mind that "you" in English can
be used as a generic pronoun; the "your reputation" in the above applies
to all writers, not just spinoza1111.
If you'd prove what you claim, you
need to write a "C: The Complete Nonsense" about my book.

Not really. I don't care about your book; I have no interest in learning
another flavor-of-the-month Microsoft API. Since I started programming,
the official API you should target for Microsoft systems has changed about
six times. If they're able to keep .NET sufficiently useless to everyone
else, they might run with it for a while, otherwise they'll switch again.
I'm not gonna waste time learning any of them until I have a much more
concrete reason to.
And be advised that unlike Herb, I will be on your ass like a fly on
shit once you publish your new Vicious Tirade.

If I were inclined to bother, this would surely matter to me.
I don't need to since this isn't a technical issue.
Non-sequitur.

You have tried and
partly succeeded to destroy a man's reputation at a libelous level.

Truth is, as the nice folks say, an absolute defense against libel in these
parts. If in fact he's wrong, then he's wrong, and I'm not wrong to say
so.
I'm not a little creep of a computer programmer.

I would never have thought so.
Think of me as your manager.

You're a sheep farmer?
Think of me as your parole officer.

If I'm ever on parole, I'll totally keep that in mind.
Think of me as the prosecuting attorney nailing your butt.

If I ever commit a crime, I'll totally keep that in mind.

(Disclaimer: Given the complexity of the US legal system, the chances
that I've committed some sort of crime are near unity.)
Think of me as your father.

Wow, it's true, zombies ARE stupider than they were when they were alive.
Attorneys chortle when programmers start nattering on about technology
because the law has more majesty than some little programming
language.

In other words, you're basically conceding that there are errors, but you
like to feel that, even though the law is pretty clear on what is or isn't
libel, it OUGHT to be the case that there's some kind of way to do an end
run around it.

You still haven't explained what your big obsession with Schildt's books is.
There are other books on C out there, some quite readable and much more
accurate. What's wrong with them?

-s
 
K

Keith Thompson

Richard Heathfield said:
No it doesn't, but in general the point is a good one.

I had no end of trouble with closing braces. They were present in my
code, the code that ended up on the CD - but somehow they
occasionally got dropped from the text. That caused a lot of
head-scratching for a while, especially as it was an intermittent
fault and thus hard to chase.

Also, (artificial) deadlines make life very hard for technical
authors, especially when you bear in mind that they are trying to do
a full day's work at the same time and in many cases have families. I
can fully understand how code that won't even compile could get into
a Schildt book. And I don't have a huge problem with that. The onus
is heavier on a tutorial author to get things right, but hey, people
are human. No, the problem is not the errors themselves. It's the
lack of a working mechanism for acknowledging them and publishing
corrections.

Again, here's the code in question:

/* write 6 integers to a disk file */
void put_rec(int rec[6], FILE *fp)
{
int len;

len = fwrite(rec, sizeof rec, 1, fp);
if(len<>1) printf("write error");
}

Hmm. You obviously know more about the publishing world than I do,
but I think you're letting Schildt off the hook too easily. Missing
braces are the kind of problem that's perfectly understandable (it
would be nice if there were a mechanism that prevented it, but the
errors Schildt made in the particular example are not. The code,
as written, could not have worked, and is not textually similar to
any code that could have worked. It's not entirely unreasonable,
I suppose, for someone to write such code and think it was correct.
But the only way it could have survived to appear in print is if
the author *never actually tried it*.

And I understand that he may have been busy, but I suggest that if
he was too busy to get such fundamental things right, he was too
busy to write a book.

Yes, an errata list that acknowledged and corrected such errors
would mitigate the harm, but I honestly don't see how a competent
author could reasonably have made that particular mistake in the
first place and not caught it before publication.
 
F

Flash Gordon

Richard said:
[Attribs mildly tidied up]

In

<40 lines of unaddressed context snipped because Spinoza1111 is too
lazy or too stupid to do it himself>
No, it's not clear,

It's clear to everyone else. Which bit don't you understand?
and that you think it's clear shows your lack of linguistic ability.

That you think it's unclear shows your lack of brain.

It will be easier if you use his definition of what "clear" means and
then say that using that definition what Schildt wrote was not clear
since it was wrong.
 
K

Kenny McCormack

Except that Monty Python is, you know, funny.

(Now watch Spinoza complain that I've insulted him by saying he's not
funny.)

You've routinely done essentially the same thing. That is, taken great
pains to interpret things said about or to you as insults to your
manhood.
 
S

Seebs

You've routinely done essentially the same thing. That is, taken great
pains to interpret things said about or to you as insults to your
manhood.

lolwut

Seriously, that's just dumb. Given that I don't think the category "insults
to my manhood" is semantically value (I am certified 100% macho-free), I
think this is another case of you assuming that everyone else is just like
you, and interpreting their behavior accordingly.

You don't actually seem to have any interest in discussing C; is it really
that good a use of your time to sit around and do nothing but bitch about
people who do?

-s
 
K

Kenny McCormack

lolwut

Seriously, that's just dumb. Given that I don't think the category "insults
to my manhood" is semantically value (I am certified 100% macho-free), I
think this is another case of you assuming that everyone else is just like
you, and interpreting their behavior accordingly.

(FYI) I was talking to Kiki, not you.
You don't actually seem to have any interest in discussing C; is it really
that good a use of your time to sit around and do nothing but bitch about
people who do?

This is a comedy newsgroup. I read it (and post to it) for the
entertainment value. It hasn't really been about C (in any useful
sense) for decades.
 
S

spinoza1111

Either you're using the phrase "true constant expression" to
something that doesn't exist in C, and is therefore most likely
irrelevant both to this newsgroup and to whatever problem you
and John Nash were working on, or you are just factually wrong.
C constant expressions do have types.  25*8, for example, is of type
int, and compiler developers can either accept that and implement
it that way, or not claim to be developing a conforming C compiler.

But here we learn that conformance is impossible because to "conform"
you have to run nondeterministically. As to your example, the standard
wasn't in common use at the time.
If you're unwilling to accept the way C defines things, you might
consider finding (or, if necessary, inventing) another language,
which you can discuss elsewhere.

Nice little gag rule. Kiki, "C" doesn't mean "standard C". The
standard is trash. C means "what people code when they say they code
C", because in reality, they are more right than the standard which
was defined as I have said to protect the profits of vendors.

Herb was in fact reporting actual practice amongst actually productive
programmers and as such did a far more valuable service. There is no
need for standards, especially standards used by companies to increase
profits. Just do a halfway decent job of language design. The
developers of C did not.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,812
Latest member
GracielaWa

Latest Threads

Top