subroutine stack and C machine model

S

spinoza1111

At this point, I feel obliged to help Spinny out:  I never graduated from
high school, nor do I have a GED or anything comparable.  Oh, and I
have a learning disorder!  Also, depending on whom you ask, I may technically
be considered "gay".  And I like cats but not dogs.  Finally, and I think
this is particularly important, I really liked Death Magnetic and it's
probably my favorite Metallica album.

Although it's not relevant whether you're gay, and not relevant if you
failed to graduate high school other than that it might indicate
dysfunction, I would expect you graduated from university. Please
provide on topic information as to your major, please.
 
N

Nick Keighley

The standard, statistically, continued to be Microsoft's C compilers

I don't think this is true. I don't think Microsoft were that big a
player in the C compiler market in 1989. Weren't Borland bigger? Even
DEC was probably more important.

for better or worse. Better in that the developers at Microsoft seem
to have believed in precedence followed by left to right.

righ to left

<snip nash story>
[...] the Standardizers ignored Microsoft being unwilling to confront its
market power.

whilst it's a waste of bits to contadict everything you say that it is
wrong I like to stick one in every now and again. I don't think the C
standardisers gave a tuppeny whistle what microsoft thought in 1989.
 
S

spinoza1111

In

spinoza1111wrote:



All conforming compilers. The Standard guarantees that b()'s return
value will not be used and that b() will not be called. It also
guarantees that, in a() && b(), if a() returns 0, b()'s return value
will not be used and b() will not be called.

Richard please don't waste our time by telling us what we know.
 
S

spinoza1111

The issue never came up. No one suggested that this should be changed.


Not true what that language is used for high-performance computing.
Many users depend on the sorts of compiler magic enabled by that
feature.

"users" and "compiler magic" are meaningless and barbaric words.
Simply not true. This isn't "invalid reordering". It was a considered
decision, back in the day, to allow for this sort of reordering to allow
compilers to better adapt code to processors with different semantics.
This is nonsense, since a||b and a&&b have always been implemented
such that b is not evaluated when a is true (in the case of or) or b
is false (in the case of and). Furthermore, even "back in the day" it
was already known that reordering that breaks code with side effects
was a bad idea.

As it happened, certain processors made it convenient for compiler
developers to reorder a()+b() and similar constructs. When they did so
(often in consequence of stupid mistakes very similar to the first way
I constructed a grammar this week so as to right associate munis and
divide), they blamed the victims of their incompetence. They did not
foresee the use of function call and penalized the best programmers in
the typical fashion of incompetents.
No. First off, again, the issue simply was not raised that I know of;
if it was, surely you can give us the defect number of the defect someone
raised that would have brought the question to light? No?

Wow, just because it got lost in the paperwork doesn't mean it doesn't
exist. In fact, These "tracking systems" were shown to be in the
official report on the (NASA) Columbia and Challenger disasters
excellent ways to cover up errors and (in the words of an official
researcher) "normalize deviance".

Look at the misuse of the language alone. To say "hey, you idiots, you
are ignoring the fact that we now know how to reorder, not source
code, but its internal representation while preserving correctness,
and for this reason there is no reason, any longer, to say that the
order of operations is invalid" is even in bureaucratese an issue and
not a defect.

The fact that Columbia's heat shielding, which in the words of the
official report was simply not supposed to happen, was "registered" as
either a "defect" or "issue" coupled with sloppy English caused the
Columbia engineers to say "oh yeah we know about dat defect, dat
issue" and to ignore it.

It appears this happend wrt C99.
Secondly, it's not a question of compiler courses, but of philosophy -- a
subject you should appreciate! Philosophically, C's policy has been to
let you specify things if you mean to, and let you leave them unspecified
when you want the compiler to try to get better performance and you have
no preference between a couple of choices.
You do not express yourself well, and I think in general inability to
say what one means here causes flaming and bullying. This has gone on
so long in the educational system, and propagated so much on the
Internet, that good style is now considered, as in my case, to
indicate mental disorder, as in Idiocracy: "you talk like a fag and
your shit's all fucked up".

What you seem to be saying is that "C" (considered as the group of
compilers) does things differently and in some cases evaluation is in
the unexpected right to left order, etc. This was due in the past
mostly to the state of the art at the time the compiler was designed
and the rage on the part of programmers to think of themselves as
"engineers", "squeezing performance" out of a metaphorical car.

But there was no reason to Standardize this bullshit owing to the fact
that when the Standard was written, we knew how to preserve the
illusion of strict sequenced source and to optimize in a hidden way.
Your remit seems to have been to make broken compilers "standard" in a
similar way the Coumbia engineers were authorized to approve a
launch...in which people got killed.
Again, unsupported.

Why do you keep making these dogmatic claims about events you obviously
never witnessed, in a field of inquiry where you've aggressively and
actively pursued militant ignorance?

I think the ignorance is yours. Standardizing "no defined order" was
your way to make different orders standard and different vendors
happy. In doing so, you used a bureaucratic procedure and language,
the "knowledge" of which was a form of insider trading. The result is
the manufacture of ignorance and incoherence, because a()||b() cannot
be reordered but a()+b(), apparently, cam be reordered.
 
D

Dik T. Winter

> news:[email protected]... ....
>
> The C for-loop certainly is amateurish-looking, in that the compiler almost
> has to be instructed every time exactly how to code a for-loop instead of
> figuring it out for itself.
>
> The Algol 60 version at least deals with that properly.

In what way?
> The C for-loop has some useful aspects, but for the basic purpose of
> interating over a range, it's fiddly to write with extra scope for error, as
> the loop variable has to be written 3 times, together with the terminating
> condition (is it < or <=...) and the increment step.

The basic purpose of the C for-loop is not looping over a range, I think. It
is more like a while loop.
 
S

spinoza1111

In <[email protected]>,

spinoza1111wrote:

Computer Science classes [...] DID NOT EXIST
when Dijkstra was starting out [...],

Yes, precisely. And therefore, by your own argument, he was
unqualified to participate in the field of Computer Science. Your
inability to see your own point is fascinating. Fortunately, people
like Dijkstra, Turing, von Neumann, Knuth, Moore, Floyd, Hoare, and
the like were not so snobbish as to think that a person without an X
degree is not qualified to express opinions on X. Had they been so,
CS would never have got off the ground.

Apples and oranges. Seebach is "no Edsger Dijkstra" because it was
clear from "C: The Complete Nonsense" that he was unable to organize a
document, separating important from unimportant issues.

I don't care if he's "gay". Turing was gay. But it is astonishing to
me that he doesn't appear to have a complete degree, did not finish
high school or equivalent, and, above all, that he did not ever take a
computer science class. In short, he'd never functioned meaningfully
within a community of knowers. Instead, he seems to have been selected
to work on the standard based on his "genius" (an artifact in part of
avoiding, apart from participation in a community of knowers, the very
real bullshit that also exists in academia) and his willingness based
on his lack of academic experience to go along with vendor
requirements. He was in some sense a *tabula rasa* who could be
counted on to shoehorn language in bureaucratic containers, and, in
corporate style, personalize issues as in the case of Schildt, who has
a COMPLETE undergraduate degree in philosophy, and a Master's degree
in computer fucking science.

Furthermore, your "reasoning" shows no awareness of time dimension
whatsoever. In Korporate style, you reason that if it's a Rule it must
have always been a Rule (Eurasia has always been at war with Oceania
in 1984, "we have always done it this way".
Let me put it more clearly. A degree in X is NOT NECESSARY for
expressing a competent opinion on X or doing work in the field of
X[1].

Wow. Hey, moron, replace X by brain surgery.
Let me put it even more clearly. A degree in X is NOT SUFFICIENT for
an opinion on X, or work in the field of X, to be guaranteed
competent.

This is the wikipedia legend, fostered historically by the fact that
in business data processing, companies wanted the appearance of
accurate computation without its reality insofar as the reality would
interfere with profits. For this reason, it has been said repeatedly
that computer science is worthless as a degree, mostly by people who
don't have such a degree. If they are retained in jobs with fat
salaries, they reason that they know their jobs, but it's clear to me
that you don't.
<snip>

[1] Except in certain fields, for legal reasons which have nothing to
do with actual competence and everything to do with perceived
competence. In other words, many legislators make the same mistake as
you.

So brain surgery should be performable by you? Right...
 
B

bartc

Dik T. Winter said:
In what way?

By being written as (iirc):

FOR I:=A UNTIL B DO ...

instead of:

for (I=A; I<=B; ++I);

which requires you to tell it pretty much how to code the loop, diverting
some concentration from other matters. (In fact when I wrote the above, even
though I was taking care, it first came out as:

for (I=A; I<=B; ++B);

but using uppercase the error was quickly noticed.)
The basic purpose of the C for-loop is not looping over a range, I think.
It
is more like a while loop.

Exactly. It's quite a nice looping construct but is clumsy for a for-loop.
 
N

Nick Keighley

By being written as (iirc):

FOR I:=A UNTIL B DO ...

FOR I := A STEP 1 UNTIL B DO ...
instead of:

for (I=A; I<=B; ++I);

which requires you to tell it pretty much how to code the loop, diverting
some concentration from other matters. (In fact when I wrote the above, even
though I was taking care, it first came out as:

for (I=A; I<=B; ++B);

but using uppercase the error was quickly noticed.)



Exactly. It's quite a nice looping construct but is clumsy for a for-loop.

Whilst Algol-60 never looked messy

for j:=I+G,L,1 step 1 until N, C+D do A[k,j]:=B[k,j] do something;
for dummy:=0, dummy + 1 while dummy < count do thingy;
 
D

Dik T. Winter

>
> As it stands, this makes no sense.

In the expression:
a + b
there are three evaluations, the evaluation of the first operand, the
evaluation of the second operand and the operation itself. What was
talked about was the order of evaluation of 'a' and 'b', not about how
the actual operation was performed, but you focus on the latter.
> I understand that the C standard
> didn't mandate violating the laws of mathematics,

The current version does not even violate numerical mathematics, that
is, expression can only be rewritten if it is known that the exact
result of the rewrite is not different from the result of the original.

So I wonder what your point is, mathematics does not tell whether 'a'
must be evaluated before 'b' or not.
 
N

Nick Keighley

 > >  > "Except for the assignment operators and the null coalescing operator,
 > >  > all binary operators are left-associative, meaning that operations are
 > >  > performed left to right."

 > > Again you are confusing the order of evaluation of arguments to an
 > > operation and order of evaluation of the operations themselves.
 >
 > As it stands, this makes no sense.

In the expression:
    a + b
there are three evaluations, the evaluation of the first operand, the
evaluation of the second operand and the operation itself.  What was
talked about was the order of evaluation of 'a' and 'b', not about how
the actual operation was performed, but you focus on the latter.

 > I understand that the C standard didn't mandate violating the laws of mathematics,

whatever /they/ are...

Spinoza: consider the expression
a - b

The C Standard does not allow an implementation to re-arrange this to
b - a

because that would be wrong, but it does allow the implementation to
generate code like this (R1 and R2 are registers)
R1 <- VALUE(a)
R2 <- VALUE(b)
R3 <- R1 - R2

or like this
R1 <- VALUE(b)
R2 <- VALUE(a)
R3 <- R2 - R1

In each case R3 contains the result whilst the evaluations of 'a' and
b have been done in a different order.
The current version does not even violate numerical mathematics, that
is, expression can only be rewritten if it is known that the exact
result of the rewrite is not different from the result of the original.

So I wonder what your point is, mathematics does not tell whether 'a'
must be evaluated before 'b' or not.

and note since 'a' and b are potentially arbitarily complicated
expressions it could make a difference

i = g();
a = i + j;
s = f() + a;

by stashing a away before the call to f() you could save
recalculating it.
 
D

Dik T. Winter

> > Exactly. It's quite a nice looping construct but is clumsy for a for-loop.
>
> Whilst Algol-60 never looked messy
>
> for j:=I+G,L,1 step 1 until N, C+D do A[k,j]:=B[k,j] do something;
> for dummy:=0, dummy + 1 while dummy < count do thingy;

for a:= 1, 3, 8, 16, 22, 11, 9 do i:= i + 1;
 
S

Seebs

Apples and oranges. Seebach is "no Edsger Dijkstra" because it was
clear from "C: The Complete Nonsense" that he was unable to organize a
document, separating important from unimportant issues.

Since you're interested in philosophy, you will be fascinated to note
that it is very hard to determine whether or not you have knowledge of
the above.

Consider the common assertion that knowledge is "justified true belief".
A man looks out into a field while passing it in a train, and sees something;
he believes there is a sheep in the field, because he has seen a fluffy white
thing. But in fact, what he saw was an unusually shaped white rock. But!
Behind the rock, completely invisible to him, is a smallish sheep.

Does he have knowledge of the sheep? Probably not.

The relevance of this is that your "criticisms" of the CTCN page have
been consistently found to be rambling, incoherent, or just plain wrong...
And yet, by total coincidence, I think it's safe to say that you're quite
right that, back when I wrote it, I was unable to organize a document,
separating important from unimportant issues. (That said, I would also
point out that I was making no attempt to do so; I picked issues I found
interesting or informative, and they were presented strictly in page
order.)
I don't care if he's "gay". Turing was gay. But it is astonishing to
me that he doesn't appear to have a complete degree, did not finish
high school or equivalent, and, above all, that he did not ever take a
computer science class. In short, he'd never functioned meaningfully
within a community of knowers.

You seem to have missed a key point: What makes you think I don't have
a complete degree? I completed all but a couple of requirements of math
and philosophy degrees, and *did* complete a psychology degree. I functioned
fairly adequately in college -- quite well if you make allowances for the
fact that, at the time, we had no idea that I had a learning disability, nor
any clue how to treat it or accommodate it. Life has since improved.
Instead, he seems to have been selected
to work on the standard based on his "genius"

No.

There is no "selected to". I wanted to, it sounded fun, people in the
standards community knew me and said I should try it out, so I did. There's
no selection; you pay your dues and show up (or don't show up, if you don't
want to).
(an artifact in part of
avoiding, apart from participation in a community of knowers, the very
real bullshit that also exists in academia) and his willingness based
on his lack of academic experience to go along with vendor
requirements.

The latter is a great example of something which isn't knowledge, because
it is neither true nor justified. There simply is no history of wanting
people to "go along with vendor requirements". During the entire time I
worked on C, we consistently had non-vendors present and active in the
process, and they were listened to carefully, because vendors wanted to make
sure they didn't screw their users by not being aware of user requirements.

In short, what actually happened was the opposite of what you describe.
He was in some sense a *tabula rasa* who could be
counted on to shoehorn language in bureaucratic containers, and, in
corporate style, personalize issues as in the case of Schildt, who has
a COMPLETE undergraduate degree in philosophy, and a Master's degree
in computer fucking science.

Your assertion that this was "personalized" fascinates me, because I was
unaware of any personal aspect to the thing.

However. You raise an interesting concern. You have pointed out in
the past a belief that people like me had "AP'd out of" early CS courses,
and overspecialized later, leading to a disconnected ivory-tower view
of things, uninformed by significant awareness of the sorts of things
one picks up earlier in a CS program.

What, then, should we make of someone who apparently skipped over most of
undergraduate CS entirely, and came in only at the master's degree level,
which is purely academic? Possibly that doesn't tell us anything. But what
if that person then demonstrates a total unawareness that systems other
than MS Windows may have genuinely different architectures? Why, then
I think we're starting to see signs of an ivory tower academic whose lack
of real-world experience hurts his ability.
Furthermore, your "reasoning" shows no awareness of time dimension
whatsoever. In Korporate style, you reason that if it's a Rule it must
have always been a Rule (Eurasia has always been at war with Oceania
in 1984, "we have always done it this way".

If you wanted to say "anyone who doesn't have a degree in CS, but attended
college at a time when they were generally available", you should have. And
as noted, my college didn't have CS degrees when I attended, although they
had one of the earlier CS programs.
This is the wikipedia legend, fostered historically by the fact that
in business data processing, companies wanted the appearance of
accurate computation without its reality insofar as the reality would
interfere with profits. For this reason, it has been said repeatedly
that computer science is worthless as a degree, mostly by people who
don't have such a degree. If they are retained in jobs with fat
salaries, they reason that they know their jobs, but it's clear to me
that you don't.

I don't think anyone's claimed that CS is "worthless" as a degree, merely
that it's not strictly necessary. I might well be more effective in some
ways if I had a CS degree. However, it's unclear that the benefit would
justify the time sink.
So brain surgery should be performable by you? Right...

Who would you rather have representing you in court, assuming it were legal
for both to do so?

* Someone who is one course short of a law degree, but who loves to read
case histories and competed in national debate tournaments.
* Orly Taitz, who has a law degree.

-s
 
S

Seebs

You don't really expect to get a meaningful answer to that question,
do you?

Not really, no. But! He has an unusually high probability of yielding
a claim about C which is untrue in an interesting way, and thus yields
a topical and rewarding discussion.

-s
 
S

Seebs

Although it's not relevant whether you're gay, and not relevant if you
failed to graduate high school other than that it might indicate
dysfunction, I would expect you graduated from university. Please
provide on topic information as to your major, please.

I did all but a couple of courses of math and philosophy majors, and graduated
with a degree in psychology. The motivation was that I suddenly realized that
I could go do other stuff sooner if I picked the right four courses and
finished any degree, and I didn't feel like another year in college.

It may or may not be relevant that I spent my free time in college hanging
out on computers, learning to program, and helping CS students with their
homework. :p

-s
 
S

Seebs

"users" and "compiler magic" are meaningless and barbaric words.

Meaningless, I'd dispute. Barbaric, by contrast, isn't even wrong.
This is nonsense, since a||b and a&&b have always been implemented
such that b is not evaluated when a is true (in the case of or) or b
is false (in the case of and).

"Always" may be a bit strong, there may be some stuff in the 1972 era
that wasn't sure about it.

But! That is a special case, introduced because it allowed some particular
idioms to work, such as "s && *s".

It is, to use the term correctly (a rarity), the exception that proves the
rule. The fact that these are called out as a special exception shows that
the rule exists. The reason they're called out as a special exception even
shows you why the rule exists.
Furthermore, even "back in the day" it
was already known that reordering that breaks code with side effects
was a bad idea.

No, it was known that writing code which would be broken if it were reordered
was a bad idea.
As it happened, certain processors made it convenient for compiler
developers to reorder a()+b() and similar constructs. When they did so
(often in consequence of stupid mistakes very similar to the first way
I constructed a grammar this week so as to right associate munis and
divide), they blamed the victims of their incompetence. They did not
foresee the use of function call and penalized the best programmers in
the typical fashion of incompetents.

This is a fascinating bit of revisionist history, but you've yet to provide
any evidence at all that there is any time of which this is true.

You have not shown that there were "stupid mistakes" involved in the
decision to reorder these things. You haven't even hinted at why we should
imagine that you would know.
Wow, just because it got lost in the paperwork doesn't mean it doesn't
exist. In fact, These "tracking systems" were shown to be in the
official report on the (NASA) Columbia and Challenger disasters
excellent ways to cover up errors and (in the words of an official
researcher) "normalize deviance".

Nice dodge, but not buying it. I do not believe a single defect report has
ever been lost. Could you identify for us some of the gaps in the defect
report sequence which must be true for your claim to hold? Alternatively,
you could simply tell us who raised the issue?
Look at the misuse of the language alone. To say "hey, you idiots, you
are ignoring the fact that we now know how to reorder, not source
code, but its internal representation while preserving correctness,
and for this reason there is no reason, any longer, to say that the
order of operations is invalid" is even in bureaucratese an issue and
not a defect.

No such distinction is made in the standards stuff.

Also, who said anything about reordering source code? Compilers reorder
their internal representation... But even BEFORE they did any such reordering,
the language clearly indicated that the order of execution was intended to
be potentially variable. Why? Because the people doing it were experienced
compiler developers who were familiar with a variety of languages and wished
to do something they felt would be rewarding.
It appears this happend wrt C99.

No, it doesn't.
You do not express yourself well, and I think in general inability to
say what one means here causes flaming and bullying.

That would be one explanation for your stream-of-consciousness abusive posts
in which you somehow never quite manage to provide a shred of evidence for
your lcaims.
What you seem to be saying is that "C" (considered as the group of
compilers) does things differently and in some cases evaluation is in
the unexpected right to left order, etc.

That's not "unexpected", though, except by you.
This was due in the past
mostly to the state of the art at the time the compiler was designed
and the rage on the part of programmers to think of themselves as
"engineers", "squeezing performance" out of a metaphorical car.

Again, you're ascribing implausible motivations to people
you've never met.
But there was no reason to Standardize this bullshit owing to the fact
that when the Standard was written, we knew how to preserve the
illusion of strict sequenced source and to optimize in a hidden way.

Sure. We also knew how to render three-dimensional pictures of monkeys,
but we didn't standardize that either. What were we thinking?
Your remit seems to have been to make broken compilers "standard"

Except you've never actually given any evidence that they're "broken",
except that they fail to conform with your personal preferred behavior.
You haven't shown that your preference is a good one.
I think the ignorance is yours.

If so, you could easily resolve it by providing information other than
your unsupported bald assertions.
Standardizing "no defined order" was
your way to make different orders standard and different vendors
happy.

Do you have any evidence to support this claim?
In doing so, you used a bureaucratic procedure and language,
the "knowledge" of which was a form of insider trading.

Do you have evidence to support this claim?
The result is
the manufacture of ignorance and incoherence, because a()||b() cannot
be reordered but a()+b(), apparently, cam be reordered.

The "apparently" there suggests that you're not quite clear on this language.

You're quite right that there is a difference between those. There is a
sound logical reason for that difference.

Hint:

Is it ever possible to determine the value of "a()||b()" without evaluating
b()? If so, could that become a useful feature?

By contrast, is it ever possible to determine the value of "a()+b()" without
evaluating b()?

The rule, which is pretty obvious, is that in general you can omit evaluation
when it's not necessary. (The comma operator's evaluation of unused values is
a special case, but is useful for expressing some likely idioms.)

-s
 
S

spinoza1111

I did all but a couple of courses of math and philosophy majors, and graduated
with a degree in psychology.  The motivation was that I suddenly realized that
I could go do other stuff sooner if I picked the right four courses and
finished any degree, and I didn't feel like another year in college.

It may or may not be relevant that I spent my free time in college hanging
out on computers, learning to program, and helping CS students with their
homework. :p

This means as far as I can tell you never learned to discuss CS
outside of overspecific problems and that you may have reinvented some
wheels. A "knack for programming" might take you far, but in my
experience it makes you confuse issues in dialog. An example would be
the way you thought, in "C: The Complete Nonsense" that the existence
of free(NULL) falsifies Herb's point that one needs to balance free()
calls with calls to malloc(), calloc, and realloc().

You never discussed abstractions in a human context only specific
problems expressed in programming language in a machine context, which
made you astoundingly literal-minded to a fault but willing to use
this fault to harm others. Since programming languages seemed to you
to have single meanings, you acted as if Herb could mean ONLY by his
assertion that he did not know of free(NULL) (a useless piece of shit)
and that your interpretation matched in logical force the *frisson*
you got when your code worked.

I do not mean "classes in communications". I mean discussing a
scientific discipline such as computer science and understanding how
the model, say of a concrete stack, can be instantiated in many
different ways as long as one can distinguish its essential
abstraction in the concrete.
 
S

spinoza1111

It seems to me that to be a "participant observer" in a standards
effort in which the English language was so misused (confusing
"defect" with "issue") and which had been taken over by vendors, would
have been to be corrupted. Perhaps this is why Schildt stayed away.
Not really, no.  But!  He has an unusually high probability of yielding
a claim about C which is untrue in an interesting way, and thus yields
a topical and rewarding discussion.

I think you have lost all rights to use "untrue", since "truth" here
is a document which ratified ignorance and bad practice. It appears to
me now that the Standards effort was an Idiocracy. Sure, you MIGHT
have been a savant who'd not taken computer science, although this is
unlikely: the first computing savants had no such access but would
have been glad to learn what they created in order to advance the
state of the art.

But your lack of formal CS education shows if you can seriously
maintain that the Standard had to feign ignorance so that optimal code
could be generated. As I have tried to explain to you, it is NOT
NECESSARY to make evaluation order undefined in order to optimize
because optimization must preserve the original intent of the code,
and the movement of the code takes place not in source but in its
intermediate representation.

To declare (as both K & R and the Standard apparently have) that a()+b
() must be reorderable for optimization, for efficiency as a matter of
a language standard is using efficiency, as incompetent programmers so
often use that word, as a coverup. A language that imposes stricter
rules is actually easier to optimize since the optimizer has more
information about the intent of the code.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,810
Latest member
Kassie0918

Latest Threads

Top