subroutine stack and C machine model

S

Seebs

But here we learn that conformance is impossible because to "conform"
you have to run nondeterministically.

Where do you get that? There's nothing anywhere requiring nondeterminism.

"unspecified" doesn't mean "you can't just pick something and do it", it
means "you don't have to pick the same thing all the time *if you don't
want to*".
As to your example, the standard wasn't in common use at the time.

Hardly matters -- the type rules predate the standard by years.
Nice little gag rule. Kiki, "C" doesn't mean "standard C". The
standard is trash. C means "what people code when they say they code
C", because in reality, they are more right than the standard which
was defined as I have said to protect the profits of vendors.

You've said this, but you've produced NO evidence to support it. By
contrast, we've given multiple examples of things which were put into
the standard for reasons clearly contrary to your claim.

Your best attempt so far was your argument about the unspecified order
of evaluation -- which, if it had been a change, might even have been
plausibly argued in some way. But when it turns out that the decision
in question is more than twenty years older than you thought it was,
and predates "vendors" entirely, that sort of undermines your theory.

The problem here is that you don't actually hold beliefs based on arguments.
You have beliefs, and you can make up arguments for them retroactively,
but if the arguments fail, it doesn't matter, you still have the belief.

You've admitted that you don't know much about C. You've proven that
you know virtually nothing about C standardization. And yet, all the
support you have given is "I have said". Well, yes, you have said. But
since it's not true, who cares? That's just another instance of you
being wrong.
Herb was in fact reporting actual practice amongst actually productive
programmers and as such did a far more valuable service.

Well, actually, on the order of evaluation thing, he made two claims,
one of which contradicts yours, and the other of which is blatantly stupid
and did not in fact reflect actual practice on any implementation I've
ever heard of.
There is no
need for standards, especially standards used by companies to increase
profits.

Oh, how dare they come up with a common language definition so that people
can buy whatever compiler they want and expect to know what it does! Those
dastardly fools! They've increased profits! OH NO!!!!!

When you were just talking about vendors increasing profits, your position
was unsupported and basically implausible; it was in the territory where no
response more dignified than "remember, shiny side OUT" is really going
to affect anything.

But *companies* includes the vast majority of users of C. And right you
are; the standard has been used by those companies to increase profits, by
allowing them to develop software more quickly, more reliably, and with
greater confidence that it will port reasonably to new machines, even machines
which hadn't been designed yet when the software was written.
Just do a halfway decent job of language design.

Could you give us a single concrete example of a language you believe has
done a "halfway decent job of language design"? Hint: If it has any
of the traits you bitch about in C, I won't believe you.
The developers of C did not.

You keep saying this, but:

1. You've made it completely clear that you have never had a solid
understanding of C, so your criticisms are, fundamentally, pointed at
something which isn't actually C, but rather your half-crazed imaginings.
2. You've yet to provide a single argument stronger than a totally
unsupported assertion that you are "intelligent" and that other "intelligent"
people will have the same expectations as you, which languages should
therefore conform to.

-s
 
K

Keith Thompson

spinoza1111 said:
But here we learn that conformance is impossible because to "conform"
you have to run nondeterministically.

This is completely wrong, as has been explained to you at some length.
As to your example, the standard
wasn't in common use at the time.

How does that affect the argument?

[snip]
If you're unwilling to accept the way C defines things, you might
consider finding (or, if necessary, inventing) another language,
which you can discuss elsewhere.

Nice little gag rule. Kiki, [...]

Back in the Bozo bin with you, Spinny. If you need help correcting
your many misconceptions (and believe me, you do), you won't be
getting it from me. You probably won't even understand why.
 
S

Seebs

Back in the Bozo bin with you, Spinny. If you need help correcting
your many misconceptions (and believe me, you do), you won't be
getting it from me. You probably won't even understand why.

Of course he does. He's presumably counting on it. You can count that
as a concession on the merits.

-s
 
D

Dik T. Winter

> It can reorder a+b where a and b are unaliased lValues, but a good
> optimizer would not reorder a(...)+b(...), a+b(...), or a(...)+b,
> unless it had full information about the internals of a and b.
>
> Perhaps this is why Algol failed long term. Not knowing how to
> optimized, the developers of Algol adopted what Adorno called the
> "scientifically ascetic" viewpoint. Whereas languages controlled by
> vendors such as early Fortran were more portable as a practical matter
> simply because they ran on IBM machines or clones.

You know that in Fortran in the expression:
A(...) + B(...)
A and B can be evaluated in any order? Worse, given the expression:
A(...) + A(...)
when the argument lists are the same, A need only be called once.

And I have seen many non-portable Fortran programs and libraries in the
course of time.
 
D

Dik T. Winter

> There are two ways to make code efficient:
>
> (1) As in assembler, manipulate the code "by hand"
> (2) Use a compiler optimizer
>
> In the nondeterminacy of a()+b(), it is impossible to do (1) in the
> sense of performing something inside a() that's used inside b().
>
> It's unnecessary for the order of evaluation of a()+b() to be
> nondeterministic to do (2). This is because modern optimizing
> compilers construct a data structure (often but not always a DAG)
> which finds all knowable dependencies and can rearrange a and b only
> when it's safe.

And how do I express that I do not care in what order it is evaluated, but
that I only want the most efficient way? Not moreover that when 'a' and
'b' come from a different compilation unit, the compiler may not be
able to determine whether there is a dependency at all. That is the
reason that Fortran standards forbid side-effects in functions called within
an expression that have influence on other parts of the expression (one of
those nearly unenforcable prohibitions in the Fortran standards).
 
D

Dik T. Winter

> Nice little gag rule. Kiki, "C" doesn't mean "standard C". The
> standard is trash. C means "what people code when they say they code
> C", because in reality, they are more right than the standard which
> was defined as I have said to protect the profits of vendors.

Ah, yes, gcc is from a big vendor.
> Herb was in fact reporting actual practice amongst actually productive
> programmers and as such did a far more valuable service. There is no
> need for standards, especially standards used by companies to increase
> profits.

And Microsoft is not.
 
S

spinoza1111

...
 > Nice little gag rule. Kiki, "C" doesn't mean "standard C". The
 > standard is trash. C means "what people code when they say they code
 > C", because in reality, they are more right than the standard which
 > was defined as I have said to protect the profits of vendors.

Ah, yes, gcc is from a big vendor.

Open Source as it happened has more than served the interests of big
vendors. It is a form of virtual slavery, because while "theft of
intellectual property" is a crime, "theft of intellectual production"
is Standard Operational Bullshit in the computer industry, and it
started with unpaid overtime in Silicon Valley to "meet deadlines".
Open source is another way for vendors to make big bucks while its
foolish contributors are unrecognized and unrewarded. At least
Microsoft pays its people.
 > Herb was in fact reporting actual practice amongst actually productive
 > programmers and as such did a far more valuable service. There is no
 > need for standards, especially standards used by companies to increase
 > profits.

And Microsoft is not.

Microsoft sets standards and this is a good thing.
 
S

spinoza1111

...
 > There are two ways to make code efficient:
 >
 > (1) As in assembler, manipulate the code "by hand"
 > (2) Use a compiler optimizer
 >
 > In the nondeterminacy of a()+b(), it is impossible to do (1) in the
 > sense of performing something inside a() that's used inside b().
 >
 > It's unnecessary for the order of evaluation of a()+b() to be
 > nondeterministic to do (2). This is because modern optimizing
 > compilers construct a data structure (often but not always a DAG)
 > which finds all knowable dependencies and can rearrange a and b only
 > when it's safe.

And how do I express that I do not care in what order it is evaluated, but
that I only want the most efficient way?  Not moreover that when 'a' and
'b' come from a different compilation unit, the compiler may not be
able to determine whether there is a dependency at all.  That is the

In this case, it should evaluate left to right.
reason that Fortran standards forbid side-effects in functions called within
an expression that have influence on other parts of the expression (one of
those nearly unenforcable prohibitions in the Fortran standards).

That you regard it as unenforceable shows you need to learn
optimization theory IMO.
 
T

Tim Streater

spinoza1111 said:
[...]
But here we learn that conformance is impossible because to "conform"
you have to run nondeterministically.

This is completely wrong, as has been explained to you at some length.
                                      As to your example, the standard
wasn't in common use at the time.

How does that affect the argument?

[snip]
If you're unwilling to accept the way C defines things, you might
consider finding (or, if necessary, inventing) another language,
which you can discuss elsewhere.
Nice little gag rule. Kiki, [...]

Back in the Bozo bin with you, Spinny.  If you need help correcting
your many misconceptions (and believe me, you do), you won't be
getting it from me.  You probably won't even understand why.

The standards do NOT create a fixed language worth knowing in detail,
because they bless, for example, a program which prints a or b in
random order depending on compiler. You fail to see that leaving
things undefined creates multiple languages. One that consistently
evaluates a() and b() left to right implements a different language in
terms of sensible programming language design than one that evaluates
right to left, but both implementations are "standard". This means
that learning "standard C" is impossible.

More of Spinny's "soviet" logic.
It also means that Herb Schildt erred when he agreed to write about
"standard C" because the nondeterminacy of standard C means that it's
literally impossible to write a sensible reference!

I have seen in these discussions, in fact, no reference to any book or
reference on the Standard, other than the Standard, but the Standard's
goal is not to show people how to use C, it's to tell VENDORS if their
compiler is standard. This is in fact the mission of any standard.

Yes you have, bozo, K&R has been mentioned may times.
 
S

spinoza1111

...
 > It can reorder a+b where a and b are unaliased lValues, but a good
 > optimizer would not reorder a(...)+b(...), a+b(...), or a(...)+b,
 > unless it had full information about the internals of a and b.
 >
 > Perhaps this is why Algol failed long term. Not knowing how to
 > optimized, the developers of Algol adopted what Adorno called the
 > "scientifically ascetic" viewpoint. Whereas languages controlled by
 > vendors such as early Fortran were more portable as a practical matter
 > simply because they ran on IBM machines or clones.

You know that in Fortran in the expression:
       A(...) + B(...)
A and B can be evaluated in any order?  Worse, given the expression:
       A(...) + A(...)
when the argument lists are the same, A need only be called once.

And I have seen many non-portable Fortran programs and libraries in the
course of time.

And this infantile disorder is a precedent we need use?
 
S

spinoza1111

Of course he does.  He's presumably counting on it.  You can count that
as a concession on the merits.

You guys are a riot, one major reason: you have never disambiguated
convenience, administration and truth.

"Il servo padrone. [Italian: the master as servant] – In regards to
the dull-witted tasks, which are demanded by the ruling culture from
subordinate classes, these latter become capable of such solely
through permanent regression. Precisely what is unformed in them is
the product of social form. The creation of barbarians through culture
is however constantly deployed by this latter, in order to preserve
its own barbaric essence. Domination delegates the physical violence,
on which it rests, to the dominated."

- TW Adorno
 
S

spinoza1111

Where do you get that?  There's nothing anywhere requiring nondeterminism.

What pray means "undefined", then?
"unspecified" doesn't mean "you can't just pick something and do it", it
means "you don't have to pick the same thing all the time *if you don't
want to*".

That's nondeterminism, chump.
Hardly matters -- the type rules predate the standard by years.


You've said this, but you've produced NO evidence to support it.  By
contrast, we've given multiple examples of things which were put into
the standard for reasons clearly contrary to your claim.

Your best attempt so far was your argument about the unspecified order
of evaluation -- which, if it had been a change, might even have been
plausibly argued in some way.  But when it turns out that the decision
in question is more than twenty years older than you thought it was,
and predates "vendors" entirely, that sort of undermines your theory.

You had a chance to change it and you blew it.
The problem here is that you don't actually hold beliefs based on arguments.
You have beliefs, and you can make up arguments for them retroactively,
but if the arguments fail, it doesn't matter, you still have the belief.
The arguments don't fail. It's absurd to allow a()+b() to be reordered
when this isn't done for a()||b(). And you apparently don't even
understand my contention: that the "indeterminacy" is not a positive
and observable thing, because most C programmers stay with one
compiler or a narrow range, and they see determinacy...yet are blamed
for your screwup because they haven't read the standard!

You've admitted that you don't know much about C.  You've proven that

No, I have said that while I was asked Princeton to teach C, and
assist a Nobelist with C, and delivered classes in C at a Chicago
corporation, I realized that C was inadequate for development and
stopped using it circa 1993. Therefore, I've probably "forgotten more
than you know".

Since programming languages are not hard to learn if one has a solid
foundation (which you appear to lack) I am rapidly getting back up to
speed, not because I like C, but because I think it's repellent what
you do in its name to people like Navia and Schildt. I think your
behavior is unacceptable and if I have to relearn C to tear you a new
asshole, I will.

I advise you in fact to review the code I posted in response to the
request for guidance to the poster on infix to Polish. I challenge you
to post a Vicious Little Tirade about the errors in this code. It is
the result of my relearning process and it took two hours.
you know virtually nothing about C standardization.  And yet, all the

What I have learned makes me sick.
support you have given is "I have said".  Well, yes, you have said.  But
since it's not true, who cares?  That's just another instance of you
being wrong.


Well, actually, on the order of evaluation thing, he made two claims,
one of which contradicts yours, and the other of which is blatantly stupid
and did not in fact reflect actual practice on any implementation I've
ever heard of.

Reproduce it here.
Oh, how dare they come up with a common language definition so that people
can buy whatever compiler they want and expect to know what it does!  Those
dastardly fools!  They've increased profits!  OH NO!!!!!

If the order of evaluation of a()+b() is undefined across compilers
they cannot do this.
When you were just talking about vendors increasing profits, your position
was unsupported and basically implausible; it was in the territory where no
response more dignified than "remember, shiny side OUT" is really going
to affect anything.

The rule you apply here is administrative, although expressed
colloquially: "don't bite the hand that feeds you".
But *companies* includes the vast majority of users of C.  And right you
are; the standard has been used by those companies to increase profits, by
allowing them to develop software more quickly, more reliably, and with
greater confidence that it will port reasonably to new machines, even machines
which hadn't been designed yet when the software was written.

It doesn't work that way. Instead, "consultants" make fat hourlies
while working excessively to port the code. Business is irrational and
barbaric. The only reason C is preserved is that C programmers won't
learn better languages.
Could you give us a single concrete example of a language you believe has
done a "halfway decent job of language design"?  Hint:  If it has any
of the traits you bitch about in C, I won't believe you.

C Sharp & Java. Both of these reject reordering of a()+b() and after
precedence, evaluate left to right.
You keep saying this, but:

1.  You've made it completely clear that you have never had a solid
understanding of C, so your criticisms are, fundamentally, pointed at
something which isn't actually C, but rather your half-crazed imaginings.
2.  You've yet to provide a single argument stronger than a totally
unsupported assertion that you are "intelligent" and that other "intelligent"
people will have the same expectations as you, which languages should
therefore conform to.

Yes, I am, and I refuse, any more, to be made into a savage by the
tools of production. 1401 assembler language and Cobol were bad
enough. But the hypocrisy of C was that it was presented as a modern
and "clean" language.

And yes, I am a highly intelligent person, so much so, in fact, that
in communities of stupid people some think me crazy. Why do you
suppose Nash lived in Princeton, and not Roanoke, where he tried
living with his sister (cf Sylvia Nasar's book)?

It's because he was thought "strange". Likewise, the morons here act
like 14 year olds when I use a complex sentence.

It also drives narrow and brutalized specialists like Heathfield, you
and Thompson that I didn't have to make "learning C" a lifetime
calling. Narrow and feeble little men love finding something that they
can pretend is complicated, like the lawyers in Dickens' Bleak House.

The fact is that each one of us has had to sell too much of his time
and spirit to some uncaring corporation or other organization. Nearly
all of you have had to work unpaid overtime.

Techies console themselves for this waste of spirit in an expense of
shame by pretending that they are learning something Important on the
job and here, it's a pathological programming language.

Nobody's capable of thinking themselves at all "intelligent" unless
constantly and addictively reassured by meaningless victories over
code that was in most cases poorly designed in a hack o rama, like
rats pressing a lever.

I know I'm intelligent, and for this reason when I had only "symbolic
processing system" I was insulted by the long hours I had to work to
produce reliable and quality solutions for Roosevelt University (not
that they gave a ****, of course) and I worked nights to fix Fortran.
I have never in fact understood the love for make-work programming
languages like C, which I now believe constitute welfare for
inadequate white males.

I saw musicians and writers work at Bell Northern Research as
programmers and "technical" writers only to be deprived, year after
year, of autonomy and common decency by a criminal management at
Northern Telecom that was raping the company, and then tossed aside
with neither pensions nor, outside of Canada, health insurance.
Therefore I am not at all impressed by the savagery of your Vicious
Little Tirade because people are more important than "correct"
software, especially because your claims that it doesn't matter
whether a or b is printed first shows you wouldn't know correct
software if it bit you in the butt.
 
S

spinoza1111

[...]
But here we learn that conformance is impossible because to "conform"
you have to run nondeterministically.

This is completely wrong, as has been explained to you at some length.
                                      As to your example, the standard
wasn't in common use at the time.

How does that affect the argument?

[snip]
Nice little gag rule. Kiki, [...]

Back in the Bozo bin with you, Spinny.  If you need help correcting
your many misconceptions (and believe me, you do), you won't be
getting it from me.  You probably won't even understand why.

The standards do NOT create a fixed language worth knowing in detail,
because they bless, for example, a program which prints a or b in
random order depending on compiler. You fail to see that leaving
things undefined creates multiple languages. One that consistently
evaluates a() and b() left to right implements a different language in
terms of sensible programming language design than one that evaluates
right to left, but both implementations are "standard". This means
that learning "standard C" is impossible.

It also means that Herb Schildt erred when he agreed to write about
"standard C" because the nondeterminacy of standard C means that it's
literally impossible to write a sensible reference!

I have seen in these discussions, in fact, no reference to any book or
reference on the Standard, other than the Standard, but the Standard's
goal is not to show people how to use C, it's to tell VENDORS if their
compiler is standard. This is in fact the mission of any standard.

It was probably McGraw Hill marketing that had the bright idea of
making, in 1991, a "complete reference" on C, and in 1999 a book on
"standard" C, and they asked Herb to write these books. I didn't
suggest a book on compilers to Apress: Dan Appleman responded to a
note from me, praising one of his articles, with an enquiry as to
whether I wanted to write a book about practical .Net compiling.

At worst, Schildt got his tits in the wringer and made the best of a
bad situation. He did not deserve the shitstorm of abuse he got.
 
S

Seebs

In this case, it should evaluate left to right.

Even if that's 20% slower, but makes no difference to the user?
That you regard it as unenforceable shows you need to learn
optimization theory IMO.

Perhaps you could point us at a system that enforces it in all cases?

-s
 
S

Seebs

What pray means "undefined", then?

"undefined" means anything is permissible. It doesn't mean you have to do
something unpredictable.

Note that "undefined" and "unspecified" are NOT the same thing.
That's nondeterminism, chump.

No, it isn't.

Here's the difference. You have advocated that the standard should
*require* left-to-right evaluation. In fact, the standard makes no
requirements as to order of evaluation except when there's a sequence
point.

You have claimed that the standard "requires nondeterminism". If this
were the case, the standard would be *prohibiting* left-to-right evaluation,
but in fact, it does not; an implementor is free to implement left-to-right
evaluation, just as an implementor is free to implement right-to-left
evaluation, or evaluation which is left-to-right for most functions
and right-to-left for variadic functions.

There is a difference between "you may do whatever you wish" and "you must
do something random".
You had a chance to change it and you blew it.

Except that you have yet to make any argument stronger than "some random
guy on Usenet claims without support that intelligent people would expect
this".
The arguments don't fail.

They certainly do. Hint: If you haven't persuaded anyone, that usually
means your argument has failed.
It's absurd to allow a()+b() to be reordered
when this isn't done for a()||b().

Except you've already been told why it isn't.
And you apparently don't even
understand my contention: that the "indeterminacy" is not a positive
and observable thing, because most C programmers stay with one
compiler or a narrow range, and they see determinacy...yet are blamed
for your screwup because they haven't read the standard!

Name one.

Seriously, you're claiming "most C programmers" are confused by this, but
you've yet to demonstrate that even ONE programmer has been confused by
this. Every book I'm aware of states it explicitly, and I have never in
all my years of programming and reading C code seen a single person make
a mistake with this.
No, I have said that while I was asked Princeton to teach C, and
assist a Nobelist with C, and delivered classes in C at a Chicago
corporation, I realized that C was inadequate for development and
stopped using it circa 1993. Therefore, I've probably "forgotten more
than you know".

You currently don't know much about it, and demonstrably, your beliefs about
C are full of crazy.

I've met many people who have tought classes they were woefully unqualified
for.
Since programming languages are not hard to learn if one has a solid
foundation (which you appear to lack)

You haven't really established this. FWIW, I've never had any trouble picking
up programming languages.
I am rapidly getting back up to
speed, not because I like C, but because I think it's repellent what
you do in its name to people like Navia and Schildt. I think your
behavior is unacceptable and if I have to relearn C to tear you a new
asshole, I will.

Good luck with that. If I'm wrong, I anticipate being corrected.
I advise you in fact to review the code I posted in response to the
request for guidance to the poster on infix to Polish. I challenge you
to post a Vicious Little Tirade about the errors in this code. It is
the result of my relearning process and it took two hours.

Wow.

Well, I did indeed respond to it. I just sorta skimmed after a while,
there was not much point.
What I have learned makes me sick.

It ain't what you don't know that gets you, it's what you know that ain't
so.

I have yet to see you make a claim about C standardization that was not
false.
Reproduce it here.

SURRENDER BAD C!

Okay. C, TCR 3e, page 735

The way an order-of-evaluation error usually occurs is through
changes to an existing statement. For example you may enter
the statement
x = *p++;
which assigns the value pointe to by p to x and then increments
the pointer p. Say, however, that you later decide that x
really needs the value pointed to by p squared. To do this,
you try
x = *p++ * (*p);
However, this can't work because p has already been incremented.
The proper solution is to write:
x = *p * (*p++);
Errors like this can be very hard to find.

We can't tell whether he thinks the parentheses matter, or whether he
thinks this is evaluated left-to-right. You might think it's the latter,
but consider page 56::

The ANSI C standard does not specify the order in which the
subexpressions of an expression are evaluated. This leaves the
C compiler free to rearrange the expression to produce more optimal
code. However, it also means that your code should never rely
on the order in which subexpressions are evaluated. For example,
the expression
x = f1() + f2();
does not ensure that f1() will be called before f2().

Presumably, then, he thinks the parentheses change this or something? We
may never know.
If the order of evaluation of a()+b() is undefined across compilers
they cannot do this.

Unspecified, not undefined. The terms have precise meanings.

And yes, they can: They can write code which does not depend on that
decision. Which everyone I have ever met has been doing successfully
for years.
The rule you apply here is administrative, although expressed
colloquially: "don't bite the hand that feeds you".

No, it's actually much simpler: "Don't listen to the crazy guy who has
never presented even a single anecdotal story to support his claims."

Come on, at least give us ONE concrete example of a vendor laying off
compiler developers because they had a guy who would have been working on
fixing order of evaluation if it had changed, but then it didn't get
changed so they fired him.
It doesn't work that way.

Well, it does, actually.
Instead, "consultants" make fat hourlies
while working excessively to port the code.

Please let us know where these "fat hourlies" are available for porting C
code. I've never seen it.
Business is irrational and
barbaric. The only reason C is preserved is that C programmers won't
learn better languages.

And partially because, for many of the things C is good at, no better
languages are yet on offer.
C Sharp & Java. Both of these reject reordering of a()+b() and after
precedence, evaluate left to right.

Can't speak to C Sharp much. (Obviously, "portability" is a moot point
there.) But let's keep Java in mind, then.
And yes, I am a highly intelligent person, so much so, in fact, that
in communities of stupid people some think me crazy.

I'm glad for you. I'm not especially dumb myself. But that doesn't
mean that your intuition is the gold standard of language design. SURRENDER
EVIDENCE!
It also drives narrow and brutalized specialists like Heathfield, you
and Thompson that I didn't have to make "learning C" a lifetime
calling.

You dropped a word, but actually, you'd be wrong anyway. I wouldn't call
"learning C" a lifetime calling. It's been one of my favorite hobbies, but
it's not as though I've spent more time on C than I have on, say, Dungeons
& Dragons, or reading science fiction, or playing video games, or writing.
The fact is that each one of us has had to sell too much of his time
and spirit to some uncaring corporation or other organization. Nearly
all of you have had to work unpaid overtime.

Probably, although $dayjob's pretty good about making it up to us.
Nobody's capable of thinking themselves at all "intelligent" unless
constantly and addictively reassured by meaningless victories over
code that was in most cases poorly designed in a hack o rama, like
rats pressing a lever.

Actually, I'm pretty much capable of thinking myself intelligent regardless,
because I have a psych degree and have been through more various tests
of cognitive function than I once suspected had ever been developed.
Therefore I am not at all impressed by the savagery of your Vicious
Little Tirade because people are more important than "correct"
software, especially because your claims that it doesn't matter
whether a or b is printed first shows you wouldn't know correct
software if it bit you in the butt.

You keep doing all this ranting, but you never actually present evidence
for your claims. You brag about how qualified you are to present the
evidence. Fine, let's assume you're qualified. So present the evidence
already!

-s
 
R

Richard Bos

Ben Pfaff said:
Seebs said:
I guess I tend to view clashes like that as pretty minor and easily
resolved. I'd expect either of you to be pretty easy to get along
with. [...]

You might be right.

Hey, let's try it out. Any comp.lang.c regulars need a job? My
company is hiring ;-)

Yeah, me, but I'm not moving to the States for it...

Richard
 
F

Flash Gordon

Richard said:
Ben Pfaff said:
Seebs said:
I guess I tend to view clashes like that as pretty minor and easily
resolved. I'd expect either of you to be pretty easy to get along
with. [...]
You might be right.

Hey, let's try it out. Any comp.lang.c regulars need a job? My
company is hiring ;-)

Yeah, me, but I'm not moving to the States for it...

I wouldn't mind going to the states, but it would depend on what the job
on offer was. ;-)
 
M

Moi

Name one.

Seriously, you're claiming "most C programmers" are confused by this,
but you've yet to demonstrate that even ONE programmer has been confused
by this. Every book I'm aware of states it explicitly, and I have never
in all my years of programming and reading C code seen a single person
make a mistake with this.

You probably have forgotton about Herb Schildt. Herb was the kind of
red blooded American guy who only needed to bought a Borland compiler
, got himself a copy of the C standard, to write a book about the
infamous C programming language standard, that almost won the him
the Pullitzer price, and he was nominated for the Nobel prize
for economics, too!


But he taught it at Princeton, you fool!

OMG

AvK
 
S

Seebs

You probably have forgotton about Herb Schildt. Herb was the kind of
red blooded American guy who only needed to bought a Borland compiler
, got himself a copy of the C standard, to write a book about the
infamous C programming language standard, that almost won the him
the Pullitzer price, and he was nominated for the Nobel prize
for economics, too!

Heh.

Actually, Schildt got that right in at least one place in C:TCR.

-s
 
N

Nick Keighley

Seriously, you're claiming "most C programmers" are confused by this, but
you've yet to demonstrate that even ONE programmer has been confused by
this.  Every book I'm aware of states it explicitly, and I have never in
all my years of programming and reading C code seen a single person make
a mistake with this.

I think there may a bit of "luck" involved here. I suspect a sample of
C programmers if asked, which order f and g were called in in f() + g
() most would answer "f". The reason they hardly get bitten by this
is a natural shyness of obscure code. And exploiting multiple
functions for their value and their side effects might be regarded as
obscure (even if they didn't quite phrase it that way). The best one I
can think of is some sort of stream reader but then you'd have to be
unlucky enough to do arithmatic on the values.

/* calculate number of entries from start and end */
num_vals = -(read_val() - read_val());

which would probably cause most people to blink.

/* calculate velocity from distance and time */
velocity = read_val() / read_val();

/* read word */
word = read_val() << 8 | read_val();

the last one is plausible


"ALGOL 60 was a language so far ahead of its time that it
was not only an improvement on its predecessors but also
on nearly all its successors".
--C.A.R. Hoare

Can't speak to C Sharp much.  (Obviously, "portability" is a moot point
there.)  But let's keep Java in mind, then.

I've heard it said that both Java and C Hash are both equally
unportable as both both only run on a single platform (their VM)
:)


speak for yourself


<snip>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,812
Latest member
GracielaWa

Latest Threads

Top