[Note to Seebs - the very last paragraph is just for you.]
In <
[email protected]>,
spinoza1111wrote:
A programmer who has a certain expectation of the way in which his
program should work, which is violated by a bad software system, is
to me more intelligent than the people who created the bad system.
He's what Kant would call "the citizen of a better world".
If you are referring to left-to-right evaluation, you have yet to
demonstrate that a significant number of programmers expect that
order, and have yet to acknowledge that unspecified evaluation order
is a hallmark of a great many languages, not just C. Do you consider
them all flawed for that reason?
Unspecified evaluation order was a "hallmark" of older programming
languages because it was thought to be something appropriately
determined by the programmers of compilers...for the same reason that
different compilers for the same language, in the earlies, gave
variorum results for Boolean evaluation, parameter sequencing, and
expression operand sequencing beyond precedence. Notoriously the
obsolete language APL enforced right to left evaluation.
This was unnoticed because most programmers stayed on one compiler, so
each compiler's choice was a defacto standard.
Then, in the case of C (an ** older ** language), vendors who didn't
want to change compilers without a "business case" (money to be made
by the wealthy), enforced the "standard" that there was no usable
standard, preferring in the case of C to impose nondeterminacy on C
and make intelligent people look stupid...all in service of Holy
Private Property.
But taking an idiot vote including old practice doesn't decide the
issue. Instead we find that newer, and more truly standardized,
languages enforce determinism, in part because optimization is
possible without making source code non-deterministic:
Java was designed as a reliable replace for C in applications. Here is
the deal with Java:
http://java.sun.com/docs/books/jls/second_edition/html/expressions.doc.html:
"The Java programming language guarantees that the operands of
operators appear to be evaluated in a specific evaluation order,
namely, from left to right." Note that they need only "appear" to be
evaluated in l-r order: they can be sensibly optimzed.
Microsoft, which is powerful enough to enforce defacto standards over
and above conformance to multivendor standards, spares itself trouble
by declaring in MSDN that the order is NOT non-deterministic:
"The precedence and associativity of C operators affect the grouping
and evaluation of operands in expressions. An operator's precedence is
meaningful only if other operators with higher or lower precedence are
present. Expressions with higher-precedence operators are evaluated
first. Precedence can also be described by the word "binding."
Operators with a higher precedence are said to have tighter binding."
"The following table summarizes the precedence and associativity (the
order in which the operands are evaluated) of C operators, listing
them in order of precedence from highest to lowest. Where several
operators appear together, they have equal precedence and are
evaluated according to their associativity."
C's nondeterminacy is recognized as a bug and not a feature in
academia. This journal article recognizes "sequence points" as a C
idiom, and idioms are usually signs of a language mistake:
http://journals.cambridge.org/actio...72398F7187C.tomcat1?fromPage=online&aid=54521
"The presence of side effects in even a very simple language of
expressions gives rise to a number of semantic questions. The issue of
evaluation order becomes a crucial one and, unless a specific order is
enforced, the language becomes non-deterministic. In this paper we
study the denotational semantics of such a language under a variety of
possible evaluation strategies, from simpler to more complex,
concluding with unspecified evaluation order, unspecified order of
side effects and the mechanism of sequence points that is particular
to the ANSI C programming language. In doing so, we adopt a dialect of
Haskell as a metalanguage, instead of mathematical notation, and use
monads and monad transformers to improve modularity. In this way, only
small modifications are required for each transition. The result is a
better understanding of different evaluation strategies and a unified
way of specifying their semantics. Furthermore, a significant step is
achieved towards a complete and accurate semantics for ANSI C."
The C Sharp standard: ISO IEC 23270
"Except for the assignment operators and the null coalescing operator,
all binary operators are left-associative, meaning that operations are
performed from left to right."
Alas, that is *so* true.
Any stick to beat him with, eh? Too academic, not academic enough,
good at mathematics (or "autistic", as you appear to prefer),
uneducated and yet elitist... Emerson redoubled in spades.
Yes, any stick that's appropriate. To insist on irrevelancy may
"sound" academic, but in truth, it's not. In fact, programming needs
more academic theory.
But just because the theory and its consequent praxis emerges from
outside Microsoft, this does not in logic entail that the theory and
praxis are academic, and any less idiomatic than Microsoft.
It was also the case in the mainframe era that programmers at non-IBM
companies consoled themselves for being part of one of the "seven
dwarves" that they were superior beings using high theory. This was
only true of Burroughs, because it's just silly to use such a crude
measure of being academically superior.
Today, Microsoft programmers are regarded with scorn by people who
actually know little about Microsoft's actual design procedures, who
are themselves incompetent in their own specialties.
Feel free to try to back up that claim.
Firstly, C99 *is* the (de jure) C Standard. Secondly, it's a standard
to which Microsoft's compiler does not conform. The ++ operator
requires an operand that is a modifiable lvalue, not only in C99 but
also in C89 and indeed in C compilers that pre-date C89.
Microsoft enforces this rule. So in what way is Microsoft
nonconformant? In not being non-deterministic as regards expression
evaluation?
Note that in fact few or no compilers, Microsoft or other, actually
CONFORM to the nondeterminacy called for in the standard in a()+b().
This is because most compile in a particular sequence, and in the
preponderance of cases this will be left to right. To literally
conform, the C compiler would have to use random number generation to
make the order undeterminate.
This shows the near-criminal misuse of standardization, for making
nondeterminancy a standard was not a favor to coders, nor did it
improve, or for that matter even "standardise" C semantics.
Quite the reverse, for non-determinacy is by definition not standard!
A useless non-determinacy was made the standard to retroactively bless
as many compilers as possible, to pimp Microsoft, and to preserve Holy
Profits, Batman.
Any stick to beat him with.
Quit whining. I have a lot of sticks because my case sticks, mate.
In programming, the details *matter*.
To nasty little clerks. The rest of us automate the detail work.
Then please get on with it.
That's bullshit and you know it. You've learned about most of my
errors when I've admitted them and/or a guy like Bacarisse has found
them. I've corrected them, most recently the grammar error in the
parser discussion, where tonight I've posted C Sharp code to parse
using the corrected grammar.
Eventually, sometimes. You will learn faster if you stop assuming
people are wrong to correct you.
Ben is right to correct me most of the time: Peter some of the time.
You, almost never, but sometimes, such as when you complImented me on
my knowledge of a simple word pair. Keep improving.
Presumably the difficulty in optimising C explains why it wipes the
performance floor with other languages.
Wrong answers are still wrong answers when arrived at fast. What you
imply may be true: C may be more resistant to optimization. But this
means that some idiot's opinion trumps the collective wisdom of
automated optimization.
Autism is not a learning disorder. It is a neural development
disorder. Not quite the same thing.
I see no paradox in Seebs being correct. He is correct (most of the
time) because he has taken the trouble to learn the language.
I think that the design of C is so poor that learning it destroys
other parts of the brain. Dickens saw this in the lawyers of Bleak
House whose knowledge of Jarndyce destroyed them in all other
respects.
because the need for intelligent interpretation is beyond you,
Actually, his articles reek of intelligent interpretation.
His articles don't give any hint that he feels threatened. He
occasionally gets exasperated, but who doesn't? But not threatened.
and when you see others make them
you are horrified by way of psychological transference.
No, he's just pointing out that they're errors.
["Oh my they might laugh at me like back in school."]
Projecting again?
But, in programming, we know how to deal with errors.
Well, we do. It is not evident that you are particularly skilled in
that area. Acknowledging errors is a vital precursor to dealing with
them, and you're weak in that area.
SO IS DECENCY AND RESPECT, and not jumping to unwarranted conclusions
about what Schildt does or does not know based on his attempt to be
clear...especially when you concede that the attempt is successful.
People won't admit errors in an environment dominated by autistic
twerps who globally question their competence based on one data point.
You made a FOOL out of yourself pulling that shit on me in 2003 when
you so generalized based on one data point, that being my use, for
readability, of repeated limit evaluation for. You later were
embarassed when people brought your attention to my book.
Using "autistic" as a pejorative is just pathetic. As for twerps,
well, twerp is as twerp does. Seebs's articles do not seem to me to
be particularly twerpoid.
In view of the foul abuse which you have enabled from the zanies
here, "autistic twerp" is both documented and defensible.
At least you have the good sense not to learn it from Schildt books.
So there's some hope for you yet. (Unfortunately, Schildt is by no
means the only C author who doesn't know C very well, so beware.)
Get it straight. For the same reason that clarity implies
understandability, and understandability implies truth, the knowledge
of mistakes coupled with the belief that mistakes make for a more
"efficient" language should NOT be called knowledge at all, just as a
lawyer in Bleak House who knows Jarndyce and nothing else does not
know the law. C's nondeterminacy was a mistake.
In fact, the programmer who codes a()+b() is smarter than the twerp of
a compiler developer who inverts the order for shits and giggles. This
is because the twerp, when said twerp decides to get gay and invert
the order, perhaps because of some feature of long-dead hardware, was
HIMSELF probably forgetting that in the language in which he considers
himself an Expert, the operands may have side effects!
Whereas the intelligent Java or C Sharp programmer, forced to maintain
some Fat Bastard's C code, has learned properly of propriety only to
be blind-sided by indeterminacy.
Worse, his code works because in so many cases the order is left to
right, but another Fat Bastard who sleeps with the Standard under his
pillow tells him the code is buggy because it is "not standard" and
"might" not work in the (unlikely) event that we chuck the PC and get
a Univac mainframe.
Whereupon the intelligent programmer tells Fat Bastard to take a hike.
I agree. I don't believe it will convince you either.
*This* is the mainframe era. As for covering up errors, that's a
losing strategy. Those who value correctness in others should also
value it in themselves. You can't correct an error if you won't
acknowledge it.
Equivocation. I admit technical errors, but what you call "errors" are
mostly matters of opinion, and you're one of those pub ranters who
must always be right. What's worse: you're a sober pub ranter.
That's why the better authors provide errata pages. Could you please
point me to Schildt's comprehensive and well-maintained errata page?
I can't seem to find it anywhere.
What's the point in learning wrong stuff?
You learned about learning not in institutions of learning, or it
didn't take, because what you think of as "learning" is what they
teach bairns in Borstal, Army recruits and in corporate training
classes. "Learning" is nothing like being bawled at by a Sergeant
Major to disassemble a Lee-Enfield his way lest the Fuzzy Wuzzies
conquer your sorry ass. It is a DIALOGUE between teacher and student
in which the teacher might not always be right. It is one of MUTUAL
respect in which the teacher, unlike the Sergeant Major, does not
belittle the student any more than the student does not belittle the
teacher as Seebach belittles Schildt.
Really? When was that?
Bad organisation, not bad content.
I thought you said he was an autistic twerp. Perhaps you think he's a
worthwhile and talented autistic twerp?
Yes. A worthwhile and talented autistic twerp.