Syntax for union parameter

R

Rick C. Hodgin

Do you have any reason why you would have chosen to code in a manner
contrary to the one that you yourself have prescribed?

Do you mean why I use Microsoft's compiler instead of GCC? If so, it
is exactly, and only, for one precise reason: edit-and-continue abilities.
If so, then in
what sense have you prescribed it? If not, they they are in disagreement
with you, if only with regard to whether or not the reason they had was
sufficient to justify ignoring your prescription.

I view edit-and-continue as a sufficient enough advantage to continue using
Microsoft products (until I get my own toolset completed) despite even fairly
significant deviations from other well known compiler standards or practices.
I can re-code those variations (should there be any in the type of coding I
do) faster than I can develop code using GCC from the get-go without the
many advantages of the Visual Studio IDE and related plugins.

If your point was something else, I did not understand it.

Best regards,
Rick C. Hodgin
 
K

Keith Thompson

Rick C. Hodgin said:
Seriously, James? Are we back to this (again). *SIGH* This will be my
last post to you on any issue relating to whether or not compiler authors
agree with me.

The scope here is "on this point." The GCC compiler authors agree with
me (on this point) because they wrote their compilers to operate as I
have indicated. Regardless of their underlying motives, this is what
they wrote. It is the way I have indicated. We are in agreement (on
this point).

I think you're using the word "agree" in a matter different from the way
James and I (and probably most people) understand it.

If the gcc developers "agreed" with you, it would mean that they share
your thoughts about how expressions *should* be evaluated, that the
order you prefer is right and any other order is wrong.

What I strongly suspect is that they do not share your opinions, and
that if their compiler happens to generate code that behaves in a way
that conforms to your expectations, it's coincidental. They could,
without changing their minds about anything, release a new version
tomorrow that behaves differently (in areas where the C standard doesn't
specify the behavior).

Compilers are complicated things. A particular order of evaluation
might be an emergent property that the designers didn't intend.

[...]
*SIGH* By your own line of reasoning whether or not they actually
disagree with me cannot be known. For example, you say that the GCC
authors could've disagreed with me, but for whatever reason chose to
code in the manner I prescribe. Well, the same applies here, James.

It could be known if someone asked them, or if they've already said
something about it.

[...]
Applying this same logic to GCC, either way, their authors are agreeing
with me.

Ok, you're definitely using the word "agreeing" in a manner that's
inconsistent with my understanding of the word.

Suppose, hypothetically, that you think a person should always start
walking on his or her left foot, and that starting on the right foot is
simply wrong. I would disagree with you on that point; I don't think it
matters which foot I start on. If I happen to start on my left foot,
that doesn't imply that I agree with you; I still think you're wrong
in your idea that it matters.

[...]
I've never seen a language that pushes left-to-right, but all of them
push right-to-left. My experience here is limited to x86 and ARM
implementations using various toolsets, but is by no means comprehensive.

*Languages* don't push left-to-right. The C language is defined by the
ISO C standard, which explicitly leaves the order unspecified. Why do
you have such difficulty either understanding or believing that?

[...]
Yeah, I would argue that today on modern CPUs it makes little difference.
I'm sure the C standards I've seen (of right-to-left) now exist for backward
compatibility reasons.

You're misusing the word "standards".

[...]
 
J

James Kuyper

How about just one?
#include <stdio.h>
int main(void) {
int arr[] = { 10, 20, 30, 40, 50, 60 };
int i = 0;
arr[i++ + ++i] = i++ + ++i;
for (int i = 0; i < sizeof arr / sizeof arr[0]; i ++) {
printf("%d ", arr);
}
putchar('\n');
}

Using gcc 4.7.2, the output without optimization is:
10 20 4 40 50 60
With optimization, the output is:
10 20 30 40 4 60


A flaw in the optimization engine, or one which takes advantage of the
fact that there is no hard definition on this type of behavior as is
defined in C (again, a flaw in the optimization engine that it produces
inconsistently executing code as per its unoptimized form).


As far as the C standard is concerned, optimizations that cause problems
for code with undefined behavior do NOT indicate a defect in the
optimization engine, they indicate a defect in the code. If the gcc
authors share that possibility, then another possibility is that the
optimization engine that is working precisely as intended, even if you
disapprove of intentionally creating a an engine that works that way.
 
J

James Kuyper

Do you mean why I use Microsoft's compiler instead of GCC? If so, it
is exactly, and only, for one precise reason: edit-and-continue abilities.

No - I mean if you were creating the assembly code yourself, either
manually or by the use of RDC, rather than relying upon either MS or GCC
to create it.
 
R

Rick C. Hodgin

*Languages* don't push left-to-right. The C language is defined by the
ISO C standard, which explicitly leaves the order unspecified. Why do
you have such difficulty either understanding or believing that?

Languages are useless without an implementation. The implementations of
the languages I've seen on x86 and ARM only push right-to-left.
You're misusing the word "standards".

"Standards," in this case, referring to the way they are written and operate
in practice (as per the last couple/three decades), as is visible by examining
their code in disassembly mode inthe debugger, or through generated ASM output.

And again I say, my understanding of this is limited only to the things I've
seen on x86, and more recently, ARM. It is in no way comprehensive, but as
to my memory, this is an exhaustive list of the things I've seen. And I
seem to recall vaguely that I have seen compiler switches which allow you
to specify left-to-right as an option.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

As far as the C standard is concerned, optimizations that cause problems
for code with undefined behavior do NOT indicate a defect in the
optimization engine, they indicate a defect in the code.

That is a categoric flaw on behalf of the compiler coders (to be thusly
inconsistent with their own unoptimized code), and another categoric
flaw in the C standard for allowing such behavior to even exist in the
first place.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

No - I mean if you were creating the assembly code yourself, either
manually or by the use of RDC, rather than relying upon either MS or GCC
to create it.

I still don't understand. How have I "chosen to code in a manner contrary
to the one that [I myself] prescribed"?

Best regards,
Rick C. Hodgin
 
J

James Kuyper

That is a categoric flaw on behalf of the compiler coders (to be thusly
inconsistent with their own unoptimized code), and another categoric
flaw in the C standard for allowing such behavior to even exist in the
first place.

That "flaw" exists because both the authors of the C standard and most
of the implementors who implement C disagree with you about whether it's
a flaw. Not having to detect and deal specially with code that has
undefined behavior allows simpler, faster, and more effective
optimization engines. Users want that greater speed, so implementors
have an incentive to provide it, and members of both groups were on the
committees that deliberately decided to write the standard in a way that
accommodates those desires.
 
K

Keith Thompson

Rick C. Hodgin said:
That is a categoric flaw on behalf of the compiler coders (to be thusly
inconsistent with their own unoptimized code), and another categoric
flaw in the C standard for allowing such behavior to even exist in the
first place.

It is not a flaw, categoric or otherwise. It is a language design
decision, made deliberately and thoughtfully for reasons that you happen
not to agree with.

Let me try another analogy. You ask me to go to the store and buy some
milk. I buy a quart. You ask someone else to go to the store and buy
some milk. He buys a half gallon. Is it a "categoric flaw" in the
English language that the phrase "some milk" allows for differing
precise quantities? Is (at least) one of us wrong for buying an amount
other than what you expected? Of course not; the phrase is ambiguous,
and deliberately so. By saying "some milk", you (perhaps deliberately,
perhaps accidentally) left it up to me to decide how much to buy. If
you wanted to ask me to buy some specific amount of milk, you could
easily have said so.

English permits ambiguous statements that do not precisely specify all
aspects of what's being said. It would be a far weaker language if it
didn't. The phrase "some milk" does not mean "precisely one US quart of
whole milk at a temperature of 40 degrees Fahrenheit in a plastic
container".

C permits ambiguous expressions that do not precisely specify the order
of the operations performed. It's less obvious that this ambiguity is
*necessary*; it permits some optimizations, but one could argue that the
resulting confusion is too high a price to pay. But it was, as I've
said, a deliberate design decision.

The C expression (x + y) *does not mean* evaluate x, then evaluate y,
then compute their sum. It means compute the sum of x and y. If you
want to specify the order of evaluation, you can do so by writing
different code.

If you don't like it (and I'm not arguing that you should), then I
advise you either to stop using C and stop hanging out in a forum that
discusses C, or to accept that C is defined in a way that you dislike
and learn to live with it.
 
K

Keith Thompson

Rick C. Hodgin said:
Languages are useless without an implementation. The implementations of
the languages I've seen on x86 and ARM only push right-to-left.

Your experience is limited (as is mine, of course). Languages and
implementations are not the same thing.
"Standards," in this case, referring to the way they are written and operate
in practice (as per the last couple/three decades), as is visible by examining
their code in disassembly mode inthe debugger, or through generated ASM output.

As I said you're misusing the word "standards". ISO/IEC 9899:2011 (E)
is a standard. The particular behavior of a particular implementation
is not a standard.

[...]
 
R

Rick C. Hodgin

That "flaw" exists because both the authors of the C standard and most

of the implementors who implement C disagree with you about whether it's
a flaw. Not having to detect and deal specially with code that has
undefined behavior allows simpler, faster, and more effective
optimization engines. Users want that greater speed, so implementors
have an incentive to provide it, and members of both groups were on the
committees that deliberately decided to write the standard in a way that
accommodates those desires.

I agree with wanting greater speed. So, add a command line override which
lets the compiler take advantage of some feature on a particular platform
which allows it to run faster when enabled, thereby removing some peculiar
requirement of the language on obscure code, but then for the rest of the
time, for the rest of the developers, adhere to the (what should be an
existent) standard and allow the code to operate the same on all platforms.

This "undefined behavior" allowance is horrid. It should define a behavior,
and then people who write compilers, who have experience in optimizing in
certain unique ways for each platform, can then expose those features they've
found which allow it to go faster. A developer need only use the switch to
get the greater speed, while simultaneously reading: "Using this switch
will remove strict adherence to clause x.y.z of the official C standard,"
and possibly even point out in your source code where the sacrifice is
being made so the developer can code around it.

Anything less is ... something less than what should be being done.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

Your experience is limited

I believe I volunteered that information.
(as is mine, of course). Languages and
implementations are not the same thing.

Obviously. And that's my point. GCC's "standards" are what they have
released on a given version. They may change, but for that version it
is what it is, right, wrong, or indifferent. I cannot expect GCC to
adhere to a C standard which exists if in a particular release there is
a bug which does not adhere to it. At that point, were I forced to use
that errant version, I would have to acknowledge the NEW STANDARD which
exists, the result of the publication of the flawed release.
As I said you're misusing the word "standards". ISO/IEC 9899:2011 (E)
is a standard. The particular behavior of a particular implementation
is not a standard.

Keith, there are other definitions of the word standard. It is not a formal
spec, which is called a standard, but it is actually a standard, a de facto
standard (as per its published functionality, meaning it exhibits certain
behavior in a publicly released version that one can rely upon for that
version (at least)). It is something which, over time, has come to be a
particular way, has been coded for, has been unit tested, and so on. It
is not an arbitrary fluke that will toggle with each release based on
whomever it was that kicked off the release build that day. No, it is a
standard.

https://en.wikipedia.org/wiki/De_facto_standard

You consistently demonstrate that you do not understand my posts because
you rigidly adhere to positions which are tenuous at best, and place a
large gulf or barrier between us. I, for one, would be content to cease
all discussions between us, and allow both of us to continue on with our
lives.

Agreed?

Best regards,
Rick C. Hodginc
 
D

David Brown

Oops! I obviously meant right-most function call parameters are
pushed first (as such, right-to-left).


I don't know. I can see arguments either way. The first is the nature
of the early C compiler implementations. They were parsing parameters
left-to-right on the line, and they parsed all the way through to make
sure it was a valid syntax before encoding it. That meant recursion
through the parameters, and then explicit code generation writes on the
way back out. I could be wrong, but it is at least one possibility.

I've never seen a language that pushes left-to-right, but all of them
push right-to-left. My experience here is limited to x86 and ARM
implementations using various toolsets, but is by no means comprehensive.

The order of parameter passing can be defined by the ABI for the
platform, by the language, or it can be implementation-dependent. Some
ABI's make it specific (in order to allow linking code in different
languages), and some platforms support multiple ABI's. The C standards
do not dictate any order (they do not even dictate using a stack), but
/most/ implementations use a few registers for the first parameters,
then a stack for the rest in right-to-left order. And /most/ ABI's have
that as their basic calling convention.

However, on x86 there have been many different conventions. "cdecl" is
the most commonly used for C, with all parameters passed on the stack,
right-to-left. But the Windows API uses "stdcall", which passes the
parameters on the stack in left-to-right order. This ordering is
specified for Pascal, which always orders evaluation left-to-right
because the language is designed to be parsed in a single pass, with no
more than one symbol read-ahead, and it does not have the variable
length parameter lists that force C to use right-to-left parameter stacking.


<http://en.wikipedia.org/wiki/X86_calling_conventions>
 
K

Keith Thompson

Rick C. Hodgin said:
I believe I volunteered that information.


Obviously. And that's my point. GCC's "standards" are what they have
released on a given version. They may change, but for that version it
is what it is, right, wrong, or indifferent. I cannot expect GCC to
adhere to a C standard which exists if in a particular release there is
a bug which does not adhere to it. At that point, were I forced to use
that errant version, I would have to acknowledge the NEW STANDARD which
exists, the result of the publication of the flawed release.

The order of evaluation of operands, in cases where the ISO C
standard does not specify it, is not a "standard", it's mere
happenstance -- particularly if the gcc documentation doesn't
bother to mention it. If you're going to be upset when the order
changes in a new release, or with different optimization options,
or with the phase of the moon, then you're likely to spend a great
deal of time being upset.

You use the word "standard", in a forum whose subject matter centers
around a particular published ISO standard, to refer to something
for which nobody else here uses the word "standard".

[snip]
You consistently demonstrate that you do not understand my posts because
you rigidly adhere to positions which are tenuous at best, and place a
large gulf or barrier between us. I, for one, would be content to cease
all discussions between us, and allow both of us to continue on with our
lives.

And apparently it's my fault that you're misunderstood.

If you're going to make statements here about C, I reserve the right to
discuss those statements, and to publicly correct any misinformation
that you propagate here. Recent experience has not indicated that this
will do you any good, but it might do some good for others.

If you wish to cease communications, feel free to do so.
 
J

James Kuyper

No - I mean if you were creating the assembly code yourself, either
manually or by the use of RDC, rather than relying upon either MS or GCC
to create it.

I still don't understand. How have I "chosen to code in a manner contrary
to the one that [I myself] prescribed"?

I didn't say that you have. I was asking whether there could be any
reason why you would. I provided two different responses, prefixed with
"if so" and "if not", respectively, depending upon whether you answered
that question with a "yes" or a "no". I'm not suggesting any specific
reason, merely asking whether any such a reason could exist.

In context, it should be a reason that could plausibly also apply to
Microsoft - so gangsters holding guns to your head forcing you to
violate your prescription would not be an example of the kind of
"reason" I was asking about.
 
D

David Brown

No - I mean if you were creating the assembly code yourself, either
manually or by the use of RDC, rather than relying upon either MS or GCC
to create it.

I still don't understand. How have I "chosen to code in a manner contrary
to the one that [I myself] prescribed"?

You suggested that perhaps MS agreed with you about specifying a
particular order of evaluation, and then decided to implement the
opposite order in their compiler "for whatever reason". James was
trying to ask you if you could think of a reason why /you/ might specify
one method then implement the opposite. If you can't imagine such a
reason for yourself, why do you think it is a realistic possibility for MS?
 
J

James Kuyper

On 02/07/2014 02:53 PM, Rick C. Hodgin wrote:
....
I agree with wanting greater speed. So, add a command line override which
lets the compiler take advantage of some feature on a particular platform
which allows it to run faster when enabled, thereby removing some peculiar
requirement of the language on obscure code, but then for the rest of the
time, for the rest of the developers, adhere to the (what should be an
existent) standard and allow the code to operate the same on all platforms.

The standard says nothing about command line options, which is good,
because if it did, it would prohibit the use of compilers that are built
into an IDE, with the option list controlled by menu choices, rather
than a command line. This is a prime example of the advantages of
avoiding unnecessary specificity in a standard.

Also, keep in mind that standard allows such optimizations; but
in no way does it mandate them. Therefore, an implementation is free to
provide exactly the features you describe, while remaining fully
conforming regardless of which option you choose. Many do provide a
feature similar to that, except that the interface is reversed: certain
minimum optimizations, some of which may require excersizing the
implementation's right to rearrange the order of evaluation, are on by
default. A special option is required to turn them off, not on. It's
done that way because it matches the preferences of most of the users,
who prefer turn off the default optimizations only when they want to
make the behavior more predictable (for instance, for tracking down bugs).
 
D

David Brown

1) I agree with you, Rick. C would be a better language if it were
completely standardized. And by that I mean that the results are
predictable (see below for more on this). But it isn't, and we have to
live with that.

I don't know that C would be better by being /completely/ standardised -
but it could be better if some things were fixed in stone. Personally,
I would like to see things like two's complement arithmetic being
mandated, as well as making certain conversions explicitly specified.
There are a lot of things that are consistent for all modern practical
processors and C compilers, but are merely "implementation dependent" in
the standards. Writing portable code would be easier if these were
fixed, and nothing would need changed in any toolchains (unless anyone
wanted to write a compiler for an ancient mainframe following these new
standards).

The order of evaluation of parts of an expression (or arguments to a
function call) do not fall into that category. Personally, I prefer
these to be unspecified - it gives the compiler more freedom, and there
is no problem writing explicitly ordered code if that is needed.
2) If you want D, you know where to find it. Or RDC...

Note: The fact is that C is not a "safe" language. And by safety, I mean
at least the following 3 things:

1) The behavior of any syntactically valid program is predictable and
consistent (across implementations).

I suspect that requirement is equivalent to the halting problem, and
therefore impossible. There is room for improvement, of course, both in
the C standards and in implementations (which could always be better at
warning about possibly undefined behaviour).
2) The behavior of any syntactically valid program is safe (none of
this crashing your system, reformatting your hard drive, or nasal
daemons stuff). I'm sure we all remember the first time we found
out that a perfectly reasonable, syntactically valid C program
could easily crash your system (for example, if you allocate too
big of an automatic [aka, "stack"] array).

That would require a sandbox environment for running the program, such
as a virtual machine - or at the very least, a "virtual virtual machine"
using JIT compilation. That is why "safe" languages like Java and C#
need virtual machines. "Big" systems, like x86 systems running a full
OS, can get quite a lot of this through memory protection and process
privileges - they do that today. On small systems, it is completely
unrealistic as the hardware does not support such control.
3) Runtime errors are caught and handled sensibly (without the
programmer having to do anything - i.e., write error handler code for
every system call, as in C). Normally, this means generating a
sensible error message and aborting the program.

Again, this is only feasible on big, hosted systems - C is also used on
embedded systems which cannot handle errors like that.
 
D

David Brown

David Brown said:
On Thursday, February 6, 2014 4:52:52 PM UTC-5, Keith Thompson wrote: [...]
C has done quite well *without* defining the order of evaluation of most
expressions.

Until I switch from compiler A to compiler B ... then C becomes useless.

As long as you write proper C, then there is no problem. If you want to
write some sort of mess that is syntactically correct and happens to be
accepted by the compiler, but is undefined, then you can't expect to
rely on getting the same nasal daemons from two different compilers.

There's plenty of C code that depends on compiler-specific extensions
and other characteristics -- and if you need to use those extensions,
there's nothing wrong with that.

Oh, I know that - I use such extensions all the time. But that is not
the situation here (I don't think we need to make it more complicated!).
The Linux kernel, for example, can't be compiled by C compilers other
than gcc (or compilers closely compatible with gcc). Plenty of
Windows-specific code depends on Microsoft's compiler. And so forth.

In my line of work, you write the code to be as portable as practically
possible, but still fix the toolchain used for a particular project. I
don't even change minor versions of a toolchain within a project (and
thus keep lots of old toolchains lying around for different targets).
Just because theory says it all /should/ work (baring compiler-specific
extensions), does not mean that it all /will/ work. And libraries are
also tied in with specific toolchain versions.
[...]
C has been standardised since 1989 - and the standards explicitly say
that the ordering of actions between sequence points is not defined, and
that code that depends on a particular ordering (such as "a =
a[i++]") has undefined behaviour.


It's not quite that simple. For example, the behavior of this:

printf("foo") + printf("bar");

depends on the order of evaluation of the operands of the "+" operator,
but the behavior is not undefined. It must (attempt to) print either
"foobar" or "barfoo"; which one it prints is *unspecified*.


Fair enough - things are /never/ quite that simple!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,077
Messages
2,570,567
Members
47,203
Latest member
EmmaSwank1

Latest Threads

Top