Syntax for union parameter

R

Rick C. Hodgin

No, that's why traffic lights are always ordered red/yellow/green from
top to bottom (at least in the US).

I was thinking ... this red/green system we have with vertical and
horizontal formats, red typically on the top or left, but probably not
always (with certainly no mechanical system to rigidly require it to
be that way) ... that such a system would never come into existence
were it created today and offered as a solution for traffic stops.
There are far too many flaws.

It's very much like aspects of C in that way. :)

What I'm trying to do is create a new traffic system cue, one that does
not rely on legacy baggage from an era when things were not the way
they are now.

Best regards,
Rick C. Hodgin
 
J

James Kuyper

The reasoning is the sequence of operations:
// a = a[i++];
t1 = i; // i of "i++"
i = i + 1; // ++ of "i++"
t2 = a[t1]; // value of "a[i++]"
a = t2; // stored into "a"


The flaw in that reasoning is that C gives implementations greater
freedom in the order of evaluation than that.


That is a flaw in C. If a different order is required, it should be coded
explicitly by the developer in source code. The compiler should always
do things in an explicit order, and at all times.


The ideal order for one platform could be quite different from the ideal
order for another platform. C allows developers to ignore that issue,
writing a single version of their code for a wide variety of different
platforms. Instead, it puts the onus on implementors of C. An
implementor doesn't have to support platform-specific optimizations, but
as soon as there are at least two different implementations targeting
the same platform, competition for customers will provide a motivation
to do so.

A small number of implementors need to be very familiar with each
platform they're implementing C on, allowing a large number of C
developers to write code without having to worry about such issues, and
to easily port their code from one platform to another. If you want to
worry about such issues, creating a different version of your code for
each platform, C isn't the language for you; try assembler instead.

Hence my use of the cask. If you need an intermediate values, use the cask
to explicitly obtain exactly what you need, and don't rely on a wonky
compiler flexibility/liability which may/or-may-not work as you expect
from one version to another. D'oh! That's a no-brainer!

The current syntax of C allows complete specification of the order,
should it be important to do so. Just created explicitly named temporary
variables, and use multiple statements rather than trying to cram
everything into one statement. It's not necessary to add anything to the
language for that purpose.

I've not paid much attention to your "cask" concept, so I've no idea
whether it would be useful for any other purpose. The quality of your
other ideas gives me little justification for bothering to investigate
that one.
a[ti] = a[((|ifp|ti||)i)++];
Make heads or tails of that without a GUI IDE!

That is a prime example of why the C committee didn't feel that such
code deserved protection.

The C committee considered casks? AWESOME! I thought they were an
invention. Parallel discoveries! I love it!

For all I know, they might have, though if they did, I doubt it was
under that name; obviously, if they did consider it, they rejected it,
so I wouldn't get too happy about the possibility that they might have
considered it.

I was referring only to the complexity of your code, not to it's use of
features that RDC has which are different from those mandated by the C
standard.
 
R

Rick C. Hodgin

Yes, we've only been using text-based information for a thousand
years or so, and the verbal equivalent for considerably longer.

If they had computer technology back then, they would've been using GUIs
as well.

I'll admit text has its place. But GUIs offer far better features in
almost all cases. You can do some things in text faster than with a
GUI, but there are ways around most of those as well. For example,
with a GUI you click on something and press the Delete key to delete
it. You don't have to drag-and-drop. And many others.

Best regards,
Rick C. Hodgin
 
M

Martin Shobe

Not exactly, Mr. Thompason.

http://www.city-data.com/forum/general-u-s/666062-traffic-lights-pictures-your-states-region.html

In Indianapolis where I live, there is a low underpass bridge used by
trains. It is a tight, confusing intersection when all is well. Add
a little rain and it starts to flood, a little snow and people get
killed at that intersection. It has horizontal traffic lights because
it cannot take up so much space vertically and still meet height
requirements.

They are typically oriented vertically. But they are not. And I'm
sure if I was making the case for the red/green system, someone named
Keith Thompson or James Kuyper would point out that I have not
considered the times where vertical placement just isn't possible or
practical. :)

Same difference, except here the red is always on the left. You can
still tell which one means stop without being able to see color.

Martin Shobe
 
J

James Kuyper

On 02/06/2014 10:43 AM, Rick C. Hodgin wrote:
....
I would welcome your experience on my project, James,

I doubt it - I would routinely use that experience to bring your
attention to issues you've decided to ignore in your enthusiasm for your
own concept of what a programming language should be. No major change
has ever been achieved except by those who have the enthusiasm to see it
through to it's conclusion. However, enthusiasm isn't enough, your ideas
must also be good enough to justify your enthusiasm, and you need to
understand, with more sympathy than your currently showing, the reasons
why other people have made other choices for their languages than you did.
Not everyone who made a different choice than you, did so because they
lacked your understanding; many of them made a different choice because
their understanding was superior to yours. Others made a different
choice because they were targeting a different computing environment
than the one you're familiar with, one where their choice was better
than yours. It isn't always the case that the environment was different
for reasons that are now obsolete, though I understand that it seems
that way to you.
 
R

Rick C. Hodgin

Same difference, except here the red is always on the left. You can
still tell which one means stop without being able to see color.

Martin Shobe

Not exactly, Mr. Shobe.

http://www.flickr.com/photos/hankrogers/6998795712/

With regards to traffic lights, in the U.S., there seem to be common
practices, but no hard standards (kind of like the C language
implementations where it can on certain things do whatever it chooses
on platform X, and on platform Y something completely different).

Best regards,
Rick C. Hodgin
 
G

glen herrmannsfeldt

(snip, someone wrote)
Repeating a claim does not make it true.
The compiler must do what the specification says it must do for
specified behaviour, and should generate the best code it can for
implementation dependent behaviour. It can do what it likes for
undefined behaviour.
The programmer should learn to use the language and the tools.

Note that Fortran and PL/I don't have this problem, but they do have
another, similar one. In array expressions, such as:

X=X+X(5);

evaluation order matters. Fortran and PL/I both defined the result,
but in different ways.

Fortran requires that the result be as if the whole right hand side
was evaluated before the left hand side is changed. If the compiler
can't verify aliasing, it needs a temporary array.

PL/I requires that the change to X(5) happen as if the array element
were accessed in order. As machines with vector registers didn't exist
at the time, that made some sense, but now it restricts the
optimizations that can be done. (But in many other cases, PL/I will
generate temporary arrays.)

Leaving things implementation defined or undefined allows the compiler
to optimize the specific case, even for hardware not thought up at
the time the language was specified.

-- glen
 
K

Keith Thompson

Rick C. Hodgin said:
a = a[i++];
It's interesting that Java, which as I recall requires strict
left-to-write evaluation and makes `a = a[i++]` well defined,
gives a different result. I'm not saying that Java gets it right
(or even that C gets it right), but even if you think the behavior
should be well defined, it's not obvious *how* it should be defined.


My position is this: it's patently exactly obvious.

For example, you have to compute something before you can assign it. As
a result, when the assignment time of the calculation arises, whatever
value is in i at that time is what's used.

No ambiguity whatsoever. It always works properly in all cases.


Possibly the designers of Java also thought the answer was "patently
exactly obvious". And yet they came up with a different set of rules
than you did.

There's nothing wrong with defining a specific evaluation order for all
expressions, as long as the definition is consistent. But there are
also real advantages to leaving the order undefined or unspecified.
Your refusal to acknowledge the possibility that that might be the case,
or even that there's more than one set of rules that might make sense,
does not suggest to me that you have a good understanding of the issues.
 
G

glen herrmannsfeldt

(snip)
It's interesting that Java, which as I recall requires strict
left-to-write evaluation and makes `a = a[i++]` well defined,
gives a different result. I'm not saying that Java gets it right
(or even that C gets it right), but even if you think the behavior
should be well defined, it's not obvious *how* it should be defined.


I didn't know that Java defined this one, but the exception model of
Java does put a lot of restrictions on expression evaluation.
In simple cases, you have overflow, underflow, and bounds checking,
but add method calls and it gets more complicated.

-- glen
 
G

glen herrmannsfeldt

(snip)
The issue is that
a = a[i++];
could "evaluate" a[i++] before a, making the statement do something like
(snip)

In a slightly more complicated version:

int a[] = {1, 2, 3};
int i = 0;
int *p1 = &i;
int *p2 = &i;
a[*p1] = a[(*p2)++];
(snip)
has the exact same undefined behavior, but it might not be detectable at
the problem line. It is also possible, and allowed, that this might
result "any" behavior. There are (or at least were) processors, where
the code generated could bus fault when executing this statement.

Bus fault reminds me of an interesting case for the IBM S/370.

S/360 had a translate (TR) instruction, very useful for code
translation, such as ASCII <--> EBCDIC, but also useful for many
other cases. It does, pretty much, what C does for:

unsigned char *a, *b;
for(i=0;i<n;i++) a=b[a];

For n between 1 and 256. All in one instruction! (If you find
that interesting, see what TRT, TRanslate and Test, does.)

It was, however, defined such that it would still work if entries of
b were not addressable, as long as they weren't accessed.

That was fine for S/360, (well, except for the 360/67) but bad for
S/370 with virtual memory. Consider that b might cross a page boundary,
and the page on one side or the other might not exist. For S/370 and
later processors, if b crosses a page boundary it requires a trial
execution, where the whole thing is done without storing, and any
page faults are recognized. After the trial execution, it then does
it for real.

So, yes, another example where defining the obvious meaning for
something turned out to cause problems later.

-- glen
 
G

glen herrmannsfeldt

(snip on undefined operations such as a=a[i++];)
While that's true, C was a very early example of a language that did
this. Most of the languages you know and love that do this today (Java,
JavaScript to name but two) took the concept from C but specified it
more rigorously.

For Java, I believe a lot of the reason is the exception model, and
the requirement that exceptions be consistent.
I'm dubious that we'll ever know the motives behind some of the early
design decisions but several vaguely plausible ones are:
- because it makes writing the compiler easier
- because it makes the compiler run faster
- because no-one thought about the problem and by the time someone did
there was already heaps of code that assumed one behaviour or the other
- because on a popular architecture of the day there was a logical way
to map this to the machine code and so no-one bothered to specify it
further.

There are some interesting address modes for VAX that are undefined
for similar reasons. Especially interesting are ones that
autoincrement or autodecrement PC. Some of the reason was that the
PDP-11 allowed for some similar modes, which might have complicated
some implementations.
Generally though C leaves a lot more things unspecified (or defined
as undefined!) to allow the compiler maximum freedom to reorder and
restructure code to best suit the target architecture.
Both Java and JavaScript have very different models (compiled to a
single standard pseudoarchitecture; interpreted) to C (compiled onto a
vast array of very different hardware).

-- glen
 
R

Rick C. Hodgin

There's nothing wrong with defining a specific evaluation order for all
expressions, as long as the definition is consistent.

EXACTLY! :)
But there are
also real advantages to leaving the order undefined or unspecified.
Disagree.

Your refusal to acknowledge the possibility that that might be the case,
or even that there's more than one set of rules that might make sense,
does not suggest to me that you have a good understanding of the issues.

I can accept that. It may also indicate I'm right though (because I hold
strongly to the other side in spite of the apparent "evidence" which
suggests that not having standards is a good thing).

Best regards,
Rick C. Hodgin
 
K

Keith Thompson

Rick C. Hodgin said:
EXACTLY! :)


Disagree.

Those advantages have been discussed at some length in this thread.
I can accept that. It may also indicate I'm right though (because I hold
strongly to the other side in spite of the apparent "evidence" which
suggests that not having standards is a good thing).

You say that defining the order of evaluation is better than the C
approach. That's fine. I don't necessarily agree, but it's a perfectly
valid opinion.

You insist that there are *no* advantages to leaving the order of
evaluation undefined. That's simply incorrect. It permits compilers to
generate faster machine code from source code that manages to avoid
undefined behavior (at the cost of unpredictable results for code that
doesn't do so).

Finally, you insist that your particular definition is obviously
superior to any other (for example, to Java's rule that imposes strict
left-to-right evaluation, which strikes me as simple and easy to
understand). I find that absurd.
 
M

Martin Shobe

Not exactly, Mr. Shobe.

http://www.flickr.com/photos/hankrogers/6998795712/

With regards to traffic lights, in the U.S., there seem to be common
practices, but no hard standards (kind of like the C language
implementations where it can on certain things do whatever it chooses
on platform X, and on platform Y something completely different).

There are indeed standards. Whoever installed those got it wrong.

Martin Shobe
 
R

Rick C. Hodgin

There are indeed standards. Whoever installed those got it wrong.
Martin Shobe

Not comforting words to the dead, colorblind person's family.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

You insist that there are *no* advantages to leaving the order of
evaluation undefined. That's simply incorrect. It permits compilers to
generate faster machine code from source code that manages to avoid
undefined behavior (at the cost of unpredictable results for code that
doesn't do so).

No. It allows a compiler designer to create a compiler that implements
what he decides for the target architecture to produce "faster code."
But that doesn't mean it's producing actual real faster code because
it's all built upon the whims of its own standards, so you are no longer
comparing apples to apples. If you want to see which one generates
faster code, then parse it the same way and apply the logic the same way.

On the other hand, I would agree that there should be full flexibility
in how the mechanics of the language could be implemented. For example,
if a particular platform has a blazingly fast floating point engine, then
there's no reason integer values couldn't be implemented entirely in the
floating point engine, being converted to integer when needed, and so on.
However, those mechanical implementations cannot violate the rules of the
language, or else the language is rendered useless. And if the language
does not have specification on something as fundamental as order of
processing across an equal assignment, then it needs to be addressed.
Finally, you insist that your particular definition is obviously
superior to any other (for example, to Java's rule that imposes strict
left-to-right evaluation, which strikes me as simple and easy to
understand). I find that absurd.

I do think that a particular definition is superior to another, but I
also agree that having a common standard is something that's perfectly
acceptable as well (because everyone is working from the same sheet of
music).

I believe Java got it wrong in this case (if Java really does left to
right, I've never observed Java code being different from C code, but
maybe it's just the coding I've done). That's my opinion, and I would
carry it forward to any other similar language that went that way.

FWIW, I tested the a = a[i++] code in Visual C++, GCC, and G++.
It works as I specified in GCC and G++, and the exact opposite way in
Visual C++. I would find that completely unacceptable were I to encounter
such a thing in normal programming. With RDC, it won't be an issue.
It may produce slightly slower code on one platform, but at 3GHz and
higher ... who cares for 99.9% of applications. For the other .1%,
write some custom assembly, or re-write the algorithm.

By definition, and for all time, I would argue that the order in which
a source code line must be filled should be the absolute determining
factor in how that code is executed. And that is always how RDC will
parse its source files.

Best regards,
Rick C. Hodgin
 
K

Keith Thompson

Rick C. Hodgin said:
No. It allows a compiler designer to create a compiler that implements
what he decides for the target architecture to produce "faster code."
But that doesn't mean it's producing actual real faster code because
it's all built upon the whims of its own standards, so you are no longer
comparing apples to apples. If you want to see which one generates
faster code, then parse it the same way and apply the logic the same way.

Apples-to-apples comparisons (for C as it's currently defined) are
entirely possible. Just compare the speed of the generated code for C
source code whose behavior is well defined by the C standard. (Behavior
of code whose behavior is undefined is not relevant in the context of C.
Feel free to care about it yourself if you so choose, but expect me to
ignore you on that particular point).
On the other hand, I would agree that there should be full flexibility
in how the mechanics of the language could be implemented. For example,
if a particular platform has a blazingly fast floating point engine, then
there's no reason integer values couldn't be implemented entirely in the
floating point engine, being converted to integer when needed, and so on.
However, those mechanical implementations cannot violate the rules of the
language, or else the language is rendered useless. And if the language
does not have specification on something as fundamental as order of
processing across an equal assignment, then it needs to be addressed.

C has done quite well *without* defining the order of evaluation of most
expressions.
Finally, you insist that your particular definition is obviously
superior to any other (for example, to Java's rule that imposes strict
left-to-right evaluation, which strikes me as simple and easy to
understand). I find that absurd.

I do think that a particular definition is superior to another, but I
also agree that having a common standard is something that's perfectly
acceptable as well (because everyone is working from the same sheet of
music).

I believe Java got it wrong in this case (if Java really does left to
right, I've never observed Java code being different from C code, but
maybe it's just the coding I've done). That's my opinion, and I would
carry it forward to any other similar language that went that way.

FWIW, I tested the a = a[i++] code in Visual C++, GCC, and G++.
It works as I specified in GCC and G++, and the exact opposite way in
Visual C++. I would find that completely unacceptable were I to encounter
such a thing in normal programming. With RDC, it won't be an issue.
It may produce slightly slower code on one platform, but at 3GHz and
higher ... who cares for 99.9% of applications. For the other .1%,
write some custom assembly, or re-write the algorithm.


It's your expectations that are the root of the problem.

As far as C is concerned, and as far as I personally am concerned, the
behavior of `a = a[i++]` *simply doesn't matter*. Why does it matter
to you? Would you seriously write a line of code like that in
real-world code?

In most cases, code whose behavior is undefined because of evaluation
order issues is also just plain poorly written code.

The behavior of `a = a[i++]` is undefined. Your solution is to
drastically change the language definition so that the behavior is
defined. My solution is to write it more clearly, for example:

a = a[i+1];
i++;
By definition, and for all time, I would argue that the order in which
a source code line must be filled should be the absolute determining
factor in how that code is executed. And that is always how RDC will
parse its source files.

"By definition, and for all time". By *what* definition (other than one
you've made up)? It's difficult to take you seriously when you write as
if your personal preferences were Fundamental Truths of the Universe.
 
R

Rick C. Hodgin

Apples-to-apples comparisons (for C as it's currently defined) are
entirely possible. Just compare the speed of the generated code for C
source code whose behavior is well defined by the C standard. (Behavior
of code whose behavior is undefined is not relevant in the context of C.
Feel free to care about it yourself if you so choose, but expect me to
ignore you on that particular point).

So the apples-to-apples comparison is for those things defined in C.
Anything left up to the implementation cannot be compared. Useful. :)
C has done quite well *without* defining the order of evaluation of most
expressions.

Until I switch from compiler A to compiler B ... then C becomes useless.
I would argue that having C without standards has allowed the creation
of niche empires. A developer with a large code base dare not migrate
away from a toolset, lest they face so many bugs that it makes it
impossible to execute.
It's your expectations that are the root of the problem.

They said the same thing to Philo Farnsworth: "Send moving pictures
through invisible air? Lunatic!"
As far as C is concerned, and as far as I personally am concerned, the
behavior of 'a = a[i++]' *simply doesn't matter*. Why does it
matter to you?


It doesn't. It's an example demonstrating the thing I do care about:
order of operation.
Would you seriously write a line of code like that in real-world code?

I would do it on a bet or a dare, or if I was trying to prove a point,
or maybe to sneak in some code to amuse and confuse those who come after
me who will be examining my code. I can't imagine a case where I would
write code like that as a matter of course, however.
In most cases, code whose behavior is undefined because of evaluation
order issues is also just plain poorly written code.
Agreed.

The behavior of 'a = a[i++]' is undefined. Your solution is to
drastically change the language definition so that the behavior is
defined. My solution is to write it more clearly, for example:
a = a[i+1];
i++;


I'm penning the language definition for RDC. There are no drastic
changes. This is how I have always had it in mind to be.
"By definition, and for all time". By *what* definition (other than one
you've made up)?

None. It is the one I have determined.
It's difficult to take you seriously when you write as if your personal
preferences were Fundamental Truths of the Universe.

I believe that this methodology of writing code is better than the other
choices. The authors of GCC agree with me. The authors of Microsoft's
compiler disagree with me. And I'm sure if we got another 10 compiler
authors lined up they'd have varying opinions as well. It is simply
that mine fall on the side of the exact order required to carry out the
instruction, and in no other order, and using whatever exists at that
point based on any prior modifications.

It's my position, and it's the definition of RDC. Your language may vary.

Best regards,
Rick C. Hodgin
 
J

James Kuyper

No. It allows a compiler designer to create a compiler that implements
what he decides for the target architecture to produce "faster code."
But that doesn't mean it's producing actual real faster code because
it's all built upon the whims of its own standards, so you are no longer
comparing apples to apples. If you want to see which one generates
faster code, then parse it the same way and apply the logic the same way.

But the only thing that matters as far as the C standard is concerned is
the connection between the inputs and the outputs of the program.
Radically different ways of arranging the internal processing, that
produce the same final result, are allowed by the C language precisely
because they don't matter.
If your interest in apples and oranges were solely due to their calorie
content, without regard to their other nutrients or their flavor, then
it is entirely right and proper to compare apples and oranges - in terms
of their calorie content. Similarly, comparing two different
implementations of C for their speed and memory usage when implementing
the same code is entirely appropriate (if they're targeting the same
platform), even if the detailed machine code they generate does the same
thing in radically different ways.

Of course, if the sub-expressions have side-effects, the order in which
they are evaluated CAN make a big difference. In that case, if you want
to get those side-effects to occur in a specific order, all you have to
do is re-write your C code in such a way as to explicitly specify the
order. That has the considerable benefit of making the code easier read
and understand.

....
language, or else the language is rendered useless. And if the language
does not have specification on something as fundamental as order of
processing across an equal assignment, then it needs to be addressed.

C has been in existence for more than three decades, without that issue
being addressed, as a result of a deliberate decision that it was
unnecessary to address it. Experienced programmers know how to
explicitly specify the order, when it matters, and implementors know how
to design compilers to take advantage of the fact that the order isn't
specified, when it doesn't matter. This strongly suggests to me that the
"need to address" this issue exists mainly in your head.
I believe Java got it wrong in this case (if Java really does left to
right, I've never observed Java code being different from C code, but
maybe it's just the coding I've done).

Since C leaves it unspecified, it's not necessarily different from Java.
Therefore, a failure to observe a difference from Java may have just
been the result of bad luck. You shouldn't write C code where the order
matters - perhaps you've been accidentally paying more attention to that
rule than you thought?
 
R

Rick C. Hodgin

Since C leaves it unspecified, it's not necessarily different from Java.
Therefore, a failure to observe a difference from Java may have just
been the result of bad luck. You shouldn't write C code where the order
matters - perhaps you've been accidentally paying more attention to that
rule than you thought?

When I write code, I go out of my way to make it clear. I do that for a
host of reasons, but the biggest is long term maintenance concerns. I do
not mind sacrificing a little performance, or possibly missing out on an
optimization, so that when I come back to the code five years later I'm
able to immediately follow, understand, and make a change.

Best regards,
Rick C. Hodgin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Staff online

Members online

Forum statistics

Threads
474,078
Messages
2,570,570
Members
47,204
Latest member
MalorieSte

Latest Threads

Top