I was surprised by OP's question, and slightly saddened
by most of the responses, many of which seemed to
agree with OP that the use of '->' for the '->' op
was a historical misfortune, and that a better-designed
language would overload wherever possible.
But fortunately I read down to Alan Curry's beautiful
response:
How about this for a reason: if it acts different, it should look different.
Dereferencing is different from not-dereferencing.
You found a pair of operations that could be represented by a single symbol,
since there's no set of operands that is legal for both. Does this mean it's
a good idea? No. Representing different operations with the same symbol is
called "overloading", and C is not a very overloaded language.
The arithmetic operators are pretty heavily overloaded, and this does cause
problems. Integer division gets used accidentally where floating point
division was intended. Pointer addition gets done with the wrong value
because people forget that it's automatically scaled up by the sizeof the
target.
'.' and '->' are the opposite of overloaded. They're underloaded. At any
particular spot in a program, at most one of them is legal. The distinction
between them is a pure redundancy. And redundancy is good. Redundancy allows
us to assign a meaning to only some inputs, and reject the others.
The elimination of redundancy makes more inputs meaningful, which makes the
language more delicate. It's helpful to have a lot of meaningless inputs that
near the meaningful ones, because meaningless inputs are errors detected at
compile time.
Hooray for redundancy! Hooray for meaninglessness! Hooray for underloading!
I absolutely agree with Mr. Curry. If others insist that a
"friendlier"
language would overload wherever possible, I'll answer that I fell in
love with C language because it *isn't* friendly in their sense.
It's WYSIWYG, and was often called the "portable assembler" for that
reason.
The rest of you may want languages that *guess* what you want.
Go down the hall, turn left, and look for a different language.
C does what it's told. And BTW where would you draw the line?
int a, **pp;
a += pp;
Compiler: treat 'pp' as **pp' to make it legal.
User: Oh thank you, compiler. This way I don't need
to wear out my '*' key. The '**' just confuses
my manager anyway.
x = y / 0;
Compiler: change the illegal '/' to a legal (though useless) '+'
Where do you draw the line? Maybe some day compilers will
be so much smarter than people that we expect them to silently
correct our errors. But that day is not this day.
And speaking to the question of redundancy in language, I
wish I'd saved a wonderful article in some information-theory
journal which pointed out that natural languages need to
*combine* information with redundancy. (Without the redundancy,
babies could never learn language.)
James Dow Allen