Keith said:
August Derleth said:
Kevin Goodsell wrote:
[...]
The question Ben is asking is, what would you propose pointer
multiplication do? How would you define the operation? I can't think
of any way in which multiplying a pointer by any other value would
be useful or even meaningful.
The way I reason is like this: If I take i and assign an address to it
(that is, I make it a pointer), i is the name of a block of memory
that holds a certain string of bits or trits or decimal digits that
compose that address. At this layer of abstraction, it's no different
from an int or a float. Since I can multiply two ints and stand a good
chance at getting a meaningful result, why not two pointers? Or an int
and a pointer?
Because you *can't* multiply two pointers, or an int and a pointer,
and stand a chance at getting a meaningful result.
Which doesn't stop BCPL, for instance.
Pointers are not integers. Pointer arithmetic in C is not defined in
terms of treating the bit patterns composing the pointer values as
integers and performing integer arithmetic on them; it's defined in
terms of what object the resulting pointer value points to.
I know that. I am satisfied with the existing pointer arithmetic,
acutally, even though I posited my remarks from the perspective of
changing it.
For example, if p is a pointer, the machine-level operations that are
performed to evaluate the expression (p + 4) depend on the type of p.
If p is a char*, (p + 4) points to a location 4 bytes "after" the
location pointed to by p; if p is an int*, (p + 4) points
4*sizeof(int) bytes "after" the location pointed to by p. If p is a
void*, the expression (p + 4) is illegal (though some compilers may
support it as a language extension (unwisely, IMHO)).
This is where my ideas about multiplication fall down in one respect:
Addition (by positive and negative numbers) is scaled, but there's no
meaningful way to scale multiplication. Or division, for that matter.
And how would you scale the square root?
We could invent ways, surely, but it's hardly worth it.
(If you want to know, I was imagining pointers as being treated like
integers in my imaginary C-like language.)
Another example: On Cray vector machines, a machine address points to
a 64-bit word. The C compiler implements 8-bit chars (CHAR_BIT == 8)
by "cheating" a little. (I put the word "cheating" in quotation marks
because what the C compiler does is perfectly legitimate; it just
doesn't map directly to the underlying hardware.) A char* is
implemented as a word address with a byte offset kludged into the
high-order 3 bits. (I actually don't know how much hardware support
there is for this format.) Multiplying two such pointers makes as
much sense as taking the square root of a struct.
This is fascinating. Which compiler do you use? Does gcc support the
Cray vector machines?
It's fascinating, and it makes my ideas sound rather foolish. You could
multiply two such pointers, but the machine-level semantics would always
be in doubt and the result would be universally meaningless.
If, for some reason, you want to treat the contents of two pointers as
if they were integers, multiply them, and store the resuting integer
bit pattern into a pointer, you can do so. I don't think a cast from
a pointer to an integer of the appropriate size, or vice versa, is
guaranteed to just copy the bits, but it probably does so on most or
all implementations. If you're concerned about that, you can use
memcpy(), which is guaranteed to copy the bits. Using explicit
conversions, you can multiply two pointers (treating the pointers' bit
patterns as integers and treating the resulting integer bit pattern as
a pointer) as easily as you can take the square root of a struct
(treating the struct's bit pattern as a double). The compiler won't
hold your hand while you do this, but it won't stop you. Of course,
you'll invoke undefined behavior, and I can't think of any possible
use for the result.
All of that is fair enough, and it's all I can reasonably expect. After
all, the compiler won't hold my hand, will it?
memcpy() seems like a
somewhat sideways means of achieving this, but its semantics are
obvious. Whereas the semantics of multiplying two pointers would not be.