K
kuyper
Joe said:No, they can't. The bits of a byte are ordered as they are. The bit
order cannot change between int and char.
Citation please?
I don't see anything in the standard that requires the value bits of
any two unrelated integer types to be in the same order. It's certainly
feasible, though very expensive, for an implementation to have 'int'
values represented using the bits within each byte in the reverse
order with which those bits would be interpreted as unsigned char. Such
an implementation would be very unnatural, but it would be perfectly
feasible, and could be done in a way that's perfectly conforming. If
you can find a clause in the standard prohibiting such an
implementation, please cite it.
It would be much more plausible at the hardware level: I think it would
be quite feasible to design a chip where instructions that work on
2-byte words interpret the values of bits in precisely the reverse
order of the way that they're interpreted by instructions that work on
one byte at a time. I can't come up with any good reason to do so, but
I suspect it could be done fairly efficiently, achieving almost the
same speeds as more rationally-designed hardware.
The point isn't that there's any good reason to do this; I can't come
up with any. The point is that the standard deliberately fails to
specify such details. I believe that the people who wrote the standard
worked on the principle that it should avoid specifying anything that
it doesn't have a pressing need to specify. That makes it possible to
implement C on a wide variety of platforms, including ones using
technologies that didn't even exist when the standard was first
written. Can you think of any reason why the standard should specify
that unrelated integer types order their bits within each byte the same
way?