P
pete
The said:The standard guarantees: CHR_BITS >= 8
That is the world outside the C continuum interprets
char = 8 bits.
But hardware designers are sometimes crude and defines that
the mashine should have no possibility to address less than
9, 12, 16 or 32 or.... bits. As C is designed to be
effective it will then define char bits as the smallest number
of bits the mashine can address in its memory.
As external chars are commonly exactly 8 bit wide the
implementation will transfer the 8 bit to the internal
char wide and reverse. The result will be that your
32 bit char has 24 bits unused (or sign extended in case of
signed char.
So when your implementation tells you that sizeof(char) == sizeof(int)
== sizeof(long) == 1 then your
- unsigned char uses 8 bits,
If sizeof(long) equals one,
then unsigned char, uses at least 32 bits.
If sizeof(long) equals one,
then UCHAR_MAX is at least 0xffffffff.
the sign bit ignored
What sign bit ?
unsigned char doesn't have a sign bit.
and the unused bits zeroed (or set mashine dependant)
What?
Are you still talking about unsigned char ?