C
Christian Kandeler
Hi,
consider the following program:
#include <stdio.h>
int main(void)
{
struct test {
unsigned int x : 1;
} test;
test.x = 1;
printf("%lu\n", (unsigned long) (test.x << 31));
return 0;
}
On a platform with 64-bit longs and 32-bit ints, this prints
18446744071562067968, i.e. a number that has the upper 33 bits set to 1.
This stunned me at first, but I think I have now figured out what happens:
(1) Because the bit-field is only one bit wide, all its values fit into a
signed int, so test.x is converted to one.
(2) Therefore, the result of the shift operation is a signed int too.
(3) Since the resulting value is negative on this platform, ULONG_MAX + 1 is
added to it, yielding the value mentioned above.
Is this correct?
If it is, the (unwanted) sign extension is the result of (1), which converts
the unsigned bit-field to a signed int. This could then easily be avoided
by casting the bit-field to an unsigned int before the shift. However, the
resulting program
#include <stdio.h>
int main(void)
{
struct test {
unsigned int x : 1;
} test;
test.x = 1;
printf("%lu\n", (unsigned long) ((unsigned int) test.x << 31));
return 0;
}
still prints the same value with gcc 3.3.3. All other compilers I have tried
(including gcc 4), print 2147483648, as I had originally expected. Is my
assumption correct that gcc 3 is wrong here? Or am I overlooking something
and the behavior is actually implementation-defined?
Thanks,
Christian
consider the following program:
#include <stdio.h>
int main(void)
{
struct test {
unsigned int x : 1;
} test;
test.x = 1;
printf("%lu\n", (unsigned long) (test.x << 31));
return 0;
}
On a platform with 64-bit longs and 32-bit ints, this prints
18446744071562067968, i.e. a number that has the upper 33 bits set to 1.
This stunned me at first, but I think I have now figured out what happens:
(1) Because the bit-field is only one bit wide, all its values fit into a
signed int, so test.x is converted to one.
(2) Therefore, the result of the shift operation is a signed int too.
(3) Since the resulting value is negative on this platform, ULONG_MAX + 1 is
added to it, yielding the value mentioned above.
Is this correct?
If it is, the (unwanted) sign extension is the result of (1), which converts
the unsigned bit-field to a signed int. This could then easily be avoided
by casting the bit-field to an unsigned int before the shift. However, the
resulting program
#include <stdio.h>
int main(void)
{
struct test {
unsigned int x : 1;
} test;
test.x = 1;
printf("%lu\n", (unsigned long) ((unsigned int) test.x << 31));
return 0;
}
still prints the same value with gcc 3.3.3. All other compilers I have tried
(including gcc 4), print 2147483648, as I had originally expected. Is my
assumption correct that gcc 3 is wrong here? Or am I overlooking something
and the behavior is actually implementation-defined?
Thanks,
Christian