O
omisols
Gurus -
Can anyone explain the behavior here?
#include <stdio.h>
#include <stdlib.h>
int main ()
{
unsigned char c = 0;
printf (" c = %u, ~c = %u \n", c, ~c);
return 0;
}
The output for the above chunk of code (when run on a linux box with
gcc) is:
c = 0, ~c = 4294967295
The value for ~c is (2 ^ 32) - 1.
Isn't an unsigned char 8 bits long?
If that's the case, should the value of ~c be ( 2 ^ 8 ) - 1 == 255?
I'm confused.
Thanks,
OmiSols
Can anyone explain the behavior here?
#include <stdio.h>
#include <stdlib.h>
int main ()
{
unsigned char c = 0;
printf (" c = %u, ~c = %u \n", c, ~c);
return 0;
}
The output for the above chunk of code (when run on a linux box with
gcc) is:
c = 0, ~c = 4294967295
The value for ~c is (2 ^ 32) - 1.
Isn't an unsigned char 8 bits long?
If that's the case, should the value of ~c be ( 2 ^ 8 ) - 1 == 255?
I'm confused.
Thanks,
OmiSols