Christian Christmann said:
I would like to hear your opinion about the C standards.
When I have a
signed char a = 130;
I explicitly generate an overflow since the maximum value a singned char
can accommodate is "127". Is this behavior of the overflow defined in any
way or do I get an undefined state of "a" due to the C standards?
Assuming SCHAR_MAX==127 (which is common but not universal), the
expression 130 is of type int, so it's implicitly converted to type
signed char. To find out what happens, we have to read C99 6.3.1.3,
which covers conversion of signed and unsigned integers:
When a value with integer type is converted to another integer
type other than _Bool, if the value can be represented by the
new type, it is unchanged.
Otherwise, if the new type is unsigned, the value is converted
by repeatedly adding or subtracting one more than the maximum
value that can be represented in the new type until the value is
in the range of the new type.
Otherwise, the new type is signed and the value cannot be
represented in it; either the result is implementation-defined
or an implementation-defined signal is raised.
The third paragraph applies here; either an implementation-defined
value is assigned to a, or an implementation-defined signal is raised.
(The wording about signals is new in C99.)
It's likely that the implementation-defined value will be -126, but
the standard doesn't guarantee that.
The rules are different for conversions and for arithmetic operations.
If an arithmetic operation overflows for a signed type, the behavior
is undefined.