P
pemo
As far as I understand it, a trap representation [TR] means something like -
an uninitialised automatic variable might /implicitly/ hold a bit-pattern
that, if read, *might* cause a 'trap' (I'm not sure what 'a trap' means
here - anyone?).
I also read into this that, an *initialised* automatic variable, may never
hold a bit pattern that might, when read, cause a 'trap'. I.e., if an auto
is explicitly initialised, it may *never* hold a trap value; no matter how
it's been initialised - right?
So, firstly I'd like to know whether my interpretations of the standard are
correct here - always given that my phrasing, and choice of words is not
contentious (words always are contentious to a degree of course)??
Now, this second bit builds on my first bit. I think!
It seems to me that, for example, a char cannot possibly contain an implict
set of bits that could cause a TR - or is it that I'm not considering some
possible alphabet/possible-machine where this is feasible [and, if I could -
is this really worth worrying about?]? For example, consider this code:
// ex1.
char x;
int i = x;
As 'x' is read, the c99 standard says that it may contain a TR - however,
given that a char is eight-bits (CHAR_BIT is always 8 bits right?), and that
there's an ASCII code for all 256 possible values of a CHAR_BIT that can
'accidentally' be set in 'x' - x can *never* hold a set of bits that could
cause a trap. Right?
Now, *if* there was such a set of bits, it seems to me that *if* an auto
could *always* contain this set, that that would be *great* - as it would
prevent a lot of bugs like the one in ex1 above. But given my example, this
doesn't seem possible - all possible bit patterns are legal here - and the
only way of knowing that the bits in 'x' are 'wrong', is to know that 'x'
wasn't explicitly initialised. Building upon that, every C compiler I've
ever used issues a warning along the lines of 'x' is not initialised before
it is used - if such a diagnostic is *required* by the c99 standard, then
traps should never occur - if of course you're paying attention to the
warnings your compiler issues!
Also, if it were possible [to always trap in such a situation], it would
require some runtime-checking right - either by the OS, or by the compiled
code itself? And the latter seems to go against a bit of the C ethos as I
understand it, i.e., that the compiler doesn't check [at compile time], nor
does the compiler generate code that checks at runtime - a C compiler
assumes that you should know what you're doing, and be ever diligent when
you use it [the C language]?
Lastly, am I right in thinking that TRs would simply /go away/ *if*
compilers/the-std *mandated* that every automatic to be initialised -
whether it be a struct union or whatever? Does such a restraint seem
something that a /later/ incarnation of the C standard might impose - and
that the ground is being prepared via the introduction of TRs?
Oh - ok, there's another 'lastly' . what was the rationale/driving-force
behind putting TRs into the standard - does anyone here know, or is this
part a question for comp.std.c?
an uninitialised automatic variable might /implicitly/ hold a bit-pattern
that, if read, *might* cause a 'trap' (I'm not sure what 'a trap' means
here - anyone?).
I also read into this that, an *initialised* automatic variable, may never
hold a bit pattern that might, when read, cause a 'trap'. I.e., if an auto
is explicitly initialised, it may *never* hold a trap value; no matter how
it's been initialised - right?
So, firstly I'd like to know whether my interpretations of the standard are
correct here - always given that my phrasing, and choice of words is not
contentious (words always are contentious to a degree of course)??
Now, this second bit builds on my first bit. I think!
It seems to me that, for example, a char cannot possibly contain an implict
set of bits that could cause a TR - or is it that I'm not considering some
possible alphabet/possible-machine where this is feasible [and, if I could -
is this really worth worrying about?]? For example, consider this code:
// ex1.
char x;
int i = x;
As 'x' is read, the c99 standard says that it may contain a TR - however,
given that a char is eight-bits (CHAR_BIT is always 8 bits right?), and that
there's an ASCII code for all 256 possible values of a CHAR_BIT that can
'accidentally' be set in 'x' - x can *never* hold a set of bits that could
cause a trap. Right?
Now, *if* there was such a set of bits, it seems to me that *if* an auto
could *always* contain this set, that that would be *great* - as it would
prevent a lot of bugs like the one in ex1 above. But given my example, this
doesn't seem possible - all possible bit patterns are legal here - and the
only way of knowing that the bits in 'x' are 'wrong', is to know that 'x'
wasn't explicitly initialised. Building upon that, every C compiler I've
ever used issues a warning along the lines of 'x' is not initialised before
it is used - if such a diagnostic is *required* by the c99 standard, then
traps should never occur - if of course you're paying attention to the
warnings your compiler issues!
Also, if it were possible [to always trap in such a situation], it would
require some runtime-checking right - either by the OS, or by the compiled
code itself? And the latter seems to go against a bit of the C ethos as I
understand it, i.e., that the compiler doesn't check [at compile time], nor
does the compiler generate code that checks at runtime - a C compiler
assumes that you should know what you're doing, and be ever diligent when
you use it [the C language]?
Lastly, am I right in thinking that TRs would simply /go away/ *if*
compilers/the-std *mandated* that every automatic to be initialised -
whether it be a struct union or whatever? Does such a restraint seem
something that a /later/ incarnation of the C standard might impose - and
that the ground is being prepared via the introduction of TRs?
Oh - ok, there's another 'lastly' . what was the rationale/driving-force
behind putting TRs into the standard - does anyone here know, or is this
part a question for comp.std.c?