You seem to be implying that all-bits-0 may not be a trap representation
for any integer type. If I am misunderstanding, please clarify.
Otherwise, I'd be interested in how you justify this claim. My reading
of the standard so far strongly suggests that any non-character integer
type can have padding bits, and that those bits can be used to form a
trap representation. Furthermore, it does not specify what values these
padding bits take in any circumstance, and as far as I've seen does not
specify that all-padding-bits-0 (and all value bits "don't cares")
cannot be a trap representation.
The standard does not explicitly guarantee this, but any
implementation that would generate a trap representation for
all-bits-zero would never see the light of day, if simply
for the common practice of:
int array[2];
memset(array, 0, sizeof array);