yeti said:
Sorry, in fact standard requires computers to have deterministic
behaviour. There would be no standard if if demons flew from my nose.
Within your own logic: if for a particular code situation, demons
-always- flew from your nose, then that would be deterministic,
and thus within the bounds of what you claim about the Standard.
But your claim that the Standard requires computers to have
deterministic behaviour is doubtful. The ISO C89 Rationale
(not normative) indicates,
Undefined behavior gives the implementor license to not catch
certain program errors that are difficult to diagnose. It also
identifies areas of possible conforming language extension:
the implementor may augment the language by providing a definition
of the officially undefined behavior.
Thus, for any particular "undefined behavior", the implementor could
choose to define the behavior as being non-deterministic, and that
would be within the scope of allowed designs.
Also, if the program does something like overwrites the end of
a local variable, and so doing bashes control information being
used by the implementation, then flow of control could end up
directed to any point in memory, through any -possible- machine
instruction sequence. That instruction sequence could include
invoking a machine-level instruction that was defined
non-deterministically, or even a machine-level instruction that
was not well defined and whose operation turned out to depend upon
random transient voltages. Similar things can happen if function
pointers are abused -- remember the old trick of taking an
array of integers and casting that to a function pointer and invoking
that. If you believe that the Standard actively disallows the
implementation from ever branching into arbitrary code, then we
will be wanting Chapter and Verse (C&V) of the sections of the
Standard that you believe nails down the behaviour.