sprintf

C

CBFalconer

yeti said:
Sorry, in fact standard requires computers to have deterministic
behaviour. There would be no standard if if demons flew from my
nose. All lines in the standard would shrink to "How am I supposed
to know.Go ask your computer"

while (!(near_an_insane_bomb_laden_jihadist())) continue;
puts("\aDeadly insult to Mohammed and Allah\a");

exhibits undefined behaviour and free will.
 
Y

yeti

CBFalconer said:
while (!(near_an_insane_bomb_laden_jihadist())) continue;
puts("\aDeadly insult to Mohammed and Allah\a");

exhibits undefined behaviour and free will.

Unfortunately NO. near_an_insane_bomb_laden_jihadist() will never show
unpredictable behaviour.
Computers (as of now) always have predictable behaviour (at least in
theory... or else we would be in trouble.. think of an autopilot).
Ever heard a DFA generate a true randome number ?? I haven't.
"Free will" if it exists at all is yet not suitable enough for
computers.
And perhaps it would be benificial for serinity of this group if we
refrain from remarks or code ( for code is just another way to express)
which would not be taken in good taste by a lot of people. I hope you
understand.
 
Y

yeti

Walter said:
Within your own logic: if for a particular code situation, demons
-always- flew from your nose, then that would be deterministic,
and thus within the bounds of what you claim about the Standard.
Ah.. futility of reason. Well my mistake. I thought "flying of demons
from my nose" was rather an unproabable event and if it did happen then
the behaviour would not be deterministic for demons at least may have
mood swings.
But your claim that the Standard requires computers to have
deterministic behaviour is doubtful. The ISO C89 Rationale
(not normative) indicates,

Undefined behavior gives the implementor license to not catch
certain program errors that are difficult to diagnose. It also
identifies areas of possible conforming language extension:
the implementor may augment the language by providing a definition
of the officially undefined behavior.
Where do the lines above say about the bahaviour of the computer. It is
basic assumption for all programming languages (including assembly and
machine code) that computers should have perfectly predictable
bahaviour.
Without this asumption you won't be sure what a computer would do after
you gave it an instruction to execute.
And the words ".. by providing a definition of the officially
undefined behavior .." in fact suggests that what is left undefined by
the standard need not be inherently unpredictable, for a particular
implementation can define it anyway.
Thus, for any particular "undefined behavior", the implementor could
choose to define the behavior as being non-deterministic, and that
would be within the scope of allowed designs.

Also, if the program does something like overwrites the end of
a local variable, and so doing bashes control information being
used by the implementation, then flow of control could end up
directed to any point in memory, through any -possible- machine
instruction sequence. That instruction sequence could include
invoking a machine-level instruction that was defined
non-deterministically, or even a machine-level instruction that
was not well defined and whose operation turned out to depend upon
random transient voltages. Similar things can happen if function
pointers are abused -- remember the old trick of taking an
array of integers and casting that to a function pointer and invoking
that. If you believe that the Standard actively disallows the
implementation from ever branching into arbitrary code, then we
will be wanting Chapter and Verse (C&V) of the sections of the
Standard that you believe nails down the behaviour.
LoL...behaviour will still be perfectly predictable (if not easy to
predict). If you take the same computer with same intial state and run
the same crap code which messes up the control information then you
WILL end up with the same sequence of instruction and output EVERYTIME.
 
J

jaysome

Ah.. futility of reason. Well my mistake. I thought "flying of demons
from my nose" was rather an unproabable event and if it did happen then
the behaviour would not be deterministic for demons at least may have
mood swings.

Where do the lines above say about the bahaviour of the computer. It is
basic assumption for all programming languages (including assembly and
machine code) that computers should have perfectly predictable
bahaviour.
Without this asumption you won't be sure what a computer would do after
you gave it an instruction to execute.
And the words ".. by providing a definition of the officially
undefined behavior .." in fact suggests that what is left undefined by
the standard need not be inherently unpredictable, for a particular
implementation can define it anyway.

LoL...behaviour will still be perfectly predictable (if not easy to
predict). If you take the same computer with same intial state and run
the same crap code which messes up the control information then you
WILL end up with the same sequence of instruction and output EVERYTIME.

I think you're right, but you may be erroneously equating
"predictable" behavior to "undefined" behavior. The C Standard is
perfectly fine with predictable, undefined behavior (it's even
perfectly fine with unpredictable, undefined behavior, which I think
is what you are arguing against, and within the context of your
arguments, as far as I can tell, I agree with you).

Undefined behavior is undefined behavior, and it should be avoided if
you aspire to be a C programmer, especially if you put these kinds of
bullets in your resume:

-- expert at ANSI C

Best Regards
 
C

Charlton Wilbur

y> Unfortunately NO. near_an_insane_bomb_laden_jihadist() will
y> never show unpredictable behaviour. Computers (as of now)
y> always have predictable behaviour (at least in theory... or
y> else we would be in trouble.. think of an autopilot). Ever
y> heard a DFA generate a true randome number ?? I haven't. "Free
y> will" if it exists at all is yet not suitable enough for
y> computers.

You're not understanding what the Standard means by "undefined
behavior." It doesn't mean that *any given* compiler will produce
unpredictable behavior; it means that the behavior of the computer
when it encounters that particular bit of code is defined by the
implementation, and there is no guarantee at all about what the
behavior will be.

For instance, I can know with an absolute certainty what one compiler
will do if I say i = i++ * i++; but that knowledge is limited to that
compiler. Another compiler, written by someone more pedantic and
hostile, might take advantage of flaws in the hardware to make the
computer catch on fire when it encounters that code. Both of these
would be acceptable behavior according to the C standard.

y> And perhaps it would be benificial for serinity of this group
y> if we refrain from remarks or code ( for code is just another
y> way to express) which would not be taken in good taste by a lot
y> of people. I hope you understand.

It is most beneficial for the serenity of this group for people to not
post as experts when they are demonstrably wrong and then not continue
to post defensive responses when their error is pointed out.
Remaining in good political taste is a distant second.

Charlton
 
K

Keith Thompson

Charlton Wilbur said:
y> Unfortunately NO. near_an_insane_bomb_laden_jihadist() will
y> never show unpredictable behaviour. Computers (as of now)
y> always have predictable behaviour (at least in theory... or
y> else we would be in trouble.. think of an autopilot). Ever
y> heard a DFA generate a true randome number ?? I haven't. "Free
y> will" if it exists at all is yet not suitable enough for
y> computers.

You're not understanding what the Standard means by "undefined
behavior." It doesn't mean that *any given* compiler will produce
unpredictable behavior; it means that the behavior of the computer
when it encounters that particular bit of code is defined by the
implementation, and there is no guarantee at all about what the
behavior will be.

Correct, but the phrase "defined by the implementation" is a bit
problematic. The standard uses the word "defined" to refer to
something that must be documented. There is a class of behavior
called "implementation-defined behavior", defined as

unspecified behavior where each implementation documents how the
choice is made

"Unspecified behavior" is

use of an unspecified value, or other behavior where this
International Standard provides two or more possibilities and
imposes no further requirements on which is chosen in any instance

For completeness, the definition of "undefined behavior" is

behavior, upon use of a nonportable or erroneous program construct
or of erroneous data, for which this International Standard
imposes no requirements

The implementation needn't document what happens, and the behavior
needn't be consistent or even deterministic. You can argue that all
computer behavior must be deterministic on some level, and it may be
so for many systems, but the C standard does not require this. A
computer system might, for example, provide a device that generates
truly random (not pseudo-random) data, and undefined behavior can
legally depend on the output of such a device. (Note that the rand()
function can't use truly random data.) Perhaps more realistically, it
can depend on the content and timing of keyboard input, which can
depend on anything you like.
 
C

Charlton Wilbur

KT> The implementation needn't document what happens, and the
KT> behavior needn't be consistent or even deterministic.

True; I usually think in terms of programs where the source is
available, and thus the source is a sort of documentation. Obviously
this colors my thinking.

Charlton
 
K

Keith Thompson

Charlton Wilbur said:
KT> The implementation needn't document what happens, and the
KT> behavior needn't be consistent or even deterministic.

True; I usually think in terms of programs where the source is
available, and thus the source is a sort of documentation. Obviously
this colors my thinking.

Do you mean the source of the implementation (e.g., of the compiler)?
The source of the application won't do you any good in determining how
the program will behave in cases of undefined behavior. (You probably
know that; I just wanted to make sure.)

Incidentally, your non-standard quoting style is ok, but a bit
verbose. You might consider just using attribution lines and "> "
prefixes like the rest of us do. Or at least drop the whitspace in
front of the "KT> " tags. Thanks.
 
N

Nelu

Charlton Wilbur said:
KT> The implementation needn't document what happens, and the
KT> behavior needn't be consistent or even deterministic.

True; I usually think in terms of programs where the source is
available, and thus the source is a sort of documentation. Obviously
this colors my thinking.

I don't think the source will do you any good (or at least not in all
cases). For example free releases memory allocated with malloc, calloc,
realloc. If those functions are just forwarding requests to
the operating system when you free memory that was not allocated with
those 3 functions the behavior may not be dictated by the compiler but
by the operating system.
 
B

Ben Pfaff

Nelu said:
I don't think the source will do you any good (or at least not in all
cases). For example free releases memory allocated with malloc, calloc,
realloc. If those functions are just forwarding requests to
the operating system when you free memory that was not allocated with
those 3 functions the behavior may not be dictated by the compiler but
by the operating system.

That just means you need source for the operating system.
 
W

Walter Roberson

yeti said:
LoL...behaviour will still be perfectly predictable (if not easy to
predict). If you take the same computer with same intial state and run
the same crap code which messes up the control information then you
WILL end up with the same sequence of instruction and output EVERYTIME.

You are basically arguing the mechanistic view of the universe --
that if you just knew the state of everything precisely enough, that
you could predict -exactly- what would happen. That view has
been disproven by quantum physics experiments. And basically,
every modern fast computer is a quantum physics experiment -- modern
chip designers put a lot of work into reducing the influence of
random quantum behaviour (except where they -want- random quantum
behaviour.)

You also have a restricted view of what a computer is, and of how
computing is performed. There are, for example, biological computers,
in which starting and ending protein hooks are inserted into vats of
proteins, allowing the answer to self-assemble biologically. And Q-bit
(Quantum bit) based computers are actively being worked on;
unfortunately they aren't particularily stable as yet.

It is
basic assumption for all programming languages (including assembly and
machine code) that computers should have perfectly predictable
bahaviour.
Without this asumption you won't be sure what a computer would do after
you gave it an instruction to execute.

You are ignoring multiprocessor computers: when you have multiple
processors, the relative order that things happen in becomes
indeterminate, with one run seldom being like the next; nanosecond
differences in interrupts determine which processor gets a shared
resource first, and the happenstance orders *do* propogate into
the problem as a whole.

You are also not taking into account network computers. When
computer X sends a message to computer Y, the timing with which
Y receives the message is variable, and it is trivial to construct
examples in which the timing variations trigger different actions.
Y might not receive the message at all. Y might receive two (or
more) copies of the message. Y might sometimes be able to detect
that something went missing (or is taking the slow boat), but
for some protocols Y won't be able to tell. Y might request
a resend -- that's different behaviour, not predictable by X.
Possibly everything will be straightened out by the time you
get to the application layer, but there are important applications
(such as video broadcasting) in which losses are expected and
a lot of work goes into making the data stream robust "enough"
for use.

And you are not taking into account that some cpus have
(deliberate) random-number generators, and that those random
numbers are important to a variety of activities, including
various kinds of security (or even just making a computer
game less predictable.)

Non-determinism happens in a wide variety of circumstances
in computing, and there are different strategies for dealing
with it, down to chip doping strategies and up to the application
layers.

When the C standard says that something has undefined behaviour and
that the implementation can define that behaviour if it wants, the
standard *does* mean to include the possibility that the implementation
will behave non-deterministically. There is a different kind of
behaviour, "unspecified behavior" if my mind isn't too asleep yet, for
which the standard does not say exactly what happens, but for which the
implementation is not allowed to cause the program to fail. For
undefined behaviour, the results really are subject to change without
notice.
 
Y

yeti

Charlton said:
y> Unfortunately NO. near_an_insane_bomb_laden_jihadist() will
y> never show unpredictable behaviour. Computers (as of now)
y> always have predictable behaviour (at least in theory... or
y> else we would be in trouble.. think of an autopilot). Ever
y> heard a DFA generate a true randome number ?? I haven't. "Free
y> will" if it exists at all is yet not suitable enough for
y> computers.

You're not understanding what the Standard means by "undefined
behavior." It doesn't mean that *any given* compiler will produce
unpredictable behavior; it means that the behavior of the computer
when it encounters that particular bit of code is defined by the
implementation, and there is no guarantee at all about what the
behavior will be.

For instance, I can know with an absolute certainty what one compiler
will do if I say i = i++ * i++; but that knowledge is limited to that
compiler. Another compiler, written by someone more pedantic and
hostile, might take advantage of flaws in the hardware to make the
computer catch on fire when it encounters that code. Both of these
would be acceptable behavior according to the C standard.
Thankyou and I certainly appericiate the pedagogic value of what you
have written. But I'm sorry it doesn't say anything which I don't know.
The whole argument started with people (correctly) saying that
behaviour was left "undefined" in the C standard and I asked a simple
question "should we leave it at that?". Does saying that "C standard
doesn't have an answer" answer the question ?? Some code is behaving in
a particular way and there has to be an explination why it is doing so.
Yes it may vary from one implementation to another. But everywhere it
will have a rational explination. Even if it puts your computer on
fire, there still will be an explination "why" and "how" it did so.
y> And perhaps it would be benificial for serinity of this group
y> if we refrain from remarks or code ( for code is just another
y> way to express) which would not be taken in good taste by a lot
y> of people. I hope you understand.

It is most beneficial for the serenity of this group for people to not
post as experts when they are demonstrably wrong and then not continue
to post defensive responses when their error is pointed out.
Remaining in good political taste is a distant second.
Well I just hoped that you will understand it doesn't matter if you
didn't.
 
Y

yeti

Walter said:
You are basically arguing the mechanistic view of the universe --
that if you just knew the state of everything precisely enough, that
you could predict -exactly- what would happen. That view has
been disproven by quantum physics experiments. And basically,
every modern fast computer is a quantum physics experiment -- modern
chip designers put a lot of work into reducing the influence of
random quantum behaviour (except where they -want- random quantum
behaviour.)
Now why do chip designers put in a lot of efforts to reduce influence
of randome quantum behaviour ??
Perhaps to make things more deterministic and thats what I stated.
You also have a restricted view of what a computer is, and of how
computing is performed. There are, for example, biological computers,
in which starting and ending protein hooks are inserted into vats of
proteins, allowing the answer to self-assemble biologically. And Q-bit
(Quantum bit) based computers are actively being worked on;
unfortunately they aren't particularily stable as yet.



You are ignoring multiprocessor computers: when you have multiple
processors, the relative order that things happen in becomes
indeterminate, with one run seldom being like the next; nanosecond
differences in interrupts determine which processor gets a shared
resource first, and the happenstance orders *do* propogate into
the problem as a whole.
Indeterminate possibly is not the right word here.
You are also not taking into account network computers. When
computer X sends a message to computer Y, the timing with which
Y receives the message is variable, and it is trivial to construct
examples in which the timing variations trigger different actions.
Y might not receive the message at all. Y might receive two (or
more) copies of the message. Y might sometimes be able to detect
that something went missing (or is taking the slow boat), but
for some protocols Y won't be able to tell. Y might request
a resend -- that's different behaviour, not predictable by X.
Oh I guess then nothing in world is deterministic for everything in
world is composed of quantum particles which inherently have
non-deterministic behaviour. But what I may like to point out is that
all the non-determinism applies at the quantum level and not in the
macro world. Nice thing about nature is that all the randome behaviour
at sub-atomic level sums up into deterministic bahaviour at macro
level.
Possibly everything will be straightened out by the time you
get to the application layer, but there are important applications
(such as video broadcasting) in which losses are expected and
a lot of work goes into making the data stream robust "enough"
for use.

Again why is it "straightened out" ?? isn't it that we want things to
behave deterministically ??
And you are not taking into account that some cpus have
(deliberate) random-number generators, and that those random
numbers are important to a variety of activities, including
various kinds of security (or even just making a computer
game less predictable.)
Now why exactly are special randome number generators needed if cup's
themselves behave non-deterministically ??
Well why talk of a "Randome number generator" a human attached to the
computer may do the job as well.
But then this randomeness would be externally generated and not
inherent to the computing device.
We need seperate randome number generators just because computers
themselves are unable to generate randome numbers.
 
G

Gordon Burditt

exhibits undefined behaviour and free will.
y> Unfortunately NO. near_an_insane_bomb_laden_jihadist() will
y> never show unpredictable behaviour. Computers (as of now)
y> always have predictable behaviour (at least in theory... or
y> else we would be in trouble.. think of an autopilot). Ever
y> heard a DFA generate a true randome number ?? I haven't. "Free
y> will" if it exists at all is yet not suitable enough for
y> computers.

You're not understanding what the Standard means by "undefined
behavior." It doesn't mean that *any given* compiler will produce
unpredictable behavior;

Although it *COULD* do exactly that. Some CPUs have hardware
random-number generators that use thermal noise to generate numbers
that, according to quantum theory, are true random numbers. (Intel
put this in some Pentium CPUs). The compiler *could* generate code
to use these random numbers to determine which of many obnoxious
things to do, like it was written:

switch(_PentiumTrueRandomGenerator()) {
case 1: abort(); break;
case 0: break; /* do nothing */
case 2: _ElectrifyKeyboardWith50000Volts(); break;
...
}

it means that the behavior of the computer
when it encounters that particular bit of code is defined by the
implementation, and there is no guarantee at all about what the
behavior will be.

For instance, I can know with an absolute certainty what one compiler
will do if I say i = i++ * i++; but that knowledge is limited to that
compiler. Another compiler, written by someone more pedantic and
hostile, might take advantage of flaws in the hardware to make the
computer catch on fire when it encounters that code. Both of these
would be acceptable behavior according to the C standard.

Another way of saying this is, no matter what it does, you have
no right to complain about it or try to collect on the warranty.
 
Y

yeti

Keith said:
Correct, but the phrase "defined by the implementation" is a bit
problematic. The standard uses the word "defined" to refer to
something that must be documented. There is a class of behavior
called "implementation-defined behavior", defined as

unspecified behavior where each implementation documents how the
choice is made

"Unspecified behavior" is

use of an unspecified value, or other behavior where this
International Standard provides two or more possibilities and
imposes no further requirements on which is chosen in any instance

For completeness, the definition of "undefined behavior" is

behavior, upon use of a nonportable or erroneous program construct
or of erroneous data, for which this International Standard
imposes no requirements

The implementation needn't document what happens, and the behavior
needn't be consistent or even deterministic. You can argue that all
computer behavior must be deterministic on some level, and it may be
so for many systems, but the C standard does not require this. A
computer system might, for example, provide a device that generates
truly random (not pseudo-random) data, and undefined behavior can
legally depend on the output of such a device. (Note that the rand()
function can't use truly random data.) Perhaps more realistically, it
can depend on the content and timing of keyboard input, which can
depend on anything you like.
A randome number generator or a human sitting in front are both
external entities.
And if at all the implementation makes use of the randome numbers
generated either by the human or a randome number generator, there
still would be an explination, taking into account that a randome
number has been used.
Now I do understand that C standard may not ( for the behaviour left
"undefined") say anything about what will happen at runtime, but that
doesn't mean that we can't explain the runtime behaviour of the
program.
 
R

Richard Bos

Thankyou and I certainly appericiate the pedagogic value of what you
have written. But I'm sorry it doesn't say anything which I don't know.

It clearly states something you may know in theory, but obviously don't
really understand, though.
The whole argument started with people (correctly) saying that
behaviour was left "undefined" in the C standard and I asked a simple
question "should we leave it at that?".
Yes.

Does saying that "C standard doesn't have an answer" answer the question ??
Yes.

Some code is behaving in a particular way and there has to be an
explination why it is doing so.

No. There doesn't even have to be an expl_a_nation.
Yes it may vary from one implementation to another. But everywhere it
will have a rational explination.

No.


Really. You're just fundamentally mistaken on this issue. I know that's
hard to grasp, but that's no reason to be stubborn in the face of better
experts than you or me.

Richard
 
Y

yeti

Gordon said:
Although it *COULD* do exactly that. Some CPUs have hardware
random-number generators that use thermal noise to generate numbers
that, according to quantum theory, are true random numbers. (Intel
put this in some Pentium CPUs). The compiler *could* generate code
to use these random numbers to determine which of many obnoxious
things to do, like it was written:

switch(_PentiumTrueRandomGenerator()) {
case 1: abort(); break;
case 0: break; /* do nothing */
case 2: _ElectrifyKeyboardWith50000Volts(); break;
...
}



Another way of saying this is, no matter what it does, you have
no right to complain about it or try to collect on the warranty.
Certainly. But I am not complaining. I am just seeking a explination
for the behaviour. And the explination can be simple that "behaviour is
governed by the hardware randome number generator"
 
Y

yeti

Richard said:
It clearly states something you may know in theory, but obviously don't
really understand, though.


No. There doesn't even have to be an expl_a_nation.
Yes there has to be, maybe not required by C standard but required by
common sense.
No.


Really. You're just fundamentally mistaken on this issue. I know that's
hard to grasp, but that's no reason to be stubborn in the face of better
experts than you or me.
1. I do accept that C standard leaves the behaviour (refer to first
post) undefined.
2. I am arguing that there has to be a rational explanation for all
kinds of behaviour shown by a computer. It doesn't matter if C standard
covers it or not.
3. Due to point 2 all the behaviour (of computers), even those left
undefined( or defined by the implementation) in C standard have a
rational explanation.
4. I am no expert at computing and hence am trying to learn form the
people who are
5. So far none of the posts here have been able to convience me.

PS: I'm sorry if you felt that I am being stubborn, but thats the way
it is.
 
K

Keith Thompson

Nelu said:
Right, but I was referring to the source code for the implementation.

The standard's definition of "implementation" is (C99 3.12):

particular set of software, running in a particular translation
environment under particular control options, that performs
translation of programs for, and supports execution of functions
in, a particular execution environment"

If the OS "supports execution of functions", it's part of the
implementation.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,999
Messages
2,570,246
Members
46,840
Latest member
BrendanG78

Latest Threads

Top