C Test Incorrectly Uses printf() - Please Confirm

S

Seebs

All I read that as is the value of 'a + 5' is not defined.

No, the *BEHAVIOR* is not defined.

Undefined behavior means the compiler is allowed to die screaming.

Undefined behavior means that, on a CPU where you get a trap signal
if you have two accesses to the same register without an intervening
nop or access to another register, the compiler is allowed to generate
two accesses to the same register without an intervening nop or access
to another register, because YOU SCREWED UP. Your code is wrong. The
compiler is under no obligation to make it work.
printf("%d\n", *++p, p == &a[0] ? 1 : 0);
This program will always print '4' on any C platform.

Not necessarily. I'd agree that it seems *likely*, but imagine an
architecture -- and such have existed -- where pointers are divided
into more than one part, and ++p is implemented as something to the
effect of:
extract both parts of p
modify p appropriately
zero storage register
mask in part 1
mask in part 2

The compiler is allowed to interleave this with the comparison, resulting
in an access to a pointer register that's in a state which causes a trap.

There have been real systems where simply loading the value of an
invalid pointer was a trap, even if you didn't dereference it.

Seriously! WE ARE NOT MAKING THIS STUFF UP TO BE RANDOM. The reason it's
undefined behavior is that there have been real historical systems where
VERY SURPRISING things happened. Well, surprising if you're trying to reason
out what the machine can do while assuming that it's an extremely simple
machine with no special states.

Consider:
a = (++a % 4);

I saw a machine where this generated a series of pretty much arbitrary
values which mostly had only one or two bits set, but could be as high as 513.

-s
 
K

Keith Thompson

Tom St Denis said:
I'm not arguing that one should write [nor a compiler accept] such
code. I'm merely speaking practically about how existing compilers
[and most processors] work.

Mostly I'm writing to hear the sound of my own keyboard. :)

And that special sound made by typing the command that posts the
article is the best of all, isn't it? :cool:}
 
I

Ike Naar

[snip]
Your explanation
gives the impression that there are a finite possible outcomes from this
particular code. (Which would place it under "unspecified", rather than
"undefined", as I recall.)

Good point. I should have added that the given list of interleavings
was not exhaustive.
Consider a CPU where "ar1" and "aw0" in your example are run in parallel.
Since this is perfectly legal if they refer to different objects, if the
standard were to require this to be defined (even if unspecified) behavior,
then the compiler might be forced to produce suboptimal code "just in case".
(Consider the case of two pointers, both pointing to the same int, which
cannot be detected at compile time.)

Indeed.
 
N

Niklas Holsti

Kenneth said:
[ about printf("%d", ++a, a + 5); ]
'++a' and 'a + 5' are separate expressions.
The value of 'a' is intact until the sequence point.

...

In `` printf("%d", ++a, a + 5); '' the first sequence point, S0, is
before the statement starts. The next sequence point, S1, is before the
call to printf. In between those sequence points, the arguments
(the expressions ``"%d"'', ``++a'' and ``a+5'') are evaluated.
Several accesses to ``a'' will happen here: the ``a'' in ``++a''
will be read to obtain its previous value (event ar0),
the ``a'' in ``++a'' will be modified to store
the incremented value (event aw0) and ``a'' in ``a+5'' will be
read to compute the sum of ``a'' and ``5'' (event ar1).

Necessarily, ar0 must happen before aw0, because the value written
at aw0 depends on the value read at ar0.

Things can be more complex on machines where "int" is not an "atomic"
type, for example 8-bit microcontrollers. There, ar0 is two memory
accesses (low octet, high octet) and likewise aw0. To implement "++a",
these accesses can be interleaved per octet. For example, the access
order can be ar0(low octet), aw0(low octet), ar0(high octet), aw0(high
octet).

The accesses in ar1 can be interleaved with those accesses, so the value
read in ar1 could be a mixture of the new (incremented) value of the low
octet and the old value of the high octet.
Consider a CPU where "ar1" and "aw0" in your example are run in
parallel.

Interleaving octet accesses is a bit like that.
 
S

spinoza1111

I took an online C test a few months ago. I actually thought the test
was better than some I've taken, but one question in particular I
think has the wrong answer. The question is this:

   What is printed:

   int a = 1;
   printf("%d", ++a, a + 5);

   a. 1
   b. 2
   c. 7
   d. undefined

I selected d. This is the explanation given as to why b is the correct
answer.

   The first expression in the parameter list following the format
string is paired with the first (and only) conversion specification.
The increment is a prefix so the result of the operation is the a + 1.
Since there are more items in the value list than there are conversion
specifications, the extra value is not shown.

I believe the correct answer is d because according to K&R2 (and by
implication the Standard) the order in which function arguments are
evaluated is not defined; in fact, K&R2's example, in Section 2.12,
shows the variable n being used twice in the same printf call (albeit
with the correct number of conversion specifications).

Am I correct that d is the correct answer?

The test maker probably tried it out and on his platform the answer
was (b). But the error isn't his fault. It's that neither Kernighan
and Ritchie nor the C standards team had the courage to define C
semantics properly. They covered this up by appeals to pragmatism and
a strange sort of "freedom" (to be chained to a computer 24/7), but in
fact they served the objective needs of Wall Street capitalists who
didn't want to keep compiler developers on staff to make their
compilers conformant.
 
E

Eric Sosman

Right. The two possibilities are "something unexpected" and
"everything else."

"Undefined" means *undefined,* as in *not defined,* as in
*not defined at all, not limited or constrained in any way.* The
Standard washes its hands to the point of sterility, it walks away
to an infinite distance in zero time, and you are On Your Own. No
guarantees, no promises, no recourse, no no no no no.

All I read that as is the value of 'a + 5' is not defined. All the
work up to that point IS defined. computing an expression like 'a +
5' can have no side effects in this application since it's either 6 or
7 which are both valid 'int' values [no overflow].

I get what you're saying that since the ENTIRE statement is not
composed of defined expressions it's rejected as UB, but if we speak
practically for a second, it's a throwaway statement that can't have
side effects [other than taking time to compute].

Have you ever heard the adage "Code to the interface, not to
the implementation?"

The "interface" here is the Standard, the agreement between user
and implementor. If you choose to ignore your responsibilities under
that agreement, fine -- it's your choice, sometimes your prerogative
and even your imperative to do so. But if the implementation then does
something unwelcome, you cannot whack the implementor upside the head
with the Standard you yourself have chosen to ignore. Bargains work
when both parties adhere to them; one-party bargains are empty.

So: When you appeal to "it's a throwaway statement" and to "it has
no side-effects" (as if that mattered) or "it's a pleasing color, with
soft tannins and medium body," you are deciding to abandon your side of
the bargain. Thereafter, you can no longer argue that "the Standard
requires" or "the Standard promises," because you've already rejected
the Standard. You can write "C" or you can write "Frobozz Magic C" --
both have their uses, but do not mistake either for the other.
In fact, since it's not even printed out it could just optimize out
the expression altogether.

I would therefore expect with every C compiler on this planet in
common use that the statement would print '2' as its output with
absolutely no undefined behaviour whatsoever. More so, I think this
is just a whole in the spec in which you have UB but it could be
defined [or at least mitigated] with a bit of logical deduction.

You seem to have forgotten that "undefined behavior" includes
"what I hoped would happen." Of course, it also includes "what I
hoped would happen, except when it really matters ..."

As for the "bit of logical deduction," I offer you

void f(int, ...);
int x = 42;
f(x++, ++x, --x, x--, strlen((char*)x));

.... and invite you to make whatever logical deductions you can.
But finally, I do agree that it's a bad thing to do since in general
it could lead to UB with side effects, e.g.

It doesn't "lead to" U.B., it *is* U.B. It is undefined because
the Standard declines to define it.
int a[4], *p =&a[0];
a[1] = 4;
printf("%d\n", *++p, *(p + 3));

For instance might safely print '4' or it might crash with a bus error
[or whatever UB you can imagine].

Compare that to say

printf("%d\n", *++p, p ==&a[0] ? 1 : 0);

This program will always print '4' on any C platform.

Chapter and verse?
 
E

Eric Sosman

[...]
Seriously! WE ARE NOT MAKING THIS STUFF UP TO BE RANDOM. The reason it's
undefined behavior is that there have been real historical systems where
VERY SURPRISING things happened. Well, surprising if you're trying to reason
out what the machine can do while assuming that it's an extremely simple
machine with no special states. [...]

A further reason it's undefined is to avoid constraining the
compiler -- "the optimizer" -- any more than is unavoidable. We
all love optimizers for the way they make our ham-handed code (I'm
speaking for myself) run sleekly and swiftly, and we love to give
them the greatest possible freedom their wonders to perform. As
soon as we insist on particular outcomes for dubious constructs,
we limit the optimizer and bar it from applying transformations that
might have been to our benefit.

Sometimes (IMHO) the desire to assist the optimizer produces
wrong-headed results, as in the recent "all loops terminate" folly.
But since there's little if any good reason for `++a,a+5' in an
argument list, it would be folly to demand a particular outcome, or
even one of a set of specified outcomes. You want freedom, you've
got to accept responsibility. TANSTAAFL.
 
S

spinoza1111

Where does the standard say that the side effect of incrementing a must
not occur until the sequence point?  (Hint: It doesn't.)


The expression in question is the entire function call.
By themselves, the subexpressions ++a and a + 5 are well defined.
But because they appear together as part of a larger expression
with no intervening sequence points, the behavior is undefined.

Once again, 6.5p2:

    Between the previous and next sequence point an object shall
    have its stored value modified at most once by the evaluation
    of an expression.  Furthermore, the prior value shall be read
    only to determine the value to be stored.

The printf violates the second sentence; the second subexpression a +
5 reads the value of a, but not to determine the value to be stored.

That second sentence can be confusing (it confused me for a long
time).  The point is that if an object is read just to determine
the value to store in that same object, there's no problem, since
the read logically has to occur before the write.  But if the object
is read in a part of the expression that doesn't contribute to the
computation of the value to be stored, then the read and the write
could occur in either order, at the whim of the compiler.

The authors of the standard *could* have placed some limitations
on how such expressions are evaluated, saying, for example, that
either ++a must be evaluted (including its side effect) before a +
5, or vice versa.  But that would constrain the optimizations that

This is completely false.

Compiler optimization is not done to source code. It is done to an
intermediate representation whilst preserving the abstract semantics
of the language at all times. Real compiler optimization NEVER results
in different results for different optimizing compilers that are
correct.

However, neither K&R nor the authors of the standard had the courage
or industry to fully specify the semantics of C. Although they covered
their incompetence up with talk of a strange sort of American
"freedom" (to work 24/7 in front of a computer), their objective
reason was to preserve the investment of Wall Street capitalists who
refused to change compilers without what the MBA boys call a "business
case" (where "a business case" means "how does your plan make me
richer").

This appeal to the needs of "optimization" is familiar to me. It
occurs when little techies refuse to face the objective fact that they
are not the independent "scientists" that they fancy themselves to be,
and know that if they suggest something which costs money without
making it, the big boys will unman them.
compilers are able to perform and make the standard more complex.

"Too complex", also, means "don't make me think".

"Far from perceiving such prohibitions on thought as something
hostile, the candidates ['for posts'] – and all scientists are
candidates – feel relieved. Because thinking burdens them with a
subjective responsibility, which their objective position in the
production-process prevents them from fulfilling, they renounce it,
shake a bit and run over to the other side."

Adorno accurately perceived as a white collar worker in the Depression
in the US that all "jobs" then as now were so conditional, so
temporary-even-when-permanent, that to be a white collar "scientist"
was to be at all times a candidate, for a new and better post that
could be used for leverage at one's current job, and for one's own
job. They internalized unspoken and unwritten boundaries upon thought
and at these limits, would turn remarkably anti-intellectual and
perverse in a way that would bely their offline selves, listeners to
Beethoven discussing great events in their mortgaged living rooms.

We think of ourselves as "middle class". However, Adorno knew, since
he lived during the era in which it happened, that World War I had
destroyed the linkage of money with gold, so that any putative member
of the middle class could be impoverished by vast changes over which
he had no control. Today, we're dependent not on government largesse
(which would be bad enough) but on corporate largesse (which is
worse). Interestingly, Adorno reflects Ron Paul's views on this matter
alone, without Paul's foolish and racist libertarianism.

Computer technicians suffer from a lack of fit as seen here from their
self-image as "independent professionals" and corporate slaveys
without a bank account that's independent of free-floating exchange
rates, so when confronted with the realities of this lack of fit, they
turn to savagery, such as is happening here to someone who, rightly,
questions the wisdom of calling something "undefined" when for any
given compiler, it isn't.

Developing a compiler is as much a scientific venture as it is
industrial, but as soon as the welfare state, with its expenditure
unlinked to the flourishing of the wealthy, appeared in America and
Britain, the privately owned media named this to be "waste".

The lower middle class, internalizing this Puritanism, was ready in
computing to differentiate sharply between direct labour such as
writing an application program, and indirect labour such as writing an
insanely great tool that would automatically generate hundreds of
actual programs. Capitalism (cf Max Weber) having become a new
religion, a priesthood of CEOs was jealous of government making its
fundamental gesture, that of spending money to make money, and
government successes such as the Tennessee Valley Authority and rural
electrification horrified this priesthood.

Therefore, both K&R and the standards writers knew instinctively that
to actually define the semantics of C would be to tell private firms
to do something with no clear reward, and this would be decried in the
same terms as were government programs in the arts: "elitist",
"telling people what to do", "airy-fairy", "theoretical" and so on.

The result? A fucking mess, and constant assaults on people who
haven't got religion.
 
S

spinoza1111

And it's generally easy enough to avoid writing such code in the
first place.

Actually, Kiki, we have a saying here in Hong Kong, roughly nan dak wu
to, "it's not easy to get stupid".
 
S

spinoza1111

This makes no sense to me.

The real-world problem that I usually see is people writing code which
looks reasonable to them because they don't understand the formal
definition of the language, which is a problem that would easily be fixed
by more information.

You can't prevent bad style by telling people not to know what they're
doing.  Your advice here is analagous to telling engineers not to study
materials specifications, because this could lead them to designing
bridges which are just barely capable of supporting their expected load,
on a technicality.

Maybe it could, but the problem with such an engineer did not originate
with materials specifications, and the risks for an engineer who doesn't
actually know about the qualities of various materials are a lot worse,
and a lot harder to mitigate.


Has nothing to do with preincrement, has only to do with side-effects.  It's
a pretty simple rule, and one which is of great value to just about anyone
trying to avoid getting bitten by an optimizer.

-s

Note what Seebach is doing here.

Deeply insecure, and rightfully so, about his lack of qualifications
(he's never taken a comp sci class), he starts here talking about "the
secret contour of [our] weakness", "the bad engineer" which in the
absence of acceptance of proper certification (something the
corporation can do without), in the absence of fathers and male
mentoring, everyone here fears themselves to be, he's instantiated a
club that he plans later on to use on Shao Miller.

He starts using the pseudo-democratic language of the hacker which
pretends to replace the "bad" hierarchy of the Suits with a "good"
hierarchy.

At the top of the "good" hierarchy, as a fantasy imported directly
from Star Wars and Lord of the Rings and crap like that, rule the self-
images of the regs here, kindly, bearded "good fathers" Who Always
Admit Their Mistakes and believe in Rough Consensus and Working Code.

[Sighs and flute notes.]

But when one examines the writings of these Men Without Bones, one
discovers as in Seebs' attack on Schildt, no structure. One discovers
that their sacred texts have devolved from vigourous hierachies and
proper outlines to lists of shibboleths chanted as it were by
Troglodyte Monks.

Doing things in the accepted way and even using the right buzz words
(such as "it's undefined") becomes as important as writing a correct
program and understanding compilers. Not threatening the seniority of
the senior is also as important.
 
S

spinoza1111

Not so.


Not so.

Modern computers are full of strange stuff, like instruction scheduling
windows and out-of-order execution, and it is genuinely possible for code
with this kind of undefined behavior to do something which does not
correspond to EITHER of the expected "orders".

I've pointed out that all this strangeness must always preserve
semantics, which means it's profoundly stupid to use C in the first
place, since its developers and standards writers were too lazy,
cowardly, and incompetent to specify its semantics.
 
S

spinoza1111

[...]
Seriously!  WE ARE NOT MAKING THIS STUFF UP TO BE RANDOM.  The reason it's
undefined behavior is that there have been real historical systems where
VERY SURPRISING things happened.  Well, surprising if you're trying to reason
out what the machine can do while assuming that it's an extremely simple
machine with no special states. [...]

     A further reason it's undefined is to avoid constraining the
compiler -- "the optimizer" -- any more than is unavoidable.  We
all love optimizers for the way they make our ham-handed code (I'm
speaking for myself) run sleekly and swiftly, and we love to give
them the greatest possible freedom their wonders to perform.  As
soon as we insist on particular outcomes for dubious constructs,
we limit the optimizer and bar it from applying transformations that
might have been to our benefit.

This, as I have pointed out, is bullshit. The C standard allows
programs to give different results for different compilers when the
programmer uses a construct identified in the standard as undefined.

But what this means is that C has no fixed semantics.

However, an optimizing compiler must at all times PRESERVE the
semantics of code (cf. Aho et al, Compilers: Principles, Techniques
and Tools), which means that C CANNOT BE OPTIMIZED.

However, on each compiler, the undefined semantics are defined by the
compiler writers with in fact the blessing of the lazy, cowardly, and
incompetent writers of standards, at least one of whom (Peter Seebach)
has boasted that he's never taken a computer science class. These
semantics are then optimized.

Far from making optimization easier, saying that a construct is
undefined prevents writing optimizing compilers from optimizing
existing code.
     Sometimes (IMHO) the desire to assist the optimizer produces
wrong-headed results, as in the recent "all loops terminate" folly.
But since there's little if any good reason for `++a,a+5' in an
argument list, it would be folly to demand a particular outcome, or
even one of a set of specified outcomes.  You want freedom, you've
got to accept responsibility.  TANSTAAFL.

Bullshit. You're making good programmers responsible for the cowardice
and incompetence of K&R and the standards boys. The real "hacker
ethic"? Some fat and bearded clown in an office talking portentiously
as above about "freedom and responsibility", usually when he manifests
no real responsibility in his personal affairs and his relations with
others. In so doing, his effort is continuously to show mere human
beings that the artifact and the collective idea is more important
than them.

Fortunately, in my world experience as a consultant, the world is
walking away from C which is perceived outside the USA as Yankee
imperialist bullshit.
 
N

Nick Keighley

Because [studying the standard] could lead to people
writing code which is strictly conforming, but squeaks
through on a technicality.
This makes no sense to me.
Someone who knows the standard might write
  a << b % c.
Someone who doesn't is forced to write
  a << (b % c).
those who don't know the standard might write
     word = b1 << 8 + b2;
(actually they *did*)

I'm not sure what your point is.  That this is perfectly OK?

my intent was that it was wrong! I thought from context that the
user's intent was to combine two 8-bit bytes to make a 16-bit word.

#include <stdio.h>

int main (void)
{
unsigned char b1 = 0x7;
unsigned char b2 = 0xe;
unsigned short w1, w2, w3;

w1 = b1 << 8 + b2;
w2 = (b1 << 8) + b2;
w3 = b1 << 8 | b2;

printf ("%x %x %x\n", w1 & 0xffff, w2 & 0xffff, w3 & 0xffff);
return 0;
}

this produces the output
0 70e 70e

But there's some UB in there. It plainly doesn't do what the user
intended.
It is to
me, but that may be for reasons that have nothing to do with reading the
C standard.  

I suppose it depends what you thought it was supposed to do... Though
the potential UB should have set off alarm bells. Oh, wait you don't
consider UB to be important! :)
The problem with Malcolm's example is that programmers
who've done a lot of C++ will be perfectly happy with ordinary
arithmetic operators (+, *. etc) in an unbracketed << operand.  It's
very common in C++.

good point. I think the bit of my brain that handles C++ is distinct
from the C bit. If I switch langues I switch idioms as well. My C++
code uses 0 rather than NULL.

Missing ()s.

damn!
if (!(item_store = malloc (item_count * sizeof * Item)))

in my defense I'd have coded it (which I suppose might be your point)

if ((item_store = malloc (item_count * sizeof (*Item))) == NULL)
These are indeed obscure to many C beginners, but the solution is simple
enough -- read a good C book (the standard might not be the place to get
used to this sort of thing).

I suppose my trouble is I think the C standard is pretty readable and
it wouldn't do any C programmer (or quite a few non-C programmers) any
harm to studt it. You're right most brand new beginners would need
something gentler.

To be honest the people who know the standard well are *not* the same
people who write horribly obscure code. The standards readers tend to
be the more careful and thorough people.
 The flip side can be equally problematic:
people who don't know C well who fight its idioms and write some other
language transcribed into C.

ug. When was learnign C there was a guy around who kept on pushing "as
far as possible you want to make your C look like Pascal" "why?". And
I was a Pascal programmer.


--
10.3. Transput declarations
{ "So it does!" said Pooh, "It goes in!"
"So it does!" said Piglet, "And it comes out!"
"Doesn't it?" said Eeyore, "It goes in and out like
anything,"
Winnie-the-Pooh, A.A. Milne.}
From "Revised Report on the Algorithmic Language ALGOL 68"
 
N

Nick Keighley

And my view is that these should be avoided wherever possible.

really? Many of those idioms are pretty common. Even if you don't use
'em you're going to see them. Do you really not do assignments in your
tests? (you'd thinkI learn not put code fragements in my posts by
now...)

if ((in_stream = fopen ("mumble.dat", "r")) != NULL)
process_mumble_data();

It's quite a bit clumsier without the the assignment. Some loops are
messier without this

while (fgets(line, LINE_MAX, fp) != NULL)
process_line (line);

For instance often there is choice between using array notation or a
travelling pointer. I go for the array.

I'm quite happy with the pointer. It seem smore elegant. I don't like
unnecessary pointer notation.

/* array */
msg_type = buff [0];
bcc = buff [1];
/* skipp padding */
bs = buff [4];


/* ok pointer */
p = &buff [0];

msg_type = *p++;
bcc = *p++;
p += 2; /* skip pad */
bs = *p++;

I submit that the pointer version is easier to read and easier to
write correctly. And easier to modify.

/* not so ok */
p = &buff [0];

msg_type = *p;
bcc = *(p + 1);
/* skip padding */
bs = *(p + 4);

The worst of all possible worlds. You see code like this from people
who've drunk the "pointers are more efficient than arrays" kool aid.
A really primitve compiler
will generate less efficient code because it stores the index
variable, but this type of micro-optimisation seldom matters.

agreed that isn't why I use pointer notation.

The Prime Directive: write it clearly
 
B

Ben Bacarisse

Nick Keighley said:
my intent was that it was wrong! I thought from context that the
user's intent was to combine two 8-bit bytes to make a 16-bit word.

Ah, sorry. I was reading it in the context of the previous example that
was clearly not intended to do anything (at least not anything obvious).
I see now that you chose the names carefully and I carefully ignored
them!

<snip>
 
N

Nick Keighley

Ah, sorry.  I was reading it in the context of the previous example that
was clearly not intended to do anything (at least not anything obvious).
I see now that you chose the names carefully and I carefully ignored
them!

as most of the code I post is nonsense I can hardly blame you!
 
M

Malcolm McLean

To what advantage is there in defining the derefence of NULL?
A program that deferences NULL might produce wrong but seemingly
plausible results. Depending on what type of program it was, these
could be very negative (for instance if you send someone a gas bill
for 150 pounds when the actual amount is 100 pounds, you could find
yourself on fraud charges).

If a null pointer dereference is defined to always terminate the
program with an error message, this can't happen.
 
S

Shao Miller

To what advantage is there in defining the derefence of NULL?
It's already defined by the C Standard as leading to behaviour
undefined by the Standard.

Implementations probably do define some reaction to the construct.

Advantages can be debated.
To what advantage is there in forcing a certain order of evaluation?
Programmers could expect consistent treatment across any conforming
implementation.

Advantages can be debated.

:)
 
W

Walter Banks

spinoza1111 said:
Fortunately, in my world experience as a consultant, the world is
walking away from C

Unfortunately the experiences you cite are twenty to thirty years
old and hardly relevant to the current discussion.
 
W

Willem

Malcolm McLean wrote:
)>
)> To what advantage is there in defining the derefence of NULL?
)>
) A program that deferences NULL might produce wrong but seemingly
) plausible results. Depending on what type of program it was, these
) could be very negative (for instance if you send someone a gas bill
) for 150 pounds when the actual amount is 100 pounds, you could find
) yourself on fraud charges).

A program that <insert some bug here> might produce wrong <etcetera>.

(For example, to dereference a previously freed pointer).

Why would dereferencing NULL pointers have to get special treatment,
when it's just one of a whole slew of possible bugs ?

) If a null pointer dereference is defined to always terminate the
) program with an error message, this can't happen.

So instead of 1000 ways to go wrong, there are only 999 ways to go wrong.


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,954
Messages
2,570,116
Members
46,704
Latest member
BernadineF

Latest Threads

Top