subroutine stack and C machine model

S

Seebs

It seems to me that to be a "participant observer" in a standards
effort in which the English language was so misused (confusing
"defect" with "issue")

That characterization mistakes a particular set of formal distinctions for
the English language. Not every formal process defines those terms
identically. That's nothing to do with the English meanings of terms, except
that normally names are chosen which are roughly evocative of the intended
role of an item in a process.
and which had been taken over by vendors,

You keep asserting this, but you have no information, and your claim is
contrary to every report we've had from people who participated in the
process.
would have been to be corrupted. Perhaps this is why Schildt stayed away.

More likely, I always assumed, was that he had no interest in the standard,
only in having a credential to wave at people.
I think you have lost all rights to use "untrue",

Even granting that you presumably do think this, so what? You've gotten
claims wrong with a bewildering speed and variety. You don't actually seem
to hold any particular position on any issue unless it's either unverifiable
or false.
since "truth" here
is a document which ratified ignorance and bad practice.

No, "truth" here is the set of claims which accurately correspond to the
world they nominally describe.
It appears to
me now that the Standards effort was an Idiocracy.

That's even stupider than your previous theory. I'm an outlier by any
standard. Most of the people involved were long-term professionals
experienced in the field, whether they worked for compiler vendors or
test suite vendors or whatever. We had, that I know of:

* compiler vendor representatives
* software developer representatives
* test suite developer representatives
* consultants
* writers
* educators

In short, a broad range of people who had different perspectives on C.
And, consistently, this was taken into account and raised specifically.
Vendors actively sought feedback from software developers who would be
using the resulting compilers. Software developers who wanted features
would seek feedback as to how well those features might work, or what
could be done to provide them.
Sure, you MIGHT
have been a savant who'd not taken computer science, although this is
unlikely: the first computing savants had no such access but would
have been glad to learn what they created in order to advance the
state of the art.

This is semantically incoherent.
But your lack of formal CS education shows if you can seriously
maintain that the Standard had to feign ignorance so that optimal code
could be generated.

It's not really a question of ignorance, I don't think. Consider that
K&R 1 (1978) documents order of evaluation as unspecified.

What order should we have specified, were we to specify one? For whose
benefit?
As I have tried to explain to you, it is NOT
NECESSARY to make evaluation order undefined in order to optimize
because optimization must preserve the original intent of the code,
and the movement of the code takes place not in source but in its
intermediate representation.

Your "explanation" here remains surreal and incoherent. No one claimed
that movement takes place in source under any circumstances. Obviously,
optimizers work on intermediate representation rather than on source (except
some peephole optimizers, which worked on assembly output and may well have
actually worked by directly modifying the assembly source, but that's a
special case).

The problem is that, while you're right that it is possible by analysis
of syntax to divide expressions into those which definitely have no
side-effects, and those which might have side effects, the latter category
is far too large. Worse, you don't allow for the possibility that a user
may be well aware that there are side-effects from both operands of an
operator, *but not care which order they happen in*. Consider:

x = a++ + b++;

Do you think the user cares whether a or b is incremented first? I don't.
To declare (as both K & R and the Standard apparently have) that a()+b
() must be reorderable for optimization, for efficiency as a matter of
a language standard is using efficiency, as incompetent programmers so
often use that word, as a coverup. A language that imposes stricter
rules is actually easier to optimize since the optimizer has more
information about the intent of the code.

Except that some of that information is false. The information that the
user "intended" a() to occur first may be untrue. The user may have not
cared, but there's no longer a way for the user to express that.

Which is why C lets you specify ordering when you do care:
x = a();
x += b();

-s
 
S

spinoza1111

Since you're interested in philosophy, you will be fascinated to note
that it is very hard to determine whether or not you have knowledge of
the above.

Consider the common assertion that knowledge is "justified true belief".
A man looks out into a field while passing it in a train, and sees something;
he believes there is a sheep in the field, because he has seen a fluffy white
thing.  But in fact, what he saw was an unusually shaped white rock.  But!
Behind the rock, completely invisible to him, is a smallish sheep.

Actually, I am familiar with the Gettier counter-example to "justified
true belief". The man who came up with it was not a specialist in
epistemology and did no other work. He thought he was clever in
shooting down hundreds of years of philosophical work...you and he
might get along IF you still think you were clever about Schildt.

It is jejune since it's based on a wearisome British empiricism in
which things are "things" and unquestionable.

Madonna is having a photo shoot in Central Park. Simultaneously, you
pass it by, see a cutout of Madonna and tell your friends that
"Madonna was in Central Park today". My riposte to Gettier is that in
this silly and jejune case, I got lucky, and have "justified true
belief" but "by accident" as a favor of the gods. The real Madonna and
her lifelike image are both sense data, and neither one has priority.

The opinion that you don't is based on a silly illusion that one must
"work" for the knowledge. As it happens, other forms of knowledge such
as an intuition based on wrong evidence that are later confirmed
(justified) by valid evidence are, in fact, knowledge retroactively.

If you have a "hunch" that you smell a rat in a language standard
effort and this is later confirmed by the admitted failure of the
effort, and the revelation that voting members without CS degrees gave
silly reasons for not doing their job, then you validly say "I knew it
[sic] all along".

Foreknowledge based on intuition that is later confirmed is knowledge.
I smell a rat in C standardization and my intuition (which is itself
based not on astrology or haruspication but by my own experience with
the thuggish irrationality of software management for profit) is being
confirmed.

I knew it, I shall soon say, all along.
Does he have knowledge of the sheep?  Probably not.

The relevance of this is that your "criticisms" of the CTCN page have
been consistently found to be rambling, incoherent, or just plain wrong....
And yet, by total coincidence, I think it's safe to say that you're quite
right that, back when I wrote it, I was unable to organize a document,
separating important from unimportant issues.  (That said, I would also
point out that I was making no attempt to do so; I picked issues I found
interesting or informative, and they were presented strictly in page
order.)

OK, fine. If you prefer that formulation, if indeed I have been mostly
mistaken about C (which is perfectly possible for the reason I have
stated, that after being Princeton IC's expert on C, after teaching it
to CS majors, after assisting Nash blah blah), you concede that the
document was disorganized and failed to separate important issues, it
is now time for you to erase the document, and update wikipedia,
removing the section on Criticisms of Schildt, since that document,
that disorganized document, is its one and only source.

If you:

(1) Remove the document "C: The Complete Nonsense"
(2) Apologize at the site of the former document, admitting that it
failed to separate important issues out, and also admitting the harm
done needlessly to Schildt's reputation
(3) Remove the section on Criticisms from Schildt's wikipedia
biography

then I will consider this matter closed.
You seem to have missed a key point:  What makes you think I don't have
a complete degree?  I completed all but a couple of requirements of math
and philosophy degrees, and *did* complete a psychology degree.  I functioned
fairly adequately in college -- quite well if you make allowances for the
fact that, at the time, we had no idea that I had a learning disability, nor
any clue how to treat it or accommodate it.  Life has since improved.

I am sorry you have what you call a learning disability. I myself had
what is now termed ADD through university and I self-medicated with
cigarettes and coffee, "coming down" with alcohol.

My learning disability made me consistently try to use structured and
complex English to describe my goals in programming and at times today
gives me the appearance of verbosity when in fact I am at times
overcompensating, and trying to be attentive to all issues, whether in
programming or debate.

However, my successes in programming, which were never based on
trashing reputations, showed me that formalism in English, and what
formalism I was able to use in math, were key to success in
programming, and this personal discovery coincided with the rise of
slower and more structured ways of programming. This 1970s movement
also introduced humanism into programming, in the writings of the
middle Dijkstra into programming.

But a learning disability is not an excuse, I think you may agree,
with failure to take responsibility for a document which despite its
thinness and disorganization, has been amplified in the echo chamber
of the Internet to what appears is a mass of evidence, because
Internet users confuse mere repetitions of a text with multiple texts.

Therefore, you can and should make amends.
No.

There is no "selected to".  I wanted to, it sounded fun, people in the
standards community knew me and said I should try it out, so I did.  There's
no selection; you pay your dues and show up (or don't show up, if you don't
want to).


The latter is a great example of something which isn't knowledge, because
it is neither true nor justified.  There simply is no history of wanting
people to "go along with vendor requirements".  During the entire time I
worked on C, we consistently had non-vendors present and active in the
process, and they were listened to carefully, because vendors wanted to make
sure they didn't screw their users by not being aware of user requirements.

So goes the playbook. You need some experience in the real world.
In short, what actually happened was the opposite of what you describe.

The appearance was certainly carefully crafted that it was, in order
to get buy in to what was a failure to serve the public interest from
guys like you.
Your assertion that this was "personalized" fascinates me, because I was
unaware of any personal aspect to the thing.

OK, you had no personal animus against Schildt, any more than the
"mean kids" campaign against Kathy Sierra had against her. However, he
took steps to end the process his "critique" started and www.meankids.org
is no more. Can you take responsibility for removing "C: The Complete
Nonsense" and fixing the wikipedia bio?

However.  You raise an interesting concern.  You have pointed out in
the past a belief that people like me had "AP'd out of" early CS courses,
and overspecialized later, leading to a disconnected ivory-tower view
of things, uninformed by significant awareness of the sorts of things
one picks up earlier in a CS program.

What, then, should we make of someone who apparently skipped over most of
undergraduate CS entirely, and came in only at the master's degree level,
which is purely academic?  Possibly that doesn't tell us anything.  But what
if that person then demonstrates a total unawareness that systems other
than MS Windows may have genuinely different architectures?  Why, then
I think we're starting to see signs of an ivory tower academic whose lack
of real-world experience hurts his ability.

I don't think that likely at all because if anything, at the Master's
level, one learns all sorts of things about non-Microsoft and non-IBM
platforms, more so than at the undergraduate level. In my own MSCS
program, I had to write the firmware for a DEC PDP-8 in my Computer
Architecture class (my grade was A). We had a full range of systems
available, IBM, Microsoft and so on. Herb studied at the Univ of
Illinois a leading school for computer science.
If you wanted to say "anyone who doesn't have a degree in CS, but attended
college at a time when they were generally available", you should have.  And
as noted, my college didn't have CS degrees when I attended, although they
had one of the earlier CS programs.


I don't think anyone's claimed that CS is "worthless" as a degree, merely
that it's not strictly necessary.  I might well be more effective in some
ways if I had a CS degree.  However, it's unclear that the benefit would
justify the time sink.


Who would you rather have representing you in court, assuming it were legal
for both to do so?

* Someone who is one course short of a law degree, but who loves to read
  case histories and competed in national debate tournaments.
* Orly Taitz, who has a law degree.

You are by your own admission more than one course short. And please
replace law by medicine, and reboot.
 
S

Seebs

This means as far as I can tell you never learned to discuss CS
outside of overspecific problems and that you may have reinvented some
wheels.

You are going wayyyy too far from too little data. What makes you think I
never learned to discuss CS outside of overspecific problems?
A "knack for programming" might take you far, but in my
experience it makes you confuse issues in dialog. An example would be
the way you thought, in "C: The Complete Nonsense" that the existence
of free(NULL) falsifies Herb's point that one needs to balance free()
calls with calls to malloc(), calloc, and realloc().

It's a concrete example of a value you can pass to free even if it didn't
come from one of those. Were I writing the page today, though, I'd probably
be more focused on the more significant issue, which is that his writing
could easily leave the reader thinking that:

x = malloc(5);
free(x);
free(x);

is okay, because x is a pointer returned by malloc.
You never discussed abstractions in a human context only specific
problems expressed in programming language in a machine context, which
made you astoundingly literal-minded to a fault but willing to use
this fault to harm others.

This is a fascinating theory, but completely wrong. I'm literal-minded
to a fault because I'm autistic, not because of when I did or didn't discuss
things.

In fact, I've spent quite a while discussing abstractions in a human context,
because I was raised by mathematics teachers. I'm a lot better at it now
than I was 15 years ago, though.
Since programming languages seemed to you
to have single meanings, you acted as if Herb could mean ONLY by his
assertion that he did not know of free(NULL) (a useless piece of shit)

It's not useless, though. It's an extremely useful thing, because it ensures
that it is always safe to free (once) a value returned by malloc or realloc.
It eliminates hundreds of "if (p)" tests. Very useful.
and that your interpretation matched in logical force the *frisson*
you got when your code worked.

This is just semantically incoherent.
I do not mean "classes in communications".

Which is a shame, because I took those too. :)
I mean discussing a
scientific discipline such as computer science and understanding how
the model, say of a concrete stack, can be instantiated in many
different ways as long as one can distinguish its essential
abstraction in the concrete.

Ahh, and we're back to your little hobbyhorse.

What you say could actually have been a good argument. I think there's a lot
to be said for describing the logically-nested activation records as a
conceptual "stack", and showing how this could be implemented in various ways,
including the specific case of just having a large chunk of memory and running
up it and down it.

But.

The objection to Schildt's writing on the topic is not to his use of the
abstraction "stack". It can't be, because he never deals with the
abstraction. He talks only about a specific concrete form, and gives it,
not as an example of one way of doing things, but as The Way Things Are.

Had he talked about the abstraction, then shown an example of one way of
implementing it, I certainly wouldn't be complaining.

However, we can tell these apart immediately by his use of language. It
is true that the set of activation records is *a* stack. It is only true
on some implementations that the set of activation records is stored in *the*
stack. The indefinite article refers to an abstraction; the definite article
refer to a specific and concrete implementation.

I already posted the hunk of text from C:TCR (3e) showing exactly this point.

I have always thought that Schildt was a clear and lucid communicator; I
merely think he's wrong about a lot of things. By contrast, your careful
distinction between the abstraction of "a stack" and specific implementations
is a marked improvement over his material on the stack, which always felt a
bit clumsy. You are almost right; that would have been a good way to express
it, and if he'd expressed it that way, instead of the way he did, it would be
ridiculous for people to criticize him for it.

-s
 
S

spinoza1111

whatever /they/ are...

Spinoza: consider the expression
      a - b

The C Standard does not allow an implementation to re-arrange this to
      b - a

because that would be wrong, but it does allow the implementation to
generate code like this (R1 and R2 are registers)
      R1 <- VALUE(a)
      R2 <- VALUE(b)
      R3 <- R1 - R2

or like this
      R1 <- VALUE(b)
      R2 <- VALUE(a)
      R3 <- R2 - R1

In each case R3 contains the result whilst the evaluations of 'a' and
b have been done in a different order.

Right. The problem was that given specific compilers for given
specific machines generally stayed with one way or another in cases
like this, and programmers mistook this behavior for a property of C.
Moreover, I would guess that in more than nine out of ten cases the
order of evaluation was left to right.

Well, this fact, alongside the fact that other languages stay with
left to right, means that K&R and the standard erred, since they
created a hidden fact. It should have been cleaned up in the standard,
and would have been if the standard was in the public interest. It was
not cleaned up to preserve vendor investment.

Nor is it necessary for optimization: the above reversal is possible
for the most common case when a and b are simple constants or lValues
without aliasing, and whether they are can be determined by the
optimizer.

If Schildt erred in understanding this he was the victim of what was
essentially a hoax similar to Alan Sokal's infamous paper or that
fucking stunt played on Dan Rather with Bush's draft records.
Foreknowledge and intuition that is later confirmed is KNOWLEDGE
(Gettier is WRONG). Rather is in court and he will succeed, I hope and
believe, in bring Bush to justice.

It was a hoax in Schildt's case because Schildt's experience and
knowledge on the vastly most prevelant platforms were valuable to real
people, to the extent that the existence and popularity of Schildt's
book should have been a reason to correct the standard. Instead,
sponsored by vendors, the standards committee performed an adolescent
stunt by preserving a completely unnecessary property of C, one that
is mostly invisible to intelligent C programmers.



The current version does not even violate numerical mathematics, that
is, expression can only be rewritten if it is known that the exact
result of the rewrite is not different from the result of the original.
So I wonder what your point is, mathematics does not tell whether 'a'
must be evaluated before 'b' or not.

and note since 'a' and b are potentially arbitarily complicated
expressions it could make a difference

   i = g();
   a = i + j;
   s = f() + a;

by stashing a away before the call to f() you could save
recalculating it.- Hide quoted text -

- Show quoted text -
 
S

Seebs

If you have a "hunch" that you smell a rat in a language standard
effort and this is later confirmed by the admitted failure of the
effort, and the revelation that voting members without CS degrees gave
silly reasons for not doing their job, then you validly say "I knew it
[sic] all along".

Ahh, but you have been consistently wrong in every aspect of your commentary
on the standard. You are treating my comments, in 2009, of why I think a
given decision has worked out well, as though they are the reasons that it was
made in 1975 or so.
I smell a rat in C standardization and my intuition (which is itself
based not on astrology or haruspication but by my own experience with
the thuggish irrationality of software management for profit) is being
confirmed.

Only, actually, everything you've predicted has been directly false.
OK, fine. If you prefer that formulation, if indeed I have been mostly
mistaken about C (which is perfectly possible for the reason I have
stated, that after being Princeton IC's expert on C, after teaching it
to CS majors, after assisting Nash blah blah),

I don't care what alleged credentials you have. You make elementary newbie
mistakes, like mistaking function call argument separators for comma
operators, asserting that at some point in the past everyone used
left-to-right evaluation, and so on. (Interestingly, you originally claimed
that everyone did left-to-right; when someone pointed out that MS did
right-to-left, you then manufactured a theory that the C89 people cared
what MS did; so far as I know, MS was completely uninvolved in C
standardization and no one knew or cared what they were doing.)
you concede that the
document was disorganized and failed to separate important issues,

It is still not especially organized.
it
is now time for you to erase the document, and update wikipedia,
removing the section on Criticisms of Schildt, since that document,
that disorganized document, is its one and only source.

False on two counts.

First off, the document is disorganized, but *still correct*. Bad
organization does not invalidate data.

Secondly, it's not the only source. That document postdates a great deal
of public criticism of C:TCR.
(1) Remove the document "C: The Complete Nonsense"
(2) Apologize at the site of the former document, admitting that it
failed to separate important issues out, and also admitting the harm
done needlessly to Schildt's reputation
(3) Remove the section on Criticisms from Schildt's wikipedia
biography
then I will consider this matter closed.

Wouldn't it be far better, given that there are serious and fundamental
errors in C:TCR, for me to go back to this document and improve it?

Consider that since I wrote that, I've written probably a couple hundred
thousand words of technical material professionally, on topics ranging from
compiler design to shell programming. I'm an experienced writer, capable
of organizing book-length works effectively and making points persuasively.

Clearly, the document would be much improved if I went and reworked it.

An apology for "needless" harm seems unlikely, since the harm is ultimately
either nonexistent or needful. His book is full of nonsense. Failure to
clearly indicate this to people would be a disservice.
I am sorry you have what you call a learning disability. I myself had
what is now termed ADD through university and I self-medicated with
cigarettes and coffee, "coming down" with alcohol.

I didn't medicate it at all. :)
But a learning disability is not an excuse, I think you may agree,
with failure to take responsibility for a document which despite its
thinness and disorganization, has been amplified in the echo chamber
of the Internet to what appears is a mass of evidence, because
Internet users confuse mere repetitions of a text with multiple texts.

Ahh, but that's not at issue. There's no such confusion. My document is
the most conveniently accessible, and is given credibility by some, but there
are plenty of other sources. Francis Glassborow has criticized Schildt's
writings, and is arguably more qualified than I am -- he's read, and reviewed,
many more C books than I have. People might dispute Clive's sense of timing,
but I don't think anyone would dispute his competence to speak to the quality
of writing about C.

Don't mistake it being the thing that convinced Wikipedia that the criticisms
were notable for it being the only thing that could have convinced them.
Therefore, you can and should make amends.

I have done no wrong. I may have done right relatively ineffectually, but I
don't know that it's worth correcting.

Does anyone CARE whether Schildt's books were awful in 1995? I'm not sure
there's much need to pursue this; he's gone off to bother other languages,
which he may well know better.
So goes the playbook. You need some experience in the real world.

Back to philosophy so soon? I assume, admittedly without justification, that
the world I experience is real.
The appearance was certainly carefully crafted that it was, in order
to get buy in to what was a failure to serve the public interest from
guys like you.

This is a pretty elaborate conspiracy theory, but you've offered no support
for it, nor have you even hinted at addressing the significant flaws.

Most noticably, you've claimed that the goal of C99 was to avoid making
vendors do more work, so they wouldn't have to retain compiler developers.
However, C99 is full of features which required a great deal of work, many
of which were proposed *by the vendors*. This completely scuttles your
premise. Without that premise, all your speculations as to the vast
conspiracies which could have concealed that intent are useless.

If indeed there was intent to produce a crappy standard to save vendors work,
that intent was pursued so completely ineptly that it is ludicrous to imagine
a conspiracy to cover it up succeeding. The impositions on vendors in C99
turn out to be large and significant, in some cases noticably exceeding
the impositions placed by C89.

You seem absolutely obsessed with this order-of-evaluation thing, but there's
simply nothing else, in anyone's writing, in any source, suggesting that it
has ever been a significant consideration. Actually, wait, that's not quite
true. I once saw a web page from a guy who had the same argument. But that
could have been you; I don't know if I have archives far enough back to
check that.
OK, you had no personal animus against Schildt, any more than the
"mean kids" campaign against Kathy Sierra had against her. However, he
took steps to end the process his "critique" started and www.meankids.org
is no more.

Who's "he"? And what the **** does this "Kathy Sierra" person I've never
heard of have to do with anything?
Can you take responsibility for removing "C: The Complete
Nonsense" and fixing the wikipedia bio?

No, because no one has given me any reason to believe that the information
there is fundamentally misleading. Poorly organized, I'd grant. If you'd
really like, I'd be willing to put a couple of hours into cleaning it up
and making the explanations clearer, and picking out a few more of the
really impressive errors; I only made it a little way into the book before
I got bored, so I haven't really checked it all carefully.
I don't think that likely at all because if anything, at the Master's
level, one learns all sorts of things about non-Microsoft and non-IBM
platforms, more so than at the undergraduate level.

Perhaps so, but Schildt's writing consistently demonstrates non-awareness
of other platforms. Maybe his program was different from yours.
In my own MSCS
program, I had to write the firmware for a DEC PDP-8 in my Computer
Architecture class (my grade was A). We had a full range of systems
available, IBM, Microsoft and so on. Herb studied at the Univ of
Illinois a leading school for computer science.

Could be, but his information about C continues to be inaccurate.
You are by your own admission more than one course short. And please
replace law by medicine, and reboot.

Well, that's sort of the point, isn't it -- the various fields are different.

FWIW, I have heard of many people getting much better results from a midwife
with no university education at all than from a doctor fully trained in the
latest totally speculative theories as to what a childbirth would be or how
it would work. It's not necessarily as true now as it used to be, but there
have certainly been times when the formalized structure was worthless.

While it's certainly true that I've never taken courses in CS, it's been
nearly twenty years since I left college, and I've spent that time studying
programming, reading programming books, and programming professionally.
I never claimed that, at 18 or so, I was comparably qualified to someone
with a CS degree. Now, though, I don't necessarily expect a CS degree to mean
that someone will know more about CS than I do, although they're likely to
have some areas of knowledge I haven't picked up yet.

-s
 
S

Seebs

Right. The problem was that given specific compilers for given
specific machines generally stayed with one way or another in cases
like this, and programmers mistook this behavior for a property of C.

You've claimed this, but do you have any kind of evidence that anyone but
you has regularly made that mistake?
Moreover, I would guess that in more than nine out of ten cases the
order of evaluation was left to right.

Again, any support for that?
Well, this fact, alongside the fact that other languages stay with
left to right, means that K&R and the standard erred, since they
created a hidden fact. It should have been cleaned up in the standard,
and would have been if the standard was in the public interest. It was
not cleaned up to preserve vendor investment.

It was not "hidden".
Nor is it necessary for optimization: the above reversal is possible
for the most common case when a and b are simple constants or lValues
without aliasing, and whether they are can be determined by the
optimizer.

But it's most USEFUL in the case where they might have side effects.
If Schildt erred in understanding this

Interestingly, he didn't. As already pointed out, he correctly states,
on page 53 (or maybe 56? my memory's shot) of C:TCR 3e that the order
of evaluation is undefined in a statement like:

f = x1() + f2();

Where he errs is later on; he offers "x = *p + *p++" as an example of an
invalid operation, but "x = *p + (*p++)" as an example of a valid one.
Interestingly, this isn't true even if you trust the Microsoft C++ web
site which (erroneously, even for their compilers, so far as I know)
claims that precedence controls order of evaluation.
he was the victim of what was
essentially a hoax similar to Alan Sokal's infamous paper or that

No, really, not at all.

The information in question is not secret, it is widely available, everyone
but you I've talked to seems to have known it all along, and even Schildt
specifically points out that he knows it in his book.
It was a hoax in Schildt's case because Schildt's experience and
knowledge on the vastly most prevelant platforms were valuable to real
people, to the extent that the existence and popularity of Schildt's
book should have been a reason to correct the standard.

Except that, as pointed out, Schildt's book specifically disagrees with
your theory. Furthermore, it's not clear at al that Schildt's
Windows-specific view of the world counts as the "vastly most prevelant
platforms" for C, because C originated in a Unix environment, and was
widely used in embedded systems. C's prevelance has always been
greater outside of the Windows world. By the time of C:TCR 3e, Windows
development was mostly built around C++ anyway.
Instead,
sponsored by vendors, the standards committee performed an adolescent
stunt by preserving a completely unnecessary property of C, one that
is mostly invisible to intelligent C programmers.

Except that you've still never supported the claim that this was driven
primarily by vendors. You've argued that this is unnecessary, but you
haven't offered concrete numbers for what the performance difference is
for real-world applications.

And indeed, you continue to fail to respond to a number of specific
examples of cases in which reordering, or interleaved ordering,
demonstrably offer potential for improvements in performance which
might affect some side effects, but where experienced C programmers
won't have relied on the order in which the side effects occur. (Determining
whether or not there are side effects doesn't come close to solving the
problem; you have to figure out whether or not the order of the side effects
matters. As an interesting example, if you're looking at two routines
which each produce logging messages, the output of the file may vary with
order of evaluation, but the developer doesn't care which comes first.)

-s
 
T

Tim Streater

spinoza1111 said:
On Oct 29, 10:14 pm, Nick Keighley <[email protected]>
[snip]
or like this
      R1 <- VALUE(b)
      R2 <- VALUE(a)
      R3 <- R2 - R1

In each case R3 contains the result whilst the evaluations of 'a' and
b have been done in a different order.

Right. The problem was that given specific compilers for given
specific machines generally stayed with one way or another in cases
like this, and programmers mistook this behavior for a property of C.
Moreover, I would guess that in more than nine out of ten cases the
order of evaluation was left to right.

Well, this fact, alongside the fact that other languages stay with
left to right, means that K&R and the standard erred, since they
created a hidden fact. It should have been cleaned up in the standard,
and would have been if the standard was in the public interest. It was
not cleaned up to preserve vendor investment.

Where do you get this "hidden" nonsense from? It's already been pointed
out that this was well-documented. If the standardisers had decided to
reverse or alter how this worked, I and thousands of other programmers
might have suddenly been left with potential bugs in our software. The
standardisers would have been booted from here to hell and back. Hardly
in the public interest at all. And so what if it preserved vendor
investment. By doing so it avoided creating bugs in software already
deployed.

Yes, the standardisers left things as *clearly* *documented* by K&R.
Good for them - very much in the public interest.
 
R

Richard Tobin

spinoza1111 said:
A language that imposes stricter
rules is actually easier to optimize since the optimizer has more
information about the intent of the code.

Do you have any evidence for that, or is it just something you think
must be true?

In a language with strict ordering rules, how do you express your
intent that the relative order of evaluation of a() and b() is
unimportant?

In C you express it by writing a() + b(). If you want to impose an
order, you use temporary variables.

A compiler may be able to determine that the order is unimportant -
regardless of the intent of the programmer - by analysing the
functions, but that is not "easier".

-- Richard
 
R

Richard Tobin

Tim Streater said:
Where do you get this "hidden" nonsense from? It's already been pointed
out that this was well-documented. If the standardisers had decided to
reverse or alter how this worked, I and thousands of other programmers
might have suddenly been left with potential bugs in our software.

The status quo was that the order was unspecified. Changing it
to require a particular order would not break any correct programs,
since the required order would be one of the previously legal
behaviours.

-- Richard
 
T

Tim Streater

spinoza1111 said:
Don't think I ever posted to PHP or Javascript.

Oh but you did. Several hundred posts-worth, in fact, to comp.lang.php
or .javascript. I don't recall the topics now or what you were doing
there in the first place, and don't intend to bother looking, either,
uninterested as I am in electronic dumpster-diving.
If its pop how is it impenetrable? Your anomie obscures it for you.

Hardly. I also get bored (or would do, if I read it) by the stuff that
spammers add to e-mails to confuse spam-filters, and for much the same
reasons as with your stuff. Seebs has much greater clarity of expression.
That's because you'd rather watch TV. In fact, a structured
walkthrough constitutes a community form of informal proof of
correctness, but the sort of trashing behavior we see here, ...

Zzzzz zzzz ...
Corporatese. Tell me, how is it possible for people to be "happy" on
the job? It's not a question of whether anyone was happy, it's a
question whether your software was correct.

By "happy" I in fact meant they were satisfied with the tool I offered
them, which saved them much time and walking. They also provided useful
feedback from time to time.

Still, I'd say its quite possible to be happy at work, not that this is
germane to the issue. The first and most important step is to get on
with people. If you really can't do that where you're at, then leave.

As for the "correctness" of my software tool, which ran without stopping
for months at a time, given that it controlled devices which themselves
typically had several hundred users on at once, you can imagine I'd have
heard pretty damn quick if it wasn't correct. I don't say the design
couldn't have been improved, but that's another matter.
 
M

Moi

On Oct 29, 2:00 am, Moi <[email protected]> wrote:
the bug I found on behalf of Nash. Being passive-aggressive, the
Standardizers ignored Microsoft being unwilling to confront its market
power.

Microsofts power in the "C" compiler market was not that big in 1989.
It was the time when the "real compilers" on the PC market
(IIRC: MWC, Lattice, Watcom, Whitesmith) were pushed out of the (home user-)
market by the turbo-stuff with fancy IDE's.
Microsoft at first ignored "C" and stepped in later, mainly because Bill Gates
had once envisioned that anything could be done in Basic.


AvK
 
S

Seebs

Not quite true. Ralph Ryan from Microsoft was the Environment
Subcommittee Chair on C89. Two other Microsoft participants: David
Weil, Mitch Harder.

Oh, cool. I guess they were just gone by the time I got there. During
my time, I don't think MS had people at meetings often. (In fact, I'm
not even sure there was an MS rep at the meeting we had on the Microsoft
campus -- by that time, they'd settled on doing C and C++ in adjacent
weeks at the same location or something similar, so they could have had
the whole thing there for C++.)

-s
 
P

Phil Carmody

Seebs said:
And I like cats but not dogs. Finally, and I think
this is particularly important, I really liked Death Magnetic and it's
probably my favorite Metallica album.

Pervert!

Phil
 
P

Phil Carmody

Richard Heathfield said:
In the workplace, I've taught several CS graduates. They didn't know
spit about programming. None of them.

Ditto. It's the strongest correlation I've found in the work-place.
(At least in my programming jobs.) Age, nationality, gender, religion,
build, hair-length, sexuality, music-taste, preferred drug - all
irrelevant. CS degree - warning lights!!!

(And, yes, with few exceptions, they did all have "Design Patterns"
on their bookshelf.)

Phil
 
S

Seebs

That book is quite possibly the worst I've ever bought. Even
"Numerical Recipes" is better.

I'm curious: What about it do you dislike (or should I ask that elsewhere)?

I have a copy of that, along with a ton of other books, which I've variously
read or skimmed or whatever. My impression of DP was sort of ambivalent...
I'm not sure why, but it just seemed like it would be likely to yield a lot
of mis-applied designs. It may be one of those things where having them
handy and being familiar with them is useful to an experienced programmer,
but if you aren't experienced enough, they're just dangerous.

I'm much more interested by the notion of "antipatterns", which I think are
in some ways a better way to learn about software, and certainly something
people ought to be aware of. (I say this having inherited a database design
that I was able to get onto the front page of the Daily WTF at one point.)

-s
 
B

Ben Bacarisse

Do you have any evidence for that, or is it just something you think
must be true?

To me, it seems self-evidently false.

If some particular order of evaluation allows for easier optimisation,
any C compiler may assume that that order was explicitly intended
(since any order is permitted) and may therefore get the supposed
benefit. As far as the compiler is concerned, I don't see how there
can be any benefit to a specified order of evaluation that can't be
realised in a language that permits any order.

<snip>
 
S

spinoza1111

You are going wayyyy too far from too little data.  What makes you think I
never learned to discuss CS outside of overspecific problems?

"C: The Complete Nonsense" is the major evidence. Added to that is
your writing here. No first rate person would say that optimization
justifies abandoning left to right (or right to left, or any fixed)
ordering, although third rate programmers defend unreliable code as
being more efficient, invalidly reasoning that if the code is
unreliable it must be fast.
It's a concrete example of a value you can pass to free even if it didn't
come from one of those.  Were I writing the page today, though, I'd probably
be more focused on the more significant issue, which is that his writing
could easily leave the reader thinking that:

        x = malloc(5);
        free(x);
        free(x);

is okay, because x is a pointer returned by malloc.

This shows lack of reading comprehension. Herb wrote that free "must
only be called with a pointer that was previously allocated with one
of the dynamic allocation system's functions (either malloc(), realloc
(), or calloc())."

In your example, after the first free(x), x is no longer allocated.
Common sense tells the reader that at the point where the second free
() is executed, x is not allocated, therefore not "previously
allocated" although allocated in the past. The adverb and participal
is ambiguous but clear in context.

Logically, if I am NO LONGER allocated, I am not "previously
allocated" although I was allocated in the past. The reader comes to
the passage knowing that free deallocates.
This is a fascinating theory, but completely wrong.  I'm literal-minded
to a fault because I'm autistic, not because of when I did or didn't discuss
things.

If you are as you say autistic, then I ask the court for summary
judgement and costs, because while I have compassion for autism, your
autism disqualifies you from writing on programming. This is because
part of autism includes inability to deal with natural language and
use common sense.

Apologize to Herb and seek treatment.
In fact, I've spent quite a while discussing abstractions in a human context,
because I was raised by mathematics teachers.  I'm a lot better at it now
than I was 15 years ago, though.

Evidence of that would be an apology.
It's not useless, though.  It's an extremely useful thing, because it ensures
that it is always safe to free (once) a value returned by malloc or realloc.
It eliminates hundreds of "if (p)" tests.  Very useful.

Only if you're lazy. Sounds like more catering to incompetence and a
good way to cover up free() statements when programmers have failed to
balance malloc() and free().
This is just semantically incoherent.

If you're autistic, you have language difficulties. If you have
language difficulties, you have no standing in pronouncing things
"semantically incoherent". It's like the half-literates here who say I
can't write or am verbose. They have no opinion I need respect, and
the echo chamber of the Internet is the only reason why your tirade
was influential.

Sure, I'll research Glassborow et al. But all I have seen so far are
repetitions to the effect that "Schildt sucks and contains thousands
of errors". I have not seen any list of those errors: your document
makes reference to them but lists fewer than 100.

Another smoking gun: your strange belief that something can be
"clear" (understandable, generative of justified true belief of
something true by my argument last month, which you haven't refuted)
yet untrue is repeated without change or consideration of its
weirdness.
Which is a shame, because I took those too.  :)

It shows, because like the C Standard they are a bureaucratic way of
certifying behavior that should not be certified. Gee, a lot of
compilers evaluate things every which way. Their vendors' CEO might
have my Daddy by the balls. Best call them all standard. Gee, little
Johnny can't read for the key point, such as "allocate and free should
be isomorphic, in one to one correspondence, and he takes his rage out
on people who can on the Internet. Let's let him take a class in
communications and pass him."
Ahh, and we're back to your little hobbyhorse.

What you say could actually have been a good argument.  I think there's a lot
to be said for describing the logically-nested activation records as a
conceptual "stack", and showing how this could be implemented in various ways,
including the specific case of just having a large chunk of memory and running
up it and down it.

But.

The objection to Schildt's writing on the topic is not to his use of the
abstraction "stack".  It can't be, because he never deals with the
abstraction.  He talks only about a specific concrete form, and gives it,
not as an example of one way of doing things, but as The Way Things Are.

Had he talked about the abstraction, then shown an example of one way of
implementing it, I certainly wouldn't be complaining.

However, we can tell these apart immediately by his use of language.  It
is true that the set of activation records is *a* stack.  It is only true
on some implementations that the set of activation records is stored in *the*
stack.  The indefinite article refers to an abstraction; the definite article
refer to a specific and concrete implementation.

I already posted the hunk of text from C:TCR (3e) showing exactly this point.

I have always thought that Schildt was a clear and lucid communicator; I
merely think he's wrong about a lot of things.  By contrast, your careful
distinction between the abstraction of "a stack" and specific implementations
is a marked improvement over his material on the stack, which always felt a
bit clumsy.  You are almost right; that would have been a good way to express
it, and if he'd expressed it that way, instead of the way he did, it would be
ridiculous for people to criticize him for it.

The problem is that the ability to generalize and separate essentials
from accidents should pre-exist the exposure to programming. You don't
give Schildt's readers enough credit for this when in fact the example
of sports talk, which Noam Chomsky has pointed out is highly informed
and able to deal with abstraction, "ordinary" people are able to tell
the difference between essentials and accidents. A machinist sees the
micrometer of difference between a real and an ideal triangle and
compensates for it while still using Euclid.

You may have mistaken your own way of thinking, which you admit is
autistic in some measure, with that of the "normal" person. This isn't
to say that a normal person is better than you. It's to say that you
need to develop great software and not try to be a technical writer.

You should develop a tool, a super-lint, to find problems caused by
side effects in code that assumes left to right. That's because no
matter how well YOU know that left to right may not apply, this is a
natural thing for intelligent people to assume, and their intelligence
should have caused the Standard to standardize left to right. It also
should have abandoned "sequence points" and mandated that pre and post
increment and decrement be applied only to lValues.
 
S

Seebs

"C: The Complete Nonsense" is the major evidence.

Hmm. So you're saying that something I wrote up in maybe an hour or two
something like fourteen years ago is clear evidence as to what I have ever
learned? That's interesting.
Added to that is
your writing here. No first rate person would say that optimization

I hear... bagpipes?

.... No, surely not. That's no true scotsman.
justifies abandoning left to right (or right to left, or any fixed)
ordering,

Well, oddly, a number of people frequently regarded as "first rate" think
so.
although third rate programmers defend unreliable code as
being more efficient, invalidly reasoning that if the code is
unreliable it must be fast.

Ahh, but no one has made such an argument. You're also forgetting the
key distinction between the language and the code. We're talking about
the language, not code written in it. Code written by experienced
C programmers is not rendered "unreliable" by the order of evaluation
rule selected, *because they write code with that rule in mind*.
This shows lack of reading comprehension. Herb wrote that free "must
only be called with a pointer that was previously allocated with one
of the dynamic allocation system's functions (either malloc(), realloc
(), or calloc())."
In your example, after the first free(x), x is no longer allocated.

But it was previously allocated. He doesn't say "one that is now allocated",
he says "one that was previously allocated". That's sort of the point.
Common sense tells the reader that at the point where the second free
() is executed, x is not allocated, therefore not "previously
allocated" although allocated in the past. The adverb and participal
is ambiguous but clear in context.

Not really.
Logically, if I am NO LONGER allocated, I am not "previously
allocated" although I was allocated in the past.

Have you ever heard the phrase "previously owned"? Hint: "Previously
owned by a little old lady" does not mean "and still owned now, because
if she'd sold it it would no longer be previously owned."
The reader comes to
the passage knowing that free deallocates.

Yes, and is told that as long as it was previously allocated, it's fair
game. Ashes were previously on fire. That doesn't mean they're on fire
now, it means they were in the past. Putting out a fire doesn't make it
no longer the case that it "was previously on fire".
If you are as you say autistic,

That is what the nice lady told me, and since she's a specialist in the
field, and her judgement coincides with that of a number of other specialists,
and because her proposed explanation has both explanatory and predictive
power, I have provisionally accepted her claim.
then I ask the court for summary
judgement and costs,

Denied. *bangs gavel*
because while I have compassion for autism,

Like hell.
your autism disqualifies you from writing on programming.

No it doesn't.
This is because
part of autism includes inability to deal with natural language and
use common sense.

Not so. Perhaps before you start having "compassion" you should start
with "comprehension".

I certainly am a bit light on the unconscious set of responses people
usually refer to as "common sense". However, I have demonstrated a great
deal of facility with natural language -- which is quite common among
us "high-functioning" autistics. (The term covers pretty much anyone who
can get dressed in the morning without professional help, I figure I'm
probably covered.)

The proof, as they say, is in the pudding. Ask anyone who's ever worked with
me whether or not I can use natural language competently. No, really, go
on. Ask them. Ask them about the books I've written. (Note: I only have
one book out there in print that I'm listed as an author of, but that doesn't
mean there aren't other books I've written; it just means that the others
were, say, written for an employer or something.)
Apologize to Herb and seek treatment.

Nothing doing. I don't apologize for things that weren't wrong, and I'm
already as treated as one gets for autism.
Evidence of that would be an apology.

No. Evidence of that would be writing clearly and effectively, and (for
instance) successfully explaining issues pertaining to C, or human interface
design, or other things to people. Which, conveniently, I've done
professionally at some length.
Only if you're lazy. Sounds like more catering to incompetence and a
good way to cover up free() statements when programmers have failed to
balance malloc() and free().

Interesting theory, but... Considering how atrociously poor your demonstrated
knowledge of C is, why should I accept your judgement? What have you done
that qualifies you to comment?
If you're autistic, you have language difficulties.

Not really. I'm on the hyperlexia side of things. I have difficulties
with some specific categories of communication (such as speech acts), but
in terms of general language usage, I'm fine.

(Hint: It doesn't fit the stereotype if I go out of my way to wave the
word "autistic" in front of you because it will make you do funny things.)
If you have
language difficulties, you have no standing in pronouncing things
"semantically incoherent".

Ahh, but what if instead I have abnormal facility with language? That would
sort of reverse things.
It's like the half-literates here who say I
can't write or am verbose.

You obviously can write. I don't know about verbose, but you certainly
run on tangents at great length, and you're not a particularly effective
communicator, I'd say. You've simply been unable to offer coherent
support for many of your positions, and your hammering on the degree
and later on the autism thing suggests that you have no clue at all about
the role of ethos in effective persuasive writing.
They have no opinion I need respect, and
the echo chamber of the Internet is the only reason why your tirade
was influential.

I don't even know that it was influential. The closest I've seen is that
when some anonymous idiot who kept using open proxies to spam wikipedia
about how offensive he found the claims of "criticism of Schildt" finally
hit a wall, the wall was that someone pointed out that page and my
nominal credentials, which got the wikipedia editors to finally say "yes,
that's substantive".
Sure, I'll research Glassborow et al. But all I have seen so far are
repetitions to the effect that "Schildt sucks and contains thousands
of errors". I have not seen any list of those errors: your document
makes reference to them but lists fewer than 100.

I don't think I've claimed thousands. I'd probably say there's over a hundred
errors, including minor stuff and repetitions, and at least twenty or thirty
significant errors that cannot be adequately explained as typos. In a third
edition, no less.
Another smoking gun: your strange belief that something can be
"clear" (understandable, generative of justified true belief of
something true by my argument last month, which you haven't refuted)

Your arguments bored me, you refused to answer direct questions, so I plonked
you. I unplonked you when I found out you were funny.
yet untrue is repeated without change or consideration of its
weirdness.

And you think *I* have difficulties with language?

Let's try examples:

* The feline's autoambulatory facilities permitted an exposition of the
question of whether comestibles could be had.
* The cat can check whether it has food, because it can walk.
* Nevertheless a significant component of feline comestible-discovery
tactics relies on quantum entanglement derived powers not dissimilar
to those purportedly held by the prognosticators who advertise on
television.
* The cat uses psychic powers to see whether it has food.

These are, in order:
* Unclear, but true.
* Clear, and true.
* Unclear, and untrue.
* Clear, and untrue.

Schildt's writing frequently expresses a given thought eloquently and
unambiguously; however, it is unfortunately the case that some of the
beliefs thus expressed are untrue.
It shows, because like the C Standard they are a bureaucratic way of
certifying behavior that should not be certified.

You make a lot of random guesses about the contents of classes you never
took.
Gee, a lot of
compilers evaluate things every which way. Their vendors' CEO might
have my Daddy by the balls. Best call them all standard.

Again, no matter how many times you make this stuff up, it ain't what
happened. K&R (1978) stated that order of evaluation was not specified
by the language, and ISO C (1989) formalized that without altering it,
because no one seemed to feel it needed to be altered.
Gee, little
Johnny can't read for the key point, such as "allocate and free should
be isomorphic, in one to one correspondence, and he takes his rage out
on people who can on the Internet. Let's let him take a class in
communications and pass him."

This would be way, way, cooler if you'd not misused "isomorphic".

While it's true that a 1-1 relationship may be an isomorphism, not all
1-1 relationships are isomorphic. It's 1-1 and onto that makes something
isomorphic.
The problem is that the ability to generalize and separate essentials
from accidents should pre-exist the exposure to programming.

No, the problem is that a good writer has to indicate which are the essentials
and which are the accidents, because otherwise even a fairly smart reader
may not be abel to tell which is which.

Consider. In 1978, C was unambiguously defined to allow arbitrary orders
of evaluation. And yet. Years later, you were using C in Microsoft
environments, and formed the belief that "nine out of ten" compilers
evaluated left-to-right -- despite the fact that you haven't been able
to name a single compiler which did so, and the two compilers other
people have been able to test thus far did not.

Which is to say, that years after your first CS class (1970, you tell us),
you were unable to separate essentials from accidents.
You don't
give Schildt's readers enough credit for this

I was active on Usenet during a time when many of his readers were
posting here, and demonstrably, they could not separate essentials from
accidents in this respect.

They had been told that there was "The Stack", which was a single block
of memory, and they were shocked to discover that this might not always
be the case.
You may have mistaken your own way of thinking, which you admit is
autistic in some measure, with that of the "normal" person.

No, I haven't. Not ever, really -- for me, the realization that other
people had a way of thinking was pretty much the same as the realization
that it was not the same as mine. There may have been as much as a year's
gap between those, but probably much less, and it happened before I was
reading reliably.
This isn't
to say that a normal person is better than you. It's to say that you
need to develop great software and not try to be a technical writer.

I'll totally think about that. However, what I've consistently found is
that, while I'm certainly a decent software developer, it's the ability
to translate from thinking-like-machines to thinking-like-people that's
been my most significant marginal advantage -- and that shows up mostly
in technical writing.
You should develop a tool, a super-lint, to find problems caused by
side effects in code that assumes left to right.

No, I shouldn't.

This may sound familiar, but:

You may have mistaken your own way of thinking, with that of the
"normal" person.

Because you totally have.
That's because no
matter how well YOU know that left to right may not apply, this is a
natural thing for intelligent people to assume,

I'd like to suggest a bit of a homework assignment for you. Please
go find a dictionary, and look up the word "evidence". See if your
amazing intelligence will allow you to perceive ways in which it could
be related to the statement you just made.
and their intelligence
should have caused the Standard to standardize left to right.

I have bad news for you.

It's true that you have never provided any support for your premise.
And it's also true that your conclusion is not related to your premise.
But. It turns out that two wrongs do not make a right. And a conclusion
which is unrelated to an unsupported premise is not "proven".
It also
should have abandoned "sequence points"

I don't see why.
and mandated that pre and post
increment and decrement be applied only to lValues.

How the **** am I supposed to try to take you seriously when you demand
that the standard "should have" mandated that pre and post increment and
decrement be applied only to lvalues? Why don't you demand that it
"should have" specified that execution starts at main(), or that it
"should have" included <stdio.h>, or that it "should have" been written
in English and distributed in printed form with black text on white
pages?

Hint: Of course they have to be lvalues. In fact, they have to be
*modifiable* lvalues. Duh.

-s
 
K

Keith Thompson

Seebs said:
Again, no matter how many times you make this stuff up, it ain't what
happened. K&R (1978) stated that order of evaluation was not specified
by the language, and ISO C (1989) formalized that without altering it,
because no one seemed to feel it needed to be altered.
[...]

Going even further back:

<http://cm.bell-labs.com/cm/cs/who/dmr/cman.pdf> is the C Reference
Manual that came out with 6th Edition Unix in May 1975. In section 7,
it says:

Otherwise the order of evaluation of expressions is undefined. In
particular the compiler considers itself free to compute
subexpressions in the order it believes most efficient, even if
the subexpressions involve side effects.

<http://cm.bell-labs.com/cm/cs/who/dmr/kbman.html>, dated January 7,
1972, describes B, the predecessor of C. I don't see a general
statement that the order of evaluation of expressions is undefined,
but there is a specific statement that, in a function call:

The expressions In parentheses are evaluated (in an unspecified
order) to rvalues and assigned to the function's parameters.

<http://www.masswerk.at/algol60/modified_report.htm>, the Modified
Report on the Algorithmic Language Algol 60, says:

The order of evaluation of primaries within an expression is not
defined. If different orders of evaluation would produce different
results, due to the action of side effects of function
designators, then the program is undefined.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,812
Latest member
GracielaWa

Latest Threads

Top