Comparision of C Sharp and C performance

S

spinoza1111

spinoza1111wrote:



No, the point was that it is important not to confuse the textual
substitution performed by the preprocessor with the compile-time
arithmetical simplification performed (optionally) by the translator, be
it an interpreter or a compiler or something in between.

Yeah, I cleared this up for you, didn't I.
You wear that like a badge, but if your C knowledge is anything to go
by, I wouldn't trust your compiler to compile "hello world" properly.

Stop making a fool of yourself.
 
S

spinoza1111

spinoza1111wrote:

I'm not calling anyone ignorant. Learn to read for comprehension.

Stop making a fool of yourself. I swear to God I'm gonna write you a
sonnet on that theme.

Here's what you said about Peter Neumann: "the moderator of
comp.risks, whom I was crediting with good sense - apparently
mistakenly. I'm not even remotely interested in what ignorant people
believe about you."

Don't make total fool of thy sweet self
Thou art already mostly fashion'd fool:
With pomposity and pretense you exhaust pelf:
Perhaps you need to go back to school.
And learn there in fool school manners
Such breeding as can be taught you,
How to write parsers and also scanners,
And how to wipe your arse after you poo.
Such a Borstal you deserve, dearest Dick
Not the victor's crown or laurel so green:
The loser's lousy lot forever you will lick
Until you cultivate sensitivity and understanding.
Let this indeed be a lesson unto ye
Stop causing this thread such unspeakable Misery.
 
S

spinoza1111

You were wise to abandon philosophy.

WTF? I didn't. The head of the department asked me after I got my BA,
and with no plans to go to graduate school, to teach philosophy.

But after programming thereafter for ten years, I woke up in a sort of
Anbar Awakening, and saw for the first time the dull souls of my
coworkers. I became an autodidact but took whatever opportunity I
could to Get Smart, including bypassing higher-paid Korporate jobs for
a job at Princeton University.

But you can read the story in my published book, since Apress allowed
me to put in biographical details while discussing how to build a
compiler in an amusing fashion.

Programming was merely for me a draft-dodging scheme that got outa
hand (the draft didn't end until 1973 and I took my first computer
science class in 1970). But one gets interested in reified crap.
 
R

Richard Tobin

spinoza1111 said:
Thanks for the clarification, if indeed this is a "clarification" in
the sense of being "clear" and therefore true. I don't have time,
right now, to work through your example. But is it fair to say that
the preprocessor rescans until all define symbols are gone? If this is
the case, can source code loop the preprocesssor? If this is the case,
does this not suck?

Rescanning is done, but not in a way that can cause indefinite
recursion. See section 6.10.3.4 of C99, "Rescanning and further
replacement".
And do you know whether there is any difference
between inside out expansion and rescanning?

The very article you were replying to showed an example of this. But
if you "didn't have time to work through it" - it should only take a
few seconds - I'm not going to spend my time spelling it out to you.

-- Richard
 
N

Nick Keighley

In other words, I'm being trashed and you're afraid of being subject
to the same treatment.

no. It's surprising how often "in other words" is used to introduce a
phrase that has a radically different meaning from that which it
replaces (a sort of natural language macro).

I was actually trying to spin your post so it made some sort of sense.
Foolish of me.

Your macro rule para-phrased:-

"parenthesize formal parameters in macro definitions, decide whether
to [expand to] an
expression or a statement, [not anything] else, if you [expand to] an
expression [enclose] it in its own set of round parentheses, if you
[expand to] a statement [enclose] the statement list in its own set of
braces"

this rules out expanding to a complete function (I've done this) or
chunk of code. Your rule is more a rule of thumb; that I've seen
violated many times. Sometimes sensibly.
 
N

Nick Keighley

If you said it, or Francis said it, I'd probably think that was obviously
what was meant.  Spinny, though, is sufficiently completely incapable of
getting technical details right (consider that this whole subthread was
inspired by his assertion that the preprocessor was doing the constant
folding)

I know, I meant to dig it out but this thread is too long

that I would not be comfortable guessing as to what he meant...

he also changes what he claims to mean when he realises he's wrong
 
S

spinoza1111

In other words, I'm being trashed and you're afraid of being subject
to the same treatment.

no. It's surprising how often "in other words" is used to introduce a
phrase that has a radically different meaning from that which it
replaces (a sort of natural language macro).

I was actually trying to spin your post so it made some sort of sense.
Foolish of me.

Your macro rule para-phrased:-

"parenthesize formal parameters in macro definitions, decide whether
to [expand to] an
expression or a statement, [not anything] else, if you [expand to] an
expression [enclose] it in its own set of round parentheses, if you
[expand to] a statement [enclose] the statement list in its own set of
braces"

this rules out expanding to a complete function (I've done this) or
chunk of code. Your rule is more a rule of thumb; that I've seen

George Orwell's most important rule in "Politics and the English
Language" (http://www.mtholyoke.edu/acad/intrel/orwell46.htm) is
little known:

"Never use a metaphor, simile, or other figure of speech which you are
used to seeing in print."

I'd add "or hearing from your mates especially in drunken bull
sessions". Orwell was addressing the sort of people that read left-
wing crap and repeated it mindlessly in the 1930s, but today he'd be
ranting about Fox News idiots and the stupid, empty things ordinary
people say.

"Chunk of code" is precisely that kind of programmer phrase which
shows that someone ain't thinking. The rule specifically PROHIBITS you
from thinking in terms of "chunks of code".

However, you have suggested a third possibility, which would be the
generation of function definitions or declarations. The important
thing is that the macro is documented by something which defines it as
one of an expression, statement, declaration or definition macro.
violated many times. Sometimes sensibly.

Why is it preferable apart from the cheap thrill factor to violate
rules rather than change them? Must we so conform to bourgeois
morality that instead of daring to redraft the rules we violate them
in that sort of anarchism which is a precondition of Fascism/
 
N

Nick Keighley

Your macro [writing] rule para-phrased:-
"parenthesize formal parameters in macro definitions, decide whether
to [expand to] an
expression or a statement, [not anything] else, if you [expand to] an
expression [enclose] it in its own set of round parentheses, if you
[expand to] a statement [enclose] the statement list in its own set of
braces"
this rules out expanding to a complete function (I've done this) or
chunk of code. Your rule is more a rule of thumb;

George Orwell's most important rule in "Politics and the English
Language" (http://www.mtholyoke.edu/acad/intrel/orwell46.htm) is
little known:

"Never use a metaphor, simile, or other figure of speech which you are
used to seeing in print."

I'd add "or hearing from your mates especially in drunken bull
sessions". Orwell was addressing the sort of people that read left-
wing crap and repeated it mindlessly in the 1930s, but today he'd be
ranting about Fox News idiots and the stupid, empty things ordinary
people say.[/QUOTE]

in your opinion

"Chunk of code" is precisely that kind of programmer phrase which
shows that someone ain't thinking. The rule specifically PROHIBITS you
from thinking in terms of "chunks of code".

and I'm disagreeing with it. I don't agree that "chunk of code" is an
"empty thing" I used it because it says what I want.
However, you have suggested a third possibility, which would be the
generation of function definitions or declarations. The important
thing is that the macro is documented by something which defines it as
one of an expression, statement, declaration or definition macro.

or you could just read the macro definition.
Why is it preferable apart from the cheap thrill factor to violate
rules rather than change them?

because it's hard to lay down rules for coding standards that work in
all situations. This is why most coding standards include "let out"
clauses. "Don't violate the rules in this standard without some
thought. Preferably document the reason for the violation". Some
standards insist on some formal process for documenting standard
violation. If programming could be reduced to simple rules we'd write
one last program and bugger of and do something else.

<snip>

and, incidently, I agree that stuffing arbitary limps of code into
macros is generally a poor idea. I've debugged code like that and it's
not fun.


-- =
"High Integrity Software: The SPARK Approach to Safety and Security"
Customers interested in this title may also be interested in:
"Windows XP Home"
(Amazon)
 
D

Daniel

no. It's surprising how often "in other words" is used to introduce a
phrase that has a radically different meaning from that which it
replaces (a sort of natural language macro).
I was actually trying to spin your post so it made some sort of sense.
Foolish of me.
Your macro rule para-phrased:-
"parenthesize formal parameters in macro definitions, decide whether
to [expand to] an
expression or a statement, [not anything] else, if you [expand to] an
expression [enclose] it in its own set of round parentheses, if you
[expand to] a statement [enclose] the statement list in its own set of
braces"
this rules out expanding to a complete function (I've done this) or
chunk of code. Your rule is more a rule of thumb; that I've seen

George Orwell's most important rule in "Politics and the English
Language" (http://www.mtholyoke.edu/acad/intrel/orwell46.htm) is
little known:

"Never use a metaphor, simile, or other figure of speech which you are
used to seeing in print."

I'd add "or hearing from your mates especially in drunken bull
sessions". Orwell was addressing the sort of people that read left-
wing crap and repeated it mindlessly in the 1930s, but today he'd be
ranting about Fox News idiots and the stupid, empty things ordinary
people say.

"Chunk of code" is precisely that kind of programmer phrase which
shows that someone ain't thinking. The rule specifically PROHIBITS you
from thinking in terms of "chunks of code".

However, you have suggested a third possibility, which would be the
generation of function definitions or declarations. The important
thing is that the macro is documented by something which defines it as
one of an expression, statement, declaration or definition macro.
violated many times. Sometimes sensibly.

Why is it preferable apart from the cheap thrill factor to violate
rules rather than change them? Must we so conform to bourgeois
morality that instead of daring to redraft the rules we violate them
in that sort of anarchism which is a precondition of Fascism/

I must say, it's kind of sickening to see many of the posts. I just
happened on the discussion looking for something else, and am not
going
to be sucked in (other than this one post).

In a way I feel kind of sorry for spinoza1111. I hope he gets into
something more productive. It's quite clear to me he's a pretty bad
programmer and lacking in knowledge of C. His rantings do seem to
cause a huge waste of time and energy, and cause harm to a group
that I respect and make use of.

By the way, although I'm a C programmer, I really love C#. C# is
very well designed and a pleasure to write code with. But it is
slow. I know. I completely ported a C database application to it.
But I had to give it up because of the slow performance.

Daniel
 
P

Phil Carmody

Richard Heathfield said:
I don't expect you to understand that paragraph. In fact, at this
stage I can hardly bring myself to expect you to understand the
alphabet.

Yes. Because he's ignorant of so much.

Get over it. Killfile and get on with your life, your time is
to valuable, in particular your time in front of usenet is
valuable to _others_, please don't waste it.

Phil
 
K

Keith Thompson

Morris Dovey said:
Richard's time here is valuable precisely /because/ he understands
that knowledge is not well-served by ignoring error.

In most cases, I agree with you. In the case of "spinoza1111",
trying to correct his errors is a waste of time. Most of what
he posts is nonsense (insults, irrelevant rants); much of the
rest is just wrong. If he dropped the nonsense and just posted
actual information that's mostly wrong, correcting him would
probably be useful, for the sake of other readers if nothing else.
But correcting his factual errors just encourages him to post
more nonsense.

I believe this newsgroup would be greatly improved by his absence,
but we can get most of the benefit by ignoring him.
 
R

Richard Bos

Beej Jorgensen said:
Indeed, the standard doesn't even take a stab at the definition of
"pseudo-random". I think it's probably happy to let the free market
sort that one out. :)

Well, yes, that was more or less my point, wasn't it? The Standard
probably _can't_ make demands on the quality of rand() that would
satisfy everyone. Some people want high-quality PRNs, some want an
ultra-fast sequence of noise without being overly concerned about
statistical rigour. You can't satisty both.
All a Standard for general use can do is note that most programmers, at
some point, want a small number of numbers which the common user will
see as random. Beyond that, you are going to have to define _very_
precisely what exactly you mean with "pseudo-random", and that
definition is guaranteed to be wrong for some people. As it is, the
Standard has provided a rand() which suffices for trivial (i.e., most)
use, and
"So, your PRNG just outputs the number 3490 over and over... that's
crap, see."

No, see, that is part of what, IMO, the Standard _can_, and does,
demand: that the advertised limits of your rand() are correct. "There
must be at least N degrees of randomness in each of the M bit-sets of
the random number stream" is such a specialised job that I've had to
resort to gobbledegook in that sentence. "A range of numbers shall be
generated from 0 to RAND_MAX, inclusive" is much easier both to specify,
and for less heavy-weight implementations to comply with. And IYAM, just
3490 repeated does not comply with that simplest of demands, the only
one which the Standard _can_ reasonably make.

Richard
 
K

Keith Thompson

Beej Jorgensen said:
The problem is with the 'pseudo-random'. It fails to specify what
tests the generated sequence must pass in order to be accepted.

Indeed, the standard doesn't even take a stab at the definition of
"pseudo-random". I think it's probably happy to let the free market
sort that one out. :) [...]

"So, your PRNG just outputs the number 3490 over and over... that's
crap, see."

No, see, that is part of what, IMO, the Standard _can_, and does,
demand: that the advertised limits of your rand() are correct. "There
must be at least N degrees of randomness in each of the M bit-sets of
the random number stream" is such a specialised job that I've had to
resort to gobbledegook in that sentence. "A range of numbers shall be
generated from 0 to RAND_MAX, inclusive" is much easier both to specify,
and for less heavy-weight implementations to comply with. And IYAM, just
3490 repeated does not comply with that simplest of demands, the only
one which the Standard _can_ reasonably make.

Would a rand() implementation that does a decent job of generating
numbers from 1 to RAND_MAX, but that never produces 0, be conforming?
(Assume that this can be proven by examining the source code.)
Note that for a large RAND_MAX, this might be good enough for
most purposes.

Must rand() be able to return *all* numbers from 0 to RAND_MAX?

These are mostly rhetorical questions; I don't think they can be
definitively answered given the wording of the standard. But I
think the difference between a rand() that generates numbers from
1 to RAND MAX and one that just generates 3490 is more qualitative
than quantitative.

Intuitively, a rand() that always returns 3490 is Just Wrong, but
there's no clear dividing line between correct and incorrect
implementations.
 
S

spinoza1111

Most of these "errors" are rumors. We all make "errors": Seebach has
claimed that "the heap is a DOS term", Richard has claimed that I'm
not in comp.risks, etc.

However, a poster who like me is statistically outlier in that he's
not a half-literate, narrow little technician attracts jealousy and
hatred here from the same sort of people who supported Hitler in
Germany: the white-collar lower middle loser class.

Any way, there are better things to do than code C. Such as send money
to Haiti in aid of the earthquake victims.
 
S

spinoza1111

Yes. Because he's ignorant of so much.

Because a person makes an error doesn't mean he's ignorant. Is
Heathfield unable to search correctly, given his claim that I'm not in
comp.risks based on searching titles of digests? Is Seebach ignorant
of basic computer science given his claim that "the 'heap' is a DOS
term"?

There are three possibilities with respect to an error:

(1) Ignorance
(2) Malicious lying (a strong possibility in Heathfield's case based
on his global conduct)
(3) Honest errors of the sort Einstein made in mathematics and von
Neumann in coding

A gentleman assumes (3) in most cases.

However, people of the white collar lower middle class, of the same
sort who supported Hitler and the slaughter of the Jews, tend through
their own psychological insecurity (which is based on their material
insecurity, for their "professional" jobs are held by grace and favor
of large corporations) to ascribe (1).

However, there are better things to do than code C. Such as help the
people of Haiti. They're in hell thanks in part to the fact that
shitty little white computer programmers have supported American and
French governments have been punishing Haiti for two hundred years for
taking the rights of man and citizen seriously while black.
 
S

spinoza1111

On Jan 12, 7:49 pm, Nick Keighley <[email protected]>
wrote:
On Jan 11, 4:16 pm, Nick Keighley <[email protected]>
wrote:
spinoza1111wrote:
He didn't want to explain "my" 1990 rule (parenthesize formal
parameters in macro definitions, decide whether to return an
expression or a statement, return nothing else, if you return an
expression return it in its own set of round parentheses, if you
return a statement return the statement list in its own set of braces)
because there might have been counter-examples and the rule is rather
complicated.
There is no sensible rule, simple or complicated, about what macros
return, because macros don't return anything. We covered this already.
How silly. And for years myself and other long term programmers managed
to understand what "the macro returns X" means.
to be fair spinoza seems to be using the term in a slightly different
fashion. he's talking about the textual expansion of the macro. So
#define INC(X) (X + 1)
INC(y);
"returns" (y + 1)
I was going to say that spinoza's usage, though non-standard, was
clear but perhaps it isn't...
In other words, I'm being trashed and you're afraid of being subject
to the same treatment.
no. It's surprising how often "in other words" is used to introduce a
phrase that has a radically different meaning from that which it
replaces (a sort of natural language macro).
I was actually trying to spin your post so it made some sort of sense..
Foolish of me.
Your macro rule para-phrased:-
"parenthesize formal parameters in macro definitions, decide whether
to [expand to] an
expression or a statement, [not anything] else, if you [expand to] an
expression [enclose] it in its own set of round parentheses, if you
[expand to] a statement [enclose] the statement list in its own set of
braces"
this rules out expanding to a complete function (I've done this) or
chunk of code. Your rule is more a rule of thumb; that I've seen
George Orwell's most important rule in "Politics and the English
Language" (http://www.mtholyoke.edu/acad/intrel/orwell46.htm) is
little known:
"Never use a metaphor, simile, or other figure of speech which you are
used to seeing in print."
I'd add "or hearing from your mates especially in drunken bull
sessions". Orwell was addressing the sort of people that read left-
wing crap and repeated it mindlessly in the 1930s, but today he'd be
ranting about Fox News idiots and the stupid, empty things ordinary
people say.
"Chunk of code" is precisely that kind of programmer phrase which
shows that someone ain't thinking. The rule specifically PROHIBITS you
from thinking in terms of "chunks of code".
However, you have suggested a third possibility, which would be the
generation of function definitions or declarations. The important
thing is that the macro is documented by something which defines it as
one of an expression, statement, declaration or definition macro.
Why is it preferable apart from the cheap thrill factor to violate
rules rather than change them? Must we so conform to bourgeois
morality that instead of daring to redraft the rules we violate them
in that sort of anarchism which is a precondition of Fascism/

I must say, it's kind of sickening to see many of the posts. I just
happened on the discussion looking for something else, and am not
going
to be sucked in (other than this one post).

In a way I feel kind of sorry forspinoza1111. I hope he gets into
something more productive. It's quite clear to me he's a pretty bad

The "bad programmer" is a social construct. It's the image of what the
real programmer, who is objectively a powerless member of the lower
middle class, fears himself to be. He is in fact the Jew in Hitler's
Germany who the non-Jew of the white collar lower middle class feared
himself to be and therefore murdered.

Most objectively good programmers have left or been driven from the
field, since in a corporation it's more important to not rock a boat
constructed nearly completely from incompetent code. I'm now a teacher
of English, social studies, and occasionally computer science.

As the author of "Build Your Own .Net Language and Compiler" who was
asked by Princeton Univ to assist John Nash with C, I don't lose any
beauty sleep over these wild accusations. At the same time, the
exclusion of great programmers from real programming was a real harm
to them, because they've been replaced by liars and fools, like Peter
Seebach, who hasn't written, as far as I can tell, anything other than
shell scripts, who believes that "the 'heap' is a DOS term", and who's
actually proud of never taking comp sci.

Any way, I am coding some C to see if I can implement "ropes"
considered as unrestricted strings that can contain Nuls and have no
length limits. I am also donating time and money to Haiti, and I
suggest you do so too, instead of insulting strangers on the Internet
(which isn't quite the same thing as defending oneself as a man and
gentleman).
 
N

Nick Keighley

However, there are better things to do than code C. Such as help the
people of Haiti. They're in hell thanks in part to the fact that
<expletive> little white computer programmers have supported American and
French governments have been punishing Haiti for two hundred years for
taking the rights of man and citizen seriously while black.

The American's and the French have been programming computers for two
hundred years! Who knew?!
 
S

spinoza1111

The American's and the French have been programming computers for two
hundred years! Who knew?!

Before you programmed them, you were computers: cf. When Computers
Were Human (Princeton Univ Press). The focus is on your relations with
the rest of society, and in those relations, it is necessary to
fantasize that you have a "skill" even when you don't have one
independent of your relations to the means of production.
 
P

Phil Carmody

Morris Dovey said:
Richard's time here is valuable precisely /because/ he understands
that knowledge is not well-served by ignoring error.

Nilges bilge doesn't count as 'error', it counts as 'noise'.

Phil
 
S

spinoza1111

In most cases, I agree with you.  In the case of "spinoza1111",
trying to correct his errors is a waste of time.  Most of what
he posts is nonsense (insults, irrelevant rants); much of the

They're not rants.
rest is just wrong.  If he dropped the nonsense and just posted
actual information that's mostly wrong, correcting him would

Can you provide pithy examples? Is it for example wrong that a
benchmark showed C Sharp only ten percent slower than C in code with
so many loops that an interpreter would have been an order of
magnitude slower? Or are you exagerrating mistakes whilst forgiving
really egregious errors ("Nilges isn't in comp.risks" and "the 'heap'
is a DOS term") in your butt buddies?
probably be useful, for the sake of other readers if nothing else.

I haven't been making false claims about C. Where I've made errors,
they have been retracted, despite the fact that you thugs then use the
retractions to "prove" you are Tough Guys Who Never Make a Mistake in
the rhetorical sense.
But correcting his factual errors just encourages him to post
more nonsense.

I believe this newsgroup would be greatly improved by his absence,
but we can get most of the benefit by ignoring him.

But you're not. You're merely talking to others about me in my
presence, which would be considered very ill-bred in meat space.

**** you, asshole (and that's far less ill-bred than you).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,091
Messages
2,570,605
Members
47,225
Latest member
DarrinWhit

Latest Threads

Top