Comparision of C Sharp and C performance

S

spinoza1111

spinoza1111wrote:


It seems fair to deduce from that response that you can't justify your
assertion, and you're relying on insults to cover your error. It may be
working - perhaps hardly anyone realises you're wrong because they're
too busy realising other things about you.

Again: Orwell had nothing but contempt for men who spoke in phrases
they read in books or heard from commissars, and repeated without
thought.

"It seems fair to deduce" - it seems fairer to flip you the bird
"You can't justify your assertion" - oh but I have
"Insults" - don't presume even to be the sort of person of whom it's
meaningful to say he's insultable.
 
M

Mark

spinoza1111 said:
But more seasoned programmers who have used Turing-complete macro
processors such as the PL/1 macro processor, MASM and BAL for IBM
mainframes tend to say that the macro is an active piece of code which
returns something. They understand that the absence of loops or go to
statements in the C preprocessors mean that it is not Turing complete
at preprocessor time, but if they've studied computer science they
understand from their automata class that there are lower-powered
abstract machines nonetheless capable of wide classes of computation,
such as the finite state automaton.

Not wishing to get into the slanging match...

One issue I have with treating macros as active pieces of code returning
something (without indicating this is a simplification) is that students
may go away with some confusion over what's happening. For example,
they may later wish to look at the result of optimisation and not
understand that the code has been directly substituted in and,
therefore, be subject to a different set of optimisation rules than a
(non-inlined*) function.

If the person doesn't need to know this level of detail, it may well be
reasonable that the simplification is good enough.

* And, of course, even inlined code may or may not be treated as truly
inlined.
Programmers who've studied computer science in a real university have
also had the ability to cultivate critical thinking, so they draw the
conclusion that C preprocessor macros can do a lot of damage. Thus
they evolve rules for its use such as "a C preprocessor macro must
return either an expression or a statement: if it returns an
expression, its definition must be surrounded by round parentheses: if
it returns a statement, its definition must be surrounded by curly
brackets: furthermore, each formal parameter in the body of either
type of macro must be surrounded by round parentheses wherever it
appears".

There are lots of caveats to use of macros - whether people learn them
or not rather depends on the path they take to learning the language.
Of course, that includes the books they read as well as the advice given
in class/handouts and, in all cases, that includes the simplifications
used.
Note these things about this rule:

(1) It was too advanced for the level at which Schildt was writing
(2) It belongs in a more advanced textbook on C style and standards
<snip>
(4) It is not important to the beginner, who should not even think
about writing new macros

I think these are reasonable, but I do believe simplifications are
flagged as such, even if the detail of the simplification isn't explored.
 
B

Ben Bacarisse

spinoza1111 said:
Because not all shops follow my rule, and it belongs less in an
introductory text and more in a book on style.

No comment at all about the questions I raised? Is the ABS macro
wrong? How bad must the code in a C book be for you to criticise it?
No, I want you to show me that you have the skill to explain this,
right here and now.

No thank you. I am not at your beck and call.
In fact, my version of your do while loop has a
real problem, and I want you to show me how you avoid it.

It is that in

#DEFINE GROSS2NET do { net = gross - taxes; } while(0)

the user must remember to terminate the macro call with a semicolon.
This in addition to its unreadability makes my solution better, I
believe, and I want you to show me if my wrong.

You are wrong because #DEFINE is not #define. C is case sensitive.

<snip>
 
S

spinoza1111

spinoza1111wrote:


Allow me to translate: although you're wrong and you know you're wrong,
you'd rather die than admit it, so you're trying to paint it over with
lots of words.

If you /could/ prove that you were right, you'd have done so by now.

(Sigh). In the sense independent of your considerable limitations,
I've proved I am right. But persuading you is probably beyond my
rhetorical skill. I can only say that you're losing credibility FAST
with others and myself with silly claims such as "macros don't return
anything", "Nilges isn't in comp.risks", etc. ad nauseum.
 
S

spinoza1111

No comment at all about the questions I raised?  Is the ABS macro
wrong?  How bad must the code in a C book be for you to criticise it?







No thank you.  I am not at your beck and call.





You are wrong because #DEFINE is not #define.  C is case sensitive.

Grasping at straws, aren't we. Yes, we know that #define is case
sensitive. I wanted to emphasise my point, and don't you DARE start
any shit with me over this because I will finish it, punk, since I've
used #define since you were in Doctor Dentons. At worst, in the
mainframe tradition, I tend to use upper case and lower case
interchangeably in writing about code, because it's aliterate nonsense
to believe that the upper and lower case words are connotatively
different.

So. Have you anything substantive to say?
 
N

Nick Keighley

How silly. And for years myself and other long term programmers managed
to understand what "the macro returns X" means.

to be fair spinoza seems to be using the term in a slightly different
fashion. he's talking about the textual expansion of the macro. So

#define INC(X) (X + 1)

INC(y);

"returns" (y + 1)

I was going to say that spinoza's usage, though non-standard, was
clear but perhaps it isn't...
 
S

spinoza1111

spinoza1111wrote:


Well, that's half true. He knows it, but you don't. If you'd known it,
you wouldn't have used upper case for a #define directive.

Of course, I've known that C is case-sensitive (more as a creepy
fashion statement than in any attempt to be useful) for years, as does
everybody else here.

Stop making a fool of yourself.
 
S

spinoza1111

spinoza1111wrote:


That is, in *your* eyes you're right, but by any objective standard
you're wrong. I can accept that.

 > But persuading you is probably beyond my


Not so. Firstly, macros /don't/ return anything. Secondly, I never said
"Nilges isn't in comp.risks" (if you think I did, provide a message ID

Yes, we know that you used spinoza1111 and then lied.
to support your claim) and what I *did* say was more about my opinion of
the comp.risks moderator than about you. As for my losing credibility,
it seems I am losing more credibility by bothering to putting your C
mistakes right than you could ever hope to gain by avoiding the issue
all the time.

You ignore moral credibility. You cannot be trusted. If it was an
error for me to provide an example in which I upper cased define for
emphasis in a form of publication language, it was immoral of you to
try to make ignorant people believe your claim that "Nilges isn't in
comp.risks" by using a search method which even your butt buddy Kiki
pointed out was stupid.

Stop making a fool of yourself.
 
S

spinoza1111

to be fair spinoza seems to be using the term in a slightly different
fashion. he's talking about the textual expansion of the macro. So

#define INC(X) (X + 1)

INC(y);

"returns" (y + 1)

I was going to say that spinoza's usage, though non-standard, was
clear but perhaps it isn't...

In other words, I'm being trashed and you're afraid of being subject
to the same treatment.
Thanks for enlightening us.
 
S

Seebs

to be fair spinoza seems to be using the term in a slightly different
fashion. he's talking about the textual expansion of the macro. So

#define INC(X) (X + 1)

INC(y);

"returns" (y + 1)

I was going to say that spinoza's usage, though non-standard, was
clear but perhaps it isn't...

If you said it, or Francis said it, I'd probably think that was obviously
what was meant. Spinny, though, is sufficiently completely incapable of
getting technical details right (consider that this whole subthread was
inspired by his assertion that the preprocessor was doing the constant
folding) that I would not be comfortable guessing as to what he meant...

-s
 
A

Andrew Poelstra

If you said it, or Francis said it, I'd probably think that was obviously
what was meant. Spinny, though, is sufficiently completely incapable of
getting technical details right (consider that this whole subthread was
inspired by his assertion that the preprocessor was doing the constant
folding) that I would not be comfortable guessing as to what he meant...

I can't think of anything else he might have meant - after all, once
the preprocessor is done with the code there is nothing that even
looks like a function call left. So I can't grok any other meaning
of 'return'.

(And that a preprocessor runs is hardly a "technical detail" that
any regular here would miss.)
 
S

Seebs

I can't think of anything else he might have meant - after all, once
the preprocessor is done with the code there is nothing that even
looks like a function call left. So I can't grok any other meaning
of 'return'.

Again, he thought the preprocessor did constant folding, so I'm not
sure I trust him at any level.

-s
 
S

spinoza1111

Again, he thought the preprocessor did constant folding, so I'm not
sure I trust him at any level.

Briefly, for the space of a few hours. Like Churchill I got straight,
but like Lady Astor, "you're still ugly". How's that DOS heap coming
along?
 
S

spinoza1111

spinoza1111 wrote:

<snip>




Well, *you* know that, just like you know lots of things that ain't so.



Aye, it was.

 > it was immoral of you to


Go read the original. My claim was actually not really about you at all,
but about the moderator of comp.risks, whom I was crediting with good
sense - apparently mistakenly. I'm not even remotely interested in what
ignorant people believe about you.

Keep talking chump, you look more foolish and more dishonest with each
post. Here's who you're calling ignorant:

"I have been a member of the SRI International Computer Science
Laboratory since September 1971. I spent eight years at Harvard
(1950-58, with my A.B. in Math in 1954, S.M. in Applied Math in 1955,
and PhD in 1961 after returning from my two-year Fulbright in Germany
(1958-60), where I also received the German Dr rerum naturarum in
1960."

Peter Neumann

versus

"I have worked for an airline and some banks and insurance companies,
and stuff, and I edited a book"
 
S

spinoza1111

If you said it, or Francis said it, I'd probably think that was obviously
what was meant.  Spinny, though, is sufficiently completely incapable of
getting technical details right (consider that this whole subthread was
inspired by his assertion that the preprocessor was doing the constant

That statement was redacted. How's that DOS heap coming along? And
your belief that a psychology major and a fashionable disease entitles
you to destroy reputations?

The point was that in effect that's what happens. The preprocessor
RETURNS preprocessed C and SOME compilers resolve the value of
subexpressions. Those of us who have written compilers understand
this. Those of us who are trusted only to pass bugs on lest they mess
up don't.
 
S

spinoza1111

His usage is clear enough but it is problematic.  

There are two different languages involved, the C preprocessor
and the C language proper, that are effectively merged.  The
preprocessor is a text transformation language.  In that context
INC is a function that accepts an argument and returns a value.

The problem is that the C standard (and ordinary usage) talks
about the two languages together, i.e., C source code is usually
a mixture of text form the the two languages. Ergo it is
desirable to use different wording when talking about processor
functions and C functions.  After all the two are very different
kinds of animals.

Part of education is getting rid of the habit of reification, and the
understanding that CONCEPTS are not THINGS (especially not animals:
this is a regression to childhood). People are diverted into crappy
but apparently well paid jobs as "computer programmers" and wind up
using childish metaphors unaware that they are metaphors, but some of
us who were so diverted know this.

C is the union of the "main" language and an excrescence called the
preprocessor. While doing their thing, both functions and macros in
some unified sense return values, but we are aware that this is done
differently.

If a C #define macro contains a #defined symbol, the inner symbol has
in fact to be translated before the outer symbol. Therefore the
preprocessor uses a stack and is in fact the "runtime" of a "macro
machine". #define symbols are brain-damaged programs for this machine
and may be usefully spoken in the active voice.

There's no reason why the C preprocessor could not be Turing complete,
with a #dowhile added. In the spirit of "vanity C", in which the C
compiler writer vain-gloriously ships extra features which he snarkily
enables or disables using a handy-dandy little switch (like,
apparently, void pointer fun in GCC) this would be nifty, if only
because it would confuse and then enrage Heathfield.
 
B

Ben Bacarisse

spinoza1111 said:
If a C #define macro contains a #defined symbol, the inner symbol has
in fact to be translated before the outer symbol.

That's wrong. Here are two examples:

#define INNER1 arg
#define OUTER1(arg) INNER1

#define INNER2(x) #x
#define OUTER2(y) INNER2(y)

OUTER1(A)
OUTER2(A)

Work out what would happen if the inner macro is expanded before the
outer one. The try then examples out.

The C pre-processor does re-scanning rather than nested inside-out
expansion.

<snip>
 
S

spinoza1111

That's wrong.  Here are two examples:

  #define INNER1 arg
  #define OUTER1(arg) INNER1

  #define INNER2(x) #x
  #define OUTER2(y) INNER2(y)

  OUTER1(A)
  OUTER2(A)

Work out what would happen if the inner macro is expanded before the
outer one.  The try then examples out.

The C pre-processor does re-scanning rather than nested inside-out
expansion.

Thanks for the clarification, if indeed this is a "clarification" in
the sense of being "clear" and therefore true. I don't have time,
right now, to work through your example. But is it fair to say that
the preprocessor rescans until all define symbols are gone? If this is
the case, can source code loop the preprocesssor? If this is the case,
does this not suck? And do you know whether there is any difference
between inside out expansion and rescanning?

Note to corporate autists: a language lawyer level knowledge of so
flawed an artifact as C is NOT competence at the only meaningful task,
which is programming and maintaining software, and of the creeps here,
only Ben seems to have this competence. Kenny and the Twink may have
it as well, and I mention them because they are not creeps. This is
because competent programmers try to avoid exposure to the thought
viruses of C by not using stupid tricks, and competent programmers
rewrite poorly written code filled with Stupid C Tricks.

Thank you for your attention.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,098
Messages
2,570,625
Members
47,236
Latest member
EverestNero

Latest Threads

Top