Comparision of C Sharp and C performance

S

spinoza1111

You are completely missing the point, technical terms always mean
exactly what Nilges says they do and if we disagree it is because we are
ignorant and did not study CS at the same places he did.

Well, you are ignorant, but technical terms (I've been saying) mean
depending on multi-term structures in context, not what I say they
mean.
 
S

spinoza1111

You are completely missing the point, technical terms always mean
exactly what Nilges says they do and if we disagree it is because we are
ignorant and did not study CS at the same places he did.

Of course in the real world that the rest of us inhabit many terms have
changed there meaning over the last 50 years. Just think about how the
word 'computer' has changed its meaning. Come to that, exactly what is a
computer (and be careful because careless definitions will include many
things that most of us would noot actually consider to be computers)

Golly think ponder ponder...it's a Turing-complete device?
 
S

spinoza1111

That means you're too educated.

I think Spinny's actually made the key point himself, probably
unintentionally:


The point, of course, being that generally a "compiled language" is one
which doesn't have the interpreter.  A normal C implementation doesn't
have an interpreter; it generates code which is run natively.  Schildt
did not write a thing which generated native code, so it's not what is
normally called a "compiler" -- this his emphasis on it being an
interpreter.

From the GNU C Compiler documentation:

"Historically, COMPILERS [my emphasis] for many languages, including C+
+ and Fortran, have been implemented as “preprocessors” which emit
another high level language such as C. None of the compilers included
in GCC are implemented this way; they all generate machine code
directly."

Because of sexual anxieties, many developers reserve words with high
positive connotation to mean complex and more difficult to implement
things, where the software becomes Lacan's and Zizek's "big
other" (the Stalinist father of WWII who by definition is never
satisfied with the son). Which of course goes against the meaning of
the adage that great engineering is simple engineering. If you're
writing a demo or instructional compiler as were Herb and I, the first
edition best emits interpreted code, and for this sacrifice of speed
you get better debugging.

The GNU documentation clearly implies that a COMPILER which "generates
machine code directly" is better than one that doesn't. But it also
uses "compiler" to refer to what Herb wrote.

You could argue, in a sort of pedantic way, that the interpreter actually
compiles-to-X, where X is something other than native code, and then
something else interprets the X.

Gee, you could.
So, given Java as an example to draw from:  Does Schildt's "C interpreter"
allow you to convert C programs to some kind of data file, which another
program or piece of hardware then executes?

Java and .Net do not typically interpret code: we've told you that,
dear little Peter. They translate bytecode to native code the first
time the path of logic that contains the code is executed in most
implementations, although there exist implementations which are
interpreters. Confusingly, esp. to people without proper education in
computer science, Microsoft documentation calls this step JIT
compilation, which is incorrect if we reserve the term "compiler" to
something that translates something in a Chomsky 1 or 0 source
language writable and readable by humans.
 
S

spinoza1111

That means you're too educated.

Wow. Wow. "Too educated". That's one for the red book:

"The 'Heap' is a DOS term"
"You're too educated"

"Too" educated?

Oh that this too too solid Flesh, would melt,
Thaw, and resolue it selfe into a Dew
I think Spinny's actually made the key point himself, probably
unintentionally:


The point, of course, being that generally a "compiled language" is one

Yes. The participle is in opposition to interpreted. But the use isn't
grammatically consistent.
which doesn't have the interpreter.  A normal C implementation doesn't
have an interpreter; it generates code which is run natively.  Schildt
did not write a thing which generated native code, so it's not what is
normally called a "compiler" -- this his emphasis on it being an
interpreter.

You could argue, in a sort of pedantic way, that the interpreter actually
compiles-to-X, where X is something other than native code, and then
something else interprets the X.

So, given Java as an example to draw from:  Does Schildt's "C interpreter"
allow you to convert C programs to some kind of data file, which another
program or piece of hardware then executes?

Or that the Euerlasting had not fixt
316: His Cannon 'gainst Selfe-slaughter. O God, O God!
317: How weary, stale, flat, and vnprofitable
318: Seemes to me all the vses of this world?
319: Fie on't? Oh fie, fie, 'tis an vnweeded Garden
320: That growes to Seed: Things rank, and grosse in Nature
321: Possesse it meerely. That it should come to this!
 
S

spinoza1111

That is still no answer.  I don't have that book anymore but I don't
recall anything much about parsing type 1 grammars in it.




Do you not want me to correct your initial statement so that it makes
sense, then?  I am pretty sure you meant "Chomsky type 2".  Why not
just use the more usual term "context free grammars" so you don't
have to remember what type number they are?

No thanks, because in my readings about automata theory (which started
with independent reading of Hopcroft and Ullman in 1971 and included a
graduate level class, this numbering system has VARIED.

When will you learn that clerkish knowledge isn't knowledge? And when
will you start using your skills for good and not evil?
 
S

spinoza1111

spinoza1111wrote:




OK that tells us something about the programmers you normally come in
contact with. Contract workers come in two bands, those who are too

"You are the homeless men and crack hos I see from my car" is not an
argument. Nearly all contract programmers are basically rejects from a
bad system, which says something about them and the system.
 
B

Ben Bacarisse

spinoza1111 said:
No thanks, because in my readings about automata theory (which started
with independent reading of Hopcroft and Ullman in 1971 and included a
graduate level class, this numbering system has VARIED.

Citation? Nothing I've seen by anyone as reputable as Hopcroft or
Ullman uses any other numbering. Who does (other than yourself)?

<snip>
 
B

Ben Bacarisse

spinoza1111 said:
The GNU documentation clearly implies that a COMPILER which "generates
machine code directly" is better than one that doesn't. But it also
uses "compiler" to refer to what Herb wrote.

Citation? Specifically where the GNU documentation uses "compiler"
for what is clearly and interpreter. I can't find any such use.

<snip>
 
J

JW

I'm impressed.  Very few people could define an N^2 algorithm for calculating
factorials.

Hint:  When you call factorial(19), you calculate factorial(19) iteratively,
and then you calculate 19 * factorial(18).  You then calculate factorial(18)
iteratively, then calcualte 18 * factorial(17).  Etcetera.

In short, for factorial(19), instead of performing 38 multiplications and
19 calls, you perform 19 multiplications and 19 calls for the recursive
calculation, plus 164 multiplications for the iterative calculations.

This is not a reasonable way to go about things.  This is a pretty
impressive screwup on your part, and you can't blame the language design;
this is purely at the algorithm-design level, not any kind of mysterious
quirk of C.

Again, the problem isn't with C's design; it's that you are too muddled
to design even a basic test using two algorithms, as you embedded one
in another.

Here's two test programs.  One's yours, but I switched to 'long double'
and used 24! instead of 19! as the test case, and multiplied the number
of trials by 10.

The main loop is unchanged except for the change in N and the switch to
%Lf.

Yours:
        long double factorial(long double N)
        {
            long double nFactorialRecursive;
            long double nFactorialIterative;
            long double Nwork;
            if (N <= 2) return N;
            for ( nFactorialIterative = 1, Nwork = N; Nwork > 1; Nwork-- )
                nFactorialIterative *= Nwork;
            nFactorialRecursive = N * factorial(N-1);
            if (nFactorialRecursive != nFactorialIterative)
               printf("%Lf! is %Lf recursively but %Lf iteratively wtf!\n",
                      N,
                      nFactorialIterative,
                      nFactorialRecursive);
            return nFactorialRecursive;
        }

Mine:

        long double ifactorial(long double N)
        {
            long double nFactorialIterative;
            long double Nwork;
            if (N <= 2) return N;
            for ( nFactorialIterative = 1, Nwork = N; Nwork > 1; Nwork-- )
                nFactorialIterative *= Nwork;
            return nFactorialIterative;
        }

        long double rfactorial(long double N)
        {
            long double nFactorialRecursive;
            if (N <= 2) return N;
            nFactorialRecursive = N * rfactorial(N-1);
            return nFactorialRecursive;
        }

        long double factorial(long double N)
        {
            long double nFactorialRecursive;
            long double nFactorialIterative;
            nFactorialIterative = ifactorial(N);
            nFactorialRecursive = rfactorial(N);
            if (nFactorialRecursive != nFactorialIterative)
               printf("%Lf! is %Lf recursively but %Lf iteratively wtf!\n",
                      N,
                      nFactorialIterative,
                      nFactorialRecursive);
            return nFactorialRecursive;
        }

Output from the main loops:

24! is 620448401733239409999872: 14.00 seconds to calculate 10000000 times
24! is 620448401733239409999872: 5.00 seconds to calculate 10000000 times

... Which is to say, no one cares whether C# is faster or slower than C
by a few percent, when non-idiotic code is faster than idiotic code by
nearly a factor of three.

-s

Sorry to interrupt. I am just curious in why the second implementation
is significantly faster than the first implementation. My guess is
that the second implementation breaks the two factorial calculation
into two subroutines, hence the memory can be released after each
calculation which boosts up the speed. Is my guess reasonable?

-jw
 
S

spinoza1111

Citation?  Nothing I've seen by anyone as reputable as Hopcroft or
Ullman uses any other numbering.  Who does (other than yourself)?



It's not important, Ben. What's important is the taxonomy: regular
languages, context free languages, context sensitive languages, and
everything else. I studied this stuff outside the workplace and on my
own time in 1971 and revisited in a graduate school class in which I
got an A. However, I've never used the information on the job nor have
I taught it. I think that the awareness that the taxonomy exists is
the important thing here, and this awareness doesn't exist in the
typical auto-didact.

Having one's own notation could indicate that one appreciates it at a
deeper level than the rote-learning swot, or invented it on one's own
like some sort of crazed savant. In my case, the former is true and
the latter is false.
 
N

Nick Keighley

the amusing thing is you name-drop Chomsky then apparently get it
wrong. It probably would have taken you 60s tops to check it out.

once you start using technical vocabulary you enter the realm of
exactness.


so you were self taught on this stuff?

It's not important, Ben. What's important is the taxonomy: regular
languages, context free languages, context sensitive languages, and
everything else.

If you'd said that at the beginning it would have been impressive.

I studied this stuff outside the workplace and on my
own time in 1971 and revisited in a graduate school class in which I
got an A. However, I've never used the information on the job nor have
I taught it. I think that the awareness that the taxonomy exists is
the important thing here, and this awareness doesn't exist in the
typical auto-didact.

doesn't auto-didact mean "self taught"?

Having one's own notation could indicate that one appreciates it at a
deeper level than the rote-learning swot, or invented it on one's own
like some sort of crazed savant. In my case, the former is true and
the latter is false.

riight...
 
N

Nick Keighley

On 2010-01-03, Ben Bacarisse <[email protected]> wrote:
That means you're too educated.
I think Spinny's actually made the key point himself, probably
unintentionally:
The point, of course, being that generally a "compiled language" is one
which doesn't have the interpreter.  A normal C implementation doesn't
have an interpreter; it generates code which is run natively.  Schildt
did not write a thing which generated native code, so it's not what is
normally called a "compiler" -- this his emphasis on it being an
interpreter.

From the GNU C Compiler documentation:

"Historically, COMPILERS [my emphasis] for many languages, including C+
+ and Fortran, have been implemented as “preprocessors” which emit
another high level language such as C. None of the compilers included
in GCC are implemented this way; they all generate machine code
directly."

If you're
writing a demo or instructional compiler as were Herb and I, the first
edition best emits interpreted code, and for this sacrifice of speed
you get better debugging.

did Schildt's "translator" (to use a neutral term) emit any sort of
code?
The GNU documentation clearly implies that a COMPILER which "generates
machine code directly" is better than one that doesn't.

no it doesn't. It says many compilers emit HLL output (for instance C-
front the first C++ compiler emtted C which could then be compiled to
native code using a traditional C compiler). gcc emits native code it
doesn't say (or imply) one is "better" than the other.
But it also
uses "compiler" to refer to what Herb wrote.

I don't think it does.
Gee, you could.

point being?

I'm playing around with a scheme (Lisp) compiler that emits C which
can then be compiled with a C compiler. I'm happy to call that a
compiler. I got the impression Schildt's translator almost directly
executed the source code.

Java and .Net do not typically interpret code: we've told you that,
dear little Peter. They translate bytecode to native code the first
time the path of logic that contains the code is executed in most
implementations, although there exist implementations which are
interpreters. Confusingly, esp. to people without proper education in
computer science, Microsoft documentation calls this step JIT
compilation, which is incorrect if we reserve the term "compiler" to
something that translates something in a Chomsky 1 or 0 source
language writable and readable by humans.

JIT compiler seems a reasonable term. I'm not convinced "compiling" is
limited to human readable texts.

Isn't it theoretically possible for JIT compilers to be faster then
normal compilers because they actually know what envrironment the code
is running in? That is they could (in theory) profile the code as it
runs.
 
N

Nick Keighley

You are completely missing the point, technical terms always mean
exactly what Nilges says they do and if we disagree it is because we are
ignorant and did not study CS at the same places he did.

Of course in the real world that the rest of us inhabit many terms have
changed there meaning over the last 50 years. Just think about how the
word 'computer' has changed its meaning. Come to that, exactly what is a
computer (and be careful because careless definitions will include many
things that most of us would noot actually consider to be computers.

an information transforming device? I'm happy to accept much embedded
stuff to be a computer though I feel I've let in everything from a
microphone to a printing press.
 
D

Dennis \(Icarus\)

Francis Glassborow said:
That, of course, is the problem. Nilges' "Turing-complete device" is too
restrictive because such devices cannot actually exist (they require
unlimited resources) and more practical definitions tend to let in more
than we feel comfortable with. Is a mobile (cell) phone a computer? What
about a digital camera? A TV? Where do we draw the line?

Cell phones, digital cameras, and TVs oftentimes have processors embedded
within.
A computer allows end-user access to that processor. :)

Dennis
 
A

Aatu Koskensilta

spinoza1111 said:
What you call "errors" are in fact Schildt's bold attempt to make
sense of a mess for real people, and in a self-contradictory fashion,
nearly all of Schildt's critics call him "clear", not knowing that
"clarity" logically implies truth.

This is a peculiar doctrine. On any usual understanding it is certainly
possible to be both clear and wrong.
 
B

Ben Bacarisse

Sorry to interrupt. I am just curious in why the second implementation
is significantly faster than the first implementation. My guess is
that the second implementation breaks the two factorial calculation
into two subroutines, hence the memory can be released after each
calculation which boosts up the speed. Is my guess reasonable?

No. Well, maybe I should say that it is a reasonable guess, but it is
not a correct one!

The faster code does less computation. The slow code calls itself
when doing the recursive computation, but that includes another
calculation of the iterative factorial. By separating them, less work
is done overall. Much less in the long run.
 
N

Nick Keighley

This is a peculiar doctrine. On any usual understanding it is certainly
possible to be both clear and wrong.

Spinoza uses an "unusal" definition of "clear". Attempts to refute him
by quoting dictionaries result in you being told you are using an
incorrect dictionary.

Humpty-dumptyism run wild.
 
S

spinoza1111

You are completely missing the point, technical terms always mean
exactly what Nilges says they do and if we disagree it is because we are
ignorant and did not study CS at the same places he did.

Of course in the real world that the rest of us inhabit many terms have
changed there meaning over the last 50 years. Just think about how the
word 'computer' has changed its meaning. Come to that, exactly what is a
computer (and be careful because careless definitions will include many
things that most of us would noot actually consider to be computers)

On p 2 the Dragon book 2nd edition says that a compiler translates
source to target without specifying that the target must be
executable machine language, and in the next paragraph it says "IF
[emphasis mine) the target program is an executable machine language
program...". At the bottom of the page it says that "Java language
processors combine compilation and interpretation", meaning that the
authors believe that they contain the "front end" of a compiler
(scanning and parsing) with an interpreter. Of course, insofar as Java
run times translate bytecodes to native machine codes, even the mighty
Aho et al. are having an Homeric nod, since they're wrong: but all
this means is that even high-quality texts can be found to contain
"errors" when subjected to enough deconstruction, and that Scripto Boy
(Seebach) was a fool to single out Schildt.
 
S

spinoza1111

Spinoza uses an "unusal" definition of "clear". Attempts to refute him
by quoting dictionaries result in you being told you are using an
incorrect dictionary.

Humpty-dumptyism run wild.

No, just literacy. Had I been Scripto Seebie, I would have called
Herb's "clarity" an "apparent" clarity, since to be "clear" is to have
a clear relationship with the truth. The main definition in the
Compact Oxford English Dictionary defines "clear" as "understandable",
and "understanding" as knowledge of the "truth".
 
S

spinoza1111

<snip code reproduced below>







No.  Well, maybe I should say that it is a reasonable guess, but it is
not a correct one!

The faster code does less computation.  The slow code calls itself
when doing the recursive computation, but that includes another
calculation of the iterative factorial.  By separating them, less work
is done overall.  Much less in the long run.

Which missed the point, since the point was to burn cycles in such a
way that would have brought an interpreter to its knees. It didn't
faze C Sharp running under .Net, which only ran about ten percent
slower, since C Sharp isn't interpreted. Instead, bytecodes are
transformed into native code the first time they are executed by
the .Net runtime.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,888
Messages
2,569,964
Members
46,293
Latest member
BonnieHamb

Latest Threads

Top