Progress in data processing

J

jacob navia

OK I am running vista.

My old machine died with a disk controller failure and I had to buy
a new one. The new one was cheaper than the old one (1100 Euros vs
620 Euros) but had twice as much RAM (2GB), twice as much disk
space (500GB) and twice as much processor (dual core AMD 64 bits)

Within the Vista OS, I installed a Virtual PC with windows XP,
to remember the old days.

And then, I compiled the source code of lcc-win32 using the
lcc-win32 compiler.

Vista: 3.5 seconds
Windows XP (running under Vista emulation) 4.4 seconds...

Can you imagine?

I wonder if I put a windows 98 emulation it will run actually
faster than the Vista version even if it is running in a
virtual PC!!!

Everything is slower or at best the same speed. I start
Microsoft C and it takes forever, just as it did under
XP, but much slower than it did under MSDOS.

Then, surfing the web I found (slashdot pointer)
http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins

Those guys measured the time it takes to do common tasks under
a Mac of 1986 and a Vista/AMD dual core. The tasks are like
doing an Excel spreadsheet, using Word, booting the system, etc.

< QUOTE >
Check out the results! For the functions that people use most often, the
1986 vintage Mac Plus beats the 2007 AMD Athlon 64 X2 4800+: 9 tests to
8! Out of the 17 tests, the antique Mac won 53% of the time! Including a
jaw-dropping 52 second whipping of the AMD from the time the Power
button is pushed to the time the Desktop is up and usable.
< END QUOTE >

Yes, we wait longer for results today as we waited in 1986. The huge
benefits that could be here with such a hardware speed are completely
destroyed by the bloated software written in bloated languages that we
run today.

Why do I still use C?

Precisely because of that. Because the language is still against the
trend.

Simple software, simple languages are now a thing of the past.
Instead of progress we have regression. We have to run always
faster to keep at the same speed.

I am not implying that C is perfect or that I do not see the
huge gaps in the language. What I am pointing at, is that the
need for a simple and fast language is not in the present trends
of software development.

Actually this could be very good news for C. Obviously some
applications exist that could be better in terms of speed. :)

But the problem with C is that is seen as obsolete. Most people
at the company where I was in my last consulting jobs used C++
and would laugh at anyone that would dare question their
templated bloat.

Who cares about speed they said. Who cares about disk space or
memory consumption.

Ram is cheap, disk is cheap. BLOAT IT!!!!!!

A disk costs the same if it is spinning with 50GB or with
350GB inside. FILL IT!

What now?

There is a much simpler solution to templates. It is called
aspect oriented programming.

That is the subject of the next installment. The objective of this
one is to point out that keeping things simple can be an
objective *per se*. And to keep them simple and fats, a
language without excessive bloat is needed.

C (with some improvements) fits this description.

jacob
 
K

Karl Heinze

On Thu, 31 May 2007 21:28:29 +0200, jacob navia

Hi Jacob,

I completely agree with (almost) all of what you've said.

K. H.
 
I

Ian Collins

jacob said:
Yes, we wait longer for results today as we waited in 1986. The huge
benefits that could be here with such a hardware speed are completely
destroyed by the bloated software written in bloated languages that we
run today.
Not those of us who choose operating systems that get faster and lighter
with each new release...
But the problem with C is that is seen as obsolete. Most people
at the company where I was in my last consulting jobs used C++
and would laugh at anyone that would dare question their
templated bloat.

Who cares about speed they said. Who cares about disk space or
memory consumption.

Ram is cheap, disk is cheap. BLOAT IT!!!!!!
Sounds like a bunch of piss poor C++ programmers. Piss poor programmers
work in all languages.
 
J

Jeremy Thomson

I have to smile.
I started working in D.P (thats Data Processing to you
whippersnappers) in 1985.
The computer was an old NCR century 100, its was 18 years old & I was
17.
This machine was one of the first to use MOS memory, 256K in all,
though it still had about 4K of boot core memory.
Every job was started with punched cards, we had a 'console' which was
a typewriter keyboard & thermal printer.
Disks were removable disk packs, they had been 'unstrapped' to
increase the storage to 200Mb, there were three, the drives were the
size of washing machines.
Backup was to 1/2inch mag tape at a maximun density of 3200BPI (Bits
per inch) though we ran them at 800BPI for safety.
Out biggest client was a chain of Menswear stores we loaded each days
transactions from there POS registers from cassette tape, about 35 of
them.
In that small system we ran Debtors, Creditors, General Ledger and
Sales Analysis, some clients had stock control.
Though the memory was 256K it was 'partitioned' into 4 so you could
run 4 jobs at once.
Most of the jobs would run in 48K, the OS used up 96K.
It didnt have virtual memory, you could page in code by program
control but it would be too slow to do that per transaction so a
debtors end of month roll really did run in 48K.
The systems were programmed in Cobol, the programs were patched with
patch cards to change clients names & addresses by poked the strings
into memory locations once the program was loaded. There were a only a
few utlities to copy, backup or sort files.
We had a 32K partition where we could run the editor, it drove a dumb
terminal that ran an edlin style line editor.

Makes me wonder what the computers will be like in another 40 years.

Jeremy Thomson
 
C

Cesar Rabak

jacob navia escreveu:
OK I am running vista.
[snipped]


Then, surfing the web I found (slashdot pointer)
http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins


Those guys measured the time it takes to do common tasks under
a Mac of 1986 and a Vista/AMD dual core. The tasks are like
doing an Excel spreadsheet, using Word, booting the system, etc.

The comparison is not exactly fair as the versions, and because of it
the _functionality_ of the 'same' programs are too different.

If you could get hands on a version of Excel and Word (probably would be
2.0) from the same period for an Intel machine, you'll probably would
arrive at similar results.

You'll need to arrive at a means of running Windows 3.1 on your new box :)

Regards,
 
O

Old Wolf

Not those of us who choose operating systems that get faster and
lighter with each new release...

Does such a thing exist? Linux used to comfortably
fit on a CD; now it takes seven or more.
 
I

Ian Collins

Old said:
Does such a thing exist? Linux used to comfortably
fit on a CD; now it takes seven or more.
That'll be all the bundled extras, if Linux is like Solaris, the core OS
gets faster each release and still runs well on old hardware.
 
C

Clark Cox

Does such a thing exist? Linux used to comfortably
fit on a CD; now it takes seven or more.

It still does fit on a CD. I've got a couple of distros in my CD-wallet.
 
G

Guest

jacob navia escreveu:
OK I am running vista.
[snipped]


Then, surfing the web I found (slashdot pointer)
http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins


Those guys measured the time it takes to do common tasks under
a Mac of 1986 and a Vista/AMD dual core. The tasks are like
doing an Excel spreadsheet, using Word, booting the system, etc.

The comparison is not exactly fair as the versions, and because of it
the _functionality_ of the 'same' programs are too different.

If you could get hands on a version of Excel and Word (probably would be
2.0) from the same period for an Intel machine, you'll probably would
arrive at similar results.

You'll need to arrive at a means of running Windows 3.1 on your new box :)

Word 2.0 seems to run just fine on Win98. (I just tried it).
I don't know about any later OS.
 
R

Richard Heathfield

Old Wolf said:
Does such a thing exist? Linux used to comfortably
fit on a CD; now it takes seven or more.

DSL fits in 50MB, by design. A month or so ago, my son installed it onto
a ten-year-old laptop, and he's delighted with the performance gain.
Mind you, it /was/ running Win98, so I suppose that's not surprising.
 
M

Martin Neitzel

JN> Yes, we wait longer for results today as we waited in 1986. The huge
JN> benefits that could be here with such a hardware speed are completely
JN> destroyed by the bloated software written in bloated languages that we
JN> run today.

True. But don't forget that you can also run __old__ software on
modern hardware. For example, Borland's Turbo C 2.0 compiles and
links already so blazingly fast on a 133MHz Pentium-1 that you think
you are sitting in front of an interpreter.

Thanks for keeping lcc-win32 non-bloated and zippy!

Martin
 
A

Al Balmer

Old Wolf said:


DSL fits in 50MB, by design. A month or so ago, my son installed it onto
a ten-year-old laptop, and he's delighted with the performance gain.
Mind you, it /was/ running Win98, so I suppose that's not surprising.

DSL? All I can think of is Digital Subscriber Line, but you must mean
something else.
 
J

Johan Bengtsson

jacob said:
Vista: 3.5 seconds
Windows XP (running under Vista emulation) 4.4 seconds...

Can you imagine?
Actually I can, anything running in a virtual machine of any kind should
be somewhat slower than running it in the main OS. Perhaps a little bit
more difference than I would have expected but not surprising. Very
processor intensive tasks take almost the same amount of time, but as
soon as file system or screen I/O is involved there should be *some*
difference.

Could you try installing vista in a virtual machine and compare?

Or did you swap the numbers? Or did I missunderstand you?
I wonder if I put a windows 98 emulation it will run actually
faster than the Vista version even if it is running in a
virtual PC!!!

I doubt it, but it is possible

[snip]

For the rest I absolutely agree with you...
 
J

jacob navia

CBFalconer said:
But you can't (sometimes) run new software on old hardware.
lcc-win32 is an example, which creates some form of trap the moment
you try the debugger. The docs specify it runs under W98, but it
doesn't.

IR WILL NOT RUN IN A 486 CHUCK!!!!!

I have told you this a thousand times.

It needs pentium1 or higher
 
B

Barry

CBFalconer said:
Sounds much like an early version of the HP3000 circa 1973 (I was
about 42), but with a much poorer OS. That could function with
128kB (64 kW) of memory. We mounted a C on it about 10 years
later, when the memory was up to some number of megs, and the
machine was much smaller.

Well that was a few years before my time, but you probably
remember the INP (it hung off the GIC). Early versions had
a mulitasking OS in 16K. Of course it didn't have a file system.
 
H

Harald van =?UTF-8?B?RMSzaw==?=

CBFalconer said:
I know that. However W98 runs on a 486, and you claim it runs
under W98. Which means your documentation is non-trustworthy.
I've said this before.

Quoting from <http://www.cs.virginia.edu/~lcc-win32/>:
"Minimum requirements:
Windows 95 or later for the command line tools, Windows 2000 or later for
the IDE. All later operating systems (XP/2003/NT) are fully supported."

Does lcc-win32 use a command line debugger? It's been a long while since I
gave it a try, but as I recall it had a graphical debugger.
 
J

jacob navia

Harald said:
Quoting from <http://www.cs.virginia.edu/~lcc-win32/>:
"Minimum requirements:
Windows 95 or later for the command line tools, Windows 2000 or later for
the IDE. All later operating systems (XP/2003/NT) are fully supported."

Does lcc-win32 use a command line debugger? It's been a long while since I
gave it a try, but as I recall it had a graphical debugger.

No command line debugger. I will change that to win98+pentium1.

jacob
 
S

stdazi

OK I am running vista.

My old machine died with a disk controller failure and I had to buy
a new one. The new one was cheaper than the old one (1100 Euros vs
620 Euros) but had twice as much RAM (2GB), twice as much disk
space (500GB) and twice as much processor (dual core AMD 64 bits)

Within the Vista OS, I installed a Virtual PC with windows XP,
to remember the old days.

And then, I compiled the source code of lcc-win32 using the
lcc-win32 compiler.

Vista: 3.5 seconds
Windows XP (running under Vista emulation) 4.4 seconds...

Can you imagine?

I wonder if I put a windows 98 emulation it will run actually
faster than the Vista version even if it is running in a
virtual PC!!!

Everything is slower or at best the same speed. I start
Microsoft C and it takes forever, just as it did under
XP, but much slower than it did under MSDOS.

Then, surfing the web I found (slashdot pointer)http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Beli...

Those guys measured the time it takes to do common tasks under
a Mac of 1986 and a Vista/AMD dual core. The tasks are like
doing an Excel spreadsheet, using Word, booting the system, etc.

< QUOTE >
Check out the results! For the functions that people use most often, the
1986 vintage Mac Plus beats the 2007 AMD Athlon 64 X2 4800+: 9 tests to
8! Out of the 17 tests, the antique Mac won 53% of the time! Including a
jaw-dropping 52 second whipping of the AMD from the time the Power
button is pushed to the time the Desktop is up and usable.
< END QUOTE >

Yes, we wait longer for results today as we waited in 1986. The huge
benefits that could be here with such a hardware speed are completely
destroyed by the bloated software written in bloated languages that we
run today.

Why do I still use C?

Precisely because of that. Because the language is still against the
trend.

Simple software, simple languages are now a thing of the past.
Instead of progress we have regression. We have to run always
faster to keep at the same speed.

I am not implying that C is perfect or that I do not see the
huge gaps in the language. What I am pointing at, is that the
need for a simple and fast language is not in the present trends
of software development.

Actually this could be very good news for C. Obviously some
applications exist that could be better in terms of speed. :)

But the problem with C is that is seen as obsolete. Most people
at the company where I was in my last consulting jobs used C++
and would laugh at anyone that would dare question their
templated bloat.

Who cares about speed they said. Who cares about disk space or
memory consumption.

Ram is cheap, disk is cheap. BLOAT IT!!!!!!

A disk costs the same if it is spinning with 50GB or with
350GB inside. FILL IT!

What now?

There is a much simpler solution to templates. It is called
aspect oriented programming.

That is the subject of the next installment. The objective of this
one is to point out that keeping things simple can be an
objective *per se*. And to keep them simple and fats, a
language without excessive bloat is needed.

C (with some improvements) fits this description.

jacob

Hehe. It's funny how people make conclusions like "Use Linux is FASTER
than Windows", "C is faster than C++" etc...

To me, it's all a matter of programming - and I don't think that C's
great value resides in it's speed. (is it even defined by the
standard? ;-) )
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,989
Messages
2,570,207
Members
46,783
Latest member
RickeyDort

Latest Threads

Top