Keep in mind that we're talking about a language that's extremely
Indeed when the human time needed to do a one-of task, such as
one-time data conversion or always-changing code in support of
scientific research, is a hundred times the computer time, and the
computer time is "dirt cheap" ($20/month for computer time compared
to $50/hr for human professional software-engineer time), so that
99.9% of the cost of a project is human professional labor while
only 0.1% of the cost is the computer time, shaving the computer
time by a factor of two, or even ten, is an utter waste of time.
Only software that is utterly stable for a long time, used billions
of times between software upgrades, where the CPU cost is a
signficant fraction of the total system cost, would benefit from
bumming code at the expense of extra human labor to tweak it that
factor of two faster. I'm thinking MS-Windows may be the only
software in the world that could truly benefit from spending lots
of human time to make it run faster, because of the extremely large
user base, except that MS-Windows typically sits idle 99% of the
time, running in short burts to refresh the screen whenever the
user moves something, so actually maybe even MS-Windows doesn't
qualify for being worth the human-labor cost to tweak it a little
bit faster.
From: Bart <
[email protected]>
... I'm working on a dynamic language now which at present is
some 3 to 10 times as slow as optimised C (although there's some
way to go yet...). ... But that's measured for tight integer code.
When you throw in some string processing, higher level datatypes,
and calls into the runtime, then they can be comparable, say
between 1 and 2 times as slow, for a language considerably more
expressive (ie. increase in apparent runtime of 0 to 100%). In
theory...
If it's expressive enough to cut human time by a factor of two in
designing new algorithms and getting them working, then I'd say the
CPU time lost (compared to C) is not worth crying over.
And ultimately, for programs with a short runtime, it really
doesn't matter if it takes 100ms or 200ms.
Actually in some cases it *does* matter. A case in point: I'm
currently building TinyURL.Com/NewEco. TinyURL.Com/Portl1 links to
the portal into what I have so far, running in PHP/MySQL on a free
hosting service. From the start I planned to charge users for how
long scripts take to run, paid by their labor contributing to the
project. From my experience with CGI/CMUCL, I expected typical
scripts to take a second or two. Well, I installed start+end
clock-checks, subtract to get time of script, and typical times are
less than one millisecond for short scripts and up to maybe 20
milliseconds for longer scripts. So this means if one of my users
performs a mere ten seconds of labor to earn ten seconds of credit
that's good for hundreds of script-runs, maybe thousands. But if
something took an extra 100ms suddenly, that would be **noticed**,
for sure. If I eventually have millions of simultaneous users, each
executing one script every ten seconds, that's hundreds of
thousands of script-runs per second, even the small number of
milliseconds each script-run now takes might be too slow, and I
might need to distribute the load onto multiple servers.
There's a difference between a program that takes 200ms to run to
completion on a single-user system, with most of the CPU time
sitting there waiting for the user to issue the next command, and a
CGI or PHP server application that has millions of simultaneous
active users, where even a few milliseconds per user's script-run
is putting a heavy load on the server/CPU. Deeper into a serverside
appliation, if different components of the appliation are using
SOAP to communicate with each other, there may be millions of SOAP
transactions per second, and a few extra milliseconds each may be a
killer.
I think it's time to go into Monty Python mode, no not "SPAM",
instead "profile". If your application is too slow, find out what
exactly is making it slow, don't guess, run a profiler or something
equivalent. It may be the user interface is too hard for users to
grasp, so they take longer to react. Or it may be some part of the
computer code that is too slow and is run so very often that it
becomes critical. You can't just guess on principles, although
principles can guide you what to measure first in the hope of
finding the answer quickly.
Going deeper into that side topic, for anyone who is curious about
NewEco: After I finished rescuing my 59 megabytes of GeoCities files
last weekend, I've been spending the past several days making tweaks
to the core accounting parts of NewEco/Portl1, namely:
- Create account;
- Login;
- Spend 4-10 seconds to fill in missing word in randomly chosen
sentence/phrase to prove you're not spambot and thereby earn
4-10 seconds of CPU time, i.e. 4000-10000 milliseconds of CPU
time, i.e. hundreds to thousands of script-runs;
- Logout;
and as of earlier tonight I have it in good enough shape that I'm
now seriously planning the next core feature, namely surveys. I'm
too tired to start actual coding tonight, so that's why I'm
responding to newsgroup articles instead. The basic way these
surveys will work is that any user can *invest* any portion of
his/her current account in any particular item in any particular
survey. This is not a payment, it's an *investment*, because like a
bank account it can be withdrawn at any time, but like investment
in a business it accrues productive value during the whole time
it's deposited before withdrawn.
So what surveys? I'll start off with two:
* What survey would you most like to invest in?
- Survey of what survey you'd like to invest in (i.e. this self);
- Survey of what NewEco features you'd like me to work on next;
- Fill in blank: ___________________________________________________
* What NewEco feature would you prefer that I implement next?
- FilJob = Filtering job ads to eliminate the ones you don't qualify for;
- TruFut = Truth-futures market, estimating truth of claims;
- Contract work: Post RequestForBids, lowest bidder does work and gets paid;
- PAlert = Priority-alert notification system;
- Fill in blank: ___________________________________________________
(Suggestion: Look at TinyURL.Com/NewEco and
http://www.rawbw.com/~rem/WAP/projectIdeas.html
for ideas of online services I'm eager to implement if you show interest.)
My current idea for design is 3 tables:
- List of surveys
- List of items in surveys
- List of investments by users in items in surveys