Trickle?
Ok... only its multiplied by a billion:
http://en.wikipedia.org/wiki/Transistor_count
A typical desktop computer uses less than 500 watts for *everything*
except the screen. Hard drives. DVD burner. Keyboard, mouse, USB devices,
network card, sound card, graphics card, etc. (Actually, 350W is more
typical.)
Moore's Law observes that processing power has doubled about every two
years. Over the last decade, processing power has increased by a factor
of 32. If *efficiency* had increased at the same rate, that 500W power
supply in your PC would now be a 15W power supply. Your mobile phone
would last a month between recharges, not a day. Your laptop could use a
battery half the size and still last two weeks on a full charge.
In practice, hard drives are not likely to get more efficient, since you
have to spin up a lump of metal. (Solid state drives tend to be either
slow and unreliable, or blindingly fast and even more unreliable. Let me
know how they are in another ten years.) Network cards etc. are
relatively low-power. It's only the CPU and some of the bigger graphics
cards that really eat electrons. Moore's Law for power efficiency is
probably asking too much, but is it too much to ask that CPUs should
double their efficiency every five years? I don't think so.
If you are arguing that computers should not use millions/billions of
transistors, I wont argue, since I dont know the technology.
No. I'm arguing that they shouldn't convert 90% of their energy input
into heat.
Only pointing out that billion is a large number in pragmatic terms - So
is million for that matter
- Actually not so sure even on that count
[Never counted beyond hundred!]
Not really. A single grain of salt contains billions of billions of
atoms. A billion transistors is still a drop in the ocean. Wait until we
get the equivalent of an iPhone's processing power in a speck of dust
that can float in the air.
http://www.technovelgy.com/ct/content.asp?Bnum=245