OT: This Swift thing

C

Chris Angelico

One would think that in 2014, a device called a "thermostat" would shut
down the power before expensive equipent goes up in a ball of smoke.

That exchange actually happened back in 2005 (wow! ages ago now), but
same difference. However, I think there are very few thermostats that
can cut the power quickly enough for an overclocked chip that loses
its heat sink. MAYBE if the heat sink is still on and the fan isn't,
but not if the hs falls off. "Under two seconds" might become "the
blink of an eye".

ChrisA
 
S

Sturla Molden

Chris Angelico said:
That exchange actually happened back in 2005 (wow! ages ago now), but
same difference. However, I think there are very few thermostats that
can cut the power quickly enough for an overclocked chip that loses
its heat sink. MAYBE if the heat sink is still on and the fan isn't,
but not if the hs falls off. "Under two seconds" might become "the
blink of an eye".

If the heat sinks falls off, yes, that is really bad news... But if the fan
fails the warm up shouldn't be that rapid. I thought we were taking about
fan failure, not detached heat sink.

Sturla
 
S

Steven D'Aprano

That exchange actually happened back in 2005 (wow! ages ago now), but
same difference. However, I think there are very few thermostats that
can cut the power quickly enough for an overclocked chip that loses its
heat sink. MAYBE if the heat sink is still on and the fan isn't, but not
if the hs falls off. "Under two seconds" might become "the blink of an
eye".

The fact that CPUs need anything more than a passive heat sink is
*exactly* the problem. A car engine has to move anything up to a tonne of
steel around at 100kph or more, and depending on the design, they can get
away with air-cooling. In comparison, a CPU just moves around a trickle
of electric current.

(No currently designed car with an internal combustion engine uses air-
cooling. The last mass market car that used it, the Citroën GS, ceased
production in 1986. The Porsche 911 ceased production in 1998, making it,
I think, the last air-cooled vehicle apart from custom machines. With the
rise of all-electric vehicles, perhaps we will see a return to air-
cooling?)

CPU technology is the triumph of brute force over finesse.
 
R

Rustom Mody

That exchange actually happened back in 2005 (wow! ages ago now), but
same difference. However, I think there are very few thermostats that
can cut the power quickly enough for an overclocked chip that loses its
heat sink. MAYBE if the heat sink is still on and the fan isn't, but not
if the hs falls off. "Under two seconds" might become "the blink of an
eye".
[/QUOTE]
The fact that CPUs need anything more than a passive heat sink is
*exactly* the problem. A car engine has to move anything up to a tonne of
steel around at 100kph or more, and depending on the design, they can get
away with air-cooling. In comparison, a CPU just moves around a trickle
of electric current.

Trickle?
Ok... only its multiplied by a billion:
http://en.wikipedia.org/wiki/Transistor_count
(No currently designed car with an internal combustion engine uses air-
cooling. The last mass market car that used it, the Citroën GS, ceased
production in 1986. The Porsche 911 ceased production in 1998, making it,
I think, the last air-cooled vehicle apart from custom machines. With the
rise of all-electric vehicles, perhaps we will see a return to air-
cooling?)
CPU technology is the triumph of brute force over finesse.

If you are arguing that computers should not use millions/billions of
transistors, I wont argue, since I dont know the technology.

Only pointing out that billion is a large number in pragmatic terms
- So is million for that matter
- Actually not so sure even on that count
[Never counted beyond hundred!]
 
R

Rustom Mody

Chris Angelico wrote:
If the heat sinks falls off, yes, that is really bad news... But if the fan
fails the warm up shouldn't be that rapid. I thought we were taking about
fan failure, not detached heat sink.

Dont know about 'fall off'
However one day I tried to 'clean' my 'dirty' computer
- which included removing the CPU fan, dusting it and fitting it back
- didnt know about thermal paste

Machine shut down in a minute (if I remember right)
with a message about overheating

When the (new!) thermal paste was applied it started again
I vaguely remember that the bios remembered the untoward event and some
resetting was required though dont remember what
 
R

Rustom Mody

The fact that CPUs need anything more than a passive heat sink is
*exactly* the problem. A car engine has to move anything up to a tonne of
steel around at 100kph or more, and depending on the design, they can get
away with air-cooling. In comparison, a CPU just moves around a trickle
of electric current.
(No currently designed car with an internal combustion engine uses air-
cooling. The last mass market car that used it, the Citroën GS, ceased
production in 1986. The Porsche 911 ceased production in 1998, making it,
I think, the last air-cooled vehicle apart from custom machines. With the
rise of all-electric vehicles, perhaps we will see a return to air-
cooling?)
CPU technology is the triumph of brute force over finesse.

BTW people are going this way:
http://www.silentpcreview.com/
http://www.endpcnoise.com/
 
S

Steven D'Aprano

Trickle?
Ok... only its multiplied by a billion:
http://en.wikipedia.org/wiki/Transistor_count

A typical desktop computer uses less than 500 watts for *everything*
except the screen. Hard drives. DVD burner. Keyboard, mouse, USB devices,
network card, sound card, graphics card, etc. (Actually, 350W is more
typical.)

Moore's Law observes that processing power has doubled about every two
years. Over the last decade, processing power has increased by a factor
of 32. If *efficiency* had increased at the same rate, that 500W power
supply in your PC would now be a 15W power supply. Your mobile phone
would last a month between recharges, not a day. Your laptop could use a
battery half the size and still last two weeks on a full charge.

In practice, hard drives are not likely to get more efficient, since you
have to spin up a lump of metal. (Solid state drives tend to be either
slow and unreliable, or blindingly fast and even more unreliable. Let me
know how they are in another ten years.) Network cards etc. are
relatively low-power. It's only the CPU and some of the bigger graphics
cards that really eat electrons. Moore's Law for power efficiency is
probably asking too much, but is it too much to ask that CPUs should
double their efficiency every five years? I don't think so.

If you are arguing that computers should not use millions/billions of
transistors, I wont argue, since I dont know the technology.

No. I'm arguing that they shouldn't convert 90% of their energy input
into heat.

Only pointing out that billion is a large number in pragmatic terms - So
is million for that matter
- Actually not so sure even on that count
[Never counted beyond hundred!]

Not really. A single grain of salt contains billions of billions of
atoms. A billion transistors is still a drop in the ocean. Wait until we
get the equivalent of an iPhone's processing power in a speck of dust
that can float in the air.

http://www.technovelgy.com/ct/content.asp?Bnum=245
 
R

Rustom Mody

No. I'm arguing that they shouldn't convert 90% of their energy input
into heat.

Strange statement.
What should they convert it into then?

JFTR: Information processing and (physics) energy are about as convertible
as say: "Is a kilogram smaller/greater than a mile?"
 
S

Steven D'Aprano

Strange statement.
What should they convert it into then?

Useful work, duh.

Everything *eventually* gets converted to heat, but not immediately.
There's a big difference between a car that gets 100 miles to the gallon,
and one that gets 1 mile to the gallon. Likewise CPUs should get more
"processing units" (however you measure them) per watt of electricity
consumed.

See, for example:

http://www.tomshardware.com/reviews/fx-power-consumption-efficiency,3060.html

http://en.wikipedia.org/wiki/Performance_per_watt

Quote:

Theoretically, room‑temperature computer memory operating
at the Landauer limit could be changed at a rate of one
billion bits per second with only 2.85 trillionths of a
watt of power being expended in the memory media. Modern
computers use millions of times as much energy.

http://en.wikipedia.org/wiki/Landauer's_principle


Much to my surprise, Wikipedia says that efficiency gains have actually
been *faster* than Moore's Law. This surprises me, but it makes sense: if
a CPU uses ten times more power to perform one hundred times more
computations, it has become much more efficient but still needs a much
bigger heat sink.

http://en.wikipedia.org/wiki/Koomey's_law

JFTR: Information processing and (physics) energy are about as
convertible as say: "Is a kilogram smaller/greater than a mile?"

(1) I'm not comparing incompatible units. And (2) there is a fundamental
link between energy and entropy, and entropy is the reverse of
information. See Landauer's Principle, linked above. So information
processing and energy are as intimately linked as (say) current and
voltage, or mass and energy, or momentum and position.
 
R

Rustom Mody


Hey thanks for that!
Always thought something like this should exist but did not know what/where/how!
Useful work, duh.
Everything *eventually* gets converted to heat, but not immediately.
There's a big difference between a car that gets 100 miles to the gallon,
and one that gets 1 mile to the gallon. Likewise CPUs should get more
"processing units" (however you measure them) per watt of electricity
consumed.
See, for example:



Theoretically, room-temperature computer memory operating
at the Landauer limit could be changed at a rate of one
billion bits per second with only 2.85 trillionths of a
watt of power being expended in the memory media. Modern
computers use millions of times as much energy.

Right so we are still very much in theoretical zone.
As the next para there says:

| If no information is erased, computation may in principle be achieved
| which is thermodynamically reversible, and require no release of
| heat. This has led to considerable interest in the study of reversible
| computing.

Particularly interesting as no-information-erasure corresponds to functional
(or maybe relational) programming. Of course still all theoretical.
Much to my surprise, Wikipedia says that efficiency gains have actually
been *faster* than Moore's Law. This surprises me, but it makes sense: if
a CPU uses ten times more power to perform one hundred times more
computations, it has become much more efficient but still needs a much
bigger heat sink.

That was essentially my point
 
R

Roy Smith

Steven D'Aprano said:
Moore's Law observes that processing power has doubled about every two
years. Over the last decade, processing power has increased by a factor
of 32. If *efficiency* had increased at the same rate, that 500W power
supply in your PC would now be a 15W power supply.

I think you're using a strange definition of efficiency. I would define
it as electric_power_in / processing_power_out. If processing power has
gone up by a factor of 32, and electric power used has stayed more or
less the same (which it has), then efficiency has gone up.
Your mobile phone would last a month between recharges, not a day.
Your laptop could use a battery half the size and still last two
weeks on a full charge.

One of the real industrial problems facing today's society is storage of
electrical energy in batteries. The lead-acid batteries in our cars are
not terribly different from the ones in our grandparents' cars (or even
our great-grandparents', if they had cars). The storage capacity has
gone up a little, mostly because the plastic shells we use now are
thinner than the bakelite shells they used to use, so there's more
internal volume for the same external size container.

And, yes, we now have other chemistries (lithium ion, metal hydride,
etc) which are better in various ways, but the energy density (joules /
kg) really hasn't changed much in 100 years.
No. I'm arguing that they shouldn't convert 90% of their energy input
into heat.

Actually, they convert 100% of their energy input into heat. The trick
is having them do something useful along the way.
 
M

Michael Torrie

A typical desktop computer uses less than 500 watts for *everything*
except the screen. Hard drives. DVD burner. Keyboard, mouse, USB devices,
network card, sound card, graphics card, etc. (Actually, 350W is more
typical.)

Moore's Law observes that processing power has doubled about every two
years. Over the last decade, processing power has increased by a factor
of 32. If *efficiency* had increased at the same rate, that 500W power
supply in your PC would now be a 15W power supply. Your mobile phone
would last a month between recharges, not a day. Your laptop could use a
battery half the size and still last two weeks on a full charge.

Actually that's not what Moore's law is about. Moore's law states that
the number of transistors on the die doubles every 18 months. Any other
doubling of something else is entirely coincidental.
<snip>

No. I'm arguing that they shouldn't convert 90% of their energy input
into heat.

All electronic circuits that don't create a motive force that performs
work convert 100% of their electrical energy into heat. I'm using "work"
defined in the physics sense. CPUs take in electricity and expire 100%
of it as heat, and do so immediately. This conversion to heat does
happen to do something useful along the way (flipping states on
transistors that represent information). We used to tell people that
computers make very efficient space heaters. Because in fact they do.
 
G

Gene Heskett

Looking at the whole system, about the only energy input that is not
converted to heat, would be the milliwatt or 3 of sound from the speaker
when it beeps at you, and the additional energy to spin the fans. That is
all calculate able if you have experience in air moving, as in HVAC.
Strange statement.
What should they convert it into then?

JFTR: Information processing and (physics) energy are about as
convertible as say: "Is a kilogram smaller/greater than a mile?"

;-)

Cheers, Gene Heskett
--
"There are four boxes to be used in defense of liberty:
soap, ballot, jury, and ammo. Please use in that order."
-Ed Howdershelt (Author)
Genes Web page <http://geneslinuxbox.net:6309/gene>
US V Castleman, SCOTUS, Mar 2014 is grounds for Impeaching SCOTUS
 
M

Marko Rauhamaa

Gene Heskett said:
Looking at the whole system, about the only energy input that is not
converted to heat, would be the milliwatt or 3 of sound from the speaker
when it beeps at you, and the additional energy to spin the fans.

That all becomes heat as well.

The dust particles that stick to the ceiling would be an example of
energy not wasted as heat (gravitational potential energy).


Marko
 
M

Marko Rauhamaa

Michael Torrie said:
We used to tell people that computers make very efficient space
heaters. Because in fact they do.

And that's no joke. Our home in Finland is heated with electric
radiators. They are on 8-9 months a year. During those months, the use
of all electrical appliances is free (apart from wear and tear).


Marko
 
D

Dennis Lee Bieber

(No currently designed car with an internal combustion engine uses air-
cooling. The last mass market car that used it, the Citroën GS, ceased
production in 1986. The Porsche 911 ceased production in 1998, making it,

Sorry, but the VW Bug was still being produced in some countries up to
2003 (VW shut down the last production line June 2003 in Mexico). The 911
is still in production -- though the air-cooled engine was discontinued in
1998.
 
A

alex23

The nice thing with optional type annotations and an hypothetical Python
compiler would be that you could, e.g., continue using the interpreter
during development and then compile for production use.

s/annotations/decorators/ and you effectively have Cython's "pure
Python" mode.
 
G

Gregory Ewing

Rustom said:
JFTR: Information processing and (physics) energy are about as convertible
as say: "Is a kilogram smaller/greater than a mile?"

Actually, that's not true. There is a fundamental
thermodynamic limit on the minimum energy needed to
flip a bit from one state to the other, so in that
sense there's a relationship between watts and
bits per second.

We're nowhere near reaching that limit with
current technology, though. In principle, our
CPUs could be a lot more energy-efficient.

(That doesn't mean they would convert a smaller
proportion of their energy input into heat. It
means they would need less energy input in the
first place).
 
G

Gregory Ewing

Steven said:
Everything *eventually* gets converted to heat, but not immediately.
There's a big difference between a car that gets 100 miles to the gallon,
and one that gets 1 mile to the gallon.

With a car, the engine converts some of its energy to
kinetic energy, which is subsequently dissipated as heat,
so it makes sense to talk about the ratio of kinetic
energy produced to energy wasted directly as heat.

But when you flip a bit, there's no intermediate form
of energy -- the bit changes state, and heat is produced.
So all of the heat is waste heat.
 
G

Gregory Ewing

Chris said:
So, let me get this straight. A CPU has to have a fan, but a car
engine doesn't, because the car's moving at a hundred kays an hour. I
have a suspicion the CPU fan moves air a bit slower than that.

If the car were *always* moving at 100km/h, it probably
wouldn't need a fan.

In practice, all cars do have fans (even the ones that
aren't air-cooled), for the occasions when they're not
moving that fast.

(BTW, so-called water-cooled engines are really air-cooled
too, just not by air flowing directly over the engine
block. (Although marine engines may be an exception.))
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,073
Messages
2,570,539
Members
47,197
Latest member
NDTShavonn

Latest Threads

Top