When Hardware is Free, Power is Expensive

Bill Gates has often said that over time, the cost of computer hardware approaches zero. Here's one such example:


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2007/05/when-hardware-is-free-power-is-expensive.html

For the record, it is possible to design a power supply that’s efficient at low load levels. But it’s far from common.

http://www.silentpcreview.com/article263-page4.html

The Fortron Zen is pretty amazing; 77% efficiency at 52w, going all the way up to 88% efficiency at 300w.

14.8 cents/kwh? I would love to get electricity that cheap. Over here in Germany we pay 0.17 € cents/kwh (which is 0.22 $-cents). And that’s already pretty low…

Sorry, I meant 0.17 €, not € cents. And also 0.22 US-$, not cents.

I’m not running it, but I’ve heard anecdotally that Vista (with it’s Aero Glass and other graphical hoopla) IS running video cards, etc. at a much higher level at “idle” than people were seeing under XP.

This doesn’t in any way negate what you’re describing, but I think it might be reasonable to tweak your calcuations to account for the fact that maybe Google’s problems just aren’t our problems YET.

14.28 cents is expensive, huh? Sheesh, you Americans don’t know you’re born! Next thing, you’ll be whinging about the price of “gas” :wink:

Insanely expensive power rates in your area?

I believe that you just proved the point that electric is way too cheap in California. Where is the incentive to buy/use more efficient power supplies at those (cheap) rates? As you point out the technology is out there, but nobody will buy it if it isn’t priced competitively and has good amortization. As Jonas points out power is much more expensive in Europe, but then again consumers take power usage into consideration when purchasing appliances which overall has a positive side effect for everybody’s future.

As a side note Google is using so much electricity that Solar energy starts to become economical for them.
http://www.greenbiz.com/news/news_third.cfm?NewsID=34136

I’ve been considering buying the PicoPSU ( http://www.silentpcreview.com/article601-page1.html ) for my already low-power server (no video at all! 5V fans, runs at 300mhz at idle), but since it’s been over two years since I built it and a year since I pulled the video card, I haven’t bothered; it can’t be using more than 70-80W as-is.

As a side note, I love how exhaustively researched all of your articles are, Jeff. =D

The last calculation here is actually only correct in your situation where you already have a power supply.

If someone is looking into buying new hardware anyway, your calculation shows that a better but more expensive powersupply can be cheaper after 2 years, even if it cost 30 - 50 dollar more.

You are missing the point that 12v is primarily used for mechanical components, whereas 5v and 3.3v are much better for logic. A change to all 12v would require:
Redesigning chips to work on more voltage
Making bigger chips to handle the extra voltage (wires or leads need to be bigger)
Requiring the proccesor to have its own voltage regulator, because all motors connected create interference.

And of course, ALL the chips in the system need to be redesigned. Not just the CPU. ALL of them. I don’t particularly think that this is such a good idea.

Also, a PC operates on 12v, -12v, 5v, -5v, 3.3v and sometimes 1.5v (not provided by a regular ATX PSU, this is generated on the motherboard).

Good as always Jeff, nice reading. :slight_smile:

#jeffpage:
{
background-color : #00FF00;
}

Anyway, as Jonas said your prices are very low. Here in Denmark we pay around 0.33$/kWh. Gas prices has just exploded, around 11DKR/l, should be around 7.45$/galon as far as I can calculate.

Hi,
Just checking, but did you calculate your efficency values in that table by dividing one voltage by the other…that’s not how you do it, that would only be the ratio of the different voltages.

Or is it just coincedence that the percentages come out that way?

Best regards
Steve

I don’t get it. You said prices are “insanely” high but still you say it would take years to compensate the initial costs of a better power supply. Isn’t something wrong there?

If energy prices were higher, the better power supply would amortize in the first year and more people would make the switch. So energy is TOO CHEAP. The first priority should always be energy savings, not your personal money issues…

But still a great post, as always :slight_smile:

Sorry, scratch that…I misread it while scanning…

S.

I got a Playstation 3 a few months back, and was initially excited to join the Folding@Home project with it. Excited until I discovered that running Folding@Home (maxing out the CPU) was costing $30/month in power. (This is partly because it bumped me up one tier in PGE’s fee schedule…you pay one rate for “baseline”, another for +100% baseline and even more for +200% baseline. This last rate was $0.33/Kwh on my last bill.)

The PS3 uses around 180 watts while at max CPU. Between it and my usually-on server (~140 watts) the den is often ten degrees hotter than the rest of the house.

Lately I’ve been wondering if there could be some way to capture all this wasted heat energy and use it for some useful purpose.

How efficient are laptop bricks are conversion? They too are power supplies.

blah blah blah, there so innovative.

That’s why they chose NC as their datacenter site. Tax breaks, a couple of COAL power plants, and the legislature cheaply brought.

Google are locating their data centers next to power substations (like dams) where the direct cost is a lot lower, and then kitting out their roofing with solar technology. I’d say they’re very aware of the cost implications and reacting to it sensibly.

Thanks for the story. One small correction: The “Holze” referenced briefly above (the guy personally offended by inefficient power supply) is actually “Holzle” — or more precisely, Urs Hlzle, Google’s Senior VP of Operations.

“Idle power consumption for a typical desktop PC ranges between 120 and 150 watts. Thus, the real challenge is to deliver 90%+ efficiency at typical idle power consumption levels-- 120-150 watts.”

No, the real challenge is to drop power consumption levels on idle to something much lower.

On idle, putting the CPU into a lower energy mode, dropping it’s speed to the 100MHz range, spinning down disks, powering off CD drives, switching off displays, and making displays and other peripherals draw much less power in standby modes, these are the real challenges that will give real savings at home.

They might not work for google as google probably doesn’t have its boxen idling very much. But for the home, this will give much better benefits.