Upgrading to a High Efficiency Power Supply

Just about every device you plug into a wall consumes power whether it is on or not.

Why? Simple - the switch usually interrupts the output voltage of the transformer, not the input voltage. This means that power is ALWAYS running through the transformer coil in the machine (nothing but irons, hot plates, and some lights run at a full 120v, just about everything else needs to convert the voltage down. If it doesn’t use an adapter, the adapter is just hidden inside the machine as a permanent part)

Why would someone do this? Easy. Ease of use. Remotes need power to receive a signal, so any device with a remote is ALWAYS on. Tv tubes would be as slow to start today as they were in the 50ies if they weren’t kept in a “warm” state.

Heck, even your doorbell is hooked to a 12volt transformer which is ALWAYS running (it wouldn’t be too safe to run 120volts to something that people mash on that could get wet).

Just turn all your devices off and go watch your power meter spin!

Fortunately, you CAN get some relief by converting some of your computers to lower powered dedicated boxes. Want a file server? Get a network attached storage device instead of a computer. Firewall? Most cable routers do that job and consume a fraction of a computer.

I figure my power bill drops at least 20 dollars a month for each server I take offline and replace with a dedicated brick.

Oh, and LCD displays consume a LOT less power than a tube display just because they don’t need to keep an electron gun hot and ready at all times.

A better solution to the computer supply problem is my home DC power supply idea. Energy is wasted all over the house by those little black power transformers. If it feels warm to the touch, then it’s wasting energy. I propose that the UL come up with a home DC power standard. A new electrical socket that has a 3 prong outlet for ground, +5V, +12V. From this, every small electronic device in the house could be powered (including your PC). An efficient, central switching supply would supply the DC power for the house. This would save lots of electricity, but also the waste of every device manufacturer having to include a power supply. The cost of everything would be reduced and we would save electricity in the process.

Waste heat from appliances or any or other devices, including computers is not necessarily waste. Waste heat is 100% efficient at heating the space it is situated in…assuming that it indeed needs to be heated, of course. In fact, nearly ALL of the energy consumed by a computer - efficient or not, heats the space it sits in. A computer is basically one big resistive heating element coverting electricity to heat with near perfect efficiency. During heating months, this is a nice side benefit. In the cooling months, well…turn off the A/C and open the window if you really want to save some energy.

Although, in the end, what’s the point!? - Electrical power is relatively cheap and ultimately, energy is limitless. Heresy to you!? - Well you really need to read the book “The Bottomless Well” for an actual scientific exploration on the topic. It continues to amaze me how easily most people are influenced by the inept and scientifically clueless media…especially on the perpertually popular topics of energy use and environmental issues.

Standby power has had a lot of press coverage lately in the UK and there are now a couple of companies offering solutions:

http://www.oneclickpower.co.uk/
Use power strips where the entire strip turns off automatically when it detects the device using the ‘master’ socket has gone into standby.

http://www.byebyestandby.co.uk/
Offer radio-remote to turn off their ‘smart sockets’ from a distance.

Of course in the UK our sockets always have a switch on them too - so the easiest and cheapest way to save standby power is to turn it off at the wall.

Edison has the last laugh here.

dnm, electrical heating is actually far more efficient then most other forms. Electrical energy is only about 25%-33% more expensive then natural gas here (assuming my math is right when converted to joules), and due to the need to exhaust the combustion byproducts, most furnaces are only 75% efficient in the best of times, and high efficiency furnaces cost several times more.

Even if the electricity is generated using natural gas, what do you think will be more efficient, a small number of plants operated by companies with engineers maintaining the equipment, or each and every single furnace which hopefully gets occasional maintenance (but has an individual loss well under the cost of hiring aforementioned engineer)

As to the comment that the heat isn’t in the most ideal location, my house has these things called convection currents which take care of the heat distribution within individual rooms, averaging the heat reasonably well.

My office’s temperature is roughly the same in the various places I have measured, (within a degree) and is heated primarily by the computers contained therein as it’s the only occupied room for many hours during the day (I don’t mind the rest of the house being much colder while unoccupied, nor do the cats, they just curl up in my office with me) – I do have a couple fans in my office, but I do not run them in the winter.

During the winter I run distributed computing projects at certain times of the day (like a poor man’s thermostat), during the summer I do not.

@Dave:
Direct electrical heating is not as efficient as a heat exchanger that is powered with electricity.

Here is the problem: If we save money via using more efficient power supplies, our energy company WILL increase our rates. Its as simple as 1+1=2. And its as guaranteed as the sun will rise in the East.

An example? We had a major drought in the southeast USA and were forced(!) to stop using water (or really really skimp on its use) for almost every purpose. What happened? People obeyed, the majority of us, and our local water utilities saw major reductions in revenue. How to make up that lost revenue? Increase our rates.

Thanks. Wouldnt shock me (no pun intended) that our electric companies would do the VERY SAME THING – if we were to have figured out how to save 30% on our electric bills.

The entire system is broken. Before we figure out how to save electricity usage and water usage, etc - we need to first fix our political and civil systems.

I’m using a Pico120 regulator on the MB with a separate brick. This reduced my idle Watts from 47 to 35 for the server - it will pay for itself in 4 years! I’m running an AMD 4850e dual 2.5gz processor in a motherboard with nvidia 7025/630 chipset and 2 WD 500mb green disks (mirrored). With such low power its trivial to keep it cool (and quiet - a 68W in series resistor slows the case fan down nicely as there isn’t a PSU fan.)

Funny how it’s often the power supply that can get overlooked, not in terms of output - you gotta run your components and try to be safe for the next video card upgrade - but in terms of actual consumption.

In 2006 Google wrote

Instead of the typical efficiencies of 60-70%, our servers’ power supplies now run at 90% efficiency or better, cutting down the energy losses by a factor of four.

Ten years later, now that 80 Plus Platinum and 80 Plus Titanium are a thing, 90% efficiency and even 94% efficiency can be bought off the shelf in 2017:

I just upgraded from an 80 Plus Gold power supply I bought in 2011 to an 80 Plus Titanium, and here are my results with an i7-7700k and a 1080 Ti:

Prime95 — 252w to 235w (7% better)
Rthdribl + Prime95 — 443w to 426w (4% better)

Efficiency does depend on the load factor and seems to peak at 50% load, but impressively enough, titanium guarantees 90% efficiency even at 10% load.

1 Like

I specified a platinum PSU in a recent AMD build, my first since 2006!

Looking within the (excellent!) Silverstone brand, pricing looks like $220 for a 800w Titanium SFX

$165 for a 700w Platinum SFX

$100 for a 500w Gold SFX

It’s hard to do apples-to-apples since the fancier efficiencies are only typically built for the higher wattage requirements.

1 Like