The Cost of Leaving Your PC On

Between my server and my Windows Media Center home theater PC, I have at least two PCs on all the time at home. Have you ever wondered how much it's costing you to leave a computer on 24 hours a day, 7 days a week?

This is a companion discussion topic for the original blog entry at:

Good tips, thanks.

Another option, especially in California is to get a solar power system. With PGE’s “Net Metering”, at peak times you are getting a 3 to 1 ratio (in $) on power you consume vs power you use. I have 5 servers and a bunch of other electronic devices running (e.g. 4 ReplayTVs) running 24/7 and my PGE bill is still only $5/month (due to minimum state fees).

For more details (technical, financial, etc…) see:

Wow, that’d be great… if I could afford to buy a home in the Bay Area!

Strong demand and low interest rates through the early spring helped push the median price for a single-family home in the nine counties to $622,000.

So much for that whole American dream thing…

I used to have this big server fetish back in the day… had a crazy dual xeon with 2gb of memory and 6 40gb HDs all stuffed in a full size ATX chassis with a bazillion fans inside my closet.

What did I use it for? personal web host, and to host a few non-production client apps/sites and as a place to store a bunch of media files so I can access them from around the house. Power consumption was off the charts.

fast forward 3 years. I realized a) I didn’t need that much power and b) more powerful stuff got cheaper and less power hungry. Tossed the old box and replaced with a mini-ITX setup (tiny box, tiny PSU, 1 fan(soon to be replaced by a passive cooling solution)) with integrated everything (lan, video, etc etc) and a single 30gb drive to hold the win2k3 and SQL 2000. No CDROM. I was going to extend the theme and use a notebook drive as well, but I wanted something reliable.

Then I grabbed an el-cheapo NAS dealie and tossed in a pair of 200gb seagates to hold all my documents, media, recorded tv shows, etc etc. Now I just sit back and see how long it takes for a $8 a month power savings to recoup the $1000+ I put into this new setup. =)

Now I just sit back and see how long it takes for a $8 a month power savings to recoup the $1000+ I put into this new setup. =)

LOL, exactly :wink:

It’s all about striking a balance between cash outlay and potential longer term savings.

I like mini-itx, but those damn CPUs are way underpowered. For reference, a 1ghz C3 “Nehemiah” CPU benchmarks out at around the P2-500 level. A Pentium-M system is easily 3x-4x more powerful at the same wattage. It’s also much more expensive, of course…

Now that I think about this, that $20 a month I’m spending on power might be better used to pay a web hosting service. That would take me to zero watts, but I’d be sacrificing personal control over my stuff, too.

I use second hand laptops for server stuff. Especially trivia like firewalls, print servers etc can easily be handled by even the oldest laptop. I buy the occasional laptop hard disk, but the last laptop I was given has USB2 so it’ll take a fast external disk if I ever need that. Note that you don’t care if the screen is broken, as long as it’ll drive an external monitor when you need it to (or run linux straight off the CD drive for many things).

Power consumption is very low (45 watts peak, 10W or less with the disk spun down), and most laptops now use relatively efficient switch mode power supplies.

The saving mount up - my two laptops have cost me about $US100 so far (that new hard disk) which should pay off in about a year.


160 W = 2-3 lightbulbs. All power savings should be measured in light bulbs; your standard 60W bulb is a good measure. It’s an understandable unit that most people can understand.

It can cost several hundred dollars to drop the power consumption of a computer from 160W average to 100W (1 light bulb), and even more down to 40W (2 light bulbs). It costs about $5 to replace an incandescent bulb drawing 60W with a fluorescent bulb drawing 10-15W. Replacing all lightbulbs in your typical house (between 20 - 50!) with fluoresecent bulbs will result in significant power savings for a fairly modest cash outlay (which will tend to be re-couped just in the cost of replacing the lightbulbs).

While saving power is good, don’t target your computer first. Other good places to start are your hot water system, airconditioning and heating, your refrigerator (bad seals on a fridge can be a huge power draw!), the laundry, TV and radio. Nearly all of these will provide bigger power savings for the buck (many of these just require you to change habits - e.g. don’t leave the TV on if it is not being watched)

A detailed refutation that leaving your computer on all the time is, in any way, a good idea:

With all due respect… duh?

I certainly wasn’t advocating leaving a PC on all the time just because.

I need the home theater PC on to record television and allow remote scheduling over HTTP; I need the server on to serve up along with other web content, SMTP, POP3, FTP and other miscellaneous long running tasks.

It is possible to use S3 suspend with Windows MCE and scheduled recordings, however, my HTPC developed a disturbing habit of hanging on wakeup about 20 percent of the time…

Or you could just shut down your MCE box and make your server wake it when need it (http access to a page served by said server and backed by a script or/and on schedule). WOL is your friend, and if it fails you, there are some sub $5 solutions that take a more hardware oriented approach. We use this regularly with embedded machines here on the coin-op part of IT.

That’s probably a low-ball number you’ve got there. Keep in mind that the kWpH the electricity company charges you for is how many watts your are using per-second. This means that it would cost you less to have a 10 Watt bulb on for two hours than it would to have a 20 Watt bulb on for one hour.

I’ve read somewhere a report that stated that most hardware failures occured during powering up (or down), and they even had some figures that show how companies that leave they computers running 24/7 have lower hardware-costs in the long run.

Now if I could just get you to turn off your three LCD monitors when you leave for the day…



They turn off after 20 minutes. Otherwise who’s gonna see my cool screen saver running across three monitors?

It’s like Laser Floyd, but without the lasers. Or the Floyd.

Hey, I’m glad somebody brought up the idea of laptops. Your average laptop is optimized for a far lower power drain simply because it’s designed to run on battery power. So, if you have a home desktop machine plus a laptop that you tote around, plugging the laptop in when you get home instead of powering up the desktop machine is probably going to save you some juice right off the bat. (Do the math, though, obviously.)

harveyswik: That goes against everything I’ve ever heard about electricity billing. Care to back it up?

Or just get rid of that physical box and start using a virtual server. I don’t want to seem like a spammer so I won’t go on about the I get mine off but it’s a great way of being environment friendly while maintaining your own personal server thingy.

(oh what the chuff. I get mine from - I recommend them.)

Oops, I just realized how old this discussion is…

Good thing it will only take a couple months to pay off the measuring device :stuck_out_tongue:

Much better ways to conserve energy are to make sure your house is insulated properly. Most electricity is used for heating and cooling, and proper insulation makes sure the hot or cold air stays where you want it. Running around making sure every electrical appliance you have in your home is drawing minimal power is kind of like a fat guy doing wrist curls at the gym. He’s got bigger fish to fry.

Ha! What a great, geeky post. You’d get along great with my wife.

I bet Bre at could figure out a way to power your server with your hamster, or with excess energy from microwaving popcorn.