Hardware is Cheap, Programmers are Expensive

Given the rapid advance of Moore's Law, when does it make sense to throw hardware at a programming problem? As a general rule, I'd say almost always.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2008/12/hardware-is-cheap-programmers-are-expensive.html

Hardware is cheap, without programmers is trash.

Hmm, what and ideal picture you paint. The hardware I work with is more than adequate for my job - sure, it’d be nice to have cutting edge bits of kit and to not worry about how our services perform. WOMM (works on my machine), so I’m past caring.

No, rather I would work smarter and with a more efficient sense of mind, learning to eep out every tiny ounce of performance I can before I hit the ‘Buy It Now’ button in accounts. As another poster comments, telling customers to do the same would be the death of a company.

I agree with the Knuth quote and it’s worth pointing out that people all too often only pay attention to half of the quote, forgetting that 3% of the time you actually should always be paying attention to the right kinds of optimization.

Always try to spend your way out of a performance problem first by throwing faster hardware at it

Bullshit.

This should be preceded by:

  1. Perf test your code
  2. Profile
  3. Optimize bottlenecks

only when the lowest hanging fruit has been reaped does it make sense to throw more hardware at the problem. Most of the initial perf issues are easily solved. It’s only when you reach the area of diminishing returns that things become expensive.

I’ll put those economics up your ass, write my web apps in C++, and run them on a Pentium IV.

I’m not being sarcastic or joking.

That’s why we have Wirth’s Law:
Software is getting slower more rapidly than hardware becomes faster

Can that salary chart be right? A programmer with less than a year of experience makes more than one with 1-4 and even a little more than one with 5-9 years of experience? So if that is right and you are looking for a job just leave your resume blank until you have 10 years of experience!

I think that he is judging this as a cost via employer with benefits:

http://swz.salary.com/salarywizard/layouthtmls/swzl_compresult_national_IT10000010.html

Breakdown:

25th percentile: $46,047
50th percentile: $52,141
75th percentile: $59,139

Breakdown WITH benefits:
50th percentile: $76,448

That salary chart only has 23 samples recorded for the 1 year experience guys vs 2000+ samples for the other bands.

It’s a statistical glitch.

While I agree in principle, I’d caution against assuming that the cost of adding a server to a hosting farm is equal to the cost of buying the hardware. First there’s the supporting infrastructure - more servers mean more storage, which means more backups, just to name one element.

On top of that, more servers mean more work for the systems people, whether you have them on staff or pay a hosting provider. A well designed infrastructure architecture (e.g. good automated configuration management) will mean you can scale your number of servers more quickly than your staff, but it’s not insignificant.

How can the top chart be correct? After that I stopped reading, if all the conclusion were based on an incorrect chart, then why bother with the rest of the article?

Yes, the salary chart is right. That’s because it’s based on newly employed programmer with x years experience - driven by job ads.

And a newly employed person with no experience includes a small number of superstars who have just left college (and not just with a BS - some will have a masters or even a PhD) who lift the average, but no-one advertises for $100,000 jobs for PhDs with two years’ commercial experience.

On high volume low cost products (a typical consumer product embedded system for example), smart coding for cheap hardware definitely pays dividends. Often the performance is limited by the need to run from batteries.

Jeff is using the wrong chart. The chart he is using is a Sr. Software Engineer/Developer/Programmer.

Here is just non-Sr Chart link

http://www.payscale.com/research/US/Job=Software_Engineer_%2f_Developer_%2f_Programmer/Salary

There less than 1 years is 57k, with 20 years or more at 81k.

Ok, it is 5:40 on the west coast. Why are you awake!?

Also. I have had a couple of employers who wouldn’t spend a dime on their developers, designers or anything. I recently came to a very amazing company where each employee received a brand new computer and individual licenses for CS3, VS 2008, VS 2005, Vista Ultimate and a few others. Now that is awesome!

Also. I have had a couple of employers who wouldn’t spend a dime on their developers, designers or anything.

Obviously this is a very bad sign, and I’d avoid working for companies this dumb.

View’s shared by some very smart people.

Does Django scale?

Yes. Compared to development time, hardware is cheap, and so Django is designed to take advantage of as much hardware as you can throw at it.

Oh, you forgot: Make it scalable. No sense in trying to spend your way out of misery if you code uses only 1 core…

We are waiting to receive our brand new workstations. My old one is no slouch and I’m still going to take a few months of transition before I finally turn it in. In the meantime I’ll finally have something to run Vista. My previous machine could, technically, run it but the performance was underwhelming.

For programming I think it’s easy to throw more hardware at the problem. Almost every IDE I use is memory/CPU hog. When it comes time to ship I keep an old P4 around just for testing. While my company has the resources to buy good new hardware regularly a lot of our users don’t have the same capabilities.