Hardware is Cheap, Programmers are Expensive

Also. I have had a couple of employers who wouldn’t spend a dime on their developers, designers or anything.

Obviously this is a very bad sign, and I’d avoid working for companies this dumb.

That’s easy to say but something I have encountered several times. Production hardware is coming out of one budget, the money for the standardized hardware software out of another, so the developers get stuck with the same hardware as everybody else. Trouble is that you often only get one machine (because everybody else doesn’t need more than one machine either) so you get to run a virtual machine on top of your OS and dev stuff to replace the test box you used to have 10-odd years ago.

IME this is something that mostly happens in bigger companies where the bureaucracy of acquiring additional hardware or software can be so bad that some programmers (OK, me for starters) will cough up for some tools themselves. But that’s software obviously.

Sadly, sometimes, somewhere, a low-end office computer - just about the cheapest you can get - costs the same as a programmer’s monthly salary.

I’m personally a little bit more expensive than that (just a little), but getting any kind of hardware (or desk, or chair, or software) is almost completely impossible.

Just like any good thing, this can be taken too far.

IME as a web developer, the single biggest performance problem stems from interacting inefficiently with a database, a poorly optimized database, or shudder both.

Now, however, I hear web developers tossing around maxims about the dangers of code optimization and using them as reasons to not change expensive database transactions.

bigger companies where the bureaucracy of acquiring additional hardware or software can be so bad that some programmers (OK, me for starters) will cough up for some tools themselves

I ninja-stealth upgraded my own computers at a previous job where we had restrictions like that. I basically gutted the inside of the PC so on the outside it looked like a normal corporate box, but on the inside it had a modern mobo, PSU, CPU, hard drive etcetera. I just imaged the old drive over to the new one, and let it redetect all the new hardware.

Granted this is a little (ok maybe a lot) extreme, but that’s how much I believe in giving programmers the hardware they deserve!

While I agree that companies unwilling to spend cash on developers/design is dumb, jobs are not falling from trees anymore here in the SE US. It is a real gamble to drop your established job and pick up one with another company not knowing if it will be around in 12-18 months.

Also, can that pie chart on optimization be right? 14% sounds really high for micro-optimization. How many asm algorithms are the average developers running that 10+% performance boosts are available for optimizing them?

I ninja-stealth upgraded my own computers at a previous job where we had restrictions like that. I basically gutted the inside of the PC so on the outside it looked like a normal corporate box, but on the inside it had a modern mobo, PSU, CPU, hard drive etcetera. I just imaged the old drive over to the new one, and let it redetect all the new hardware.

I just bought my own Laptop, it had the hardware I wanted (for the development I do) and a second monitor output. Let work know, and told them that it will be used purely as a work machine (which it is) and until they get me a rig with the specs I want, then I’ll just use the laptop.

Okay so it cost me money, and the company none. But I have pretty high spec’d laptop, that’s my own, and if work supply me with the rig I want, then bonus, I’ve got a laptop to use at home.

You may want to read http://www.diovo.com/2008/10/the-funny-caching-problem/

Hey Now Jeff,

Great post nice charts. It’s good to see Coding Horror #3 on top 125 NOOP dev blogs for q4. Hardware is cheap.

Coding Horror fan,
Catto

I work in the games industry, in the short term at least your argument doesn’t work for us. I like it that way.

Throwing hardware at a problem makes very good sense when talking about a hosted application. When you move to Desktop development, it doesn’t make as much sense. Intuit can’t expect all their users of Quickbooks to throw a couple hundred bucks at their machines every couple of years.

At my last job, we used quite a bit of in-house developed software. These were not (and could not be) hosted applications – applications for drawing and marking up architectural drawings and things, software that had to run disconnected from the net, etc. It made more sense to our bosses to optimize code rather than upgrade 10,000+ machines.

But, it seems that no desktop developers exist anymore, or at least they don’t read blogs and comment on them, so no one here will care. :slight_smile:

This is true, but as always experience comes in to say when it isn’t true.

In web development anyway, I’ve seen the case quite often that people do something really algorithmically stupid that will be needlessly expensive, big n or small n, with the ‘hardware is cheap’ argument when you know it’ll be a problem. In one case I saw a nested loop that executed SQL that would do a query, and loop over the results of the query and do another query on top of that. Eventually the home page was doing 1500 queries, which ‘worked’ because of a cache, unless you were the lucky visitor that hit during expiration time. I found it because it took 15 minutes for the dev server homepage to load (since no customers were hitting it).

It took me 15 minutes to write the proper query, but more thought than what was there before.

So I think an additional consideration needs to be the complexity of the optimization. If it’s easy, like pulling expensive operations out of loops if they don’t need to be there, that makes sense to me and you should do it right in the first place. (I’m sure you’d agree with this). But if it requires a whole data and object remodel, buy new hardware first.

Never send a human to do a machine’s job

  • Agent Smith

I agree with Matt above:

…as a web developer, the single biggest performance problem stems from interacting inefficiently with a database, a poorly optimized database, or shudder both.

Jeff’s third step (Benchmark your code to identify specifically where the performance problems are) probably means: figure out which database queries are being run the most often and also taking the longest to execute? Then take them out of the damn for loop.

David is right above. This is the better salary chart:

http://www.payscale.com/research/US/Job=Software_Engineer_%2f_Developer_%2f_Programmer/Salary

But, regardless, Jeff’s point is still valid.

All I have to say is that YMMV.

Take, for instance, a preprocessor program that removed duplicate CDRs before sending data on to billing, etc.

The program, which used to run very quickly, was now taking more time to process a CDR file than it took for a CDR file to be produced. A very non-optimal situation, of course. This was all CPU time – no I/O or memory constrained the program.

Now, we could find a hardware twice as fast, but it wouldn’t really be easy – the hardware being used was already fast enough. Distributing wouldn’t work either, unless you wrote a whole new application with a completely different processing algorithm.

But, let’s stay with the hardware. Suppose we decided to take that route. That’s 3 months before we can have the hardware ready to be used. I’m happy that smaller concerns could have the new machine in the afternoon, but a LOT of corporations out there just don’t do things this way. Their loss? Maybe, but that’s still a system we have to deal with.

Now, being unable to charge our clients for 3 months really isn’t good for business. So, my manager decided to ask me – not a member of the programming team, and a much more expensive resource – to take a look at it.

Two hours later, I had a 5000% performance increase tested and ready to go. It just happened that the fields in each CDR were extracted through a pattern matching, and just a dozen or so fields in the first hundred or so were actually used. The total number of fields had increased from a couple hundred to more than 600 through the years, which resulted in a much slower matching. So I just changed it to ignore all fields above the last significant one before extraction.

And the moral of the story is: hardware maybe be cheaper than a month of a programmer’s salary, but getting a good programmer to look at the problem for a day or so might be cheaper and faster.

I brought myself a Mac Pro with 8 cores and 10 Gig Ram to be able to virtualize Windows XP and Vista as many times as I wished. I couldn’t be happier. :slight_smile:

These observations might be valid for server based applications.
For desktop development, shrinkwrap software or industrial applications with a defined hardware environment, you still cannot replace senior development craftmanship with hardware.
Good that is :slight_smile:

good analysis, although I have to counter, programmers are expensive relative to what? Compared to server hardware costs it would seem expensive but have you considered the costs of providing a service like stackoverflow w/o software? like if stackoverflow was paper based for instance; then one would have the cost of ink, paper, printing equipment, transportation, labor, and etc. kinda makes programmers in-expensive.

i agree with this article whole heartedly.

in my experience, it is amazing how many managers and old programmers are out there that disagree on this subject.

On the other hand, sometimes premature optimization turns out to be the major selling point of a new product: http://alumnit.ca/~apenwarr/log/?m=200806#24