The Years of Experience Myth

Well…

It’s certainly true that 10 more years won’t make a good programmer out of a bad talent.

But, the reverse is true. 10 more years of experience WILL make a better programmer out of a great talent. Sometimes a MUCH better programmer.

Generally I’ve heard it said that it takes 15 years to master any kind of fundamental craft, be it programming, carpentry, motor mechanics, or something else. I think that’s pretty much true.

(Also, it’s been claimed that most people can only learn one such craft properly throughout the span of a lifetime. I also believe that to be true… even though it’s an entirely different issue altogether.)

@Gerry

You’re so right that a smart programmer with 10 years experience will outshine a smart programmer with only 1 year. But, I’d prefer to hire a smart programmer with a single year of experience than a lousy programmer with 10 years.

I’ve been interviewing programmers for the last couple of weeks (I’d like to thank Jeff for running this story and the one about telephone interviews — talk about excellent timing). The HR department slapped 5 years experienced required on the ad. We rejected over 90% of the applicants based on their CVs alone. Telephone interviews got rid of another 5%. So far I haven’t come across one I’d like to hire. I’m beginning to wonder whether all the smart programmers with over 5 years experience aren’t all happy where they are (either that or there are very few smart programmers). I persuaded the HR department to re-advertise the role without the 5 year requirement. I know we’ll get a load of CVs, most of which will be rubbish, but I’m hoping we’ll find some young smart programmer who hasn’t found his dream job yet.

I really agree with you…and the same thing happened in our company a guy with 6 month experience is much better than the oldest employee of our company

It is kind of seniority over talent, a whole philosophy on it’s own.

Or the paradox of the good plumber: he works faster, thus get less paid. Translate to programmer learning faster, thus having less experience.

I have seen very extreme cases of this, where it was required to have a combined 20 years of experience (in radically different areas), but be no more old than 25.

For one job with a somewhat pointless years of specific experience requirement, I was able to get the hiring manager to like me, give me a phone interview and everything. Then I talked to the lead developer who just asked specifically how many years experience with that particular tech. I told him (this information was available on the resume) and then he responded, “I’m sorry to waste your time, we need someone with more experience with this particular technology.” It was for a Junior position, too. In this case the HR person was actually much more reasonable than the lead developer.

I was upset at first, but then realized that I probably didn’t want to work for a guy that short-sighted anyways.

Experience matters, up to a point.

Different kinds of experience matter differently. People seem to improve when they’ve used platforms, for about a year or so. Beyond that, they might improve in other ways.

It takes more than a year or so to learn coding and data structures and algorithms and algorithm-analysis and business problem analysis.

Learning to get along productively with different types of people takes a bit longer. Marketers, difficult bosses, engineers in other disciplines, business customers, and so on.

Learning how to present software to non-software people takes a while too. People seem to improve over a period of decades. But the non-software people are also becoming more sophisticated over that time scale.

The experience of going to a good school matters too. Nothing replaces the basic math and the study of the basic algorithms. People don’t seem to learn much in the way of new programming paradigms outside of school. Look at the adoption of object-oriented and now functional programming. Courses, at least, are the way to go.

Learning about the application space can be done in school but you can be trained in finance and end up working on embedded systems or web apps. And the other way around. So application space learning has to be on the job and is often per-company. HP does things differently than IBM, which is different from Microsoft, Google, etc.

There is, however, no cure for stupidity.

Oddly enough Martin Fowler has a similar post today. http://martinfowler.com/bliki/CheaperTalentHypothesis.html

Completely agree. Whenever I have been asked the experience question regarding a technology I don’t know in an interview I have always answered roughly the same.

For example, “Although I don’t have knowledge of that particular language/technology at the moment, I have experience in many languages/technologies and the ability to learn. If that skill is essential on day one of the new job, I will use my notice period ensure that I have at least a rudimentary level of knowledge by the time my new position begins.”

If the company doesn’t like this answer and would refuse the position because I don’t have three years of the particular skill then I probably wouldn’t like working there anyway.

I believe Jeff is pretty much right on the money. I’d like to add a little bit more though. Nobody is saying years of experience isn’t helpful. Every programmer needs to go through the pain of learning multi-threading correctly. That 1 college course that touched the topic briefly clearly isn’t enough. Only years of experience will get you that. However, years of experience is secondary to “smart” and “gets things done” (http://www.amazon.com/Smart-Gets-Things-Done-Technical/dp/1590598385). An experienced dodo that’s a waste of space is a costly mistake.

Moreover, that programmer out there that’s smart, gets things done, and is experienced isn’t available for you to hire. He or she already is working at a great company (or owns their own) and doesn’t want to leave. You’re only hope is a hefty signing bonus or a disgusting amount of stock options to lure them over.

Saying “experience don’t matter when hiring” sounds to me a lot like “experience is next to worthless” or “coders don’t learn anything on the job”.

I am the first to remind people that the actual number of years of experience doesn’t really matter (we’ve all met many useless programmers with several years of experience).
However, it is also true that even the most brilliant people learn and gain much from professional experience, and usually not much use for serious work unless (until) they have it.

Note that experience does not always equate specific skills. Simply working as a professional programmer for a year or two teaches MANY lessons beyond a specific skill like TCP/IP: how to estimate work, how to do engineering tradeoffs, how to work in a team, designing and implementing code so that you can maintain it two years after the fact, writing robust and error aware code. Even serious hobbyist programmers are usually missing those skills, simply because they never had to learn them.

Also, some skills are not so easily acquired. For others, we simply don’t have the time for the new employee to pick up. For example, if we are working on embedded systems under a tight schedule, I don’t have the time to wait for the new Java guy to grok pointers. Our new employee needs to know C well enough, and we also expect him/her to be able to deal with new platforms almost by themselves. We don’t have time to babysit them.
These are not the kind of skills that people “pick up” in 3 to 6 months (and even if they did, it’s still too much), they take a year or two to develop.

And of course, there are the various fields of knowledge: signal or image processing, scientific computing, speech compression, etc.

So yeah, when we look for candidates, we look over previous work experience. We will (and have done so) accept candidates without prior professional work experience, but even then they normally have some years of experience as hobbyist programmers.

I think that the simplest thing to say about this is that ‘years’ isn’t the right unit of measurement for ‘experience.’ If somebody told you that they got hit by a car, you wouldn’t ask “How long?” You would ask questions that aimed to discover the person’s memories of the components of that experience, that person’s synthesis of the components of that experience within a wider context of just the experience itself, and finally the changes that the experience brought about within the person that would affect the way that they would approach other situations in future.

Comparing a ‘smart programmer with 1 year of experience’ to a ‘smart programmer with 10 years of experience’ is just silly. I’ve been playing guitar for 10 years longer than I’ve been programming, but I’m a lot better programmer than I am a guitar player.

also:

“analysis of congitive development shows it takes ten years to truly master a discipline”
“it takes 15 years to master any kind of fundamental craft”

is completely unsourced nonsense that substitutes research-y sounding words for research.

and:

Smart, fast learning people are no substitute for real experience.

adds the adjective ‘real’ to destroy any meaning in the sentence. Reality isn’t measured in years. If the reality you’re talking about is extent of retained facts, depth of synthesis of known facts, and effective modification of strategies based on those facts, this is something done better by “smart, fast-learning people” by definition.

Agreed, however: Good programmers get better at programming every year. Average programmers get better at their niche every year.

" … there’s about even odds that they have no idea what they’re doing."

Among the things interesting (to me, anyway) about this quote is its implicit self-loathing: half of all developers are clueless? Seems an extreme charge. What other professions judge themselves so harshly?

But the answer to that is fairly obvious: the social sciences. Talk to any economist, psychologist, sociologist, educational theorist, political scientist, etc., and they’ll quite willingly let you know that most of their colleagues are idiots and charletans.

And so it is with programmers–methodological wars, competing theories and schools, and endless oscillations between true-believer enthusiasm and the dark night of disillusion and self-hatred. Perhaps programming is a social science, rather than a type of engineering. It does feel that way.

But more to the immediate point: hiring requirements specify x years of experience because (1) they don’t want to pay for that 6-12 month learning period, and (2) hanging on to a position longer than that gives some hope that you did indeed ‘get it’. HR is not swinging for the fences in creating job requirements, they’re just trying to avoid striking out. It’s about risk aversion, not excellence.

And in a field that rates itself as 50% incompetent, doesn’t that seem a sensible approach?

bobD

I agree with the post wholeheartedly. I first experienced the disconnect when I became an MFC developer back in 2000. Previously I had only worked on C and Unix. I took some time to learn C++ and then MFC before applying for new work. I had about 4 years of industry experience under my belt at that point, doing projects. I had 0 years of project experience with C++ and MFC. After looking for several months I found a place that was willing to hire me as a junior MFC developer. I had some things to learn still, but after a few months I was basically as good as most of the other developers there. After several months my project manager became very impressed with me. She noticed that I was able to handle client meetings well, that I had good communication skills, and that I anticipated problems before they happened. But of course, I thought to myself. I had experience with these things from the 4 years I spent at my previous job. People also noticed my work ethic, that I put in my hours, and got substantive things accomplished. No one really said it, but I got the sense that I should’ve been hired on at a higher rank than I was. In fact rumor had it that I was up for a promotion, but then the bottom fell out of the IT market in 2001, and I and most of the people I knew were out of a job. Oh well.

Since then I haven’t thought well of the “X years of Y” mentality that most employers have. I wish they’d get a clue that they’re not even using the right criteria.

My own sense of why this mentality persists is that business schools, and society in general, still think in terms of the Industrial Age. Companies act like they’re hiring electricians, or plumbers. Each time a new language comes out they think it’s a whole new system, and therefor they have to find people who know it. What they’re missing is that a new language is just a new set of rules, often with the same basic architectural assumptions as several other languages. Today it’s more important to learn about APIs–a kind of “architecture inside of architecture”–than languages. The APIs are what take the longest to learn.

Years of experience are not a myth. They matter a lot.

This blog is a joke and i really do not thank you for giving such advices to conduct an interview.
IT interviews are some of the most boring in the market due to geek questions like these.

It is a real torture for the candidate.
Do you really need to ask a candidate to write 15 methods to do these and that, and ask him to learn by heart what is this collection or these one …
Jesus Christ …
We, programmers, are paid to code and to learn.
When we have a problem, we do some research and some learning, then we code.

This is very very true…most of my experience is very basic, using design templates and drag and drop developer tools for websites and simple programs, but my near perfect GRE score has allowed me to pretty much solve any problem I’ve encounteered thus far, and I taught MIT and Stanford prospects for 2 years and did not find them particularly gifted, but they will be the ones hired, and, it is wonderful, since I am happily off to solving puzzles and enjoying gadgets and tech. :O)

i agree in terms of technical skill… however, inter-personal skill may take longer to build up. both sets of skills should be considered when hiring a candidate imo.

The following would be jokes, except that they are true stories from the '70s.

ca. 1971:

MIT student comes back from winter vacation. Roommate asks, “Did you get the summer programming job you were going to interview for?”

No. They asked me if I knew COBOL. I said no, so they said they couldn’t give me the job. Do you think maybe they don’t have any Language Reference manuals?

(Source: the roommate.)

ca. 1976

A guy interviews as a systems programmer at a small mainframe shop. They ask him how he learned IBM’s DOS/VS and CICS at his previous job. He said, “From the IBM manuals.”

They didn’t hire him. Because the manuals were so hard to understand, they figured he was lying, so they didn’t want him. Alternatively, if was telling the truth and had successfully learned from the manuals and been successful, he couldn’t be normal, so they didn’t hire him.

(source: one of the interviewers)

Dakra

I structure my interviews to find people who are capable, not necessarily qualified. BlackWasp is the guy I’m looking for, he’s capable of picking up the necessary technology although not qualified today to use it.

Agree when applied to programming languages, or technologies.
But not necessarily about full fields.
It will be more difficult for a Java programmer to do low level TCP/IP coding.
And there are fields where experience cannot be replaced. Years of experience in security, or internationalization, or working close to the machine, are impossible to replace. That experience transfers easily to other platforms, or programming languages.
Kind of orthogonal things, and one should not confuse them.