Learning on the Battlefield

Summary of thread so far:

  1. People who haven’t got a CS degree think degrees overrated, because they didn’t go to a university and became great coders.

  2. People who have a CS degree think degrees are vital, because they did go to a university and became great coders.

There are obviously many paths to wisdom.

Peter,

Academia isn’t always years ahead of industry.

Two recent notable examples: graphics hardware / rendering and data center scale engineering / operations.

Of course, if you insist that academia is 5-30 years ahead, then I’ll agree… it’s just that often times they’re years ahead in an impractical (“wrong”) direction :slight_smile:

~L

The Canadian Academy of Engineering requires a year worth of internship before it allows you to graduate with an engineering degree (which comp sci degrees are considered in Canada.) I wish this was a requirement of more programs in the US.

I’m with Jamie on this topic. I have a diploma from a technical college. The course work was very practical. We did learn the theory, but we also applied it through labs and assignments. There wasn’t a single data structure or advanced algorithm we didn’t write by hand. After graduating I was able to go head to head with CS degrees who had several years experience. I have worked for at least 2 companies that actually give preference to graduates from the college I attended over applicants with a CS degree. Mainly because we can produce rigth from the get-go. I have had on several occassions, fresh CS grads put on my teams and found, wth a couple of exceptions, that they took longer to come up to speed with the technology used and took longer to produce quality results, even after a year or two on the job.

I’ve been working as a developer now for 18 years, and have never had a problem finding work - in fact I turn it down on a fairly regular basis. I went independent (contracting) 7 years ago and it’s been steady ever since.

Don’t get me wrong. I’m not trying to knock or slag CS degrees. I have worked with many CS grads who were quite excellent. I have also worked with diplomas from “lesser” technical colleges who were totally useless. I considered going back to uni to get a CS degree myself (all courses from my tech college are directly transferable to the uni in my city - no other tech college here can do that -, so it wouldn’t take long). After a couple of years working, though, I just didn’t see the point. In this city, at least, there doesn’t seem to be any extra benefit. I don’t think those of you who exclude people because they don’t have a “degree” are likely missing out on some fantastic resources. Not all tech colleges are bad (though I agree there are a lot that are). Some are very good. A degree, in and of itself, does not mean the person holding it is a better developer. It just means they have a degree. /end rant.

Jeff,

I have to say I disagree with you on many aspects of this article, but first, I do agree that there is no substitute for experience. However:

Trying and failing does no good if you don’t know what you’re doing in the first place. In a scholastic environment, you can get the help you need to figure out what’s going on so your failures can be corrected and you can show what you know. In a profession environment, this scenario would result in you finding a new job.

There has to be a foundation for any job. You don’t become CEO of a bank without knowing business math, and you don’t become a programmer without knowing software principles such as memory usage, syntax, and the like. If you just jump right into the industry with no prior knowledge, odds are (as I have seen) your code is going to be very inefficient, ugly, and buggy.

“the work you’re doing is far more relevant than any classes you’re taking” - wrong. Most of my classes at my school took the principles of computers and built on them to make me able to use any programming language I want. From learning about basic data structures to sorting algorithms to compilers, I gained a total understanding of how software works, and with that foundation, I can currently use C, C++, C#, and Delphi to write any application I want, and any other language I want to pick up I can do so in about a week as compared to months for someone who’s never had any formal training. Training is vital to being able to work.

“seek out internships like your life depends on it” - probably the best piece of advice for students out there. However, schools offer something that you can’t get anywhere else: a pre-existing network that you’re automatically in. In the workplace, you have to build a network of contacts, friends, and supervisors so that if something happens, you can find another job fairly easily. Typically, that network is quite small for a long time. However, through schools, that network is built through the instructors, your peers, and the support faculty (secretaries, career services, etc). It’s already in place and ready to direct you to a great job if you have the skills and knowledge.

I won’t go into the rest of it, but will close by saying Experience is great, but lack of skills will get you nowhere. It’s 100 times harder to get those skills with experience only, and a good school will set you up for great sucess.

LKM, you’re absolutely right. I messed up :smiley:

There are paradigms, concepts, that will stay the same long, if not infinite, period of time(procedural, functional, oo, general, etc). But there are technologies that are constantly evolving. If you know concepts you can do anything theoretically. But in practice you ought to know more than just concepts.

In ancient Japan sword masters tested katanas on wood… and peasants. It’s blood what makes sword weapon.

Same in CS. Swords stay swords. Steel changes, shapes evolving, and so on. But you still need to spill the blood.

Once you know concepts you are blacksmith. And when you kill - you are sword master.

So by that I mean one thing: good programmer always in process of learning, and applying knowledge in practice (yeah nothing new here :D). Not just waiting for something/someone to teach him.

Training is vital to being able to work.

I’m sorry, but I do not agree with this. Learning how to learn is vital to being able to work, and that’s something a good university will teach you.

But the courseware itself? Largely irrelevant in our field.

For best results, get out there and immerse yourself in the thick of it with your peers. Make mistakes!

tcliu – You are my hero.

Both of your comments are insightful… the first, into our industry. The second, into the people that make up our industry.

How awesome.

g

No way. You’re saying that experience is important? That’s unheard of. Theory and practice are the same thing, aren’t they? Universities produce all or almost all of the software that’s used by millions of people around the world, don’t they? Evil corporations don’t innovate, they just steal the ideas from universities, right?

I’m one of those who were mislead by thinking they should go to school first, get a degree, and then find a job. I never dared to apply for a programming job before completing my degree. So, I learned the science and management of programming. I made my code clean and maintainable, I understood the importance of good requirements and well-designed models, I even solved a few nasty puzzles and had a peek at a few related fields. Then, I was dropped into the battlefield.

I didn’t have the acronyms. I didn’t have any specs. Nobody had any kind of methodology whatsoever. All I was expected to care about were “implementation details”, as my teachers called them. Nobody had the faintest idea of what professional software development was about. They just hired programmers to do the technical stuff.

In the years that I fought, I acquired a few language skills, learned to configure system services, followed some of the hype and got a feel for the type of applications I’m developing. I don’t really model. I don’t test much either. Who would care anyways? In some ways, what I’m doing today professionally is much closer in nature to what I was doing in my basement as a teen in BASIC than what I learned to do in school.

Sometimes, I wonder if I haven’t regressed.

Jeff, you say - "As long as you’re out on the battlefield fighting the good fight, you’re bound to improve."
But what if you’re out on the battlefield, but fighting the BAD fight? Whe wrong-way fight? You’re bound to…

I disagree with the notion that it is necessary to attend a university to prepare one’s self for real work as a software developer. I taught myself C++ by writing code and reading books. It took about a year in between stocking shelves at the local supermarket to master the basics. After realizing that I wanted to write code for a living, I decided that I should probably go to school.

I wouldn’t say it was a mistake, because it opened a door to an internship that grew into a full time position. But, I learned far less useful information through my 4 years of (part-time) college than I did in that first year of actual programming. After working full-time and taking night classes for 4 years, I decided to put school on hold and haven’t looked back since.

If you lack the ability to learn complex new concepts without the structure of a university environment, you will not survive in this field. Going to a university may teach you how to learn, but if you already can, teach yourself the skills necessary to build something impressive. Then do it. Contribute to an open source project, or create your own. In my book, a background of excellent work is far more impressive than a college degree.

When hiring Software Engineers I look for a CS or IS degree. No degree, no job…

Phil, i think you are a moron, truly. I find this method of hiring really discriminatory, especially concerning programming. I left uni. where i was studing computer science, why?, because my professors were idiots. If I have the discipline to teach myself, and at a level and pace that supercedes uni… how am i lesser than the other programmer that stayed in class? That degree requirement of yours seems too rigid in my view. and if i had to guess, you’re probably a bad programmer.

Ultimately as Jeff says in another post, that you have to have the mindset to be a programmer. I don’t believe that a CS or IS degree is any more important than a mathematics degree. Programming requires logical thinking, which college/university course can filter out the least suitable.

One thing that I don’t think has been mentioned enough is that CS courses tend to favour Open Source software, which means zero in terms of candidate suitability in a Microsoft house.

In my company we tend to hire people from a numerate science discipline (physics, chemistry, bioinformatics, …), mainly because our customers are scientists, engineers, etc. Learning SW engineering/development is done on the job plus specific training. We use few CS graduates since they wouldn’t fit our customers. Yes, some more formal education may be helpful but I agree with the notion that the more interesting bits are happening in industry, not at university: look at agile/XP, test-driven, refactoring, patterns, …

Do Generals make analogies to software when talking about Battles? Doubt it.

Battlefield analogy is cute but isn’t great. Analogies in general bite because someone always picks a hole in some aspect and they’re never perfect.

Custards last stand anyone. Er, actually it reminds me of many software projects actually.

The battlefield analogy is poor because the way it was presented was basically saying ‘experience is good. don’t let your lack of schooling stop you for getting practical experience before you finish your degree’.

However there are so many more aspects to battlefields that are being conveniently sidestepped for the sake of a good article. If you think armies just wake up one day, run to a field, and start fighting you’re an idiot. Those that do, die. Battles are planning, and planning takes skill. Skill is learnt through a combination of experience and revision (of history, tactics, etc). Soldiers train, practice manoeuvres, etc etc etc.

Battles themselves are short, explosive bursts of activity following orders and adapting to situations.

Writing software is talking to clients about features, prioritising, coding, testing, prototyping, meeting, talking about more features, reprioritizing, changing code, maintenance, more testing, regression testing, more meetings, and the odd successful, on-time delivery …

So how exactly is a battle in any way similar to the process of delivering software?

Because one should never criticise ideas without offering their own, here’s my newbie advice:

Don’t expect to be an architect or know everything out the door. You won’t solve things the best way the first time. Failure is ok. Have a goal.

AVOID analysis paralysis. Coding and testing something will give you a better indication of how things work than looking at a sheet of paper all day and drawing pretty pictures. Don’t just throw away those prototypes though, refactor them into something useful.

Write tests first.

DO find a good ‘commander’ (ie mentor) and get in the trenches. Listen to them, take what they say, but don’t believe everything you hear. Think critically.

DO move around. You don’t want to get stuck in a job for more than 2 years. (it’s better for your salary as well as for your experience)

DO NOT feel bad about quitting.

DO say NO. Avoiding bad experience is better than no experience, so don’t just take anything that comes along. It’s your life, and if someone asks you to write bs code for their horrible WTF of a system you should pause, take a look around you, and if you see your future spiralling into the drain then leave. It’s just not worth it. You don’t have kids or a mortgage yet, and you can manage eating KD for a few more weeks.

I’ll just echo many of the previous comments by saying that education is a crucial aspect of any programmer’s career. However, without any experience “on the job”, a well-trained programmer often knows just enough to be dangerous.

I’ve seen that some people consider certifications to fall under “experience”. Personally, I’d consider them a proof of an education. Somebody who has a Sun Java Certification most likely knows what they’re doing. Being Sun Certified says leaps and bounds more to me than a mention of passing a Java course or two at some state college somewhere.

@squidbot:

That depends highly on the faculty that “gets” the Comp Sci program. You’d be correct if it’s the engineering faculty (or you’re confusing it with a software engineering degree).

Many universities have the Science faculty responsible for Comp Sci, in which case Comp Sci students escape with a BSc, not an engineering degree (and so are not engineers). There’s also the weird case of the University of Waterloo, which has a Math faculty (which gets the CS program, obviously).

You also need to stay on the battlefield long enough to learn from your mistakes. E.g. if you design a system and then leave before it goes live, you’ve only seen part of the battle. You don’t know what happened in the end, so you don’t know the ultimate outcome of your decisions. Naturally, you’ll assume that your contribution was successful, but was it? What were the final consequences of your decisions? Did you make a mistake? Did you omit something important? If you don’t hang round until the end you’ll never know.

Too much “experience” in this industry consists of people doing stuff, but then moving on to other projects before they even have a chance identify their mistakes, never mind learn from them.

I read somewhere (but can’t find the link) that proficiency in software development is not correlated with years of experience, but with the number of times a person has been through the whole lifecyle of a project, right through to completion.

Also, you learn faster if you can get accurate feedback faster: http://www.agilekiwi.com/making_better_programmers.htm

Personally, I agree that talent and work experience is paramount, and that most programming jobs can be done without academic background (though I think it adds a lot of value).

However, I take issue with the tone from the article and some of the commentators, as if universities are a thing of the past (“universities were great…”) or as if they are behind the times in research.

As with most industries, the software industry is and have always been years behind the research front in the academia.

Some examples (also from electrical engineering): distributed computing, garbage collection, communications, voice and video compression, image analysis and processing, control systems, parallel processing. I could go on.

This should not come as a surprise. The purpose of academia is to research. Even if the area is obscure. Even if it seems of little practical use. This is the difference between science and engineering.

Later, come the engineers in the industry, which have a problem and need a solution (or saw a solution and are looking for a problem). These academically trained engineers keep on top of current (or older) research. Or maybe they vaguely remember something from their studies. Perhaps they have friends in academia, who remember reading this or that article.

The result is that these engineers read the research, then apply the solution (for example, an algorithm) to the problem. However, research is seldom interested in practical considerations. So the engineers need to understand the solution, and then adapt it to the real world. Of course, that takes a lot of time.

This is the norm in most disciplines, and not that different in many areas of software engineering.

This is also why it pays to have academic background. You know so much more, you have a wide base of knowledge to draw from. You have a strong theoretical background, meaning you can quickly become familiar with new subjects, just by reading the basic book on whatever subject you are currently working on. Some jobs are so hard that you need an actual academic expert on the subject. Just try developing a V.17 modem yourself.

Of course, there are exceptions. They are exceptions.