Learning, or, Learning How To Learn

Unlike your tour guide, I can think of quite a number of things I learned in college that have been and are applicable to my job. It was college where pointers “clicked”, I figured out recursion, I learned all about big-O notation and how to organize my code. I learned about the principle of DRY though we didn’t call it that and I was encouraged to break things down into manageable units - back then it was Pascal and C and the idea was functions, then it was C++ and classes. I learned about search and sort algorithms, reusability, the fundamentals of computer graphics and event driven programming. These days I don’t have to write my own linked list implementation, but I could if I needed to. But there was and still is value in understanding all of that.

I agree that college can teach you how to learn. I also know that I applied quite a bit of my college CS knowledge to my work and still do today, though the languages have mostly changed. And that is just the CS stuff - I have used some of the math I learned in college too. College was anything but a waste of time for me despite my best efforts at the time to slack off.

Programming is not an industry of any particular innate social significance. All the talk of software being necessary for various other fields is just rationalization that could be applied just as easily by other fields – medical advances are expensive to finance, so you could even argue that investment bankers are the real heroes.

Surely Jeff’s original context is that keeping one’s knowledge up to date is at least as vital to one’s career in software as it is in pretty much any other field. Heart surgery is a more valuable field to society, but AFAIK, every technique you learn as a surgeon doesn’t get thrown out the window every 5 or 10 years the way it does in software.

Learning how to learn is definately a valuable skill in the field of software development. Without it we’d all be a big bunch of cut and paster’s. One of the reasons why it may not be perceived as most critical in IT could be that we are not really in the business of saving lives here. At least not in the field of business IT. But in one indirect way or another, I’d like to think that we actually are.

Jeff,

Playing by the rules we invented? LOL!!! You must be on crack! Let me let you in on little secret buddy: we are still constrained by the rules of physics: space, time, matter, and time, not to go any deeper (see computational theory, np-hard problems, the very foundation of all the crap we build).

Now if you’re referring to the rules by which we must play to program in a framework that some dimwit came up with, then yes… you would be fighting against the rules the dimwit’s limited creative capability was able to conceive of…

I agree that learning how to learn is essential.

However, almost none of my schooling taught me how to learn. What it taught me was:

  • Learning is tedious and mainly involves memorizing apparently random and pointless facts.
  • Work, which you’ll be doing the rest of your life, is similarly tedious; that’s why we give you tedious homework to prepare you for it.
  • It doesn’t matter what you learn, or especially if you learn more than you “have” to. What matters is the grade you get.

And I went to “good” public schools, mind you. It took a few years after I quit going to school before I could enjoy learning again.

My goal is to home-school my kid so he won’t have to grow up thinking learning is boring and pointless like I did.

I was just saying the other day:

  • The IT guy at work is learning to play the guitar
  • My brother who works in a Network Operations Center is learning French
  • I am STILL learning Software Engineering

Every day it changes. And every day I want to know how and why.

  • Until the Von Neumann architecture is replaced (in commonly used computers); we sure do play by the rules.

  • A professor of mine pointed out that the difference between a professional and a laborer was that the professional had, and used, a library.

  • My bestest boss ever kept a copy of Snedecor to hand, and referred to it regularly.

  • If that guide never used anything he learned at MIT; neither have Click and Clack the Tappet Brothers.

Kyralessa, you summed it up perfectly. That’s exactly what I would have said for public high schools/universities.

in reply to the post of John A. Davis on June 28, 2007 03:34 PM

and so much wasted time to make the flashcards.

underlining items in the book works great if your memory can keep up and remembers by a quick glance which parts are not needed to be read any more. and it doesn’t waste time, you can underline the stuff very quickly unlike making flashcards or writing down summaries.

to “Ben on June 28, 2007 12:38 PM”

heheh. your niece was right. what you were saying to her was probably worded in a very overcomplicated way. that is not for kids. they will not learn that way, they need other ways for learning stuff.

One of my profs said “getting your undergraduate degree is just proof that you can be trained”. A Masters and doctorate, in most disciplines, do require actual critical thinking and new approaches. For most physical sciences at least, a B.S. = What, a M.S. = How, and a PhD = Why. In your undergraduate courses you learn the frame of reference and terms for your major, you masters thesis and courses you learn how those things interact, and for your dissertation you explain why thing work the way they do.

Great Post Jeff,
After teaching college for the past 23 years I couldn’t agree more.

When I started in this industry my favorite programming language was solder, and PROMS had an amazing 256 Bytes of storage. Everyone programmed in Assembler and we struggled to write self modifying code spending hours optimizing and rewriting to try to save even 1 byte of space.

EVERYTHING fit on one floppy: O/S, development system, application and data.

The entire documentation for everything fit in one slim three ring binder which could easily be read in a single sitting.

Modern systems have expanded complexity by what, 10,000 times? (Pulling a number out of my … hat).

No other discipline approaches the amount of change as computers have, and in the same time frame, not even close.

Og, the caveman doctor was testing drugs and experimental surgery techniques long ago and nothing much has really changed since then. Test it on a patient see what happens shrug.

Architecture and engineering haven’t changed all that much in the last 1,000 years either. Not really. Fancier slide rules, better materials.

And here we are programming systems on machines that would have been considered supercomputers 25 years ago in a global network that was unthinkable 20 years ago in a language that didn’t exist 12 years ago using methodologies that were unheard of 10 years ago exchanging data using technologies that didn’t exist 5 years ago and displaying it with techniques that didn’t exist 3 years ago.

25% of the material I teach gets replaced every year, a major platform shift every 4 years. I teach very little (if anything) of what I taught 5 years ago. Everything gets pushed down and taught in courses and schools previous to mine.

Every year brings a new layer, what started originally as a single layer small application now has over 20 layers looking at all of the hardware and software involved.

I specialize in disciplines that did not exist 10 years ago, application development security and development methodologies (UP, Agile, etc) in a hostile environment where bringing up an unpatched server will result in your machine being overtaken by 10 year old hackers and turned into IRC porn servers within 48 hours of turning them on.

My last few projects would have been in a Bond movie 5 years ago. Wireless RFID tracking of a manufacturing process utilizing fingerprint scanners, PDA’s, GPS, GIS, with all the data being captured by SAP.

The only thing that has been consistent at my job is how much stuff has changed.

When I first started in this field I was proud of how much I knew, now I am all too aware of how much I don’t know ……

yeah i agree… sometimes we have to ‘unlearn’ some knowledge in order to learn new stuff… Our life is a continuous cycle of learning and ‘unlearning’ knowledge as we move ahead in life and gain experience on many things that we encounter in the journey

Jeff where does the picture in “Learning On the Battlefield” come from? Good post as usual

There are several folks here where I work who have Masters Degrees and such, and almost all of them are subpar programmers at best…

Jeff, you may have Jumped the Shark, seriously.

“Nowhere is the importance of learning how to learn more critical than in the field of software development”? Are you mad? What about open heart surgery, or dozens of other professions?

You elevate software development to some sort of mystical status that is somehow saving the world or something, but it’s not. Men who had 100K of memory and had to swap things in and out of it and who had to write hex, now those guys I could respect.

Jeez

Great post. I recently tried to explain this to my niece who hates history. I told her every smart person I know has knowledge of and opinions on historical events that have no importance to their lives. I tried to tell her that history itself isn’t as important as learning to retain, organize, and evaluate information that may not be immediately relevant.
She rolled her eyes and stop telling me about school. Kids these days . . .

Steve,

I don’t think that’s what Jeff is saying. He’s pointing out that software engineering can be pretty faddish, where new things replace old. Object-oriented programming, Java, version control, design patterns, ORM, etc., etc. If you like Java, and find it tough to learn a new programming language, then you’re going to be unhappy.

Whether this kind of learning is meaningful is doubtful. Someone may argue that they should learn one language and be done with it, and work on something that requires real smarts. Instead, in this field, a great deal of time is devoted to mastering things that quickly become obsolete.

Yet, if you aren’t keeping up, then you might find yourself out of a job.

Nice post, Jeff. Enjoyable as always.

Steve - I think what Jeff means is that in very few professions do the primitives change so often as in software development. Medical technology may advance, but human organs are pretty much the same as they were thousands of years ago. I think the guys we should really respect are the ones who at one time wrote things in hex, but have adapted to new technology over the years and are now at home in .NET, Java, etc.

“You elevate software development to some sort of mystical status that is somehow saving the world or something”

Software touches EVERY OTHER INDUSTRY, or at least should. Writing good software means creating tools that assist surgeons how to learn to be surgeons, and then later on how to be better ones. Software advances fields study. That is a fact.

Two years ago, I had major skull and jaw surgery. Three dimensional x-rays were taken and imported into an application that allowed my surgeon to perform the exact procedure virtually a week before cutting into me.

That’s not just a model. That is a model with my exact bone and muscle structure. I was not allowed to see the actual software in use, but the doctor was more than willing to answer my technical questions afterwards, since I was obviously nearly more amazed by the software used than the actual procedure itself. He knew which teeth had awkward roots into my jawline without making a slice.

“I can’t imagine my life before this modeling thing.”

Hmmmm. I hear a lot of that from people using good software. “I can’t imagine my life without [insert software/hardware product]”.

Maybe doctor-programmers worked on the development of this application, but more likely than not it was some awesome programmers working closely with doctors. Nobody dismisses software development as easy and Jeff was not saying it was more important that medicine either.

However, choose one single field of expertise and study the world now that touches so many other industries with a direct effect upon their advanced other than hardware/software (including firmware) development and I will, I dunno. Become a plumber.