Fifty Years of Software Development

O'Reilly's History of Programming Languages poster is fascinating reading.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2006/09/fifty-years-of-software-development.html

I can’t believe that source control wasn’t popular until 1999. I’ve never worked for a company that didn’t use it. Before cvs there was rcs, and before rcs I’m sure there was something else.

Maybe source control is new for the hobbist.

– What we haven’t been able to cope with so well is how long it takes for the human beings to catch up with the hardware.

arguably, not. the Wintel monopoly is so named because of the death spiral they dance. Intel has no (at least, little) engine of growth without a need to suck cycles. M$ has no (at least, little) engine of growth without a source of cycles to suck up with new “features”.

They desperately need each other. the rise of linux, and the $100 machine, is their collective worst nightmare.

how to productively use the cycles courtesy of Moore’s Law, now that’s a different issue. your recent articles about Vista’s requirements and “features”, should give those in bed with Wintel the shivers.

speaking as one who is currently embedded in commercial applications, a BCD multliplier is a much better way to use that real estate and cycles. hasn’t happened. less pixel dust and more real computation. gamers likely don’t see it that way.

a more disturbing thread has been bothering me of late. to wit: “The Soul of a New Machine” was published in 1981. the notion that a bunch of folks could, or even know how to, design a processor today from a parts list (which probably doesn’t exist either) is farfetched.

hardware is becoming a mono-culture. unless, and i just don’t know, hardware engineers are having a party in the embedded space. i hope so. if not, then where are we really headed? how soon will we stop even training hardware engineers, because it is decided that X86 is good enough for all purposes?

I agree that 1999 sounds crazy late for source control.

I started using PVCS, a commercial package, at a game development company in 1991. Game programmers are quick to tackle new technology, but pretty slow to get up to speed on development practices, so I’m assuming even in '91 we were late adopters.

Granted, we used it primitively, i.e. no multiple checkouts and no branching, but in retrospect it was much more powerful and reliable than SourceSafe, which I ended up using for most of the late 90’s, until things like Perforce came along.

I’m with bb. I’ve worked as a game developer since 1995, and even back then working without source control would have been almost unthinkable. And still with bb, we’re usually pretty far behind on good devleopment practices. I cannot believe source control wasn’t widespread outside game development well before 1995. I don’t have much evidence either way, though I did breifly work for a non-game developer in 1996 where CVS was already well entrenched.

I’ve also been present at two different game developers for a switch from SourceSafe repositories to CVS. One of those SS repositories predated 1995, but the other maybe not.

One of my colleagues used to work at Boeing at the turn of the century and they had to do some work on VAX machines. VAX (at least this is what he told me) has source control built into the operating system, and I would have said VAX machines have been around since the late 70’s early 80’s.

It’s not that the VMS filesystem has source control built in, exactly. When a file is changed, the file’s version number is automatically incremented, so instead of foo.c you’d have foo.c;5. If you needed to go back to a previous revision of that file you could open foo.c;4 and save it over foo.c, creating foo.c;6 with the same contents.

Deleting a file deletes all revisions, but in the normal usage pattern you just plain don’t delete most files.

Hi

I read the blog in an online news reader and all images are replaced with a “WTF” image. I tried it in several online readers and it’s the same.

Any chance you could authorise images when displayed on newsisfree.com domain?

Perhaps instead of ‘mainstream’ you meant small teams?

Source Safe was used 1993 at Corel.
RCS was used at my present company, a unix company, at that time.

However I worked at a small company with a team of developers below 10 and they were just using a source copy on the network, and doing diffs with norton commander…

Small teams definitely think of source control as being an hinderance rather than helping.
Today I use ClearCase and I can’t live without source control – I check in every day a snapshot of my work

I’ve never worked for a company that didn’t use it.

I worked for plenty of small businesses that had no idea what source control was. I stand by my assertion that mainstream programmers didn’t have ready access to source control until around ~1999.

There’s a very good reason why source control is item #1 in Joel’s twelve-item checklist… which was coincidentally written in the year 2000.

A lot of those new languages I’d consider as scripting languages and they are used to get over platform dependence on the internet.

So yes people who use these don’t necessarily need to know about hardware, but do need to know about the abilities of the framework that they work under - which could be considered a virtual machine.

Our repository goes back to mid-1996.

By contrast, I’ve worked with software companies well into this millenium who were stubbornly ignorant of source control.

I’m really not sure what this means, except that the issue is probably a lot more complicated than it looks, and I doubt any one person’s individual experience is likely to shed any statistically meaningful light on the subject.

Instead, the patterns should be used as
signposts to the failures of the programming
language. As in all programming, the
identification of commonalities should be
followed by an abstraction step in which the
common parts are merged into a single solution.

That’s a very good point that never occurred to me before. But it makes so much sense.

Take for example the “once” keyword in Eiffel: it complete does away with the need for Singleton patterns. Or rather, it provides the Singleton pattern with a single language keyword.

Obviously there’s a lot of fluff going on with “Design Patterns” today, but I wonder what other patterns would be considered so essential that would justify providing them as a language construct.

I tell you, kids these days…think they know everything. Maybe you Microsoft folks only took to source code control in 1999, but it’s not like that with everyone.

I was working with old SCCS controlled source trees in 1986 (a system which dates from the 70s…look for Rochkinds’ paper). Almost everyone I delt with then working on non-trivial programs used SCCS or eventually RCS and considered tracking changes essential.

Maybe that’s just a Unix thing…:slight_smile:

You whipersnappers make me feel old sigh. Back in the day, we, too, had “source control.” It was called “paperwork” and organizational “procedure.” Mind you, this was back in the 70s in the military. But, for us, no one touched the code until we assigned them to it. The only way their modifications could get into the official source code was at the end of the release cycle when we accepted their Hollerith cards for inclusion into the official source. That’s after all the myriad forms of testing of that particular hunk of code and it’s forward and backward dependencies (which we kept track of on a paper “chart”).

This is another example of how all the stuff people figured out in the mainframe world half a century ago was just ignored and then had to be re-invented in the PC world when it started to hit the fan. “1999:” if I could find my dentures, I’d say “pshaw.”

Your point about patterns being examples of what’s missing from a language is well made, but your comments about source control are way off-mark.

I’ve been in the business since 1980, and my department was using source code control well before I started. Self-built systems on a Vax, with proper check-in and check-out. We even had a part-time librarian (yes, they were actually called that). Shrinkwrap PRODUCTS might not have been commonplace, but the need for control was well understood.

There may well be organizations that don’t use source control: but they’re not professional, and I wouldn’t work for one.

I stand by my assertion that mainstream programmers didn’t have ready access to source control until around ~1999.

Define mainstream. I too have used SCC since the early nineties…PVCS '91-'95; VSS '95-'97; …

Add me to the list of people who were using source control long before 1999. I began using it in the early 1990s, and it was already old-hat then.

Part of the problem is you don’t seem to realize that CVS is a 3rd generation software package. CVS was a layer on top of RCS. RCS, in turn, was basically scripts to extend the original system, SCCS. RCS basically made some SCCS operations simpler, CVS added real branches to it.

Bob Moore - reading your comment, you reminded me of something; back in the mid-80’s I co-oped for the US Navy. My project was to implement the DIFF algorithm in Fortran IV, so that they could easily see the changes made between revisions in their source control system - they actually tagged every line of code with the name of the programmer who last altered it.

I’m also hard pressed to understand what any of the “new” languages listed offer that the older languages don’t…

I agree with Mark Dominus comments, but what are we mere developers to do. I took a compiler construction course at Uni (which was by far and away my favourite subject), but I’m smart enough to realise that creating my own ‘perfect’ language for the typical projects I encounter just isn’t feasible. Maybe I’m too smart.

I work in C# every day, and the above is one of the many reasons that I wish the language supported macros. Not C/C++ style text replacements, but honest-to-goodness syntax tree modifying macros (with intellisense support thrown in too thank you very much). I HATE typing ‘string.Format(“output the value of {0}”, blah)’ (much like a sprintf in C). Such a waste of time. If I could, I’d make a macro that allowed me to do this ‘F"output the value of $blah$"’ in a heartbeat.

I remember reading a post by one of the C# language developers who was quite adament that C# would not support macros anytime in the near future (reasons being misuse). I remember when I first started in C# I thought that was smart. Now I must admit I think its a little arrogant. Leave the decision in the hands of the developer as to whether macros are a good or a bad idea. I think sealing your classes or C#'s way of not making all methods virtual by default is arrogant. Let me decide what I want to change and override. How can you possibly know what my needs in the future are going to be.

Now that I’ve finished my mini-rant, I wish C# supported macros. I wish it would be easy for me to build my language from the ground up. I wish that I could just do away with all boilerplate code. (seems like the ranting could start again so I’ll leave it there)