The Greatest Invention in Computer Science

What do you think the single greatest invention in computer science is? Besides the computer itself, I mean.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2008/06/the-greatest-invention-in-computer-science.html

of course old languages, still in use, use routines. As somebody who works using java, VB .Net and COBOL [http://en.wikipedia.org/wiki/COBOL#History_and_specification] routines are used in all although sometimes called different things.

Jeff, you often talk about raising the abilities of the mass of average developers. By that standard, there is no development more important then the Relational Database and SQL. It’s basic theory and tools are still in use after nearly 40 years, and it’s at the heart of almost every major system.

Where are all the alternatives to databases, like languages have alternatives? Where are all the new frameworks and platforms? The storage and retrieval strategies? The theoretical work?

Databases have seen (tentative) changes like Object database systems and XML storage. These are trivial (and poor) compared to the changes that “the procedure” has seen, such as Object Oriented programming and functional languages. Codd got it right from the start, but procedures… we’re still not sure how to do them right.

@Matt Johnson-

Applying the Atwood scale to a non-linear space, it would be the grape, and the oboe, respectively.

oops, that was Simon Wright

Codd: Tools designed for mediocre developers only serve to keep them mediocre. The relational model hasn’t moved an inch precisely because developers concerned with such things lack the capacity to improve upon it.

Sorry to nitpick, but this is a pet peeve of mine:

each one more exquisitely polished and finely cut than the next.

So, you’re saying, polished(n) polished(n+1)?

You mean that they get progressively LESS polished and finely cut?

While that may actually be the case in practice, as we write sloppier and sloppier code as the a deadline approaches, it’s hardly something we should aspire to.

I think you meant:

each one more exquisitely polished and finely cut than the last.

The polished(n+1)polished(n) is so essential. We’re not perfect; we never will be. I’m happy when I look at code I wrote last month and see how utterly terrible it is, but how much better it is than the code I wrote last year. When I stop being grossed out by last year’s code, I’ve stopped growing, and it’s time to stop coding and be a manager.

I disagree with all of you! :slight_smile:

No, seriously: I contend that, taken together as a single invention, the Graphical User Interface and the Mouse is the single greatest invention in computer science ever. Without that, computers would have forever remained the playground of ascetic men that wear white lab coats and thick glasses to work - computers would never have been commercialized.

Without commercialization, there would have been only a fraction of the funds available to develop matters to where we are today. Even worse - without commercialization, computers would have had no real significance in the world - okay, maybe they would have been useful for larger corporations… think the airline industry. But certainly not on the scale we see today. Would we even have had PDA’s? iPods? Cellphones? It’s hard to say.

Interesting timing of this post, Jeff - I was just reading through some of the code for Pac-Man (http://cubeman.org/arcade-source/pacman.asm) and wishing it was written in something a bit more readable than assembly language. I kept wishing there were some routines!

Does anyone know when routines started coming to the fore? Even better - does anyone know who wrote the first routine? That’d be an interesting fact for the resume: “created the backbone of Computer Science”.

Really? The single greatest invention in computer science? Not PageRank, or QuickSort, or Chomsky hierarchy of formal languages, or pointers or recursion or Von Neumann architecture, but mere routines?

Of course, the perspective is of a former Editor in Chief of IEEE Software, who’s latest publication includes such CS heavyweights as “
Should You Adopt Open Source Software?” and “Making Statistics Part of Decision Making in an Engineering Organization”. IEEE Software is not a Computational Theory organization, but a Software Engineering and Engineering Management publication.

Surely you (and Steve McConnell) mean “The Greatest Invention in Software Engineering!” Joel would be most displeased with your bastardization of the great field of lambda calculus, computability theory, and finite state automatons.
/TongueInCheek

Does anyone know who wrote the first routine? That’d be an interesting fact for the resume: “created the backbone of Computer Science”.

Perhaps Fortran II in 1958? One of the primary features is “separate compilation of subroutines”.

http://en.wikipedia.org/wiki/Fortran#FORTRAN_II

I don’t know if there is a proper term for it, but rather than any single thing I think the greatest invention in CS are designs that allow one interface (as in code not gui) to be used for conceptually same things. I guess I mean generic programming but broader, not just templates.

Besides templates, there is inheritance that allows everything that can be done to parent class object to be done to sub class objects. Then there is function overloading and implicit type conversions for function parameters as done in C++. And finally, duck typing that only cares that object has an interface to do the asked thing and no predetermined knowledge about the object is required.

Obviously all these can be used badly, but I like the idea of write-once-use-for-everything-that-has-interface-for-that. Without this kind of features everything would be a special case and demand special treatment.

I have a better question: Why does every code monkey consider himself qualified to comment authoratively on matters of computer science?

Ah, yes, i remember my early days with the C-64 and countless “GOTO” and “GOSUB” calls. Remember: Do NOT sequentially name your code lines. Otherweise, if you have to insert Code in-between, you are screwed. You can of course add to existing lines, but I think the C-64 was limited to 2 “screen-lines” per code line.

Nothing says more “I love you!” then when you have a 1500-Line BASIC Program where you need to insert something between lines 50 and 51. Ah yes, it’s an easy change of course…

50 GOSUB 1501
1501 old code of line 50 + the new stuff
150x RETURN

If there would be a way to get the flow of a typical C-64 Basic application I think you would get a diagram that looks the the organization structure of many big companies…

Anyway, just keep in mind that “computer science” is a narrow and a wide subject at the same time. The routine is good for programmers, but employees in other areas of computer science would possible give other answers. Network Engineers may hail some network protocol, i.e. “The best invention was TCP/IP, as it finally allowed students from the east coast and west coast to share their porn collection without much hassle on the cost of the company” whereas Mainboard Designers might think of some standard (i.e. the IBM AT or later the ATX) that made creation of compatible Third-Party Mainboards that fit a wide range of existing PCs.

By the way, your blog silently eats and brackets :slight_smile:
That should have been:
1501 existing code of line 50 with the additions you want to make

“Does anyone know who wrote the first routine?”

That would probably depend on how loosely you want to interpret this.

Konrad Zuse WROTE routines (he called them plans) in a relatively high level language called Plankakl that he designed in the early 1940s… but nobody actually ever built a compiler for that language until 1998… other than that, early programmers used functions in machine code and assembly languages which underneath act just like functions in C. i.e. you pushed your arguments on the stack and the location to jump back to and then jumped execution to the address the function was located and the function read the arguments from the stack and processed them, then jumped back to the stored return address. Then they used macros to help simplify the push/call/pop/return logic. That’s pretty much how C functions work underneath, C just makes it a lot easier to write them and more readable.

But isn’t the routine an essential developmnt in the progress of computer science?
I think that it would have been implemented by someone somewhere sometime anyhow. It is greatly useful and greatly efficient, but it is not a great original idea.

Maurice Wilkes must be a candidate: he’s credited with the development of microprogramming (there’s a paper from 1953), macros and subroutine libraries. Aged 94, still at the University of Cambridge Computer Lab - which he ran for 35 years.

http://en.wikipedia.org/wiki/Maurice_Wilkes

“But isn’t the routine an essential developmnt in the progress of computer science?”

Yes, and it was actually more essential in the beginning than it is now, because memory constraints were such that it was really impossible to do anything at all useful without them. I’d be willing to bet that some form of routine was used on the very first computer that was able to be programmed.