The Greatest Invention in Computer Science

According to Maurice Wilkes (mentioned in Dave Smith’s comment), there are only two ideas in Computer Science: abstraction and caching.

When he told us undergrads this, it was obviously a bit annoying having just spent 3 years trying to get our heads round the Cambridge CS course. However, I still often think about what he said and, unsurpringly, I’ve yet to come up with a counter example.

To steal Robert Cooper’s phrase, every CS invention is just a paint job over those two ideas.

I’ll add my vote to the compiler list. Compilers are so much more than ‘glorified Assemblers’. In their day Assemblers were a great advance over machine coding. But Compilers gave rise to a myriad of different languages whose individual dialects might be tailored to specific industries and problem sets.
Favorite examples: COBOL, Fortran, APL

The biggest improvement in most compilers was the incorporation and application of optimizing techniques.
Favorite example: Turbo Pascal

Coming in second place to compilers are Interpreters and Virtual Machines. However, these didn’t become viable solutions until the hardware was fast enough and memory big/cheap enough to run interpreted applications.
Favorite examples: REXX, Java, PHP

===============
I disagree with routine as a greatest candidate. It is simply a packaging of code. Look back to the COBOL paragraph, which are blocks of code that can be executed singularly or repetitively. These code blocks were made available by a compiler. Assemblers made the macro available to repeat blocks of code. It isn’t the concept of a block of code (routine) that is the greatest, but the technology that enables this blocking.

James: I don’t think it’s true that everything is just a paint job over those two ideas (abstraction and caching). I think it’s more accurate to say that there are only two kinds of ideas: abstraction ideas and caching ideas.

The distinction there is that many – most, in fact – of the abstraction ideas are not merely “oh, let’s use abstraction here”; the actual way that one uses abstraction, and the abstractions that one chooses to use, are a critial part of the idea.

(Or perhaps you say that an art gallery only has two basic paintings; paper and canvas. Everything else is just a paint job over those two.)

My first reaction was to have flashbacks to BASIC programming on the CBM64, numbering lines by tens to be able to insert new ones; and living without GOSUB or passing parameters. Somehow we didn’t seem to mind so much, though.

My second reaction was: well, what about the class? In OOP, creating a good class (and its well-polished methods) is as important today as it was to create a generally well-tested and useful routine back in the days of structured programming (in C and Pascal in my case). Surely it is efficient organization of code that is hard to master, not just in the context of routines.

My third reaction was that I feel old, and I’m only 34. Such is the life of a programmer.

Aikimark: APL has never been a compiled language in the sense you describe.

I say the turing machine(together with the church-turing thesis) if it qualifies as an “invention” is the most radical one in the computer SCIENCE field.

I vote for routine as a process of abstraction. We have a long and productive history of hardening large concepts into smaller representations. Today’s macro is tomorrow’s micro. This process enables us to think and program at a higher level.

Routine is a form of sequence control. In the 1940’s, the Mark I was called an automatic sequence controlled calculator. In the 1950’s, electronic computers had only transfer or jump operations to control their sequence - but then programmers used a technique called “sub-programs” to implement routines. In the 1960’s, this technique was hardened into machine op codes like “transfer and set index” to effect subroutine linkage. By the time of the IBM 360 in 1964, the “branch and link” opcode itself was a call to a microcode routine.

So this process of abstraction takes useful ideas and compacts them into lower / harder layers of the system to allow programmers to create at higher / softer levels. A good example is the stack, which moved from a very useful programming technique to a machine instruction within a few years.

For a while, this process produced very high levels of abstraction in the form of programming languages like APL. But more recently, perhaps due to the drop in hardware costs, there has been a dramatic drop back down to languages like C and its derivatives, where a routine more closely resembles its 50 year old origin.

If we talk about “first” in terms of inventions, it wasn’t 1937 or Konrad Zuse or whoever claims to first having built something called computer - no, it happened almost 200 years before with Charles Babbage’s Difference and later his Analytical Machine. Although the machine was never built during Babbage’s lifetime, first programs were already written for it by (surprise, surprise) Ada Byron Lovelace. Yes, right. The first programmer - at least to our knowledge - was a woman. And she did it more than 150 years ago.

For a short, simplified overview see http://www.maxmon.com/1830ad.htm

It “was intended to use loops of Jacquard’s punched cards to control an automatic calculator, … This machine was also intended to employ several features subsequently used in modern computers, including sequential control, branching, and looping.”

So it could have known subroutines already. :wink:

Before your article, if given a more specific question of “What is the Greatest Invention in Programming”, I’d for better or worse would answer “The Object”. However, as Object is a direct descendant of the routine, I’d have to recognize it’s superior claim. Of course I should note that your check list for good routines, is a good guide for well balanced objects.

Pointers + Records. Programmers are provided with considerably more power with a combination of pointers and records than they are with routines. Linked lists, stacks, queues, heaps, binary trees, all the other kinds of trees, graphs… The concept of pointers to structs that contain more pointers is the single most powerful invention with regard to data structures.

What you mean is this…

After structured programming:

public void PrintCenteredString(string text)
{
int center = (80 - text.Length) / 2;
string spaces = “”;
for(int i = 0; i center; i++)
{
spaces += " ";
}
Print(spaces + text);
}

Note how each curly brace (or scope in C/C++) has it’s own line? It’s the proper formatting that doesn’t look like some sort of abortion. And everytime I see the formatting you use, it makes me think this. Imagine a man walking along writing something, but then he slips and falls, and OOPS, all his writing gets messed up. That’s exactly what the inconsistent depth level of the curly braces reminds me of.

I also remember the c64 and the goto and gosubs, that was the basic language back in the day (and I guess sort of still is).
Pascal also had the goto, but didn’t use the line numbers and so you could (as my uncle told me back in those days) write goto h*ll and I thought that was so cool… (I was 10 back then, everything with that four letter word was cool).

And so when I tried out (and bought) Delphi I had to try out the goto H*ll thingy.

What I also remember from those days was 6502 assembler language. I remember writing some easy demos making sprites move over the screen while I could type something at the same time. ea = NOP back then, and I remember A9 also meant something, just that I can’t remember what it was now…
Ah. Those were the days. we even had some great games back then :wink:

Hm.
I think it is just impossible to define the single best invention in computer science, as there are lots of things you cant really compare.

For programming, yes, routines - or, being a bit more abstract - general abstractions from goto (that is, loops, routines) are one of the really big things.

However, how do you compare the complexity theory to this?
Is the knowledge that finding some shortest hamilton cycle takes a lot of time better of worse than saving a buch of keystrokes?

Is the idea of having a program look at your programs, optimize a bit and turn it into assembler better or worse than the idea of stacks (that is, the basic of many programming languages)?

Or is the idea of the lambda calculus better or worse then the idea of using magnetic hard drives?

I am unable to answer such questions, as all of these are significant inventions.

If it’s all abstraction and caching, then abstraction is rather broadly defined. Consider: coding schemes, the concept of an instruction set and higher level languages that can be translated down to it, the concept of manipulating sequences of instructions as data, the concept of an algorithm, the concept of indirection, the concept of a virtual machine, the concept of hierarchical decomposition of a problem, the various techniques that are used to implement hierarchical decomposition, the various themes like hashing that are broadly useful to implement efficient algorithms…

So a statement like “the only ideas are caching and various abstractions” might not be false, but how much information does it convey?

Abstractions are like potato chips: handy algorithmic themes like alpha-beta which evidently took people years to find, computability and Turing completeness and asymptotic complexity classes like NP, compressibility and information theory and learning, continuations, in fact the entire menagerie of techniques to express synchronous and asynchronous parallelism, inheritance, GC, the concept of “purely functional” and the laziness abstractions that make it work, the menagerie of public-key-cipher-related tricks, COME-FROM…

Really should include reference to this book:

http://www.cs.inf.ethz.ch/~wirth/books/AlgorithmE0/

Umm… Could you define “routines”?

As far as I can tell, sub-programs have been around almost as long as computers. I use to program sub-programs in assembly language. I had them in Fortran and Cobol. Heck, I can’t think of any language I had that didn’t have some concept of a gosub or function. (Well, APL didn’t, but you could never read what you wrote anyway).

To me the big change was the localization of variables. Older languages didn’t have THAT concept. For example, in the early 1980s and late 1970s, I use to write in a proprietary language called Cadol. Cadol didn’t have variables, but registers. Some registers could store alphanumeric data and some numeric data. Basically, all registers are global from boot up until you shut down the machine.

We had sub-routines, but we didn’t have local variables. To get around this issue, we divided the registers up into “used to store data” and “used by sub-routines”. When you call a subroutine, you’d copy the necessary “used to store data” registers to the “used by sub-routine” registers, call the sub-routine, and then copy the “used by sub-routine” registers back to the “used to store data” registers.

When I learned Pascal back in the mid-1980s, I was awe struck at the concept that functions and procedures had local variables. That was a great revelation and made programming much, much easier to do.

I think Fortran II (1958) was the first language to have procedures, but I don’t remember if these procedures had local variable space or not. I don’t think it did.

While a routine might be the greatest invention in programming, it certainly isn’t the greatest invention in Computer Science. If it weren’t for Turing’s Universal Computing Machine or the Von Neumann Machine, we might not have Computer Science at all.

“According to Maurice Wilkes (mentioned in Dave Smith’s comment), there are only two ideas in Computer Science: abstraction and caching.”

That’s a particularly dumb quote. Abstraction is an extremely important idea, a crucial idea - its various flavours literally define how languages work - whereas caching is an implementation detail used to give greater efficiency to certain classes of program, very useful but hardly in the same ballpark or even the same city as the ballpark of abstraction.

Why is “caching” more important than: hashing; inheritance; data structures (not the same as abstraction, almost the reverse); “graphs” (linked structures); O() analysis?

I just don’t see it. Seems like something someone says to be cute, rather than deeply thinking abut it.

Is “logic”, or similar concepts, an invention?