The Greatest Invention in Computer Science

Tjerk:

Depends how you define “invention in computer science”. Is abstraction really an invention, per se? I would say that regardless, it is certainly not an invention in computer science. Abstraction FAR predates computer science, even under the “computers have nothing to do with computer science” definitions.*

Even accepting that abstraction is greater and the routine is just a manifestation of same, that does not stop it from being an invention – an invention I would still argue is a lot greater than the others you list among many dimensions of “greatness”.

  • This goes for “logic” and a bunch of other suggestions too. We needed these “inventions” for fire and a bunch of other inventions that came far before comsci or anything else related.

What about lisp and smalltalk? They’re just as modern as any modern programming language and they were invented in 1958 and 1969 respectively.

I think you are all wrong. For inventions that changed the world, you just can’t beat the CMOS chip. (I know it’s clunky now) What CMOS enabled us to do is learn a programming language and port it from one machine to another that is designed very differently. CMOS made it possible for the PC Clones to be usable, for the PC market to grow, for us to have systems to work on. I’d like to think CMOS is the precursor to all modern programming. Maybe I am biased.

“Actually, I believe the single greatest invention in C.S. is the Data Type. Seriously, think about it: the array, the binary tree, the hash table… How would you do anything useful without them?”

“He [Dijkstra] makes the case for abstraction by arguing that we have small brains, and have to live within their limits, so anything that helps us limit the amount of “stuff” we consider at one time is good.”

Data Types and Structures (OOP) are neat and all, but seriously the greatest achievement in C.S. has to got to be that we’re constantly learning new ways to deal and manage computer power. That we’re actually admitting to ourselves our own limits in respect to computers.

re Phil on June 7, 2008 09:39 PM’s comments about stack vs BALR

If I recall, many small machine architectures in the 1960’s 1970’s (1130, some microprocessors) did not put subroutine or function parameters in registers, stacks, nor did they point to them from registers other than the return address register. Self modifying code was OK. The parameter values were placed into memory following the call instruction. In architectures where the caller set the return address, it set it to just after the last parameter. In architectures where the HW stored a return address with a value just past the call instruction, the subroutine/function had to add enough to that return address to point to the instruction after the last parameter. Either way, if there was a variable number of parameters and/or variable length parameters, the exercise got to be fun.

re David W. on June 8, 2008 01:10 AM question: Did Fortran II have local variables?
YES. Function and Subroutine variables were local. So were their names. The same variable names could be used in independently in each routine.
If you wanted them to be global, you had to put them in COMMON
see http://en.wikipedia.org/wiki/Fortran

re: Dr. Goulu on June 8, 2008 05:07 AM "In fact a pure “Von Neumann” architecture isn’t really usable without a stack…"
Tell that to the mainframe folks. IBM mainframes do not have PUSH and POP instructions.
In z/Architecture Principles of Operation , SA22-7832-06
https://www-01.ibm.com/servers/resourcelink/lib03010.nsf/B9DE5F05A9D57819852571C500428F9A/$File/SA22-7832-06.pdf
, the words “push” and “pop” do not appear. “Stack” appears 983 times, but usually referring to the program linkage [traceback] stack.

David

The stored-program computer (a.k.a. the von Neumann architecture) and code as data.

http://blog.uncommons.org/2008/06/08/great-innovations-in-computing/

As well as making computers much more flexible, it made it possible to write programs that write programs (e.g. compilers).

The stack.

It’s the simplest data structure, and yet the most useful, so essential that its handling is now engraved in the silicon. In fact a pure “Von Neumann” architecture isn’t really usable without a stack, while you could build languages such as Forth and computers such as the HP “reversed polish notation” calculators around a stack.

I’d say the single greatest invention in computer science is the concept of treating data in memory as instructions. Such data has come to be called “software”. The earliest computers had no such concept. To change the problem the computer was to solve, you had to mount a different plugboard. The invention of software is usually attributed to mathematician John von Neumann.

The pointer.

I vote for compilers

Guess what? If there were no routines we could still program computers to do our bidding. The routine is a great artifact for human comprehension but its hardly necessary.

Surely greater inventions include the algorithm, the the stored program, or (more fundamentally) the function (in the mathematical sense)? Or stored state?

I guess when you have a popular blog you feel compelled to say something periodically even when you have nothing to say.

I think saying that those languages are so young is misleading. The rate of increase of inventions and such has become faster over previous years. For languages from the 70s, for example, 10 years was not that much time because the main users of these languages were small groups of researchers.

However, for languages from today 10 years is a much longer time. The speed of development of languages has increased so much more today. This is due to the Internet and other factors such as open source software which allow for the average person, not just researchers, to contribute to development.

The routine??? hahaah you truly are stupid.

routine is just a form of abstraction, and abstraction is at the heart of computer science. The real inventions are lambda calculus and model checking and automatons.

Seriouzly you’re stupid

I think that XBase, starting with dBase II in 1983 and later Clipper and FoxPro, deserve a very important place in this list. It’s power, simplicity and interpreted style have done the path to many improvementes in modern languages like .NET
Even today XBase is still alive and kicking.

I was just reading through some of the code for Pac-Man
(http://cubeman.org/arcade-source/pacman.asm)
and wishing it was written in something a bit more readable than assembly language.
I kept wishing there were some routines!

But it uses routines: Just look for “jp” (jump) and “ret” (return).

Asm is by ways more readable than pure machine-code :wink:

Public key cryptography, or cryptography in general must surely rank as one of the most important inventions for Computer Science. It has made online commerce possible and even easy. In order for technology to survive it must be economically feasible. Without secure transactions over the Internet, I’m sure that it wouldn’t have grown to where it is today.

One can actually make a pretty good case for subroutines being #1, as it demonstrates what makes human reasoning what it is – the ability to imagine an abstract concept, give it a name, and then refer to it by that name going forward. That’s basically what language is all about, no?

0 - Zero

http://en.wikipedia.org/wiki/0_(number)

OK. I’m going with one other poster here and say “the algorithm”, and if I have to choose one then it would be the “sorting algorithm”. The routine is just the mechanics of implementing something, but “what to do” is way more important!

The human brain…
Without it, nothing else matters.