The Greatest Invention in Computer Science

Hmmm… but you can write code without functions.
You can’t write code (at least useful code) without jz or jnz

Come on!

The binary digit (bit) was an invention. Check out “Code” by Charles Petzold.

The stored program idea - that is that a program is just at some level data for something to run. Inherent in the work of Turing, polished by Von Neumann.

Everything since is just optimization.

“No offense to the software squad, but if you really want to call it Computer Science, then the top dog is without a doubt the microprocessor.”

That depends on who’s definition of computer science you use. There is plenty of disagreement on that topic, but in my experience computer engineering is a separate field from computer science and the microprocessor would fall under an advancement in computer engineering.

I think the context in this clearly isn’t about hardware :stuck_out_tongue:

Best Regards,
Gerald

Gerald said:
“Unless you’re programming for embedded systems where clock cycles are still at a premium, you probably want to avoid using macros unless you have a really really good reason. The few clock cycles you gain by using macros instead of functions is no longer worth the loss of type and scope safety in most projects.”

I go by the rule that if a subroutine uses more cycles to be called than it executes itself it’s not wroth to be a subroutine. The real problem is small functions that are called all the time and are recursive. When I said “whenever possible use C-like macros” I should have said “when is viable”.

Routines?
Computer Programs are routines…
I guess the most important invention is networking computers

I’m going to claim that the compiler and the computer itself are special cases of the most important computer science invention: the idea of a universal machine.

The idea is a machine which can take a description of a machine in data, and emulate it. The computer is an implementation of this idea in hardware. The interpreter in software. The compiler translates from one way of writing the description to another.

The universal machine is also first in 1937. Software predates hardware.

I also agree that that kind of optmization should be one of the last used in order to improve performance, but programmers should keep in mind the overhead always.

“I go by the rule that if a subroutine uses more cycles to be called than it executes itself it’s not wroth to be a subroutine. The real problem is small functions that are called all the time and are recursive. When I said “whenever possible use C-like macros” I should have said “when is viable”.”

There would be very few cases where a subroutine/function uses less clock cycles than calling it does. And in that case, using an inline function is a better choice than a macro; you get the performance benefits of using a macro and the maintenance/type safety/scoping benefits of functions.

The only time I ever use macros any more is when it makes things a lot easier to read and understand, such as using them in message mapping, i.e.:

START_PACKETMAP(pack)
PACKET_MAP( PACKET_FOO, OnFoo )
PACKET_MAP( PACKET_FOOTOO, OnFooToo )
END_PACKETMAP

Best Regards,
Gerald

Logic, because you can’t solve problems without it, followed by memory, to bypass the effort required to re-solve those problems.

Jeff,

for accuracy, consider that Computer Science in itself has little to do with computers, even with programming. The single greatest achievement in Computer Science is the concept of tractability and intractability) of problems. You know, the O(n) stuff. They are so fundamental that an algorithm is just a jumbled assembly of words from a language grammar without tractable problem to implement. Languages in themselves are not a computer term either but another concept deeply entrenched in Discrete Mathematics which found very cool use in building computer languages.

Software Engineering as a science is closer to what you are referring to. Computer Science is much drier than Software Engineering and much more methodical in its approach. The “bad code” you often write about is an outgrowth of ignorance of so many “programmers” out there who found their way into software field when money seemed good and roadside coding education replaced the good sense of learning the fundamentals before pounding the computer metal.

Afraid my vote goes for Assembler, invented (I believe) by David Wheeler in Cambridge in 1951-1952. It’s the earliest point I can think of where software was taken out of the hands of theoretical mathematicians (an endless roll of paper and a pencil with an eraser on the end) and physicists (mercury delay tubes and bit switches, what fun).

Assembler (generically) may be a bit of a headache, but at least it allows fairly normal people to think syntactically and semantically about what they are doing.

Not that Dr Wheeler was exactly normal. I was taught by him … it was an interesting experience.

I personally feel the Keyboard is the most important invention.
Ofcourse keyboards existed before computers connecting one to a computer actually gave programmers enough mental leyway to start thiking about stuff to improve their programs

Languages are a tiny part of CS.
The greatest was Von Neumann’s architecture. Then Denning’s Working Set virtual memory. Dijkstra’s semaphore.

One could argue Multics and Tenex operating systems.

You’re not a raving lunatic Simon :stuck_out_tongue:

But I don’t agree with you, though I guess it depends on what type of software you’re developing. Object oriented design doesn’t work well in some areas. That’s one of the reasons I still do a lot of programming in C++ instead of languages that force me to use objects for everything; I have the flexibility to mix object oriented and procedural design. Although the same effect can be had by using singletons, it just feels wrong to me sometimes.

I think overall object-oriented design is a great thing in the industry. I certainly wouldn’t say it has hurt the industry.

“Even if you could design the perfect object model for your application it will quickly disintegrates, under the weight of maintenance, in to a collection of “bucket” classes with weak cohesion and tight coupling.”

If you design the right object model, there’s no way that happens, unless you’re not giving much thought to the object model when you’re doing maintenance and just throwing around a bunch of quick hacks.

Best Regards,
Gerald

Well, I think that if we are talking about the greatest inventions in Computer Science, you’d have to start at the Turing Machine. After all, this is the theoretical abstraction for all computing machines.

After that, I’d probably pick something like context free grammars (which you could probably argue isn’t even Computer Science, but it surely has been a major tool for developing programming languages).

We’re really talking about block structure, not routines (which existed long before autocoder hit the scene). Learn Algol60 if you want to appreciate it in all its glory. Seriously. You have no idea.

I’ve always thought the stack is one of the great computer inventions.

Certainly, routines existed before the stack, but the concept of a stack is less obvious than “let’s try and reuse this code”. Before stacks, routines were called by (for instance) BALR, “Branch And Link Register”[*]. The next IP address was placed in the named register, and the PC was changed to the routine. Returning consisted of jumping to the address in the register. But think of how painful this is: there’s no place to put local variables, and recursion can’t be done (unless you make your own area to store the link register values).

[*] IBM 360. They didn’t have IP and PC registers, I’m just stealing the Intel terms.

The first routine is likely to have been in either Fortran II or Lisp (or FORTRAN II and LISP, as they were then, lower-case letters not having been invented at the time).

Of course, it’s entirely likely that people did them in assembly beforehand; it’s even possible that some early processors had builtin support for them (as does the M68K, for instance).

I said the internet :frowning: