The Greatest Invention in Computer Science

the code itself will be bigger but more fast.

Yeah right. If the code still fits into the processor cache that’s fine. But once it doesn’t anymore, the performance impact of cache misses are much larger than any “non-inlined” simple routine you did not write as macro (not to consider the “performance impact” of the developer when it comes to actually maintaining such code).

And by all means try not to use recursion.

You wouldn’t believe what a good job most of today’s compilers can do on recursive code. Tail recursion is especially simple - you can’t do better by handcrafted assembler. After all, the jump back to the top of the routine is needed then. :slight_smile:

One advice: Measure before you optimize. Or as D.E. Knuth said: “Early optimization is the root of all evil.”

So if you still write macros to gain speed you may actually making it slower. Measurably. Believe me. BTDT.

You’re hardly comparing like with like. Which one of these is easier to read/understand?

Before:
1000 CENTER% = (80 - LEN(TEXT$)) / 2
1010 SPACES$ = ""
1020 FOR N = 0 TO CENTER STEP 1
1030 SPACES$ = SPACES$ + " "
1040 NEXT N
1050 PRINT SPACES$ + TEXT$
1060 RETURN

After:
public void PrtCntdStr(string t) {
int c = (80 - t.Len) / 2;
string s = “”;
l:
if (c == 0) goto p;
s += " ";
c --;
goto l;
p:
Print(s + t);
}

At the end of the day, while I broadly agree that I couldn’t do my job without routines, the examples given didn’t do much to support the argument (as I hope my counter-example demonstrates when you are prepared to muddy the water a little).

With this post I feel obliged to explain how the computer do routines.
Some computers use that sort of mechanism, those with CPUs with too few registers to hold the arguments passed, or which run OSes with requirements for support of such CPUs (x86 compatibility mode rather than x64). Other computers, such as Sparc based machines or an x64 based machine running 64-bit OS pass arguments in registers for most calls. Good compilers will inline functions to avoid the overhead in either system.

And by all means try not to use recursion.
If you have a recursive problem, use recursion, and profile the cost of maintaining your own stack against the one the OS and hardware supports. Even if the arguments are passed in registers, you still have to save the values somewhere if you have a non-tail-recursive recursive call, and building your own structures to support that in an iterative manner is unlikely to be worthwhile either in terms of performance or code clarity.

You can have too many lines in a routine, so you create lots of little routines and then you have a really long list of routines and pretty soon the names begin to become less meaningful and it becomes harder to get a handle on the code.
So often times I still write long routines with long comment lines of '/////////////////////////////// to divide them into logical chunks
rather than writing shorter ones. There’s a happy medium.
Maybe it’s a bit like the goto debate

Well, I guess I was on the same wavelength, because in all honesty I picked the algorithm, as the element of linear (potentially recursive) thought that was the greatest contribution of computer science.

Some more candidates:

Goto/Jmp/Br

Without jumps, only very trivial programs would be possible. Although these are nowadays deeply hidden behind the compiler, like quantum physics is hidden behind Einsteinian physics.

Pointers

Without pointers, stacks would be impossible and thus routines would be seriously crippled. Yet again, these are now generally hidden, but more like Einsteinian physics behind Newtonian.

Structures

Without structures, parameter passing would be cumbersome. The simile here might be sand on a beach or atoms behind molecules perhaps.

The hierarchy continues up the scale through routines, objects, libraries, protocols, etc. …

I’m going with “free compilers” as the greatest invention. Poor college students and people in developing countries otherwiese would never know of Computer Science because of the high cost of compilers.

I think Binary Arithmetic is the most important part of computer science (outside of hardware).

Not that the routine isn’t important, but I think you should give some credit to Alan Turing’s Bombe (http://en.wikipedia.org/wiki/Bombe). It did help save the free world

I agree the example is poor, both versions are almost line for line equivalent, and if there’s any difference in readability it’s down to indentation and names.

The significant difference that isn’t highlighted here is that the first version is called using GOSUB, and T$ is a global variable not an argument, with all the problems that entails.

Which then shows the benefit when you need to nest routine calls with arguments and return results e.g. when the second version is implemented more realistically as:

void PrintCenteredString(string text, int lineLength)
{
Print(CenterString(text, lineLength));
}
string CenterString(string text, int lineLength)
{

}

wishing it was written in something a bit more readable than assembly language. I kept wishing there were some routines!

One of my project managers back in 1980 was a strong advocate of macro assembler. It was possible to create routines and code that is surprisingly readable in assembler.

Gerald said:
“Unless you’re programming for embedded systems where clock cycles are still at a premium, you probably want to avoid using macros unless you have a really really good reason. The few clock cycles you gain by using macros instead of functions is no longer worth the loss of type and scope safety in most projects.”

I forgot to mention that I’m in fact programming (in VHDL) a processor inside a FPGA and with that processor I will build a crude game (using caracter map as graphics) in assembly. The problem is that the FPGA only has 2kb of memmory, which only 1kb is avaible for the program and the data. So that is why I’m a little bit crazy about processor cycles nowadays. It’s mind numbling to think how to make a computer instruction and how the processor grows exponentially with every single instruction that you add.

For example, in order to save memmory we created 3 separate instructions to load a data in the register: LOAD, LOADLO, LOADHI. The processor is 16bit so we use 4 bits for the instruction and registers adressed in 4 bits (16 registers). So LOAD has [instruction number, 4 bits] [the register where the data will be loaded, 4 bits] and in the next cycle comes the adress of the data in 16 bits. LOADLO and LOADHI serve to load constants in the register, but each only use one cycle (LOAD uses two, one for the instruction and the other for the adress, using twice as much memmory too), the first one only loads data in the 8 less significant bits of the register and the last loads data in the 8 most significant bits. So LOADLO is [instruction number, 4 bits] [the register where the data will be loaded, 4 bits] [the constant 8 bits].
So to load 00 0A (hex, 10 in decimal) in the register I have:
LOADLO Rx, 0A
To load 0A 00 (hex, 2560 in decimal):
LOADHI Rx, 0A
To load 0A0A (hex, 2570 in decimal):
LOADLO Rx, 0A
LOADHI Rx, 0A

If you think about it most constants you load in a given program are less than 256 so I’m saving quite a bit of memmory doing this.

Sorry about all this mumbling about cycles and memmory, I’m getting crazy with all this binary stuff, sharing makes the nightmares go away.

Come on people… the greatest invention in computer science (so far) is easy to answer. It’s the Mouse.

Grace Hopper’s first compiler (A-0) was fundamentally subroutines…
see a href="http://en.wikipedia.org/wiki/A-0_programming_language"http://en.wikipedia.org/wiki/A-0_programming_language/a

Frankly I’m not sure the concept of “most important” is at all valid. “What are the most important concepts?” or “critical points in history” are better questions.

But if I do play the game… my vote is: the compiler (and that includes assemblers really); programming in binary is a real bummer.

a href="http://niyati123.blogspot.com/2008/02/first-compiler.html"http://niyati123.blogspot.com/2008/02/first-compiler.html/a

If you ever have the opportunity I recommend seeing Robert Martin do his talk on functions. It helped me to understand and master the power of the routine.

It the internets - guyz!

I wouldn’t vote for the routine. I’d vote for the concept of putting code and data in the same memory.

It looks like you’re looking for a fight, so… the greatest invention in computer science, I’ll have to go with Dope Wars. It got millions of kids to get their TI-82s to stop displaying the Sierpinski gasket and start shopping for the best prices on heroin.

Or…

O-notation. The logic gate. The integrated circuit. S-expressions, defmacro, and eval. TCP/IP. Digital systems.

I think one thing we struggle with here is an inability to go back to just before these great discoveries and look forward at what leaps they were. There’s an awful lot below the subroutine that you’re taking for granted.

Even in the arena where the subroutine does the most, information hiding and abstraction, earlier discoveries like the transistor and assembly language compete for attention…

No offense to the software squad, but if you really want to call it Computer Science, then the top dog is without a doubt the microprocessor.

Someone quoted Dijkstra, which is really appropos for this topic. He started programming in 1952 and contributed much to the field, inventing the mutex, for instance. He gave a Turing Award lecture in 1972 called “The Humble Programmer” which is a classic, here’s a link to the text:

http://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340.html

He makes the case for abstraction by arguing that we have small brains, and have to live within their limits, so anything that helps us limit the amount of “stuff” we consider at one time is good. This piece is from an era when there was debate over whether subroutines were a good thing or not… we’ve come a long way!