Pressing the Software Turbo Button

Does anyone remember the Turbo Button from older IBM PC models?


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2008/12/pressing-the-software-turbo-button.html

I can’t really disagree with the idea that developers need fast machines to be productive.

Of course I wish these same developers with very fast whiz bang
workstations would remember that the rest of us users probably do
not have the latest and greatest hardware at our fingertips.

Back in college (early 70’s) we were learning Fortran on a mainframe. The input entry was with punch cards. In order to speed up the turn around for testing our programs, they would put the card reader on line. They would have one operator feed he card reader and one sit behind the line printer to rip apart the print outs. You could literally see your card deck get read in and the output fly out of the printer. If you got in line early, you could get a couple of iterations of running and fixing your code before the hour was up and the card reader went back to spooling jobs for later running.

I think he means that compilers should be super optimized and fast. This is of paramount importance. But for any other software, just get faster hardware. That’s good enough for those dirty idiot users. They probably wouldn’t notice if you optimised your code anyway.

There’s a related problem of course is game consoles. Many of the games for the NES and SNES were dependent, not on CPU speed, but on TV refresh rates. That is, NES and SNES games were essentially programs designed to draw a fixed list of frames.

The code for games would would have to be tweaked seperately for the NTSC and PAL versions because PAL draws to the screen 50 times a second, while NTSC 60 times a second. So the PAL version had to be modified to be 13% faster to compensate for the fact that 10 fewer frames were being drawn per second.

This unfortunately resulted in a bug in the PAL version of Super Mario Kart, which made it impossible to get to a secret area in one of the ghost house runs. There was a very long jump you had to make, and the physics engine just ran slightly too fast to make the jump.

another problem for NES games (especially in the megaman series), is that if the code for calculating a frame takes slightly more than 1/60th of a second to run, the program would have to wait for the next scan to draw the frame it calculated. So on screens where there’s just slightly too many enemies, the game would suddenly run twice as slow, even if the program only overshot the frame drawing interupt by only 1 millisecond, you’d get a whole frame of wasted CPU time, while the program waits for the next interrupt.

7 damn megaman games and they never addressed this. Couldn’t they have written a subroutine to detect if they missed a frame, and compensate in the next frame? apparently not.

The turbo button was very useful for gamers. Games back in those early days did not auto-adapt to different frame rates if more processing power was available. They expected 4.77mhz and a certain video RAM access speed. Change that in any way and the time in the game changed. Running your car racing game at 10mhz (turbo) instead of 4.77mhz meant that things happened twice as fast as they should have.

I used to flip in and out of turbo mode constantly on my old turbo XT clone…

I really thought your point was going to be about purposely slowing down software for whatever reason. The point of the turbo button was to slow down the computer as needed. Not to speed it up. It was actually misnamed.

I think it goes without saying that you want your software to run as fast as it can all the time. So unless you have a specific reason to want it to run slower then this whole post is really pointless.

It sounds like you had a blast from the past and you were thinking about the old turbo button. Instead of just producing a hey, you guys remember the turbo button? post you tried to come up with some analogy. The analogy simply doesn’t work.

It’s not the hardware, it’s a network that is a bottleneck these days.

I saw an article that compared the performance of a 1986 Mac and a 2007 PC. For the most mundane tasks – booting, launching Word, opening a file, doing a search and replace, etc. – the old Mac was slightly faster. We’ve squandered most of the benefits of Moore’s law on bloatware.

This reminds me of an old DOS screensaver I once had, explosive.com or something, which was some fireworks. I copied it over to my new Windows 95 PC and ran it, crickey those fireworks were launching and exploding fast!

And then there’s the game Police Quest, which at last check still worked on XP. The character moved too fast on screen to be able play it properly. Move right and suddenly your at a far end of the map.

Is this some kind of indirect encouragement to overclock my work computer?

My first job was writing code for a system that took 12 hours to build. So we’d write code all day and it would build all night. Then we’d come in the next morning and start debugging. If, at 9:01, I found a bug, I’d note it on my printout and then I’d need to patch it to go on testing.

Of course, there was no assembler in the debugger so I would need to modify x86 hex directly. I’d need to find a place where I could insert a long jump to empty RAM, and then write my new code in hex, and then jump back. This would continue on until about 2pm. I’d take my printouts/notes to a VT100 and type in my changes. Repeat.

The biggest programming arguments were between the old guys who used nothing but fixed length arrays (which overflowed) and the ‘new’ guys who preferred linked lists (with pointers that sometimes got corrupted).

The longest patch I wrote was 2K bytes. I pretty much had that pocket x86 instruction book memorized. For a long time after that, when I wasn’t sure of the syntax for something in C, I’d look at the assembly output to verify that I’d written the code correctly :slight_smile:

What’s slowing me down?

People are listening to your last post and aren’t optimizing their software :wink:

Actually I went from PHP to compiled software, and with programs with a reasonable size, while the compilation takes some time, the compile ends up saving me lots of time due to the type checking the languages I’m using employ.

I think of languages like PHP and perl, with less compile-time checks, as hurry up and fail languages. I’ve come to think of compile-time checks as a sweet mercy from god (or a vendor I suppose)

I agree with Matt. Plus this article doesn’t really make sense; it’s almost as if you quoted without reading.

Hey guys, remember the turbo button? [quotes someone else explaining the point of the turbo button was a compatibility mode for certain software] It’s funny, I always wanted it turned on! And even nowadays I wouldn’t turn off the turbo for my computer! And I suspect other people also prefer fast computers! [quotes someone talking about Turbo Pascal, then applies emphasis as if what they were talking was somehow related to the turbo button]

Just an additional (and more relevant aside), I think we all remember putting some old game on a new PC and going, who put my avatar on crack as he ran around the screen in fast motion.

When I originally heard that the problem was that the programmers assumed processor speed, it seemed incomprehensible to me that such a major assumption could have been made. Computers aren’t going to get faster?

What other assumptions are we making? The year 2000 will never come. Now it’s 2038 I believe. We’ll never need more than a byte for a character. SMTP is a trusting protocol, etc. What are we doing today that will be blindingly obvious tomorrow?

I completely agree with the need for enough of the following

RAM
Processor speed

And it’s amazing to me how many developers come back with a reply that they don’t need 4 gigs ram and 2 will suffice. It’s amazing that some developers think that having a slower laptop or desktop is completely fine and waiting for things to compile (lack of a 7200 rpm hard disk in a laptop), or waiting literally a total of many minutes a day to wait for things to load and launch is acceptable when developers are expected to produce at usually an unreasonable pace by most businesses these days.

Well, at least 3 gigs on an XP machine, but preferably 4+ on a 64-bit machine is ideal for any developer and costs the company measly amounts extra considering they pay developers good salaries to begin with.

The only thing I ever I remember having to use the turbo button for was to make our Packard Bell 486DX2/66 run slow enough for Tank Wars (http://en.wikipedia.org/wiki/Tank_Wars).

The only real use I remember was in Space Quest 1 to get through the cave with boiling acid. If I weren’t running at 16mhz I’d have surely fried my pixelated Roger Wilco.

@Bill Pudim

Actually, it appears more like you commented without reading… but anyways…