Is It Time for 64-bit on the Desktop?

About the performance issues…

I work at AMD (where the 64-bit extensions to x86 were invented) in the group that among other things does liason with the compiler venders, working with them on code generation, etc., so I have some real basis for what I’m saying…

Your average large C/C++ program will probably run 10%-15% faster just recompiling it for 64-bit, and the reason is amost entirely the fact that there are twice as many registers in the ISA, so the compiler can keep more stuff in register, eliminate spills to memory.

A second advantage is that where in 32-bit, there are Lord knows how many different calling conventions (__stdcall, __cdecl, FORTRAN, PASCAL, WINAPI, …), in 64-bit there is only one calling convention and it passes most parameter values in register, not on the stack.

Sure, your mileage will vary, and there are some pathological cases where performance actually decreases, but on average it’s a modest win, about 1 processor speed bin.

Of course, you don’t notice much difference on most desktop apps, since most desktop apps are not aggresively optimized for speed. A 3GHz processor used to surf the web is idle most of the time waiting for your page to download, so who would notice the difference?

As others have pointed out, only pointers and size_t are expanded to 64-bit; int, long, float, double, short, char are still the same size that they were for 32-bit builds, to the program .exe size and data space requirements do increase, but only modestly. Your data structures don’t double in size, unless they’re all pointers and size_t.

You could argue that the switch from ASCII to UNICODE, by doubling the size of all the text strings probably did a lot more to increase program size and memory footprint, but nobody seems to be too upset by that.

Jeff,

One of the compelling benefits of 64-bit Windows I didn’t see you mention is security. Jeff Jones calls out the following three security benefits unique to 64-bit Vista [1]:

  • Hardware NX protection on globally by default.
  • Kernel Patch Protection aka Patchguard.
  • Mandatory Kernel Module and Driver Signing.

Basically, the introduction of a new CPU architecture with no backward compatibility restrictions gave the OS folks some liberties to do things more securely than they could with 32-bit and its legacy drivers/programs/etc…

[1]: http://blogs.technet.com/security/archive/2006/08/03/windows-vista-x64-security-pt-1.aspx

There are 64 squares on a chess board and bit-twiddling chess game writers have already figured out how to exploit 64-bit hardware. Hopefully the bit transitions will continue until there are enough bits to represent a 4x4 matrix of adequate precision for my preferred set of games.

(Not related to this article) Do you have a list of favorite Blogs?

And the reason why this is possible (and I didn’t know this) is because Mac OS X does not map kernel into user address space. App uses the entire 4GB of available address space (with quite a bit of it reserved for Apple and third party libraries, though).

@benb:
Your assertion about true 64-bit chips is false. You are correct
if writing about Intel. You are not, however, when writing about
AMD.

I am usually referring the AMD Programmer’s Manual because I have that one in print, so I am even more certain that what I say applies to them than to Intel chips.
(I assume you mean my statement that x86_64 has only 48 bit virtual / 40 bit physical address bits, AMD indicates that the former should be considered an architectural limit (though it is easy to do away with) while the latter is only an implementation-specific limitation).
In case you wonder where the 40 bit limit probably comes from, that is the number of bits that fit in a standard 64 bit hypertransport control packet, 64 bit physical addresses would need the extended 96 bit format.

Hey Jeff, I just wanted to let you know that Beryl is no more, its merged with http://compiz-fusion.org/ . Also, you don’t really need a midrange card to run it. I am currently driving a 1920x1080 monitor with the onboard intel chipset in a mac mini with no problems. Also a coworker is using a 6 year old dell laptop with the onboard intel chipset and running compiz just fine.

I’m not certain that the underlying assumptions about 128-bit memory hold; addressing space need not be equal to available physical space (we already see this in the 32-bit world with virtual memory).

Many areas of CS, for instance, must deal with convoluted algorithms to fold back sparse arrays into finite contiguous memory space. This is basically memory management done in software (pretty much the least processor-efficient way of doing it); such arcana (and accompanying bloat, testing, and optimizations) could be done away with by letting us address a very large memory space directly, having fast dedicated hardware map it back onto physical memory locations. I hope such desirable goodies will not have to wait for the next generations to reach adulthood.

And my favorite application (I’m biased though obviously), Paint.NET, is full native 64-bit. We take advantage of 64-bits in a few places in order to avoid spending weeks optimizing 32-bit code: instead of futzing around with 32-bit integers, we just use 64-bit longs. This is necessary in many places where we have to sample many pixels from an image and blend them together to create a final pixel (the distortion effects do this a lot).

The memory management in Paint.NET, sadly, uses a short-sighted design in hindsight. All bitmaps are created using 1 big allocation instead of a tiling or paging scheme that GIMP, Photoshop, et. al. use. Because of this Paint.NET often fumbles on large images as evidenced by the numerous “out of memory” crash logs I get in my inbox every day from 32-bit users. I was planning on 64-bit adoption gaining traction sooner so that this wouldn’t be necessary, but I’ve been completely wrong :frowning: Oh well, I’m hoping to do a lot of stuff in v4.0 that will make use of tiling for more than just memory management.

beryl is done. compiz-fusion all the way.

you say: 8-bit 2^8 256 bits
you mean: 8-bit 2^8 256 bytes

I would not suggest to upgrade to Vista at all. It is too raw, unfortunatelly. SP1 and time (when we say about drivers) should fix that, bit now that OS is not good. 64 bit version twice as poor as 32 bit Vista. So even if you have 4GB of RAM (that is very unlikely for usual user, just like having 3 displays:-)), think twice when you want to upgrade your OS to Vista and, more over, 64 bit Vista.
Interestingly, I just wrote about troubles I had with Vista on my blog.

The one place where Vista 64 really shines (IMO) is with SuperFetch. The OS can cache so much stuff in RAM, I find my app-launch speed is, frankly, amazing. With 3GB of addressable space, this just isn’t possible to do if you do any serious multitasking – you run out of memory too soon. I now find myself more frustrated because

  1. I have an iPhone, and Apple hasn’t gotten around to supporting my damed OS!
    and
  2. Most PC configurations (still!) max out at 4G of RAM – you typically have to go FB-DIMM and Xeon to get any large amount of RAM in your machine
    and
  3. Edit and continue isn’t supported on VS 2005 on x64 binaries. Annoying. Anyone know if this has been fixed in VS 2008?

8-bit 2^8 256 bits
typo?

It’s really a vicous circle right now. The switch from 32bit to 64bit is more or less tied to the switch from XP to Vista, including new driver a model and adaptation to DirectX 10 for games.

Actually, gamers are one of the groups who benefit MOST from 64-bit. Modern games can easily allocate 2 GB of memory in large multiplayer situations.

“Once applications begin to push the 2GB addressing space limitation of Win32 (something we expect to hit very soon with games) or total systems need more than 4GB of RAM, then Vista x64 in its current incarnation would be a good choice.”

The following AnandTech article documents the issue in Supreme Commander, Company of Heroes, and Lost Planet.
http://www.anandtech.com/gadgets/showdoc.aspx?i=3034

@ David W.:
Sorry, Jeff is right. The Mac OS X Kernel is still 32 bits. This was mainly done for driver compatibility

How exactly can a “32 bit” kernel allocate TERABYTES of virtual address space and receive 64 bit calls? Could you (and Jeff) enlighten us a bit here?

Actually, you’re right and I’m wrong. I’ve looked into “Mac OS X Internals” by Amit Singh, and here’s what he has to say on the topic (caveat lector, this is accurate as of Tiger, Leopard may have changed things):

… the kernel is still 32 bit. Although the kernel manages as much memory as system can support, it does not directly address more than 4GB of physical memory concurrently. To achieve this, the kernel uses appropriately sized data structures to keep track of all memory, while itself using a 32 bit virtual address space with 32 bit pointers. Similarly, device drivers and other kernel extensions remain 32 bit.

End quote. So in fact Mac OS X is a 32 bit OS that to 64 bit apps appears as 64 bit. Two other interesting tidbits I’ve found are:

  • 32bit PPC processes can use all 64 bit assembly instructions.
  • Memory futzing is completely hidden from 64 bit apps, so they can allocate the entire available VM address space, if they so desire.

I’ve found programming in 64-bit assembler (AMD64) a joy compared to 32-bit, plus the more you do it, the more tricks you find, so you find yourself programming in a completely different way. I think more performance will come the longer compiler vendors have to play with 64-bit code. After all, it took quite a while for 32-bit compilers to get up to speed.
Just as you program C# in a different style to programming C++ for performance, it might take a while to get up to speed with 64-bit, so to speak.

David W, in what way is git “doing things the hard way”? I’m no genius but I find it pretty easy to use, and that’s coming from a Subversion background. The only thing that was difficult was adjusting my thinking to the concepts of decentralized source control; the tool itself is pretty easy.

Santana: “I seriously hope you realize that windows is a piece of sh** and start taking that into account in your blog, if not then in my opinion you are just another writer who think he knowns whats up but doesn’t have a clue.”

That rise to 5% market share’s really going to your head, isn’t it? Just wait till it’s 90% and everyone’s talking about how only losers use Linux, with all those viruses people write for it now it’s being used by non-geeks who can’t protect their kernels.

Terry Pratchett [adapted]: “I need no code on my screen to be a geek. Nor do I need to hate Windows. What kind of creature defines itself by hate?”