At the nVidia 7800 launch event today, one of the video rendering technology highlights was high dynamic range lighting. Almost all video cards in use today are limited to 32 bit color values-- that's 8 bits for red, green, and blue, with the "rounded" 8 bits typically thrown away*. 24 bits is enough to represent most of the colors the human eye can see. But those 8 bits per color also represent intensity. That means the brightest white is 255, 255, 255-- only 256 times brighter than the blackest black. This vastly underrepresents both the dynamic range of light in the real world (1012 to 1) and the dynamic range of the human eyeball (1000 to 1).
This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2005/06/high-dynamic-range-lighting.html
In terms of “extra bits”, e.g. 32-24=8bits extra they’re often used as an alpha value which then also has a range of 0-255.
So I’d guess in the 64bit case the leftover 16bits will also be used as an alpha channel, now with a range of 0-65535.
It’s possible they slap some other framebuffer-related data into those extra bits, but I can’t find any definitive source on this. Brian Hook (at one time he worked for id Software) made a reference to “destination alpha channel” which is “sometimes used for some special effects”.
And John Carmack said “However, only the Riva TNT has a destination alpha channel (what the other 8-bits in a 32-bit framebuffer take up) that I’m aware of.”
Still, I can’t find any indication of what you would actually USE that data for… wait, scratch that, here’s an ATI paper that documents how to use the destination alpha channel to do a depth of field effect.
You just randomly stuff data in there for whatever purpose you can dream up, I guess. LOL. I’m surprised 8 bits is enough precision for this purpose…
Ah, here we go: John Carmack stumping for more than 32 bits per pixel.
HDR as applied to digital photographs. Amazing results, and exactly the same theory as HDR in 3D rendering.