Yes, 64 bit is pretty much mandatory once you start talking about video cards with 1 GB of RAM.
64-bit Vista is extremely mature, and I’ve had no bitness issues whatsoerver, as I remarked here:
Yes, 64 bit is pretty much mandatory once you start talking about video cards with 1 GB of RAM.
64-bit Vista is extremely mature, and I’ve had no bitness issues whatsoerver, as I remarked here:
Ah, you use these cards for graphics?
You’ve obviously not seen the uses the cryptography people put them to. I don’t do either - I stopped gaming years ago because my time started to disappear down holes I didn’t want it to - used to play stuff like Civ, which doesn’t need the lunatic graphics anyway.
Like I say - I’m sure there’s other stuff you can do with that massive parallel processing power.
D’oh
Map reduce? Using racks of these cards instead of racks of blades?
Hmmm
Clever enough to think of it (ha!) but don’t ask me how you’d even begin to make it work
I bought a GeForce 4 for 500#8364; … and 2x Geforce 6800 for about 800#8364;. But then, along came the Xbox360 and for the same money i got two 360s and some change left for a decent video card for my PC.
I used to love playing the modern games silky smooth with my new setup. But is it worth all the money? In hindsight i would say no. The GeForce 4 is now in a machine that can barely be used for casual gaming. Of the two 6800s one is defect, the other in a 3rd class machine mainly used for watching videos and the occasional 4 year old game.
I must admit … I’ve become reasonable. And i don’t know why. Gotta be the age. Gotta be. I’m 33 and i’m becoming sensible … already? Noooo, i don’t want to!!! :}
I’m just pissed that as fast as the tech advances, we figure you only get to be cool for about 6 months before something way better blows your stuff out of the water.In a way it’s great, but in another it means you better have some serious cash to lay out to stay in ‘the game’(wonder how many results that would turn up? Wanders off to go see…)
nice
and what about watts ? ^^
Do you have a three-monitor-setup, and if so what do you use to hook up that third screen?
I was originaly going to get a 2-monitor setup but I definately think I’ll go for 3 now.
I was a little put off by all this SLI rubbish but after finding out its not compulsary 3 monitors and 2 cards is the way to go (Unless you can run one from onboard graphics).
I’m not an addict, and I pretty much insist on passive cooling, but I just upgraded from an 8600 GT to a 9600 GT for a cool $150 (Canadian) and was amazed at the difference in performance. Probably gets about half the FPS that yours does at the same perf settings, but nevertheless, I didn’t even realize how sluggish some games were until I tried them on the new card.
I wish you hadn’t posted that comment about Fallout being Oblivion with guns, because that makes me want to try it and I’ve already wasted enough productivity cycles at home.
weak sauce.
Interesting blog, but I wonder, which fallout artwork was on the invitation?
Dammit Jeff…I had just about convinced myself my 7900GT is fine.
Although I have been noticing frame rates on newer games is starting to approach unplayable at higher resolutions.
Just double checking you’re running Vista x64 drivers with no problems?
I’m curious to know if you have beaten Fallout 3 yet?
If you have, then what are your thoughts?
I beat the main story and am a little disappointed.
However the game itself, especially the environments(Towns, Wasteland, etc…) was incredible.
For Aaron,
Fallout 3 is pretty incredible. So was Oblivion, on which I wasted over 650 hours. I’ve done Fallout 3 twice, since it does have an end (without the expansions). I’ve yet to load the new expansions and play again, probably with mods this time.
On a note related to the video cards, I started playing with a GTX8800/320Mb, but noticed that outdoors, if I tried moving too fast, usually by spinning around quickly, the graphics would stutter and lag. I upgraded to a Radeon 4850/1Gb and the problems went away.
Just noticed this blog post here now.
While this is old, there is a bit of truth that can be gleamed from my input. When you’re tweaking your cards, don’t forget the rotational speed of your fan.
You’d think that by running it at 85 or 100% you’d get the maximum cooling effect when you have the best thermal paste selected. You would be wrong.
If you use the command line to test the fan speed at each percentage and let it run for a minute at each percent, all the while recording the sampling by piping it into a file to view later. What I’ve found at
97% max rotational speed seems to the best a cooler can actually handle. Going to the maximum of 100% just wastes power is a little bit louder. Also, the type of heat sink the graphics cards used aren’t really ideal. They’re not maximally flat on the underside for which makes contact with top of the chip. This is especially true with the newer R9 series from AMD. They have this tiny little diamond / square that transfers all that heat from the die to the heat sink, that’s a bit of a problem.
I know this idea will be hotly contested (pun intended) but having a maximally flat undersurface is just the beginning, the top of the graphics chips also need to be done the same way.
But wait… There’s more!
Now, what will really throw you is creating a recess in the dead center of the heat sink (assuming it’s for one chip only) that fits that square / diamond really snugly when the heat sink and chip are cool to the touch (not running). The recess must be within 10 micrometers of tolerance, the inside recess, on all sides including the top must be maximally flat, too.
When you apply that thin later of thermal paste on the top of the chip, it must cover the sides of the thermal contact on the chip too. When this is done correct, the heat sink is put in place and tension is put equally in place. What will happen is this, you’re increasing the surface area of the thermal contact, which makes your heat sink and paste more efficient.
Make sure you have at least 1/4 a centimeter above the chip itself as some chips have passive components on top too, like resistors or capacitors. The law of metal when it gets hot starts to expand at different rates, depends on the metals involved. The metal on the GPU will expand slightly faster than the heat sink will but won’t make the heat sink pop up, it will contact the sides and start the heat transfer faster than if the heat sink was just sitting on top of it.