Feeding My Graphics Card Addiction

Hello, my name is Jeff Atwood, and I'm an addict.

I'm addicted... to video cards.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2008/11/feeding-my-graphics-card-addiction.html

wtf? 142 watts idle still sounds like a lot to me…

Fallout 3 has over 30FPS on my 150$ ATI 4850 512MB with a 2 year old Core2Duo E6400. It’s not a good test for the graphics card. Try something like GRID (I can’t max AA on it) or the first Crysis.

I usually buy a new graphics card in the 150-200 range every 6 to 9 months. It’s cheaper and I enjoy about everything to the max.

But I am not an addict :wink:

This right here is the main problem with pc gaming; sure the screenshot looks amazing; and I would LOVE to play fallout 3 looking like that. The problem is that the hype is built around images like that, when the reality is that most pc’s wont be able to handle it. And then most gamers either don’t have the money or are not willing to spend it in a graphics card. I sometimes believe this to be unfair to the average pc user

And that’s why i think consoles are so successful, because you buy your really expensive console and then you get the new games with a decent quality without having to worry that its too old and wont be able to handle it. And the lifespan of a console is of around 5 years, so i think is a good deal.

I KNOW that the pc will always have better graphics, and I also know that what I say only applies if you have a current gen console, but I believe it to be a valid point

Fastest? Not so fast Jeff. Take a look at ATI Radeon Gainward HD 4870 X2. In two tests it beats NVIDIA N280 GTX hands down.

3DMark 06 – ATI: 14812 (1600x1200) vs NVIDIA: 12407 (1600x1200)

3DMark Vantage GPU – ATI: 9240 (1600x1050) vs NVIDIA 7520 (1600x1050)

142 watts idle still sounds like a lot to me…

Sure, but we’re talking the absolute highest end single video card. And by that metric, it’s a fantastic idle number. If you want efficiency and decent 3D performance there are better choices – notably the 9600 GT.

Take a look at ATI Radeon Gainward HD 4870 X2

Yes, but that’s SLI on a stick – not quite the same thing. SLI doesn’t always work in every game title, has double the power requirements, complicates multiple monitor setups, etc etc.

Haha… yeah. Graphics cards are one thing that I’ve always felt to be a waste of money, but worth investing on :wink:
I wish I had the moolah for a GTX 260 though. For now I have to satisfy myself with my mobo-inbuilt Radeon HD3200 :frowning:

~Raj

I wish I had the money for something better than a shitty Geforce 7300 :(. In Argentina video cards are ridiculously expensive, they cost nearly twice as much as in the US, so you need at least 1000 dollars for a GTX 280 and 550 dollars for a GTX 260 :(. This is why there is no SLI or Crossfire over here, in fact, owning anything better than integrated crap is a luxury for the rich. Over here a Geforce 8800 is high-end, I AM NOT KIDDING (I don’t think it’s still high-end in the US, right? It’s two series behind.)

Alex, isn’t the strategy there to buy used video cards from American eBay sellers who are willing to ship internationally? Factoring in crazy expensive shipping, you should come out ahead. The incredibly weak dollar probably helps as well.

I know starting about 3 years ago about half my eBay items would regularly sell internationally, if I ticked that checkbox…

(Yes, I do use eBay to sell my old video card addictions before they become worthless. Did I ever tell you about the one time I accidentally shipped the video card that was supposed to go to Paris, to Chile, and vice-versa? Oh, that was bad. Very very bad.)

I still think glide is impressive… I was replaying unreal 1 which I got with my first ever graphics card (a voodoo 3 3000)… I noticed instantly how assy the game looked and I thought… this cant be right, it wasnt that bad was it? sure enough it wasn’t … I found a glide emulator and ran the game again… wicked!

I remember glide especially because I spent ages not actually ‘playing’ the game, but stood staring at puddles of blood that had animated ripples… or mirrored floor rooms where you could actually see yourself moving around in complete fidelity…

Undoubtedly they had some interesting tricks for doing the reflections… its extremely rare today to find a dx/opengl game that has reflections… most of them go for pre-calculated cube maps… so you only really see low fidelity world reflections.

This is why I’m somewhat interested in what ray tracing can bring to the table… theres plenty of hacks that work really well with raster rendering… but ray tracing can just implicitely give you lots of other cool effects for ‘free’ due to the considerably more ‘pure’ less hackyness of it.

My pc-gaming has come to a halt over the last year so I haven’t bothered with a gpu upgrade thus far. I used to be incredibly interested in the latest greatest card though. In AUS we pay around the $1000 mark for a new top of the line card at and around launch. I can still remember my brand new nvidia 6800 ultra OC @ $900. Worth every penny.

powerful and quiet is probably what most of us would ideally like to have.

Personally I’m prepared to sacrifice a little bit of power, for a bit more quiet, so I’m a big fan of the Gigabyte SilentPipe graphics card which are entirely passively cooled.

Ideally I’d like a graphics card that would be silent and happy being passively-cooled for basic Windows use, but would switch on some fan-cooling when I start to stress it with a game.

would switch on some fan-cooling when I start to stress it with a game

Almost all modern (mid-range or higher) video cards already do this – they scale fan speed to actual GPU temperatures.

However, the default exhaust grille is incredibly restrictive.

The grill might be so restrictive because larger openings would not meet the FCC emissions requirements, and a wire mesh is expensive.

That’s what separates us from those knuckle-dragging Mac users: skill.

I’m going to studiously ignore this comment. Why you chose to include it in the article is beyond me.

Jeff, god I hate you. I was already jones-ing for a gpu upgrade and I have absolutely no way of justifying this with my other half. You ruined my day man. You are now required to play an extra hour of f3 for those of us unfortunate enough to have to use the medium settings

I’m Matt, and it’s been 3 years since my last video card upgrade.

Why must you tempt us with such sweet juicy graphic power!! Now I have to find a way to justify spending more on a video card than on an entire game console system…

I remember the birth of hardware 3d very fondly. I had the first 3dfx card, that was a pass-through card, you actually had a vga jumper from your original video card to this one so that it could take over the output when you went into an application using it.

Then there was the explosion of games that took advantage of it. Many of them got very creative with their lighting and effects. While today’s games are certainly much more sophisticated and detailed, there was just something about that time period I think we’ve lost. Maybe it’s just the initial wonder of it making me nostalgic. I remember buying games just because they supported the 3dfx card, so I could see what they were doing with it.

I don’t really care about the graphics card stuff, but I just want to say, those are amazingly cool wedding invitations. Any woman that would go along with that is a keeper, that’s for sure.

It’s fun to juxtapose your comment above about PC/Mac users (PC users are better because of the ability to get under the hood and tweak things, to understand how things work) and your opinions about C and higher level languages (I don’t need to know how it works, I trust my language/framework designers to do it right for me).

I kid, I kid.

I remember having all of those cards too, although I started, before the wonders of Voodoo, with an S3 Virge, which had hardware acceleration for like two games, total, and you had to be told what to look for to see the difference :wink:

One point of confusion - does that card seriously eat 160 watts in idle ? My ENTIRE COMPUTER under FULL LOAD only eats 80 watts!