Multiple Video Cards

I know the third crd doesnt do much for the frames, but it does something else. It rids you of the horrible micro-stutter.

Keep in mind that frames per second while a good general measure of performance is not always the best, especially when using multiple video cards. http://techreport.com/articles.x/21516

Hold on, the lowest possible config on PC requires an 8800GT? In other words, if I haven’t upgraded within the past few years, I won’t actually be able to play the game? Unlike my console brethren, who’ll be able to run it just fine on ancient hardware?

This is why anti-consolization people drive me nuts. I don’t want to spend hundreds of bucks on upgrades, so why should I have a lesser on experience on PC than on consoles, even though I have substantially better hardware?

Slightly off topic but just found this;

http://i.joystiq.com/2011/10/18/heres-the-real-way-to-play-battlefield-3/

I wonder what fps they can achieve?

Thanks for the feedback. I’ll stick with the 6870 for a year or so then move to the next gen.

I am sorry, but I find this whole concept ridiculous.

First of all, I’ve been a gamer for the last 22 years of my life, starting with an ancient Amstrad 6128 and a software developer for the last 15 years. I could always find time to play games, some times I would even lock myself for 1-2 days in order to complete a game so that I could back to my work (I co-own a media & software company). I am not really into multiplayer, since I prefer a solid story and gameplay over a clan. Even so, I like to being able to play a game with maximum details at my monitor’s resolution and as such I usually buy at least decent video cards (such as the Radeon X800, the nVidia 8800GTX and the Radeon HD 6850 a few months ago).

What I cannot fathom is the need to increase my electricity bill and my computer’s noise to ridiculous extents by buying two power hungry video cards just to have sligthly better visuals in a single game. I often wander around the net and see people that will always attempt to install and overclock the very best in their computers, usually in overclocker forums, users that in the vast majority of cases have absolutely no need for such watthour burners. The only “extreme” need I had is a third monitor but this had a real and valuable ROI.

The main problem with the the dual video card approach is that it allows the game companies to stay inefficient. I could buy an HD 6970 which is an immense powerhouse of a card (similar to what 580 is) and still wouldn’t be enough according to Battlefield’s developers! F*** that noise, I’d rather you sit down and work on your engine more efficient like Rage, Burnout:Paradise City and Crysis (2) did, long before Battlefield.

That’s only the half of the story on the matter. The second issue I have is with the gamers who buy into the super hardware trend.
Honestly, do you really care if you are “only” able to achieve a 2x or even 4x antialiasing on your computer, instead of an edge detect 8x? WIll it make such a difference if the shadowmap used is of 8192^2 resolution instead of 16384^2? Will you really notice a x64 tessellation?
After a point in cost we have diminishing returns (to a ridiculous extent). Nor you have to go very far in order to achieve the 80% of the maximum graphical experience. To that extent, probably you’ll never even notice the difference of the remaining 20% of a game’s graphic potential in a game like Battlefield. I would go as far as to say that this is a con we willingly submit into while not based into a realistic need, even if you see yourself as a gamer.

I had a few more things to say about the whole commando mentality some gamers develop (while lucky themselves for not having to join the army) but I’ll digress. Cheers.

Wow, Your cabinet is very well laid out. No clutter of wires inside and also you have managed to avoid extra dirt that gets accumulated inside a cabinet overtime.

What is that material (Grey coloured, is that foam?) that you have put in on the blank spaces of your cabinet, I think that is doing the trick of saving your cabinet from mud accumulation. How did you manage to manage the cables inside of your cabinet?

Please throw some light on that too.

Thanks

Note that the plain single 6970 still puts out over 40 fps on that scale.

Faster refresh than television, and this is on a 27" monster monitor (2560x1600 means a 27" ala Dell or Apple) on maximum quality.

As you sort-of state, this is all a complete waste of time.

But then, people who can buy a $900-1000 monitor for gaming can afford two high end video cards for marginal improvement, I guess…

(I have a 6950 at home, and I cannot get it to be “slow” in maximum quality on any game I’ve tried on a 1080p monitor.)

What I cannot fathom is the need to increase my electricity bill and my computer’s noise to ridiculous extents by buying two power hungry video cards just to have sligthly better visuals in a single game.

BF3 is a truly next-generation engine, and real-time visuals like that simply require more horsepower than older engines.

Video quality is a bit sketchy, but this tech talk covers a lot of it:

http://www.youtube.com/watch?v=vuhEQsAhUjo
http://www.youtube.com/watch?v=O9rqk2kL7zI
http://www.youtube.com/watch?v=6ekktuRD5ao

It’s amazing stuff.

(I agree that 4x AA isn’t always necessary, but higher resolutions – ideally the native resolution of your LCD panel – does look significantly better.)

I’d also add that 60fps is kind of a major quality of life improvement for fast-paced multiplayer. 30fps isn’t enough IMHO; 60fps is the target we should be shooting for.

Example:


Jeff, no argument on the 30 vs 60fps; the difference does not get even close to how smoother 60fps feel.

My argument is on the need to max out the graphics performance for the actual returns. You are paying $60 for the game plus $(triple digit number here) for the VGA plus the increase in the electricity cost. Is a game really worth all this?

True, you may not want to wait 2 years in order to get better hardware and be able to play Battlefield 3 smoothly when everyone will have jumped ship on Battlefield 4 or Call of Duty: Space warfare or whatever, but in those two years your two VGAs will become obsolete and will have to be replaced. Let’s have a closer look:

I paid a rather measly €150 (about $200) for my 6850. Your 2x 5870s cost about $620 (about €450) according to the prices you’ve posted. If by saving $300 I have to stay content with Medium graphics, then by all means, so be it. And should we be talking about a single player experience that offers replayability, I could revisit the “powerhouse” with my single Radeon HD8870 or my nVidia 770 that will probably use less electricity too; I doubt that any game can become ugly in such an amount of time.

It’s the same story with the 24GB of RAM posted a while ago. Sure it’s nice and cheap, but you’ll have to throw the DIMMs away when Sandybridge’s successor arrives. Unless you have a real need for that much memory (by real, I mean something that makes you money and absolutely cannot be done efficiently with less memory), you’ve effectively wasted your money.

PS. To be honest, I’ve experienced a difference between 2x and 4x AA - it’s actually noticeable. Still, even 2x looks better than no AA so I’ll try to at least achieve that.
Regarding the native resolution, if I have to reduce my resolution due to performance reasons, I usually exit the full screen mode and play on windowed. My main display is a 27" one, so it looks big enough even with a 1440x900 game window resolution, plus Alt-tabbing works much more robustly :slight_smile:

Jeff have you considered after market fans/heatsinks for your GPU’s. I can’t tell if they are already or not, your image looks like it has heat pipes sticking out but i’m not sure if they were part of the normal 5870.

I have a single 6950 flashed bios to be a 6970 it was just too noisy with the small fans though. I added a Thermalright Shaman heatsink and fan (14 cm fan) and now its virtually silent especially as I reduce the PWm fan speed and still cooler. Admittedly this takes a third available PCI space but the difference is incredible compared to stock. Not sure if you could fit two cards in though even blocking all your PCI/PCI-express spaces.

Think we need better motherboard and case designs for GPU’s to be honest we shouldn’t have to block expansion ports so often.

tried gaming in stereo 3d? its awesome!

I’d rather you sit down and work on your engine more efficient like Rage, Burnout:Paradise City and Crysis (2) did, long before Battlefield.

Rage is your example of an optimized game? There’s lots of reported issues even on consoles, not to mention on PC. It’s basically unplayable on my HD6870+E8400 system that can handle Crysis 2 at ultra high settings, and Crysis isn’t a good example of an optimized game either. Battlefield 3 beta performed better for me than Crysis 2 did.

I just bought a Radeon 6870 and that low 37.3 benchmark score makes me think I should take it back and get something else.

This was exactly my situation and train of thought, until I realized that the benchmark was for a resolution larger than 1080p and used 4x AA.

“Rage is your example of an optimized game? There’s lots of reported issues even on consoles, not to mention on PC.”

Less so patched and with the latest ATI drivers.

I spoke about optimization, not QA. True Rage has serious bugs with AMD VGAs which is inexcusable, but on the other hand, after installing the update and their special driver I got excellent performance with the textures locked at 8192K. The game didn’t look as great as they made it up to be but it was smooth on what its developers perceived as high quality settings.

Crysis 2 had perfect framerate at high settings, it got really bad later at the game when you encounter a raining stage. Plus it was a “meh” of a game too.

In any case, I got hold of BF3 and tried it out. It behaved perfectly at Ultra settings with AA disabled (30-60fps steadily) and while it wasn’t as smooth as in Low settings, now more than ever I can’t justify the cost of a second video card. In other words, this means that DICE actually did a good job with its engine, but employed lots of scare tactics (perhaps to force people to buy expensive hardware). Keep in mind that I played the game in a 1920x1200 resolution.

PS. The graphics of Rage are a joke when compared to Frostbite 2’s. It’s difficult for me to believe that a genius like Carmack would produce something that lacks many modern features (such as dynamic lighting), especially since these were implemented in older versions of his engine.

I bet that foam keeps your CPU and other components nice, snug and warm.