Sign in with
Sign up | Sign in
Your question

Looking for new ~£300 GPU and to use my 8800GTS with it for 3 displays

Last response: in Graphics & Displays
Share
June 24, 2010 12:24:28 PM

Hi all,

Next Tuesday, I should be getting well paid for a recent job, and I'm in the market for a new GPU (budget around £300). Here's a summary of the things I'd like to get out of this GPU:
- Fairly future proof gaming GPU (preferably for the next 3-4 years - my average GPU upgrade period).
- The ability to run it with my current 8800GTS for 3 displays - details to follow.
- Reliability (so basically if anyone knows of problems with the newer GPU's, I'd like to avoid these if possible) - I don't plan to overclock either card.

The relevant parts of my current spec (hopefully I haven't missed anything):
Asus Maximus Formula SE X38
Intel Q6850 @3.00GHz
2x2GB OCZ 240DIMM PC2-6400 Gold RAM
XFX GeForce 8800GTS 640MB PCIE
*Antec 650W EPS PSU Black
1x1920x1080 primary and 2x1600x1200 secondary
Windows 7 Ultimate 32bit

* I realise this PSU is both underpowered for 2 cards, and only has 2x PCIE 6-pin connectors, where I'd need 3 for the 8800 and a newer card. I can borrow a power supply from a friend to test everything works ok (and possibly borrow it until I can get a new one). I guess I'm looking at something like an Antec TruePower Quattro 1000W, although any alternative suggestions would be most appreciated (assuming it'll fit ok in an Antec Titan 650 Server Case).

Anyone see any further compatibility/other problems with the proposed setup?

As for the 3 displays part, I'm not looking at spanning games over all 3 (unless it's actually possible - I'm not really bothered about this). I've got a 1920x1080 primary monitor (DVI to HDMI cable - it's a TV) which I'm quite happy gaming on, along with 2 1600x1200 secondary's (DVI) - only one of which I can currently use with my 8800. My plan is to let the new GPU power the primary (so I can use it's full power for gaming), and the 8800 power the two secondary's - this would be very useful for my work and well as handy recreationally. Hopefully I'm not oversimpifying how this all would work or overlooking anything vital.

On the gaming front, the games (which come to mind) that I've had to run on lowish-mid settings atm are GTA4, ARMA2 and Far Cry 2 (which several times tried to murder my 8800). I'd quite like to see these in all their glory at 1920x1080 (no small feat I realise), and be future proofed for newer titles (DX11 etc.).

Finally, the cards I've been looking at (although please feel free to suggest any alternatives provided you can detail solid reasons why ;) ):
Asus GeForce GTX 470 1280MB
XFX ATI Radeon HD 5870 1024MB

Possible problems I can think of (answers from experts/first hand experience obviously preferable to conjecture):
- Mixing ATI/Nvidia in PCIE slots (although reading around it seems Windows 7 doesn't mind, and I'm pretty sure I can get Ubuntu to be happy with it for my work).
- I'm currently using Nvidia's TwinView configuration with my 8800 - I guess this is just a span since I can drag windows between etc.). How would this work when mixing cards and/or ATI & Nvidia drivers?
- I guess my total resolution would be 5120x1200 - is this ok with 2 cards, Windows 7 etc.
- I've got alot of extension cards in my motherboard, and I'm a little worried about cooling seeing as the GPU's are both 2-slot and so would end up taking the top 4 slots - right next to each other). I've got 4 case fans (I think they're 80mm although I'm not 100% certain), 2 on the front, 1 on the back, and 1 on the side. I can open the case up and take a closer look if necessary.

I'm from the UK, and prefer to get my hardware from Overclockers/Dabs/Amazon (although Scan seem to have some really good offers, I just haven't used them before). I don't mind delaying my buy (although excitement prevails nonetheless) if there's a likely decent price drop on some of the newer gear in the next month or two.

Finally apologies for the really long post - I'm sure you all realise now why it had to be so detailed - and many thanks in advance to anyone to actually gets through it all/can offer any advice.
June 24, 2010 1:54:50 PM

Hey, if u get the 5870, then u wont be able to use your old 8800GTS. However, the 5870 does sport 3 video outputs so you'd still be able to use your 3 monitors. Furthermore, you could then group them into a single monitor and game accross all 3 (res would be set to the lowest screen's max: 1600, which would mean ur main monitor wouldnt be running at native and would look pretty bad). Your PSU would be able to run a single 5870 if its a good quality product (I don't know much abt it.)

If however you go with the Nvidia option with the GTX 470, you can re-use your 8800GTS as a PhysX card. you could also use it to power your 2 extra monitors, but u wont be able to game accross the displays. I think u will be able to drag open windows accross the displays, but if I remember correct, that slowed down the app drastically when I tried with my 2 8800GTs back in the day.

Another worry with the 470 is that it is a Fermi card and it generates A LOT(!!!!) of heat. Also, you'd have to get a new PSU to run the 470 along with your 8800GTS, so thats also a bummer.

With the 5870, you'd get the same performance, be able to use all 3 displays, you wouldn't need to change your PSU (most likely), you'd have an extra PCI slot for maybe a sound card, and you wouldn't generate a crazy amount of heat.

I like Nvidia btw, so I'd still get the 470 =D
June 24, 2010 5:42:50 PM

Thanks for the reply. I've been very happy with my Nvidia card too (save for a few driver screwups on their part), so I'd certainly like to stick with what I know (the last ATI card I had was over a decade ago). However, from what you're saying it does sound like the 5870 is the better option - plus it doesn't hurt to have the 8800GTS as a backup in case anything ever happened - especially considering this is the only PC I can actually do my work on anyway. Reading around, the EyeFinity thing is a bit of a gamble - especially considering I'm mixing 4:3 and 16:9 aspect ratios - but since I'm not all that bothered about it, I guess it's just a bonus feature to play with at my leisure.

So, I guess like you say now it's just down to the power supply. It came with the case, which wasn't exactly cheap. The card is supposed to take a maximum of 188 watts. So that leaves me a little over 450 watts of power for the rest of the machine. I did a quick calculation and I'm pretty sure this is more than enough. Still, in the half a week until I order it, if someone could give a second opinion on that, I'd be most grateful.

Another downside to the 8570 is that the Active DP to DVI adapter I'd need is about £80 - still I guess that's half what I'd be spending on a new PSU.

Guess I'll do some final reading around while I wait to be paid (and prepare to part with £410 - the pain!), but you've helped me make up my mind anyway - thanks.
Related resources
July 15, 2010 3:51:41 AM

Hey, no probs, glad I could help. Let us know what u end up getting.
July 15, 2010 3:51:01 PM

I got the 2GB Sapphire 5870 VaporX in the end with the Active DP to DVI adapter (which took a couple of weeks to get to me on preorder). Works brilliantly out of the box on Windows (still gotta figure out how to get 3 displays going in Linux CCC), plus the Eyefinity actually works quite well even on my different monitors. I run them all at 1360x768 (highest commonly supported mode). I guess the fisheye effect is countered slightly by the fact that the outer monitors are squashed horizontally, but in the end it all looks rather nice. Been playing Dirt 2 (on top DX11 settings I might add =D) more than I should and it's amazing with the peripheral vision. Once I can figure out how to enable the bezel compensation (looks like an EDID hack will work), I'll be completely happy with this until ATI bring Eyefinity support for different size/resolution monitors (apparently it's in the works). All in all I'm very happy indeed with my investment (the only painful thing is paying it off lol - the VAT would have hurt enough on it's own).
!