New GPU suggestions, dual widescreen monitor setup.

deadpool405

Distinguished
Jan 6, 2011
27
0
18,530
I would first like to thatnk everyone for helping me choose a video card for my pc. Let's get the details out of the way...
Windows 7 Ultimate 64 bit
CPU - AMD Athlon 64 X2 6000+ Dual Core
4GB RAM
GPU - ATI Radeon HD 2400 Pro
Main Board - MS-7260 Chipset nVidia nForce 550
500 Watt power supply

I recently upgraded to widescreen dual monitors I got from a friend along with this current video card after my video card blew some caps (it was an nVidia 8600GT XXX)
I am looking to upgrade my current card to something that i can play games on (mostly FPS games) and still have dual monitors. I am looking to keep the price lower than $100, any suggestions?

PS. I am new to dual monitor setup. Is it best to use my current card for one of the monitors and get a separate card for the other or just use one card for both? The card I currently have has an adapter to plug into it which has dvi inputs on it, I do not know the type of input on the card itself that the adapter plugs into, this is why I am confused.

Edit: I was looking at these cards but do not know if they could support dual monitors.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814261059

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150475

I would get a completely different card from these listed if someone out there says it would be best, so I am not stuck on these.

Edit 2: Just to clear something up, I am comfortably fine with playing games on just one monitor, not both, just like both while not gaming.
 

TheTrueGamer

Distinguished
Jul 30, 2008
101
0
18,690

deadpool405

Distinguished
Jan 6, 2011
27
0
18,530
Thank you! Is is worth the extra $10 to get the one with the helicopter as a fan? I se it is 1gig but was wondering if you knew if it was worth it.

Edit: Also, I read that you would need one monitor with HDMI and the other with DVI, would it be ok to use an HDMI to DVI cable to my 2nd monitor?
 

TheTrueGamer

Distinguished
Jul 30, 2008
101
0
18,690
Compared to that wimpy GT 240 you showed me, for the same price (the Sapphire brand) or for $10 more (the Gigabyte brand) you get 4.1x more stream processors (which is always more is good), 512MB more VRAM (on the Gigabyte version), and because it's a HD 5xxx series, it has DX11 support, so you get a little bit of futureproofing there.
 

TheTrueGamer

Distinguished
Jul 30, 2008
101
0
18,690
And the extra VRAM is kind of needed, since it will be dispensed amongst 2 monitors. But if you don't go over 1440x900 resolutions on both, the 512MB will be sufficient.

Also, I specifically chose the HD 5670's that had a VGA and a DVI port so all you need is a VGA to DVI adapter and your set. :)

DVI to VGA adapter
http://www.newegg.com/Product/Product.aspx?Item=N82E16814998101&cm_re=VGA_to_DVI-_-14-998-101-_-Product


Edit, also if you are looking for a GT 240 vs HD 5670 benchmark, look at this:

http://www.ultimatehardware.net/ati/sapphire_radeon_hd_5670_page3.htm

The HD 5670 beat the GT 240 in every game it was pit against, and even games that are more advantageous to the Nvidia engine. (ie. Crysis and FarCry).
 

deadpool405

Distinguished
Jan 6, 2011
27
0
18,530
Sorry for all the questions, but would you choose the XFX over the GIGABYTE? The GIGABYTE has 1GB Memory and the XFX 512MB.
I do agree XFX has the best support around, they were going to replace my blown card for free but i did not have the original fan anymore.
 

TheTrueGamer

Distinguished
Jul 30, 2008
101
0
18,690
I've just had some "quality" troubles with Gigabyte over mobo issues, I've gone through quite a few mobos from Gigabyte so you can consider me biased. On my OLD gaming rig, I've had 3 XFX 9800 GTX's running strong for ~3 years after I OC'd them myself and not a single hiccup.

However I suppose Gigabyte could be better for your situation since you are planning to run dual monitors. You can just ignore my XFX recommendation, I'm just biased :p
 

deadpool405

Distinguished
Jan 6, 2011
27
0
18,530
Thanks for the help, I think I am goin to go with the GIGABYTE since it has more memory. The 4850 is around or above $100 on NewEgg and this GIGABYTE card looks better for my setup IMO unless I am missing something...
 
Every current graphics card that I can think of will support two monitors. Some will have different type of ports, so you may need an adapter for one.

That said, if you have two cards, I would use them both.

Put your gaming monitor on the stronger of the two cards. That way ehe entire set of video resources will be dedicated to the monitor that needs it the most.

 

deadpool405

Distinguished
Jan 6, 2011
27
0
18,530
"That said, if you have two cards, I would use them both.

Put your gaming monitor on the stronger of the two cards. That way ehe entire set of video resources will be dedicated to the monitor that needs it the most."



Thank you for the reply, I just was not sure how the system would handle having a different card for each monitor, will it act the same as connected to one card? Right now I am using one card and a program called UltraMon that helps with dual monitor setups.
 


It works just fine, I have done it that way a couple of times before. Under windows-7 you can even use a nvidia and a ati card, and you will need two different drivers. No special software is required.

I also found that if the two monitors are identical, you get a better experience when dragging images from one monitor to the other. The colors stay consistent, and the size does not change.
 

deadpool405

Distinguished
Jan 6, 2011
27
0
18,530
Alright, thanks again. The monitors are the exact same model, it is just frustrating trying to get them to match color and brightness wise, lol. Looks like I will get the GIGABYTE card and use my current card for the secondary monitor, then in the future I can get another GIGABYTE card and crossfire those bad boys.