Nvidia & AMD Cards in the same system - reduced performance on both

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
I recently added a second monitor to my gaming rig, and loved it so much I've decided to add a third. Before purchasing the third display I plugged it into an LCD TV via the HDMI port and was saddened to see that my 570 GTX will only output to a total of two monitors. As the most graphically intense thing that will likely be running on the secondary monitors is Eve Online at max settings, I bought myself a Radeon HD 6770 and hooked my second monitor up to it.

More often than not when playing Eve, I'm running 3 instances of the Eve client at once with no performance issues when both monitors are running off the 570 GTX. However, after adding the 6770 and connecting the second monitor to it, performance in Eve was significantly decreased, even when using just 2 clients, one on each monitor. Oddly enough, the frame rate on both monitors/clients are exactly the same, a change on one seems to translate instantly to the other.

My previous computer had an SLI configuration, but aside from that I don't have any experience with dual video cards. Both cards are running the latest drivers. I'm not sure what the problem is here and am really not even sure where to look. If my performance is going to down with the addition of a discrete GPU, I might as well get rid of it.

The only thing I can think of is that using 2 PCI-E video cards causes my 570 GTX to be kicked down to x8 mode, but even then I don't have any idea why the framerate in both clients would be identical. I do remember seeing something called LucidLogic or something along those lines while thumbing through the motherboard's manual...if I understood it correctly it would allow me to use the onboard sandy bridge GPU to power a third monitor, and shift load to the discrete GPU as necessary. It wasn't clear if that was practical or sufficient to run Eve at decent settings (not that Eve takes significant power to run, but I want to be able to do it at max settings).

Any thoughts or advice would greatly appreciated.

My system specs:

Core i7-2600K @ 4.5Ghz
ASUS P8Z68-V pro
16GB Corsair DDR3-1600
GeForce 570 GTX 1280MB
Radeon HD 5770 1024MB
256GB SSD
3x500GB HDD's in RAID5
24" Monitor @ 1920x1200
21.5" Monitor @ 1920x1080
Win7 Ultimate x64
 

bobusboy

Distinguished
Jul 3, 2009
764
0
19,060
I was not under the impression you could have both an AMD and a Nvidia card in the same machine; and have both of them output data at the same time.

IF it is possible the 6770 is MUCH weaker than the 570gtx, This means that the maximum output of the machine is only as fast as the slowest card.

That is how it would be with a normal Xfire set up anyway.

Click below for the two of them compared.

http://www.gpureview.com/show_cards.php?card1=658&card2=640
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
It's definitely possible...I have it setup and it "works", the primary monitor even works at normal performance as long as the secondary isn't trying to do any 3D rendering. I did a bunch of searching before I bought the second card and read from several sources that you can run both an AMD & Nvidia card in the same system as long as you use WinXP or 7, as Vista doesn't allow for mulitple GPU drivers.

I know the 6770 is much weaker, but as I don't need it to do anything more than run Eve at max settings I didn't think it would be an issue.
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
Could this perhaps be a driver issue? If I replace the 6770 with say a GeForce 560...will I still have the issue of the primary card only being able to work as well as the secondary?

To be clear, this isn't an SLI setup and I'm not trying for one. I want to use the two separate cards discretely.
 


I believe you will find most of those refer to using AMD cards as the primary vehicle and then a lite duty nVidia card for PhysX.

http://hothardware.com/cs/forums/p/56478/403485.aspx
 

blacksci

Distinguished
Jan 25, 2008
818
0
19,010
Why dont you buy a active adapter and hook up all 3 monitors to the amd card, yard the nvidia one out, sell it, go crossfire. Eyefinity gives you 3 monitor support with 1 card- you get a desktop across 3 monitors, and they dont even have to be the same monitor. Nvidia on the other hand you need 2 cards in sli and all 3 monitors have to be the same resolution refresh, yadda yadda.

As a side note, i love eve, and run 4 toon mining crew when i play, plan on playing it again after i build my new comp with taxes, and plan on going the nvidia route, but im buying 3 new monitors also.
 

cbrunnem

Distinguished
dont bump a thread on this website. its against the rules.

i dont think you will find the answer you are looking for hear cause 99 out of 100 times a person will buy the same gpu for crossfire or sli setups.

i think you are playing with fire and should return the second amd card and if you can return the nvidia and get a 6970 instead
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
I am going to return the AMD card, unfortunately returning the Nvidia card isn't an option as I've had it for months. I'm not a big fan of AMD and don't have any need for SLI. It looks like my next step is to see what happens when I try to use another (albeit weaker) Nvidia card in addition to the 570.
 

cbrunnem

Distinguished


my point was that a single amd card supports up to 4 monitors
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
So now I'm really confused...

It turns out I had a GTS 250 lying around from an old system that I forgot about. I installed it the system and to my secondary monitor. At first it seemed as if it would work as I wanted, but the weirdest thing happened when I started running additional Eve clients.

With 1 Eve client on the GTS 250, the game ran beautifully, 60 fps at an idle (V-sync enabled).
When I started running an Eve client in addition on the GTX 570, both clients went down to 30fps.
Running 2 Eve clients on the monitor connected to the GTX 570, and 1 on the GTS 250 monitor, the fps on both went to 20.

It seems as though total GPU power is being divided between the displays...but this doesn't make sense. If I run 3 clients across 2 monitors connected only to the GTX 570, I get significantly higher performance than when running less instances of the client across multiple video cards...it doesn't make any sense.
 

cbrunnem

Distinguished
this makes total since to me. 1 monitor 60 fps, 2 monitors 60/2 = 30 fps, 3 monitors 60/3 = 20 fps. the question is what monitor is the desktop showing up on? is it to a monitor that is connected to the 250 or 570? i suspect its the 250.... atleast i hope.
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
Unfortunately that's not actually what's happening. At present there are only 2 monitors connected. 1 to the 570 & 1 to the 250. I'm not sure what you mean which monitor the desktop is showing up on. I'm running them both in extended desktop mode in windows. The framerate drops were only when the game was running fullscreen "fixed window" mode, at least one to each monitor. So when 2 or more game clients were running, there is at least one on each monitor.

At any rate...I don't understand why I'm loosing performance when I use 2 GPU's. if the 570 can power 3 clients across 2 monitors without issue...why is performance going down when the 570 is only powering 2 clients and the 250 is powering the 3rd?
 

FtsArTek

Distinguished
Sep 11, 2011
368
0
18,810
I've done this before, albeit with a GTX460 and a HD2400 PRO... What the problem is, is your primary GPU renders all three monitors, but then the secondary GPU also tries to render them all. While the secondary is doing this, the primary transmits the images for display 3 to the secondary via the PCI-e slots, and the secondary just can't work out what to do with it properly, and so tries to re-render it before showing it. The second one just can't keep up with the primary. Really, your best option would be to sell those and buy an HD6970, or a GTX590 or 6990.
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
Is there any way to tell the video cards to only render things to the monitors they are plugged into?

Getting rid of the GTX 570 sadly isn't an option worth exploring really. My system is watercooled and the waterblock for the GPU was $100 by itself, having to find, pay for, and install a waterblock for a new card and re-doing the loop really isn't worth it so that I can play multiple instances of one game on 3 monitors.

If I were to add a second GTX 570 would this allow proper performance across multiple monitors? I suppose I might be able to justify that as it would give me the option to do SLI if I felt so inclined at some point.
 

cbrunnem

Distinguished


yes it would and you would/should sli them
 

Prodigal

Distinguished
Aug 10, 2009
23
0
18,510
That website seems to indicate that the monitors all have to run at the same resolution. 2 of mine will run at 1920x1080, but my primary runs at 1920x1200. I've tried to find other 16:10 monitors, but they're considerably more expensive :(
 

blacksci

Distinguished
Jan 25, 2008
818
0
19,010
That would be correct, which is why i told you to use the amd and eyefinity instead, its cheaper then buying a video card AND a new monitor. The only reason im going this route is because i plan on buying 3 monitors at the same time, so its not a problem for me.