Hi, i'm currently looking into getting a new rig built around one or more GTX 980's with a multi-monitor setup. (Not 3D Surround) However I have always used AMD GPU's and I have no experience with Nvidia cards or their driver software capabilities. So before I even think about the new rig, I need one quirky crucial question resolved.
Is it possible to have a multi monitor setup with 2 separate monitors with: 1xVGA CRT monitor (using VGA>DVI Adapter) and 1xDVI LCD monitor, with two GTX 980's, where each monitor is connected to a separate card ? So basically each card is driving one monitor. Obviously SLI would be disabled.
But when SLI is active both cards then switch to driving the display on the primary monitor, and the second monitor connected to the slave card then obviously just no longer receives any output, and I can just turn it off while SLI is on.
The reason I ask, is the above described setup is exactly what I have up and running on my existing AMD cards on the old rig (2x5870's in crossfire). But i'm thinking of switching to camp Nvidia on the next rig, and was wondering if my existing display setup would work with Nvidia drivers/cards ?
Just to be clear:
SLI Turned Off = Each card becomes independent, and each card detects it's connected to one monitor, and drives a display on their respective monitor. (this would just appear as two separate displays in windows 7)
SLI Turned On = The primary card becomes a master card and the second card becomes a slave card to the master card, and the SLI array as a whole (both cards) detects it's connected to ONE monitor, namely the monitor connected to the master card, and then both cards then drive the display (eg playing a game) on that one monitor. At which point I can just power down the second monitor, because it will no longer be receiving and output at that point.
Thanks in advance.
Is it possible to have a multi monitor setup with 2 separate monitors with: 1xVGA CRT monitor (using VGA>DVI Adapter) and 1xDVI LCD monitor, with two GTX 980's, where each monitor is connected to a separate card ? So basically each card is driving one monitor. Obviously SLI would be disabled.
But when SLI is active both cards then switch to driving the display on the primary monitor, and the second monitor connected to the slave card then obviously just no longer receives any output, and I can just turn it off while SLI is on.
The reason I ask, is the above described setup is exactly what I have up and running on my existing AMD cards on the old rig (2x5870's in crossfire). But i'm thinking of switching to camp Nvidia on the next rig, and was wondering if my existing display setup would work with Nvidia drivers/cards ?
Just to be clear:
SLI Turned Off = Each card becomes independent, and each card detects it's connected to one monitor, and drives a display on their respective monitor. (this would just appear as two separate displays in windows 7)
SLI Turned On = The primary card becomes a master card and the second card becomes a slave card to the master card, and the SLI array as a whole (both cards) detects it's connected to ONE monitor, namely the monitor connected to the master card, and then both cards then drive the display (eg playing a game) on that one monitor. At which point I can just power down the second monitor, because it will no longer be receiving and output at that point.
Thanks in advance.