G
Guest
Guest
OK, here's what we are trying to do. We have one CPU (Dell Dimension DPS061 running Vista) with a GeForce 7900GS graphics card. We have two monitors set up on hubby's desk, and two other monitors set up on wife's desk about 15 feet away. Originally, hubby's monitors were connected to his work laptop, but due to his layoff, laptop went bye-bye. We are trying to set up a system where hubby can use the dual monitors on his desk when he wants to work on the PC, and wife can use the dual monitors on her desk when she wants to work on the PC (not at the same time of course).
Each set of monitors includes a DVI (primary monitor) and a VGA (extended desktop). We thought that we could accomplish this simply by using a DVI splitter on one port of the graphics card (going to each of the DVI monitors) and a VGA splitter on the second port of the graphics card (going to each of the VGA monitors). The DVI splitter is working fine for the primary monitors on each desk (same image on both monitors, which is what we want), but the VGA splitter isn't working. If both VGA cables are plugged into the splitter, neither secondary monitor works. If one VGA cable is plugged into the splitter, that secondary monitor works, but the display flickers and looks blurry.
Any ideas what to do to get these two VGA monitors to clone each other? The only thing we can think of that may be the problem is that both ports of the graphics card are DVI - we used a DVI to VGA converter plugged directly into the graphics card, and then attached the VGA splitter to the converter. Didn't realize until we pulled the CPU out that the graphics card had two DVI ports and that the VGA monitor had been hooked up via the converter all this time! If we switched the VGA splitter to another DVI splitter and then just did the DVI to VGA conversion at the individual monitors, would that make any difference? Does it make a difference which of the two ports on the graphics card the primary monitor is plugged into (I'm guessing not, since you can adjust which monitor is primary in Display Settings). Thanks in advance for any ideas!!
Each set of monitors includes a DVI (primary monitor) and a VGA (extended desktop). We thought that we could accomplish this simply by using a DVI splitter on one port of the graphics card (going to each of the DVI monitors) and a VGA splitter on the second port of the graphics card (going to each of the VGA monitors). The DVI splitter is working fine for the primary monitors on each desk (same image on both monitors, which is what we want), but the VGA splitter isn't working. If both VGA cables are plugged into the splitter, neither secondary monitor works. If one VGA cable is plugged into the splitter, that secondary monitor works, but the display flickers and looks blurry.
Any ideas what to do to get these two VGA monitors to clone each other? The only thing we can think of that may be the problem is that both ports of the graphics card are DVI - we used a DVI to VGA converter plugged directly into the graphics card, and then attached the VGA splitter to the converter. Didn't realize until we pulled the CPU out that the graphics card had two DVI ports and that the VGA monitor had been hooked up via the converter all this time! If we switched the VGA splitter to another DVI splitter and then just did the DVI to VGA conversion at the individual monitors, would that make any difference? Does it make a difference which of the two ports on the graphics card the primary monitor is plugged into (I'm guessing not, since you can adjust which monitor is primary in Display Settings). Thanks in advance for any ideas!!