Using an adapter for a display port to VGA/DVI bottleneck the display?

Apr 8, 2018
1
0
10
I'm getting a 3 monitor setup and the graphics card i will be purchasing is a ZOTAC GeForce GTX 1060, 3GB. The problem is the GPU does not have the same ports as my monitors have which are HDMI, VGA, DVI. Would getting an adapter effect the quality of the display?

Thanks
 
Solution
Not quite sure what you are asking. If all three monitors have those three ports, then you could connect one via DVI, one via HDMI, and convert a displayport to either HDMI or DVI to avoid analog altogether.

Yes, VGA has a maximum resolution limit dependent on the RAMDAC of the adapter. Generally though, even cheap commodity RAMDACs will be able to drive up to at least WUXGA 1920×1200. However being analog, the image quality is highly dependent on the quality of the output components. The problem is that a cheap active adapter designed to sell for $15 tends to be made of much worse quality parts than a graphics card originally intended to sell for hundreds. So often the image starts to get blurry over 768p or so, especially if...
Not quite sure what you are asking. If all three monitors have those three ports, then you could connect one via DVI, one via HDMI, and convert a displayport to either HDMI or DVI to avoid analog altogether.

Yes, VGA has a maximum resolution limit dependent on the RAMDAC of the adapter. Generally though, even cheap commodity RAMDACs will be able to drive up to at least WUXGA 1920×1200. However being analog, the image quality is highly dependent on the quality of the output components. The problem is that a cheap active adapter designed to sell for $15 tends to be made of much worse quality parts than a graphics card originally intended to sell for hundreds. So often the image starts to get blurry over 768p or so, especially if high refresh rates are used.

At this time it's probably smarter to invest that $15 toward a used digital monitor than to buy an adapter.
 
Solution