External (Second Monitor) Won't go in higher resolutions

Andy11466

Distinguished
Mar 21, 2013
465
0
19,060
I had this two monitor set up for a while now.

A GTX 670 2GB for my 1920x1080 monitor for gaming

and a HD Radeon 5770 for second monitor (Movies, youtube, etc.)

It worked perfectly the two drivers never gave me stability issues.

I recently decided to take out the Radeon 5770 and just have the external monitor connect to the GTX 670 as well to save power. and the GTX 670 2gb can handle 2 monitors just fine.

The external monitor is connected to a HDMI to VGA adapter

This exact product

http://www.accellcables.com/J129B-003B.html

and the primary monitor is connected to a standard dvi slot.

The PROBLEM is that the external monitor won't go beyond 1024x768
The monitor is a 1600x900 screen. But in the screen resolution menu the 1024x768 is noted (recommended) on the side. If I go any higher the second monitor goes black and shows an error box floating around saying "Input not supported"

I don't wish to go back to using a 5770 as it creates a lot of heat inside the case and less space to work with.


Thanks in advance.
 
Solution
Hope I can clear up the CONFUSION:

1) DVI-I:
Many DVI outputs are "DVI-I" which means DVI "Integrated". Thus, all the output pins for both DVI and VGA are present.

*If you attach a normal VGA adapter to a DVI-I output you connect to just the the VGA pins.

THIS is the DVI-I to VGA adapter (again won't work on DVI-D) which often comes with a graphics card:
http://ca.startech.com/Cables/Audio-Video/Video-Adapter/DVI-to-VGA-Cable-Adapter-Black-Male-to-Female~DVIVGAMFBK

2) DVI-D
Digital pins only. VGA does not work.

3) VGA:
If your card had a basic VGA output obviously you'd use this. This has generally been replaced by DVI-I, which again contains BOTH pins for VGA and DVI.

SETUP:
Please link your graphics card next time, but it should...
Hello... VGA is lower resolution... and is a analog signal, not digital HD resolution... and you monitor is telling you this... The monitor doesn't have HDMI or DVI input too? Best setup would be using all DVI and HDMI connectors/cables and monitors...
That converter just takes your high quality HD digital signal and turns it into a Low quality VGA analog signal... you don't want that?
 

Andy11466

Distinguished
Mar 21, 2013
465
0
19,060
How is it that I can run 1600x900 if I just use two graphics cards then?

You say VGA is a lower quality res connector. when BOTH my monitors can run HD if connected to a DVI-I slot.

I'm not correctly understanding your logic here. Sorry.
 
Hello... get rid of that converter... Use both DVI/HDMI outputs of the GTX card and DVI/HDMI inputs of the monitors... use DVI/HDMI straight through cables only.

VGA is analog and is Frequency sensitive... and is not reliable when connecting to modern digital signal video cards/monitors/TV's... It is a BAND-AID approach and sometimes requires playing with the Frequency of the output of the video card signal... Results will vary... Monitors with DVI/HDMI inputs are so cheap these days... All your computer information works in Digital... why do you want to convert it to LOW performance analog?
 
Hope I can clear up the CONFUSION:

1) DVI-I:
Many DVI outputs are "DVI-I" which means DVI "Integrated". Thus, all the output pins for both DVI and VGA are present.

*If you attach a normal VGA adapter to a DVI-I output you connect to just the the VGA pins.

THIS is the DVI-I to VGA adapter (again won't work on DVI-D) which often comes with a graphics card:
http://ca.startech.com/Cables/Audio-Video/Video-Adapter/DVI-to-VGA-Cable-Adapter-Black-Male-to-Female~DVIVGAMFBK

2) DVI-D
Digital pins only. VGA does not work.

3) VGA:
If your card had a basic VGA output obviously you'd use this. This has generally been replaced by DVI-I, which again contains BOTH pins for VGA and DVI.

SETUP:
Please link your graphics card next time, but it should go something like this:

1) VGA monitor:
DVI-I output -> VGA adapter (for the card) -> VGA monitor

2) DVI monitor:
2nd DVI output (or HDMI->DVI if no 2nd output exists)

SUMMARY:
- use the DVI-I graphics card output and normal VGA adapter for your VGA monitor

CA_1_20001
 
Solution

Andy11466

Distinguished
Mar 21, 2013
465
0
19,060
@Photonboy
Thanks for the input.

My GTX has a

DVI-I , HDMIx2 , DVI-D

I'm using the 1920x1080 Primary Monitor
as

1) DVI-I Output -> DVI-I to VGA Adapter -> Monitor

as for the second monitor 1600x900

My option is HDMI to DVI-I because I only have 1 slot for DVI-I that is being used.

Why couldn't I go straight to HDMI to VGA converter? Rather then having 2 adapters. (HDMI to DVI -> DVI to VGA)


Also I tried the HDMI to VGA and can only go so much with the resolution until I get a "Input is not supported"
 


*Use your two DVI outputs.

It doesn't matter which one as both handle 1920x1080. (One is likely "Dual-Link DVI" for up to 2560x1440 or 2560x1600).

Here's a quick summary which includes most combinations:
1) DVI-I:
DVI cable

2) DVI-I:
VGA cable via "DVI-I to VGA" adapter.

3) DVI-D:
DVI cable

4) HDMI:
HDMI cable:

5) HDMI:
HDMI->DVI cable (no sound)
HDMI->DVI adapter, then to DVI cable

6) VGA:
VGA cable.

I won't bother getting into HDMI as an HDTV output. Again, you never, ever use VGA if DVI exists. HDMI is simply DVI + Audio.
 
Hi,
You STILL didn't mention what monitor inputs you have.

You listed the graphics card output, thanks but not the monitor inputs so I'm still having problems helping you.

You said you could run BOTH monitors using DVI so why are we even talking about VGA at all?

Simply run both monitors from your two DVI outputs. End. Of. Story.
 

Andy11466

Distinguished
Mar 21, 2013
465
0
19,060
I only have one DVI output.

Both my monitors have VGA.

I stated that im using a DVI to VGA adapt on my first monitor, then i tried a HDMI to VGA converter for my second which didn't work. Both of my monitors have a VGA connector. Sorry for the confusion

From your earlier post you said to use TWO DVI but I only have ONE.

Unless there is a DVI-I to Y split VGA adapter.
 
Hello... Get Off of the Convertor/Adaptor thought mode... if $$ is a issure... go to your local computer re-cycler store for a DVI/HDMI monitor... and cables... your graphic cards are beautifully engineered digital graphic munchers... give them the monitors they deserve... and your EYES will thank you later for it!!!

What is the exact model numbers on your GTX card? and output connections on it? you card should be able to do 3 monitors at one time.
 


Monitor #1:
DVI cable

Monitor #2:
DVI-I to VGA adapter (as I linked earlier) to VGA cable.

*As I explain earlier you likely have a DVI-I output from your graphics card (sometimes there are two DVI outputs but one is DVI-I and one is DVI-D as I explain above). This connector has BOTH output pins for DVI and VGA. When you use VGA you require that passive adapter. Again, it connects to the existing pins.

**Your graphics card processes DIGITAL signals natively. The VGA output is actually the DVI output which has been SPLIT off at the output and run through a DAC (Digital-to-Analog Converter).

CHECKLIST:
1. Get a "DVI-I to VGA" adapter if you don't have one (every card I've bought came with one)
2. Connect only to a DVI-I output (won't fit otherwise)