Sign in with
Sign up | Sign in
Your question

Computer only detecting one HDMI source out of a monitor and HDTV.

Last response: in Home Theatre
Share
August 22, 2012 9:21:52 PM

Hello,

I recently bought an HDMI splitter http://www.newegg.com/Product/Product.aspx?Item=N82E168... in order to make use of my HDTV and my monitor. Both are connected via HDMI and only the monitor shows up in the nvidia control panel. I want to extend these displays and the only thing it does regardless of what settings I choose, is cloning. Ive tried going into windows 7 res settings to see if I could detect it in there, but to no avail. I have a GTX 570 GPU with one Mini HDMI port I am using to feed the HDMI splitter into the monitor and the TV. Is it possible for me to get both displays detected using this methods? I want to make use of HDMI on both displays as the monitor I have is 3D capable, and the TV doesnt make use of DVI.

Any information would be greatly appreciated.

Id like to also note that the monitor is 120hz and in nvidia it is only allowing 60hz
August 22, 2012 10:24:54 PM

You can't use a splitter to do that. You need a dedicated output per monitor. TVs don't actually have a 120HZ input. The only exception to this, is if it is running in 3D. In this case, it is receiving 60Hz per eye. The 120Hz your TV can do is interpolated(fake).
m
0
l
August 22, 2012 10:33:25 PM

I wanted to be able to use HDMI ports for both the monitor and TV. Is there another alternative that I could use, that won't degrade picture? Is there some specialized DVI cables that support 3D? I'm not too informed on the matter and I greatly appreciate you clearing that up for me.
m
0
l
Related resources
August 22, 2012 10:44:22 PM

On the newer video cards, the DVI connector actually outputs HDMI signals. You simply need to use a DVI to HDMI cable. Your solution is easy. Connect your TV via Mini HDMI, and connect your monitor via a DVI to HDMI cable. This passes audio. Only one device will act as the audio device however. HDMI still is only able to select one connected display to handshake the full signal to.

If your displays are 3D, you need to connect an HDMI 1.3a or faster cable. You also will need the Nvidia 3D software. Check Nvidia.com to get a listing of supported displays.
m
0
l
August 22, 2012 11:00:02 PM

I have a DVI to HDMI connections, so that wouldn't be able to pass for 3D correct? And there are no switchers that could accomplish that.

Thanks for clearing this up.
m
0
l
August 22, 2012 11:03:38 PM

You can do 3D with the connector. As long as the display is capable of 3D.
m
0
l
August 23, 2012 4:17:16 AM

I would try it with what you have before you buy more. if you run into sync, or display issues, then go buy 1.4 stuff.
m
0
l
August 23, 2012 5:38:10 AM

http://www.amazon.com/SANOXY-Plated-Female-DVI-D-Adapto...

http://www.amazon.com/Mediabridge-Ultra-Ethernet-Certif...

Does it matter what kind of connector I buy? Cause I could just buy one on amazon for 1$ no biggie. Or does it have to be 1.4 compliant, if that even exists.

EDIT: I get the option to go into 120HZ refreshrate with the Dual DVI link, and only 60 with the HDMI hookup. Since I purchased the Nvidia software, I can enable 3d on the monitor, but I can't view it in 3D correctly (very distorted) because its DVI, I believe. It doesn't even give me the option to mess around with the stereoscopic settings when the HDMI hook up is connected, so I am going to assume that its an older HDMI cord.
m
0
l
!