GPU wont output to TV unless monitor is connected also

moosa17

Honorable
Mar 16, 2012
8
0
10,510
I've just acquired a "new" gpu (GTX 970), and have installed it in my machine with my monitor connected via DVI and my 4K TV connected via HDMI. This works fine while either mirroring my screen across both displays or while extending the screen with both set to 1080p.

I wanted to try out running in 4K on the TV but this is where I started having problems. In extended mode with my monitor at 1080p and the TV set to 4K resolution, the TV displays "no signal" and the monitor periodically goes black briefly. This I figured must be because the GPU can't handle outputting both 4K and another 1080p signal at the same time. So I decided to try using the TV by itself, and this is where things start getting weird.

Basically the TV will not display without the monitor also connected.
If I unplug the monitor, the TV shows "no signal."
If I set in Windows to "only show desktop on this display" (TV), the monitor switches off and the TV shows "no signal."
If I plug the monitor into the HDMI jack, it works fine, but if I then switch it for the TV in the same jack, it shows "no signal."
It doesn't seem to matter what is plugged in when I boot the machine, but if I plug in only the TV and reboot, the TV WILL show a signal once the animated "Windows is starting" screen pops up, but then will switch to "no signal" immediately after!

Once again it displays fine while mirroring or extending the screen at 1080p each.

My only thought is that perhaps somehow the TV is receiving a signal but isn't realizing it, or there's some kind of miscommunication going on between the devices... again my monitor works just fine on its own in the same HDMI jack, so I'm not sure what the difference occurring here is. And I haven't a clue why it decides to work with the TV when my monitor is connected also.

If anyone could give me some advice on this I would be hugely appreciative!

MOBO: MSI Z68A-GD80 (G3)
CPU: i7 2600K
GPU: GTX 970
RAM: 16 GB DDR3
OS: Windows 7
Displays: LG 1080p (DVI), Sony 4K (HDMI)
 
Solution
Basically it sounds like the TV works fine when you force it to receive a 1080p signal, and acts flaky when your computer tries to send it a 4k signal. HDMI 1.4 only supports 4k@30Hz. The animated "Windows is starting" screen is probably limited to 1080p or lower, so shows up fine on the TV. And when you have the TV and monitor connected together, it's forcing the TV to display a 1080p image.

To display 4k@60Hz requires the GPU, cable, and TV be HDMI 2.0 compliant.

The Nvidia GTX 970 should be HDMI 2.0 compliant. So you'll need to check the cable and TV. A lot of the first gen 4k TVs were sold with HDMI ports which weren't capable of receiving a 4k@60Hz signal. They'll usually have some alternative input like DL-DVI or...
Basically it sounds like the TV works fine when you force it to receive a 1080p signal, and acts flaky when your computer tries to send it a 4k signal. HDMI 1.4 only supports 4k@30Hz. The animated "Windows is starting" screen is probably limited to 1080p or lower, so shows up fine on the TV. And when you have the TV and monitor connected together, it's forcing the TV to display a 1080p image.

To display 4k@60Hz requires the GPU, cable, and TV be HDMI 2.0 compliant.

The Nvidia GTX 970 should be HDMI 2.0 compliant. So you'll need to check the cable and TV. A lot of the first gen 4k TVs were sold with HDMI ports which weren't capable of receiving a 4k@60Hz signal. They'll usually have some alternative input like DL-DVI or Displayport for 4k@60Hz input.
 
Solution