TV Screen blinking with ASUS GTX 1050

tazz85

Distinguished
Aug 25, 2011
96
0
18,640
OK, I've read around some posts about TV screen blinking when connecting to HDMI. So, I've got 2 32 inch TV screens from Samsung which have 1366x768 resolution. I'm using the computer to watch cameras - about 32 - 16 on 1st TV, 16 - on 2nd.

My build is this

PSU: Seasonic 600W Gold
Core i7-7700
ASUS STRIX GTX 1050
8GB RAM
120GB SSD

I've first connected the 2 TV using DisplayPort->HDMI and HDMI - HDMI cable. So, the problem I have is this - when I click on 1 camera to show it on full screen, TV starts to blink/flashes. Camera is going to main-stream. And btw 1st 2 days it didn't did that. On 3rd day it starts happening whenever you select one camera only and until you don't click twice screen continues to flash. Someone said me that it's from changing resolution and the video card is pushing the bigger resolution on TV and TV with the lower resolution can't handle it. But when I change the port problems stop! Now I've connected using DVI - HDMI cable, problems stop again for 2 days, and after that they arise again. WIth DVI cable I think is more stable, because when I tested it first I changed between 2 DVI ports on the ASUS card and 1 try from 4 it showed problems. Someone said to replace TVs with Full HD ones and then there won't be problems. Is there any other way to change some setting on NVIDIA Control Panel to make this card working these Samsung TV? 2nd TV is connected to DVR directly and shows 1280x1024 @ 60Hz without these problems, but it's analogue DVR.
Thank you.
 
I'm not quite sure what the problem is, but as per NVidia CP settings there is one you might try.

NVidia Control Panel->
Adjust desktop size and position->

choose "aspect ratio" and scale by "GPU"

*not sure if this even applies to HDTV's as they use a different protocol. How you output "1920x1080" to a monitor is different than "1080p" to an HDTV.

**do your cameras have a video output themselves?
 

tazz85

Distinguished
Aug 25, 2011
96
0
18,640
I'm watching 16 cameras with software for remote vieweing. When I select one camera with double clicking to view her on full screen TV starts to blinking - goes black, then shows picture, then again black... and it shows me HDM1 and 1366x768 resolution.
 


I see. I had found your post a little confusing to read.

It's so unique, not to mention that it worked then not so I don't know how to help troubleshoot.

I don't see how changing to a 1080p HDTV would help. In fact, you should be able to choose a 1080p HDTV option and the HDTV should downscale that internally. I do that with my BLURAY player instead of choosing the 720p option (PC has 768p I guess which should scale fine but again try and see if 1080p output is an option... not sure why that would help).

Not sure if that's an option on the PC but see if it is.

Otherwise I can only think to try one HDTV at a time, or see if that software has a support site or user feedback site.
 

TRENDING THREADS