Flickering on TV

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
I use the monitor for most computer use and set the HDTV up as an extended desktop for when I want to watch movies from my computer. The problem is that there is an intermittent horizontal flickering that occurs on the HDTV whenever there is a lot of motion on the screen. The more motion, the more flickering. This only occurs on the HDTV, not the monitor.

The TV is hooked up using an HDMI cable (with a DVI adapter which came with the video card). I've also tried a DVI-HDMI cable and an s-video cable. The refresh rate for the TV is set to 60 Hz in the Nvidia control center. I've tried to force a lower rate without any success.

Any ideas if this can be fixed? And how I would go about doing so if it can?

Sorry I copied and pasted this from someone, who is having the EXACT problem as me about two years ago. I am currently running on a GTX 260 and using dvi cables to both the monitor and the LEDTV, with the TV being at 1920x1080, and the monitor being at 1680x1050. My drivers are up to date.

Specs:
Phenom X6 1100T
6GB RAM
750W Corsair PSU
Windows 7 X64 Ultimate
GTX 260

Displays:
hpf210
LG 42HL90

Both my displays are HDCP compliant.

Setting my HDTV as primary display seems to have solved the problem, but now I do not have a taskbar on my LCD (which is still a problem)
 

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
That would mean the HDTV is set as the main display in my control panel, and thus solves the problem. But, however, it also moves my desktop to the TV rendering my LCD useless.
 

Pilk

Distinguished
Jan 6, 2010
519
0
19,010
Im still here, have you updated you display drivers for the card, lets start with the basic and work our way up? if so maybe try go down a few version and get an older version of the driver. Have you tried?
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
About the task bar issue:
There are different settings you can apply to a second monitor:
- Dublicate: 2 the same screens
- Extend: stretched over both screens
- Single monitor: image shown on one monitor

Try to set the settings to dublicate.
Then adjust the resolution, so both screens have the same resolution.

The task bar should now show up.

The problem with Windows is that it doesn't support two different resolutions.
Of course the image can be displayed in different resolutions, but the Windows elements like the task bar can't.

I can tell you this from my own experience!
I sometimes connect my laptop to an external monitor.
So I can play a game on my laptop and have my mail on the other.
The external monitor has a different resolution and thus the task bar and desktop icons aren't shown.
No problem for me, but if I want everything to show up on the second screen I just adjust the resolution of my laptop screen and it works.
 

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
The problem with that is that my monitor is 1680x1050, and my LEDTV is 1080p, it would be such a waste if I just the TV at the same resolution (assuming it is supported).

I did update my drivers to the latest, downloaded nTune, played with that, no avail. There's always a line when I set my monitor to main display.
 

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
Resolution does not seem to be a factor, as when I set the LEDTV as a main display, it no longer tears.
I did try it however, still tears. It's only when there's motion though.
@greatforce
The problem with that is Windows elements only show on the display I set as the main display, and I cannot set the LCD as the main without the LED tearing.
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
Did you check your advanced video settings in the Nvidia control center?
Here you can adjust the conversion of the 1080p footage and also adjust the image quality settings and the automatic optimalisation.
Because it can also be the graphics card itself.
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
I don't know how your control pannel is looking because you have a different graphics card then me.

Did you set the settings in the Nvidia control pannel to advanced mode?
If you did, there are categories on the left side of the control pannel.
On the bottom of the list should be the video settings (with 2 underlined subcategories).
The first underlined subcategory enables you to adjust the advanced color settings (and also other things like refresh rate).
The second enables you to adjust the video optimalisation settings (like sharper edges, 1080p footage restoration, etc.).
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
Of course it is a scale, they are all sliders.
Each slider does a different thing.

And if you can't find the advanced settings, try to switch from basic to advanced mode (I think the button was on the bottom of the control pannel, it looked like small underlined text).

If you could somehow add a screenshot of your control pannel, it would be easier for me to see what you have to do.
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
Okay.

If you can add a screenshot it would be easier for me to look what you have to do.
If you are unable to do that, you'll have to take a look at the menu on the left of the control pannel.
It should contain different categories and different (underlined) subcategories.

You should search in this menu for video settings (screen adjustements, or anything similar).
If you have found the right one, you'll see a number of sliders (gamma, color correction, etc.) and also a second tab with more sliders. Try playing with these sliders.

In the other subcategory you should find 2 sliders and a tickbox.
The first slider is for optimized image quality and the second is for correcting the distortions.
The tickbox is for turning on/off 1080p footage conversion.

There should aslo be a category where you can adjust the refresh rate and all that stuff.
Don't force the refresh rate down, but up.
The higher the refresh rate, the smoother the image and less flickering.
Your LEDTV should support higher refresh rates, because most new tv's are made to display smooth images with a high refresh rate.
 

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
The max input any tv can take is 60hz. Even if you try to force the refresh rate higher, it will not work because of the bandwidth issues at that resolution. I do not see a 1080p footage conversion, but I do see the sliders. I have already tried playing with them to no avail. I'm really not sure what it is. I may have to just buy a new graphics card...ugh..
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
Wierd, my tv can handle 60 hz and my monitor 70 hz (for a 8 year old monitor it is awesome).
There are tv's that can handle a higher refresh rate, but I don't know which ones.

I don't know either what else it could be.
It could be the tv, but also the graphics card.
If you have a spare graphics card, you could try it out with this one.
Or just borrow a graphics card from a friend or something, before you buy a new one.
Because if it's the tv, it will be a waste of money if you buy a new card.
 

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
I really don't think it is the tv, since when I set it to main display, the screen tearing goes away.

Yeah, my monitor can go up to 70, but what i meant with the TV was it will only accept 60hz input. The advertised 240hz, 480hz, stuff is just a marketing gimmick and is not true refresh rate (although it does make things LOOK smoother, which is the whole point)
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
The funny thing is though, the TV's do have that refresh rate.
But only if the input also gives that refresh rate.
All the receivers, etc. are all 60 hz, thus you will never have that high refresh rate.

I agree, it must be the graphics card then if the tearing goes away when set to the main display.
It shouldn't happen, but in your case it did.

Luckily my laptop hasn't the same problems.
I can (if I want to) plug 3 external monitors in my laptop, one on the VGA output, one on the DVI output and one on the S-Video output. Not bad for a 4 year old Acer Aspire 5520, with a Nvidia 8400M G graphics card.
So, I think if it isn't the tv it must be the graphics card.
 

kokonutz

Distinguished
Sep 4, 2011
15
0
18,510
The funny thing is though, the TV's do have that refresh rate.
But only if the input also gives that refresh rate.
All the receivers, etc. are all 60 hz, thus you will never have that high refresh rate.

^ This is not true, I'm not sure where you got that info. The advertised refresh rate isn't "true" refresh rate.


I agree, it must be the graphics card then if the tearing goes away when set to the main display.
It shouldn't happen, but in your case it did.

- I think this is my only option, it's weird because the graphics card isn't even that old. I suppose it's time for an upgrade anyway.

Luckily my laptop hasn't the same problems.
I can (if I want to) plug 3 external monitors in my laptop, one on the VGA output, one on the DVI output and one on the S-Video output. Not bad for a 4 year old Acer Aspire 5520, with a Nvidia 8400M G graphics card.
So, I think if it isn't the tv it must be the graphics card.
 

greatforce

Distinguished
Aug 31, 2011
53
0
18,640
It's about the refresh rate of the hardware and the refresh time of the screen itself in milliseconds.
The advertisements are based on the refresh time of the screen, but instead of using milliseconds they use hertz.
The TV itself must support the refresh rate given by the input source, otherwise it won't work.
TV's with a build in digital TV tuner can in some cases achieve the high refresh rates, due to upscaling technology.
So, it's definetly possible.

It's indeed weird, because it is a brand new card.
If you could try to fit a friends' card in it and try it out with that one, you know if it is the graphics card or not.