HDMI Screen Cut off + Mouse and Keyboard lag

mtader

Reputable
Oct 13, 2014
14
0
4,510
Hello all, I just put in my new GTX 1070 and after installing the drivers the edges of my screen are cut off and both my mouse and keyboard are lagging. I am trying to use an HDMI port. I have read online to adjust the underscan/overscan value in the nvidia control panel but what I have is called "geforce experience" Is that the same thing? I can't find an underscan/overscan option in it. Would that also fix the keyboard/mouse lag?

Thanks

Edit: I should also mention the screen is pretty fuzzy but in the Windows 10 settings it says I am at 1920x1080 which is the correct resolution.
 
When running underscan/overscan, the screen is often times blurry as you describe.

Both the input lag and sizing issues you describe are often times related to using a TV as your display device.

If you are using a TV as your display device, see if you can set your input to work in a 1:1 or Dot by Dot mode, or perhaps can remap the input to be the PC input. This will usually fix scanning issues and remove video processing that causes input lag. Not all TVs support this however, so your mileage will vary.

While I haven't seen NVIDIA's current control panel, that's what you want to be changing your underscan/overscan settings through, not GeForce Experience.
 

mtader

Reputable
Oct 13, 2014
14
0
4,510
Okay so I turned the overscan off on my TV and the edges of the screen are no longer cut off. However the screen is still a little grainy. Is there something else I can try?

Edit: The keyboard/mouse lag has also been fixed
 

Wiz33

Distinguished
Sep 10, 2006
177
0
18,760
What brand ,model and year? Most LCD TV up to 2015 does not support 4:4:4 color and your text, instead of solid black in color will be fuzzy with different color tint on the edge. try this:

http://www.geeks3d.com/20141203/how-to-quickly-check-the-chroma-subsampling-used-with-your-4k-uhd-tv/
 

mtader

Reputable
Oct 13, 2014
14
0
4,510
Its a Sharp TV and its few years old and I'm not sure of the model. I tested my TV with that website and it looks like it is not 4:4:4. So I guess I'll just try using VGA. I was using that before its just now with the new graphics card it has a DVI-D port so I'm gunna have to buy an adapter. Not to bad though.
 
The problem is pretty simple to pinpoint but sometimes impossible to fix, and it's 100% driver related.

Drivers by default have terrible settings, but they at least work for 99% of the people. Both NVIDIA and AMD refuse to fix the issues for the 1% that realize that their display quality is not as it should be.

Instead of just giving you immediate control over how the HDMI connections are handled, these companies insist on trying to baby their users and hide settings far and away, where they really don't need to.

It's clearly not an issue with your TV, as you had it working correctly with your last graphics card, however, convincing your NVIDIA card to display crisp output on a TV can be an exercise in frustration. It's difficult enough at times on an AMD card. I stopped using NVIDIA years ago for this reason.

The most obnoxious fix I've had to come up with is to override the EDID information that Windows retrieved from the screen and replace it with information that told the graphics card it was not a TV but a monitor instead, at which point, the output was magically fixed. This sort of idiocy could be fixed in the driver control panels with a single toggle switch, but it's not going to happen.

As far as supporting 4:4:4 color, that's not relevant, as the previous graphics card output clearly. Besides, at least on AMD, you can pick different pixel formats, so whether the TV supports 4:4:4 color or not, the card could be outputting something else. I have no idea if NVIDIA lets you choose different pixel formats or not, but I would be surprised if it wasn't possible, if not through their control panel software, through a Registry edit somewhere.