I searched the Flat Panel, Graphics Cards and Graphics & Displays forums for answers but couldn't find a solution. I asked this question on the Flat Panel Monitors forum without avail.
I have a 22" LCD TV (Teac LCDV2255HD) which has a native resolution of 1680x1050 and says it's 1080i. I'm trying to use it as a monitor for my computer:
Intel Core2Duo E8500
4GB 800MHz DDR2
Gigabyte GTX 260 core 216 GV-N26OC-896I
Antec EarthWatts EA500 PSU
Windows 7 64
The issue I have is when I plug the monitor in the image stretches beyond the sides of the screen no matter the resolution I try.
I have tried different VGA cables and a DVI/HDMI cable but the issue stays the same. I've also tried a TV with a native resolution of 1360x768 which had the same problem.
I've sent emails to TEAC but received no response. When I called them I was put through to a service technician who said to him it doesn't sound like a hardware issue with the TV and I can bring it in for service but it will cost me even if they find nothing, which he said it sounds like the case.
I have since consulted Gigabyte about the GTX 260 & they have a firmware update (for loss of signal) which I applied. Still every resolution is stretched outside the borders of the screen. Some are blurry and some aren't. The NVIDIA control panel says the native resolution is 1080i1920x1080 instead of what the TV specifications say (1680x1050, yet 1080i). I've tried scaling it in the NVIDIA control panel and I've tried adding custom resolutions but either they work momentarily, before I press the "Yes" button on the test popup or the test fails.
I'm getting a bit sick of 1280x1024 on my 17" monitor when the card should be able to handle more. But I don't know whether to buy a dedicated LCD monitor, a new card or a more powerful PSU.
I though that the two standards were conflicting. I was going to highlight that if noone else noticed it.
It says it's 1080i, 720p and in VGA it supports 1440x900@60Hz along with the three lowest 4:3 resolutions.
When I tried forcing detection in the control panel, most of the time it would settle on 1680x1050@30Hz at the right size but blurry. Sometimes it would set to 1920x1080 and stretched outside the borders of the screen.
If I have to use it at 1280×720, I might as well just buy a monitor. Looking at how much of the screen appears when high resolutions are stretched, 1280×720 seems to be the area visible.
At the moment, I'm just trying to make sure that it's the TV that's causing the problem, so that I can replace it. Any ideas on that front?
I tried running games at 1680x1050 but the TV would spit out "Invalid Format", which I assume is because the games were set to 4:3 (my 17" monitor resolution, 1280x1024). I was able to run Furmark in full screen at 1068x1050 and it ran fine. So, I'm led to believe that the card is fine. Would you guys agree?
There are a few possible issues that may be causing this. The first thing to check is to make sure the TV isn't trying to use overscan. Which basically trims the edge of television broadcasts to eliminate artifacts that would often be present. If it is trying to use overscan on a PC signal it would cut the edges like you are seeing.
The other possibility is the TV may simply not have the image processor to handle a high resolution imput from a computer. The screen lists a maximum supported input of 1080i, meaning it likely supports 480i, 480p, 720p, or 1080i inputs. 720p matches with 1280x720 which you said works properly, but 1650x1050 does not match up with q standard TV resolution. Meaning it's all dependent on the internal components. Unfortunatly that's not a brand I've heard of, so it likely isn't a high end model and thus may be limited.