I have just recently hooked up my HDTV to a computer input and had it running fine @ 1920 x 1080 x 60Hz for about a week...
Then, I fired up Warcraft III...which doesn't support native 1920x1080 resolution. I switched resolutions in game options until I found one that looked good...played for about an hour...
When I exited to Windows desktop (XP Pro), my display was cut off on the edges and dim...
It had reduced to 1536 x 1080 @ 60 hz.
The panel will no longer display 1920 x 1080 in Windows unless I reduce refresh to 50 Hz. Otherwise it outputs the reduced, dim/garbled resolution...
Did running at a non-HDTV standard resolution kill my TV?
If it did, what is the lesson to be learned here? Don't game on HDTV's or just game and hope you don't stumble across a resolution that kills your panel?
Thanks so much for your input!
point at the desktop (not at any icon)
right click once
left click on properties, this opens the display properties
left click on settings tab
can you reset the resolution?
No, I don't think it killed your panel, XP just reset it's own resolution.
I tried resetting the resolution back manually to 1920 x 1080 on both the XP display properties and the ATI CCC tab. However, if I selected 60 hz refresh rate (the TV's default), it would garble the display and overscan to display only the 1536 x 1080...
I tried reducing the refresh rate to 50 hz and the display cleared right up...
I have since updated to the newest ATI drivers and...windows desktop seems to be working at 1920 x 1080 x 60 hz again.
Perhaps it was just a bug w/ the display drivers, but it was occurring both in XP and in Ubuntu 9.10.
I'm going to try some other games at 1920 x 1080 tonight and see what the results are...
I'll update then w/ hopefully final news saying that this is resolved.
I tried downloading the CCC and drivers directly from AMD, and they malfunctioned. I tried downloading the CCC and drivers from the video card manufacturer and they worked "mostly" OK. There was also a video card bios flash update from the manufacturer, but I have not tried to install it.
Perhaps you should try deleting the video drivers all together, loading manufacturer's or AMD CCC and drivers, see which one works better, or flashing the video card bios also, if you have the motivation.
I found that ATI drivers are really lacking in some ways, and that ATI cards do not function as they are supposed to in every way advertised. There are for sure bugs in the drivers, and the manufacturer sells the cards complete with the bugs included (that's sapphire).
Windows also has some video drivers which could sometimes be replaced by the monitor manufacturers drivers. Downloading a "windows update" or automatic update has also been known to corrupt the system, or cause the video to malfunction or fail completely.
For this reason I deleted service pack 3, turned off automatic windows updates permanently, installed "tweek IU" from microsoft power toys (to shut off message balloons: "your system may be at risk! turn on automatic updates!) and have had no more "surprises" from automatic update causing the system to malfunction.
Also, doing a repair install with the original OS disk, (not reformat) will knock the system back to the better drivers (not automatic update drivers)
It seems that the automatic updates corrupt the system, in order to motivate XP users to switch to windows 7. Therefore I recommend installing (or repair installing) service pack 2 or 3 from the disk only, and do not use automatic updates.
It's alive...somehow updating the drivers did the trick. I think that somehow being at a non-typical resolution in WC3 freaked the driver out somehow and it stayed stuck that way until I downloaded new drivers. Thanks for your help.