I have an insignia tv that is capable of displaying in 1080p. When I connect the TV to my Radeon 6850 HDMI port (No adapters just straight HDMI cable), then the monitor will display in 1080i instead of 1080p. During the BIOS startup it will be in 1080p but when windows starts it changes to 1080i. If I connect my acer h233h to my computer the exact same way, then it will display in 1080p. So it seems that there is a problem with the TV and the computer.
Also the TV is an insignia NS-32L450A11, the computer recognizes this and I can't find any drivers for it if there are any.
So why is the TV only receiving a 1080i signal?
I'm at work and have an older AMD driver, but try this. Go to your Catalyst Control Center (right click on the Desktop), go to monitor properties->hdtv Support and see if you can check off the box with 1080p60
Go to your Catalyst Control Center (right click on the Desktop), go to monitor properties->hdtv Support and see if you can check off the box with 1080p60
I selected "Add 1080p60/24/50 format to the display manager (NTSC/HD/PAL)" in "HDTV modes supported by this display" and clicked apply.
But I didn't know how to implement them other than to go to the section below on the same page labled "Predefined and Custom HDTV Formats" and select the 1080p24 standard option as the other frequencies were not visible.
Then I clicked apply format and now it is in 1080p according to my TV which is good enough for me.
Was that how I was supposed to do that? I don't know which display manager they are referring to and where the "force" button is.
Well if those other options don't show up, that just means your tv most likely can't support them.
CoolBOBob1, I too have an insignia 1080p HDTV.
Connecting my PS3 system: detects and uses 1080p
Connecting my old Vista 64 Gateway Laptop with nVidia 9800M: detects and plays 1080p 60hz
Connecting my new Win 7 64 desktop with a Radeon HD 6670: detects as 1080i/30hz.
The Radeon equipped desktop refuses to allow 1080p/60 which I have been using with two other devices since I purchased the HDTV over a year ago. Also note: it detects as interlaced not progressive on the win7/radeon box. This is a 1080p, not a 1080i HDTV.
I'd rather not go the route of the DVI-HDMI converter. Doesn't support audio, AFAIK
I have exactly the same problem. I had a Radeon 4870x2 that had 2 DVI outputs from the back, one was hooked up direct to the monitor the other to the TV via a DVI-HDMI dongle.
With that setup, TV had full 1080p via DVI/HDMI. Sound too.
I've just upgraded to a HD 7950 with only one DVI out and an HDMI port. Thats where the trouble started...
The DVI will output pure 1080p. The HDMI will too, but only without the AMD display driver installed.
How do I know? At BIOS boot I get cloned displays from monitor and TV - and TV reports 1080p. Everything is fine until Windows boots. Then the TV display dies and it reports unable to display. It will display 1080i, but not 1080p.
Thats not all - the maximum resolution it will display progressive is 720p. Anything aboe that, its interlace source only.
So - I plugged in the DVI/HDMI dongle into the DVI port hooked it up to the TV - and guess what, I get 1080p - in Windows. I also get audio via the TV!
Great you might think - except if I hook up the monitor into the HDMI port - the monitor will ONLY display a maximum resolution of 720p - 1280x720 - thats all. No more. The monitor of course displays progressive sdan only it doesn't do interlace...
SO to cut a long story short, my conclusion is this - the DVI port will display a maximum of 1080p, the HDMI port a maximum of 720, higher if interlace is available. But its only the AMD display driver that imposes this limit.
EDIT: I've found a sort of fix. The display isn't reporting EDID information correctly via HDMI for some reason - for this partial fix (it only works on MONITOR not TV so plug the TV into DVI output and the MONITOR into HDMI) -> right click on Desktop > Screen Resolution, select your monitor display > Advanced settings, > List All Modes.
If the maximum resolution reported is 1280x720 click > Monitor > untick "Hide Modes This Monitor Cannot Display", click OK. Click OK again to close Screen Resolution.
Open Screen Resolution again you should have the option to select 1920x1080. If for some reason your desired resolution isn't there i.e. I wanted 1920x1200 (16:10) you'll have to create a custom resolution in the registry or use toastyX's tool to do create it for you: http://www.monitortests.com/forum/Thread-Custom-Resolut...