1080p TV outputs 1080i from desktop

schornery

Honorable
May 24, 2012
2
0
10,510
Hello,
I have an insignia tv that is capable of displaying in 1080p. When I connect the TV to my Radeon 6850 HDMI port (No adapters just straight HDMI cable), then the monitor will display in 1080i instead of 1080p. During the BIOS startup it will be in 1080p but when windows starts it changes to 1080i. If I connect my acer h233h to my computer the exact same way, then it will display in 1080p. So it seems that there is a problem with the TV and the computer.
Also the TV is an insignia NS-32L450A11, the computer recognizes this and I can't find any drivers for it if there are any.
So why is the TV only receiving a 1080i signal?
 

CoolBOBob1

Distinguished
Jul 17, 2009
45
0
18,540
I'm at work and have an older AMD driver, but try this. Go to your Catalyst Control Center (right click on the Desktop), go to monitor properties->hdtv Support and see if you can check off the box with 1080p60
 

schornery

Honorable
May 24, 2012
2
0
10,510

I selected "Add 1080p60/24/50 format to the display manager (NTSC/HD/PAL)" in "HDTV modes supported by this display" and clicked apply.
But I didn't know how to implement them other than to go to the section below on the same page labled "Predefined and Custom HDTV Formats" and select the 1080p24 standard option as the other frequencies were not visible.
Then I clicked apply format and now it is in 1080p according to my TV which is good enough for me.
Was that how I was supposed to do that? I don't know which display manager they are referring to and where the "force" button is.
 

-mt

Honorable
Nov 10, 2012
1
0
10,510


CoolBOBob1, I too have an insignia 1080p HDTV.
Connecting my PS3 system: detects and uses 1080p :)
Connecting my old Vista 64 Gateway Laptop with nVidia 9800M: detects and plays 1080p 60hz :)
Connecting my new Win 7 64 desktop with a Radeon HD 6670: detects as 1080i/30hz. :pfff:

The Radeon equipped desktop refuses to allow 1080p/60 which I have been using with two other devices since I purchased the HDTV over a year ago. Also note: it detects as interlaced not progressive on the win7/radeon box. This is a 1080p, not a 1080i HDTV. :sarcastic:

I'd rather not go the route of the DVI-HDMI converter. Doesn't support audio, AFAIK
 

Blackjack Davy

Distinguished
Dec 7, 2009
17
0
18,520
Bumping the thread...

I have exactly the same problem. I had a Radeon 4870x2 that had 2 DVI outputs from the back, one was hooked up direct to the monitor the other to the TV via a DVI-HDMI dongle.

With that setup, TV had full 1080p via DVI/HDMI. Sound too.

I've just upgraded to a HD 7950 with only one DVI out and an HDMI port. Thats where the trouble started...

The DVI will output pure 1080p. The HDMI will too, but only without the AMD display driver installed.

How do I know? At BIOS boot I get cloned displays from monitor and TV - and TV reports 1080p. Everything is fine until Windows boots. Then the TV display dies and it reports unable to display. It will display 1080i, but not 1080p.

Thats not all - the maximum resolution it will display progressive is 720p. Anything aboe that, its interlace source only.

So - I plugged in the DVI/HDMI dongle into the DVI port hooked it up to the TV - and guess what, I get 1080p - in Windows. I also get audio via the TV!

Great you might think - except if I hook up the monitor into the HDMI port - the monitor will ONLY display a maximum resolution of 720p - 1280x720 - thats all. No more. The monitor of course displays progressive sdan only it doesn't do interlace...

SO to cut a long story short, my conclusion is this - the DVI port will display a maximum of 1080p, the HDMI port a maximum of 720, higher if interlace is available. But its only the AMD display driver that imposes this limit.

EDIT: I've found a sort of fix. The display isn't reporting EDID information correctly via HDMI for some reason - for this partial fix (it only works on MONITOR not TV so plug the TV into DVI output and the MONITOR into HDMI) -> right click on Desktop > Screen Resolution, select your monitor display > Advanced settings, > List All Modes.

If the maximum resolution reported is 1280x720 click > Monitor > untick "Hide Modes This Monitor Cannot Display", click OK. Click OK again to close Screen Resolution.

Open Screen Resolution again you should have the option to select 1920x1080. If for some reason your desired resolution isn't there i.e. I wanted 1920x1200 (16:10) you'll have to create a custom resolution in the registry or use toastyX's tool to do create it for you: http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

Hope that helps someone. Its been quite a ride.
 

Mantas Balaisa

Honorable
Jan 31, 2014
2
0
10,510


Bumping again, as Im having exactly same problem. Got my AMD r9 270x hooked to Samsung f5000 LED TV via HDMI-HDMI lets output maximum 1080i 30hz.. in driver settings it shows that TV is capable of full HD 1080p 60hz (and I know it, because I can easily connect my NB with GT540m with no problem) but when I set this setting TV shows that there is no signal ot Resolution not supported (Actually it auto sets 1080p 60hz after driver install so I had to boot windows in low resolution mode to change that 1080p to something lower) . Hovever everything works just brillian (1080p 60hz) just after windows (8.1) install until at least windows native amd display drivers are installed. Should I try getting DVI- HDMI cable? I just dont get it.. Why amd doesnt fix it when this is such a common problem. And I was wondering should I pay more and get GTX760 or better go value for money r9 270x... It seems my decision was wrong... Now I understand why u have to pay lil more for Nvidia product...

 

CoolBOBob1

Distinguished
Jul 17, 2009
45
0
18,540
It's actually an issue with tv not identifying correctly. I always seem to answer this when I'm at work, but under advance menu and display options, there should be a. Button that will let you choose "unsupported" resolutions. It'll give you a list, choose your resolution, and it will now be available where your can change resolutions. At least amd seems to have the least issues with overscan on my old flat screen.

 

Mantas Balaisa

Honorable
Jan 31, 2014
2
0
10,510
If its a TV problem.. Then why exactly same TV, with exactly same cable works like a charm when connected to my Laptop with Gf GT540m? It instantly outputs 1920x1080@60hz without any problems.
 

CoolBOBob1

Distinguished
Jul 17, 2009
45
0
18,540
Ok I'm finally at home. Go to your catalyst control center. Make sure your in advance view, go to "My Digital Flat-Panel"->"HDTV Support(Digital Flat-Panel) them check the box that your tv is compatible with, in my case it was "1080p60 (NTSC) if your international it might be 1080p50 (PAL).

Then go where you normally change your resolution in catalyst control center it should now be listed as a choice
 

Mr_Zimmerman

Reputable
Jun 10, 2014
1
0
4,510


I have done this, but the option I've checked on the HDTV Support screen doesn't appear anywhere else as selectable. I've been trying to get this working on the 14.6b CCC, but I think I will simply roll it back to 14.4 because I didn't have this issue until I changed driver packages.
 

louis kemp

Reputable
Apr 14, 2015
1
0
4,510
If you have a Nvidia graphics card this problem is easily fixed.
what you need to do is gow to your desktop and right click, then select Nvidia control panel, once you have opened this you need to select 'change resolution' which should be under 'Display' once you are into this you can select what resolution to have your pc out put, select 1080p, 1920x1080, simply then click apply in the bottom corner of the screen and click yes to keep changes

 

teoteodore

Distinguished
Mar 9, 2015
16
0
18,510
I know this is an old thread, but I'd like an update. I am using a Samsung 32" 1080p HDTV with a new DELL 8700 XPS equipped with AMD Radeon R9 video card on Window 8.1 connected via HDMI cable. The resolution at the recommended 1920x? results in illegible text and cartoonlike pictures and video. After hours of trial and error with all the settings in AMD and Windows and the HDTV, I thought I'd hookup my old Dell Monitor to see if the HDTV was defective. I used dual monitor, plugging the smaller monitor into the DVI port of the video card and voile ... the HDTV screen switched to the lesser monitor display in perfect resolution (full height but half width display). As soon as I unplugged the smaller monitor, the HDTV screen defaulted to the supposed larger image and poor quality. For a while I put the smaller monitor under my desk (unplugged from electric, plug still connected to the DVI) and used the HDTV as a slave monitor to it just to get the perfect image on a larger screen. Does all this sound to you, that it really could be as simple as a HDMI to DVI adaptor? (I ordered one anyway.)
 

Chrysaliarus

Commendable
Dec 25, 2016
3
0
1,510


What I did to fix this problem is go into advanced display settings. Under related display settings theres display adapter properties.
36d68117ae604cb3bb490ff024a7a96f.png


You want to click display adapter properties and go to monitor there you have to change the refresh rate to your monitors refresh rate from what it was before. (mine was at 29 for some reason)
57788ec512284a3191a6b746746bb1fe.png


Hope this helps
 

TRENDING THREADS