Display with new graphics card looks terrible

Hello!
I have just installed new Graphics card ( Dataland RX 560 4G X-serial war ) after that i also installed AMD driver but display looks terrible resolution is ok but colors they look like borderlands graphics
Reply to chillaxton
24 answers Last reply
More about display graphics card terrible
  1. chillaxton said:
    Hello!
    I have just installed new Graphics card ( Dataland RX 560 4G X-serial war ) after that i also installed AMD driver but display looks terrible resolution is ok but colors they look like borderlands graphics


    make sure to set native resolution and color depth of your monitor


    right click the desktop


    screen resolution > advanced settings > monitor > colors


    also make sure the refresh rate is set to monitor native
    Reply to maxalge
  2. maxalge said:
    chillaxton said:
    Hello!
    I have just installed new Graphics card ( Dataland RX 560 4G X-serial war ) after that i also installed AMD driver but display looks terrible resolution is ok but colors they look like borderlands graphics


    make sure to set native resolution and color depth of your monitor


    right click the desktop


    screen resolution > advanced settings > monitor > colors


    also make sure the refresh rate is set to monitor native


    resolution and refresh rate is native but i dont know about color depth
    I tried playing with AMD display settings to fix that but its not helping
    I am using windows 10
    Reply to chillaxton
  3. Some HDMI displays tend to be treated as TVs by the display driver and are subjected to Chroma sub sampling issues.

    If you're using a TV as your display, you need to ensure the input is set as 1:1, Dot-by-Dot, DVI-PC, or whatever the manufacturer of the TV specifies to give you a raw HDMI input.

    In your graphics card's control panel, ideally you want to set the output to RGB 4:4:4 pixel format or better.
    Reply to bigpinkdragon286
  4. chillaxton said:
    maxalge said:
    chillaxton said:
    Hello!
    I have just installed new Graphics card ( Dataland RX 560 4G X-serial war ) after that i also installed AMD driver but display looks terrible resolution is ok but colors they look like borderlands graphics


    make sure to set native resolution and color depth of your monitor


    right click the desktop


    screen resolution > advanced settings > monitor > colors


    also make sure the refresh rate is set to monitor native


    resolution and refresh rate is native but i dont know about color depth
    I tried playing with AMD display settings to fix that but its not helping
    I am using windows 10



    make and model of your monitor?
    Reply to maxalge
  5. bigpinkdragon286 said:
    Some HDMI displays tend to be treated as TVs by the display driver and are subjected to Chroma sub sampling issues.

    If you're using a TV as your display, you need to ensure the input is set as 1:1, Dot-by-Dot, DVI-PC, or whatever the manufacturer of the TV specifies to give you a raw HDMI input.

    In your graphics card's control panel, ideally you want to set the output to RGB 4:4:4 pixel format or better.


    Yes! I am using TV. I have set output to RGB 4:4:4 but i dont really know how to set input to 1:1

    My TV model is: philips 221te2l
    Reply to chillaxton
  6. maxalge said:
    chillaxton said:
    maxalge said:
    chillaxton said:
    Hello!
    I have just installed new Graphics card ( Dataland RX 560 4G X-serial war ) after that i also installed AMD driver but display looks terrible resolution is ok but colors they look like borderlands graphics


    make sure to set native resolution and color depth of your monitor


    right click the desktop


    screen resolution > advanced settings > monitor > colors


    also make sure the refresh rate is set to monitor native


    resolution and refresh rate is native but i dont know about color depth
    I tried playing with AMD display settings to fix that but its not helping
    I am using windows 10



    make and model of your monitor?


    Philips 221te2l
    Reply to chillaxton
  7. I am not sure how can i explain this problem all colors look ugly black colors have white glow although resolution is 1080P it still looks pixalated nothing is softened
    Reply to chillaxton
  8. chillaxton said:
    I am not sure how can i explain this problem all colors look ugly black colors have white glow although resolution is 1080P it still looks pixalated nothing is softened


    1080p 32-bit color, is it set to 32-bit colors?

    are you using HDMI?
    Reply to maxalge
  9. maxalge said:
    chillaxton said:
    I am not sure how can i explain this problem all colors look ugly black colors have white glow although resolution is 1080P it still looks pixalated nothing is softened


    1080p 32-bit color, is it set to 32-bit colors?

    are you using HDMI?


    Yes today i got New HDMI its set to 1080P but i dont know how to check if its 32-bit
    Reply to chillaxton
  10. chillaxton said:
    maxalge said:
    chillaxton said:
    I am not sure how can i explain this problem all colors look ugly black colors have white glow although resolution is 1080P it still looks pixalated nothing is softened


    1080p 32-bit color, is it set to 32-bit colors?

    are you using HDMI?


    Yes today i got New HDMI its set to 1080P but i dont know how to check if its 32-bit


    i posted how already, read my previous post


    right click the desktop


    screen resolution > advanced settings > monitor > colors
    Reply to maxalge
  11. yeah i saw that but its not like that for me
    Reply to chillaxton
  12. i have display settings and personalization
    Reply to chillaxton
  13. Reply to chillaxton
  14. maxalge said:
    chillaxton said:
    maxalge said:
    chillaxton said:
    I am not sure how can i explain this problem all colors look ugly black colors have white glow although resolution is 1080P it still looks pixalated nothing is softened


    1080p 32-bit color, is it set to 32-bit colors?

    are you using HDMI?


    Yes today i got New HDMI its set to 1080P but i dont know how to check if its 32-bit


    i posted how already, read my previous post


    right click the desktop


    screen resolution > advanced settings > monitor > colors


    yeah that wasnt working for me as my settings are different but i was trying to find that and i think i found it but still can see that colors setting here is screenshot https://cdn.discordapp.com/attachments/215008064829521921/346039335558184962/unknown.png
    Reply to chillaxton
  15. The only possible section I can find in the manual for your TV is on page 21:

    Philips Manual - Page 21

    I would cycle through available options for both the Wide mode and HDMI overscan settings. One of those might give you a 1:1 pixel output on the display. Setting your TV for 1:1 will likely fix the problem, if you can, but some TVs actually lack the ability to perform 1:1 digital output, making them poorly suited for use as a PC display. Never having tried with the Philips brand, I really can't say with any certainty about your model.
    Reply to bigpinkdragon286
  16. bigpinkdragon286 said:
    The only possible section I can find in the manual for your TV is on page 21:

    Philips Manual - Page 21

    I would cycle through available options for both the Wide mode and HDMI overscan settings. One of those might give you a 1:1 pixel output on the display. Setting your TV for 1:1 will likely fix the problem, if you can, but some TVs actually lack the ability to perform 1:1 digital output, making them poorly suited for use as a PC display. Never having tried with the Philips brand, I really can't say with any certainty about your model.


    yeah i tried that settings but they dont help before i was using GTX 650 1GB with VGA on this monitor and it had
    0 problem
    Reply to chillaxton
  17. That's because VGA inputs on TV's are always considered and treated as 1:1 inputs to the display's output. HDMI is still subject to the idiocy of over scan. The drawback to VGA is that it's analog.

    As a last resort, you could try forcing the graphics card driver to output using DVI mode. The biggest downside to this is you lose HDMI audio through your HDMI because, well, that's just the way AMD programed their drivers to behave. I have to do this with one of my TVs that AMD keeps identifying as a TV, not as a DVI device. The other downside is it requires you to jump through some hoops to install an unsigned .inf file for the Monitor device in Device Manager.
    Reply to bigpinkdragon286
  18. bigpinkdragon286 said:
    That's because VGA inputs on TV's are always considered and treated as 1:1 inputs to the display's output. HDMI is still subject to the idiocy of over scan. The drawback to VGA is that it's analog.


    so if I get HDMI to VGA adapter will that help?
    Reply to chillaxton
  19. bigpinkdragon286 said:
    That's because VGA inputs on TV's are always considered and treated as 1:1 inputs to the display's output. HDMI is still subject to the idiocy of over scan. The drawback to VGA is that it's analog.

    As a last resort, you could try forcing the graphics card driver to output using DVI mode. The biggest downside to this is you lose HDMI audio through your HDMI because, well, that's just the way AMD programed their drivers to behave. I have to do this with one of my TVs that AMD keeps identifying as a TV, not as a DVI device. The other downside is it requires you to jump through some hoops to install an unsigned .inf file for the Monitor device in Device Manager.


    well i am not sure if it will be visible but if i take a picture of my screen tomorrow can you surly tell me if it is Chroma sub sampling issue
    Reply to chillaxton
  20. Chroma Subsampling link

    Scroll about 3/5 down the page and click the image next to the heading "How to test for Chroma Subsampling"

    You'll be able to quickly tell for yourself. If the image is clear in all areas, Chroma compression isn't the issue.
    Reply to bigpinkdragon286
  21. You can't use an mere HDMI-to-VGA adapter. You need a proper converter, as you are converting from a digital signal to an analog one, so that requires an active process. Runs under $20 here in the States, so hopefully if you go that route you can find something for a similar price where you're at.
    Reply to bigpinkdragon286
  22. bigpinkdragon286 said:
    Some HDMI displays tend to be treated as TVs by the display driver and are subjected to Chroma sub sampling issues.

    If you're using a TV as your display, you need to ensure the input is set as 1:1, Dot-by-Dot, DVI-PC, or whatever the manufacturer of the TV specifies to give you a raw HDMI input.

    In your graphics card's control panel, ideally you want to set the output to RGB 4:4:4 pixel format or better.


    bigpinkdragon286 said:
    Chroma Subsampling link

    Scroll about 3/5 down the page and click the image next to the heading "How to test for Chroma Subsampling"

    You'll be able to quickly tell for yourself. If the image is clear in all areas, Chroma compression isn't the issue.


    i tried that test so beside all colors look ugly there is very small blur on Red on black and pretty good blur on Blue on Black and also Red on Blue and Blue on Red are hardly visible everything else is ok

    here is picture of it https://ibb.co/dHX1sF
    Reply to chillaxton
  23. bigpinkdragon286 said:
    That's because VGA inputs on TV's are always considered and treated as 1:1 inputs to the display's output. HDMI is still subject to the idiocy of over scan. The drawback to VGA is that it's analog.

    As a last resort, you could try forcing the graphics card driver to output using DVI mode. The biggest downside to this is you lose HDMI audio through your HDMI because, well, that's just the way AMD programed their drivers to behave. I have to do this with one of my TVs that AMD keeps identifying as a TV, not as a DVI device. The other downside is it requires you to jump through some hoops to install an unsigned .inf file for the Monitor device in Device Manager.


    well i dont really care about HDMI audio as i have speakers anyway so how can i force it to output using DVI mode?
    Reply to chillaxton
  24. bigpinkdragon286 said:
    You can't use an mere HDMI-to-VGA adapter. You need a proper converter, as you are converting from a digital signal to an analog one, so that requires an active process. Runs under $20 here in the States, so hopefully if you go that route you can find something for a similar price where you're at.


    so which adapter or cable can help me i think I will be able to get anything for less than 10$
    Reply to chillaxton
Ask a new question Answer

Read More

Graphics Cards Displays