Doesn't fit the native resolution in Samsung S22D300 monitor on Radeon R9 280

kanishmarsh

Distinguished
Sep 9, 2012
48
0
18,530
■ I'm using Samsung S22D300 22" monitor on Sapphire Radeon R9 280..Monitor native resolution is 1920x1080..But even If I installed Radeon graphics driver, resolution doesn't fit the monitor..It always appears 1024x768..Even if I installed very latest AMD graphics driver the problem is same..

I installed all motherboard drivers and monitor driver too..My motherboard is B85M-D3H (Mys system specs- i5-4590/8GB RAM)..I'm using DVI to SVGA converter to connect the monitor..

Because of that problem I have to change the resolution manually through AMD Radeon settings [Catalyst Control-->Properties (VGA DISPLAY) ---> Monitor Attributes]

When I change the resolution in Catalyst control, then Windows allows to change the resolution manually through display settings in windows..Earlier it showed only 1024x768 resolution and no option to change the resolution..

With Windows 7 Ultimate, Windows 8.1 Pro and Windows 10 Enterprise the problem is same..but games always detect 1080p native resolution of this monitor..

I just want to know what's the reason for this problem
 
Solution

The VGA connector is analog. They date back from the CRT days when monitors didn't have a fixed resolution, just different horizontal and vertical scan rates they were capable of. So a VGA connection doesn't tell your video card the max resolution the monitor is capable of, just the max horizontal and vertical scan rate. This is done through two pins which control the v-sync and h-sync.

It sounds like one or both of your pins aren't transmitting this info properly. The fault could be in the monitor's VGA port, the VGA cable, the DVI to VGA...
I don't know exactly what's going on but it's almost certainly to do with the adapter you're using. At the moment you're using an analogue signal which isn't ideal at that resolution anyway.

Why don't you grab a cheap cable to get an adapter free digital connection. 90% that'll fix it and give you a better (pure digital) picture to boot.

Your monitor has HDMI according to specs I googled, is that right? So if your 280 has a HDMI port grab a standard HDMI cable. If you don't have a HDMI port on your GPU, any DVI to HDMI cable should work. The signals are identical for 1080P, so no electronics/conversion necessary.
Your monitor has HDMI and your graphics card will have either a HDMI or DVI output. You can get a cheap cable (either standard HDMI or
 

The VGA connector is analog. They date back from the CRT days when monitors didn't have a fixed resolution, just different horizontal and vertical scan rates they were capable of. So a VGA connection doesn't tell your video card the max resolution the monitor is capable of, just the max horizontal and vertical scan rate. This is done through two pins which control the v-sync and h-sync.

It sounds like one or both of your pins aren't transmitting this info properly. The fault could be in the monitor's VGA port, the VGA cable, the DVI to VGA adapter, or your video card's DVI port. When the video card doesn't know the max v-sync and h-sync frequencies the monitor is capable of, it errs on the side of safety and chooses the minimum - 1024x768 in your case. The reason is because trying to drive the monitor at a higher frequency than it's capable of can destroy the monitor. This is one of the few parts of a computer where a software setting can damage or destroy hardware.

Doing what rhysiam suggested is probably the best solution. Straight HDMI or DVI to HDMI will give you a pure digital signal so no more sync issues, and the picture might even become slightly clearer. The only catches are some monitors reduce your color tweaking options with a digital signal (the option is there because of possible color signal degradation in an analog VGA cable). And some monitors will assume HDMI is a video signal and automatically overscan it (make the picture larger than the screen, cutting off the edges), meaning you'll have to figure out some way to turn overscan off.

If you insist on using the VGA connection, you have two options. You can go through all the parts I listed and replace them one at a time to figure out what's causing the problem (my bet is the DVI to VGA adapter). Or you can see if there are monitor drivers for your Samsung S22D300 monitor. Monitor "drivers" aren't really drivers, they're just a list of supported resolutions and sync frequencies. Windows uses that info to determine what resolutions and sync frequencies to show you as available.
 
Solution