I can have this monitor use up all the available space on the screen at 1680 x 1050. But when I try to take it to 1920 x 1080 it cuts off some of the screen and leaves it black. I have update my ATI HD 5770 drivers to the most recent and reboot, and still the same problem. I have no idea where to take this, it does this also in 1080p mode also.
When in my desktop properties it only allows me under my basic tab, 1680 x 1050. I am confused because my windows thing recognizes 1980 x 1080 as my native resolution.. The only other thing I could think of was my older monitor is connected to the pc and that runs at a native resolution of 1680 x 1050. Does anyone have ideas on this puzzle?
Are you running both your old and new monitor at the same time?
Make sure you set each monitor individually to their native resolution. If you run duplicate/clone mode with two monitors, you would only be allowed to use the resolution of the smaller display leaving black bars on your larger display.
Are you using either HDMI or DVI cables? VGA may not give you 1080P. If you are using HDMI, make sure you set overscan to 0% if having black borders is your problem, can be found in your catalyst drivers.
Can you try a different cable, connection port or can you try it on someone else computer? It could be the monitor which is defective. Or it could be your Windows has corrupted something.
I had a somewhat similar issue around two years ago with my NEC LCD2690WUXi. It worked fine for about a year, but suddenly I no longer had the option to select 1920 x 1200. However, I was able to use it in 1920 x 1080 without any issues. I left it like that for around two months trying to correct it every now and then like doing the following:
1. install new drivers
2. install old drivers
3. playing around with CCC settings
4. switching to a spare cable
5. switching DVI ports
6. scanning for virus / trojans
Nothing worked until I did a fresh install of Windows.