I've got a Samsung 245t monitor which is a 24" LCD model that I bought about 4 years ago. Its native resolution is supposed to be 1920x1200 (16:10 aspect ratio) and I've used it successfully in the past with this resolution.
I recently tore apart my system and rebuilt it, and since I did that, the monitor has not offered 1920x1200 as a resolution as a choice within Windows (7 professional) after I attach it to the system.
I'm trying to use HDMI cable, and I've tried to attach it to a 660GTX graphics card, the motherboard's onboard video, and even my laptop as a secondary display. In each case, I am offered 1900x1080 as the "highest" resolution.
In the past, I was using DVI cable for my connector, but had some issues with the cable itself so I thought I'd switch to HDMI, as I have a couple extra HDMI cables laying around the house.
Does anyone have any ideas what could be causing this behavior?
i scavenged around and found the packaging for the HDMI cable I'd been using and it said "max resolution 1080p" just as you suggested. i swapped in a new DVI-D cable and all is right with the world! thanks nasty!