plzhelpwithinternet :
Thanks for the reply! But can I get a bit of an walkthrough, I just bought a hdmi to vga adapter that supports 1920x1200 p with a 60hz refresh rate, but it works worse than the previous one. With the previous one I was able to use CRU to add a 1600x900 60 hz resolution which works fine but since the native resolution is messed up, many games are not playable in 1600x900 (without black bars) and no games are playable at 1280x720 (without black bars),
When in 1600x900 @ 60 Hz, hit the auto-adjust button on your monitor. (Sometimes it's buried in the monitor's menu options.) This will force your monitor to measure the input signal and "learn" how to synchronize to this exact signal, so the pixels and borders line up right, and eliminates any black bars.
Repeat when you're displaying 1280x720. You have to do this at each resolution and refresh rate you intend to use for the monitor to "learn" how to map the incoming analog signal to the screen.
Repeat it again if you encounter similar problems with a full-screen game. Sometimes the 1600x900 @ 60 Hz a game uses is not exactly the same as the 1600x900 @ 60 Hz that Windows is using, and your monitor needs to learn the second variant the game is using.
As I said, analog video is very complicated. The auto-adjust button handles about 4 different settings you used to have to adjust manually by turning knobs back in the old days.
UPDATE : also to add, I might've figured out the issue but I really need some confirmation. My old adapter was of a 4:3 resolution support, this new one (1920x1200) is 16:10. My monitor's native resolution(should be) is 16:9. Maybe I should buy a 16:9 VGA adapter?
It's not the aspect ratio. The adapter manufacturer has to program in support for the exact resolution and refresh rate (horizontal and vertical scan rates and pixel clocks). If they didn't program in 1600x900 @ 60 Hz, the adapter won't be able to do it.