there seems to be some communication gap between VGA and DVI...this is sumthing i found out in one post...might be helpful...
"DVI's clock speed determines the maximum bandwidth, which is resolution times refresh rate. You can get a higher resolution by lowering the refresh rate - some LCD monitors will let you run them at, say, 50Hz instead of 60Hz refresh, and while the screen is a little slower to update, they don't flicker like the old CRTs used to. Single-link DVI has a specified maximum clock of 165MHz but various unofficial 'overclocking' hacks exist. Dual-link DVI has at least twice the bandwidth of ordinary DVI, but according to Wikipedia has no upper limit on clock speed, so 'is constrained only by hardware'. For example 3840x2400@31Hz is practical with the right hardware. Short, good-quality cables help.
VGA being an analogue rather than digital connector tends to degrade gradually as bandwidth increases. Higher resolutions just aren't as crisp as DVI, even with high-quality cable and a good monitor. (This isn't so much the fault of the VGA cable as the electronics at either end. It may be that these days anyone who cares uses DVI, so even high-end monitors use cheap electronics for the analogue to digital conversion.) I have 1920x1080 over VGA but the display ends up a bit smudged compared to using DVI"