DVI supports much higher resolutions, while VGA's max is 2048x1536 @ 85 Hz. VGA is analog and DVI is digital I believe, DVI is meant to provide better quality but I cannot tell the difference between DVI and VGA on a desktop.
With VGA the entire screen has to refresh but with DVI only the pixels that have changed need to be refreshed which is why you don't see scan lines running down the LCD monitors that are in the background on a TV show like you did with the old CRT ones.
Mouse you're confusing LCD with CRT, nor VGA with DVI.
And Lmeow you're confusing the RAMDAC limitation with VGA limit. If the RAMDACs were more capable )ie above 400Mhz) you could send higher resolutions along the connection.
And while VGA is analogue, DVI can be either, I use DVI-A at work to drive my IBM P260.
And DVI has lower resolution/bandwidth than 1920x1200x60 if it's single link TMDS driving it, or just simple single link connection (few are overdrived the way ATi's single link HDMI is).
For practical purposes the limitation of DVI vs VGA is the ability to carry audio via DVI (not widely used outside of ATi and nV) and the ability to pass along HDCP tokens/keys for watching protected materials.
The differences in quality depend alot on both the quality of the cables and the hardware at both ends. If any of them are crap you could end up with a crummy image, but a digital connection is a little less susceptible to noise and may give you a better experience if dealing with dodgy components.