HDMI and DVI are digital video. Your computer tells the monitor exactly what color and brightness to make each individual pixel. You should not be getting fuzzy text. Or rather, there is no mechanism by which these cables/adapters can be responsible for text being fuzzy. There is no way for the instructions for one pixel to "bleed" into adjacent pixels with digital video.
Converting to VGA with an adapter will not improve your situation, as all that does is insert a digital -> analog -> digital step in the middle. The only way it will help is if the problem is caused by one of the below reasons, and the VGA input suffers from it less or somehow manages to bypass it.
- Check to make sure the Cleartype settings are to your liking. There's one setting within Cleartype which is based on subpixel arrangement (RGB vs. RBG) which will make text appear fuzzy if you select the wrong one, though I'm not sure how you access it in Windows 7.
- Check that your computer's desktop resolution matches the monitor's resolution. If they don't match, the monitor has to stretch the picture to cover the entire screen. Most monitors do a good job at this, but some do a very poor job resulting in blurry/blocky images. A lot of older people set their desktop resolution low to make the icons and text bigger, resulting in this problem. If you're one of these people, set the resolution to match the monitor. Then change the DPI setting (right-click desktop -> personalize -> display). You may have to logout and back in before the changes kick in.
- Check your monitor's setting to make sure overscan is turned off. HDMI is also used for TVs, and many TV signals are designed with garbage at the edges of the screen (in particular the closed captioning signal is encoded there as black/white dots). To prevent you from seeing this, most TVs will overscan (stretch the picture slightly) to hide the garbage. This results in picture degradation like above, but you don't really notice it with images and video. For computer output though, the fuzzy or blocky images and text are pretty obvious. You can run into this problem if your TV/monitor mistakes your computer output for a TV signal, so you have to force it to turn overscan off. This setting may also be called 1:1 or direct.
Pretty much all modern video cards have HDMI out in addition to DVI out. I have to figure a 7950 has HDMI output on it. Is there some reason you aren't just using a straight HDMI-HDMI cable? It's no big deal, except a DVI-HDMI adapter may not preserve HDCP (high def content protection) info. Meaning if you ever try to watch a blu-ray movie with that setup, it may get blurred down to DVD resolution.