Does VGA to DVI-D would give me a digital or still an Analog screen?

Solution
Well, such a system does end up sending the information to the monitor as digital signals. However, it also means that the signal has been converted at least twice, each step introducing a small bit of error.

ALL video information in a computer is in digital form. A video card that sends its information out in digital form (DVI or HDMI, for example) does not NEED to modify it. It really only needs to accept the data sent to it from the CPU and send it on out accompanied by a few signals for timing. Now, it's not always done just that way. In fact, very often the resolution of the image originally present in the source data is not the resolution of the monitor being fed, and one of the important jobs a video card (or on-board chip) does...

Paperdoc

Polypheme
Ambassador
Well, such a system does end up sending the information to the monitor as digital signals. However, it also means that the signal has been converted at least twice, each step introducing a small bit of error.

ALL video information in a computer is in digital form. A video card that sends its information out in digital form (DVI or HDMI, for example) does not NEED to modify it. It really only needs to accept the data sent to it from the CPU and send it on out accompanied by a few signals for timing. Now, it's not always done just that way. In fact, very often the resolution of the image originally present in the source data is not the resolution of the monitor being fed, and one of the important jobs a video card (or on-board chip) does is re-scale the data to fit the resolution you have told the video system to send. That re-scaling job is done digitally, of course, but it does mean addition of slight errors.

However, when you have a video system putting out its information in a VGA format, that is an analog signal system. The video system's job is to accept the digital video information given to it, re-scale that to a resolution required in its settings, and then use digital-to-analog conversion to turn that information into analog signals for each of the three colors. So there's potential error generation both in the re-scaling and in the D-to-A conversion.

If you then take that VGA signal and use a dedicated active circuit in a box to sample and digitize it into a specified resolution, that conversion process adds another source of error. The final result of this process, versus having used a digital video system in the first place to provide that digital signal to the monitor, is that this VGA-to-Digital image will contain more errors or "fuzziness", and that MAY be noticeable, or it may be so small you don't care. A lot will depend on the resolution of the VGA display mode your video system is using for its output before it gets to the converter.
 
Solution