Think of it this way... the card produces a native digital signal. If you have a digital monitor, it will use that signal with the least amount of loss.
If you have a digital monitor that only has a VGA input, the card uses its D/A converter to convert the signal to analog and send an analog signal to the VGA port on the monitor. The monitor will then use its A/D converter to change that analog signal back to digital for display. Two extra steps you would not need if you had a monitor that could receive a digital signal.