I noticed that any level of Geforce above the 8600GT does not come with a native D-SUB port that CRT monitors use, they only come with DVI ports that LCD monitors use or even HDMI ports. It seems that there is usually an optional adapter that is included with these video cards that converts the DVI output to a D-SUB output if required, but im unclear on what happens to the quality with this method of output.
Does a card that doesn't have a D-SUB port have a RAMDAC? would the quality of the image be better viewed with a native D-SUB port on the card than opposed to an adapter that goes from DVI to D-SUB if you were using a CRT monitor? Does the adapter cause a loss of quality?
this kind of thing has me nervous... its like they are trying to phase out CRT monitors. But I loves my CRT...loves it.
Not any disadvantages for your crt. I used a plug for a few years, had to with my 6800 gt, and never noticed anything that stuck out. Worked just fine, if you love your crt, and have the room on your desk for it, your not going to feel any pain using the adapter.
thank you for the replies. yes that is also what i figured and why i posted this question here, but monst0r i had a question for you:
you said the signal goes analog>digital>analog ? but i thought that DVI output was digital? so therefor the digital signal from the DVI gets converted into analog by the adapter which is used for the D-SUB. isnt it only digital>analog just like a card with a RAMDAC and a native D-SUB port? i dont understand where you are getting the beginning analog in your conversion list. you have analog listed twice.
my biggest concern of course was that, surely, the RAMDAC on the card would be doing a better job of converting this digital signal into analog than a little adapter that fits in on the end, but blacksci says he never noticed it with his equipment.