Sign in with
Sign up | Sign in
Your question

VGA & DVI options

Last response: in Graphics & Displays
Share
September 17, 2002 11:47:06 PM

If you had a choice between getting a VGA to DVI adaptor, a KVM switcher, a VGA switcher, or upgrading(?) to a PC DVI 3D graphics card which would you go for on price vs peformance??

The goal is to run two platforms that are both currently VGA-out, into an LCD with a D-Sub port and a dual DVI-I digital/analog port.

(The PC is currently running on a Siluro T400 VGA card by the way.)

Jay

(Off topic my arse!)

More about : vga dvi options

September 18, 2002 3:02:05 AM

DVI graphics card, cause wid the converters, u get even worse quality than if u hooked it up straight with the analog.

What if you had admin rights to life?
September 18, 2002 5:40:19 PM

So I guess the next question is - what decent DVI cards could you recommend that equal or better the Siluro T400 (It's a GeForce2 MX400, NVIDIA 256bit 2D/3D, AGP 2x/4x support, 350mhz RAMDAC, 64MB SDRAM 128 bit bus 166mhz, 200mhz clock etc if any of that helps)

Bonus question, just to be clear on this, do analog signals suck in comparison to DVI even before the degrading caused by going through a KVM/monitor/VGA switcher? I'm hearing conflicting views on switchers, another source tells me the difference is very negligable, though this is from a non-gaming POV. Is your's more a gamers POV?

Jay
Related resources
September 18, 2002 8:09:40 PM

Well I think a gamer would notice poor image wuality even less b/c blazing through a game you don't usually take time to notice textures unless it's something really apparent. DVI can and can not make a difference, depends on the LCD. On some, using ananlog creates somewhat unclear image (poorer colors, blurriness, etc.) while DVI makes it sharp, well contrasted, etc. Others there isn't s difference. I like running in DVI cause I don't have to deal with a lot of the controls necessary to set an analog image to its best. My point of view is that of a gamer and a graphics designer, where image quality matters greatly. Also, for the GFX card, how much $$$ are you willing to spend?

What if you had admin rights to life?
September 18, 2002 8:10:36 PM

And what sucks about the analog signal for LCD is that it has to be converted to digital.

What if you had admin rights to life?
September 23, 2002 2:11:47 AM

I'd pay up to about $300 which should get me one of the better cards. I'd settle for a decent cheaper model though. You're right about VGA switchers man that sucked! Didn't burn too big a hole in my pocket though. Unlike the DVI card I'm going to have to get when I come back from my holiday.

This LCD has a D-Sub (analog?) and DVI-I port. I've only run on the D-Sub but it looks ok, from what you are saying there would be a noticeable improvement when I run my PC into the DVI port, right? Point taken on adjustments, I had to tweak the refresh rate to 75hz to get it up to an acceptable standard.
September 23, 2002 2:20:36 AM

You could go for a Ti4200 @ $120, or @ $322 a R9700 Pro (but dun get pissed if it makes ur system unstable/ has issues wid games).....

What if you had admin rights to life?
October 2, 2002 1:51:53 AM

Anything more mid range you could suggest?

Is the R9700 tempremental or is it that it's too ninja for a lot of systems to cope with?

(Sorry, been on holiday)

Jay
October 2, 2002 5:52:03 AM

POV of a gamer 1st, 3D designer 2nd. DVI is straight digital connection. If your displaying anything from your computer, it's encoded digitally. If your running through a VGA port, this is causing you video card to run the signal through a D/A converter before spitting it out. This is fine for a CRT because it's an analouge display device. But a LCD is a digital display, so the using a VGA input requires the signal to be run through a A/D converer before being displayed. So, DVI cuts out two extra steps. As far as what card to get, I just ordered a 9700 Pro. This I hope I made the right decision, but I've been reading Tom's stuff for 6 years, and he hasn't steered me wrong yet!
October 7, 2002 2:54:32 PM

So what's doing the D/A conversion, my VGA graphics card? Not particularly relevant, just curious; trying to get my head round all the aspects of this. If I change my PC to DVI it means the Mac will be using VGA or if I go the route of an ADC to DVI converter then my Mac gets the benefit of the DVI port. It's a damn sight cheaper that way then getting a spanky DVI graphics card. Going round in bloody circles here! I need to figure this one out.
October 7, 2002 2:54:34 PM

So what's doing the D/A conversion, my VGA graphics card? Not particularly relevant, just curious; trying to get my head round all the aspects of this. If I change my PC to DVI it means the Mac will be using VGA or if I go the route of an ADC to DVI converter then my Mac gets the benefit of the DVI port. It's a damn sight cheaper that way then getting a spanky DVI graphics card. Going round in bloody circles here! I need to figure this one out.
October 8, 2002 4:09:46 AM

Yes, a VGA port is an analoug connection, so in order for the graphics card to send info out with it, the digital information gets run through an onboard D/A (digital/analoug) converter first. A DVI port eliminates this step...CRT's by their nature, display analoug information. This is why you will not find any CRT's with a DVI connection, this would neccesitate the use of D/A converter inside your CRT as well, but what's the point when the graphics card makers have good ones in their cards already. Anyway, LCDs work right off the digital bitstream, so this bypasses the need for the D/A converter in the graphics card...whereas if you connect an LCD with a VGA type connection...the digital bitstream runs through the D/A in the graphics card, and then gets reconverted, back into a digital bitstream inside the LCD with an A/D converter...This just doesn't make any sense compared with a straight DVI (digital) connection.

Does this make more sense now, or did I just confuse you worse...maybe you'll want to read it a few more times...just jokin.
October 8, 2002 10:14:58 AM

I'm getting the idea, there'a a lot to take in. So if I was running a game on a DVI card to an LCD and my friend was running the same game on his analog card to his CRT I would get better image quality as DVI is better, or just better for LCD. I'm just making sure I understand you, rather than being blatently stupid!

My other (cheaper, short term) option is to convert the ADC signal of my Mac to DVI and run it to the LCDs DVI port, assuming the converter can do DVI-I. Obviously it means the PC will still be running VGA which isn't the best set up. Dunno how clued up you are on Mac technology, some idea of the capability of ADC compred to the other signal types is another issue to consider. Is there also some degrading of the signal going on here as well due to the conversion process of ADC to DVI, even with DVI's superiority over VGA??

Hope that makes sense I'm starting to lose myself here!
October 8, 2002 4:42:38 PM

Quote:
So if I was running a game on a DVI card to an LCD and my friend was running the same game on his analog card to his CRT I would get better image quality as DVI is better, or just better for LCD?

"or just better for LCD?" - this is correct, because LCD by its own nature is a digital display, it uses fixed pixels...remember Battleship, with the whole grid thing, imagine expanding on this idea a couple hundred times, each cordinate is indentified by it's location (eg B6), so the digital bitstream can be read by the LCD and immedately displayed...CRT however, is an analoug display device, although we rate the resoulution with pixels, they are not fixed pixel locations. I could try to give you an idea of how the technology works, but the are probably much more complete descriptions elsewhere.
<A HREF="http://nina.ecse.rpi.edu/shur/advanced/Notes/Noteshtm/D..." target="_new">http://nina.ecse.rpi.edu/shur/advanced/Notes/Noteshtm/D...;/A>

But yes, VGA is better for CRTs.
And DVI is better for LCDs.
And I'm not really familliar with Mac stuff, before when you brought up ADC, I was thinking this was A/D Converter...but apparently you're talking about something else?
October 8, 2002 8:05:29 PM

Ok, I'm with you on the fact digital signals are better for LCDs and I've a pretty good idea why from your explanation. The link took me to a place my brain wasn't built to understand, thanks for trying though. Still not 100% sure who gets the better deal - the LCD running a digital signal or the CRT running an analog signal. My money is on the LCD. It's not relevant to my situation really though, just curious. It would be nice to know I'd have the edge when I meet up with my mates for some LAN gaming, they all use CRTs and VGA cards.

I was talking about both A/D converters and ADC. Just to confuse the issue. Putting a DVI card in my PC means, as I'm using the LCD cross platform, that my Mac will be running into the VGA (D-Sub) port via a DVI to VGA converter, but I suppose the PC being my gaming machine it will prefer the benefit of true DVI. The other option I was looking into was keeping my PC VGA, temporarily till I can afford a DVI card anyway, and running the Mac to the DVI port using an ADC to DVI adaptor. (Annoyingly the DVI on the Mac won't work with the DVI on the LCD) I was hoping you might know a little about this, in particular if there was any signal loss by converting ADC to DVI, but as you don't there's not much point going into it really.

Jay
October 9, 2002 5:49:48 PM

Yeah, I am kinda lost there, sorry Jay...Good luck!
!