Sign in with
Sign up | Sign in
Your question
Closed

DVI vs. VGA

Last response: in Computer Peripherals
Share
May 24, 2007 2:30:12 AM

What are the technical differences between DVI (digital) and VGA cabling (analog).... Is there really a quality boost? Is there a performance hit sending digital signal?..


Thanks for any help

More about : dvi vga

May 24, 2007 5:21:04 PM

Personally, having tried out both I do not see any difference whatseover. Thus, I use the D-Sub in order to get the 75Hz refresh rate.
May 24, 2007 8:05:30 PM

Yo can't get 75hz with DVI? What if both of the outputs are DVI on my card, should I use the conversion plug?
Related resources
Can't find your answer ? Ask !
May 25, 2007 7:55:00 AM

DVI is limited to 60 Hz. My card has 2 DVI outputs and I use the convertor to turn the one (I don't use the other output anyway) into D-Sub and thus get 75 Hz.
May 25, 2007 8:03:02 PM

Ok... now is there a difference between DVI at 60 and D-sub at 75..
May 26, 2007 5:31:38 PM

As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?
June 22, 2007 4:35:29 PM

Quote:
As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?


WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.

To the OP, if you want to know the difference, go here:
Digital Visual Interface (DVI)
Video Graphics Array (VGA)
June 25, 2007 2:14:40 AM

Hrm, I'm about to choose an LCD, and have the same question...

However, I must choose betwen a d-sub VGA monitor or one with DVI input. The VGA d-sub monitor is slightly nicer, so I'm leaning towards getting it. Am I really going to notice the difference between DVI and VGA? I'd have something like the following:

video card (dual DVI outputs) -> adapter DVI to VGA -> VGA cable -> monitor VGA d-sub input

Thanks!
June 25, 2007 7:06:45 PM

Quote:
As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?


WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.

To the OP, if you want to know the difference, go here:
Digital Visual Interface (DVI)
Video Graphics Array (VGA)

Think about it. Your VGA can push 200 fps but the monitor can only refresh the image 60 or 75 times per second. Thus you may get ''torn'' images, unless you activate the V-Sync setting that restricts the fps according to the monitor's refresh rate. Another reason to prefer high end CRTs that can go upwards of 100 Hz.

MalcolmCarmen: Go for the monitor with the best image quality/specifications, irrespectively of the input signal.
June 25, 2007 11:59:18 PM

[quote="Eurasianman
WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.[/quote]

Right-click on your desktop
Go into properties

In your display settings somewhere, you'll find that your LCD screen refresh rate is 60 Hertz. You're not displaying 200 FPS on your LCD screen, not that it would matter anyway as your human eye (hopefully...) wouldn't be able to tell the difference between 60FPS and 200 FPS
November 15, 2007 4:21:03 PM

Actually I'm pretty sure you're wrong there. As LCD monitors do not actively refresh the image a given number of times per second, but instead only change the parts of the image that are moving, display Hz becomes irrelevant - shouldn't effective fps instead be restricted by the response time? Due to the way the drivers are coded this will likely still cause image tearing however, though that has never been a problem for me.

Moreover, changing the Hz to above 60 has been proven to have a potential negative impact on any overdrive/response-time technology. Lastly don't get me started on that human eye bullcrap. The human eye can easily see upwards of 200fps, and if you can't, then you're probably either visually impaired, not paying enough attention, or in denial.
November 19, 2007 6:49:33 PM

HarkeN said:
Actually I'm pretty sure you're wrong there. As LCD monitors do not actively refresh the image a given number of times per second, but instead only change the parts of the image that are moving, display Hz becomes irrelevant - shouldn't effective fps instead be restricted by the response time? Due to the way the drivers are coded this will likely still cause image tearing however, though that has never been a problem for me.

Moreover, changing the Hz to above 60 has been proven to have a potential negative impact on any overdrive/response-time technology. Lastly don't get me started on that human eye bullcrap. The human eye can easily see upwards of 200fps, and if you can't, then you're probably either visually impaired, not paying enough attention, or in denial.


Point out the device you used to compare 200 fps to the inferior sub 200 fps displays.
Thanks in advance.
November 20, 2007 4:10:50 PM

HarkeN said:
The human eye can easily see upwards of 200fps, and if you can't, then you're probably either visually impaired, not paying enough attention, or in denial.


Wow, you must be living in perpetual "bullet time" to be able to see 200fps or 200 individual "snaphots" per second. You should therefore be able so see each individual wing flap that a hummingbird does per second since they can "only" flap thier wings up to 80 times per second (depending on the species).

Most mortals cannot breakdown time into 5ms increments (1 second / 200 frames).
November 20, 2007 10:57:58 PM

Even if you send a 75hz signal to an LCD most if not all LCDs just drop the extra 15fps. 75hz signals can be blurrier than 60hz signals, so even if you are using vga most of the time 60hz will look better.

A DVI (digital) signal will look crisper than a VGA (analog). It won't have any ghosting or funny colors.
November 30, 2007 11:49:57 PM

HarkeN said:
Actually I'm pretty sure you're wrong there. As LCD monitors do not actively refresh the image a given number of times per second, but instead only change the parts of the image that are moving, display Hz becomes irrelevant - shouldn't effective fps instead be restricted by the response time? Due to the way the drivers are coded this will likely still cause image tearing however, though that has never been a problem for me.

Moreover, changing the Hz to above 60 has been proven to have a potential negative impact on any overdrive/response-time technology. Lastly don't get me started on that human eye bullcrap. The human eye can easily see upwards of 200fps, and if you can't, then you're probably either visually impaired, not paying enough attention, or in denial.


No the human eye is stuck around 35ish frames per second. It is humanly impossible to even see 60fps. I repeat it is impossible. This is not a wifes tale but medical fact.
November 30, 2007 11:51:54 PM

azgard said:
Point out the device you used to compare 200 fps to the inferior sub 200 fps displays.
Thanks in advance.


An LCD that actualy can display over 150fps would cost in the area of 6000$ and I am betting you cant find one outside of a specialty store in a very rich area.
December 1, 2007 7:27:37 PM

jerseygamer said:
An LCD that actualy can display over 150fps would cost in the area of 6000$ and I am betting you cant find one outside of a specialty store in a very rich area.


My original post was to call him out on utter BS. And LCD's don't function in the same way as a CRT so a FPS measurement is irrelevant.
July 22, 2008 8:05:59 PM

ToothFaerie said:
http://www.100fps.com/how_many_frames_can_humans_see.ht...

This explains it better than anything else I have seen on the net. Flickering is only a product of fluffy or sharp images read and enjoy I think you will see there is no definitive answer.


Exactly. This thread is full of misinformation; linked article is a good guide that should clear things up.
July 22, 2008 8:30:40 PM

The linked article is a good guide. One thing it didn't talk about that I'd be curious to see though is the question of how short of a flash of light is discernible not just by existence, but by other qualities. In other words, if you can see a 1/400 second flash, but you can't tell the difference between it and the same flash for 1/60 of a second, 60fps can still be fast enough to reproduce anything you could see. If you can tell the difference though, faster could be useful.
August 26, 2008 11:58:18 PM

jerseygamer said:
No the human eye is stuck around 35ish frames per second. It is humanly impossible to even see 60fps. I repeat it is impossible. This is not a wifes tale but medical fact.

\WHAT THE HELL are you talking about? Jesus christ, are you 3 years old?
Go play any game, set maxfps to 35 fps! THEN SET IT TO 60FPS!!! DOn't be an idoit an post idoidic messages!

Now, like others said, 200fps, yes, but the computer can generate 200fps but the monitor SHOWS the fps cap at the screen refresh, 60 refresh means even if you got 99 fps, the monitor shows 60fps. There is no visual differance to humans after 60 fps or 70fps.
anything LOWER than 60fps, we can definatly see.

September 2, 2008 3:01:23 PM

actually i think you'll find that the eye does not work like a screen where we see individual pictures however many times per second.....it is infact constantly monitoring any changes, and it is 'Persistence of vision' that allows us to 'blur' together the individual pictures flashed up by out screens.

this explains how we can see a flash lasting 1ms or even less, yet cant distinguish it from one lasting longer. also how we cannot tell the difference between 60fps and 2309320932fps.

lastly id like to agree that your computer could do 200fps but if your screen doesnt then why not cap your fps and help save the environment by using less power? :D 
September 16, 2008 9:25:00 AM

This thread is pretty hilarious, and as people who are into hardware and gaming, it's surprising how misinformed a few of the posts are.

I have dealt with flicker and high refresh/FPS quite a lot in my life because my eyes are overly-sensitive. As a result, I can tell you that refresh rate does NOT equal FPS at all. over 99% of humans can't see 30FPS. I am one of the unlucky few who can (just barely), so a slight flicker registers, especially when FPS is lower. Most people can't see the difference between 75Hz refresh and 100Hz refresh, but a small number can. Many more can see the difference btwn 60Hz and higher, as that's the normal threshold for being able to detect the flicker.

200FPS is a hilarious marker, people. it's a number used for testing equipment in a vaccuum, there are no humans that can see anything close to that. Trust me, I am an outlier and I peak in the 40's at the most. 200FPS is something that a graphics card to muster in a test, and there would be no difference, visually btwn that and 100FPS to us. The only diff is when you put those results under stress (heat, bottlenecks, etc), and the performance is pushed way down. When your 200FPS video card is put under heavy stress, perhaps it will show 35FPS, when a lesser card will be 20FPS, and then you'll notice the difference. Otherwise, go brag to your friends, it's useless in the real world.

In any case, LCD's don't refresh in nearly the same way that CRT's did, and that whole discourse hasn't been updated in common talk, so it's hard to get people on the same page. 60Hz can look very different on different screens, because their vert/hor refreshes are synced differently.

And I'm using an LCD on a DVI conn that's refreshing at 100Hz right now, so unless something has changed since the earlier posts, DVI can support over 60Hz (although someone mentioned that it drops, not sure if that's the case).
September 17, 2008 7:28:32 PM

It's resolution dependent actually - single link DVI only supports up to 60fps at 1920x1200 for example, but at lower resolution, it can go much higher.
September 24, 2008 4:00:02 PM

Yes, any resolution that uses a 10:9 aspect ratio or higher usually cannot go any higher then 60Hz so going VGA with a widescreen is basically pointless
September 27, 2008 9:30:24 AM

Eurasianman said:
Quote:
As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?


WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.

To the OP, if you want to know the difference, go here:
Digital Visual Interface (DVI)
Video Graphics Array (VGA)


do you have Vsync On?

you can get as many fps as your graphic card can output but ur monitor its only showing a part of that thats what Vsync does limits ur graphic card output to what ur monitor can handle so ur graphic card doesnt work in vane since there are lots droped frames by your monitor.

considering ur using a LCD lcds have lower refresh rate than the CRTs being arround 60 - 100 Hz

so its no way showing mor than 100 fps
Anonymous
October 13, 2008 3:06:05 PM

Well, 2 things is.
If yuo are recording a video game, at 105 fps, you will not be able to see the difference to 35 fps (according to what some people say), but yuo will see a difference in jaged edges, as the edges resresh more, and when you record it, and slow the video clip down to 1/3rd of the speed, it will show at 35 fps, because of the fps it was recorded at, and depending on how high the fps is, means how slow the video can be slowed down, for whatever reasons, such as slow motion evidence like speedhackers (knob heads arnt they? but so thrilling to kill)
November 30, 2008 1:52:16 AM

I have to ask then, my LCD TV has the native res of 1366 x 768 too. It has a VGA port and HDMI port as well. Should I use a DVI/HDMI adapter or just VGA. I'm pretty sure I'll be getting a GTX 280 soon and I'm perfectly fine with sitting on 720p resolution, I'm pretty sure that would be the best the TV could manage. Any suggestions?
November 30, 2008 5:34:42 AM

At 720p, DVI and VGA should look almost identical. I can't really tell the difference between the two until higher resolutions - at least 1680x1050, if not 1920x1200, where the VGA starts to look a little blurry.
November 30, 2008 2:41:40 PM

Alright very cool. Now I have another noob question since I'm just now seriously getting to the realm of gaming GPUs. I know the GTX280 is overkill at 720p, but just for the sake of future proofing, I can use my LCD TV with dual GTX 280 at 720p suing a VGA cable right?
December 2, 2008 3:03:41 PM

You should be able to, yes. You'll need an adapter, but the card should come with one.
February 9, 2009 10:46:34 PM

I think monitor specs are doing my nut in :) 
March 19, 2009 6:48:33 PM

jerseygamer said:
No the human eye is stuck around 35ish frames per second. It is humanly impossible to even see 60fps. I repeat it is impossible. This is not a wifes tale but medical fact.


da vinci was probably one of those rare people who had super sight. he was able to see way beyond 35 frames per second as he made sketches on how birds flew with extreme detail. When researchers had cameras to see birds in flight in slow motion, they could confirm da vinci's near flawless sketches of a birds wing position in multiple stages in flight.

so no.. not a medical fact.
Anonymous
September 11, 2009 12:57:25 AM

Did he draw a humming bird though?, it's pretty easy to see a normal birds wings in motion or at least estimate. Anyway my question is are there any health issues with staying at 60hz?, i heard somewhere that prolonged exposure to it could damage your eyes and going to a higher hz was more healthy. But i'm a little confused since your eyes couldnt see the screen flickering at 60hz anyway.
September 12, 2009 7:10:47 PM

wtf are you all talking about? LOL

DVI VGA USB FPS HDMI RCA PIN WEP CNN NBC CBS WB PCP LSD ROFL WTF BRB OMFG
September 15, 2009 11:48:53 PM

Just to clarify the human eye sees at 23 FPS. However when adrenaline is introduced to the body, it shoots up to 30 fps. Hence why your vision is enhanced when you have adrenaline in your body.

As for the question of 200 fps... the human eye CAN distinguish up to 60 fps (due to the brain/some stuff which I forgot in biology) ON A MONITOR OR TV... Anything higher then that is irrelevant for the eye's capabilities...

DVI is better, however the eye can't tell the difference so it's irrelevant, once again.
October 25, 2009 4:24:54 PM

Dvi is better but I should go for the monitor of better quality.

About the refresh time, humans should be able to recognize all refresh rates that aren´t going faster than light. Eye´s don´t see in a picture-refresh system, but are constantly recieving light. You are the difference. Its what you are used to. if you always play games at 30hertz you won't c the flickering. If ur used to 75 you will at 30Hertz.

And its the Hertz (refresh rate of monitor) that matters. But if ur fps is higher than the refresh rate, you will possibly c jaggies (caused by half renewed pics).
If ur fps is lower there will just be indentical frames in case of lcd monitors. In case of CRT monitors, there will be a light flickering.

And to indicate you can c the flicker anyways just look a little next/above/under ur monitor. If your eyes are still right you will see a little (maybe big, whatever) flicker.
December 1, 2009 7:17:51 AM


I have a Samsung T220GN with a D-Sub (VGA) only input. I was looking for a 22" monitor with DVI support but this one looks very very nice and I did not noticed any drawbacks in the shop. So, I have taken it.

Then...

I plugged it to the VGA (!!!) output of my GeForce9500 and I've got a one pixel shadows on the right side of black text on a white background. I knew this could happen because these artefacts are what I am getting when trying to connect any LCD to VGA output of my old laptop. But I did not noticed anything like this in the shop... so...

I have plugged the monitor to the DVI (!!!) output of my GeForce9500 using DVI-to-VGA cable. And (yes!) no shadows - very crisp and perfect image. I was happy for 3 days until today...

I have some sort of labs neighbouring bellow and above my office. And some times they turn their hellish time machines with large generators or something on and do manage to distort image on a CRT monitor in my office. And guess what?! They do manage to distort image of my brand new 22" LCD (!!!!!!!!!) it looks like a small noise or fluid effect. It becomes very noticable at lower resolution.

Other LCDs with DVI input feel pretty good in my office at the same time. So I do not reccomend anybody to buy D-Sub (analog) monitors.
December 17, 2009 3:05:37 AM

Quote:
"Just to clarify the human eye sees at 23 FPS. However when adrenaline is introduced to the body, it shoots up to 30 fps."
"da vinci was probably one of those rare people who had super sight. he was able to see way beyond 35 frames per second"
"over 99% of humans can't see 30FPS."
"Trust me, I am an outlier and I peak in the 40's at the most."
"\WHAT THE HELL are you talking about? Jesus christ, are you 3 years old? ...There is no visual differance[sic] to humans after 60 fps or 70fps. anything LOWER than 60fps, we can definatly[sic] see."
"No the human eye is stuck around 35ish frames per second. It is humanly impossible to even see 60fps. I repeat it is impossible. This is not a wifes tale but medical fact."
"The human eye can easily see upwards of 200fps, and if you can't, then you're probably either visually impaired, not paying enough attention, or in denial."
"which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken."
Personally I see at exactly 36.5 frames per second. However, if I alternate blinking my eyes really fast, I can see at 73 frames per second.
Quote:
"You should therefore be able so see each individual wing flap that a hummingbird does per second since they can "only" flap thier wings up to 80 times per second"
So I'm just short of seeing individual wing flaps.
Quote:
Anyway my question is are there any health issues with staying at 60hz?, i heard somewhere that prolonged exposure to it could damage your eyes and going to a higher hz was more healthy. But i'm a little confused since your eyes couldnt see the screen flickering at 60hz anyway.
It's generally thought that a CRT screen under 75Hz can potentially cause eye fatigue or headaches.
This is because a CRT refreshes the entire screen, so the flicker is more pronounced. Some fluorescent lamps also flicker at around 50-60Hz, and combined with a CRT flickering at about the same rate can increase eye-fatigue even further.
Since LCD monitors emit constant light and don't need to completely refresh, the eye-fatigue problem is a non-issue.

So your eyes will be completely fine with an LCD at 60Hz, but may get some strain from a CRT at the same (usually only when staring at it constantly for hours.) However if the CRT is at 75Hz, you will most likely be fine.
December 17, 2009 11:50:05 AM

Fluorescent lamps flicker at 120Hz typically, aside from the ones with electronic ballasts (which flicker at several thousand hertz).
December 18, 2009 4:44:26 PM

I would appreciate thoughts on the DVI vs VGA question for a business user for mostly financial transactions, stock and currency trading which is mostly 2D graphics and regular internet surfing. I do also watch YouTube videos, but I'm not a gamer or high end graphic designer.

The VGA LCD monitors are usually cheaper. Would I notice much or any difference between DVI or VGA for what I do?

My graphics card is a PNY Quadro NVS 440 which will run 4 monitors from one slot. It's mostly for financial trading work.

I buy mostly from the Egg and would probably buy either a HannsG or Acer brands. Either 19" or 22".

I already have 2 AGM 19" LCD monitors from about 2006. They are just VGA also. This would give a total of 4 monitors. I'll put then on one of those special display holders that will take 4 monitors on one stand.

Thanks all. Happy Holidays and prosperous New Year to us all.
December 18, 2009 9:01:24 PM

Resolution is the factor you should care about - I've found that <1920x1200, DVI vs VGA makes basically no difference.
December 19, 2009 12:44:43 AM

cjl said:
Resolution is the factor you should care about - I've found that <1920x1200, DVI vs VGA makes basically no difference.


Thanks CJ. The 1920 X1200 resolution is generally a 24" monitor. Is that correct? Most 19" monitors are 1440 X 900 or 1680 X 1050, and 22" monitors seem to all be 1680 X 1050 also.


So a VGA monitor at these resolutions won't make much difference also? I'm no expert so please forgive the newbie question.

For what I'm doing, it sounds like VGA vs DVI won't make much difference.

Thanks for the info.
December 19, 2009 3:42:02 AM

Some 22" monitors are 1920x1080, which is pushing it, but for the most part, you're right. You should be fine with VGA.
December 19, 2009 3:39:23 PM

cjl said:
Some 22" monitors are 1920x1080, which is pushing it, but for the most part, you're right. You should be fine with VGA.



Thanks for the clarification CJ, I appreciate your taking the time to respond.
Anonymous
January 13, 2010 6:17:23 AM

Higher fps have the benefit of renderin so many frames that your monitor is picking up the maximum number of clean frames. Say you have a game running at 60 fps and a monitor at 60hz. What if they are displaying the Same frame rates but not the same exact frames like the monitor refreshes a frame at 1ms where the game does the same at 1.2 ms so hey are not the exact same frame giving it a choppy look even At the max fps of the monitor where as if you have the game running at 200 fps the chances of the games frame refresh landing on the frame refresh of the monitor is much more likely. It's a hard concept to Explain but I tried
February 11, 2010 11:47:59 AM

All this discussion of FPS and what the eye can see is completely off base with the topic at hand... What makes DVI different with VGA
That is the topic we should be discussing not if you have super human visual perception nor is it something we need to dissect as an issue. DVI does it create a better quality experience or is it pretty much the same. Thing of it this way, VGA is analog and DVI is digital. When we watch a TV or play play-station or XBOX with the analog input rather than the RGB or HDMI where is the differences. Is it more vivid is it what?
February 20, 2010 8:31:15 PM

Agree with Spartichaos...thank you...Reading it though made me smile...

Hot pluggable Yes
External Yes
Video signal Digital video stream.
(Single) WUXGA 1920 × 1200 @ 60 Hz
(Dual) WQXGA (2560 × 1600) @ 60 Hz
Analog RGB video (−3 db at 400 MHz)
Data signal R,G,B data + clock and display data channel
Bandwidth (Single Link) 3.96 Gbit/s
(Dual Link) 7.92 Gbit/s

Rick, Leics
February 21, 2010 12:11:06 AM

Sorry I also havent answer the whole questions.....digital signals that pass though without being converted so you see it as it meant to be. VGA leads can be susceptible to interference/disruption if they are not shielded....dont think thats a problem these days....you can also get variations of dvi leads too which are compatible with analog signals and some which handle digital signals only...in general its a cleaner digital signal as intended....just see it as a peripheral with no affect other than its intention to perform itself.......
March 3, 2010 1:54:02 AM

to put all these peoples mumbo jumbo simply.

DVI is better than VGA

because DVI is digital and instead of your computer converting analog(VGA) to digital (DVI) it goes all the way through digitally.
!