Status
Not open for further replies.

PJ101

Distinguished
Mar 5, 2006
159
0
18,680
What are the technical differences between DVI (digital) and VGA cabling (analog).... Is there really a quality boost? Is there a performance hit sending digital signal?..


Thanks for any help
 

realzeus

Distinguished
Apr 16, 2007
163
0
18,680
Personally, having tried out both I do not see any difference whatseover. Thus, I use the D-Sub in order to get the 75Hz refresh rate.
 

realzeus

Distinguished
Apr 16, 2007
163
0
18,680
DVI is limited to 60 Hz. My card has 2 DVI outputs and I use the convertor to turn the one (I don't use the other output anyway) into D-Sub and thus get 75 Hz.
 

realzeus

Distinguished
Apr 16, 2007
163
0
18,680
As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?
 

Eurasianman

Distinguished
Jul 20, 2006
883
0
19,010
As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?

WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.

To the OP, if you want to know the difference, go here:
Digital Visual Interface (DVI)
Video Graphics Array (VGA)
 

MalcolmCarmen

Distinguished
Jun 25, 2007
1
0
18,510
Hrm, I'm about to choose an LCD, and have the same question...

However, I must choose betwen a d-sub VGA monitor or one with DVI input. The VGA d-sub monitor is slightly nicer, so I'm leaning towards getting it. Am I really going to notice the difference between DVI and VGA? I'd have something like the following:

video card (dual DVI outputs) -> adapter DVI to VGA -> VGA cable -> monitor VGA d-sub input

Thanks!
 

realzeus

Distinguished
Apr 16, 2007
163
0
18,680
As these are TFTs and not CRTs there is no a noticeable difference but then again DVI effectively restricts you to 60 FPS, which is still perfectly acceptable as the human eye does not distinguish shuttering above 23 if I am not mistaken. Why not have 75 Hz and thus 75 FPS if your VGA can cope with that though?

WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.

To the OP, if you want to know the difference, go here:
Digital Visual Interface (DVI)
Video Graphics Array (VGA)

Think about it. Your VGA can push 200 fps but the monitor can only refresh the image 60 or 75 times per second. Thus you may get ''torn'' images, unless you activate the V-Sync setting that restricts the fps according to the monitor's refresh rate. Another reason to prefer high end CRTs that can go upwards of 100 Hz.

MalcolmCarmen: Go for the monitor with the best image quality/specifications, irrespectively of the input signal.
 

stefx

Distinguished
Nov 27, 2006
477
0
18,780
[quote="Eurasianman
WTH are you talking about?!?!?! I can up to 200 FPS on my monitor in CS:Source and I'm using a 7900GS that is hooked up to a Samsung 940BW 19" WS through DVI! And that's probably because I have all the settings turned up to max.[/quote]

Right-click on your desktop
Go into properties

In your display settings somewhere, you'll find that your LCD screen refresh rate is 60 Hertz. You're not displaying 200 FPS on your LCD screen, not that it would matter anyway as your human eye (hopefully...) wouldn't be able to tell the difference between 60FPS and 200 FPS
 

HarkeN

Distinguished
Nov 15, 2007
1
0
18,510
Actually I'm pretty sure you're wrong there. As LCD monitors do not actively refresh the image a given number of times per second, but instead only change the parts of the image that are moving, display Hz becomes irrelevant - shouldn't effective fps instead be restricted by the response time? Due to the way the drivers are coded this will likely still cause image tearing however, though that has never been a problem for me.

Moreover, changing the Hz to above 60 has been proven to have a potential negative impact on any overdrive/response-time technology. Lastly don't get me started on that human eye bullcrap. The human eye can easily see upwards of 200fps, and if you can't, then you're probably either visually impaired, not paying enough attention, or in denial.
 

azgard

Distinguished
Dec 20, 2002
159
0
18,680


Point out the device you used to compare 200 fps to the inferior sub 200 fps displays.
Thanks in advance.
 


Wow, you must be living in perpetual "bullet time" to be able to see 200fps or 200 individual "snaphots" per second. You should therefore be able so see each individual wing flap that a hummingbird does per second since they can "only" flap thier wings up to 80 times per second (depending on the species).

Most mortals cannot breakdown time into 5ms increments (1 second / 200 frames).
 

MagicPants

Distinguished
Jun 16, 2006
1,315
0
19,660
Even if you send a 75hz signal to an LCD most if not all LCDs just drop the extra 15fps. 75hz signals can be blurrier than 60hz signals, so even if you are using vga most of the time 60hz will look better.

A DVI (digital) signal will look crisper than a VGA (analog). It won't have any ghosting or funny colors.
 

jerseygamer

Distinguished
Nov 9, 2007
334
0
18,780


No the human eye is stuck around 35ish frames per second. It is humanly impossible to even see 60fps. I repeat it is impossible. This is not a wifes tale but medical fact.
 

jerseygamer

Distinguished
Nov 9, 2007
334
0
18,780


An LCD that actualy can display over 150fps would cost in the area of 6000$ and I am betting you cant find one outside of a specialty store in a very rich area.
 

azgard

Distinguished
Dec 20, 2002
159
0
18,680


My original post was to call him out on utter BS. And LCD's don't function in the same way as a CRT so a FPS measurement is irrelevant.
 

BlueNovember

Distinguished
Jul 18, 2008
10
0
18,510


Exactly. This thread is full of misinformation; linked article is a good guide that should clear things up.
 
The linked article is a good guide. One thing it didn't talk about that I'd be curious to see though is the question of how short of a flash of light is discernible not just by existence, but by other qualities. In other words, if you can see a 1/400 second flash, but you can't tell the difference between it and the same flash for 1/60 of a second, 60fps can still be fast enough to reproduce anything you could see. If you can tell the difference though, faster could be useful.
 

BoredErica

Distinguished
Aug 17, 2007
153
8
18,685

\WHAT THE HELL are you talking about? Jesus christ, are you 3 years old?
Go play any game, set maxfps to 35 fps! THEN SET IT TO 60FPS!!! DOn't be an idoit an post idoidic messages!

Now, like others said, 200fps, yes, but the computer can generate 200fps but the monitor SHOWS the fps cap at the screen refresh, 60 refresh means even if you got 99 fps, the monitor shows 60fps. There is no visual differance to humans after 60 fps or 70fps.
anything LOWER than 60fps, we can definatly see.

 

bnot

Distinguished
Nov 17, 2007
707
0
18,990
actually i think you'll find that the eye does not work like a screen where we see individual pictures however many times per second.....it is infact constantly monitoring any changes, and it is 'Persistence of vision' that allows us to 'blur' together the individual pictures flashed up by out screens.

this explains how we can see a flash lasting 1ms or even less, yet cant distinguish it from one lasting longer. also how we cannot tell the difference between 60fps and 2309320932fps.

lastly id like to agree that your computer could do 200fps but if your screen doesnt then why not cap your fps and help save the environment by using less power? :D
 

hipcheck

Distinguished
Sep 16, 2008
1
0
18,510
This thread is pretty hilarious, and as people who are into hardware and gaming, it's surprising how misinformed a few of the posts are.

I have dealt with flicker and high refresh/FPS quite a lot in my life because my eyes are overly-sensitive. As a result, I can tell you that refresh rate does NOT equal FPS at all. over 99% of humans can't see 30FPS. I am one of the unlucky few who can (just barely), so a slight flicker registers, especially when FPS is lower. Most people can't see the difference between 75Hz refresh and 100Hz refresh, but a small number can. Many more can see the difference btwn 60Hz and higher, as that's the normal threshold for being able to detect the flicker.

200FPS is a hilarious marker, people. it's a number used for testing equipment in a vaccuum, there are no humans that can see anything close to that. Trust me, I am an outlier and I peak in the 40's at the most. 200FPS is something that a graphics card to muster in a test, and there would be no difference, visually btwn that and 100FPS to us. The only diff is when you put those results under stress (heat, bottlenecks, etc), and the performance is pushed way down. When your 200FPS video card is put under heavy stress, perhaps it will show 35FPS, when a lesser card will be 20FPS, and then you'll notice the difference. Otherwise, go brag to your friends, it's useless in the real world.

In any case, LCD's don't refresh in nearly the same way that CRT's did, and that whole discourse hasn't been updated in common talk, so it's hard to get people on the same page. 60Hz can look very different on different screens, because their vert/hor refreshes are synced differently.

And I'm using an LCD on a DVI conn that's refreshing at 100Hz right now, so unless something has changed since the earlier posts, DVI can support over 60Hz (although someone mentioned that it drops, not sure if that's the case).
 

inferno1337

Distinguished
Sep 24, 2008
1
0
18,510
Yes, any resolution that uses a 10:9 aspect ratio or higher usually cannot go any higher then 60Hz so going VGA with a widescreen is basically pointless
 
Status
Not open for further replies.