Sign in with
Sign up | Sign in
Your question

DVI-D vs DB-15 comparison

Last response: in Graphics & Displays
Share
January 30, 2007 6:31:04 PM

Are there any articles or comparisons between these two connections?

My brother and I tested both of our monitors (my 215TW and his Viewsonic WX2235WM, both native at 1680x1050, both HDTV capable) side-by-side using both connections with various games, HD movies, and still images... everything looks much better with DVI.

But a forum poster over at another board says that the differences aren't noticeable. Yet, my brother and I both heavily agreed no matter which one of our monitors, the one with the VGA connection was noticeably inferior (washed out colors, tons of Vsync problems, and not as sharp). He claims that both of our monitors are faulty...

I'd like to read a professional comparison of the two to make sure we're not going nuts.


EDIT for new info: Both the monitors my brother and I tested are LCDs. Changed title to fix n00b mistake.

More about : dvi comparison

January 30, 2007 6:55:25 PM

Why dont u just go by your own opinion? That should be the only thing that counts.
January 31, 2007 2:50:17 AM

dvi is better..it should be..its made to be...noob

thats like saying my 6200le is the same as my 7900gtx in oblivion at like 1920x1080 res....wtf??? learn some more about computers
Related resources
January 31, 2007 2:58:14 AM

Actually many people that are very knowledgeable on the subject will argue there is no difference. Learn more about computers noob.
January 31, 2007 3:28:43 AM

it depends.

some people report a very noticeable difference, I for one notice DVI delivering a much nicer image.

but others don't notice anything, not sure why, could be their eyes, could be that they were getting a good VGA signal, I don't know, there are any number of other factors.

though DVI is better on a technical level.
January 31, 2007 3:30:36 AM

o sorry iam freaking 13 and ive built like 5 computers already...

sorry for not knowing what iam talking about...i hooked my monitor up in vga then dvi..and dvi has a lot chrisper picture..and more vivid colors..its just plain clearer
January 31, 2007 3:33:23 AM

dvi is a digital signal and vga is analog...people who no stuff no that...digital is always better than analog...the monitor has to take the time to convert the analog signal to digital to display it on the screen
January 31, 2007 3:34:38 AM

I've performed the test myself and do quite a bit of video and graphics work and I see no perceivable difference on the PC. My HDMI connection makes a difference on my HD TV. Your video card may make a difference but with the 8800 I see none.

Unless you connect two calibrated displays side by side there no way you could perform the video test. With static images you could photograph the settings with a camera in manual mode. There is no way you could do an accurate comparison by memory.

If it make you feel better go for it.
January 31, 2007 3:43:29 AM

the GPU is a definite factor, since an analog vga signal can vary heavily from crap to not.


there are many factors, but DVI is undeniably a safer bet, and many people believe it's nicer.
January 31, 2007 3:51:34 AM

i believe DVI to be superior. why else would the industry move to a digital signal?
January 31, 2007 4:01:41 AM

Quote:
o sorry iam freaking 13 and ive built like 5 computers already...

sorry for not knowing what iam talking about...i hooked my monitor up in vga then dvi..and dvi has a lot chrisper picture..and more vivid colors..its just plain clearer


... There is no visible difference. I have all sort sof different connections my my network and diff graphics cards, they all look the same. THis is ~100 LCD's....

Maybe for certain monitors it makes a difference, honestly i think its because you knew which one was hooked up and in your mind you made the DVI better...
January 31, 2007 4:12:10 AM

maybe better monitors have better dvi signal than vga...maybe crappier monitors have better dvi signal than its vga..i dont know

it could also be a physicalogical thing.


beats me. Physicalogical or not its better on mine
January 31, 2007 4:29:08 AM

FYI..digital is not always better but it's almost always cheaper. Their shouldn't be an appreciable difference if setup properly, but I think VGA is more prone to errors/image variation/etc
January 31, 2007 4:59:37 AM

Quote:
dvi is better..it should be..its made to be...noob

thats like saying my 6200le is the same as my 7900gtx in oblivion at like 1920x1080 res....wtf??? learn some more about computers


Complete apples to oranges comparison there not to mention completely out of proportion.


Quote:
o sorry iam freaking 13...

That's obvious by your immaturity and your complete lack of any sort of grammarical skills. You jumped on the guy for asking a question and were proven wrong. Way to go.

Now back on topic


DVI is always better than VGA, wether it's noticable or not is another story. On my 17" lcd I can't say I see a noticable difference, on my 20" using VGA the color doesn't seem quite as good and the picture seems blurred, to the point where it hurts my eyes to look at, the DVI looks a million times better, it really depends on the quality of the ADC and DAC in use. In most cases it won't matter much.


But simply put, if you have the option always go DVI, if not go VGA in most situations it's not that much different.
January 31, 2007 5:50:03 AM

Of course DVI is better but the thing is, it costs more also the difference isn't as substantial unless you go higher in size (ie. 22 inchers)
does dvi on a 17 or a 19 inch justify having to spend an extra 30-40 dollars? If you have the money by all means go for it but if you're on the budget and you're only aiming for a good 19inch i don't think you need to go DVI.
a b U Graphics card
a b C Monitor
January 31, 2007 2:42:11 PM

Quote:
dvi is a digital signal and vga is analog...people who no stuff no that...


STFU n00b!

"People who no stuff no that" huh?
Yeah, maybe that's the problem, they don't know as mcuh as they think they do.

Ask yourself what makes the DVI look better, is it because often dvi setups autodetect the panel and then automatically adjusts the settings to a better default level, thus the n00bs are better of not having to play with colour temperatures, etc? Or is it because you're looking at a digital panel that poorly converts an analgoue signal? Is it because the card itself sucks?

Quote:
digital is always better than analog...the monitor has to take the time to convert the analog signal to digital to display it on the screen


Not always, and that statements proves you don't know jack. Try running the cable over greater distance and then you tell me which is better. VGA can give you useable cable lengths much longer than standard DVI. So in that case the loss due to cable length or even bad TMDSs may be less than the loss due to conversion errors or the vga analog signal noise.

And it's not just 'analog vs digital' , BNC connected analogue can pump out far higher resolutions and colour depths than even dual-link TMDS DVI, and over really large distances. Of course if you have quality repeaters then you can improve DVI, but that goes for VGA too.

Seriously, both are good and bad depending on their use, you own personal experience (though vast with that handful of computers :roll: ) is not indicative of much other than that. And considering the crap monitors most people compare with at home anyways, I wouldn't trust anyone else's perception of good/bad.

Betcha the VGA and DVI-A images on my properly calibrated P260 here at work look crisper and more true than any DVI connected image you put on your cheap panel, but it won't be the DVI/VGA alonge that would have the largest impact.

The only thing you said that makes sense is that the OP shouldn't care what people think, but calling him a n00b when you're obviously a FAQin' dumbA$$ is laughable. At 13 you should consider SingTFU and listening more than posting.

Seriously, go play in traffic kid!
January 31, 2007 2:58:07 PM

Quote:
i believe DVI to be superior. why else would the industry move to a digital signal?


I suppose SATA II is superior to SATA I? :p 

PCI-e 8x < PCI-e 16x?

Same situation, yes the newer tech can be better, but often it goes unnoticed that there is no actual performance gain.


I don't see any difference in DVI on my monitor, I'm sure some people do and some don't. For reference I have a Westinghouse 22" Widescreen 5ms response time. 1680 x 1050 native.
January 31, 2007 3:41:34 PM

Quote:
Actually many people that are very knowledgeable on the subject will argue there is no difference. Learn more about computers noob.


You should check your own facts before calling someone else names when they're right.

There is a massive differerence. DVI is digital. RGB (what he's calling VGA) is analog.

A picture from an analog source on an LCD panel has gone trough two lossy conversions. D to A by the video card and A to D in the monitor. It has also been degraded (twice) by quantization, and also signal loss in the cable and also its timing signals are analog therefore the H and V sync is less accurate.

A digital image has been through no conversions and suffers no loss as both it and its timing are dicrete signals.

Its exactly like comparing the sound quality of DVD to cassette tape. One is digital one is analog.

Anyone who says RGB looks as good as digital is probably using a crt monitor (which by its nature is an analog device) or maybe a resolution that isn't the LCD panel's native resolution. In either case, they aren't getting the best beacuse of basic mistakes, so are techincally clueless and most definately wrong.
January 31, 2007 3:44:48 PM

DVI will usually look better out of the box, but if you tweak the settings on the analog one, you can typically make it so it appears to have the same quality.

If you are complaining that things look fuzzy on the analog one, play around with the monitors settings, the auto-optimize for analog on some LCDs is better than others.
a b U Graphics card
a b C Monitor
January 31, 2007 3:45:43 PM

Quote:
i believe DVI to be superior. why else would the industry move to a digital signal?


I suppose SATA II is superior to SATA I? :p 

OOOOooooohhh! Watch out don't call it SATAII or the SATA-IO guys will getcha;
http://www.theinquirer.net/default.aspx?article=37254

Can't rememebr the thread but there was one of them nutterz here a few weeks back.

Of course they forget that it can still be an SATAII drive in the sense that it's an SATAII devise spec drive. :roll:

BTW, the iRAM drive from Gigabyte can actually saturate SATA 1/1.1 and might benifit from SATA2/II/3G.... speed boost.
January 31, 2007 4:03:32 PM

Quote:
I've performed the test myself and do quite a bit of video and graphics work and I see no perceivable difference on the PC. My HDMI connection makes a difference on my HD TV. Your video card may make a difference but with the 8800 I see none.

Unless you connect two calibrated displays side by side there no way you could perform the video test. With static images you could photograph the settings with a camera in manual mode. There is no way you could do an accurate comparison by memory.

If it make you feel better go for it.


Are you using and LCD panel or a CRT? I know lots of graphics pros still prefer CRT's because of the more natural colour gradient. If you're using a CRT then you won't see hardly any difference between RGB and DVI as CRT's are analog by their nature.

Also if you're mostly looking at graphics and photos its no wonder, as there's no sharp edges compared to say text.

If you're using an LCD: To see the problem everyone else sees, make sure you set the windows screen res. to your LCD's native resolution, and make sure you turn off any crappy microsoft font-blurring technology like cleartype, and boot up with the DVI cable only connected.

Now look at some small text. You should see a lovely sharp picture. Now reboot with just the RGB cable connected. you will see lots of blurriness around the text by comparison.
January 31, 2007 4:12:18 PM

Quote:

Betcha the VGA and DVI-A images on my properly calibrated P260 here at work look crisper and more true than any DVI connected image you put on your cheap panel, but it won't be the DVI/VGA alonge that would have the largest impact.



More of grapeape's usual standard of crap.

The p260 is a 19.1 inch viewable CRT with a current value of $39.95 on ebay so its nothing to boast about.
http://popular.ebay.com/ns/Computers-Networking/Ibm+P26...

Of course you won't see a difference between DVI and analog on a CRT.

And if you think any CRT could ever be as sharp as even a cheap digital panel then you really are clueless.
January 31, 2007 4:21:32 PM

VGA is the wrong term to be using guys.


EVGA, SVGA, WSVGA, these are all standards. RGB input is what the OP is talking about.

Also, the 3 color 5 cable BNC analog (with v/h inputs) look far better to a graphics artist, on a good analog monitor then the digital monitor that over hypes its colors and contrast.

It all depends on application of both digital / analog signals. Both have positives and negatives associated with each technology.


As for the 13yr old SNOT.
What about an Analog camera? You saying a digital camera can take better pictures then a regular camera? I don't think so. Sure the brain inside a digital camera helps people take better pictures by setting the f-stop and shutter speed for them, but if used properly, an SLR camera will look much better then a digital SLR in most cases.


edit: though on smaller images, digital is almost caught up if not near equal. However, performance is still poor.
a b U Graphics card
a b C Monitor
January 31, 2007 4:29:02 PM

Quote:

You should check your own facts before calling someone else names when they're right.


Right according to what? And his name-calling was in response to the previous poster.

Quote:
There is a massive differerence. DVI is digital. RGB (what he's calling VGA) is analog.


Massive difference, eh !?! What like 2.7 x 10^9 or less?
Deing digital or Analogue itself isn't an issue, it depends on the quality of the components and how they are being used. What's better, digital or analogue audio (depends on the quality and rate doesn't it?), same goes for video. You can have digital artifacts just like analogue artifacts. And like I said both have their weak links.

Quote:
A picture from an analog source on an LCD panel has gone trough two lossy conversions. D to A by the video card and A to D in the monitor. It has also been degraded (twice) by quantization, and also signal loss in the cable and also its timing signals are analog therefore the H and V sync is less accurate.


And yet you can still get a better image from a quality VGA cable and good filtered output than from a poor TMDS on crap cables outside their range. The potential is better for DVI within a given range, but it doesn't always result in a better picture, which is what the discussion has been sofar. So if someone came in saying their VGA was giving them a better picture than their DVI your argument and that of LB7 would be tha it's them because DVI is better than VGA. Which isn't always the case as many of us with experience with both know.

Quote:
Its exactly like comparing the sound quality of DVD to cassette tape. One is digital one is analog.


Obviously you've never really looked at all the options when authoring a DVD. Make the bit-rate low enough and your DVD will look worse than a VHS tape (let alone Super-Beta), it isn't just digital vs analogue alone.

Quote:
Anyone who says RGB looks as good as digital is probably using a crt monitor (which by its nature is an analog device) or maybe a resolution that isn't the LCD panel's native resolution. In either case, they aren't getting the best beacuse of basic mistakes, so are techincally clueless and most definately wrong.


Niz you prove over and over that you just don't know what you're talking about. It's not because of 'mistakes' unless they are using the wrong hardware, and that can happen for both DVI and VGA. Seriously, if you knew what you were talking about, you'd take a more centrist position, and know that the output image depends on alot of factors many of which are heavily influenced by the user (be it requirements or limitations).

Quote:
More of grapeape's usual standard of crap.


Awww poor baby, just because I keep schooling you NIZ, no need to think people post crap like you do.

Quote:
The p260 is a 19.1 inch viewable CRT with a current value of $39.95 on ebay so its nothing to boast about.
http://popular.ebay.com/ns/Computers-Networking/Ibm+P26...


eBay!?! C'mon, you're gonna argue eBay prices? We're talking quality, not prices dumbA$$ or did you think this was an economics discussion I was schooling you in yet again. The P260 is among the best Trinitrons made, and no consumer level LCD can compete with the quality. A good (and expensive) LED BacklitLCD panel can compete, and the extremely expensive HDR panels, but then again, that's the panel, not the connection, like I said. Your inability to distinguish between the two shows your ignorance.

Quote:
Of course you won't see a difference between DVI and analog on a CRT.


That's not the issue now is it, it since there is no digital on a CRT DVI-A is still analogue or did you not know that. What I was saying is that it's not the interface that always matters, so comparing a DVI-I connected LCD from 4-5 years ago isn't going to guarantee a better picture, there's more involved.

Quote:
]And if you think any CRT could ever be as sharp as even a cheap digital panel then you really are clueless.


Well based on your previous posts and the fact that I work with both, as well as with higher end panels, I KNOW you're clueless, just look at all your suppositions and how many times you got them wrong.

You THINK I'm wrong, everyone else, including myself, can clearly see you're wrong! Maybe it's the quality of the connection between your brain and your eyes/keyboard. :tongue:
a b U Graphics card
a b C Monitor
January 31, 2007 4:49:49 PM

Quote:
VGA is the wrong term to be using guys.


EVGA, SVGA, WSVGA, these are all standards. RGB input is what the OP is talking about.


Yeah it would helpp saying DB-15, but few people know the term. VGA has become synonymous.

Quote:
Also, the 3 color 5 cable BNC analog (with v/h inputs) look far better to a graphics artist, on a good analog monitor then the digital monitor that over hypes its colors and contrast.

It all depends on application of both digital / analog signals. Both have positives and negatives associated with each technology.


EXACTLY!

Glad someone else understands the concept.
In the end, go with what works for you and your application.
January 31, 2007 4:50:05 PM

Quote:
dvi is better..it should be..its made to be...noob

thats like saying my 6200le is the same as my 7900gtx in oblivion at like 1920x1080 res....wtf??? learn some more about computers


I believe that is the whole point about posting in a forum and asking a question, is it not? Usually a good way to learn something is to ask a question about something you don't know, and get the answer.

We were all n00b's at some point in our lives. We all probably got to where we are now by asking a few dumb questions along the way.
January 31, 2007 5:18:16 PM

Quote:
o sorry iam freaking 13 and ive built like 5 computers already...
sorry for not knowing what iam talking about...i hooked my monitor up in vga then dvi..and dvi has a lot chrisper picture..and more vivid colors..its just plain clearer


Hmmmm.... this just makes it too easy. Young, and a whopping 5 computer builds.... I'm not seeing anything here that screams expert to me at all. Perhaps you shouldn't be too trigger-happy with the noob label... we're not in video-game land anymore.
January 31, 2007 5:36:19 PM

Quote:
Are there any articles or comparisons between these two connections?

My brother and I tested both of our monitors (my 215TW and his Viewsonic WX2235WM, both native at 1680x1050, both HDTV capable) side-by-side using both connections with various games, HD movies, and still images... everything looks much better with DVI.

But a forum poster over at another board says that the differences aren't noticeable. Yet, my brother and I both heavily agreed no matter which one of our monitors, the one with the VGA connection was noticeably inferior (washed out colors, tons of Vsync problems, and not as sharp). He claims that both of our monitors are faulty...

I'd like to read a professional comparison of the two to make sure we're not going nuts.


Well, what have you seen has most likely much more to do with your GPU than monitor. Quality of analog greatly depends on quality of analog components.

Years ago when VGA was the only option, I remember I had to sold Nvidia TNT card and buy Matrox, because quality of TNT analog output was far inferior compared to Matrox.

I guess today when DVI is almost default, GPU makers simply ignore quality of analog output and save costs there.

Mirek
January 31, 2007 5:55:09 PM

i have 4 identical monitors... Gateway FPD1830's, 18.3" LCD screens... each with DVI and VGA...

I run a dual monitor setup on an AGP nVidia 7600GS... soon to be trimonitor with the help of Matrox's TripleHead2Go...

having run dual monitors for so long, i can tell you this... DVI is technologically capable of providing a more 'true to life' image than VGA (brighter/darker colors, red looks more 'red' blue looks more 'blue' etc)... whether it does or not depends on a few factors:

1.) capabilities/configuration of the monitor...
2.) capabilities/configuration of the graphics card...
3.) lighting environment
4.) length/quality of the cables used
5.) visual limitations of the viewer

quick explanations:
1.) capabilities/configuration of the monitor...
- monitor contrast ratio (higher the ratio, the more advantageous DVI will become)
- monitor refresh rate (mainly for comparison during video)... the faster the refresh rate (12ms, 8ms, 6ms) the more crisp the motion video will appear on the DVI

2.) capabilities/configuration of the graphics card...
- watching video, playing games, image editting, etc.. all depends on how it's signaled by the graphics card.. some graphics cards are capable of cleaning up an image (especially when the card itself has to render the image)...
- graphics cards are generalized for the public.. they're made to take a monitor attached to it and display an image, and render 3D... that's it... that's why ATi and nVidia provide color-balancing software with their drivers... to maximize the output of your card to your monitor, and to push the limit of your monitor on your card...

3.) lighting environment
- mainly light diffusion... this occurs when the light from the colors on the scren intersects with the light from other colors and cause color-mismatching when the image is viewed by a person... RED will look RED under a magnifying glass, but next to a couple hundred other red, green, and blue pixels, from 18-36 inches away will look not as red... this is why some monitors include a plastic film over the screen (brightview, for example)... this plastic helps reduce color/light diffusion
- ambient light around the monitor (in the room) also attributes to light/color diffusion... try using a monitor in a dark/red room for photo development and tell me if everything on your screen still looks the same... same under blacklight conditions... does it look the same? no

4.) length/quality of the cables used
- the longer the length, the more the signal is degraded... DVI definitely does better in terms of video quality when longer cables have to be used, because the signal is already Digital and less prone to line-noise than an analog (VGA) signal... consider Digital to be either AIR or CONCRETE (0 or 1)... can you inject noise into air? no, it drops out... can you inject noise into concrete? no, it runs off... analog, however, is more like pudding... sometimes yes, sometimes no :) 
- shielding is also an important factor, which attributes to quality of the cable... thicker shielding will help VGA more than DVI, because the shielding will help minimize the amount of noise the cable is exposed to... ever gone to Circus City or Bad Buy and had them try to sell you those $150 Monstrocity Cables for your new 1080i TV? they were expensive, but they're worth it, because your signal is analog... same goes for car audio (i do car audio installation, as well as computer teching).. better shielding = clearer signal.. period

5.) visual limitations of the viewer
- i hate to bring this up, but some people are color blind.. i used to be an inbound customer service rep for SBC Yahoo! DSL, and the modem's lights could be 1 of 3 colors: Red, orange, or green... a lot of times though, people said the light was yellow (usually they meant green)... they're color blind... sometimes they'd say the light was orange (they meant red)... some people are just physically incapable of noticing a difference in colors, because their brains literally don't process them the same way... red and green are the two most common colors that people lose sight of in 'colorblindness'... yellow and orange are usually the most common colors they are mistaken for.. some people have such strong colorblindness that they call red 'yellow' because that's what it looks like to them...

so you see, these 5 main factors are what control the display of DVI and VGA... I personally DO notice a difference between DVI and VGA (on DVI, mainly in the richness of colors.. black looks more black, white looks more white, and every color looks deeper and more rich)... gradients display better on DVI... but in the end, it's up to the user...

that's all that matters in computers, period... if your satisfied with your results and the way you got there, stick with it until your satisfaction disappears... then upgrade :) 

hope this helps, and please guys stop arguing on the threads... we all act like idjuts sometimes :thumbsup: but some of you seem to have forgotten when to stop acting :thumbsdown:

- Cheers -=Mark=-
January 31, 2007 5:57:36 PM

Analog VGA can theoretically be as good as DVI. However in real life implementations, the vga cable can make a big difference in realizing that. A bad VGA cable without the right shielding can make mess of the picture. Also the connectors on the video car and the monitor if they have quality control problem would also contribute to degraded picture. Verify those things first before jumping to conclusions.
January 31, 2007 6:03:05 PM

Quote:
dvi is a digital signal and vga is analog...people who no stuff no that...


STFU n00b!

"People who no stuff no that" huh?
Yeah, maybe that's the problem, they don't know as mcuh as they think they do.


Seriously, go play in traffic kid!

The best post i´ve read in quite a while and sound advice too. :D 
January 31, 2007 6:15:15 PM

And people that know stuff actually know how to spell words too. Just because theoretically something is better doesn't mean it always is. Many, many benchmarks will show you that.
January 31, 2007 6:16:04 PM

Quote:
VGA is the wrong term to be using guys.


EVGA, SVGA, WSVGA, these are all standards. RGB input is what the OP is talking about.


Yeah it would helpp saying DB-15, but few people know the term. VGA has become synonymous.

Quote:
Also, the 3 color 5 cable BNC analog (with v/h inputs) look far better to a graphics artist, on a good analog monitor then the digital monitor that over hypes its colors and contrast.

It all depends on application of both digital / analog signals. Both have positives and negatives associated with each technology.


EXACTLY!

Glad someone else understands the concept.
In the end, go with what works for you and your application.


Hint to all A+ future exam takers..... What is the standard analog video connection? DB-15
January 31, 2007 6:19:17 PM

Quote:
And people that know stuff actually know how to spell words too. Just because theoretically something is better doesn't mean it always is. Many, many benchmarks will show you that.
[/i]


I wouldnt equate spelling skills with general knowledge. Plenty of famous writers had bad grammar and spelling, and I know of a few scientists with poor spelling (in general).

Then there are people who don't care about typos.. You get my point.
January 31, 2007 6:25:18 PM

Quote:
i believe DVI to be superior. why else would the industry move to a digital signal?


Because it's pretty hard to DRM an analogue signal.

If you have good D/A and A/D converters in your GPU and monitor, you probably won't notice the difference between VGA and DVI. However, I don't think I've ever heard of DVI being worse than VGA, and there are some cases were VGA may be noticeably worse than DVI. So, if VGA <= DVI and DVI >= VGA, all else being equal, then the choice is simply DVI.
January 31, 2007 6:36:25 PM

DVI has more bandwidth than VGA

so maybe it won't matter with your 14in CRT but the difference may become noticeable with your new 24in WS Apple Cinema Display
January 31, 2007 6:39:04 PM

Quote:
DVI has more bandwidth than VGA

so maybe it won't matter with your 14in CRT but the difference may become noticeable with your new 24in WS Apple Cinema Display
:lol: 
January 31, 2007 6:44:44 PM

tthat has analog input?


and you assume the 23in from apple has a higher rez then a good crt?
January 31, 2007 6:49:16 PM

Quote:
tthat has analog input?


and you assume the 23in from apple has a higher rez then a good crt?


that must be a helluva 14in CRT reaching higher rez than the Apple Cinema
a b U Graphics card
a b C Monitor
January 31, 2007 9:39:13 PM

Quote:
DVI has more bandwidth than VGA


That's a pretty generalised statement. Fact is that in a standard configuration, single-link DVI has LESS bandwidth than commonly output via VGA/DB-15. However, Dual Link DVI will output more than what is commonly outputed via VGA. But this depends alot on the RAMDAC vs TMDS configuration, not the connector itself. Also BNC physically carries more than both of them, but now you're getting into the physical cable properties of the connectors, not the implementations of them. So this 'discussion' can be taken to wicked extremes.

BTW, here's one for your brain; consider that it's the analogue properties of the DVI cable and it's signal that affect it's ability to maintain a proper signal over greater lengths. Ask your local physics grad to explain that to you if you need to.

Quote:
so maybe it won't matter with your 14in CRT but the difference may become noticeable with your new 24in WS Apple Cinema Display


Depends, because if you run that 24" Cinema display on DVI from a crap single-link TMDS DVI output you may notice it too because you'll be near the limits of single link support.

But using 14" CRT as an example is ridiculous since it's not a consumer product, but a commercial one, and the Matrox cards were with hi res analogue displays for medical imaging.

Once again depends on the use, because that same cinema display is crap for CT and MRI scans, whereas the QUXGA IBM T221 and it's Viewsonic rival @ 22" would blow the Cinema 23-24" away but they don't use a normal 'DVI' connector, but they have better image quality except for fast motion where those panels have slow refresh, but for more static images, they're truely awesome. So there's a perfect example of not everything fiting into ever mould, at least not until the magical DisplayPort. 8)
January 31, 2007 10:09:01 PM

Quote:
dvi is better..it should be..its made to be...noob

thats like saying my 6200le is the same as my 7900gtx in oblivion at like 1920x1080 res....wtf??? learn some more about computers



NOT NESSESARY....... DICK!!!
January 31, 2007 10:18:50 PM

Looks like max DVI cable length (retaining image quality) is 50feet.
Max DB-15 cable length is 150feet

I really couldn't find a respectable source, instead I went by Projector theater setup forums as well as cable supply. I could find High quality DVI cables certified for a max length of 50ft, and quality vga cables (db-15) up to 150ft with people saying they have not had any signal degradation.

Resolutions are 1280x1024
a b U Graphics card
January 31, 2007 10:46:41 PM

Quote:
digital is always better than analog


not in all cases.

ever tried comparing audio.

try an old vinyl record to a cd for music.

the record sounds way better.
January 31, 2007 10:51:07 PM

you know that is entirely up to the listener. I prefer analog sound (vinyl). In fact, I enjoy it MUCH more then DVD-A.... through I hate the quality possible with most Tape recordings.
a b U Graphics card
January 31, 2007 10:57:54 PM

niz wrote
Quote:
Quote:
]And if you think any CRT could ever be as sharp as even a cheap digital panel then you really are clueless.



mr. ape wrote.
(Well based on your previous posts and the fact that I work with both, as well as with higher end panels, I KNOW you're clueless, just look at all your suppositions and how many times you got them wrong.

You THINK I'm wrong, everyone else, including myself, can clearly see you're wrong! Maybe it's the quality of the connection between your brain and your eyes/keyboard.)


i have experience with lcd,dlp and crt.

crt always wins in sharpness and overall picture quality.

so yes i agree with you mr. ape.
a b U Graphics card
January 31, 2007 11:00:20 PM

Quote:
I prefer analog sound (vinyl).



same here.
January 31, 2007 11:40:41 PM

Borrowing your Reply button, CompTIA...

I remember trying to get a Hitachi LCD connected to a laptop dock, and the cables were weird. Turns out, even though it was a DVI port, it was analog:

http://www.datapro.net/techinfo/dvi_info.html#Page06

That could very well be the reason why some people notice a difference and others don't. Maybe they don't know if their signal is really digital or not.
January 31, 2007 11:53:42 PM

Quote:
i believe DVI to be superior. why else would the industry move to a digital signal?


I suppose SATA II is superior to SATA I? :p 

PCI-e 8x < PCI-e 16x?

Same situation, yes the newer tech can be better, but often it goes unnoticed that there is no actual performance gain.


I don't see any difference in DVI on my monitor, I'm sure some people do and some don't. For reference I have a Westinghouse 22" Widescreen 5ms response time. 1680 x 1050 native.

uh huh... and one day those flash based hard drives will benefit from SATAII.. just like video cards are starting to benefit from x16 PCI-E. my point is D-SUB has limitations. DVI does too, but not like d-sub.

so i ask again, why else would the industry move to a digital signal?

a better answer would have been DRM and HDCP. ;-)

don't make me anally rape you again, boy.
January 31, 2007 11:54:15 PM

Calm down, sparky.
!