Sign in with
Sign up | Sign in
Your question

LCD finally ok for video games?

Last response: in Components
Share
October 23, 2006 1:13:14 PM

I've been looking around and i see all these new monitors at 600:1 ratio with like 2ms response time for like 180-250 bucks.. if all this is true does that mean ghosting and crystal clear pictures are true now for video games?

My first and last viewsonic a few years back with its 12ms response time and 500:1 contrast ratio looked terrible with games.. can anyone help a brother out and let me know if you are having premium experience with your new LCD monitors..

as of now i'm using a flat CRT from Samsung syncmaster 793mb.. it looks great, but its already starting to die on me at only 2-3 years old... so its time for something new.
October 23, 2006 1:39:51 PM

Tom's did a review a while back:
http://www.tomshardware.com/2006/03/27/the_spring_2006_...

The new LCD's are getting pretty good, and most people find them acceptable now. Look closely at the ones advertising 2ms response time, they may advertise to be faster, but overall image quality was rated lower than many with 4-8ms response time, and some of the very fast rated LCD's actually performed worse than their slower rated counter parts.
A good 8ms LCD can provide a decent gaming experience with todays models.
My advice, go to a store where they have a bunch setup, and view them yourself before making a decision.
October 23, 2006 1:42:51 PM

I've been using a Dell 17" LCD monitor (not the cheap-o kind) for a while now and it games perfectly well. Monitors these days are neigh equivalent to CRT's, and game just as well. Even better are the widescreen LCD's, although there are resolution issues...

If you want to game, then a good LCD monitor won't dissapoint. Tom's has a good guide, but it is kind of old, there are many new monitors out there.

Me, I'm looking at the VX2235WM, 22" widescreen viewsonic monitor.
Related resources
a c 195 C Monitor
October 23, 2006 2:08:53 PM

Try walking into Best Buy if one is near year, they have a handful of PCs running a game like BF2 and is connected to an LCD monitor. It may not be connected to the LCD monitor you want, but it can give you an idea of how they perform nowadays.
October 23, 2006 2:11:00 PM

Unless you're a hardcore FPS addict, you won't notice the difference on an 8ms screen.

Just make sure you do your homework first since not every LCD has a response time as advertised (know the difference between G-T-G response and B-T-W, overdrive panels, etc). Viewsonic and Samsung tend to be the most popular LCD's among gamers.

Also, don't get a widescreen if you value FPS framerate.

Otherwise you'll be pleasantly surprised with the quality and reclaimed desk space :) 
October 23, 2006 2:25:36 PM

i have the NEC w20gx2 and its awesome the viewsonic 9x22 is very good by all accounts to
October 23, 2006 2:43:17 PM

I've been using a 22" Acer for a few months with no problems... before that a 19" BenQ. LCD's are much better than old CRT folks suggest. CRTs are outdated suck power and give me headaches.

I've had zero problems and I used to play Call of Duty in TWL and CAL.
October 23, 2006 2:47:37 PM

I'd say LCD gaming is definitely a viable option today. I just bought a 42" Westinghouse 1080P LCD-TV and it's sweeeeeeeeet. I have to use VSYNC on FPS games but that doesn't really bother me.

-DW
October 23, 2006 6:10:47 PM

Thing about widescreen is if there are compatability issues, the screen will just black out the edges so it can go back to a more normal size. However, then you have a "smaller screen"... shrug. I balance out game and movie time, and since I'm in a dorm, can't really have a HDTV and a monitor...
October 23, 2006 6:13:56 PM

What compatibility issues? Most games released in the last 2 years or so will scale up to widescreen quite beautifully. If you do a search for "wide screen gaming" you can find websites that tell you how to fix most games so they will work perfectly. This issue bothered me before I dove into widescreens, but it really isn't a problem at all.

-DW
October 23, 2006 8:09:57 PM

thats is my problem. I'm hardcore FPS addict.. both ways as well.. First person shooter and Framerates per second. I never really payed much attention to how monitors and the such work.. i just buy whatever people say is the one for price/performance.

I've been newegging for awhile but obviously i wont trust anyone that reviews there..

So far i've had my eyes on the 19" inchers and i think the only 3 companies i would look at are Acer, Ben Q, and Samsung. I don't really care for viewsonic anymore just because of my bad experiences with them, but they aren't necessarily out of the question.. especially if one of them seems to be the ish.
October 23, 2006 8:12:23 PM

Thanks for the link and warning.. for some reason i couldn't find monitor reviews.. only was coming up with graphic cards..

I figured 2ms 4ms and 8ms would be one of those things that had to perform well and work to advertise them.. but once again i guess i get to see the side of the marketing industry destroying my trust in all companies.
October 23, 2006 8:24:49 PM

I also recommend using DVI and not Analog. Difference is like HDTV (DVI) compared to standard TV (Analog)
October 23, 2006 8:37:05 PM

Eh, I still play a lot of old school games, but yeah, most new games will run without a problem since widescreen's are becoming more and more commonplace. I'm just saying that compatability might be a problem...
October 23, 2006 8:46:29 PM

Quote:
I also recommend using DVI and not Analog. Difference is like HDTV (DVI) compared to standard TV (Analog)


The only reason DVI would be "higher quality" than BGA is if you converted the signal i.e. DVI->VGA. On the other hand if you do a VGA->DVI conversion your quality is less than vga->vga... either way it is not really noticible. The real reason for DVI is to enable HDCP and keep you from copying protected HD content, it simply wasn't possible with an analog(VGA) signal. Its not as if HD resolutions are higher than PC resolutions already in use....
October 23, 2006 8:56:48 PM

Quote:
thats is my problem. I'm hardcore FPS addict..



Perfect, Me too, I'm an FPS-o-Holic...I have 11 fps games installed. I'm also in Counter-Strike: Source Cal (counter-strike: source league)...where speed and accuracy matter the most.

I use the Veiwsonic vx-924 with a 3ms response time (which strangely enough got better ratings than the vx-922 with a 2ms response time) and I have no issues with it whatsoever.

I don't reccomend purchasing monitors online only because your almost guarenteed to get shipped one with many dead pixels.....your better off gonig to your local Best buy,Circuit City, Comp Usa, or Fry's.

The Vx922 and the Vx924 are great monitors (best monitors I have ever seen 2nd to the 30" Apple Cinema)....if you find one...buy it.
October 23, 2006 9:25:38 PM

Well I'm no expert, but everybody cept the most picky of DTP folks will all agree LCDs are good enough for games these days.

I also agree with the 'best buy' test. Specs are not something to put a lot of faith into (repsonse time should actually be listed as a range vs an absolute, and different manufacturers measure things differently...)

The vx922 and vx924 are solid gaming monitors, though they do suffer from a 'image quality' standpoint. They're based on 'TN' technology, which means 6bit colours and a narrower viewing angle, but fast response time.

The VA family of monitors don't really fit into this posting if FPS is the #1 concern, but since Tom's did such a great send-up of the VP930B, I might aswell mention it as the 'best all around' lcd in the 4:3 19" catagory.

Actually, speaking of 8bit and Viewsonic, the VX2025wm is almost fast enough to consider a 'good' gaming lcd (20" wide). Again, the 'best buy' test will determine if it'll keep you happy.

The family that I like most is IPS. Like VA, they used to be too slow for gaming, but a few years of progress has fixed that (for most users). Like the NEC model mentioned earlier in the posting, these are IMHO, as close to CRT as your ever going to get from LCD.

It all depends on where your preferences are. If its gaming only, TN is the best and cheapest answer (Viewsonic VX922/924, Samsung 940BF, LG 1932TX being my top 3 to check out (not a benq or acer fan, sorry)).

And call me crazy, but I often recommend people check out Dell for monitor deals. They basically buy the same panels for bulk prices, and pass the savings on to you (though their weaker quality control / assembly is the gamble you take (backlight bleeding mostly). Things like the 1707FP or 1907FP can be had pretty cheap.

If possible, I'd recommend seeing if you can get a 'zero dead pixel' warranty deal, lots of vendors are offering them and they are worth the money (I could never, ever, ever live with a dead pixel). ncix.com does zero pixel coverage.
October 23, 2006 9:42:35 PM

Quote:
I also recommend using DVI and not Analog. Difference is like HDTV (DVI) compared to standard TV (Analog)


The only reason DVI would be "higher quality" than BGA is if you converted the signal i.e. DVI->VGA. On the other hand if you do a VGA->DVI conversion your quality is less than vga->vga... either way it is not really noticible. The real reason for DVI is to enable HDCP and keep you from copying protected HD content, it simply wasn't possible with an analog(VGA) signal. Its not as if HD resolutions are higher than PC resolutions already in use....

Never said HD was better than a high PC resolutions... I was simply showing a comparison between the two. And for me going from CRT to DVI LCD was like night and day! Running Digital over Analog is always better.
October 23, 2006 10:25:13 PM

What IPS monitors do you recommend? What is the fastest IPS monitor
out there?
October 23, 2006 10:58:48 PM

Well if you've got the duckets, NEC Multisync 20WMGX2. ($600+)

AFAIK, it uses the newest AS-IPS technology from LG. Each generation of LCDs improves in almost every area (ie response time, contrast, brighness, and price :) 

Sadly, its the old 'get what you pay for'. I still consider the monitor to be (almost) the most important part of a computer system. (sure if its a p2-300 the monitor won't matter much, but I think most people know what I mean)

The main reason I recommend going all out on the monitor is if you also want to watch video, do regular computer work (even browsing), and play games. TN panels look kinda weak even straight on, and if you move side to side you get some funky stuff going on.

The obvious reason to not get a 20" widescreen is if your GPU can't dish out good FPS at 1680x1050 (its native resolution).

Sure they can be set to lower resolutions, but the best monitors only look like the best monitors if they are set at their native resolution.

The other reason not to spend so much coin on a LCD is the fact it is still a (fast) developing technology. But the same holds true for almost any computer purchase (wait a bit, get better stuff for less money...)

With the trend moving towards widescreen, and manufacturers don't like giving up the goods on real LCD specs, technology used, etc., its pretty hard to recommend other models right now (plus I'm kinda lazy and busy studying crap for my new job right now...)

You'd probably be happy with a VX922/924 if gaming is the sole purpose, but I can't watch movies on those things. Plus once you 'go widescreen' you don't go back. It all depends on your gpu and planned usage.

Though I'm still a fan of 'try before you buy'. Finding a store with display models to meddle with is almost a necessity with LCDs. I usually load a game to test the response time. Load a movie to see how the contrast and brightness fair, then load colour gradiant to ensure it isn't some crappy dithering 6bit panel, and finally, load a high quality jpg I'm very familiar with to check the overall colour stability (granted most LCDs in stores haven't had their colours setup properly, so you might need to use the OSD to set the RGB... woohooo TLAs...
October 23, 2006 11:20:55 PM

Duckets? That's a new word for me.

Yes, that's not a problem. I'm looking for an LCD to go with the system
I'm building around Jan/FEB. Waiting for the reviews for the G80 and
R600 to come out. May even go crossfire/SLI. So FPS not a problem.
October 23, 2006 11:43:41 PM

Here-

http://www.widescreengamingforum.com/wiki/index.php?tit...

Covers the games, and how to "convert" the ones that are not widescreen, and monitors.

Widescreen is the way to go. I am on a 24" dell 2405, and love it.

Video Card is king though, as most "good" cards give good frame rates on a common 128x1024 lcd (17-19") range.

The 1600x1200 (4:3) and 16xx x 12xx , 1920x1200 widescreen panels start sucking framerates even with a mack daddy card.

But is sure is pretty on all that screen realestate!
October 23, 2006 11:56:17 PM

Well then the NEC is the best I can think of (somebody else???)

Though the 2407 could be a good consideration. I can't remember all the nitty gritties, but the 2007 was a inferior samsung to the 2005 LG. But then again I also read its 'whatever was in stock' when orders were placed from Dell (ie you might get a Samsung, might get an LG). But bigger is definitly better, once you get used to moving your eyes around the gigantic desktop space.

Anyway, so long as the place you buy from has a decent return policy, it isn't such a big gamble. But if your brave enough / have to do mail order, its probably worth the extra effort upfront to avoid the return headache.

And duckets, well sorry, my new job has me speaking all kinda of funky stuff. I'm just a Canadian white boy selling Cronk Cell media to American hip hop kiddies... Didn't know what Cronk was til last month, but its selling like crack in the ghetto (sorry for the sarcasm but its a necessary evil required in exploiting the hip-hop generation... contempt for your customers isn't a sin, is it?).
October 24, 2006 12:28:17 AM

i only play fps i've been for the last year and 1/2 playing on a samsung 930B 19" i've got no complaints with the LCD performance in CS:S / CoD2 /BF2..UT2004
October 24, 2006 12:31:27 AM

I have the 20.1" widescreen from Dell...

I love it as well... One of the best purchases I have made...

They are bright (with some bleed but it is minimal, the first gen was the worst and later gens were corrected).

Something to also keep in mind about widescreen monitors.

I physically measured both my 20.1" WIDE and my 21" 4:3 at work.

Turns out that the 16:10 (not 16:9 in this case) is actually "7" square inches of viewable space "MORE" than my 21" crt.

That is a lot of realestate... Also side by side doc review/update is excellent.

Just some more food for thought....

I play UT2k4 and CODUO and some retro (Descent 3) and I have not seen any ghosting on this display @ 1680x1050 with an *cough* ATI 9600 Pro */cough*
October 24, 2006 12:34:13 AM

PS..

My Dell 20.1" is only 12ms G-T-G... And it looks great...

That is why people are telling you to review them in person if available.
October 24, 2006 12:35:53 AM

You and think alike. I used think Viewsonic was the bottom of barrel as far as any monitor was concerned. I had a Samsung Flat Screen CRT 9ne or something like that for 7 years I just loved. It finally starting really showing it's age, so I had to part with it. Some of the new Viewsonic LCD's though are pretty nice, don't mark them off completely without comparing some of the 922 series. I don't know where you live, but go out and do the comparison thing with the various makes up and running, you'll be pleasantly surprised how much better the LCD's are today compared to even a year or so back.
If you notice I have a 902b, not Viewsonics best by far, I am more into strategy and turn based games. I am really pleased with it.
October 24, 2006 12:37:06 AM

Quote:
Quote:
I also recommend using DVI and not Analog. Difference is like HDTV (DVI) compared to standard TV (Analog)




Never said HD was better than a high PC resolutions... I was simply showing a comparison between the two. And for me going from CRT to DVI LCD was like night and day! Running Digital over Analog is always better.

So if I posted a screen shot or photo of from my desktop on two LCD one DVI and 15 pin dsub you think you could tell which is which? It seems to me if there's that the kind of noticeable difference that would be easy. Reality you would have a 50/50 chance. I work with it all day and cant tell the difference.
October 24, 2006 1:25:39 AM

FLA,

I guarantee you will not be able to see the diff between the two posted images... This will not be the fault of the monitors it will be the fault of the camera... (pretty tough to see motion artifacts/ghosting/... when it is a still)

You need to consider these displays native modes. Most LCDs are now made (some exceptions like some Acers) for Digital. This is the same for most HDTVs...

IF you were to place two monitors of the exact same spec side by side and then pass VGA and DVI to one each respectively I guarantee you would be able to tell a difference... That said, it is up to the monitor manufacturer to decide how good the inputs for each are...

Go to AVSFORUMS and read for yourself what EVERYONE is using (Blu-ray disc players aside) for their input on HDTVs. They are mostly sticking with the DVI/HDMI inputs because there is a noticable difference in many aspects of the display devices...

To include:

Motion Jitter
Color gamut (HDTV GAMUT more accurately displayed)
Digital Artifacts
Banding....

Many also find inputs such as component video as "Soft"..

Just some food for thought..
October 24, 2006 1:34:13 AM

I have to agree here. Unless the systems dvi output is subpar, everytime I've seen identical monitors side by side, dvi vs vga, the crispness of the digital stands out.
October 27, 2006 1:25:45 AM

Yea i may very well buy one more CRT and just wait a couple more years then. I should be able to have a better picture than i currently have.. and absolutely no ghosting for about 225 dollars.

Right now i'm on a Samsung SyncMaster 793mb. Its only a 17" i believe.. forgot ^^.. but its starting to do that weird thing where the screen either flickers or shuts off.. sometimes the screen starts doing this wavy thing .. or even shrinks like really tiny and dies.

The picture is good, but i think its starting to fade on me.
October 27, 2006 1:55:09 AM

Well I don't know if such a service exists in your area, but CRTs can be refurb'd pretty easily.

When I used to work for IBM global services, and we had a shop there full of guys who'd recalibrate crusty old crts all day long. Atleast if a pro looks are yours he should be able to determine if it just needs a little work or whether it's beyond repair.

And so long as you don't plan on carting it around, have the desktop space, don't mind a little extra heat/hydro bill, top notch CRTs are still a beautiful thing to look at. It's kind of a shame they never really offered too many 'wide screen' models since I don't think I can get used to 4:3 after 16:10... Then again, once you get a 20" wide on a ergoflex arm you don't usually want to live with the ergonomic nightmare of a big fat CRT.
October 27, 2006 2:10:52 PM

Quote:
I have to agree here. Unless the systems dvi output is subpar, everytime I've seen identical monitors side by side, dvi vs vga, the crispness of the digital stands out.



The only way that would happen is if you were comparing two different monitors, or if the monitors were really crappy.

DVI or VGA, the resulting images should be identical.

The reason behind DVI (and HDMI) is that, of course, DVI is digital. Beyond the marketing of "Digital is better," this is why:

Your video card generates a digital image. The card then has to use some D-A converter or DSP to convert the digital image to analog. This is fine for CRT's, because CRTs actually use the analog signal. When LCD's came along and started rendering images digitally, then the LCD had to convert the analog signal back to digital signal. It made no sense to be doing a useless analog conversion, because that just adds cost. It also added the possibility of quality loss during conversion.

If you saw a difference in quality between DVI and VGA, then it was because your monitor (CRT/LCD) is of poor quality, or your cable was faulty. Ultimately, DVI and VGA are just transport mediums, but the signal at the source and at the destination should always be the same, regardless of how it's transmitted (unless you have a poor-quality DSP in your device).

VGA is actually better for CRTs, and DVI is better for LCDs. Unless you bought a crappy brand or model though, there won't be a difference between what you're looking at, VGA or DVI, CRT or LCD.
October 27, 2006 3:56:17 PM

I agree with a lot of what you are saying. My thinking when it comes to our over marketed world thanks to our great country.. may be for you math dorks...

Digital is discrete, and analogue is continuous. In my logic I would feel if we didn't just throw the vcr away instead of updating it.. (vcr technology hasn't been updated really since they were released) anything analogue in nature should have a much richer and more accurate picture.

If it captures everything thats there.. shouldn't we be able to technically see everything if our monitors could show everything? Obviously with digital we are missing out on an infinity amount of pixels.. or information.
I realize there is only so much we perceive and whatnot, and no i'm not stoned, but it just seems to me we may be missing out on something with our digital age.

But honestly.. i don't really care.. i just want an Extremely Nice picture.. fast.. and cheap.

Thanks everyone for the help.. ive been looking in various stores and some of the viewsonics do look very respectful now on display... i just wish i could actually play battlefield 2 on them or.. counter strike.
October 27, 2006 4:20:06 PM

Okay, both you guys are on crack (wiz & johan)

Since a computer's video buffers is digital, and vga signal is analog, you have conversions which although 'shouldn't' mess up the signal, are just voltage aproximations, which are never perfect (ie go buy the best adc and dac and run stuff through them forever and watch the signals lose quality).

Second, johan, I'm an analog junky myself in the audio world, but I long ago realized everytime you make an analog copy, you introduce noise like in ad/da conversions. Therefore VCRs will always suck because they don't make a 'perfect' copy like digital formats. Though things like DAT format fixed that, these days its all about HDs 'n' Flash, so tape and plastic discs shouldn't even be considered usefull for anything.

The new world will be all digital cept the final output stage. To preach that digital can't ever replace analog's quality is to admit your too ignorant of how easily fooled our senses are. Picture printing, audio recording and all media works by fooling the eye/ear/etc.

Vinyl is way better than CD, but only because CD was 16bit 44khz. 24bit 96kHz rocks vinyl, and I'm a believer that 1080P digital will fool my eye enough to forget i ever grew up with magnetic tape.
October 27, 2006 5:37:29 PM

Quote:
Okay, both you guys are on crack (wiz & johan)

Since a computer's video buffers is digital, and vga signal is analog, you have conversions which although 'shouldn't' mess up the signal, are just voltage aproximations, which are never perfect (ie go buy the best adc and dac and run stuff through them forever and watch the signals lose quality).


You're simply wrong on too many points.

The analog VGA signal will produce the same picture as a digital signal DVI signal, unless you have serious RF intererence, a crappy DSP on your monitor, or a bad cable. Your statement that VGA isn't 'perfect' has no basis, no proof, and simply doesn't make sense. An analog video signal is currently as precise as a digital signal, and will produce the same image. Why would every computer in the world be based on a standard that produces poor-quality images? What do you think CAD has been using for decades?

DVI was created to avoid unnecessary conversions to-and-from analog, however it provides no better picture (unless your device has a crappy DSP).
October 27, 2006 10:25:09 PM

I won't waste my time counting the number of points i made, nor bother trying to prove the ones you didn't counter-argue, but since you like to talk crap, i'll give you a little digital 101 lesson you are in such desperate need of.

Digital = a sequence of square wave DC voltage pulses (where like 5V = 1, and 0V = 0).

Analog = continous voltage wave pattern of an unlimited number of wave forms.

No matter how good your dac/adc is, it will never ever be perfect (like analog copies of tapes). On top of that, as you mentioned, there are phenomenon that can further degrade the already not 'perfect' analog signal before it reaches the next adc (in the lcd)

So, digital-digital communiction is perfect, unless you feel like argueing that.

And the digital-analog conversion is never perfect, unless you feel like explaining the idea of infinite sampling to the crowd here.

Again, your eye may not be able to tell the difference, and dac/adc's are pretty amazing these days, but never ever can a person say that a converted signal is as good as an unconverted signal.

I've even read that a vga signal from DVI-I connector is superior to the standard HD15 vga connector. Originally vga didn't need to support todays high frequencies since it was like 1987 when it was made.

The whole reason why video signals moved away from digital after EGA was due to limits of digital chips/signals of the mid 80s. Guess what, its 2006 and those limits are gone.

So not only will you not have to worry about cross-talk and other signal degradation issues with analog, digital will also prevent the 2 conversions required to take a digital image from a computer, analog cable, and back to digital for the LCD (or crt, but thats a whole other story).

But again, I have no proof that dac/adc isn't perfect, nor would anybody dare make such a statement with such highly educated people around, and obviously, it simply doesn't make sense.

Plus the human eye is so perfect it would notice if an image wasn't exactly the same...
October 27, 2006 11:09:55 PM

I have a Samsung 940BF 2ms response. I've also seen firsthand a Viewsonic VX922 or whatever it is.
Overall I'd say LCDs, assuming you read reviews, see them in person, and do a little buyers research are quite good now. Not all are gaming worthy but many are.

Mine is outstanding and I was very worried since it was my first LCD. I have noticed the slightest smear in one game if I study the screen closely looking for it. If I'm just playing and not looking specifically for it I never notice a thing and I'm quite a particular person. Maybe overly so according to my wife. hehe
October 27, 2006 11:11:13 PM

I completely understand what you are saying, and i even agree that digital is better right now. Just with my limited knowledge on the subject, it would seem that there should be an even better alternative to digital with some type of new analogue device. Obviously i can see the difference between the nice new shiny HD tvs and my old school one, but it would seem the missing gap part would make a bigger difference than it does.

And i myself enjoy the two technic mk2's that are sitting next to me on my desktop.. as well as my large vinyl collection.

In either case, i just wish i could actually test out these damn monitors. Just really frustrating.
October 27, 2006 11:17:50 PM

I see plenty of LCD's and CRT's at work on a day to day basis. The quality of LCD screens has come up by leaps and bounds in the past 18 months. I hardly ever see a dead pixel any more.

I highly doubt that is you buy a decent LCD you will ever be unhappy with it. Just avoid the ultra cheap, previous generation, 'B' quality displays and go with a good name. Personally I love my widescreen Viewsonic, however I have seen planty of great other displays too. :) 
October 30, 2006 11:22:36 AM

Quote:
I won't waste my time counting the number of points i made, nor bother trying to prove the ones you didn't counter-argue, but since you like to talk crap, i'll give you a little digital 101 lesson you are in such desperate need of.

Digital = a sequence of square wave DC voltage pulses (where like 5V = 1, and 0V = 0).

Analog = continous voltage wave pattern of an unlimited number of wave forms.

No matter how good your dac/adc is, it will never ever be perfect (like analog copies of tapes). On top of that, as you mentioned, there are phenomenon that can further degrade the already not 'perfect' analog signal before it reaches the next adc (in the lcd)

So, digital-digital communiction is perfect, unless you feel like argueing that.

And the digital-analog conversion is never perfect, unless you feel like explaining the idea of infinite sampling to the crowd here.

Again, your eye may not be able to tell the difference, and dac/adc's are pretty amazing these days, but never ever can a person say that a converted signal is as good as an unconverted signal.

I've even read that a vga signal from DVI-I connector is superior to the standard HD15 vga connector. Originally vga didn't need to support todays high frequencies since it was like 1987 when it was made.

The whole reason why video signals moved away from digital after EGA was due to limits of digital chips/signals of the mid 80s. Guess what, its 2006 and those limits are gone.

So not only will you not have to worry about cross-talk and other signal degradation issues with analog, digital will also prevent the 2 conversions required to take a digital image from a computer, analog cable, and back to digital for the LCD (or crt, but thats a whole other story).

But again, I have no proof that dac/adc isn't perfect, nor would anybody dare make such a statement with such highly educated people around, and obviously, it simply doesn't make sense.

Plus the human eye is so perfect it would notice if an image wasn't exactly the same...


...............................

I'm so tired of people like you. I'd school you on D/A conversion like you wouldn't believe, but that's not even the topic of the post. Just because you mix music and can google stuff doesn't mean you know what you're talking about. That doesn't matter though. I don't need to counter every (retarded) point you made. My point is simple:

You get the same picture, DVI or VGA, unless you have a crappy monitor. Period. I don't care about A/D conversion. If you want to argue a point, try arguing the OP. I'm sure you'd be surprised how many screens use the analog, NOT the digitial, in the DVI anyway.

If you really want to prove your point, you can research the analog transmission method versus the digital signal. You might even know what you're talking about next time you post.

I'm sure you'd also be interested to know that the digital DVI connection has a (much) shorter max length than the analog VGA signal, and the digital DVI signal lacks ECC, meaning that when you approach the max length, you get signal corruption without any way to compensate. So all of this "Digital is always better" is BS.

DVI-I is better for flat-panels, and analog is better for CRTs, because that guarantees the fewest signal conversions by means of the end-user. Analog VGA produces the same quality as digital DVI, and even extends over longer distances. DVI has better forward-compatibility; it doesn't produce a better picture. It's that simple.
!