Sign in with
Sign up | Sign in
Your question

Can an LCD match a high end CRT for gaming? (as of late 2008)

Last response: in Computer Peripherals
Share
September 25, 2008 11:04:57 PM

I was wondering if LCD monitors can yet match high end CRT monitors quality wise in the same price range? (say $250 to $350) To be more specific, I would be refering to color depth, clarity, sharpness, and color accuracy and whatever else I would be missing. The relativity of this would be specifically gaming related (fast paced fps, graphic design)

I just cant decide what type of monitor I should pursue based on a few factors. :( 

Obviously, I dont have hands on experience to which one would be better to buy yet. (hoping to get this resolved on this thread)

Are there even any good high end (new) CRT monitors left? (we're in the final quarter of 2008 now!)

Is the LCD technology up to par yet with the high end CRT's?

Lastly, does VGA or DVI make a difference on which CRT monitor I
should pick?

I would be looking for specific brands and models. :bounce: 

Note that size and weight is not an issue. I'm just looking for sheer professional high end level quality within the $250 to $350 price range.

Input is greatly appreciated. ;) 
September 26, 2008 7:55:32 AM

Simple answer is no.


If gaming and graphics design are the top two priorities, then my personal choice would be the NEC LCD2490WUXi-SV. Priced at about $1,300.

SV = SpectraView; an NEC proprietary colorimeter for color accuracy calibration.


September 30, 2008 9:00:39 PM

High end CRT's are hard to beat. The cheap LCD's use TN panels, which only have 6 bits (64 shades of brightness) per color pixel. VA has 8 bits (256 shades per color or 16.7 million true color), but have slow response times (motion blur), IPS panels are the best, but some still have motion blur issues (over 8 ms response time) and are expensive.

http://www.pchardwarehelp.com/guides/s-ips-lcd-list.php

Part of this has to do with the way a standard LCD works. The brighness for each pixel is held at the same level for 16.67ms (assuming a 60hz DVI connection), then switched to a new brightness, and your brain notices the sudden jumps.

Although a CRT running at 60hz is only doing a frame every 16.7ms, the lower persistance phophors are fading the entire time, so your brain doesn't notice the motion blur. However the persistance in the phophors in a computer CRT monitor aren't setup for 60hz, so there's flicker. I have to run monitors at 85hz or faster to get rid of the flicker. The CRT can run up to 100hz to 120hz at 1600x1200 (if you're video card can do this).

If you're running Windows XP (versus Vista), another issue is native resolution. I have a 20" viewable monitor (called 21") that I run at 1280x960 for desktop and desktop type applications, but 1600x1200 for some games. The equivalent viewing size would be a 24 inch 1900x1200 LCD, using the inner 1600x1200 pixels. However at 1280x960, dithering would blur up the LCD image, and at 1600x1200, the icons are too small. A CRT, being analog, just adjusts beam width and sweep rates (the phosphor mask is really high resolution, with 2048x1536 being the highest resolution), so the images aren't dithered by resolution changes.



October 24, 2008 8:46:38 PM

I will have to agree with everybody else here. High end CRTs are just fantastic.
!