I use to have the ATI Radeon 9800 Pro a year ago and I could play some games at almost max graphics including Half Life 2. With that card the graphics were still beautiful with beautiful colors and smooth blending. Now I have built a new computer and bought two GeForce 7600GTs in SLI. The graphics are nice, but for some reason the color lookes weird. It looks like I am playing Half Life 2 in a 16Bit color mode instead of 32Bit. I have checked and it is 32 Bit. I have checked all my settings and everything is set to max in the game and in my desktop display settings as well. In the game when I look at the sky, or areas with less light I get that weird purplish, green, pixely, odd blending, rather than smooth transition blending of colors. I can see where the purple stops and the other color begins. It looks like old VGA 16bit. Is that just because of the cards? I know I should be seeing better graphics than that with these cards.
I have DLed the lates drivers too. Here's my comp stats:
eVGA 680i Mobo
2GB of Corsair CM2X1024-6400 DDR2 800 4-4-4-12 @ 2.1volts
2 SLI Geforce 7600 GT OC graphics cards 256MB
16MB cache 320GB HD 7200RPM
Intel E6600 Core 2 Duo processor
700w Xtreme Gamer OCZ power supply
That's interesting, I just moved up from a Viewsonic A7f to an LCD Viewsonic VG2030wm 20.1" 1680x1050. I noticed some greenish colors if I wasn't looking at the monitor just right while playing BF2. Everything else seems fine, and guild wars looks spectacular. I'm currently using crossfired x1950 pro's. I believe I also read somewhere once, that some LCD's have fewer displayable colors or lower bit depth, and depend on pixel response speed to trick the eye into seeing more color.
Anyone else out there more familiar with LCD color tech??
That monitor doesn't have a true 8bit per channel panel. it is 6bit/ channel. So it is not surprising you're seeing lower quality than 24bit truecolor. These panels have the low response time at the cost of color fidelity. If color fidelity is important to you, take it back and get something else not 22" and not a TN panel, as all the 22" panels are TN panels made by CMO. Look for 16.7 million colors on the specs for true 8bit. 16.2m is the 6bit + HiFRC like it is on your Viewsonic 22". Basically it does switches between 2 colors to simulate the color it cannot display to reach 16.2 million colors. 6bit can only displays 262k colors. Great majority of the cheaper monitors use these TN panels so you're going to have to pay a little more. Just remember those 8bit panels are slower in response time.
I hate to say it, but ira176, your panel is a 6bit panel too. If you can't notice the difference, then good. Being able to notice tiny details is very costly
Wow! Pretty nice reply there. Well, I do ultimately love the huge wide screen especially for games like Call Of Duty 2, Battlefield, and racing games. So that answers my question, thank you very much. You're pretty smart and I am at least happy that it is not my GPU's.
I have a Chi Mei 221D (recent acquisition) and an Asus PW191 on my main machine. Viewsonic VA2012WB for my second machine. My Gripe with the 22" is that it doesn't offer any more pixels than the 20"... I was eyeing that Dell 24" with 1920x1200 colorific pixels 8O That was a little more than what i wanted to spend
Chiadog, I had a pretty good idea it wasn't a 16.7 million color display after seeing some color annomalies during certain games. Fortunately it doesn't ruin everything, just wish that Viewsonic would use the better panel.