skit75 :
Gaming wouldn't be as near as popular as it is if gamers had to buy 10-bit displays. They are just too costly and we wouldn't even be able to see a difference. There are other eye-candies we would much rather prefer from our displays... such as refresh rates, resolution & physical size.
You can check some of the Firepro cards and there is another company called BlackMagic(check Adobe for support) that can produce 10-bit images.
Thanks, Skit25.
However, I should state the following for anyone curious re 10 bit vs. 8 bit monitors.
I wouldn't dismiss 10bit as eye candy, although it is serious eye candy and how sweet it is!. Whereas an 8bit monitor can display roughly 16 million colors, a 10bit (per channel, x3) monitor can display over a billion colors.
The i/o cards made for the pro video production market are not made to accelerate video rendering in the way that CUDA, Open GL, Open CL are designed to do. A significant portion of the rendering of images is handed off to the GPU nowadays, with only the pro workstation cards such as Quadro and, more recently, the Firepro cards outputting 10bit per channel to 10bit monitors.
Video pros rely on both a fast GPU card for speedy rendering as well as a pro video i/o card which also allows external pro monitoring card.
Pro video production cards, by Black Magic Design, etc., are strictly i/o devices that transcode the video to a pro intermediate CODEC which greatly facilitates realtime editing. They also output to external pro production monitors compliant with various broadcast color standards such as rec 709, for example. The available CODEC transcode choices range from 8bit to 10bit. Of course a 10bit CODEC greatly enhances the ability to do far more exacting color correction. Also, for people not versed in video production tech, the so-called banding problem disappears in 10bit video and that is amply illustrated by Sony's Bravia sales literature.
So, why wouldn't the gaming experience be greatly enhanced by 10bit output and display?
I understand that the costs are higher but I see lots of quality-obsessed gamers spending lots of money on rigs. Surely they'd want a far richer viewing experience. I have to wonder if the game developers are using 10bit displays to develop the games even if rendering their output to 8bit in the interests of file size, and the end user only displays via 8bit.
Can we be so sure that 10bit gaming is not coming for the general gaming public?
To address peegeenyc's question: any of the pro workstation cards such as the Nvidia Quadro or AMD Firepro output 10bit and can, as far as I'm aware, be used in a gaming rig. I could be wrong but I'm not aware of any trade-off between powerful Open CL and 10bit output.
The GPU card manufacturers probably don't allow 10bit monitoring on non-workstation cards because gamers aren't screaming for it.
I suspect that if they did, the manufacturers would allow eventually allow it and the price of 10bit GPU cards would drop in price for everyone.