10bit graphic cards

wsmith

Honorable
Oct 23, 2013
6
0
10,510
I'm wondering what is now and what is the near-term future for graphic cards supporting 10bit processing and 10bit output. I'm a video pro on Windows and Adobe and until recently I've been limirted to costly Nvidia Quadro cards if I wanted that. Now that Adobe supports AMD and ATI cards via its support of OpenCL.

I need to know about ATI and AMD cards that support 10bit.

Would I be incorect in assuming that it won't be long before 10bit is the standard for gamers and even everyday users who just want better quality monitors and overall better viewing experience?

Do any AMD or ATI cards support 10but processing and 10bit output for monitoing on an external pro production monitor (which is very different compared to a PC monitor).

Thanks!





 
Solution
Gaming wouldn't be as near as popular as it is if gamers had to buy 10-bit displays. They are just too costly and we wouldn't even be able to see a difference. There are other eye-candies we would much rather prefer from our displays... such as refresh rates, resolution & physical size.

You can check some of the Firepro cards and there is another company called BlackMagic(check Adobe for support) that can produce 10-bit images.
Gaming wouldn't be as near as popular as it is if gamers had to buy 10-bit displays. They are just too costly and we wouldn't even be able to see a difference. There are other eye-candies we would much rather prefer from our displays... such as refresh rates, resolution & physical size.

You can check some of the Firepro cards and there is another company called BlackMagic(check Adobe for support) that can produce 10-bit images.
 
Solution

wsmith

Honorable
Oct 23, 2013
6
0
10,510


Thanks, Skit75,

I've just been looking at what AMD is up to with their Firepro and OpenCL technology. Interesting. But they probably need to polish their drivers up to the level of stability that Nvidia has achieved with their proprietary Cuda code.

As for Black magic, I know them well and use them. I think the sun is setting on that type of I/O hardware - ever since the advent of cameras recording to memory cards. No more need to ingest tape-based files. External monitoring to an external video production is now very practical with many video cards, via HDMI or Display Port. It's just 10bit monitoring that is lagging. It's available with the pro workstation cards but they can be quite costly, on the order of 2k - 3k dollars, and up.

But if one looks at the GTX Titan, with 6GB of RAM and staggering processing power, it's got what it takes except for 10bit support. And a great deal cheaper.

I wish Tom would had included some more meaningful benchmarks ala video performance, in it's recent analysis. Hopefully it will do so in the future.

Thanks!
 

peegeenyc

Honorable
Nov 6, 2013
1
0
10,510
Yes, I too have been looking at this, as a workstation using photo-pro, who needs Open CL acceleration for various programs (CS6, Capture One raw processing) AND want 10bit support for my 10bit monitors and color-accurate workflow. However, it seems you are forced to choose - a high end gaming card with powerful open-CL OR you can have 10bit pipes. Not both.

Be happy to know I'm wrong in this, if anyone has a solution from AMD or NVIDIA, that will give both together.
 

wsmith

Honorable
Oct 23, 2013
6
0
10,510


Thanks, Skit25.

However, I should state the following for anyone curious re 10 bit vs. 8 bit monitors.

I wouldn't dismiss 10bit as eye candy, although it is serious eye candy and how sweet it is!. Whereas an 8bit monitor can display roughly 16 million colors, a 10bit (per channel, x3) monitor can display over a billion colors.

The i/o cards made for the pro video production market are not made to accelerate video rendering in the way that CUDA, Open GL, Open CL are designed to do. A significant portion of the rendering of images is handed off to the GPU nowadays, with only the pro workstation cards such as Quadro and, more recently, the Firepro cards outputting 10bit per channel to 10bit monitors.

Video pros rely on both a fast GPU card for speedy rendering as well as a pro video i/o card which also allows external pro monitoring card.

Pro video production cards, by Black Magic Design, etc., are strictly i/o devices that transcode the video to a pro intermediate CODEC which greatly facilitates realtime editing. They also output to external pro production monitors compliant with various broadcast color standards such as rec 709, for example. The available CODEC transcode choices range from 8bit to 10bit. Of course a 10bit CODEC greatly enhances the ability to do far more exacting color correction. Also, for people not versed in video production tech, the so-called banding problem disappears in 10bit video and that is amply illustrated by Sony's Bravia sales literature.

So, why wouldn't the gaming experience be greatly enhanced by 10bit output and display?

I understand that the costs are higher but I see lots of quality-obsessed gamers spending lots of money on rigs. Surely they'd want a far richer viewing experience. I have to wonder if the game developers are using 10bit displays to develop the games even if rendering their output to 8bit in the interests of file size, and the end user only displays via 8bit.

Can we be so sure that 10bit gaming is not coming for the general gaming public?

To address peegeenyc's question: any of the pro workstation cards such as the Nvidia Quadro or AMD Firepro output 10bit and can, as far as I'm aware, be used in a gaming rig. I could be wrong but I'm not aware of any trade-off between powerful Open CL and 10bit output.

The GPU card manufacturers probably don't allow 10bit monitoring on non-workstation cards because gamers aren't screaming for it.

I suspect that if they did, the manufacturers would allow eventually allow it and the price of 10bit GPU cards would drop in price for everyone.
 


It may indeed be on the way.

Right now, it seems image effects are a priority over color depth/accuracy and this is always in relation to frames per second as that is the single most important, detectable by an end-user, gaming requirement. The added effects cost frames which asks more from the hardware so more powerful hardware is where manufacturers apply their focus.

I suppose when the video hardware begins to realize a ceiling similar to the CPU market realizing shear clock speed limitations vs. stability/reliability & therefore moving to multiple cores, the focus of GPU manufactures will begin to switch to what they can improve without such a direct challenge to the physics of the problem and we would eventually get better color depth/accuracy. It is just lower on the priority list is all.

I also am curious if developers are working in 10-bit for world/level/map creation. I would assume that they are.