The name "32bit" is misleading I believe. Generally the highest bit (colour depth) used is 10, which produces over a billion possible colours, compared to 8 bit which produces 16.7 million. So as you can see there is a bit of a difference.
That's the jist I got from online research.
I am pretty sure the only NVIDIA cards to support 10bit are QUADROS, not GeForce unfortunately.
Oh right, yeah. I suppose this adds to the debate over GTX vs QUADRO for a workstation. An 8 bit IPS is still going to be great compared to a TN, but if you are doing film grade stuff then you need a QUADRO or similar to achieve a billion+ colours.
It's frustrating because I suspect that again, the GTX are up to the task but drivers not optimised for this.
Has anyone heard of a driver hack to get 10bit on a GeForce card?
Thanks guys, I think I'm going to order the PA246Q anyway.