I don't have a strong reason to get a card capable of doing 30 bit colors, but since I'm quite likely to buy a U2711, I'd like to get such a card as well. As far as I can tell, this sort of requires DisplayPort, which mandates 30 bit colors, as opposed to HDMI 1.3, where 30 bit is optional and (since few care) likely to be reported incorrectly on the specs. Also, HDMI that needs an adaptor from DVI looks more likely not to have 30 bit colors. Did I get it right?
They have been making 30 bit cappable cards since the late 1990s
All ATI and Nvidia workstation cards should be 30 bit cappable. Cards like the FirePro and Quadro.
So if you aren't a professional graphics designer, a workstation card might be out of your league just for extra colors your eyes won't notice.
I'm interested in desktop solutions - Geforce. The question is which cards do "deep color". After some more research I found that DisplayPort doesn't guarantee 30 bit color either. There's a document from nVidia stating that some Quadro cards don't do 30 bit colors even though they have DisplayPort - at http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_ne... . Now it looks like only Geforce 210, GT 220, and GT 240 can do deep color (at least according to what I found on nVidia's site.) Still, I wonder if the card manufacturers changed anything related to this. (I haven't looked at ATI because their cards tend to have issues under Linux.)
All the current lines from ATi and nVidia are 10 bit per channel capable, and the HD5000 series from ATi supports 12bit per channel xvYCC extended colour space via the HDMI 1.3 connector. DVI and DP are currently limited to 10 bit per channel on both ATi and nVidia.
I thought I should look at ATI as well. From their specs at http://www.amd.com/us/products/desktop/graphics/ati-rad... it looks like 30 bit color only works with HDMI and for some reason HDMI can do at most 1920x1200. It's the same for all HD 5xxx cards. So I wonder why would they have such a limitation. Can anybody explain / confirm / deny this?
You're reading it wrong.
10-bit per channel (30-bit) is via DVI, DP and VGA; 12-bit per channel xvYCC wide gamut is via HDMI, and the max resolution is essentially a limitation of the typical use, to leave overhead for audio, it's not actually using dual-link TMDS it's actually a much faster single TMDS to support higher resolution and bit depth than supportable by a standard single TMDS.
It must be set within the CCC and your monitor must fully support the format (and be reporting support via EDID info).
Many have high bit-rate internal look-up tables and precision support, but actually only accept a lower level input/output usually 8-bit, the U2711 you're looking at also mentions "12-bit internal processing" and may in fact be limited to 8-bit input/output like their 24" model. I haven't found the U2711's full specs to verify that (not the useless specs on the product page).
You're reading it wrong.
10-bit per channel (30-bit) is via DVI, DP and VGA
What concerns me is that the only place they mention anything about Deep Color is the HDMI section. I couldn't find any mention of 10-bit or 30-bit.
[...] I haven't found the U2711's full specs to verify that (not the useless specs on the product page).
Yeah, me neither. But again, the only mention of anything higher than 8-bit is when they talk about HDMI. At any rate, "1 billion colors" is mentioned in many places; it's just that it's not clear if it's HDMI only.
A related question would be if DP mandates its technical specs to be implemented (as opposed to HDMI 1.3, which leaves a lot of its features as optional).