So, each manufacturer of the video card only needs to write an OpenGL or DirectX compatible driver and then games (or any application) written to the OpenGL or DirectX standard will magically work on their hardware.
I think your confusion is that you don't quite understand what an API is, and by association don't understand DirectX.
Yeah but you don't understand that the question is a hardware one and you guys are derailing him into only into the API/compiler direction that will confuse him more. The GPU has hardware limits and regardless of the 'firmeware', BIOS, or API it might never be able to perform a task regardless of the extensions you add and whatnot, they just can't be done (like FP16HDR+AA), but what the API et al might be able to do is to work with what they've got to make things similar if not equal, and that will once again be a hardware consideration more than anything because it's unlikely that the APIs or compliers will be limited.
There are important hardware limits and for the PS3/RSX they would fall within the current OGL2.0++ limits that the GF6/7 series has. Regardless of the API the hardware not only needs to be compliant but have the architecture to support the feature set called for or required to achieve a desired effect/result. While nV and ATi have different propietary extensions (now close to equal, but still favouring nV) the biggest differences are how they are implemented and how they can be exploited for a general path. An example would be D3 where both have 'capable' hardware that is OGL1.5+ compliant/capable, but the GF6 outperformed the X8 series partly because of the way D3, using OGL, handled early Z-culling compared to the 'typical' method in DX, and focusing on the architecture to play to one API more than another does then play a factor. The hardware itself is of great importance of course, and the two are differnt and wil play to different features. The Xbox360 is designed for M$'s usual DX form and while not DX10 compliant is somewhere between DX9.0C and DX10 giving it more features which of course just add to it's options when programming. The Xenos VPU can use FP10 or FP16 for it's HDR+AA calculations depending on what speed the programmer calls for and still keep floating point accuracy throughout, it can also do integer values as well, just like the RSX. However, the RSX on the other hand cannot do FP targets within the core for AA since it still has the FP blending in the ROPs, and will require CPU/FPU assistance for some processes if it wants to achieve similar effects by doing the AA process using CELL processing power instead of all within the VPU.
Second as to the developer, DX vs OGL, Cg makes that less of an issue than it has to be. If you KNOW that you have to do both, then it's worth the extra effort to go through Cg to make quick compilinginto both OGL and DX possible instead of staggered. Longer process than single route but usually faster than doing both seperately, but from what I hear a pain in the A$$ compared to picking one or the other alone.
Simply put
NEITHER the Xbox360 nor the RSX will be able to support ALL the DX10/OGL2.0+++ features available to the next gen cards, but likely be able to support many of them, and likely the X360 will be able to support more within the VPU, but that may be greatly mitigated by implementations with the API and more specifically the game itself (there's work being done for some PS3 titles involving int32 HDR with AA to get a similar effect to FP16HDR+AA, it's supposedly close with some issues, and there's talk of int64 as well front ended on the Cell side of the equation).
So whether we'll notice any difference in quality will be all in the eye for the beer holder, just like how some people did and didn't notice the difference between FP24 and partial precision FP32+FP16+FX12 in previous generations. I think regardless of everything Crytek will do it's best to make it look good on all systems, especially if the desktop is any indication (brought more new stuff to all sides [nV, ATi, AMD/M$ {64bit support} than anyone else IMO). I don't think anyone will really care if it's FP16 or int32 HDR+AA if it looks/plays the same.