Sign in with
Sign up | Sign in
Your question

PS3 supports direct x 10 card game effects or not?

Tags:
  • Graphics Cards
  • Playstation
  • Games
  • Directx
  • Graphics
Last response: in Graphics & Displays
Share
May 16, 2006 1:42:36 PM

Just wondering if the playstation 3 is able to show direct x 10 quality game effects. The gap between direct x 9 and 10 is huge. How will they solve this gap? Or will the ps3 be allready outdated in graphics quality in some way when Crysis (The first DX 10 game) will be released this year. I know Crysis will also play with dx 9 and it looks stunning allready but if I have to believe their stories then crysis will be be even way more beautifull on DX 10. Maybe it's better to buy a new direct x 10 card instead of buying a console around that time? Like to see some opinions on this matter. :) 

More about : ps3 supports direct card game effects

May 16, 2006 2:14:26 PM

If I'm not mistaken, the PS3 doesn't use DirectX at all--they use either a linux or some other *nix core if I'm not mistaken. Who says you have to use DirectX?
May 16, 2006 2:37:36 PM

The PS3 is not a PC. It doesn't use DirectX.

It's graphics processor will be equivalent to Nvidia's newest technology though, it'll be fine.

The gap between DirectX 9 and 10 is almost nil, from what I've seen it's more to do with back-end. The famous Crysis demo was also shown in Dx9, from what I understand.
Related resources
May 16, 2006 2:42:27 PM

The PS3 uses OpenGL ES for the 3D graphics API as well as NVIDIA’s Cg shader language.

Also, I should mention that DirectX 10 is just an API. It does not inherently have quality game effects. It's really up to the developer how they use it.

Rest assured that the PS3 will launch with games that look as good as competing XBox 360 and PC titles.
May 16, 2006 3:12:13 PM

I do not believe the PS3 GPU would support DX10 if in a PC environment since its the same GPU as in your standard 7800-7900 card.

I know that the Xbox 360 GPU though does support DX10 since it is based on the unified shader architecture that ATI will use in its upcoming cards. And I assume it uses DirectX since its Microsofts console.
May 16, 2006 5:38:45 PM

ummm....I was just wondering if the ps3 is able to show direct x 10 effects. I know it does not use direct x 10 as it's microsoft. lol But it's kinda weird if it only uses the gpu to do this. If this would be the case why do we need direct x afterall as it can be implemented in the gpu itself. There should be software that tells the gpu what to do right? The bios or something. So my question is how is the ps3 able to do exactly the same in crysis without dx 10 functions.
May 16, 2006 6:04:02 PM

A video card will have native code running on it (call it firmware). In order to tell the video card what to do, you need a driver to speak to it. Back in the old days, you had to code every game to match every different video card driver you wanted to support. This is obviously cumbersome and inefficient, so they came up with some standard APIs (Application Programming Interfaces), such as OpenGL and DirectX. So, each manufacturer of the video card only needs to write an OpenGL or DirectX compatible driver and then games (or any application) written to the OpenGL or DirectX standard will magically work on their hardware.
I think your confusion is that you don't quite understand what an API is, and by association don't understand DirectX. Read the wikipedia link above for APIs and then read the entry on DirectX. Hopefully between what I wrote earlier and those two links, you will be able to answer your own question. Let me know if not.
a b U Graphics card
May 16, 2006 6:09:15 PM

Well Direct X is just a graphic API stardard (which is basicaly controlled by mircosoft).

It kinda like that saying that "there is more than one way to skin a cat" the point of having a stardard [ graphic API] is so that the coders will know that there software will run basicaly the same way on all systems that use the API and give them a set of rules to work in.

PS3 has it own graphic API set (I have no idea what it is or called) some say based off of OpenGL (again another graphic API standard however not controled/dictated by one company) which I pretty sure can/will produce graphics effects equal to what you have seen with Direct X demos. The main limiting factor will be Hardware not the API.


so short anwser no it doesn't use direct X API But yes it should be able to process similar effects.
a b U Graphics card
May 17, 2006 12:13:37 AM

Quote:
So, each manufacturer of the video card only needs to write an OpenGL or DirectX compatible driver and then games (or any application) written to the OpenGL or DirectX standard will magically work on their hardware.
I think your confusion is that you don't quite understand what an API is, and by association don't understand DirectX.


Yeah but you don't understand that the question is a hardware one and you guys are derailing him into only into the API/compiler direction that will confuse him more. The GPU has hardware limits and regardless of the 'firmeware', BIOS, or API it might never be able to perform a task regardless of the extensions you add and whatnot, they just can't be done (like FP16HDR+AA), but what the API et al might be able to do is to work with what they've got to make things similar if not equal, and that will once again be a hardware consideration more than anything because it's unlikely that the APIs or compliers will be limited.

There are important hardware limits and for the PS3/RSX they would fall within the current OGL2.0++ limits that the GF6/7 series has. Regardless of the API the hardware not only needs to be compliant but have the architecture to support the feature set called for or required to achieve a desired effect/result. While nV and ATi have different propietary extensions (now close to equal, but still favouring nV) the biggest differences are how they are implemented and how they can be exploited for a general path. An example would be D3 where both have 'capable' hardware that is OGL1.5+ compliant/capable, but the GF6 outperformed the X8 series partly because of the way D3, using OGL, handled early Z-culling compared to the 'typical' method in DX, and focusing on the architecture to play to one API more than another does then play a factor. The hardware itself is of great importance of course, and the two are differnt and wil play to different features. The Xbox360 is designed for M$'s usual DX form and while not DX10 compliant is somewhere between DX9.0C and DX10 giving it more features which of course just add to it's options when programming. The Xenos VPU can use FP10 or FP16 for it's HDR+AA calculations depending on what speed the programmer calls for and still keep floating point accuracy throughout, it can also do integer values as well, just like the RSX. However, the RSX on the other hand cannot do FP targets within the core for AA since it still has the FP blending in the ROPs, and will require CPU/FPU assistance for some processes if it wants to achieve similar effects by doing the AA process using CELL processing power instead of all within the VPU.

Second as to the developer, DX vs OGL, Cg makes that less of an issue than it has to be. If you KNOW that you have to do both, then it's worth the extra effort to go through Cg to make quick compilinginto both OGL and DX possible instead of staggered. Longer process than single route but usually faster than doing both seperately, but from what I hear a pain in the A$$ compared to picking one or the other alone.

Simply put NEITHER the Xbox360 nor the RSX will be able to support ALL the DX10/OGL2.0+++ features available to the next gen cards, but likely be able to support many of them, and likely the X360 will be able to support more within the VPU, but that may be greatly mitigated by implementations with the API and more specifically the game itself (there's work being done for some PS3 titles involving int32 HDR with AA to get a similar effect to FP16HDR+AA, it's supposedly close with some issues, and there's talk of int64 as well front ended on the Cell side of the equation).

So whether we'll notice any difference in quality will be all in the eye for the beer holder, just like how some people did and didn't notice the difference between FP24 and partial precision FP32+FP16+FX12 in previous generations. I think regardless of everything Crytek will do it's best to make it look good on all systems, especially if the desktop is any indication (brought more new stuff to all sides [nV, ATi, AMD/M$ {64bit support} than anyone else IMO). I don't think anyone will really care if it's FP16 or int32 HDR+AA if it looks/plays the same.
May 17, 2006 8:10:35 AM

TheGreatGrapeApe, you got my head spinning all the way lol. That was a very professional speech thank you. So it's true....the next gen dx 10 pc cards will be able to push the boundaries further due to more features. :)  Next time I will ask my questions a bit differently as it might have been a bit confusing. thx for the explanation guys.
!