Sign in with
Sign up | Sign in

ATI and PhysX Co-exist on the Nintendo Wii

By - Source: Tom's Hardware US | B 32 comments

Like a huge slap in the face to ATI, rival company Nvidia has signed a "tools and middleware" license agreement with Nintendo, bringing PhysX technology to the Wii console.

Yesterday Tom's reported that Nvidia signed a deal with Sony Computer Entertainment Inc that gives PlayStation 3 developers access to the PhysX software development kit (SDK). According to the company, the kit is now available as a free download on the SCEI Developer Network and consists of a full-featured API and "robust" physics engine. But because the console's RSX GPU--based on Nvidia's G70 architecture (think GeForce 7800)--doesn't support PhysX in a hardware (or CUDA) sense, the middleware thus relies on the Cell's Synergistic Processing Units (SPUs) to process the physics rather than dumping the entire load on the Cell's Power Processor Unit (PPU).

Now Nvidia is taking another step into dominating the gaming industry by inking a deal with Nintendo that grants Wii developers access to the PhysX SDK as well. “Nintendo has reshaped the home entertainment and video game market with the success of the Wii console. Adding a PhysX SDK for Wii is key to our cross-platform strategy and integral to the business model for our licensed game developers and publishers,” said Tony Tamasi, senior vice president of content and technology at Nvidia. “With Nvidia PhysX technology, developers can easily author more realistic game environments for the evolving demands of a broad class of Wii gamers.”

Currently the Nintendo Wii is the heavyweight champ in regards to overall console sales, selling over 22 million units in North America alone since its launch back in November 2006; 48 million units worldwide. While porting the PhysX technology over to the blockbuster console is smart in a business sense for Nvidia, what makes the whole announcement rather curious is just how the Nintendo Wii hardware can even handle physics processing. Of the three major consoles on the market today, the Nintendo Wii is the least powerful in a visual sense, relying more on the interaction provided by the Wii Remote.

Let's look at it this way: the Nintendo Wii relies on the PowerPC-based "Broadway" processor clocking in at 729 MHz and developed using 90 nm SOI CMOS processing. On the graphic side, the visuals are rendered by ATI's Hollywood" GPU, clocking in at 243 MHz and developed using a 90 nm CMOS process; there's a 3 MB embedded GPU texture memory and framebuffer thrown in there as well. As for the console's memory, there's 88 MB total: 64 MB "external" GDDR3 SDRAM and 24 MB "internal" 1T-SRAM integrated into the graphics package.

So how will the Nintendo Wii carry the burden? That question has yet to be answered, however after closer inspection of the Gamebryo LightSpeed announcement released last week (link), reporting that Emergent Game Technologies integrated PhysX into its Gamebryo 2.6 development platform for the Wii, today's announcement should not have come as a surprise. According to a Nvidia rep, PhysX has been a part of game development for some time; the company merely made it official with today's announcement. With the new SDK implimentation, Nvidia can now make changes directly to the middleware without the need for developer involvement.

Still, with an ATI GPU under the hood of Nintendo's Wii console, it seems almost comical that Nvidia has invaded their "space" so to speak, now assimilating all three gaming consoles into the overall PhysX collective. Like yesterday's report with Sony's PlayStation 3, hopefully Nvidia will shed a bit more light on how the PhysX middleware will interact with the Wii hardware, and if gamers will see any performance issues as a result.

Display 32 Comments.
This thread is closed for comments
  • 2 Hide
    trainreks , March 19, 2009 7:44 PM
    who cares?

    its on a Wii.
  • 3 Hide
    RiotSniperX , March 19, 2009 7:57 PM
    trainrekswho cares? its on a Wii.



    AHAHAHAHAHA! So true! Whats a wii going to do with Physx? Make boom blox more realistic?
  • 5 Hide
    thedipper , March 19, 2009 7:58 PM
    "Like a huge slap in the face to ATI"
    Because ATI doesn't have a physics engine that they're trying to make mainsteam, that opener is pretty much as retarded as they get.

    Let's not pretend ATI systems can't run PhysX to its fullest. It DOES run on the CPU FYI.
  • 5 Hide
    nukemaster , March 19, 2009 8:01 PM
    Kind of funny how when it was Ageia PhysX, no one wanted it, now that Nvidia owns them, everyone is signing up.
  • 2 Hide
    hellwig , March 19, 2009 8:28 PM
    I say if Nvidia can get PhysX on everything, good. ATI will probably end up licensing it, and then games will actually start using it. With quad-core CPUs pretty much the norm these days, there's no reason any computer couldn't run PhysX (Nvidia GPU or not). Besides, my understanding is the PhysX overhead on the GPU is too burdonsome, like the GPU doesn't have enough to do in modern games.
  • 3 Hide
    thedipper , March 19, 2009 8:33 PM
    Well Hellwig, if PhysX ran on ATI GPUs, and used AMD Stream properly, it's really no question that ATI would have the clear advantage in physics rendering.

    I believe this is why it isn't currently useable on an ATI GPU.
  • 6 Hide
    blazer_123 , March 19, 2009 8:50 PM
    IMHO, the reason AMD doesn't use physx is because they do not want to be at the whim of their main competitor. Physx is a software solution that has been modified to run on AMD parts. AMD is fearful that if they officially consent to Physx use then the market share, and hence leverage, will increase for Physx.

    This would leave AMD in a bad position. First, development of Physx would be controlled by Nvidia. Second, you can guarantee that AMD will perpetually be in catch-up mode with poor relative performance.

    Why would you possibly consent to something that is only going to put you at a disadvantage as it becomes prevalent? For the good of the consumer? HAHAHAHAHA. First rule of business: Profit Maximization.
  • -5 Hide
    hairycat101 , March 19, 2009 9:15 PM
    ATI should licence and start using PhysX. The reason is this. If I am getting a new card, I want to be able to still have a use for it when I upgrade. IF I get an Nvidia card, I can still use the old one for PhysX and the new one for the GPU. There is no use for an old ATI card... unless you have an old system that you want to slap it in.
  • 9 Hide
    armistitiu , March 19, 2009 9:19 PM
    First of all ATI can do Physx. It has been shown that CUDA can be enabled on ATI. They could make drivers to support it but they don't want to. ATI Stream apparently is not that popular but as soon as OPENCL SDK is out i think a lot of people will try using it because it's supported by both GPU vendors and beaucause it's open source and i think that's the most important thing. I tend to support ATI on this one (not enabling Physx) because i hate closed proprietary software. BTW OpenCL is very similar to CUDA and my guess is you could easily implement Physx in it.
  • 4 Hide
    armistitiu , March 19, 2009 9:20 PM
    hairycat101 There is no use for an old ATI card... unless you have an old system that you want to slap it in.

    Folding @ Home ? :) 
  • -1 Hide
    hustler539 , March 19, 2009 9:38 PM
    This is a good thing. As more consoles accept PhysX, gamemakers are more likely to embrace it as well. That means more eye candy to us, and I for one definately don't mind more realism :D 
  • 4 Hide
    joseph85 , March 19, 2009 9:50 PM
    hustler539This is a good thing. As more consoles accept PhysX, gamemakers are more likely to embrace it as well. That means more eye candy to us, and I for one definately don't mind more realism

    I don't care honestly about graphics. PhysX could be a damn good thing if it adds to gameplay and nuance game structure. However I feel little uncomfortable with one company licensing all that software without competition. ATI/AMD paying for physX recalls the trap they're currently in with Intel. While not necessarily a bad thing that and industry makes a standard, other tech advances could be looked over because of such situations.

    TLDR Game physics don't really need a standard in my opinion.
  • 4 Hide
    curnel_D , March 19, 2009 10:28 PM
    I agree with one of the posts, once the OpenCL sdk is out, Cuda and Physx will likely become just what it is, a software middleman, and long forgotten at that.

    PhysX was a cool technology before Nvidia bought it. Then they just ruined the whole idea by dropping the PPU. Waste. Meh.

    Despite that, there's still a modder woking on bringing physX to ATI. He has just gone underground for a while, so he doesnt have to deal with the abuse from ATI and Nvidia. Once he's done, I'm sure the More powerful stream processing of the ATI cards will give it the advantage in Physics processing.
  • 0 Hide
    The Schnoz , March 19, 2009 10:47 PM
    I recall a programmer in Israel who got PhysX to work on his ATI graphics card. The end result was that Nvidia supported the programmer, but then ATI told him to stop. http://www.tgdaily.com/html_tmp/content-view-38283-135.html
  • 0 Hide
    pharge , March 19, 2009 11:13 PM
    mm... in PS3 "PhysX in a hardware (or CUDA) sense, the middleware thus relies on the Cell's Synergistic Processing Units (SPUs) to process the physics rather than dumping the entire load on the Cell's Power Processor Unit (PPU)."....

    Since most of the PC game on the market only use no more than 2 cores on our CPU... will that be cool if PhysX can also dump some load to those unused or idle cores on the CPU?
  • 0 Hide
    curnel_D , March 20, 2009 12:45 AM
    The SchnozI recall a programmer in Israel who got PhysX to work on his ATI graphics card. The end result was that Nvidia supported the programmer, but then ATI told him to stop. http://www.tgdaily.com/html_tmp/co [...] 3-135.html

    That's the one I'm talking about. Nvidia Renigged on him, and ATI was being even worse.
  • 0 Hide
    megamanx00 , March 20, 2009 4:20 AM
    Yeah, I don't know about the whole slap in the face thing. Heck ATI is still dragging their feet with their way overdue Havok GPU support. Anyway I'm assuming that Physx is implemented on the Wii CPU and will probably be used for more realistic interactions like objects bouncing off each other, but probably not for things like realistic liquids or glass shattering. Since developers are learning to use PhysX elsewhere it makes sense to have these tools available on the Wii in order to shorten development time. As for the PS3 I'm sure some of the physics processing can be offloaded to the GPU, but because of it's 7800 related design it wouldn't be too efficient.
  • 2 Hide
    Dmerc , March 20, 2009 7:06 AM
    I tried PhysX on my 9800GT playing UT3. At 1280 x 720 at full everything, it was unplayable. The Cpu (i7 920 overclocked at 3.2ghz ) was only 30% ultised while I was playing. I just wish that PhysX would use a core of the cpu instead of the GPU, would let me play at 1920 X 1200 at full details with proper physics.
  • 1 Hide
    Anonymous , March 20, 2009 11:49 AM
    "ther';s got to be soemthign wrong with your system , becuae physx is designed to make games run faster not slower and every gmae i have that uses it runs faster since i got teh pyhsx installed"

    Wrong. Physx is something else you need to possess it makes it slower. EVERYTIME. I have a gtx 280 and I get noticeable slowdown in UT 3 with everything turned up and Physx on.

    Tobad it's still a gimmick. Mirriors edge with physx is laughable.
    They need to actually make something that isn't terrible.
  • -1 Hide
    hairycat101 , March 20, 2009 12:11 PM
    armistitiuFolding @ Home ?

    As far as I can tell folding is mainly used by gamers as a benchmark of their systems. Has this really helped out anyone or just raised the electric bill of folks with old computers laying around and nothing better to do with them?

    I've not heard a good reason to "fold" at home.
Display more comments