ATI and PhysX Co-exist on the Nintendo Wii
Like a huge slap in the face to ATI, rival company Nvidia has signed a "tools and middleware" license agreement with Nintendo, bringing PhysX technology to the Wii console.
Yesterday Tom's reported that Nvidia signed a deal with Sony Computer Entertainment Inc that gives PlayStation 3 developers access to the PhysX software development kit (SDK). According to the company, the kit is now available as a free download on the SCEI Developer Network and consists of a full-featured API and "robust" physics engine. But because the console's RSX GPU--based on Nvidia's G70 architecture (think GeForce 7800)--doesn't support PhysX in a hardware (or CUDA) sense, the middleware thus relies on the Cell's Synergistic Processing Units (SPUs) to process the physics rather than dumping the entire load on the Cell's Power Processor Unit (PPU).
Now Nvidia is taking another step into dominating the gaming industry by inking a deal with Nintendo that grants Wii developers access to the PhysX SDK as well. “Nintendo has reshaped the home entertainment and video game market with the success of the Wii console. Adding a PhysX SDK for Wii is key to our cross-platform strategy and integral to the business model for our licensed game developers and publishers,” said Tony Tamasi, senior vice president of content and technology at Nvidia. “With Nvidia PhysX technology, developers can easily author more realistic game environments for the evolving demands of a broad class of Wii gamers.”
Currently the Nintendo Wii is the heavyweight champ in regards to overall console sales, selling over 22 million units in North America alone since its launch back in November 2006; 48 million units worldwide. While porting the PhysX technology over to the blockbuster console is smart in a business sense for Nvidia, what makes the whole announcement rather curious is just how the Nintendo Wii hardware can even handle physics processing. Of the three major consoles on the market today, the Nintendo Wii is the least powerful in a visual sense, relying more on the interaction provided by the Wii Remote.
Let's look at it this way: the Nintendo Wii relies on the PowerPC-based "Broadway" processor clocking in at 729 MHz and developed using 90 nm SOI CMOS processing. On the graphic side, the visuals are rendered by ATI's Hollywood" GPU, clocking in at 243 MHz and developed using a 90 nm CMOS process; there's a 3 MB embedded GPU texture memory and framebuffer thrown in there as well. As for the console's memory, there's 88 MB total: 64 MB "external" GDDR3 SDRAM and 24 MB "internal" 1T-SRAM integrated into the graphics package.
So how will the Nintendo Wii carry the burden? That question has yet to be answered, however after closer inspection of the Gamebryo LightSpeed announcement released last week (link), reporting that Emergent Game Technologies integrated PhysX into its Gamebryo 2.6 development platform for the Wii, today's announcement should not have come as a surprise. According to a Nvidia rep, PhysX has been a part of game development for some time; the company merely made it official with today's announcement. With the new SDK implimentation, Nvidia can now make changes directly to the middleware without the need for developer involvement.
Still, with an ATI GPU under the hood of Nintendo's Wii console, it seems almost comical that Nvidia has invaded their "space" so to speak, now assimilating all three gaming consoles into the overall PhysX collective. Like yesterday's report with Sony's PlayStation 3, hopefully Nvidia will shed a bit more light on how the PhysX middleware will interact with the Wii hardware, and if gamers will see any performance issues as a result.
its on a Wii.
AHAHAHAHAHA! So true! Whats a wii going to do with Physx? Make boom blox more realistic?
Because ATI doesn't have a physics engine that they're trying to make mainsteam, that opener is pretty much as retarded as they get.
Let's not pretend ATI systems can't run PhysX to its fullest. It DOES run on the CPU FYI.
I believe this is why it isn't currently useable on an ATI GPU.
This would leave AMD in a bad position. First, development of Physx would be controlled by Nvidia. Second, you can guarantee that AMD will perpetually be in catch-up mode with poor relative performance.
Why would you possibly consent to something that is only going to put you at a disadvantage as it becomes prevalent? For the good of the consumer? HAHAHAHAHA. First rule of business: Profit Maximization.
Folding @ Home ?
I don't care honestly about graphics. PhysX could be a damn good thing if it adds to gameplay and nuance game structure. However I feel little uncomfortable with one company licensing all that software without competition. ATI/AMD paying for physX recalls the trap they're currently in with Intel. While not necessarily a bad thing that and industry makes a standard, other tech advances could be looked over because of such situations.
TLDR Game physics don't really need a standard in my opinion.
PhysX was a cool technology before Nvidia bought it. Then they just ruined the whole idea by dropping the PPU. Waste. Meh.
Despite that, there's still a modder woking on bringing physX to ATI. He has just gone underground for a while, so he doesnt have to deal with the abuse from ATI and Nvidia. Once he's done, I'm sure the More powerful stream processing of the ATI cards will give it the advantage in Physics processing.
Since most of the PC game on the market only use no more than 2 cores on our CPU... will that be cool if PhysX can also dump some load to those unused or idle cores on the CPU?
That's the one I'm talking about. Nvidia Renigged on him, and ATI was being even worse.
Wrong. Physx is something else you need to possess it makes it slower. EVERYTIME. I have a gtx 280 and I get noticeable slowdown in UT 3 with everything turned up and Physx on.
Tobad it's still a gimmick. Mirriors edge with physx is laughable.
They need to actually make something that isn't terrible.
As far as I can tell folding is mainly used by gamers as a benchmark of their systems. Has this really helped out anyone or just raised the electric bill of folks with old computers laying around and nothing better to do with them?
I've not heard a good reason to "fold" at home.