Sign in with
Sign up | Sign in

Physics Drivers Outrage: Nvidia Guilty?

By - Source: Tom's Hardware | B 49 comments

Opinion - When we published an article detailing Nvidia’s advantage in 3DMark Vantage, we had a good feeling that data might spark some controversy. Using GPU for physics calculation in a CPU benchmark highly suspicious thing in any way you look at it. And in fact, it all appeared that Nvidia has been caught with its hands in a cookie jar. Finger-pointing was the result, but as it turns out, there are always two sides to the story and the benchmark maker has an entirely different opinion.

It isn’t like Nvidia and ATI have always played nice. And if you notice anything but the usual in this industry, you would suspect cheating, just like it is the case in this recent physics outrage, which came up since Nvidia is claiming a huge advantage in the 3D Mark Vantage physics test. Earlier this year Nvidia was involved in the highly controversial Creedgate, when Ubisoft found that its "The Way It’s Meant To Be Played" Assassin’s Creed title was faster on ATI cards. The company decided to remove DX10.1 support, since the game was not just slower, but also unstable on GeForce cards. The explanation following this controversy was received as being rather doubtful.

In the most recent case of accusations, AMD claims that Nvidia has been fiddling with the 3DMark Vantage benchmark. Just several hours after that story was published, ATI partners contacted us with similar claims about Unreal Tournament 3. The whole revolves around Nvidia’s driver version 177.39.

It all would be fine and dandy if somebody actually had called representatives of the two software companies and asked them for an explanation about what actually happened. We were able to contact those companies and received surprising statements from Oliver Baltuch, president of Futuremark and Mark Rein, vice president at Epic Games.

Let’s take a step first and look how the physics case unfolded:

November 2004: Nvidia introduces its first commercial chipset supporting multi-GPUs - the nForce 4 SLI for AMD platform. AMD downplays the value of multi-GPU cards, claiming that the value lies within a single-die GPU.

February 2005: Nvidia announces that the company has shipped three million SLI-capable chipsets.

March 2005: ATI’s unveils the never-actually-launched X850 Crossfire platform, consisting of two Radeon X850 boards connected via an external cable. The solution never saw any real volumes.

March 2006: GPU Physics begins its life as a marketing gimmick between ATI and Nvidia. Both companies announce GPU Physics at GDC Spring in San Francisco using Havok FX, a sub-set of the Havok physics API that used the GPU to "animate" physics. Was never really true Physics.

May 2006: At E3 2006 in Los Angeles, key game developers criticized Havok FX and decided to go either with Havok or Ageia’s PhysX API - since both APIs are CPU agnostic and work on almost all platforms.

June 2006: ATI was the first company to demonstrate the new technology at Computex in Taipei. ATI used a system with three X1900XTX graphics cards.

September 2007: Intel buys Havok and Nvidia/AMD open negotiations with Ageia. AMD did not want to pay for Ageia and decided that the role of physics on a GPU should be buried.

November 2007: At the AMD Phenom launch in Warsaw, AMD’s developer relations manager says: "GPU physics is dead for now".

February 2008: Nvidia announces the acquisition of Ageia.

April 2008: During its Financial Analyst Day, Nvidia announces that a physics driver will be available to the general public by mid-summer. The first public demonstration did not go as planned, but the potential was clear.

June 2008: Nvidia releases PhysX Application Software 8.06.12 first to the general press, then to the public. This version of PhysX enables GPU acceleration of the PhysX API. Controversy sparks around 3DMark Vantage and Unreal Tournament 3.

Looking back in history, we notice that ATI’s first reaction to multi-GPU was negative, but the company followed suit with Crossfire and now the company is preaching about advantages of smaller GPUs instead of large monolithic dies. Later, AMD was downplaying the value of GPU physics and then announced that it found an agreement with Intel/Havok. But this move was "too little, too late" for companies like Epic and Futuremark, who made their design calls years ago. AMD didn’t work on GPU Physics and even tried to bury it. As a result, PhysX has become the physics API of choice for more than 150 games and Futuremark used PhysX in its benchmark.

AMD’s Official Statement: Nvidia fools 3DMark Vantage

The issue with AMD attacking Nvidia over 3DMark was summed in an interesting article by my ex-colleague Charlie Demerjian. We have received an official statement from Dave Baumann, former head of Beyond3D and now in a senior technical role inside AMD’s graphics unit:

"We believe physics simulation, whether performed on the CPU or the GPU, will be an increasingly important feature of upcoming games. The powerful parallel processing capabilities of modern GPUs have been proven to be very useful for accelerating some types of physics calculations, such as cloth simulations and rigid body collisions, used to enhance game visuals. However, using the GPU in this way only makes sense if it doesn’t detract from graphics rendering performance. In other words, adding a few more moving objects into a scene isn’t necessarily beneficial if it requires other 3D effects to be simplified, or sacrifices resolution and frame rate.

3DMark Vantage attempts to address the growing importance of game physics by including support for GPU-accelerated physics in the GPU tests, implemented using DirectX 10 geometry shaders. The developers balanced the physics and rendering workloads in a way they felt was reflective of what we would see in next-generation games. Additionally, they included CPU tests that supported the use of Ageia PhysX PPUs to offload some physics calculations from the CPU. This decision was made prior to the acquisition of Ageia by Nvidia, and the subsequent discontinuation of discrete PPU products.

Recently released drivers from Nvidia (ForceWare 177.39) fool the 3DMark Vantage benchmarks into thinking an Ageia PhysX PPU is installed, while actually doing the additional physics processing on the GPU. Since Vantage has separate GPU & CPU benchmarks which both include physics processing, this causes the performance benefits of GPU physics to be double-counted, resulting in an artificial inflation of the final score. Real games can be expected to limit the amount of GPU physics processing to avoid significantly impacting rendering performance. Also, we are confident that the vast majority of upcoming game titles will not include support for PhysX, but will instead rely on more popular physics middleware (such as Havok) or proprietary physics engines, which will not benefit in any way from Nvidia ’s PhysX drivers."

Summing up, AMD claims that the ForceWare 177.39 driver "fools" the 3DMark Vantage benchmark. We’re not so sure. Ageia has built a very solid library of titles that use the PhysX API. Being a standard library within the Unreal Engine got Ageia more than one hundred contracts alone. PhysX is the most common used physics API for console games (NovodeX and Meqon APIs) today, Sony has licensed PhysX SDK as the official physics engine for Playstation 3 console, Microsoft licensed PhysX for their own Robotics Studio and the list goes on. So, why would we undermine PhysX’ value as an API? Because of Intel’s Havok?

Wasn’t this a case of "fooling" a benchmark in the first place?

Futuremark: No cheating here

3DMark Vantage is in the hot seat as far as Nvidia’s PhysX goes, because this is the first time that a GPU is influencing CPU scores. AMD claims that Nvidia violates BDP Driver Rules. This is what Futuremark’s had to say:

"The driver in question has not been submitted for authorization and is only for demo purposes only. Nvidia has followed the correct rules for driver authorization and the BDP by sending us the 177.35 published driver (the same as AMD has now sent us the 8.6 published driver), both of which are currently undergoing the Authorization process in our Quality Assurance area at this moment.

Only drivers that have passed WHQL and our driver Authorization Process have comparable results that will be allowed for use in our ORB database and hall of fame. Other drivers which have not been submitted will not be commented on. Otherwise, we would have to inspect every Beta and press driver that is released.

Our application is not changed in any way, thus any statement implying otherwise is incorrect."

According to Futuremark, Nvidia did not violate BDP Driver Rules. Then again, they didn’t state that the 177.39 drivers were legit either. However, the 177.39 driver will not enable Physics on a GPU, it is the PhysX 8.06.12 Application Software. We spoke with Oliver and other members of the Futuremark team and learned that they have no issues with the PhysX Software 8.06.12 because it is WHQL, but the display driver has to be certified as well. Once that is done, both PhysX 8.06.12 and 177.35 will be certified for use on ORB.

The only difference between future WHQL driver 177.35 and 177.39 is the inclusion of GeForce 9800GTX+, a strangely named 55 nm die-shrink of the G92 chip.

Epic Games: It’s customer service

Following conversation with Futuremark, we spoke with Mark Rein, VP of Epic Games. Mark is known to be quite knowledgeable when it comes to new technologies and the company isn’t shy about pointing fingers even at the largest corporations.

When it comes to the topic of physics on a GPU, Mark believes that using GPU for physics is not a cheat. Rather, he considers this feature a bonus for users of Unreal Tournament 3 who own Nvidia graphics cards.

Those users can now play games that offer features that were designed for an Ageia PPU simply by using their high-end GPU. The only thing that Nvidia did was to change the library that we shipped with the game, and ultimately made those levels run better."

Mark called it ironic that nobody cried foul when Epic released the UT3 Bonus Pack, which contained PhysX-enabled levels and required an Ageia PPU to run. Nvidia’s purchase of Ageia made this technology available to millions of gamers, instead of several thousands (according to some sources, Ageia shipped only 120,000 boards).

Mark chose to call Nvidia’s PhysX driver "customer support". He mentioned that Nvidia had a long history of going the extra mile to improve their customers’ PC gaming experience through driver features and optimizations created by working closely with developers.

Personally, I have been propagating physics in games for ages now, because it is the only way to enable the creation of realistic games. For me, physics can be Havok, PhysX or even Ray tracing, but games and other applications should have physics, because looks are nothing, if the game creators cannot put their feelings about the world into motion.

How many times did we complain about race cars not crashing when they are touching a curb or passing the grass at very high speeds? Physics is the answer, and Nvidia’s PhysX is one of the roads that game developers can take.


Nvidia: PhysX driver for public arrived, PhysX part is WHQL certified

At the end of the day, we asked Nvidia when the PhysX driver will be available to general public. More importantly, when there will be a WHQL certified driver so that Futuremark can approve it. We were given an answer by Bryan Del Rizzo, one of the members of Nvidia’s PR team:

"The PhysX system software is WHQL. [The] 177.39 display driver is BETA (so the co-installer will be under beta downloads)"

This answer didn’t exactly satisfy us, we wanted to hear when will the driver be WHQL’ed and when will the driver go through Futuremark’s BDP Driver Certification process.

Nvidia sources told us that that the WHQL display driver is currently being certified at Microsoft. The release is expected by mid-summer. While a specific date was not given, commons sense suggests that WHQL driver will be available when 9800 GTX+ cards ship - which will be July 11.

For now, you can find official beta driver on following pages: 32-bit Windows XP, 32-bit Windows Vista and 64-bit Windows Vista .
If you want the PhysX Application Software in stand-alone form, download link is here. So far, supported boards include GeForce 8, 9 and the GTX 200 series.


Conclusion

To us, there is a clear conclusion of this matter. AMD rained on its own parade and the launch of its excellent 4850 and 4870 cards. We got the impression that the software vendors in question believe that their products benefit from GPU physics and that accelerating in-game physics with a GPU was a positive move. Forget corporate politics for one minute and figure this one out: Nvidia modified PhysX and brings it to millions of owners of GeForce cards. We would say around 10 million, since we do not think that 8600 versions and below are capable of serving PhysX demands with graphics.

If you own a GeForce 8800, 9600/9800 or GTX 200 series, we can only recommend a download of the latest drivers and PhysX software when they become available and start playing those levels in Unreal Tournament 3, Ghost Recon and other PhysX games.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 16 Hide
    porksmuggler , June 26, 2008 1:39 AM
    Hey Tom's you would be better off just presenting the official statements from each company. ATI and nVidia get an equal share in most the systems I build, and fanboy rants like Theo's really destroy this site's credibility (like the switch to Bestofmedia hasn't enough already). nVidia's intent is obviously to manipulate the benchmark, regardless of any discussion of real world performance.
Other Comments
  • 5 Hide
    swiftpulse , June 25, 2008 11:58 PM
    While your conclusion holds true for games like UT3, still the issue of 3dmark remains an issue. I can't understand why "ATI partners" complain about UT3 but it's clear that 177.39 can inflate the CPU score of the 3dmark and produce misleading result.

    Of course as far as Futuremark is concerned, no cheating happened because the drivers were not authorized by them, and Nvidia can't be faulted to enable a feature on their product, no matter the timing. It falls to the press and publishers to take the 3dmark results and point out that the CPU score in 3dmark vantage will have little effect in games.

    The real benefits should be sought in games like UT3.

    My 2 cents anyway.
  • -6 Hide
    techguy911 , June 26, 2008 12:05 AM
    Since when is using a gpu to increase the speed of math calculations cheating in bench marks? its not artificial its a REAL increase due to calculations being done faster in the gpu.

    The problem is ati is crying wolf because they didn't think of something like this first and don't have anything in the works.
  • Display all 49 comments.
  • 3 Hide
    njalterio , June 26, 2008 12:20 AM
    The reason why this is cheating is because the gpu is being used to assist the cpu when only the cpu is supposed to be tested. Whenever a system with an Nvidia graphics card using the controversial driver is tested, the cpu score will be higher then it's actual value.
  • 5 Hide
    KITH , June 26, 2008 12:25 AM
    I'm thinking the point is that you don't automatically get the cpu processing boost and graphics at the same. the difference between marketing claims and reality. it can do this and that but not both together necessarily.
  • 4 Hide
    Christopher1 , June 26, 2008 12:31 AM
    KITH hits the nail on the head. The reason that this is such an absolute outrage is that in real life conditions..... physics processing and the other processing are going to be done AT THE SAME TIME.
    This is basically punking the software program and making it appear that a card is better than it actually is. Futuremark would do well to realize this, and do the physics tests and the other tests that cause the controversy AT THE SAME TIME from now on, so that there can be no punking of the tests.
  • 2 Hide
    njalterio , June 26, 2008 12:32 AM
    ^Exactly
  • 4 Hide
    kaldemeo , June 26, 2008 1:13 AM
    the problem is that nvidia owns PhysX.. I really hope the game industry in the future will choose a open standard and not PhysX
  • 4 Hide
    xBruce88wXx , June 26, 2008 1:29 AM
    ... on nVidia's site, the link you gave for the physx download, the "Products supported" tab only lists; GeForce GTX 280 GPUs, GeForce GTX 260 GPUs, GeForce 9800 GTX GPUs, and AGEIA PhysX Processor (All). It does not list any of the 8 series cards. Screenshot
  • 16 Hide
    porksmuggler , June 26, 2008 1:39 AM
    Hey Tom's you would be better off just presenting the official statements from each company. ATI and nVidia get an equal share in most the systems I build, and fanboy rants like Theo's really destroy this site's credibility (like the switch to Bestofmedia hasn't enough already). nVidia's intent is obviously to manipulate the benchmark, regardless of any discussion of real world performance.
  • 1 Hide
    chesterman , June 26, 2008 2:04 AM
    hey, i have a 8800gts320mb and i'd be rlly rlly happy if my card have the support to physx, but in the read me of physx driver and the neta 177.39 driver says that only the GeForce GTX 280 GPUs, GeForce GTX 260 GPUs and GeForce 9800 GTX GPUs supports the new feature. after all, my card have or dont have the support for physx?
  • 4 Hide
    GT-Force , June 26, 2008 2:22 AM
    Yep. nVidia was saying that PhysX will support 8800 and above, but no dice so far.
    Tom's crew should read better before they post an article!
  • 2 Hide
    nukemaster , June 26, 2008 2:27 AM
    Nvidia will be making the drivers available later for your gts 320 and all 8 and 9 series cards. just wait it out. Now if they can fix the 8800GTX + Vista 64 glitches with random MMO(i know many are no name games, but this should not happen) games and QUAKE(the first one runs like ****. no joke, full screen actually frames drops when you pick up items. This was tested with an alternative card and all problems do not happen)
  • -4 Hide
    creepster , June 26, 2008 5:33 AM
    I think people are a little confused as to how Vantage tabulates the scores. it is clear you have never used the program because CPU and graphics performence is tabulated from all the benchmarks, the first 2 are weighted towards graphics and the last 2 are weighted towards CPU, but all 4 benches contain many many physx effects.

    In the first bench you have the cloth covering the boat that is a physx process, the clothing on the models is all physx enabled, most notably the white shirt as it moves with that girls giant jugs. I'm not sure if the water has any physx effects applied as well, I wouldn't be surprised if it did

    in the 2nd bench there are thousands of asteroids on screen all bouncing around, that is all physx calculations right there.

    the new driver and software improves frame rates across ALL tests even the ones that are geared more towards graphics. the problem is 3d mark is giving extra points to the cpu when in reality it should be giving those points to the graphics card.
  • -3 Hide
    dragonsprayer , June 26, 2008 6:40 AM
    Nvidia cheats again? wow are we suprised?

    Nvidia is the dirty player, while all the goverment officials investigate intel for discounts not to use amd - which i think is fair! Nvidia cancels SLI for intel chipsets for almost 3 years now - ask yourself why is there no investigation of an SLI monopoly?

    Nvidia cuda is the most exciting procesing technolgy since X86, mean while nvidia still plays dirty with no regard to the public!

    Did nvidia buy ageia to help it's customers? Then why no intel chipset SLI drivers?

    Can you believe AMD lets intel build crossfire but nvidia does not licence SLI?

    I got a message for nvidia - when the 4870 you can say byby to 9800GTX sales from me! When the 4870X2 comes out we say byby to the 9800GX2 and GTX 280! Go make super computers and get out of gaming!

    Nvidia only cares about Nvidia and not the customer that is fact!
  • 0 Hide
    Anonymous , June 26, 2008 8:14 AM
    It's not cheating get over it. AMD will soon do the same:

    ATI falls for PhysX
    http://www.overclock3d.net/news.php?/gpu_displays/ati_falls_for_physx/1

    The bigger question is when is this available for 8800 cards? Tom's last paragraph says:

    "If you own a GeForce 8800, 9600/9800 or GTX 200 series, we can only recommend a download of the latest drivers and PhysX software when they become available and start playing those levels in Unreal Tournament 3, Ghost Recon and other PhysX games."

    But when you check the 177.39 Beta driver page it says:
    "Adds support for NVIDIA PhysX Technology for GPU PhysX hardware acceleration on GeForce 9800 GTX, GeForce GTX 280, and GeForce GTX 260 GPUs."

    You got my hopes up too early Tom's :(  :( 
  • 9 Hide
    TheGreatGrapeApe , June 26, 2008 8:25 AM
    So Theo, why didn't FM give you this nice little tidbit they gave hothardware?
    http://hothardware.com/cs/forums/t/39136.aspx
    "Outside of this matter, we have been introduced to this technology from NVIDIA and it is truly innovative for future games and game development. As you know, we have an actual game coming as well and it could also make use of PhysX on the GPU."
    Guess you wouldn't want to make too much out of a product you're licensing for your game. Don't want those fees to go from the 'Free' to 'Moderate', eh?

    Theo, I'm also curious, if ATi's changing the order of the shaders to run better on their hardware without changing the end result was considered an offense by FM, how would this be any different in the future, where they would have to reconfigure how the code was handled in order to work on the GPU.

    Essentially this invalidates Vantage for anything else other than nV to nV comparsions. However how many reviewers do you think will stop using it or even bother to check whether drivers are BDP approved.

    Bungholiomarks, who cares, stop using in reviews. More than ever it's nothing more than a pretty cut-scenes and an internal stability check.

    BTW, just on some facts, Brook GPU did physics before ATi and nV started pimping it on their own, and ATi has multi-GPU in their Evans and Sutherland SimFusion rigs long before nVidia did, and the only way they got their SLi (not SLI) to work on their rigs was to use ATi and Metabyte's IP in using AFR and SFR formats instead of 3DFx's dead SLI.

    Personally I hate Micro$oft, but more than ever I wish they had stopped dragging their feet on DirectPhysics and brought an agnostic physics API to the market instead of having these IHV-biased solutions.
  • 0 Hide
    Horhe , June 26, 2008 8:33 AM
    I don't understand, if physics are calculated on the GPU, doesn't this mean less power for graphics?
  • 0 Hide
    darthvaderkenneth , June 26, 2008 8:34 AM
    i hav a 9800gx2 but it says 9800gtx or gtx 200 series so can i download it?
  • -5 Hide
    wh3resmycar , June 26, 2008 10:18 AM
    ati fanboys floods em more with your rant...

    you guys are just envious because our 8s and 9s do physics while your 2s and 3s cant..

    oh btw, wait for the 55nm gt200, i aint talking about the gtx+ mind you.
  • 0 Hide
    choirbass , June 26, 2008 10:39 AM
    article aside; the current downloadable 177.35 drivers can be used by any cuda based nvidia gpu, even if the setup app indicates otherwise. just go through a manual .inf install in device manager, and itll end up showing your gpu as something its not (eg. 8800GT is instead recognized as GTX280 afterwards). not that the model mislabeling matters, its the performance improvement that does.
Display more comments