The Scientists' Opinions on Gaming Physics

Convincing The Consumers

Discussions on separate processors handling physics (and other compute-intensive aspects of games like artificial intelligence) have been around for over a decade, but it's only now that we finally have the real thing. So how come the reception has been so mild? From a consumer point of view, the main issue is what performance you get for your money. Coughing up $300 for another component needs to be justified: it must result in quite a large improvement in the gaming experience. Also, as a vendor, you have to deliver a product in a really convincing way, something Ageia's PhysX card still hasn't managed to do - not according to the testing done so far anyway, where the applied physics only seems to slow down a game even further. The apparent selling points so far have been few, if any at all.

Simply put, physics is how "stuff" behaves when interacting with other "stuff". Presenting physics isn't as obvious as graphics - you won't get that "wow" effect just by showing a couple of pictures or a short gameplay movie. You need to experience it first-hand. The physics are there, but invisible until you start moving and touching things.

Yes, Ageia tries to show off physics with Cell Factor, where you have a massive number of objects that you can use to interact with in the game. But the added physics seems to slow the graphics down, which really is a consequence that's not all that strange. Add hundreds of new objects that need rendering into scenes that are already limited in some way by graphics, and that's what you get.

Cell Factor is accelerated by Ageia's PhysX.

Ageia is also facing another issue: both ATI and Nvidia want us to know that they also can do physics. They were quick to show off demos with thousands of moving objects rendered in real-time by their regular GPUs instead of a separate physics unit. This, of course, came accompanied by big numbers that in theory should prove that their technology is better than Ageia's. But are bigger numbers all there is to it?

The short answer is "no". When comparing a 3D game running with and without a graphics card, the difference is quite apparent. Today, stunning visuals can be drawn in real time, without any pre-rendered scenes. But then again, these kinds of cards have been around for 10 years now, and the technology has evolved quite a bit since it first arrived. Without using a 3D card, any game would bring down even today's high-end CPUs, which clearly shows that a GPU makes a good investment.

Even sound makes use of separate processing as it takes some load off the CPU, although not as much as the graphics does. So adding another processor to handle physics does make sense. Unloading the CPU is always a good thing, especially if it also adds the possibility of making the games more realistic.

But here's where things get a bit tricky...