Sign-in / Sign-up
Your question

Can Ageia's PhysX Card Bring Real-World Physics to Games?

Tags:
  • Memory
  • Games
  • Physx
Last response: in Memory
June 19, 2006 10:19:16 AM

Ageia says its new physics processing device and engine will do nothing less than revolutionize PC gaming. We gauge just how well Ageia's PPU (physics processing unit) can handle flying debris and shrapnel in Ghost Recon Advanced War Fighter--which you can also see for yourself by downloading our video.

More about : ageia physx card bring real world physics games

June 19, 2006 11:02:07 AM

as i stated in the 3d gaming notebooks thread ->> http://forumz.tomshardware.com/mobile/modules.php?name=Forums&file=viewtopic&p=192302#192302

its much a case of, whos technique will be most successful, in this "physics war", the main problem, if it persists, will be that there will be X% games sporting physx and X% sporting havok, if nvidia only support havok with sli mode (and ati do the same) then you will need a physx card to get the most of out the physx games and sli/xfire to get the most out havok games, a very costly solution

worse would be if ati go with physx support for xfire, meaning an ati based system would have no havok support

possibly the best outcome would be one of the physics companies going under (not likely havok seeing an already fairly large base will support its physics rendering when the forceware 90 comes out) or a buy out resulting in a merging of technologies

which would come back to the point i made in the gaming laptops thread, with then 3 potential chips in your computer able to do the physics work would you be able to set a primary/secondary chip for physics? if you have a dual core cpu, sli and physx card could you set the physx card as primary physics device with cpu as secondary and extraneous gpu usage as tertiary

perhaps this sort of setup would help prevent some of the slow downs involved by helping prevent the "primary" physics card getting bogged down and using other available resourses to do other work
June 19, 2006 11:16:28 AM

I think the whole Physics fad is really interesting, and may eventually make gaming more fun. How it all shakes out is not really important to me right now.
Here is my issue. "Real-World Physics". This is not what people are looking for , or expect. They are looking for Hollywood action movie physics. Please don't get me wrong, My intention is not to burn anyone- this is just my observation. I have a friend contracting in Iraq. I am reasonably certain that if he gets into a sticky situation, he will be aiming for center of mass, and not paying homage to Atari's Combat with trick shots. At least, I hope not.
That's my 2 cents, Thanks
Related resources
June 19, 2006 11:18:48 AM

Yeah, PPU is bad idea right now. Got some major problems and only causing the game to slow down. Ageia better get a solution fast before Nivdia or ATI can come up with something better.
June 19, 2006 11:50:09 AM

What I'm failing to understand is how Ageia sees the adoption scenario unfolding in their favor.

Extra CPU cores are becomming the norm very quickly and it's inevitable. Far faster than custom physics hardware which is too new to even call a fad.

A solution that uses extra cores will be incredibly more successful than a solution that requires a $300 card. Devs will not be that interested in pidgeon-holing themselves into a solution that restricts sales to 500 people due to a low saturation of dedicated cards.

This is what will kill Ageia. And it's simply so obvious that it makes my head hurt that anyone thinks that they can succeede on their current path.

Ageia has the mindshare right now, due to a ton of work and money spent in the marketing channels. But their product is too much of an uphill battle.

Frankly, Havoc is the solution most likely to win the battle here, but Ageia is the current darling due to focusing more heavily on the marketing and having the "only PPU" out there.

But if Havoc can get their message out to the audience that they offer the same as Ageia does and, without the $300 card, then Ageia is done for.

Ageia's only chance of success is if they start selling the cards for $50, start getting it added as an on-board chip on new motherboards. Only then will it become widespread enough that devs will lock their games to needing a PPU. Then Ageia can focus on selling for the upgrades to better units.

But even this is a weak stance ultimately, because unlike GPUs, people want their PPU to CHANGE gameplay, not just make it prettier. And the amount of gameplay change attainable is limited to A) the presence of a PPU and B) the capabilities of the PPU present. The likely hood of ANY game taking full advantage of a non-ppu system playing alongside a PPU motherboard solution playing alongside a second generation PPU system is too far fetched. The games will have to play to the lowest common denominator and thus we have the trap that makes PPUs a troubled market before it even really exists.

Ageia has been spending a LOT of money to build it's name and create the market for PPUs so that they can in turn provide the solution for the problem. But it's a false problem. Now and in the future games = online. And Online means you have to accomodate all the players, not just the one with the PPU.

Just because Ageia can build a PPU does not mean that it's the right way to bring physics to gaming. The solution that can succeede is the right solution, not the one with the bigger marketing budget.

Ageia's sales pitch to devs: "Hey, over 1,000 people have Ageia cards now, use our system to revolutionize your game and you might generate as much as $20,000 gross sale. And even more might spend $300 out of the blue to get our card."

Nvidia/Ati sales pitch to the devs: "Trust us, we can talk gamers into paying an extra $500 for a second or third GPU. I know that less than 1% have it now, but trust us, we have big names"

Havoc's sales pitch to devs: "A third of your potential customers have multi-core systems and more buy them every day. Use our system and you can revolutionize your game for the masses and achieve maximum sales."

Now honestly, where do you think success lies?
June 19, 2006 12:20:01 PM

even "holywood" physics are based off "real world" physics

for example, in "holywood" physics if a door is blown off a car it will very unlikely end up standing on its edge (poor example of physx attempt), it may fly further than you would generally expect but its movement, bouncing, spinning, etc would be what you would expect from something that size, shape, weight traveling at that speed, trajectory

change the weight, speeds, shapes, trajectory and the resulting movements should be what is expected as a result of those changes

just from that it sounds complex, but thats nothing compared to the number of calculations required for that sort of movement to take place, especially considering the fact that where the main force impacts that hits it, what part of it hits the ground and bounces, and resulting aerodynamics are all variable and will impact the number and complexity of the calculations that are taking place

going back to your real world example, the physics involved in those situations is fairly immense. a bullet has an arching trajectory, currently as far as i know there are no games that take this into account, bullets just fly straight, i would find this alone very interesting and would make games much more enjoyable as, as in real situations, you would have to take into account wind speed and distance for long distance shots

also how bodies react when collisions occur would be another real world example, in the real world you may not be watching and going "wow that looks so realistic and cool" but in a game it would be just another thing that, when paying attention, would hold the players attention and captivate them

all these "real world" physics examples are the base of "holywood" physics, drop the weight of an item and suddenly it flies twice as far, explosions blow bodies higher into the air and over greater distance, all while running the same set of physics calculations causing the actors to react as you would expect

the idea is to make things happen as you would expect them to happen, not for items to react unnaturally, if its unnatural then generally it sticks out like a sore thumb, if its natural then you wont notice it as much, you will see it, it will do what you expect, and you will be satisfied and forget about it, unless youre paying attention and watching what it does to make sure it does what it should. whether you realise it or not your mind is doing a huge amount of physics calculations itself while watching these games, which is why bad physics is so obvious, your mind calculates one rough outcome, something completely different occurs and its like an error pops up in your mind

one prime example i would say would be the collision calculations in oblivion, whenever you run into an object thats moveable it reacts in an extreme manner, like fly across the room and slam into the wall at a force that makes it bounce about 10 feet back from the wall, even one time a piece of wood spun around on one end for about half a minute before falling over! you notice it because its not what you would expect to happen, you laugh because it looks funny, but in the end its a loss for the physics engine (or the values going into it) because there is an incorrect calculation going on thats causing unnatural effects

i suppose to concisely sum up this rant id say that this should give you some idea why physics is so costly (atleast resource wise!) to implement and how when properly implemented the results are not stunning unless youre paying a lot of attention to what is going on, otherwise it will just be what you expect and nothing out of the ordinary

--edit--
silly spelling mistakes
June 19, 2006 12:24:24 PM

Why dont ya people just give it a chance... if you dont have money... dont buy... if you have money and want to spend it on something esle... do it... its pointless to complain what it can or cant do... its just new technology... you cant expect much from it... ya havent given it some chance to mature or become optimize... why dont ya wait till UT2007 comes out to see if its good to buy or not... before that happens... just sit and wait instead of complaining how bad it is
June 19, 2006 12:33:21 PM

I actually like ATI's products a lot but am disappointed with their idea of using a 3rd dedicated GPU for physics processing. I thought the AGEIA PhysX card was such a bad idea and now ATI wants me to buy a 3rd videocard for physics? I don’t care what you want to call it, it’s just as bad as buying the dedicated physics card.
June 19, 2006 12:44:06 PM

we arent complaining as we are discussing, as you say, so far there is too much unknown, too much untestable, to either justify purchasing a physx card, or, if you dont have it already, a second graphics card or multicore cpu for the physics

the fact of the matter though is that physics calculations are going to get more complex and one of these solutions will be required to keep up, so far it doesnt look like the physx card will cut it because of, what appears to be, some serious limitations in implementation

it will take more than just one game to show the potential of this, and other, solutions, it will also take maturity of the system, both of which will take valuable time. the havok solution will have an easier implementation, if you have sli cards just upgrade driver and it goes, assuming all havok engine driven games are supported then immediately there will be serveral testable titles to run against

i just hope that nvidia offer an option of primary/secondary physics device, if i get a quad core cpu im sure id rather its cores do more of the physics work than the sli cards, let them deal with the rendering!
June 19, 2006 2:18:43 PM

physics is the future for all gaming, not as big as gfx mindu, but a seperate physics solution is need to provide the best quality. obviously i would like to see ageia make a break through and it is way too early to tell if it will. consider that it does hav some bigs games, ut2007 for example. dedicated solutions hav always worked better then software or driver solutions and atis and ageias look promising, but just like quad sli, the software for them has to be worked out
June 19, 2006 2:32:19 PM

The issue here is that it is not even bleeding edge tech, it's just bleeding - the one game that uses it doesn't represent well, there are few others on the horizon, and it is very expensive to boot. Not a good combo - it almost seems like Ageia thought they could get by on the novelty factor until they improved the product and the number of games supported. Almost like they are charging $299 to be a beta tester - ouch...
June 19, 2006 2:55:16 PM

excellent way of putting it. why should I spend money to be a guinea pig for a product that does not yet work (essentially)?

PhysX seems to be not a niche market, but a non-market. What niche does it fill? Nobody can use it effectively, and we don't know if it will actually start being used significantly in the future.

$300 US is insane for such a product, I can buy a video card which will reasonably run any game on the market for that price. This product is only currently supported on a game I'm not going to buy. I knew the result of this article before it was written, because I thought to myself "what possible need do I have for this product?". At least with the Voodoo add-on cards, I saw immediate significant tangible increases in performance.
June 19, 2006 3:16:50 PM

The Physix card is just a waste of money as well as power, and space in ones rig. Besides it only supprts a number of games, but it would be much better if it were backwards compatible with older games, such as CoD2. It just seems that Ageia are trying to take over the "real world physix" cenario with games, but it just seems that it hasnt spent enough time in creating this piece of hardware. My verdict wait until Ati Or Nvidia come out with something better.
June 19, 2006 3:41:26 PM

While it's not worth it right now, I'm hoping the PhysX card succeeds because it's the only implementation that's not just eye-candy. The problem with Nvidia and ATI's solutions is that the resulting physics calculations don't affect gameplay. PhysX has the possibility for in-game physics to alter the gameplay and I think that could be really awesome.
June 19, 2006 4:03:03 PM

I'm just wondering how physX will be in AMD's 4x4 era......HT link to a dedicated PPU, anyone? 8O
June 19, 2006 4:23:25 PM

They should have used the PCI-e x1 slot insted since you could get close to twice the bandwidth then just the PCI slot.
June 19, 2006 4:44:29 PM

In my opinion this is just another idea as how to get more money from pc users.I fear that if Ageia's popularity will rise then it's just a matter of time before you'll have to buy 10 different cards just to play anything more then "minesweeper" Motherboard producers will follow, and try to imagine the size of your "desktop pc" then. ;-)
June 19, 2006 5:06:26 PM

Isnt this the kind of technology that could be added to a graphics card or even built into the mother board? This just sounds like another way people can waste their hard earned money and get very little in return. Advertisements can be very seductive and very deceiving don't believe the hype!
June 19, 2006 5:31:57 PM

Quote:
Isnt this the kind of technology that could be added to a graphics card or even built into the mother board? This just sounds like another way people can waste their hard earned money and get very little in return. Advertisements can be very seductive and very deceiving don't believe the hype!


read the whole article...
June 19, 2006 5:47:18 PM

I actually liked the article. Thanks. :) 
I was waiting for this article for a long time.

It had some important issues like the old "clipping" problem that never seem to get fixed. The more realistic weight causes very unrealistic leaning angles. Frame rates are slower, etc.
I think all the PhysX card hype is a small step in the right direction, but there is a LOT more work to do that just adding some hardware. The problems also seem bigger than "Hovok vs. Ageia" or "1 card vs. 3 cards", etc (they're all far from ideal)

I am hoping developers can start taking physics more seriously if they know the hardware can handle the extra load and users are expecting more realistic physics while also expecting more object interaction.
In a few years there might even be very usable phyics hardware on the motherboard like AC97 or cheap video with shared memory.
June 19, 2006 6:27:59 PM

Seeming how there is a hack for Ghost Recon that allows the same effects without using a PhysX card, I don't see their solution coming out on top.

It will be interesting to see what UT2007 gets from the PPU.

Actually, I'm a bit surprised that MS hasn't come out with a Direct X standard for physics.
June 19, 2006 6:29:55 PM

The solution always seemed obvious to me, ATI and Nvidia need to slap on a PPU to all their new cards then all game developers will support it. ATI proved with All in Wonder series they could combine 2 hardware components so why can't they do it now?

I honestly thought ATI's developers were retarded when I saw they wanted us to buy 3 video cards to get physics.
June 19, 2006 6:36:48 PM

The "old clipping problem" points out another weakness with the PhysX card, or any other type of hardware solution. If the game isn't written to allow it, the hardware can't fix it. Doors that don't fall over, clipping, and other oddities are part of the software in the game. The way I see it, and I may be wrong, is that the PhysX card will just do more to show the bugs or poor software codes then it will do anything that actually fixes the base problem and give realistic explosions, etc.

I think that the solutions from Nvidia and ATi will offer more in the long run than the the PhysX card, and their solutions will force the game companies to write games that use the hardware. Time will tell, so we can only wait and see at this point. For now, I'm not buying anything.
June 19, 2006 6:55:21 PM

A particle effects accelerated, yeah. Maybe if software comes out that really shows something interesting I would consider one, but sure as heck not for a few extra bits of shrapnel.
June 19, 2006 7:06:24 PM

First of all this card debuting at $300 is not just expensive...it's utterly insane. Paying $300 for something that at present is little more than a paper weight is mind boggling to me.

I don't know what the marketing geniuses at ageia are thinking, but this company is not going to go far with this kind of planning.

This is not a graphics card. This is not an essential necessity for gamers. At present it’s a novelty accessory at best.
Most people are not willing to pay $300 for a graphics card, what makes you think they will pay that for a useless accessory like this?

This item should've debuted in the $50-$100 range, and even then it would've had a hard battle ahead (just look at how many people actually spend that kind of money for sound cards...not many as most people just go for the onboard option.)

Second, although ageia is desperately awaiting the release of UT2007, I don’t think that it will give it any sort of a boost in sales.
UT2007 is a multiplayer game, and many of the lan problems outlined in the article, will be present there too. All physics eye-candy will be client side, which means most people will turn them off anyway because it will most likely put them at a disadvantage. Nobody wants their view to be obstructed by a pile of smoke and debris while their opponent has a perfectly clear view.

The only way I see this product succeeding is if either ageia slashes their prices to reasonable levels(around $100), or even better if they go for the integrated option, either on the motherboard or video card.
Otherwise they should start looking for new jobs…
June 19, 2006 7:08:37 PM

I love the idea of physics acceleration in general, but Ageia is in a bad position to say the least. They are trying to establish a market here, knowing that nVidia and ATI (800 lb. gorillas to say the least) will respond, but they lack the track record that would convince a lot of developers to support their product. UT2007 is a big release to be sure, but if it is not a killer app for PhysX, or if it is delayed much, Ageia may well be done early in this game. The fact that nVidia has gone with an existing engine (Havok FX) is problematic for Ageia as well - one would think it is easier to modify game code for an established, more widely used product. I wish them luck, but sounds like David needs to slay both Goliath and his younger brother to win this battle...
June 19, 2006 7:13:11 PM

Quote:
I actually like ATI's products a lot but am disappointed with their idea of using a 3rd dedicated GPU for physics processing. I thought the AGEIA PhysX card was such a bad idea and now ATI wants me to buy a 3rd videocard for physics? I don’t care what you want to call it, it’s just as bad as buying the dedicated physics card.


You may be disappointed, but in reality, its to be expected from both ATI
and Nvidia, with their more is better (SLI, crossfire,QUAD) attitude.
June 19, 2006 8:19:05 PM

I am a programmer, but not a game programmer.
Anyway, I am don't know exactly how games are made now.
I guess for physics, there is an API and a Driver for the hardware.
The game programmer doesn't need to know EVERYTHING about physics or all about 3D graphics. Most of the work is done in API calls while the hardware accelerates it. I never thought hardware can fix anything. (garbage in, garbage out)
Quote:

If the game isn't written to allow it, the hardware can't fix it.

So. To permanently fix physics problems, just use a good physics API (Don't re-invent the wheel), fix the current APIs, and update the Drivers. (this will be true for any company that makes special cards)
Just my 2 cents.
June 19, 2006 8:48:00 PM

I did read the whole article Dude! It was a rhetorical question :!:
June 19, 2006 9:45:42 PM

Hmmm... I’m having flash-backs to the 486/68000 days and the math co-possessor you needed to do any serious number crunching (and was required by some games).
Maybe AMD has something with the AM2 X 2 idea.
PhysX on a PCI card sounds like a step back, but as a co-processor it would have direct access to the GPU and memory.
June 19, 2006 9:48:17 PM

Hmmm... I’m having flash-backs to the 486/68000 days and the math co-possessor you needed to do any serious number crunching (and was required by some games).
Maybe AMD has something with the AM2 X 2 idea.
PhysX on a PCI card sounds like a step back, but as a co-processor it would have direct access to the GPU and memory.
June 19, 2006 10:21:43 PM

I would get it for 200$
nothin more cuz 300 is too expensive
June 20, 2006 7:08:12 AM

i think if many horses were eaten this would be better
June 20, 2006 11:32:26 AM

When I first heard about a dedicated Physics card I thought that will be awsome... But after almost half a year I have seen nothing that has inpressed me about Agiea's product...

Now here are some engines that inpressed me and they dont need a physics card to run.

Dark Messiah users a heavily modified Havok engine and wow it looks great and it is going to have a great impact on game play

The Crytek 2 engine... This game (Crisis) looks *beep*ing brillant... The way the leaves move when brushing past, the way the truck drove through the wooden building and the way the trees break when shot at... If you have not seen the engine in action do yourself a favour and download the latest demo... And once again no mention that you will be needing an Ageia Physic card

and last but not least Hellgate London also dosent need a physics card... And as amatter of fact it will include support for the Havok FX physics engine... (One up for Nvidia) :D 

These are some of the most awaited games of the yaer if not the last two years...

So... I will not be buying a Agiea PhysX card until they can show me a game that blows me away like Crisis does... I dont want to see a demo that they have down...

Oh yes and if you guys havent seen this have a look...

http://www.firingsquad.com/features/ageia_physx_respons...

It's what Havok had to say about Agiea concerning Ghost Rekon Advanced Warfighter...
And did anyone notice that Havok played a bigger part in GRAW than Agiea Did...

All i want now is for the havok engine to utilise the second core on my X2, then I will be a happy *beep*er
June 20, 2006 1:34:20 PM

Quote:
Hmmm... I’m having flash-backs to the 486/68000 days and the math co-possessor you needed to do any serious number crunching (and was required by some games).
Maybe AMD has something with the AM2 X 2 idea.
PhysX on a PCI card sounds like a step back, but as a co-processor it would have direct access to the GPU and memory.


Oh good old days! We were selling our body :lol:  to get hold of a math coprocessor to finish rendering quicker ......

But anyway I think a dedicated PPU is just the first step and has to be integrated into more common hardware to be realy wide spread.
June 20, 2006 1:47:05 PM

FINALLY, somebody that gets it! The PhysX card is just a math engine, specialized and much better at it that the general purpose CPU. If the game developers don't code it correctly, then there's nothing the PPU, CPU, or GFX can do about it. This is all just like the first 3D cards, it took a year or two at least before everyone could figure out what to do with it. Drivers will also evolve for this new hardware for the same reason. Not every development team has the talent to do this I suppose. Crytek of course being an exception, and a very rare bird. Far Cry is a couple years old now and there are still new games being released that don't come close to its level of realism.

And the whole Hollywood physics doesn't compare. Movies just film our real world so of course a door does fall down correctly and boxes do splinter. If it's a Pixar film, their budget is 10 times a game budget and take 4 years to make. So those animated films had better behave realistically, as the audience expects. Games will get there, it will just take time for the coders to learn how to do it. BUT, they can never, ever do everything they want because they want to sell as many copies as possible so they can stay in business and write the next game. So they have to write each game to perform at least adequately on mid-range systems as well as the $3000+ systems.

Unfortuneately GRAW was patched towards the end of development to support a PPU card and isn't integrated that well. Until they get a big title (Unreal 2007 or Cellfactor) that really makes use of the card fully it seems a bit weak. I know I don't see a value in having more than one GFX card, so I still lean towards a PPU. I hope they survive long enough for developers to learn how to use it well. If that chip was on motherboards that would be ideal, then it would just be there when needed and all the other software physics solutions could be out there as well.

As for the $300 cost, sure it's expensive right now. It's called earning back your development money. Just like FEAR comes out at $50 but eventually comes down to $19. The newest GFX card is $600 until newer models outdate it and prices come down as the new core sells enough to earn back what they put into it.

Quote:
The "old clipping problem" points out another weakness with the PhysX card, or any other type of hardware solution. If the game isn't written to allow it, the hardware can't fix it. Doors that don't fall over, clipping, and other oddities are part of the software in the game. The way I see it, and I may be wrong, is that the PhysX card will just do more to show the bugs or poor software codes then it will do anything that actually fixes the base problem and give realistic explosions, etc.

I think that the solutions from Nvidia and ATi will offer more in the long run than the the PhysX card, and their solutions will force the game companies to write games that use the hardware. Time will tell, so we can only wait and see at this point. For now, I'm not buying anything.
June 20, 2006 4:02:24 PM

Again I don’t think UT2007 will help ageia much. I think the addition of this card will almost always give you a performance drop.

The physics card only calculates the positions of objects and how they are interacting, but in the end all that extra debris etc, still has to be rendered.
So the addition of the card takes away some CPU load, but it adds a lot more GPU load because there is more junk on the screen. The ppu helps the cpu NOT the gpu. It actually makes more work for the gpu.

The question is: Is the cpu performance increase so great that it will make up for the gpu overload and give an overall increase in performance? Most likely not.
Unless you have a top of the line $3000 rig, and very few people do, your game performance is most likely bound by the gpu not the cpu, so the addition of the physics card will almost always result in a performance drop.

Sadly, as much as I hate to say it, the only way I see hardware accelerated physics going anywhere is if M$ gets in the game and incorporates a physics API into directX.
Currently the physics scene resembles much of the pre-DirectX graphics API scene. A million companies with their own APIs trying to push their solution.

If M$ incorporates a physics API into directX then it is very likely that it will become the standard and then hardware companies will have a somewhat more safe and widely accepted set of specs they can build hardware for.

Nvidia/Ati could just incorporate a physics co-processor on their GPUs, or mobo manufactures could put them onboard, much like they do with sound.
June 20, 2006 6:12:38 PM

I think Ageia biggest mistake was not to firstly introduce its Runtime-environment free of bugs and "jaw dropping" like Crisys' physics engine, into the market and secondly presents its PhysX processor.
You gotta addicted'em first and charge'm later, once we had a background with many games using its Runtime, many products u can launch supported by that background.
I think it was Voodoos mistake, and I can forecast Ageia's faith: even Ageia's stocks goes to DUST or it will be bought by ATI ( and will be the thirdy card :-) ).
But with certain, Physics will be part of DirectX 11 =P and will be supported by: Software, Graphics Processor, Physics cards or even a PPU in a 4x4 socket (who knows).
But the idea of gameplay with particle and fluid dynamics and without clipping bug its a gamer dream.
Hopefully, it'll take place soon, so I can play my Americas Army 3.0.0 with Real World Physics =)))
June 20, 2006 7:21:24 PM

I would give this card another chance, I have spoke with Ageia and they said them self that Ghost Recon : Advanced Warfighter was just a gimmic of the power that could really come from Ageia PhysX. If you dont believe me check out Cell Factor with the real cloth and liquids making that "Revoluiton" come true.
June 20, 2006 7:25:04 PM

I saw the CellFactor footage and its really impress!
Maybe the Tomshardware would give us a test with this game.
June 20, 2006 9:54:20 PM

Quote:
I think Ageia biggest mistake was not to firstly introduce its Runtime-environment free of bugs and "jaw dropping" like Crisys' physics engine, into the market and secondly presents its PhysX processor.
You gotta addicted'em first and charge'm later, once we had a background with many games using its Runtime, many products u can launch supported by that background.
I think it was Voodoos mistake, and I can forecast Ageia's faith: even Ageia's stocks goes to DUST or it will be bought by ATI ( and will be the thirdy card :-) ).
But with certain, Physics will be part of DirectX 11 =P and will be supported by: Software, Graphics Processor, Physics cards or even a PPU in a 4x4 socket (who knows).
But the idea of gameplay with particle and fluid dynamics and without clipping bug its a gamer dream.
Hopefully, it'll take place soon, so I can play my Americas Army 3.0.0 with Real World Physics =)))


They actually have 'gotten them addicted first' in a sense - you just haven't seen it yet. Rather than the hundreds of thousands of dollars a developer would have to shell out for a Havok engine, they can turn to Ageia and get the PhysX SDK for free, including support. They have over 60 developers on side already incorporating PhysX into new games.

Developer support will continue to grow, since a free tool to add physics to their products is a value-add for them, with less overhead than investing in Havok, or developing their own solutions.

From what I've read, the ATI and Nvidia solutions being proposed are basically for particle effects - which I'm sure will look pretty, but won't offer the kind of immersive effects that a real physics solution will offer.

IMO the real hurdle will be demonstrating physics effects in a tangible way to the general public.
June 21, 2006 3:47:11 AM

What would make me want to buy a PhysX is if Ageia implimented some sort of software physics override that allowed me to set the level of physics in ANY game (single player of course), much like the console and regular cheats for PC games. I would much rather a simple slider or what not rather than having to go into documents to change numeric values.

I write this because this thread made me remember playing Quake 2 LAN at High School with low-grav. Aerial battles just gave the game a whole new perspective, and it would rock to do the same thing in newer games It also opens up new gameplay opportunities (i.e. playing "tennis" with a car in an FPS, shooting that car into a group of enemies to wipe them out, anything else you might be able to imagine).
June 21, 2006 4:13:54 AM

I can see what people are saying. The fact that the card is $300 and the effects that are seen are marginal, and there are flaws that it does not correct (i.e. the clipping as brought up in the article).

The PhysX card is most likely marketed toward enthusiasts opposed to just the casual buy a 7600gt gamer. Which isn't to say that people out there with that config would not buy a PPU; it's just that if you're going to have a PPU at $300 you're going to have high end video cards. Which is most likely what you're going to need considering that the extra debris in result would require more effort out of the GPU.

Another thing to take into account if yall remember from back in the day is the 3d accelerator, and look what happened...
June 21, 2006 4:21:13 AM

Quote:
Here is my issue. "Real-World Physics". This is not what people are looking for , or expect. They are looking for Hollywood action movie physics.


I think you hit the nail on the head. Real world physics of bullets, impacts and explosions are often too fast for the unaided eye to even see.

We don't need no stinking realities.
June 21, 2006 11:01:20 AM

Had to register jst to respond to this thread! ;>

1. no game out yet to show the power of the PhysX card
2. the game in this article had limited support for the PPU, they just slapped it in and added debris etc.
3. there will be a long time before a game comes that REALY uses the power of the PPU in a awesome and good way except added debris and bigger explosions etc.
4. Clipping is what the game devs program into the game, a PPU cant remove clipping or realy do anything unless its programed into the game, so the ones that think all a PPU can do is add some debri because thats all we seen will change their minds once a game realy use the card to the max comes.
5. the card takes of math loads, AI, physics, objects being deformable and dynamic, is what the card is designed to do, a CPU (even the fastest ones today) cannot match the computing power of a PhysX card, they are built different and are realy more different then some seems to think. if game developers learn to use this computing power in the right way except just adding debris in the last second before release the load of the system will be drasticly reduced.
6. there are several ways to utelize teh card, more eyecandy that ofcourse stresses your grafix card more, or realy smart AI never before possible, or games that havent even been tested yet as computers have to many bottlenecks. lighting calculations, AI, objects, dynamic stuff, pretty mutch anything that uses big math calculations that would normaly make any CPU die(yes even the future quad cores). a CPU cannot be compared to a real physics specific device in calculation power.

alot of ppl dont realy understand what it can be used for, game developers are the ones that hails this card more then anyone, as it can allow them to do so mutch more then whats possible today. gamers ofcourse wants value for their cash! i cant disagree with that! :D  they did release the card wrong and with to little support, but their a small company and doesnt have the money that the giants have, they probably released it because they had no shoice to wait any longer. the giants are helping out a bit though as they wants this card to make it, they know what it can do for gaming and most of the big titles comming will have support. how mutch they utelize the card remains to be seen though.. im sure hoping they use the card unlike the game in this article ^^ thats whats missing, a game that realy uses the card so ppl can understand it aint a GFX card or a pci CPU, its a very powerfull card that calculates math and thats the biggest bottleneck in computers today as CPUs arent designed for it in the same way. Mhz/memmory etc doesnt matter and cant be compares with CPUs as the designs are extremely different and their made for 2 different things, a CPU is a general processor, you can compare it with a car and a top fuel race car. the car you drive to work in, the top fuel you race in ;> they work kinda the same way but under the hood theres more difference than meets the eye, nobody sane would bother driving a top fuel to work every day, it wasnt designed for it, and a normal volvo aint something you use to race top fuel with.. might have been a bad example lol! ppl might not get it..

the article was good, however i felt the article makers and basicly most ppl dont understand the job a PPU is designed for. its not designed to just add a little more debri to games, that was just **** support the game devs made. I realy hope a games comes that uses the card fully, only then would ppl stop being sceptical and see what it was realy meant to do. basicly noone seems to know what its made to do, and the article was kinda bad in that sence as they didnt realy sound like they knew what it could do either except for more debri. mostly complaints of a small company trying to let game developers get more power to play with ^^ once the game devs knows how to use the card, and the card starts to show what its capable of the story will turn around imo.

untill then dont buy it unless you want to help Aegia basicly.
the ATI, Nvidia stuff.. a GPU is better at these calculations then a CPU, but it doesnt come close to PhysX.. it would help but be pretty mutch more useless and require upgrades every few months as the cards get more powerfull, with a PhysX you wont have to upgrade as the potential power is more then enough for a couple of years ^^ IF game devs start using the card.. using the card to max require alot more programing in games, a reason why it will take a long time before we get to see what the card can do, or should i say feel what the card can do? as it aint eyecandy this card is built for ^^ its game realism, having stuff happening in games that would happen in real life, crash a jet plane in a forest and watch realistic fire burn down the trees and every single part of the plane, ground, trees etc act fully realistic at the crash! that requires looaads of programing so again its up to the game devs to show the power they can bring out of the card. a CPU would fry trying to attempt sutch a thing, a GPU would be able to show the grafix of it, but the calculations would only be doable on a PhysX. with that said you need a hardcore grafix card to enable the eyecandy of a fully realistic plane crash, and a good CPU, in combination with the PPU, thats why they referr it as the 3rd weel in the circle. however it depends on the game devs and how they utelize the card, they can make a hardcore FPS game and only use the card for extreme realistic AI, objects etc, in that case no added eyecandy so it would reduce system load, on the other hand some game devs may deside to make a hardcore game with explosions everywhere and tons of textures flying around = added GFX work, it may reduce performance, but the purpous then was to add to the game play.

its up to the game devs to show the cards power and how it can be used, be it for extreme realism and calculations or extra eyecandy that loads your GPU even more is up to them. as said it requires lots of programing time so wait until some game dev spends the time to explore it. as for the first PhysX game with support for it, the support stinks and ppl thinks this is what the card does, so the card gets dissed because a game used it to add a bit more debri.. and article makers dont realy help as they say the card is meant to improe performance, thats not true it depends on how the game devs use the card. most will use it to add eyecandy, others will use it for AI, both are different with system load and ofcourse more textures impacts the system performance but then the card was used for gameplay and game experience, not releaving calculations from the system but adding more calculations for game play purpouses.

oh well im done, dont buy this card until you know and seen what it can do, and that will take time, and even mroe time considering its a small company, but the big ones are helping them out and seems to believe in them so i will too ^^ just give it time and let game developers use the card to show what they can realy do with it. the games will be amazing once they start taking the extra time to utelize the card, but sadly development time = money, so again it will take some time.. ^^

just wait and see, sorry for the spelling and gramar but atleast you didnt hear me try saying it ;> that would sound bad... xD hope some stops thinking the card automaticly does magic stuff without game devs putting the time to program it... (remove clippings etc..)

edit: oh and Havok! if they make that engine have hardware support for PhysX, that engine would do what it does except it wouldnt load your CPU of GFX card with the calculations and it would become 100 times more powerfull and speed of calculations would be amazing, it would load the PhysX card and thats what the card is for. the card aint a stand alone upgrade, it needs the code and calculations to be usefull, dump havok stuff on PhysX and they can evolve that engine and add 100 times more realism and calculations with the same impact as the original engine would have on your system. THAT is what the card is for ^^ i hope some understands it a bit better now, tried to explain it as good as i could. go and make Havok dump the calculations on PhysX instead of your GPU, CPU and memmory and they can rule the world of game engines :D  no software, cpu clogging, bandwith usage, memmory usage, just pure hardware! remember 3D cards debu? well its kinda like that but for calculations, calculations are done by drivers, game code, api etc and its loaded onto the CPU, uses lots of memmory and system resources that is used for tons of other stuff, add hardware that takes over and your game engine becomes 100 times more powerfull as you can do ALOT more stuff then you could before and it wouldnt impact system performance or resources as the card takes over :D  but that is if the card is used that way by game devs, game engines. if its used to put more stuff, explosions, textures etc in games that you couldnt because of other bottlenecks expect GFX, and other resources to have more work with the added eyecandy.. and lol @ rendering comment by the guy above.. proves how little ppl know about this ;> it adds load to the system if used that way, it removes load from the system if used that way, its up to the game developers and Havok is the perfect thing that could be used with PhysX and realy show what it can do and why it was made, ask the devs that made Havok how it would allow them to do more stuff by loading the PhysX instead of the system resources ^^ they need to be creative to make an engine that works well with the other limited resources, with that creativity and PhysX system resources can be spared, engine speeded up, more stuff can be added to the engine that doesnt impact the rest of the system, the rest of the system resources will be free so the hardware specs for running the game would drop dramaticly etc.. all ahrdware specs would drop in all games if they made game engines use PhysX, and having a engine like Havok utelizing PhysX to the max the limits game devs have had in making physics, AI, and all that stuff would be released and they could do magic to games ^^ the huge amount of AI would make for some dmn good games! physics would get a boost and realism like computer generated movies but in real time(no not the gfx.. but movements, flows, things that require calculation power, true realistic lipsyncs, true hair movements, cloth, fluids, grass, clouds, cars with real mass feelings and impacts that feels real etc. the list is endless and those things are computed on a very limited CPU, using memmory and system resources when they dont need to use anything of them, except GFX card, realistic water sprays, waterfalls, cloth use grafic, but wont use the limited CPU or memmory and the power at wich calculations could be done would realy improve 100eds of times... so dont confuse things and wait for the power to be used ^^ Havok is the perfect way to show what the cards can do to releave stress, bottlenecks, basicly completely removing bandwith problems witch limits AI to be simplified, dynamics, real liquids etc etc.. showing off PhysX with some extra debri particles is the biggest flunk ever xD its like taking the most powerfull GFX card today and play a 2D game on it thats 10 years old, except its about calculations, physics etc and not grafix.

edit2: think about it, in FPS games when bodies fall, how realistic do they fall? with resources freed animations could improve greatly and how a guys arm reacts and flexes or even breaks of at animated movie quality(loads GFX but as said resources freed, GFX cards would become way more powerfull and could realy display grafix is sutch detail when unloaded from most of the useless stuff they do today) and with the great grafix comes great physics in watching that arm break like it was real, helmets on soldiers falling in battle would deform realisticly as a tank drives over them, the bodies would fall realisticly and not like they do today. a raindrop falling on a leaf could make the leaf move realisticly and behave naturaly, with grafix improving PhysX helps the grafix by not letting system resources be used and bandwith used by the grafix to be occupied. the realism that can be achived if the game developers put the time into it is almost unreal :D  i want leafs to be real, i want rain to move stuff and wind forces to be realistic and blow stuff away realisticly if the mass/force is big enough to actualy do it. a huge storm moving windows, cars, trucks, blowing of roofs, even making the leafs and small objects swirl and move realisticly at the same time as tiny raindrops hitting tiny stuff as leafs makes the stuff react exactly realistic and even if the leaf is hit in the air by a raindrop it would react realistic. that is the PPU potential, that is impossible with GFX cards or CPUs, it works together with them but they cant calculate that stuff. i sure hope ppl reads this 2nd edit to try and picture how a game could be and everything in it could be.

but as said dont buy one yet >.< let the game devs make a game engine that uses the hardware and not just patch the engine to add debri and useless stuff.. that game engine is Havok... :D  someone make it happen!

/ragger in a volvo
June 21, 2006 2:34:01 PM

Oh wow where to start…
Well first of all take a writing class, or at least look up the word paragraph in the dictionary.
Second your condescending tone is completely unnecessary. Most people that post in these boards, including the article writers, are well educated in the IT field, nobody needs a lesson on “how things really work”.

A lot of us work as programmers and know very well (a lot better then you apparently) how cpus/gpus etc are designed and work, and what it takes to program them efficiently.
Coming here and blabbering about how nobody understands what this card can do except you(the enlightened one), and then throwing out statements like this card can calculate lighting and AI, makes you sound quite the opposite of smart(hmmm…what’s that word again?), and devalues any argument that you might have had.

We’re not here discussing whether or not a discrete hardware solution for physics is a good idea. We all want added realism to our games not only for physics, but graphics AI and pretty much anything else that goes into them.

We’re here discussing whether this particular approach by ageia and this particular product, is an effective solution at this point in time.
Saying that this card COULD do all kinds of things...maybe even wash your car... is completely irrelevant.
The truth is at this point in time this product is extremely overpriced for the services it delivers, and its uses are extremely limited.

It’s not up to us to make excuses for the company that released it, our job is to judge and review the product for what it is. If the time is not right for it yet, then they shouldn’t have released it.
But since they did, the job of the reviewers is to objectively inform their readers about what this product delivers, and at this time that is not a whole lot.

And a little piece of advice… next time you have a thought let it marinate a little bit and try to form a cohesive argument. Don’t try to spit out everything that comes to your head and repeat yourself a hundred times. Nobody wants to waste their time reading the same thing for pages on end.
*************************************
Getting back to the point I was making in my previous post…I found this on extreme tech. http://www.extremetech.com/article2/0,1697,1979452,00.asp
It’s still a little unclear at this point which route M$ will go because they have licensed the ageia SDK, but at the same time they want the physics calculations to happen in the GPU.
Either way this is good news and a step forward.
June 21, 2006 6:40:29 PM

I’m actually just in the middle of implementing a math problem on a Xilinx FPGA and thus was pretty curious what these little beasts could do in the world of gaming.
However, halfway through the article the feeling crept in that the author does not know more about this thing than anyone who googled PhysX for half an hour + read some product release whitewash + played around with a game that has partial support for this thing.
Somehow, I expected more insights about which physics effects are actually accelerated by offloading the math on the PhysX and what accelerations ARE achieved. Comparing a game with a deactivated effect with the same game with the effect present, but offloaded to an FPGA does from the “innovative” point of view actually not tell anything. This is purely phenomenological.
From my – however very limited – FPGA experience I dare to say that it is true that the CPU can do in principle what an FPGA can do, so yes the entire PhysX API can be implemented in software (however this is missing the point), but to speculate/compare what a different API (Havok) might do in the future on a different hardware (graphics card) without even going into the details about what “physics” effects (and thus math loads) we’re talking about is pretty much rubbish.
As pointed out by others: Looking at clipping errors in the graphics in order to decide whether PhysX is worth it or not IS kind of showing. The article pretty much boils down to “no wow effect yet coded”, “300 bucks for the bang” and “we shall see”.
June 21, 2006 6:53:56 PM

i did appologise for my writing, i did not mean to offend anyone. with that said good look with "your" views, i fail to see how ppl thinking a PhysX card accelerates grafix or anything having to do with grafix for that matter can be any sorts of experts in the area as you made it sound like. i gave lots of examples to how it can be used and greatly benefit both gaming and development of games and computer systems. what the card does and can do is more relevant then anything, as to many fail to see what the point of the card is.

however i do agree its not worth buying, not untill there are games comming that will show what its capable of, and i did say to max sutch a card requires alot more programing to games, and i think it will take some time before game developers will add all that extra time to the games. but the potential of a card being utelized right as in examples i mentioned are completely realistic, time will tell...
a b } Memory
June 21, 2006 7:08:37 PM

Ghost Recon was never meant to be run
with the Ageia's PhysX. If THG actually played
with CellFactor which is bulid for the Ageia's PhysX from
the ground up, then they will see the true potential
as to what the Aegia PhysX processor can really do.


Ghost Recon is just giving this Ageia PPU and bad
image.

Quote:
In Ghost Recon, the differences between having a PhysX card and not having one are not as dramatic as I had first hoped they would be. Honestly, it would be a hard sell to most users to ask them to pay as much as $250 extra for a product that performs only what we saw here. There is no doubt that in our videos we see a much more complex system of physics at work -- the sparks and car destruction are more detailed, the amount of "rubble" from shooting the ground or buildings is definitely more complex and realistic and the various particle systems (dirt and smoke) are more impressive -- but are they $250 more impressive? Probably not, but with other titles such as Unreal 2007 coming out with even more advanced usage models on the PhysX PPU the market will eventually get there.

Our time with the Cell Factor demo showed what a true PhysX game could potentially look like, and it is impressive. The amount of interactivity shown in our videos is unrivaled in any other title out now or that I have seen in development.


source
June 21, 2006 8:45:13 PM

Im dying to see PhysX processor hit Brazilian shelves.
The role idea of physics is very interest but the price and that Ghost Recon game scare me a little bit.
The footage of CellFactor is pretty impress and I think Thg could test the physx with this game first.
I dont know much what kind of hardware is involved in this accelerator, so I cant opine on that matter of a PhysX do what a CPU can even dream to do, but I believe that DirectX physics will allow us to choose between sofware, SLI or CrossFire using one card to Graphics and the other card to Physics or a Physics card. And, in the future, we will have upgrades like physics card compatible with DirectPhysicsX 25 :-)
I think the future is very promise.
      • 1 / 2
      • 2
      • Newest