Sign in with
Sign up | Sign in
Your question

Can anyone name a single game that supports a physics card?

Last response: in Graphics & Displays
Share
April 17, 2006 2:50:45 PM

Today is April 17th 2006 and Dell is already proudly offering an AGEIA® PhysX® Physics Accelerator as a $249 upgrade on their XPS line... will ANY game support this card today? $249 sounds like a lot to pay for ZERO instant gratification.

Anyone?
April 17, 2006 2:52:16 PM

No...

Or was the question rhetorical? :) 
April 17, 2006 3:40:06 PM

Not long ago someone post it here with a list of games that supports the physics card and future games as well. Im getting this maybe next year as game by then would fully supports it.
Related resources
April 17, 2006 3:47:26 PM

Quote:
i know that ghost recon game for the 360 supports it.



The Xbox360 has an aegea physics card built in...?
April 17, 2006 4:06:46 PM

http://physx.ageia.com/titles.html

City of Villains is on it, I'm playing that.. Can't play it 1920x1200 max details without it being laggy horsecrap. Dunno, might be my X1900 XT ain't cutting it..
April 17, 2006 4:27:22 PM

Ghost recon 3, Splinter Cell Chaos Theory w/ patch update.
Cod2 w/ patch update. It is only a matter of time before most games will support the phsysix process. :twisted:
April 17, 2006 4:39:52 PM

many more games will use this feature as well as most likely most new video cards in the feature will have physx ppu, myself most of the games online I turn effects down to low so I can see what I'm shooting at the game most wanted is a great example of effects, just try racing someone with motion blur on its hard to be first with that effect on.
April 17, 2006 5:18:43 PM

I think they are using Bet On Soldier to demo some stuff on that card. That game is out now. not that anyone cares... reviews were not so good. Does anyone here play/has played it?
April 17, 2006 5:40:09 PM

Why does a game need to support it? Why can't we get a driver that will offload the physics processing to the seperate card ... oh wait, then nvidia/AMD would make less money, nevermind...
April 17, 2006 6:09:37 PM

I can't wait until they're released, but I won't ahve enough to buy one yet =(. It's only around a month away until it gets released to consumers right?
April 17, 2006 6:10:29 PM

good point man, and if physics was offloaded just to another cpu core I would bet that the intel chips would favor much better if you could benchmark just that aspect... Of course I think that the GPU will do better than any cpu, and w/ that you are also right that ati has much more calc power in that regard than Nv...

Until we see games doing it beyond demos we will have to see...
April 17, 2006 6:25:16 PM

A normal CPU would stand absolute zero chance at matching a specialized floating point processor such as a GPU or a PPU for physics processing. Most current physics in games is done in software. It is the physics equivalent of rendering your graphics in DOOM 3 or whatnot... IN SOFTWARE MODE.

These physics processors are on the order of HUNDREDS of times faster for such calculations than a multipurpose processor such as a CPU. There is absolutely no comparison. A $250 PPU would decimate any $1500 multipurpose CPU in any floating point benchmarks. On an order of magnitudes.

What I'm trying to say is: No, Intel doesn't stand a Chance.

And to the person that said Intel is better for raw calculations, that isnt True, The A64/K8 Architecture has better floating point performance than Netburst(P4/Pentium D).
April 17, 2006 6:27:49 PM

I caught an interview w/ an ati dude (think it was on fireingsquad.com but cant remember) and he listed a bunch of numbers, from Mips to Gflops and shader ops etc... while cpu's were not even in the running, and physX somewhat ambiguous the only "real" comparrison was w/ Nv and ati. it showed the number of calculations astronomically in ati's favor. Even taking into account there may have been fudging to favor ati, it was such a landslide that Nv was closer to a cpu capability than ati.

after that they said that even if the physX board was super efficient (and based on ageia's own #s it has to be to compete) ati has so much capability to spare that they did not have to be efficient and could still rock all takers.

While it was alot of hype, there seemed to be enough truth to point out that ati was gonna be a real powerhouse in physics if all goes the way the players seem to want it to.
April 17, 2006 6:48:01 PM

Quote:
good point man, and if physics was offloaded just to another cpu core I would bet that the intel chips would favor much better if you could benchmark just that aspect... Of course I think that the GPU will do better than any cpu, and w/ that you are also right that ati has much more calc power in that regard than Nv...

Until we see games doing it beyond demos we will have to see...


Great statement. Why the push for physics when HT and MultiCore have not even been OPTIMIZED for full usage. To be honest with you, I'm sure physics is a great new thing, but until I see companies like Intel and AMD optomizing their Chips with OpenGL, Direct3d, Havok etc... I really don't see a need for this, I actually see this as a marketing gimmick or Niche Sales. Wasn't being sold before so lets sell it....kind of mentality.
April 17, 2006 6:53:34 PM

I guess I have to ask myself this question: Would I rather have better graphics or more realistic physics in a game? $250 put towards a graphics card instead of a physics card could certainly go a long way. I'd rather see better graphics instead of better physics.
April 17, 2006 7:06:36 PM

Quote:
i think we all know about the cpu's but i am sure intel can beat it at encoding and what, they rely on pure maths i thought.


You might have this backwards... Encoding relies on clockspeed.

AMD will kick Intel's butt at floating point calculations, not clockspeed... AMDs do alot more calculations per clock, thats why Athlon64's always beat the snot out of Intels when benched with Sciencemark. Physics simulations favor AMD.

Alot of games use HAVOK physics calculations already, who knows, this might be part of why AMD's are so good at gaming in the first place...


On a side note, I think it's inevitavle that PhysX will be the premier physics engine as time goes on... it's free, and havok costs a sizable amount of cash. If PhysX is any good, the developers will eat it up...
April 17, 2006 7:13:25 PM

Ghost Recon Advanced Warfighter for the PC.
April 17, 2006 7:46:19 PM

not sure... was not "logged in" to be able to see whatever you had linked. ;) 

i found what i was talking about though here. It was at firing squad... from a week or so ago, I remember different graphs, but either it was different before archiving or I really did first read it on another site. You'll get the idea of what i was saying though.
April 17, 2006 8:09:51 PM

AHA! I found it... i thought that firingsquad one looked funny. I found the one w/ the numbers and stuff: here it is

This was the one that raised my eyebrows... and it was a bit more than a week ago now that I see the dates... I must have had a few articles all jumbled in my brain. guess I'm getting a bit retarded in these latter days. ;) 
April 17, 2006 8:26:28 PM

what does the sdk enable? for games to run physics on teh gpu or for you to play with it on your own? just curious...
April 17, 2006 8:30:37 PM

I hear ya on that one. For me to spend $250 just to plug something useless into my 'puter, seems a little...um...flaky.

But hey, Ageia has some nifty marketing demo's that MIGHT work with the card :wink:

Personally, if there was widespread support for this technology, I'd have no problem getting the card, given that Ageia's partners were selling the "high-end" version for >$200. However, presently, the price is too high, but it never seems to stop the early adopters from spending their disposable income on something silly.
April 17, 2006 8:52:05 PM

right, I thought you were talking about a more specific thing like there was already a kit that enabled physics in games... I had seen links about that sdk, but figured that I code all day at work, and like to enjoy my hardware w/ minimal work on my part at home. I will wait for teh devs to work it into their games first... ;) 
April 17, 2006 9:06:30 PM

so now i get to look forward to shelling out an additional $300 for something that my processor should be able to take care of? while i'm an advocate for the advancement of technology, i feel this "physics card" is more of a lateral step than a forward one.
April 17, 2006 9:14:24 PM

Guys, in no way can a CPU handle the amount of colisions the PhysX card can handle.

We're talking semi-realistic water, fog simulations... huge amounts of collisions. It's just not doable on a CPU.

You can do 3d without a graphics card, just on the CPU, but it's not in the same league... it's basically the same deal with hardware accelerated physics.

You can debate wether we need it or not, but it's hard to make sweeping statements that it's a rip off until you've seen the difference. GO to the physX site and lok at some of the demos before deciding it's useless.
April 17, 2006 9:20:36 PM

That's not what the devs are saying.

Even the HavokFX presentation made the distinction that GPU physics are non-collidable... basically just pretty, but you can't interact with it.

That's a huge difference... from what I've read the Aegea hardware is much more powerful than GPU hardware, when used in this capacity.
April 17, 2006 9:30:17 PM

Um... yeah. Did you?

Seriously. There's this huge thread about how horrible the PhysX card is, how we're being robbed because it might be done on the GPU or multicore CPU...

At the same time, everything I've seen points to the PhysX card being far more powerful than either. Would it be nice if the GPU could do it? Sure. Is there a usable GPU solution we can compare to the PhysX? No. Christ, there isn't even a PhysX game that we can demo. All of this is farting in the wind.

I mean, I'm sure a comparo will be done when they're available. Until then this is all BS.
April 17, 2006 9:32:41 PM

read that last article i posted: ati is saying the exact opposite of what you said; that they are more powerful than physX chip but less efficient. They are just so much more powerful that efficiency is not a drain on performance 'cause they have calcs to spare.

gpu is specialized for graphics, and based on what ati and Nv are saying the physX is simply a more specialized type of "gpu" (if you wnat to think that way)
April 17, 2006 9:35:25 PM

Ati says one thing, Aegea says another.

Seriously, it's kind of hard to take at face value, they kind of have a vested interest in their point of view, don't you think?

I'll say it again; until there are two physical solutions to compare, all we have to eat is everybody's corporate spin.
April 17, 2006 9:36:38 PM

If it really is a better setup, I'd like to see ATI/nvidia cards with a physics coprocessor from these guys, all on one card. Its obvious that future games are going to be physics-intensive for realism. It wouldn't be that hard to do since someone recently put two physical cores on the same card. Then you wouldn't be working across the PCI bus either, the PPU could talk directly to the GPU and tell it where to render things. Then the GPU would be left to deal with HOW to render them.
April 17, 2006 9:43:24 PM

I thin kthe best case scenario is that games do not require the Physics chip. You want to see the extra effects? Sure, pay the piper. You don't? You don't have to...

This model works great until someone comes up with a 'killer app' that requires craploads of collisions and advanced physics though. After it becomes a 'must have', there's no option. Like the graphics card.
April 17, 2006 9:58:23 PM

just to point out, Intel single core chips MAY be able to compete, but until Conroe, the dual core offerings from Intel really do not compete. Not even in raw calculations. That will probably change in the next six months, but AMD's dual core chips really do have a better price/performance. And really just have much overall better performance.
April 17, 2006 10:09:38 PM

I agree man, with the ati/Nv option you at least have an "extra" card that is useful for more than just the 1 or 2 games that are worth having.

Overall, I am very curious/excited to see where everyone (companies) is at when the dust settles and we see what we need for the next batch of wicked games. Of course I hope that the ati solution works if only to remove a need to buy another card in the short term... ;) 
April 17, 2006 10:29:18 PM

Quote:
gpu is specialized for graphics, and based on what ati and Nv are saying the physX is simply a more specialized type of "gpu" (if you wnat to think that way)


after much reading, thats how im now starting to view this whole concept of the "physics card", a specialized piece of hardware. before we know it, a "physics card" will become an essential component, much like the graphics card is now. with time, we'll all be accustomed to purchasing the latest and greatest physics card, just as we've done with everything else in the past.

moving on, any word on how this physics card will interact with vista? i guess what i'm looking for is, will this allow vista to run a little more smoothly by taking off some of the load from the cpu/gpu? will this bring about new possibilities for future operating system? (i.e. dynamic gui's, the ability to manipulate, contort, rotate, skew, etc. windows to your liking) geez, just thinking about that makes me wish i could wake up tomorrow, 10 years in the future. :twisted:
April 17, 2006 10:34:09 PM

physics in graphics are more for interaction of rendered objects. afaik this does not apply at all w/ vista as it renders windows and such but they do not "bump" into each other or any interaction like that.

basically, answer is no. no smoother at all.

I am not so sure about the physics cards becoming that commonplace either, we are in a position less like the video card genesis (1996, birth of 3d) and more like the beta/vhs war (late 70's i think...).

big diff is that all players are not fighting w/ similar weapons like '96, but w/ differing ideas like the beta bash.

just my thoughts, but I think we will have a winner, just not guarranteed to be physX.
April 17, 2006 10:37:06 PM

Quote:
Today is April 17th 2006 and Dell is already proudly offering an AGEIA® PhysX® Physics Accelerator as a $249 upgrade on their XPS line... will ANY game support this card today? $249 sounds like a lot to pay for ZERO instant gratification.

Anyone?


http://physx.ageia.com/titles.html

learn to read and search moron
this was resolved in a single google search.

*edit*
just read some stuff over here, and Im susprised how dumb people are
triying to compare a multipurpose cpu to an specialized cpu.
I loved the example of a cpu doing a 3d render, ITS SLOW.

even the most modern actual multipurpose cpu still cant do an "aceptable" job in fixing the paid-like signal of the TV, and that is done by a single RISC chip at 10 MHz...

just like saying a muscleman (from WWF ) triying to fight against Carl Lewis or other atheles in running the 100 meters..
they were NOT TRAINED nor PREPARED to do such diferent job than tthey were trained or prepared for.
same happens for cpus-gpus etc..
Id say using GPU will be surely a HUGE IMPACT.
but it would work when some games dont do well in SLI or CROSSFIRE mode.. loading all the non-used gpu of the second video card for pure physics,

besides there have been some "rumors" at anantech about some ati cards getting their "PPU" near the GPUu in the same single graphic card.
April 17, 2006 10:40:56 PM

Mentioned at E3 u d a southern seppo!!!!!!!!!!!
April 17, 2006 11:12:38 PM

Quote:
Ghost Recon Advanced Warfighter for the PC.


Says who? Is that available today or is that pending some not-yet-released patch?
April 18, 2006 2:10:00 PM

Well it's pretty early in the game; it's very hard to see how all this will play out. Right at the moment, the card sure as hell isn't worth $249.

In any case, technologies like this are always introduced to a theoretical market and therefore at some risk. It's pretty hard to perfectly synchronize the advent of new hardware with the corresponding optimized software.

I wouldn't buy one of these now unless I had money to burn. Game development isn't the only issue---this technology is nascent, and if it succeeds we have a scenario similar to DVD burners, which debuted at a whopping 1x, then 2x, and so on. You could be the cool kid on the block for a hot minute, but then if the tech really takes off, better cards will be right around the corner anyway.

Also, I haven't been exactly blown away by the demos, but the idea of offloading physics is great, so I think we're in for a real treat once the tech is developed.
April 18, 2006 4:52:15 PM

Quote:
i have to say this, you forgot ATI and intel. can i ask what made you miss them out? it is getting on my wick people thinking they can dictate who's cool and who ain't. sorry to harp on but i'm already arguing this thing in another thread where people think the race is between havoc's FX/nvidia and AGeia without mentioing ATI and now it seems intel is rubbish too is it. AMD beats intel in games, but you have to remember physics ain't gaming, it raw calculations.


That's because the most popular combination for gaming rigs right now is AMD/nVidia because they complement eachother so well.

If I had to guess, I would say AMD > intel for raw calculations, just because their 64-bit and dual core technologies appeal to me more for some reason.
April 18, 2006 6:31:00 PM

Quote:
Today is April 17th 2006 and Dell is already proudly offering an AGEIA® PhysX® Physics Accelerator as a $249 upgrade on their XPS line... will ANY game support this card today? $249 sounds like a lot to pay for ZERO instant gratification.

Anyone?


http://physx.ageia.com/titles.html

learn to read and search moron
this was resolved in a single google search.

*edit*
just read some stuff over here, and Im susprised how dumb people are
triying to compare a multipurpose cpu to an specialized cpu.
I loved the example of a cpu doing a 3d render, ITS SLOW.

even the most modern actual multipurpose cpu still cant do an "aceptable" job in fixing the paid-like signal of the TV, and that is done by a single RISC chip at 10 MHz...

just like saying a muscleman (from WWF ) triying to fight against Carl Lewis or other atheles in running the 100 meters..
they were NOT TRAINED nor PREPARED to do such diferent job than tthey were trained or prepared for.
same happens for cpus-gpus etc..
Id say using GPU will be surely a HUGE IMPACT.
but it would work when some games dont do well in SLI or CROSSFIRE mode.. loading all the non-used gpu of the second video card for pure physics,

besides there have been some "rumors" at anantech about some ati cards getting their "PPU" near the GPUu in the same single graphic card.

I think he was trying to start a discussion about the PP, the list of games is easy to find, Dell actually provides the link for info.

I play City of Villains, but waiting for the next issue is killing me! I think this game might die before anyone has the hardware to play it at its full potential!

There are far few too many games upon release to bother now, maybe the end of the year it will be another story, but for now just burning up valuable cubicle time :p 
April 19, 2006 3:41:35 AM

I agree man. not always the most popular but prb the best. You simply cant compete w/ amd right now in games (not trying to start flames from conroe lovers, it is not out yet so not on the list yet) and for graphics; while Nv is good w/ the 7900, ati's picture/graphics quality spanks them (has for many years) and the tech specs, while not better on all tests, show that the x1x00 series is going where the future is.

I happen to have both combos. amd/Nv and amd/ati ;) 

in fact, the last system i upgraded from was also that combo. (athlonXP, 9700pro, Nforce2)

Nv seems to be where everyone wants to go for graphics (prb for the same reason that i stayed w/ the nforce mobo), but I think that ati is where it is at for quality graphics. Anytime someone says Nv has comparable quality all you have to say is AA+HDR anyone? :lol: 
April 19, 2006 3:51:10 AM

Quote:
I agree man. not always the most popular but prb the best. You simply cant compete w/ amd right now in games (not trying to start flames from conroe lovers, it is not out yet so not on the list yet) and for graphics; while Nv is good w/ the 7900, ati's picture/graphics quality spanks them (has for many years) and the tech specs, while not better on all tests, show that the x1x00 series is going where the future is.


The future is in DX10 cards. The architecture will be much different, or I would guess it would be. With 90nm on the rise and all the other goodies popping out here and there, I think it's safe to say that the PC Gaming Community is really benefitting from the tight competition.
April 19, 2006 4:10:55 AM

word on the street is that dx10 will have "general" or unified shaders. current gen is specific pixel/vertex shaders. Of course the ati chip on the 360 is made up of those very unified shaders... and that architecture is used in the 1900 only not unified... that is why i say that ati is pointing at the future.

so ya, the comming dx10 era is different... different from Nvidia. ;) 
May 1, 2006 6:52:25 AM

I don't know, but it seems it will also support some games like Age of Empires III since they use the Havok physics simulator. Lot of the good gameplay is lost (mainly online) when having a bad processor, since a lot of the processing time is lost in simulation. Will these physics cards take the same place the first graphical accelerator cards had, softing the use of the main processor?. Many new games now are begining to have the new tendency of physics simulation, then these "physical games" may represent the near future in computer games.

I only give some ideas for a further "research" at internet, I'm not sure if this card will support AOEIII or other games, this is just speculation (before it becomes a rumor)

LGNR
May 1, 2006 1:47:57 PM

ya, you're right... ageia only supports ageia (or novodex if the game info still uses that name)

but I could see it happen that a hack is made to allow havok to work on the card... even in a limited fashion. Completely "unsupported" wink-wink. or even a possibility that if ageia really takes off to have others (like havok) cut a deal to get theirs to work. That second one is far-fetched i know 'cause that would be shooting themseves in the foot (ageia) to let the competition to get a level playing field. but you never know... maybe ageia will buy them out if they become dominant.

Or ageia could fall on its face and never go anywhere... if ati/Nv have anything to say about it that is what they want I'm sure.

I say though that you shouldn't count on anything but ageia to work on it for a good while at least.
May 1, 2006 3:38:57 PM

Innovations are just coming too fast. It's like the video card industry vs the game industry. ATI and nVidia are [umping out video cards faster than the game designers can produce games that take advantage of them. Which is why there no games that have full support of DX 9.0c.

Maybe if card makers took a break for a while, consumers would see greater benefit from having a 7600 GT over a 6800 GS.
May 2, 2006 2:33:13 AM

well ut2007, one of the most popular pc game series is gonna use it with a bunch of other games, check out the ageia site
May 2, 2006 2:55:30 AM

The latest Ghost Recon uses it and runs slower. :lol: 
!