Can anyone name a single game that supports a physics card?

rodney_ws

Splendid
Dec 29, 2005
3,819
0
22,810
Today is April 17th 2006 and Dell is already proudly offering an AGEIA® PhysX® Physics Accelerator as a $249 upgrade on their XPS line... will ANY game support this card today? $249 sounds like a lot to pay for ZERO instant gratification.

Anyone?
 

chuckshissle

Splendid
Feb 2, 2006
4,579
0
22,780
Not long ago someone post it here with a list of games that supports the physics card and future games as well. Im getting this maybe next year as game by then would fully supports it.
 

bluntside

Distinguished
Mar 22, 2006
744
0
19,010
Ghost recon 3, Splinter Cell Chaos Theory w/ patch update.
Cod2 w/ patch update. It is only a matter of time before most games will support the phsysix process. :twisted:
 

gomerpile

Distinguished
Feb 21, 2005
2,292
0
19,810
many more games will use this feature as well as most likely most new video cards in the feature will have physx ppu, myself most of the games online I turn effects down to low so I can see what I'm shooting at the game most wanted is a great example of effects, just try racing someone with motion blur on its hard to be first with that effect on.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
I think they are using Bet On Soldier to demo some stuff on that card. That game is out now. not that anyone cares... reviews were not so good. Does anyone here play/has played it?
 

theaxemaster

Distinguished
Feb 23, 2006
375
0
18,780
Why does a game need to support it? Why can't we get a driver that will offload the physics processing to the seperate card ... oh wait, then nvidia/AMD would make less money, nevermind...
 

ProdigyMS

Distinguished
Dec 21, 2005
172
0
18,680
I can't wait until they're released, but I won't ahve enough to buy one yet =(. It's only around a month away until it gets released to consumers right?
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
good point man, and if physics was offloaded just to another cpu core I would bet that the intel chips would favor much better if you could benchmark just that aspect... Of course I think that the GPU will do better than any cpu, and w/ that you are also right that ati has much more calc power in that regard than Nv...

Until we see games doing it beyond demos we will have to see...
 

DougalJ

Distinguished
Mar 6, 2006
9
0
18,510
A normal CPU would stand absolute zero chance at matching a specialized floating point processor such as a GPU or a PPU for physics processing. Most current physics in games is done in software. It is the physics equivalent of rendering your graphics in DOOM 3 or whatnot... IN SOFTWARE MODE.

These physics processors are on the order of HUNDREDS of times faster for such calculations than a multipurpose processor such as a CPU. There is absolutely no comparison. A $250 PPU would decimate any $1500 multipurpose CPU in any floating point benchmarks. On an order of magnitudes.

What I'm trying to say is: No, Intel doesn't stand a Chance.

And to the person that said Intel is better for raw calculations, that isnt True, The A64/K8 Architecture has better floating point performance than Netburst(P4/Pentium D).
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
I caught an interview w/ an ati dude (think it was on fireingsquad.com but cant remember) and he listed a bunch of numbers, from Mips to Gflops and shader ops etc... while cpu's were not even in the running, and physX somewhat ambiguous the only "real" comparrison was w/ Nv and ati. it showed the number of calculations astronomically in ati's favor. Even taking into account there may have been fudging to favor ati, it was such a landslide that Nv was closer to a cpu capability than ati.

after that they said that even if the physX board was super efficient (and based on ageia's own #s it has to be to compete) ati has so much capability to spare that they did not have to be efficient and could still rock all takers.

While it was alot of hype, there seemed to be enough truth to point out that ati was gonna be a real powerhouse in physics if all goes the way the players seem to want it to.
 

hongkongphuey

Distinguished
Feb 14, 2006
83
0
18,630
good point man, and if physics was offloaded just to another cpu core I would bet that the intel chips would favor much better if you could benchmark just that aspect... Of course I think that the GPU will do better than any cpu, and w/ that you are also right that ati has much more calc power in that regard than Nv...

Until we see games doing it beyond demos we will have to see...

Great statement. Why the push for physics when HT and MultiCore have not even been OPTIMIZED for full usage. To be honest with you, I'm sure physics is a great new thing, but until I see companies like Intel and AMD optomizing their Chips with OpenGL, Direct3d, Havok etc... I really don't see a need for this, I actually see this as a marketing gimmick or Niche Sales. Wasn't being sold before so lets sell it....kind of mentality.
 

superbrett2000

Distinguished
Mar 30, 2006
53
1
18,535
I guess I have to ask myself this question: Would I rather have better graphics or more realistic physics in a game? $250 put towards a graphics card instead of a physics card could certainly go a long way. I'd rather see better graphics instead of better physics.
 

cleeve

Illustrious
i think we all know about the cpu's but i am sure intel can beat it at encoding and what, they rely on pure maths i thought.

You might have this backwards... Encoding relies on clockspeed.

AMD will kick Intel's butt at floating point calculations, not clockspeed... AMDs do alot more calculations per clock, thats why Athlon64's always beat the snot out of Intels when benched with Sciencemark. Physics simulations favor AMD.

Alot of games use HAVOK physics calculations already, who knows, this might be part of why AMD's are so good at gaming in the first place...


On a side note, I think it's inevitavle that PhysX will be the premier physics engine as time goes on... it's free, and havok costs a sizable amount of cash. If PhysX is any good, the developers will eat it up...
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
not sure... was not "logged in" to be able to see whatever you had linked. ;)

i found what i was talking about though here. It was at firing squad... from a week or so ago, I remember different graphs, but either it was different before archiving or I really did first read it on another site. You'll get the idea of what i was saying though.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
AHA! I found it... i thought that firingsquad one looked funny. I found the one w/ the numbers and stuff: here it is

This was the one that raised my eyebrows... and it was a bit more than a week ago now that I see the dates... I must have had a few articles all jumbled in my brain. guess I'm getting a bit retarded in these latter days. ;)
 

bweir

Distinguished
Feb 22, 2006
179
0
18,680
I hear ya on that one. For me to spend $250 just to plug something useless into my 'puter, seems a little...um...flaky.

But hey, Ageia has some nifty marketing demo's that MIGHT work with the card :wink:

Personally, if there was widespread support for this technology, I'd have no problem getting the card, given that Ageia's partners were selling the "high-end" version for >$200. However, presently, the price is too high, but it never seems to stop the early adopters from spending their disposable income on something silly.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
right, I thought you were talking about a more specific thing like there was already a kit that enabled physics in games... I had seen links about that sdk, but figured that I code all day at work, and like to enjoy my hardware w/ minimal work on my part at home. I will wait for teh devs to work it into their games first... ;)
 

kumana1

Distinguished
Dec 20, 2005
74
0
18,630
so now i get to look forward to shelling out an additional $300 for something that my processor should be able to take care of? while i'm an advocate for the advancement of technology, i feel this "physics card" is more of a lateral step than a forward one.
 

cleeve

Illustrious
Guys, in no way can a CPU handle the amount of colisions the PhysX card can handle.

We're talking semi-realistic water, fog simulations... huge amounts of collisions. It's just not doable on a CPU.

You can do 3d without a graphics card, just on the CPU, but it's not in the same league... it's basically the same deal with hardware accelerated physics.

You can debate wether we need it or not, but it's hard to make sweeping statements that it's a rip off until you've seen the difference. GO to the physX site and lok at some of the demos before deciding it's useless.