Sign in with
Sign up | Sign in
Your question

Is Ageia's PhysX Card worth it yet?

Last response: in Graphics & Displays
Share
October 10, 2007 3:19:10 AM

Hello every one,
I would like to know how everyone feels about the whole physx card deal now considering price drops and new games. I have a pretty decent setup already but feel something is missing is this it? These are the specs of my system right now. Tell me what you think.

Zumax 550W 38A Combined
GA-P35-DQ6
Core Duo E4400 @ 3.24Ghz 9x 360 1.5v
Crucial Ballistix Tracer DDR2 1066 @ 1152Mhz 5-4-4-12 2.2v 1Gb x 2
2 Western Digital 160Gb 16Mb Cache Raid 0
Asus EAX1900XTX 512Mb DDR3 Overclocked a little
Sound card, DVD Burner, Big Case with a couple of big fans

I might get a E6750 but, I am unsure if it will make a big difference or not.



More about : ageia physx card worth

a b U Graphics card
October 10, 2007 3:48:03 AM

I can give you honest feedback from experience of my own. A few months ago I purchased the Asus physics card and I installed it. After installing it I installed some of the demos that came with it. I was not impressed with the graphics of the demos themselves. They looked to be 2 or more years behind current game graphics and I didn't see anything that impressed me one iota.

After seeing this I decided to do some research and find out what other people were saying about these cards. I found several studies that actually proved that it decreased the benchmark scores with the physics card installed. From what I understood about this there seems to be a bottleneck in the data transfer to offset the workload to the physics card.

Needless to say I returned my card and got my money back. 249.99.

Here's my honest opinion.

Do I think that a physics card will be a good idea? Yeah, one day in the future when they improve the technology and create the games to actually utilize its capability. But not now.

My advice to you is to invest your hard earned money in the upgrade for the CPU. That will be your best investment for now.

I would keep an eye out on the physics technology because I do think it will catch on and be worth while later down the road, some day.

Follow this advice and you will do yourself a favor.

Hope this helps...
October 10, 2007 4:06:31 AM

truehighroller said:
Hello every one,
I would like to know how everyone feels about the whole physx card deal now considering price drops and new games. I have a pretty decent setup already but feel something is missing is this it? These are the specs of my system right now. Tell me what you think.

Zumax 550W 38A Combined
GA-P35-DQ6
Core Duo E4400 @ 3.24Ghz 9x 360 1.5v
Crucial Ballistix Tracer DDR2 1066 @ 1152Mhz 5-4-4-12 2.2v 1Gb x 2
2 Western Digital 160Gb 16Mb Cache Raid 0
Asus EAX1900XTX 512Mb DDR3 Overclocked a little
Sound card, DVD Burner, Big Case with a couple of big fans

I might get a E6750 but, I am unsure if it will make a big difference or not.


The time of Ageia has come and gone. Investing in a usb coffee cup warmer would benefit your gaming performance far more.
Related resources
October 10, 2007 4:09:28 AM

Unless you are a hardcore, diehard GRAW fan, then it's not worth even the silicon it's made on... Even if you were a GRAW fan, still not worth it.
October 10, 2007 5:52:50 AM

I'm a GRAW fan and I still wouldn't buy it. I think its a great idea to offload the physics on a dedicated card but only a few games support it. One being GRAW. So I would also say spend your money elsewhere.
October 10, 2007 6:22:51 AM

A quadcore pc more than enough makes up for the lack of a physics card. Basically a multi-core cpu can handle physics by itself with no problems.
October 10, 2007 8:48:49 AM

An e6750 is not worth the small performance boost over your nicely overclocked e4400. If you do upgrade, get a Q6600.

The only thing worth upgrading may be your vid card to an 8800 or 2900.
October 10, 2007 10:08:14 AM

Wait to see what it does in Unreal Tournament 3, then decide whether to get it or not.
October 10, 2007 1:53:59 PM

Thanks guys, I appreciate all the help I really do. I figured this as well from reading up on it. I figured the processor might not make that big of a difference as well. I am wondering however though why I only get the highest score of 7058 with 3dmark however. I see people with same config as me getting like 11,000 points? I don't know if everyone else is aware of something I am not or what. I figure as tweaked as I have my system I should be getting 10,000 at least you know. I kick in ati tool when I start the benchmark. I have my memory timings tweaked as tight as possible on my video card and it overclocked a little as well. Maybe you guys can give me a hint as to why this is. Thanks again everyone.
October 10, 2007 2:14:58 PM

Intel just bought them I thought. We will see some benfit in a year or two. But, I think it will be from the CPU.
October 10, 2007 2:18:20 PM

Don't think you can get 11000 with the 1900xtx
October 10, 2007 2:27:07 PM

I seen other users with that high of a score with the 1900xtx on there score server thingy. Maybe they have two of them but, it just doesn't show that part... Guesing here mind you. I know this much before I got my ati tool setup and tweaked out when I was playing games and what not running the bench with 3dmark my card wasn't kicking into 3d mode at all?.... Don't know why but whatever.
October 10, 2007 2:42:28 PM

joewho said:
Intel just bought them I thought. We will see some benfit in a year or two. But, I think it will be from the CPU.
Intel bought Havok, the company that makes the physics engine used in Half-Life 2, F.E.A.R., Halo 2 and 3, Bioshock, Strangehold, Age of Empires III, Oblivion, and just about any other game on the market.
October 10, 2007 2:49:30 PM

7k on 3dmark06 is the average score for the x1900xtx..with overclocking i could get it to 8000. Used to always wonder how these guys were getting 11,000 with a single/dual core and one card?? probably cheating the dirty bstards :) 
October 10, 2007 2:57:17 PM

joewho said:
Intel just bought them I thought. We will see some benfit in a year or two. But, I think it will be from the CPU.


No, Intel bought Havok.... the smarter physics.
October 10, 2007 2:58:59 PM

If you want to spend some money on that setup, then pick up a new graphics card next month after the new cards come out.

Isn't the physx card like 2-3 years old already? That to me makes me think it can't really be all that powerful.
October 10, 2007 3:35:46 PM

True True I'm thinking get me quad now G0 stepping. Then step up video card wise when they drop in prices some more. I think the quad is as cheap as it will get for awhile yah know.
October 10, 2007 4:38:48 PM

Hatman said:
Wait to see what it does in Unreal Tournament 3, then decide whether to get it or not.

I'm quite curious how UT3s implementation of physics will be. If i take into consideration that the ageia cards most likely have a sub 1% penetration of the gamers market it would be insane for a company to do anything more than slap the ageia sticker on its box - its just not economical.
While i hope for a major breakthrough in gaming physics, i doubt it will involve Ageia at all.
October 10, 2007 4:53:12 PM

I can think of only one good reason for buying a physx card: it will help Ageia's finances. Other than that, don't bother. Spend your money on a better CPU or video card if you want better performance, with my vote going to a better video card.

On other notes, when I was running a X1900 XTX, my 3DMark06 score was a little over 8000 and I have an overclocked FX60, so a score in the 7000's isn't all that bad in my opinion. As to the CPU, I would agree that going to the q6600 would be better than the e6750, especially in the future.
October 10, 2007 4:58:56 PM

Please correct me if I am worng, but aren't the putting small physics processors in video cards theses days? I though I read somewhere that the 2900(xt) and the 8800(gtx) both have some form of a physics engine built into them.
October 10, 2007 5:04:10 PM

I didn't know what an FX60 was lol, I googled it though. I think better yet I am going to fill my memory slots again and get two more ballistix considering I just bought these two like last week and they have dropped fourty more dollars already 8o. I have my E4400 as high as it will allow me to push it. I am waiting on the video card untill everythings drops a little my 1900xtx is pretty damn good IMO. I bought it a couple weeks back for $190~ and I have seen it on newegg and other places still priced @ around $400. I don't think the 20fps extra is worth the money they want for them you know. I will get the Quad asap as well though.
October 10, 2007 5:15:52 PM

i guess the short answer to your original question of is the agiea physx card worth it is...

NO!
October 10, 2007 5:17:02 PM

spaztic7 said:
Please correct me if I am worng, but aren't the putting small physics processors in video cards theses days? I though I read somewhere that the 2900(xt) and the 8800(gtx) both have some form of a physics engine built into them.


You're not completely right, or wrong. According to the ATI site, the 2900 has physics processing support, but not a didicated physics engine. I think Nvidia does the same, giving physics support, but not having a separate engine for it.
October 10, 2007 5:22:47 PM

Only if it has a rebate exceeding the original purchase price!
October 10, 2007 5:23:18 PM

I was thinking the same thing spaztic7. NO!! lol
October 10, 2007 5:31:14 PM

A bit off topic, but has anybody noticed how the join dates are all Tues, Jan 1 of 1970? I'd bet a fair number of people here weren't even born yet, much less having joined Tom's back in 1970. I know I was in the Air Force back then, worrying about Vietnaum, and my first computer purchase was a many years in the future.
October 10, 2007 5:57:16 PM

I thought NVIDIA and ATI were using drivers to simulate physics processing while running two GPUs in SLI/Crossfire mode, effectively using the "number two" card as a PPU. Is that an incorrect assumption?
October 10, 2007 6:00:04 PM

it only improves visuals. so say exploding barrels, or shooting cloth, etc.

nothing that effects game play at all.

(atleast that was how it was)
October 10, 2007 6:00:35 PM

That's an incorrect assumption, at least in regards to running SLI or Crossfire. Both companies will do some physics processing with a single card.
October 10, 2007 6:05:19 PM

I seen some guy saying that he bought a 1600xt for that reason and they never have come out with the driver to do it like they were boasting they were going to.
October 10, 2007 6:09:08 PM

truehighroller said:
I seen some guy saying that he bought a 1600xt for that reason and they never have come out with the driver to do it like they were boasting they were going to.


As best I understand, the simulated phyics processing is only available with the 2900 series cards and the 8800 series cards, so buying a 1600XT would not serve the purpose.
October 10, 2007 6:22:15 PM

He had bought it like a year ago lol. Poor guy wasted his money.
October 10, 2007 7:08:34 PM

truehighroller said:
He had bought it like a year ago lol. Poor guy wasted his money.

I believe I heard something similar... about a board with 3 slots for video cards... the idea being you have a crossfire setup and a 3rd card handling the physics calculations. Yes, I'm sure the guy feels dumb... but... he was just doing what "the man" wanted him to.
October 10, 2007 8:01:41 PM

Do I recall correctly from the reviews that Ageia was bandwidth limited by the bus? In that case, until that problem is solved, there is nothing you can do to save it, as no amount of computational power - no matter how perfectly it can be used - at the far side of the bus is useful if you can't feed the data to / from it fast enough.
a b U Graphics card
October 10, 2007 11:02:10 PM

Ageia itself said that they WEREN'T limited by the bandwidth of PCI, which pretty much killed the idea that a PCIe update would help.
October 11, 2007 12:22:39 AM

well, that's a bummer I guess. Thanks, Ape. What was the explanation for the momentary hang up before the effects? Fits the pattern to me, but who am I anyway.

Admittedly, there are more problems than just the bus...Such as software support, the effectiveness of the hardware, and how widespread the use of the library is. As far as the OP's question is concerned, as of right now it is useless, but Unreal promised support, at the very least, so it remains to be seen whether or not that support translates into an appreciable difference. If it does, there's hope, because Unreal engine has a history of being widely used in other titles.
October 11, 2007 12:28:09 AM

Yah Unreal's gonna kick a$$.
October 11, 2007 1:43:07 AM

NO WAY!!
All its gonna do is bottleneck your system and ( if you read other THG articles ) and actually give you LOWER benchmark scores .... Save your money or cough up a little more for a better graphics card
a b U Graphics card
October 11, 2007 2:15:43 AM

russki said:
As far as the OP's question is concerned, as of right now it is useless, but Unreal promised support, at the very least, so it remains to be seen whether or not that support translates into an appreciable difference. If it does, there's hope, because Unreal engine has a history of being widely used in other titles.


Yeah I'm interested in seeing what's implemented, sofar the only reviewer to look at the PhysX hardware support has said it's very limited, and will not be throughout the game like we thought, but more pre-staged effects and an area that will created to show off the PPU called Tornado.

The biggest problem is that someone using a PPU cannot play against someone who doesn't have a PPU and enable effects. So PPU owners must playother PPU owners or turn off the effects.

I think not having a universal API which is PPU, VPU, CPU agnostic makes interaction and adoption difficult and slow. Too bad M$ never really did much with DirectPhysics.
October 11, 2007 3:34:27 AM

How come no one is helping me with the cooler question lol. I got a ton of replies here. What is the best air cooler for video card now a days guys? I will also add grape ape guy :p  that that sounds realy $hity! If that is true then that company will not do good yet if, they do good now that Intel bought out Havok and all.
October 11, 2007 12:06:14 PM

sailer said:
You're not completely right, or wrong. According to the ATI site, the 2900 has physics processing support, but not a didicated physics engine. I think Nvidia does the same, giving physics support, but not having a separate engine for it.


Thank you for explaining this to me. I was not sure what they meant, but I got it now.
October 11, 2007 12:22:05 PM

Ok first, dont mess with TheGreatGrapeApe. He is master of parts, king or the drews, and lord of all that breaths.... or something of that sort.

truehighroller said:
How come no one is helping me with the cooler question lol. I got a ton of replies here. What is the best air cooler for video card now a days guys? I will also add grape ape guy :p  that that sounds realy $hity! If that is true then that company will not do good yet if, they do good now that Intel bought out Havok and all.


Might I add... WHAT! I am not sure if I am getting what you are asking? "If that is true then that company will not do good yet if, they do good now that Intel bought out Havoka and all." What?

So I guess since Intel bought out the leading physics company, that is going to make Ageia somehow better? Yes I agree, since Intel is such a horrible company and has no idea at what they are doing. I mean, come on, look at their processors! And who would not want to buy the worlds largest physics company? Now I would put money down that Intel is going to optimize their new (i guessing later gens of penryn and the soon to be benhalem) processors for physics capabilities.

I think you should do a little research on Havok and find out how many titles already use their software based physics engine.

Also, Havok had enough time to look up how to spell physics. They didn't have to use Ebonics.
October 11, 2007 12:30:20 PM

Hatman said:
Wait to see what it does in Unreal Tournament 3, then decide whether to get it or not.



Well from reviews

"If you have an AGEIA PhysX game physics hardware card installed, UT 3 has an option to offload the game's physics to the card but there won't be any extra effects generated "

Got that off firingsquad :) 

soo personally......no still not worth it :D 
October 11, 2007 3:58:18 PM

TheGreatGrapeApe said:
Yeah I'm interested in seeing what's implemented, sofar the only reviewer to look at the PhysX hardware support has said it's very limited, and will not be throughout the game like we thought, but more pre-staged effects and an area that will created to show off the PPU called Tornado.

...

I think not having a universal API which is PPU, VPU, CPU agnostic makes interaction and adoption difficult and slow. Too bad M$ never really did much with DirectPhysics.


Yeah, not having a single API definitely hurts, not just from the agnostic perspective, but it's kind of akin to, almost, the Hi-Def wars (BD ROM vs. HD DVD). We have two competing APIs (major), plus now the nV and ATI are pushing their own solutions (although I am not sold on that - most games are still gfx-bound and so there is no computing power to spare on physix there, although the architecture might well lend itself to the task) and that slows down the adoption. It's a tough environment to push new hardware in. I'll be honest, at first I thought Ageia had a chance to be sort of 3dfx for physics, but now I am not so certain and am actually leaning the other way. 3dfx was the only one of its kind at the time, so that was the only option you had, even if adoption was fairly slow at first.

But Havoc seems to have a lot more widespread support at the moment, so unples Unreal can capture the imagination of users / developers I think Ageia is done for.

It remains to be seen what happens with nV and ATI.

A single API would go a long in at least making sure all developers support it, and then the workload is allocated at the API level through the drivers, although I imagine that's a hot mess to actually implement.

The funny thing is, there is still so much that can be improved physics-wise - they're all pretty rough at the moment. But I guess the same can be said about the gfx with the normal maps, shadow maps, etc. - not truly real time computed effects
!