Sign in with
Sign up | Sign in
Your question
Closed

Fans ofPhysX

Last response: in Graphics & Displays
Share
a b U Graphics card
January 21, 2010 3:34:07 AM

The question is would all games in future be optimized for PhysX? When i mean all, i mean all! Only then PhysX would make sense. If physX is something of a choice that game developers use only if they want to, then it may not be a total success.
Score
0
a c 216 U Graphics card
January 21, 2010 3:38:36 AM

What games are you playing? I've seen your complaint before, but I'm baffled, because I've only seen you complain about one game which used physX. All the other games you claim the physX was better in, didn't use physX at all.
Score
0
Related resources
a c 376 U Graphics card
January 21, 2010 3:41:30 AM

Batman: AA is the only game where PhysX makes a real difference and that was mostly a design choice. That they didn't even enable AA on ATI cards(like every other game around) just shows how much the developers were in Nvidias pocket on that one.
Score
0
a b U Graphics card
January 21, 2010 3:48:51 AM

everything Physx does can(or has been) be done BETTER on havok, while costing less and performing better.



Score
0
a c 105 U Graphics card
January 21, 2010 4:11:41 AM

Stop with the fan boy isms already. I don't buy games specifically because they have phisx capabilities. I buy them to play the game. Physx is a real good asset to have when it does exist in any given game. If nvidia gives you a better show, then yes, nvidia is better. I think some of you don't even know it exists in some of the games you play but you still knock it. Clouds, smoke, particles, explosions and chunks flying all over are/contain certain amounts of "physx" components. I can name more than ONE game that has it unlike the other poster who only thinks it exists in just ONE game. I think that the Nvidia "physx" takes it to the nth providing more for the eyes to see vs ATI. Will it change, yes, but not over night. ATI/AMD needs to pump money into game development or they need to hope the "open", "free" physx gets adopted quicker by game developers. it'll take years for that, nvidia already has it.
Score
0
January 21, 2010 4:12:56 AM

Seriously you got an ATI 5850, a cutting edge dx11 card, and you think physx overshadows that??? Seriously, after I bought my GTX 295 and the 5870 came out a month later, I beat myself up for not waiting.
Score
0
a c 106 U Graphics card
January 21, 2010 4:17:45 AM

PhysX doesn't seem to matter to me in the games I play. The only time I noticed a difference was after I saw the video of it on in Batman, and it made me kind of want it, but not really. Anyway, I think it's nice, but it's not a game changer for me. I think nVidia has done a better job of marketing it than competitors like Havok, but Havok still offers some similarly impressive effects as welshousepk said. I plan on upgrading to an ATI DX11 card in the near future. nVidia's delay of a DX11 card is weakening the position of PhysX as people buy ATI cards instead.
Score
0
a b U Graphics card
January 21, 2010 4:19:40 AM

Quote:
Clouds, smoke, particles, explosions and chunks flying all over are/contain certain amounts of "physx" components.


none of those are even related to PhysX. they are physics, but done on other, open platform, software. usually Havok. Physx has so far allowed for nothing that Havok can't do better, and it leeches GPU power to do so.
when Physics calcualtions are done on a CPU, the performance hit is negligible. thus far, GPU accelerated physics (physx) has proved to cause massive performance loss.

Id love to see Physx attempt to handle volumetric clouds and smoke, or realistic terrain deformation. but it hasn't. and like i said, Havok can do it better anyway.
Score
0
a c 216 U Graphics card
January 21, 2010 4:20:49 AM

swifty_morgan said:
Stop with the fan boy isms already. I don't buy games specifically because they have phisx capabilities. I buy them to play the game. Physx is a real good asset to have when it does exist in any given game. If nvidia gives you a better show, then yes, nvidia is better. I think some of you don't even know it exists in some of the games you play but you still knock it. Clouds, smoke, particles, explosions and chunks flying all over are/contain certain amounts of "physx" components. I can name more than ONE game that has it unlike the other poster who only thinks it exists in just ONE game. I think that the Nvidia "physx" takes it to the nth providing more for the eyes to see vs ATI. Will it change, yes, but not over night. ATI/AMD needs to pump money into game development or they need to hope the "open", "free" physx gets adopted quicker by game developers. it'll take years for that, nvidia already has it.


I have seen him complaining about physics in Crysis, grand theft auto, farcry 2 and how much better it was with nvidia. Those games do not use physX, they are not on any list that includes physX. This is why I don't believe he is actually seeing a difference. I have both cards in my system, so I'd be happy to have physX work well, but the games I've seen him talk about don't use it, at all.

The reason I said the one game he complained about, was in this other post, he did include Batman.
Score
0
a b U Graphics card
January 21, 2010 5:20:08 AM

jyjjy said:
Batman: AA is the only game where PhysX makes a real difference and that was mostly a design choice. That they didn't even enable AA on ATI cards(like every other game around) just shows how much the developers were in Nvidias pocket on that one.


And having it being one of very few games to utilize Physx decently, the additional effects are quite minuscule. I've played with Physx on and off with a GTX 285 in Batman: AA. The thing I noticed most wasn't the flying papers or moving spiderwebs, but instead the most noticeable effect was the SIGNIFICANT dip in framerate. I'm actually a fan of Physx, but I don't want to have to use a dedicated Physx card nor do I want to have to SLI in order to keep at least 60FPS.

Physx is only a small luxury for having an nVidia card, just like 3D Vision. Both are great eye candy in some cases, but they both do damage to framerate. It's nothing immensely outstanding that would make you regret buying an ATI card.
Score
0
January 21, 2010 5:37:35 AM

bystander said:
What games are you playing? I've seen your complaint before, but I'm baffled, because I've only seen you complain about one game which used physX. All the other games you claim the physX was better in, didn't use physX at all.


Well I have alot of games, the ones that I play the most or even mess around with physX are Crysis, Crysis WH, GTA4, Batman AA, Stalker, Far Cry2, Resident Evil5, l4d2, gears of war, cod MF2, COD5, COD4, and Fallout. Most of these games have a sense of physics but nothing like Crysis, Batman, and GTA4. Unfortunalty I cannot use PhysX anymore as my GPU is now a 5850 from a 9800gtx+. So I miss the option of using it over Havok.

I also have Dirt but only have tried it a few times. I downloaded it with my GPU5850.... as for the list fo gmaes I have, it goes on and on and on and on
Score
0
a b U Graphics card
January 21, 2010 5:41:02 AM

liquidsnake718 said:
Well I have alot of games, the ones that I play the most or even mess around with physX are Crysis, Crysis WH, GTA4, Batman AA, Stalker, Far Cry2, Resident Evil5, l4d2, gears of war, cod MF2, COD5, COD4, and Fallout. Most of these games have a sense of physics but nothing like Crysis, Batman, and GTA4. Unfortunalty I cannot use PhysX anymore as my GPU is now a 5850 from a 9800gtx+. So I miss the option of using it over Havok.

I also have Dirt but only have tried it a few times. I downloaded it with my GPU5850.... as for the list fo gmaes I have, it goes on and on and on and on


of the games you listed, only one of them uses Physx.

you dont seems to realise the difference between Physx and Physics. Physx is Nvidia proprietry and very few games use it. the best in game physics are done with Havok.
Score
0
a b U Graphics card
January 21, 2010 5:44:20 AM

I also agree with the sudden lag in framerate when using Physx even in an Nvidia card (I have a GTX 260). IMO, better framerates would give me better gameplay than lower framerates with added eye candy.
Score
0
a c 171 U Graphics card
January 21, 2010 6:12:54 AM

L4D2 is a source game, and doesn't support PhysX. I suspect several others of those games don't support it as well. (I don't think GTA4 does either. I'd look them up, but not on the slow laptop.) This makes you seem like you don't know what your talking about when you claim X game does support it.

I actually happen to like PhysX, or at least the idea of PhysX. Offloading things from the CPU to make things faster and/or adding physical effects is great. But it needs to be more then just extra papers flying around in an alley, or bigger explosions. We need to have total control over our virtual environment, to be able to do things in games that are possible in real life. PhysX is a good start, but I'll be very disappointed if bigger bangs is as far as it gets.
Score
0
a c 105 U Graphics card
January 21, 2010 6:29:06 AM

welshmousepk said:
Quote:
Clouds, smoke, particles, explosions and chunks flying all over are/contain certain amounts of "physx" components.


none of those are even related to PhysX. they are physics, but done on other, open platform, software. usually Havok. Physx has so far allowed for nothing that Havok can't do better, and it leeches GPU power to do so.
when Physics calcualtions are done on a CPU, the performance hit is negligible. thus far, GPU accelerated physics (physx) has proved to cause massive performance loss.

Id love to see Physx attempt to handle volumetric clouds and smoke, or realistic terrain deformation. but it hasn't. and like i said, Havok can do it better anyway.


I know what you're getting at.... but.......... find the new nvidia physx trailer.... with the guy on the rocket sled..... I think hard ocp had it and maybe guru3d............ check out what I'm talking about........ it's all particle physx.....cant find the whole rocket sled thing but this might work.....http://www.youtube.com/watch?v=BGCZtXg5LyA
Score
0
a b U Graphics card
January 21, 2010 7:14:30 AM

who cares? if its not an actual video game using physx to do something Havok can't, then its pointless.

hardware accelerated Physics is a great idea, but it needs to be an open platform. Physx is just a gimmick, and as long as nvidia keep it locked to their hardware it will never be more than that.

Score
0
a b U Graphics card
January 21, 2010 7:15:57 AM

It's a non-issue.

PhysX is pretty weak, but if that's your bag, then get a cheap nVidia card to do it REGARDLESS of whether your graphics card is an ATi or nVidia, because adding the physX workload to any nVidia card robs it from its graphics ability, so the argument isn't really relevant to a graphics choice. Even in the early days when nVidia loved Havok-GPU and bad mouthed PhysX we all were talking about taking old cards and turning them into physics co-processors, not about adding the workload to main GPUs.

Want to play with physX features, then buy a GTS240/250 and add it to your rig for that purpose, however to me it still doesn't make it a compelling features until it actually does something more than debris physics. Until physX actually has an effect on game physics the CPU will remain the source of the best physics features in games.
Score
0
a b U Graphics card
January 21, 2010 7:54:41 AM

Batman looked nice, and was probably the best showcase of what PhysX has to offer - smoke and paper. I prefer Havok if only because the implementation has always been better, albeit more crude. HL2 had physics that were actually useful, although it would have been better if Valve had tried something other than the see-saw thing. No physics engine is really "better" as far as what it looks like or what it can do, as this is down to each individual developer's implementation. Crysis and Ghostbusters both use their own engines (neither have Havok or PhysX or any GPU-accelerated physcis) and are quite impressive, although Crysis does have some bugginess to the physics that defies many laws.

PhsyX does what any other engine can do, but it kills your framerate as a bonus (unless you do what Ape said).
Score
0
a b U Graphics card
January 21, 2010 8:23:07 AM

liquidsnake718 said:
Well I have alot of games, the ones that I play the most or even mess around with physX are Crysis, Crysis WH, GTA4, Batman AA, Stalker, Far Cry2, Resident Evil5, l4d2, gears of war, cod MF2, COD5, COD4, and Fallout. Most of these games have a sense of physics but nothing like Crysis, Batman, and GTA4. Unfortunalty I cannot use PhysX anymore as my GPU is now a 5850 from a 9800gtx+. So I miss the option of using it over Havok.

I also have Dirt but only have tried it a few times. I downloaded it with my GPU5850.... as for the list fo gmaes I have, it goes on and on and on and on


Take a look at nVidia's List of GPU PhysX PC Games.
Of the games you play, only Batman AA uses GPU PhysX.
Of the games that Use GPU PhysX, only Batman AA has any noticeable improvements from PhysX.

As others have stated, there is nothing PhysX offers that Havoc or other CPU based physics engines can not match or top.
GPU processed physics was a nifty idea that just does not pan out.

Also, if you are using XP or Win 7, you can use your old nVidia GPU as a PhysX accelerator card.
Just be sure to use the 186 or older driver package.
Score
0
a b U Graphics card
January 21, 2010 11:03:20 AM

jyjjy said:
Batman: AA is the only game where PhysX makes a real difference and that was mostly a design choice. That they didn't even enable AA on ATI cards(like every other game around) just shows how much the developers were in Nvidias pocket on that one.


Don't bring up that argument; The Unreal Engine [which Batman AA uses] is not CAPABLE of standard AA due to the way in processes/renders images. In this case, a special AA algorithm is needed that is not part of the DX API. NVIDIA helped the devs implement their proprietary AA algorithm that allows AA to be run on that game engine, and for whatever reason (Lazy Devs, Evil NVIDIA, or Clueless ATI, your choice), ATI's implementation didn't make it in. Other Unreal3 games have the same exact problem.

PhysX is by far a deaper API then Havok is. But because ATI refuses to support it, no dev will be crazy enoguh to create a physics engine built around the advanced functions [which will only work on NVIDIA cards]. So we get some cheezy effects tacked on.

We should be well past the point where bullet type x causes y damage; physics should have advanced to the point where you can track a bullets path, and when it hits an object, determine if it passes through, how much speed/force remains, and finally, how much damage it does to the object. No game in existence has even implemented this yet, let along the fully interactable environments that I dream of...From a gameplay perspective, Physics should be far more important then improving Rasterization is at this point...

We need a universal Physics API that can run far more advanced physics calculations in a parallel nature. As of today, PhysX is the closest we have to that API, and its gimped because AMD refuses to support it.
Score
0
a b U Graphics card
January 21, 2010 11:10:58 AM

We don't need another proprietary API like PhysX, we need something to be used that is integrated into existing APIs like DirectX, or an open API as you mentioned (although I think integrating it is better than having an API department store on your PC). Physics should always be done on the CPU too.
Score
0
a b U Graphics card
January 21, 2010 11:12:49 AM

gamerk316 said:
PhysX is by far a deaper API then Havok is. But because ATI refuses to support it, no dev will be crazy enoguh to create a physics engine built around the advanced functions [which will only work on NVIDIA cards]. So we get some cheezy effects tacked on.

My understanding was that nVidia will not allow an open source implementation of GPU PhysX, instead relying on a propitary CUDA implementation.
As CUDA is a closed nVida only standard, ATI is not able to support it.
I believe ATI has stated they would be happy to support an open standard implementation of PhysX (OpenCL or DirectCompute).
Score
0
a b U Graphics card
January 21, 2010 11:46:18 AM

Ya. Developing something and not sharing is bad! :D 

ATI should also develop something like PhysX and so in future games would have both implemented in it. So have a card of any brand and not worry about the eye candy!
Score
0
a b U Graphics card
January 21, 2010 1:14:07 PM

liquidsnake718 said:
http://www.tomshardware.com/news/nvidia-physx-amd-gpu-m...

I made the mistake of believing the marketing hype and bought a 5850 Ati card which is still very pleasing and worth the money but in the end I realized what Nvidia had to offer with their cards and propriety software. PhysX makes the difference.

What do you think.


I think you should sell the 5850, go back to your nvidia and stop making garbage posts like this one every day.
Score
0
a b U Graphics card
January 21, 2010 3:43:25 PM

outlw6669 said:
My understanding was that nVidia will not allow an open source implementation of GPU PhysX, instead relying on a propitary CUDA implementation.
As CUDA is a closed nVida only standard, ATI is not able to support it.
I believe ATI has stated they would be happy to support an open standard implementation of PhysX (OpenCL or DirectCompute).


NVIDIA recommended to AMD they could easily port the existing PhysX code in using CUDA, as they could essentially use NVIDIA's own implementation. They refused. Nothing is stopping them from creating their own implemtation using AMD Steam, or the new Computer Shaders in DX11.

Quote:

Ya. Developing something and not sharing is bad! :D 

ATI should also develop something like PhysX and so in future games would have both implemented in it. So have a card of any brand and not worry about the eye candy!


God no; Back in the days where Glide and OpenGL were competing, companies simply used the lowest common standard both API's supported in order to cut down on development time and maintain compatability across platforms. API wars are always bad for the consumer, as niether standard will gain acceptability, more code will be needed for the same effects across all configs, etc.

I find the argument against a proprietary API baseless anyway; DirectX is a proprietary closed API, PhysX is a proprietary open API; why does PhysX get singled out in this way?
Score
0
a b U Graphics card
January 21, 2010 4:21:35 PM

gamerk316 said:
NVIDIA recommended to AMD they could easily port the existing PhysX code in using CUDA, as they could essentially use NVIDIA's own implementation. They refused. Nothing is stopping them from creating their own implemtation using AMD Steam, or the new Computer Shaders in DX11.


That's the big limitation. They aren't allowing them to port in using Steam or OpenCL or DirectCompute, and they are locking out other IHV with the device check, so it's not likely they are in anyway trying to be open about the implementation.

Quote:
I find the argument against a proprietary API baseless anyway; DirectX is a proprietary closed API, PhysX is a proprietary open API; why does PhysX get singled out in this way?


For the same reason non-nVidia hardware gets singled out despite the people who bought the Ageia PPU for their rigs so they didn't have to chose another vendor.

They can say it's 'open' but all evidence points to the contrary, especially since nVidia said that PhysX could be ported over to OpenCL very easily, but they have no intention on doing that and when asked they used the opportunity to makes snyde comments about the relative performance of the HD4K series' F@H performace which is low considering it's an implementation limit, not a hardware one, and things like milky-way show that the standard implementation actually works better.

It's like saying you want to help, but then turning your back on the situation, as if the act of saying you want to donate to charity is the same as giving to charity. :heink: 

Score
0
a b U Graphics card
January 21, 2010 4:31:25 PM

Well, Fearless leader did hold up a card on Fermi day, so therefore, it must be so.
nVidia is as shallow as a child
Score
0
a b U Graphics card
January 21, 2010 4:55:36 PM

Yea, weve known this from before, but then it turns into a he said she said. Meanwhile, nVidia has had plenty of opportunities to go the open route since then even, and nothing.
Instead of pointing the finger at ATI or questioning them, write nVidia and ask them, as its all up to them isnt it?
Score
0
a b U Graphics card
January 21, 2010 6:59:58 PM

Quote:
That's the big limitation. They aren't allowing them to port in using Steam or OpenCL or DirectCompute, and they are locking out other IHV with the device check, so it's not likely they are in anyway trying to be open about the implementation.


To be fair, NVIDIA never officially supported ATI + NVIDIA PhysX from the start. Sure, idiotic move by them, but reasonable considering the potential for customer grief due to incompatabilities later on.
Score
0
a b U Graphics card
January 21, 2010 7:09:52 PM

As long as their (nVidia's) hardware still supports PhysX and the OS supports dual graphics drivers, where would there be any incompatibilities?
Both drivers load, the nVidia drivers takes over PhysX calculations (like the earlier Agea cards or a second card in a nVidia only system) and the ATI drives handle the 3D.
Sounds pretty darn simple to me and it seems to work just fine for people using the 186 driver package or older....
Score
0
a b U Graphics card
January 21, 2010 10:46:54 PM

gamerk316 said:

To be fair, NVIDIA never officially supported ATI + NVIDIA PhysX from the start. Sure, idiotic move by them, but reasonable considering the potential for customer grief due to incompatabilities later on.


But they went beyond just 'not supporting it' they actively disable it, which wan't necessary, just spiteful, since in order to use it you needed to own an nVidia card, but that you also had an ATi card made you a second class citizen. :pfff: 

Score
0
a b U Graphics card
January 22, 2010 12:48:46 AM

ArmaII does a pretty excecellent job of bullet physics.

but on the physx topic, as i said before: as long as nvidias Physx remains locked to their hardware it will never be more than a gimmick. i would love them to open it up as a platform and license it out like Havok. i really think hardware accelerated physics can do alot. but Physx is pointless as it is, and havok, and many other platforms, are cheaper and easier to implement and do not alienate an audience who buy a certain architecture.

all nvidia have to do is open the platform, and allow developers or coders the opportunity to get it working on whatever hardware they wish. then it could become a genuine evolution of physics processing. right now, its just a marketing gimmick to sell cards.

Score
0
a b U Graphics card
January 22, 2010 2:22:02 AM

^^That should ideally be the case. Give both the hardware a level playing ground and see who thumps who. Given the recent advancements in ATI, i would like to say if PhysX does open up nVidia wont be able to keep up with them, coz they just don't know there's a term in manufacturing called "deadline"!!!! :D 
Score
0
a b U Graphics card
January 22, 2010 4:46:08 AM

^ But they surely do know how to make a fake one. :) )
Score
0
a b U Graphics card
January 22, 2010 10:54:37 AM

TheGreatGrapeApe said:
But they went beyond just 'not supporting it' they actively disable it, which wan't necessary, just spiteful, since in order to use it you needed to own an nVidia card, but that you also had an ATi card made you a second class citizen. :pfff: 


The problem comes down to liability; if for whatever reason PhysX did act oddly in such a setup, which vendor would be to blame? I don't agree with the decision, but understand their thinking.

Quote:
but on the physx topic, as i said before: as long as nvidias Physx remains locked to their hardware it will never be more than a gimmick. i would love them to open it up as a platform and license it out like Havok. i really think hardware accelerated physics can do alot. but Physx is pointless as it is, and havok, and many other platforms, are cheaper and easier to implement and do not alienate an audience who buy a certain architecture.


Also to be fair, software level PhysX is doing very well (even overtaking Havok in usage, thanks to its integration into the Unreal engine). The problem is the more advanced effects simply require too much horsepower to be used on the CPU (especially if more of it gets used by properly coded multi-CPU aware programs).

Frankly, I could care less what the API for physics is; I just want it focused exclusivly on physics, multiplatform, not tied to a gameplay engine, and without a major competiting API. Right now, PhysX is the closest we have to that.
Score
0
a b U Graphics card
January 22, 2010 11:04:05 AM

Some would argue Havok does most of this as well, nullifying Physx for any exclusivitey.
Heres a lil somethin:
from the willowengine config

[Engine.ISVHacks]
DisableATITextureFilterOptimizationChecks=True
UseMinimalNVIDIADriverShaderOptimization=True

changing them to false with a 4890 i got a good 20-30% jump and that was affter the huge jump from disabled smooth frames, changing the max refresh to 100 and upping the max fps to 101
http://www.xtremesystems.org/forums/showpost.php?p=4209...
Score
0
a b U Graphics card
January 22, 2010 3:30:03 PM

Quote:

DisableATITextureFilterOptimizationChecks


I believe that switch was an old workaround to get ingame movies to work properly with ATI cards.

Quote:
UseMinimalNVIDIADriverShaderOptimization=True


This one I do know; its a workaround that gets Unreal3 games running at a more steady pace on Geforce6/7 series cards [I still had a 7800GTX when the first Unreal3 games came out, and this hack helped a ton :D ]. Might do something else as well, but thats primarily how I used that hack at least...

I should note, most Unreal games use a .ini like this so users can easily change settings for extra speed/compatability depending on their configs, so maybe reading up on thoese two switches would help?

www.tweakguides.com/UT3_7.html [Note: I can't actually open the link at work, but both hacks point to and are explained on this page]
Score
0
a b U Graphics card
January 22, 2010 8:06:17 PM

PhysX is retarded, when GPU acceleration does make a difference (high on Batman:AA) it asks for a 9800GTX/GTX+/GTS 250 for a dedicated.

If a good studio like Crytek codes CPU Physics well then it would easily beat the 9800GTX, especially since it's like using a multi-socket server/workstation motherboard & have a Phenom II 955 as the main & a Athlon II X4 930 just for physics (Athlon = $99, GTS 250 = $112) then you get 4 real, powerful cores.
If Crysis was optimized for 8 cores, 1 for OS background, 3 for AI/Game, 4 for physics then their physics would blow the dedicated 9800GTX out of the water.
Score
0
a b U Graphics card
January 22, 2010 8:29:08 PM

Physics can make a real gameplay difference, in Crysis you hide in a shack, an enemy tank sees you go in, he runs you over, the physics would kill you, if it were simply an effect then you would be alive (as long as the tank does not directly hit you).
Score
0
a b U Graphics card
January 22, 2010 8:46:54 PM

Quote:
Also the biggie in that crysis's physics were gameplay ones no matter how seemingly basic they were, still one up on gpu effects.

I've said it before, replace the "physics" with animations that look the same to the end user and they would see no difference. So why do they need to do calculations that require such power?


I directly answered your query, they would see a difference, if a piece of wood flew at them & went right throw, they can see, if it goes through any other object (trees, houses, rocks) they can see.
Score
0
January 22, 2010 10:12:31 PM

jennyh said:
I think you should sell the 5850, go back to your nvidia and stop making garbage posts like this one every day.

lol.... well I am one to see a good discussion going. I happen to like PhysX and Nvidia and yes most of the games I have played except batman, mirrors edge, ect DO NOT use phySX which I thought they did. I was trylu under the impression that Crysis utilized PhysX as I noticed the physics in the game changed when I bought my new card(Ati). The physics felt "stale" and the "bounce" or "reaction factor" also felt like half life or an older game rather than the Crysis im used to with an Nvidia card. I cant explain it but it did feel different with the 5850 albeit faster in FPS but "cheaper" in Physics. Thus I figured it was PhysX........ maybe its not.... but its good to see your opinions on this as well.......
Score
0
a b U Graphics card
January 22, 2010 10:27:02 PM

It's not, it never was.

Don't you think you'd see others saying it too? :p 

What is happening here is a placebo effect, you think you are missing PhysX and you've convinced yourself of it. Drop the 5850, stick in your old card and you'll soon see that you are imagining it.

And yes it is a good discussion...somehow :p 
Score
0
a b U Graphics card
January 22, 2010 10:27:33 PM

liquidsnake718 said:
lol.... well I am one to see a good discussion going. I happen to like PhysX and Nvidia and yes most of the games I have played except batman, mirrors edge, ect DO NOT use phySX which I thought they did. I was trylu under the impression that Crysis utilized PhysX as I noticed the physics in the game changed when I bought my new card(Ati). The physics felt "stale" and the "bounce" or "reaction factor" also felt like half life or an older game rather than the Crysis im used to with an Nvidia card. I cant explain it but it did feel different with the 5850 albeit faster in FPS but "cheaper" in Physics. Thus I figured it was PhysX........ maybe its not.... but its good to see your opinions on this as well.......


Go to settings, there is a Physics dropdown menu, turn it higher.
Score
0
January 23, 2010 6:15:09 PM

http://www.nzone.com/object/nzone_physxgames_all.html

A link with a list of some games that use a PPU.

I dont get why people bash it, must be some type of 'fanboyism' or they just dont understand how great it is.

look what PhysX can do to your cpu during games.
this is from my computers vantage test.



I am a fan of PhysX.

I even had it when it first came out back in 2006. I still have my BFG Ageia PPU laying around on my desk.
Score
0
a c 216 U Graphics card
January 23, 2010 6:22:12 PM

seth89 said:
http://www.nzone.com/object/nzone_physxgames_all.html

A link with a list of some games that use a PPU.

I dont get why people bash it, must be some type of 'fanboyism' or they just dont understand how great it is.

look what PhysX can do to your cpu during games.
this is from my computers vantage test.

http://img.techpowerup.org/100123/PPUadvantage.png

I am a fan of PhysX.

I even had it when it first came out back in 2006. I still have my BFG Ageia PPU laying around on my desk.


It's not completely about disliking physX, it's disliking Nvidia for not allowing physX to be used on ATI, or even in the presence of ATI making it not work for everyone.

With the new changes for DX11, which will work on all DX11 cards, ATI or Nvidia, we'll likely see a shift to using OpenCL, compute shader and what ever other GPU computing that is standard between both vendors.
Score
0
a b U Graphics card
January 23, 2010 6:58:53 PM

seth89 said:
http://www.nzone.com/object/nzone_physxgames_all.html

A link with a list of some games that use a PPU.

I dont get why people bash it, must be some type of 'fanboyism' or they just dont understand how great it is.

look what PhysX can do to your cpu during games.
this is from my computers vantage test.

http://img.techpowerup.org/100123/PPUadvantage.png

I am a fan of PhysX.

I even had it when it first came out back in 2006. I still have my BFG Ageia PPU laying around on my desk.

That Vantage CPU score isn't anywhere close to representative. Most games do not use PhysX, which means it makes absolutely no difference. It won't just automatically boost CPU performance or anything like that.
Score
0
a b U Graphics card
January 23, 2010 7:02:47 PM

liquidsnake718 said:
Well I have alot of games, the ones that I play the most or even mess around with physX are Crysis, Crysis WH, GTA4, Batman AA, Stalker, Far Cry2, Resident Evil5, l4d2, gears of war, cod MF2, COD5, COD4, and Fallout. Most of these games have a sense of physics but nothing like Crysis, Batman, and GTA4. Unfortunalty I cannot use PhysX anymore as my GPU is now a 5850 from a 9800gtx+. So I miss the option of using it over Havok.

I also have Dirt but only have tried it a few times. I downloaded it with my GPU5850.... as for the list fo gmaes I have, it goes on and on and on and on

Of all the games listed, only Batman AA uses PhysX. In all other games, the physics engine is entirely CPU-based (and it isn't necessarily Havok), and will run the same on both systems. I own and play several of those (including Crysis), and I have used both an 8800GTX and my current 4870x2 setup, and there's no difference.

PhysX is only something that can make a difference if the game is coded to use it. The list of games that support it is short. Overall, it's simply not a big deal.
Score
0
    • 1 / 8
    • 2
    • 3
    • 4
    • 5
    • More pages
    • Next
    • Newest
!