Sign in with
Sign up | Sign in
Your question
Closed

PhysX card on a Crossfire ATI gaming rig

Last response: in Graphics & Displays
Share
May 9, 2010 7:06:50 PM

Dear community,

I am building a new ATI based gaming rig based on the Phenom II X6 1090T processor and a dual Radeon 5850 in a crossfire arrangement. However, realizing that many of the games which I wish to play on the best graphics quality humanly possible have extensive PhysX support, I am contemplating the best options of what to do to counteract Nvidia’s egoistic brainchild (for a price that is feasible for me to justify to my Jewish soul).

From reading the forums I have come to the conclusion that fitting a secondary Nvidia card graphics card to run only the PhysX calculations would be the best step forward (taking into account the windows 7 workaround patch which is currently available).

I guess my questions would be:

Is the above actually technically possible? I.e windows 7 has the ability to support two sets of graphics drivers, however, will having a crossfire + a third Nvidia component be feasible, or efficient?

What Nvidia card could you suggest for this configuration? (I would like to keep the cost of this component reasonably low.

What alternative solutions could you recommend?

P.S. I do not think that cooling should be a problem due to the brilliant low power usage of the 5850s, and abundant cooling equipment within the well ventilated box (Liquid cooling, good PCU fan, and a PCI fan under the crossfire cards (The Nvidia would go under it).

Sincerely yours.

Dmitry
a b U Graphics card
May 10, 2010 12:24:53 AM

1: PhysX w/ ATI was never developed; it only works with Nvidia and with older drivers.
2: PhysX only works on Vista or XP
3: you do realize nvidia stopped PhysX support a while ago right? There have been rumors they might put out a stand alone driver but nothing more officially has ever been said so I doubt anything will ever come.
May 10, 2010 3:12:31 AM

Hmm, would like a pro to reply on this if possible. Most of what you are suggesting is A) irrelevant & B) Outdated, low accuracy data:

1) Yes ATI PhysX was never finished, and Havok (Sp?) was scapped at intel level. However there are ways to make contemporary NVIDIA cards to work with

2) This my dear sir is an utter shameless misconception. In Windows 7, there is the ability to have multiple Graphical cards/drivers working together- however due to NVIDIA desire to keep its market share, the PHYSX hardware work deactivates on a rig with a secondary non Nvidia card. That happens - UNLESS you get the widely available third party patch to stop the non-nvidia card check patch. As such- PhysX WORKS with windows 7 through a mixed card set-up. Not perfectly, but it does.

3)Look at the point above. Nvidia is NOT interested in letting other developers use the PhysX engine, but there are ways to make it work under a windows 7 environment, and modern patches.

My questions however where not linked to the possibility of the above points- they were:

Whether under windows 7 it is possible to setup an Nvidia physX card AND maintain another 2 cards in a crossfire arrangement?

Whether there may be a better way of going about this (Short of replacing my chosen cards)?

If it is possible to achieve PhysX via the 2X5850 +1 Nvidia card method, what Nvidia card would people recommend for this?


Related resources
May 10, 2010 3:42:43 AM

popatim said:
1: PhysX w/ ATI was never developed; it only works with Nvidia and with older drivers.
2: PhysX only works on Vista or XP
3: you do realize nvidia stopped PhysX support a while ago right? There have been rumors they might put out a stand alone driver but nothing more officially has ever been said so I doubt anything will ever come.

Oh, so wrong......I believe you are thinking of Ageia PhysX, which would make all your statements true....That however, is Not what the OP asked...and all of your staements should be vehemently ignored.

Sorry I can't answer the question at hand, but I'm curious if this is still possible also...I was of the understanding that it no longer was, but I'de love to be wrong. FYI 8-series Nvidia cards and up support PhysX, and any 8-series should do fine as a dedicated card...no need to go nuts. My concern would be the inability to denote the card as dedicated PhysX with the Nvidia drivers, as this option is usually only available with multiple Nvidia cards installed. The opipions I've seen on PhysX is if you have the old card laying around collecting dust, throw it in for PhysX, but it's not especially worth spending the extra money just to have it.
May 10, 2010 5:09:20 AM

I still cant tell much of a difference between physx and non-physx when I see screenshots...
a b U Graphics card
May 10, 2010 5:17:10 AM

blaze200 said:
I still cant tell much of a difference between physx and non-physx when I see screenshots...


Well the idea of Physx, isn't to make the game look better, it's a gameplay enhancement

i.e. you have to be PLAYING the game to notice.
May 10, 2010 6:56:46 AM

I would suggest using a 9600GT or a 8800GT.

I tried to do the old ATI/Physx thing with my 5870 and my old 260, I found it to be a serious waste of time and was shocked at how my temps went up, which you will find even worse with 2 x 5850's!!!!

I was not very confortable with the temps and the extra drivers so I purged my system. Other people have had different experiences though so give it a try but be warned you will see a dramatic temp increase.
May 10, 2010 9:55:45 AM

Thanks for the suggestions :) 

I am interested in trying out Mirror's edge and love my Unreal Tournament 3, and the new ghost recon when it comes out. While this may not sound like a reason to some, I want to see the cost and problems that I may usher in to make my system physX capable.

Marney_5, you would not happen to recall aproximately your idle 5870 temperature, and how much did it jump (aprox) when you tried that set-up. Furthermore, I wanted to enquire as to your objective opinion of the ventilation and cooling in your pc.

I would appreciate any info if anybody has successfully configured a crossfire setting with 5800 series modules + a dedicated PhysX Nvidia card in a modern rig, (Or failed miserably, to take into account their experiences)
a b U Graphics card
May 10, 2010 2:05:14 PM

Why would a X6 Phenom running 2 5850's in Crossfire need a physX card running with it? If all that is going to need help from a physX card to fit what I am going to be doing with it, I would seriously be rethinking my whole build from the ground up.
If you need physX that badly, or believe it is going to make that much difference, why don't you go Intel and SLI?
Makes no sense to me.
May 10, 2010 5:08:36 PM

Valid point Jitpublisher.

For me PhysX is just an added perk which I would like in my PC, however, I like the set-up without it just as well. It is a solid system. I have the 3850s fitted already butwas just wondering if I will be able the ATI way and still get the eye candy called PhysX in games that support it.

I would not cry if I do not have it- but if I will be able to get it for a low price of a low/mid ranged graphics card, why not? :) 
May 10, 2010 5:12:45 PM

Edit button not working:
5850s* rather then 3850s

Lack of PhysX would not break the system in my eyes, however if for a small price I am able to get that benefit too- why not?

a b U Graphics card
May 10, 2010 5:48:06 PM

Just ignore the phyisics, it won't provide enough benifit even in terms of eye-candy.
a c 272 U Graphics card
May 11, 2010 12:32:22 AM

ShadowFlash said:
FYI 8-series Nvidia cards and up support PhysX, and any 8-series should do fine as a dedicated card...no need to go nuts.

Only the 8600GT and above not the 8500GT or 8400GS.
a b U Graphics card
May 11, 2010 12:50:17 AM

jitpublisher said:
Why would a X6 Phenom running 2 5850's in Crossfire need a physX card running with it? If all that is going to need help from a physX card to fit what I am going to be doing with it, I would seriously be rethinking my whole build from the ground up.
If you need physX that badly, or believe it is going to make that much difference, why don't you go Intel and SLI?
Makes no sense to me.


PysX is above and beyond whatever Cpu + Gpu power you may have at tap. If you enable PhysX within the game, it looks for the physX processor and the code is executed by that. It can't just magically be run by your gpu or your cpu's because they are not being taxed.
Then whatever extras are added appear. Its going to be things that go under the definition of added detail. Similar to added shadows or added particles. It may be papers or bullet fragments. Anything can be trivialized, say the water splashes added with dx11 code in dirt 2 vs the normal water splashes in dx9. Oh and those cost dx11 adopters about 25% framerate.
So new better ideas don't always get implemented smoothly or look and act perfect in every execution.
Quick google found this fellow with a guide.
http://www.mymobile88.com/enable-activate-physx-run-on-...
GT 240's have been recomended as good PhysX cards 220's to. Been steep discounts and rebates on some GT 240's just look around. The 240 is also 40nm should not add any heat in of itself, except there is a extra physical card.
a b U Graphics card
May 11, 2010 2:02:58 AM

Kukushka said:
Edit button not working:
5850s* rather then 3850s

Lack of PhysX would not break the system in my eyes, however if for a small price I am able to get that benefit too- why not?


Well, yep fair enough, if it will work, and you want it, then by all means do it.
But I am really curious to see just what eye candy that physx card could add. With that setup, your games have to look pretty darn snappy even if physx could help them. I don't think there is that many main stream popular titles that actually use it.
I remember a couple of years ago someone posted a whole big long list of games that did use physx, (because everyone kept saying there were very few games that actually used it back then too) but seriously, 3/4ths of the games on the list I had never even heard of. Then they were also saying that more and more games were going to be using it, but I don't think that ever really happened. There are probably fewer games released using it today than there were a couple of years ago.
May 11, 2010 9:59:21 AM

The only game i've seen that uses physx well was Batman. Dont see alot of it in Dragon Age and the other games I own like Crysis have their own version of physx.

To be honest I didnt miss physx when I changed to ATI from Nvidia.
May 11, 2010 10:11:41 AM

There is a big difference between physX (which all computers are capable off) and GPU assisted physX acceleration, which is used to "Unlock" some hidden textures and such in a few games which are underlined by mindless728. The unlock is based on basic running of an Nvidia system because of a conscious developer decision to give preference to the hardware (and arguably Nvidia sponsorships of their products).

I found a thread on how to succesfully combine a crossfire arrangement and an Nvidia card- so the topic has been resolved:

http://forums.techpowerup.com/showthread.php?t=72035&hi...

(in case you are using a single ati card: http://www.mymobile88.com/enable-activate-physx-run-on-... )

While it may not be ideal methods as outlined by the latest posts in the threads, Ill settle with this half-way solution. Thanks for the help folks!

Sincerely yours,
Dmitry

P.s Thanks for the recommended cards.
a b U Graphics card
May 11, 2010 12:38:07 PM

marney_5 said:
The only game i've seen that uses physx well was Batman. Dont see alot of it in Dragon Age and the other games I own like Crysis have their own version of physx.

To be honest I didnt miss physx when I changed to ATI from Nvidia.


don't confuse PhysX with physics, Crysis doesn't use "its own version of PhysX", it has its own physics engine, physics and PhysX are not the same thing

@Kukushka, with the very small list of gpu accelerated PhysX games, i don't think its worth it. Though if you go that route, good luck
a b U Graphics card
May 11, 2010 1:07:41 PM

Here is a video that shows PhysX ADDITIONS in Metro 2033. At about the 50 second mark, they begin.
weapons effects
impact debris
grenade explosions
additional particles for destructible objects
a b U Graphics card
May 11, 2010 1:27:25 PM

notty22 said:
Here is a video that shows PhysX ADDITIONS in Metro 2033. At about the 50 second mark, they begin.
weapons effects
impact debris
grenade explosions
additional particles for destructible objects


and this game works fine for cpu PhysX, and the gpu doesn't add a whole lot

btw, this is why we say PhysX is a gimmick, it adds nothing to actual gameplay, these are mostly after affects
May 11, 2010 1:52:47 PM

Well, true enough- however this return back to the debate of game-play vs the experience of the game as a whole. After all, after effects such as extra textures a few more particles and smoke that lasts just a bit longer do augment the product.

A more relevant question is of course whether one would balance as a perk in terms of costs: PHYSX:Time+Money
a b U Graphics card
May 11, 2010 1:59:14 PM

Kukushka said:
Well, true enough- however this return back to the debate of game-play vs the experience of the game as a whole. After all, after effects such as extra textures a few more particles and smoke that lasts just a bit longer do augment the product.

A more relevant question is of course whether one would balance as a perk in terms of costs: PHYSX:Time+Money


well, is a $100 PhysX card worth the "extras" in a dozen or so games, not to mention the added cost for getting a motherboard with 3 x pci-e x16 slots (for AMD rad 790/890FX)

EDIT: another reason to not get the card is to make it 1 step closer to dieing, we don't need a proprietary solution for accelerated physics
a b U Graphics card
May 11, 2010 3:18:45 PM

mindless728 said:
and this game works fine for cpu PhysX, and the gpu doesn't add a whole lot

btw, this is why we say PhysX is a gimmick, it adds nothing to actual gameplay, these are mostly after affects


mindless728 said:
well, is a $100 PhysX card worth the "extras" in a dozen or so games, not to mention the added cost for getting a motherboard with 3 x pci-e x16 slots (for AMD rad 790/890FX)

EDIT: another reason to not get the card is to make it 1 step closer to dieing, we don't need a proprietary solution for accelerated physics

These fanboy missions fail.
a b U Graphics card
May 11, 2010 3:41:43 PM

notty22 said:
These fanboy missions fail.


and proprietary API's like this usually fail, especially since no game dev will use GPU PhysX exclusively

anyways, i already said that GPU accelerated physics is pointless since the GPU should be rendering, especially when they are usually the bottleneck in most games
let the cpu handle the physics, even if its mostly scripted at this time, and allow scalability to more than 2 cores (quad cores are already pretty main stream for gaming builds)
a c 125 U Graphics card
June 8, 2010 8:22:15 PM

I'm on the verge of getting a GT220 for $65 and that's all I need to get PhysX. I've been playing Metro 2033 as of yesterday and I'm really liking it, but seeing that video of the PhysX comparisons really makes me want the PhysX card (thanks Notty, I was trying to google exactly that!). Obviously if you're broke and can barely run games then it doesn't make the slightest difference to have it or not. However, in my case a PhysX card is basically the last little thing I need to have all the bells and whistles. After spending over $2000 I really can't see not justifying $65 for PhysX.

I don't like the strangle hold Nvidia has on PhysX, but I do like PhysX. And if Nvidia could just loosen it's grip and provide drivers for their cards as dedicated PhysX processors only, I'd be very happy and IMO that's a good business strategy for them. Tons of people would consider it if they knew more about it and could get it for a reasonable price, not to mention it could pick up a lot more support in game development. And it means more sales of their low end cards, people will still buy either ATI or Nvidia main GPUs but will be forced to buy an Nvidia card if they want a PPU.

Mmmm Tesslation and PhysX... /drool
a b U Graphics card
June 9, 2010 12:16:10 AM

Getting a hold of the 275 beta Nvidia driver (first released without BRAND check) should also help you getting your system running . ATI main gpu, check is not there.
http://blogs.nvidia.com/ntersect/2010/05/introducing-th...

I came to the same conclusion as you have elsewhere regarding Metro2033 and selecting , the PhysX option in options (with NO physx gpu). Its not affecting framerates because its not running the extra code for the added details/effects. Some games are obviously coded so they attempt to run Nvidia gpu physX even without a PhysX capable gpu, because frame rates come to a stand still.
October 27, 2010 2:02:07 AM

Hello Notty22, I know this is a fairly old post but yes there is a way to run physx with Ati cards in Crossfire mode. But you need to have a decent power supply and I recommend 8800 series or higher. But search on Google For physx mod 1.04 and If you have 2 montiors it will make things alot easier for you. But I will pm you with my email info and we can go from there
January 10, 2011 12:26:44 PM

hello,

I used to have the old crosfired with two x1900 master ans xtx versions on a AMD 939 Socket fx60 and completely water cooled platform after having it for three year i started to notice that it couldn't handle some of the newer games at the time so i brought the Agiea physx card for 100 quid and noticed an improvement in all games (eg games weren't lagging) but it broke 2 days after the 1 year warrenty (what a con) and then a water cooling leak caused the master x1900 card to break and was left with just the x1900 for a year untill i got my self a HD4890 which i have been happy till this day to play a bit of mordern warefare 2, DAwn of war 2, css, TF2
a c 272 U Graphics card
January 10, 2011 4:24:18 PM

This topic has been closed by Mousemonkey
!