Sign in with
Sign up | Sign in
Your question

vapor x 7970 with dedicated physx card?

Last response: in Graphics & Displays
Share
June 26, 2013 11:25:08 AM

hey guys just got my vapor x 7970 today and was wondering if i should use this new gt545 ddr3 card i have as a dedicated physx card. I was suprised as i was packaging up my 550tis to sell i threw this card in and it was playing bf3 on high in single player at 55-60 fps with vsync on. Would it be worth using this card alongside the 7970 or just let the cpu do the work? id say the card would be smoother?
June 26, 2013 12:47:42 PM

The first question I have is do you even play any games that support GPU accelerated PhysX? Battlefield 3 does not support PhysX. You can find a list of games that support GPU accelerated PhysX here.

You can try to run one of your Nvidia cards as a PhysX card with your 7970, but it does require the use of a hacked driver. Nvidia has PhysX setup so that it won't work if it detects "the enemy's" card on your system. You can find more info here.
m
0
l
June 26, 2013 12:58:00 PM

PhysX is an nVidia feature. It won't work joined up with an AMD card (unless you really screw with stuff as Supernova1138 mentioned). If you want things to work properly, and avoid more problems than it's worth, forget about PhysX.

Though I enjoy PhysX it's only available in a few games. Even then many games that do support PhysX (Batman titles for example) aren't always massive power hogs. Some people think PhysX is amazing. It does add some eye candy. But it's not earth shattering in my mind.
m
0
l
Related resources
June 26, 2013 9:49:16 PM

You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx
m
0
l
June 27, 2013 6:53:38 AM

smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


ive heard a dedicated physx card doesnt need to be that powerful. I'm not really worried about tho since metro is the only game i have at the moment that uses physx anyways and i dont play it that much.

Im still running my gt545 for my second monitor for some reason but also what would be better 3 1920x1080 monitors or 1 2460x1440 monitor for gaming? will a single 7970 be able to handle tri monitor gaming? im not sure it will or not
m
0
l
June 27, 2013 7:40:43 AM

smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


How is anti-competitive when it was an AMD head of marketing that said AMD/ATi users didn't want or need PhysX as Bullet physics was going to be so much better, you keep posting the same BS without anything to back up your claims which as far as I'm concerned is knowingly spreading misinformation which I do believe I can ban you for.
m
0
l
June 27, 2013 8:43:57 AM

06yfz450ridr said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


ive heard a dedicated physx card doesnt need to be that powerful. I'm not really worried about tho since metro is the only game i have at the moment that uses physx anyways and i dont play it that much.

Im still running my gt545 for my second monitor for some reason but also what would be better 3 1920x1080 monitors or 1 2460x1440 monitor for gaming? will a single 7970 be able to handle tri monitor gaming? im not sure it will or not


Assuming your 7970 has 3 ports in the back, yes. However, you will have to lower the eye candy a bit.

BF3


It keeps 30 fps with up to 4x AA, but i would drop the detail to high or medium with 4x AA with vsink on to keep it at 60fps constant. More benches Here
m
0
l
June 27, 2013 9:01:16 AM

Master467 said:
06yfz450ridr said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


ive heard a dedicated physx card doesnt need to be that powerful. I'm not really worried about tho since metro is the only game i have at the moment that uses physx anyways and i dont play it that much.

Im still running my gt545 for my second monitor for some reason but also what would be better 3 1920x1080 monitors or 1 2460x1440 monitor for gaming? will a single 7970 be able to handle tri monitor gaming? im not sure it will or not


Assuming your 7970 has 3 ports in the back, yes. However, you will have to lower the eye candy a bit.

BF3


It keeps 30 fps with up to 4x AA, but i would drop the detail to high or medium with 4x AA with vsink on to keep it at 60fps constant. More benches Here


thanks i couldnt find where that chart was, thats also noth the ghz card i bet either so 925mhz vs 1050 migh be a hair higher but wtill need to turn down the resolution.


also do you know if its possible to use 2 1440x900 monitors for left and right and a a 1650x1050 for the middle in eyefinity? i think it should but the two different resolutions might be weird but it would be alot cheaper for me to get one for 1440x900 monitor to try it out
m
0
l
June 27, 2013 9:09:25 AM

06yfz450ridr said:
Master467 said:
06yfz450ridr said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


ive heard a dedicated physx card doesnt need to be that powerful. I'm not really worried about tho since metro is the only game i have at the moment that uses physx anyways and i dont play it that much.

Im still running my gt545 for my second monitor for some reason but also what would be better 3 1920x1080 monitors or 1 2460x1440 monitor for gaming? will a single 7970 be able to handle tri monitor gaming? im not sure it will or not


Assuming your 7970 has 3 ports in the back, yes. However, you will have to lower the eye candy a bit.

BF3


It keeps 30 fps with up to 4x AA, but i would drop the detail to high or medium with 4x AA with vsink on to keep it at 60fps constant. More benches Here


thanks i couldnt find where that chart was, thats also noth the ghz card i bet either so 925mhz vs 1050 migh be a hair higher but wtill need to turn down the resolution.


also do you know if its possible to use 2 1440x900 monitors for left and right and a a 1650x1050 for the middle in eyefinity? i think it should but the two different resolutions might be weird but it would be alot cheaper for me to get one for 1440x900 monitor to try it out


I reeeeealy doubt it would work...
If you are going to cheap out, might as well not do it at all.
m
0
l
June 27, 2013 10:20:29 AM

Mousemonkey said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


How is anti-competitive when it was an AMD head of marketing that said AMD/ATi users didn't want or need PhysX as Bullet physics was going to be so much better, you keep posting the same BS without anything to back up your claims which as far as I'm concerned is knowingly spreading misinformation which I do believe I can ban you for.

What I mean by "anti-competitive" is NV deliberately disabling hybrid physx.

Essentially they also hurt themselves because they had the potential to sell low end cards AMD/ATI users.
m
0
l
June 27, 2013 10:23:59 AM

smeezekitty said:
Mousemonkey said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


How is anti-competitive when it was an AMD head of marketing that said AMD/ATi users didn't want or need PhysX as Bullet physics was going to be so much better, you keep posting the same BS without anything to back up your claims which as far as I'm concerned is knowingly spreading misinformation which I do believe I can ban you for.

What I mean by "anti-competitive" is NV deliberately disabling hybrid physx.

Essentially they also hurt themselves because they had the potential to sell low end cards AMD/ATI users.

They were told to pretty much by an AMD employee, so how is it an "anti-competitive" thing to do when when the competitor says that they don't want or need it?
m
0
l
June 27, 2013 10:30:07 AM

Mousemonkey said:
smeezekitty said:
Mousemonkey said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


How is anti-competitive when it was an AMD head of marketing that said AMD/ATi users didn't want or need PhysX as Bullet physics was going to be so much better, you keep posting the same BS without anything to back up your claims which as far as I'm concerned is knowingly spreading misinformation which I do believe I can ban you for.

What I mean by "anti-competitive" is NV deliberately disabling hybrid physx.

Essentially they also hurt themselves because they had the potential to sell low end cards AMD/ATI users.

They were told to pretty much by an AMD employee

No offense intended, but I would like to see proof that AMD told them to disable it. I find it hard to believe.

Of course ATI/AMD chose not to support Physx on their own cards, but I am talking about hybrid-physx like the OP is talking about.

m
0
l
June 27, 2013 10:33:20 AM

Mousemonkey said:
smeezekitty said:
Mousemonkey said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


How is anti-competitive when it was an AMD head of marketing that said AMD/ATi users didn't want or need PhysX as Bullet physics was going to be so much better, you keep posting the same BS without anything to back up your claims which as far as I'm concerned is knowingly spreading misinformation which I do believe I can ban you for.

What I mean by "anti-competitive" is NV deliberately disabling hybrid physx.

Essentially they also hurt themselves because they had the potential to sell low end cards AMD/ATI users.

They were told to pretty much by an AMD employee, so how is it an "anti-competitive" thing to do when when the competitor says that they don't want or need it?


Its anti-competitive because Nvidia stopped users with an AMD GPU from putting a low level GTX GPU in there system. (It has NOTHING to do with what anything some AMD guy might have said some time somewhere possibly [Citation needed])
It would, theoretically, make more people buy more expensive GPU's from Nvidia, so they could use physx.
m
0
l
June 27, 2013 10:49:21 AM

smeezekitty said:
Mousemonkey said:
smeezekitty said:
Mousemonkey said:
smeezekitty said:
You can but you will need to hack the NVidia driver because of their anti-competitive measures.

Besides such a slow card will probably bogg down performance almost as much as CPU Physx


How is anti-competitive when it was an AMD head of marketing that said AMD/ATi users didn't want or need PhysX as Bullet physics was going to be so much better, you keep posting the same BS without anything to back up your claims which as far as I'm concerned is knowingly spreading misinformation which I do believe I can ban you for.

What I mean by "anti-competitive" is NV deliberately disabling hybrid physx.

Essentially they also hurt themselves because they had the potential to sell low end cards AMD/ATI users.

They were told to pretty much by an AMD employee

No offense intended, but I would like to see proof that AMD told them to disable it. I find it hard to believe.

Of course ATI/AMD chose not to support Physx on their own cards, but I am talking about hybrid-physx like the OP is talking about.


Here's your starter for ten then :- http://www.google.com/webhp?hl=en&tab=Tw#hl=en&sclient=... and do bear in mind that he used to work for Nvidia before going to AMD.
m
0
l
June 27, 2013 11:12:53 AM

What he says it is for quality assurance reasons. But realistically all NVidia would have to do is put a disclaimer that users are forced to accept before enabling Physx
m
0
l
June 27, 2013 11:17:34 AM

smeezekitty said:
What he says it is for quality assurance reasons. But realistically all NVidia would have to do is put a disclaimer that users are forced to accept before enabling Physx


And as soon as there was the slightest hint of a problem the ATi/AMD lot would be slagging Nvidia off the same way they do when they have a problem with any TWIMTBP title even if it turns out to be an AMD/ATi driver issue.

How much of the history did you go through by the way?
m
0
l
June 27, 2013 11:26:36 AM

Mousemonkey said:
smeezekitty said:
What he says it is for quality assurance reasons. But realistically all NVidia would have to do is put a disclaimer that users are forced to accept before enabling Physx


And as soon as there was the slightest hint of a problem the ATi/AMD lot would be slagging Nvidia off the same way they do when they have a problem with any TWIMTBP title even if it turns out to be an AMD/ATi driver issue.

That is true, but quite typical when there is competition from two companies with a significant fanbases. And of course disabling it had a significant negative reaction from the AMD/ATI users.
Quote:

How much of the history did you go through by the way?

I only looked briefly.

In any case I will stay with AMD because I don't agree with NVidia. I don't own any Physx games anyway so it does not make any difference to me.

m
0
l
June 27, 2013 11:31:49 AM

smeezekitty said:
Mousemonkey said:
smeezekitty said:
What he says it is for quality assurance reasons. But realistically all NVidia would have to do is put a disclaimer that users are forced to accept before enabling Physx


And as soon as there was the slightest hint of a problem the ATi/AMD lot would be slagging Nvidia off the same way they do when they have a problem with any TWIMTBP title even if it turns out to be an AMD/ATi driver issue.

That is true, but quite typical when there is competition from two companies with a significant fanbases. And of course disabling it had a significant negative reaction from the AMD/ATI users.
Quote:

How much of the history did you go through by the way?

I only looked briefly.

In any case I will stay with AMD because I don't agree with NVidia. I don't own any Physx games anyway so it does not make any difference to me.



That's fair enough but its still incorrect to misrepresent the history of how things came to be if you don't know the history and be bothered to research it, I know it goes back quite a ways but some of us have been watching it unfold from the very beginning. Did you know for instance that it was originally ATi who proposed the very idea of GPU accelerated physics?
m
0
l
June 27, 2013 11:35:49 AM

Well I find it quite difficult to separate fact from fanboy nonsense from both sides.

In any case, I have not watched it all unfold because I only got into 3d graphics since the mid 2000s.
m
0
l
June 27, 2013 11:41:58 AM

smeezekitty said:
Well I find it quite difficult to separate fact from fanboy nonsense from both sides.

In any case, I have not watched it all unfold because I only got into 3d graphics since the mid 2000s.


I'm surprised you don't know all about it then because ATi first announced their plans for GPU accelerated physics back in 2006 before they considered buying Ageia.
m
0
l
June 27, 2013 11:45:42 AM

I was stuck with Intel onboard garbage at the time.
m
0
l
June 27, 2013 11:47:43 AM

smeezekitty said:
I was stuck with Intel onboard garbage at the time.

Eeeew!
m
0
l
June 27, 2013 12:06:17 PM

quite an interesting convo haha found out you can actually use all different size monitors in eyefinity with no problems and it wont make all the resolutions the same. if i can find a cheap monitor for now ill get it to try it till i get a 1440p or 1600p for a decent price or maybe my birthday haha. it sucks any decent ips one is like 500 bucks
m
0
l
June 27, 2013 9:31:51 PM

06yfz450ridr said:
quite an interesting convo haha found out you can actually use all different size monitors in eyefinity with no problems and it wont make all the resolutions the same. if i can find a cheap monitor for now ill get it to try it till i get a 1440p or 1600p for a decent price or maybe my birthday haha. it sucks any decent ips one is like 500 bucks

Its going to look all weird brah, unless somehow they are exactly the same high and exactly the same aspect ratio, that is.
m
0
l
!