Ati for graphics or Nvidia for physics?

What's the best solution of those two? Should i go for Nvidia (GTX 295) for the extra included physics. Or Ati for Xtreme dx11 graphics (HD5970)

OR as a third option, wait :S (i am going to wait until august or so anyway, because i need to collect my money for the gpu's :) )

Wich gives a chance that nvidia releases a better dx11 dual gpu.

But once again (*phfff*) my psu is only 650 watts , and has just the power to run the HD5970, while nvidia's new cards will more likely need more power then that.....

Please help me out, i have until august to make a conclusion :D


Soulmachiklamo
37 answers Last reply
More about graphics nvidia physics
  1. The HD5970 uses 300W of power. Assuming you have 80% efficiency, that leaves you with 220Watts of power. Any decent CPU that won't bottleneck the HD5970 will use 120W of power, and the motherboard and peripherals will use about 80W. You're cutting it quite close, my friend.

    But ultimate, the HD55970 is the best graphics card for gaming (in the world) currently. nVidia's GTX380, that (IIRC) is a little bitter than the HD5870, will not be used in the making of the dual-GPU chip because of power/space issues. The GTX375/360 would be used, I think, and therefore will not be as powerful as 2x HD5870's.

    ^ I'm probably wrong somewhere in there, though. Take a spoon of salt.
  2. Dont get the 295 its old tech and overpriced, the 5970 somewhat overpriced and very hard to find. Do you have a 2550x1600 monitor to take full advantage of such a card?

    If you are going to wait till august then wait for Fermi to come out and see how it performs. A 650w PSU would mostlikely run one Fermi because Nvidia will be limited to the same 300w PCIE limitation ATI had with its 5970.
  3. shadow187 said:
    The HD5970 uses 300W of power. Assuming you have 80% efficiency, that leaves you with 220Watts of power. Any decent CPU that won't bottleneck the HD5970 will use 120W of power, and the motherboard and peripherals will use about 80W. You're cutting it quite close, my friend.

    But ultimate, the HD55970 is the best graphics card for gaming (in the world) currently. nVidia's GTX380, that (IIRC) is a little bitter than the HD5870, will not be used in the making of the dual-GPU chip because of power/space issues. The GTX375/360 would be used, I think, and therefore will not be as powerful as 2x HD5870's.

    ^ I'm probably wrong somewhere in there, though. Take a spoon of salt.



    You we're very helpful 'my friend' :D

    I think i'll go for the HD5970, and my psu has truepower, and is SLI certified and has great reviews. Still, i know 100% efficiency is hard. But i bought a decent and expensive XD psu to do this kinda things. I saw people running this beast on a 550 watt psu! That must've been a decent one :D

    But your speculations helped me anyway, because i don't know that much about the new cards of nvidia.

    btw, would you chose for physics or amazing graphs? I'll take ATI :D

    Physics are great but not needed to make a game great. Although its great eye candy :S :sol:
  4. paperfox said:
    Dont get the 295 its old tech and overpriced, the 5970 somewhat overpriced and very hard to find. Do you have a 2550x1600 monitor to take full advantage of such a card?

    If you are going to wait till august then wait for Fermi to come out and see how it performs. A 650w PSU would mostlikely run one Fermi because Nvidia will be limited to the same 300w PCIE limitation ATI had with its 5970.


    I'll wait and see :p

    But if the HD5970 performs better on graphics, even without physics (simulated by the gpu) I am going to chose for ATI.
    Only if Nvidia comes with an card wich is great (and on the same price the ati one is) i am going to buy it.


    Thanks )
  5. to magneezo: Dude this is the 3rd time i see you post the same link, have you even read the Specifications Sheet of that card?, is from 10/2006.
  6. magneezo said:

    That no longer exists as a current product, the closest thing today would be EVGA's GTX275/250 Hybrid PhysX card.
  7. DarkMantle said:
    to magneezo: Dude this is the 3rd time i see you post the same link, have you even read the Specifications Sheet of that card?, is from 10/2006.

    OK my bad just trying to help my fellow computer-mates that want to know how they can get phys-x and still play their ATi cards.
    Slicing the cake to have a piece and eat it too.
  8. magneezo said:
    OK my bad just trying to help my fellow computer-mates that want to know how they can get phys-x and still play their ATi cards.
    Slicing the cake to have a piece and eat it too.

    Are you 100% sure that the card will work with the latest drivers and that people can actually go and buy that card from any retailer?
  9. PhysX =/= physics

    PhysX is just an old, useless, and dead physics GPU acceleration method. DX11 can do the same on any DX11 card, ATI or nVidia, and will be easier and cheaper to implement. DX11 is the future of GPU accelerated physics.
  10. Mousemonkey said:
    Are you 100% sure that the card will work with the latest drivers and that people can actually go and buy that card from any retailer?

    Who...little ol me?
    let me ask you a question...
    did you go to the site and click on the tabs that said 'Buy Now'and the 'support' tab that takes you directly to the newest Phys-x driver update?
  11. Yes and none of those links take me to a store that still has them in stock as they are no longer a current product (because they were discontinued after Nvidia bought Aegia) and I can't buy one here any more and I would bet that many others would find the same, that together that quite a few owners of this card have reported that it no longer works in their system together with the Ati graphic card that they have, leads me to believe that you may not have been keeping up with current events.
  12. I would seriously just wait and see what nvidia's final product is when it comes out. if you not buying until august you should see what nvidia has to offer, people can say there cards will be faster or slower but there has been no final released product or price. So you should wait until nvidia response, anyways by the time august rolls around anways there may have been even more revision of the current products so. asking several months before hand isn't going to be very useful info for what could be a completely different video card world in august
  13. Mousemonkey said:
    Yes and none of those links take me to a store that still has them in stock as they are no longer a current product (because they were discontinued after Nvidia bought Aegia) and I can't buy one here any more and I would bet that many others would find the same, that together that quite a few owners of this card have reported that it no longer works in their system together with the Ati graphic card that they have,
    leads me to believe that you may not have been keeping up with current events.

    I did drop out of sight for a couple of years...my account was still around when I popped back in around late December.
    This place was always alot of good information and I've learned many things coming here.
    You may get a good rousing by some but it's good to know the place hasn't changed exept for the layout.
  14. magneezo said:
    I did drop out of sight for a couple of years...my account was still around when I popped back in around late December.
    This place was always alot of good information and I've learned many things coming here.
    You may get a good rousing by some but it's good to know the place hasn't changed exept for the layout.

    :) I know what you mean, some things change whilst others stay the same. The graphics scene has changed somewhat with things like the whole PhysX debate and renaming schemes but other things stay the same, like the various mindsets of the opposing sides for instance.
  15. Multi monitor, 3D and gesture activated tech are some of the directions that things are headed in just for starters in the GPU's and CPU makers have been eyeing the optical data transfer possibilitys that may lie ahead so it's all been good fun.
  16. jonnyboyC said:
    I would seriously just wait and see what nvidia's final product is when it comes out. if you not buying until august you should see what nvidia has to offer, people can say there cards will be faster or slower but there has been no final released product or price. So you should wait until nvidia response, anyways by the time august rolls around anways there may have been even more revision of the current products so. asking several months before hand isn't going to be very useful info for what could be a completely different video card world in august


    That's what Nvidia wants people to do anyway. Playing mind games just to deter people from buying what is in the market right now with a promise of a better future. :p

    @OP: I would not be so concerned with Nvidia's Physx as it is not that widely used. Physics is not only meant for Nvidia cards, you CPU can do it too. Even crysis doesn't depend on nvidia cards but it damn looks good. :)
  17. PhysX is only a gimmick. It's not going to enhance your gameplay by too much.

    @OP, if you're waiting until August, then just wait for an ATI refresh. The HD5890 might be out, too, ^_^.
  18. That's what Nvidia wants people to do anyway. Playing mind games just to deter people from buying what is in the market right now with a promise of a better future.

    Well if you read his original post or later in my post he doesn't plan on buying his card till august the computer tech world could change quite a bit from now that's all i'm saying
  19. Lets see, whats more of a gimmick, eyfinity or physx ?mmmmmm
    by a ratio of 100-1 , eyeinsanity is the gimmick. 100% of Nvidia and a large amount of ATI users can and use physx in gaming. When 1% of users may even try Iinifinity and only about 1/3 of those will get it running with ATI driver whoes, their grey screening and flickering with dual monitors. But you ATI fanboys keep yucking up the wonderful 5 series and how their can never be anything more powerful than what they offer, because of some power limit. We are now at the epitome and will never have more powerful cards, LOL , wake up and smell the coffee. !!
  20. ^^ Yes, Eyefinity is so good that Nvidia had to create their own version of it. And yet we dont see ATI creating a version of PhysX...
  21. paperfox said:
    And yet we dont see ATI creating a version of PhysX...

    They have talked about it though, Bullet physics or some such but there has been talk, no action just talk so far.
  22. Yes, and although the basic principle behind GPU PhysX is a great one in my opinion, the fact that ATI cannot use it as well (regardless of why) moves it into the "gimmick" category in my opinion because no developer in their right mind would develop a game with GPU PhysX required since they would be lopping off at least 30% of their market right off the bat, which is poor strategy in my opinion. Thus unless PhysX is used on ATI cards I can't see it doing anything more than "looking good", any essential physics will still have to be done on the CPU.
  23. Notty22, the same people that buy the eyefinity cards (ones with 6 monitors), and perhaps some of the people with just the standard cards can and will utilize the eyefinity ability. Three monitors is rarely seen due to the fact that it requires a workstation card, is not supported by games, or crossfire/sli is required. ATI enables single-slow-low-budget solutions for those three issues.

    Gimmick? I think not.
  24. Quote:
    Notty22, the same people that buy the eyefinity cards (ones with 6 monitors), and perhaps some of the people with just the standard cards can and will utilize the eyefinity ability. Three monitors is rarely seen due to the fact that it requires a workstation card, is not supported by games, or crossfire/sli is required. ATI enables single-slow-low-budget solutions for those three issues.

    Gimmick? I think not.


    And 100% of the people that buy Fermi will have and use Physx or be one option in-game of enabling it. And Ati fanbois can all tell us how if they went out and bought 3 of the same monitors and a display port adapter, figured out the driver mess, get past the bezels, find a game that supports it, enjoy watching slow motion fps, they can have eyefinity. Not for me, no thanks.

    edit : But also I don't go in to every or for that many ANY ATI threads and say that eyefinity is a not for me or *** on it, unlike the troll AMW1011 who whines like a little boy how physx is dead in threads where members ask about it or are troubleshooting it.
  25. notty22 said:
    Quote:
    Notty22, the same people that buy the eyefinity cards (ones with 6 monitors), and perhaps some of the people with just the standard cards can and will utilize the eyefinity ability. Three monitors is rarely seen due to the fact that it requires a workstation card, is not supported by games, or crossfire/sli is required. ATI enables single-slow-low-budget solutions for those three issues.

    Gimmick? I think not.


    And 100% of the people that buy Fermi will have and use Physx or be one option in-game of enabling it. And Ati fanbois can all tell us how if they went out and bought 3 of the same monitors and a display port adapter, figured out the driver mess, get past the bezels, find a game that supports it, enjoy watching slow motion fps, they can have eyefinity. Not for me, no thanks.

    edit : But also I don't go in to every or for that many ANY ATI threads and say that eyefinity is a not for me or *** on it, unlike the troll AMW1011 who whines like a little boy how physx is dead in threads where members ask about it or are troubleshooting it.




    Seems your alone, eyefinity seems like a success to those guys. Notice how buttery smooth those games are on a single 5870.

    That said, eyefinity isn't for me but I don't assume my opinion is law and dictate that because I think it isn't worth it than everyone must agree.

    Oh and the difference between eyefinity and PhysX is that there is no cross platform software that does the same as eyefinity on both nVidia and ATI. I never said GPU accelerated physics are useless, just that PhysX GPU acceleration is dead because there is a cross platform alternative. I have absolutely zero problems with nVidia, I am excited for Fermi as rumors are looking up, I am just not kidding myself when it comes to PhysX.

    Oh and go ahead and call me all the names you wish notty, it says more about you than me. Regardless notty, I DO respect you even if I do not respect your opinion or your motives so, as much as I find your name calling entertaining, lets try to keep this civil. I will begin, sorry for calling you a troll, whether you are or not is not my place to state.
  26. But what about those physx cards? I don't wanna buy one :) i don't like putting an xtra card only for the physx who will be included in dx11 too?

    anyways, are those cards pci-e 1x 4x or 16x?

    thanks for all the info :)
  27. PCI-E2.0x4 begins to bottleneck cards (I believe), Anything above that is fine.
  28. Soulmachiklamo said:
    anyways, are those cards pci-e 1x 4x or 16x?


    It is not the video cards but the motherboard PCI interface which is rated as either x4, x8, or x16. Almost all cards can work on all of them, but the x4 interface is known to affect noticeable performance drop to video cards. If you're plan is to get to a single card graphics solution then just stick that card onto the available PCIe x16 slot and be happy with it.
  29. First off, if you give PhysX any value it's simple, get a separate GPU for physX and buy the best graphics card for graphics.
    Which means at this time get the best HD5 series card for graphics, and then buy something like a GTS 250 if you want to enable PhysX effects.

    If you wanna wait 'til August, there's likely going to be better solutions from both camp by that time, and even then it's likely for both to be better to still use a separate GPU for PhysX even with Fermi, instead of robbing it's graphics power to do those effects. Also by August we'll have titles out like Alien vs Predator which launches next week, and you can decide if they and their shiny DX11 goodness is better than PhysX's shiny physics. :sol:


    magneezo said:
    Who...little ol me?
    let me ask you a question...
    did you go to the site and click on the tabs that said 'Buy Now'and the 'support' tab that takes you directly to the newest Phys-x driver update?


    Did you bother to see if that 'newest driver' was really 'new' at all?
    It's a 8-series driver the latest ones are 9.xx.xx generation drivers. :hello:
  30. TheGreatGrapeApe said:
    First off, if you give PhysX any value it's simple, get a separate GPU for physX and buy the best graphics card for graphics.
    Which means at this time get the best HD5 series card for graphics, and then buy something like a GTS 250 if you want to enable PhysX effects.

    If you wanna wait 'til August, there's likely going to be better solutions from both camp by that time, and even then it's likely for both to be better to still use a separate GPU for PhysX even with Fermi, instead of robbing it's graphics power to do those effects. Also by August we'll have titles out like Alien vs Predator which launches next week, and you can decide if they and their shiny DX11 goodness is better than PhysX's shiny physics. :sol:


    Did you bother to see if that 'newest driver' was really 'new' at all?
    It's a 8-series driver the latest ones are 9.xx.xx generation drivers. :hello:

    It showed an old driver listed on the BFG support page but when I hit the link it took me directly to the Nvidia 9.09.1112 driver page.
    I really apologize for not knowing it was so old. I feel kind of foolish.....
    From here on out I think my ears are open wider than my mouth.
    Cheers.
  31. How much of a card do you REALLY need for PhysX? I've heard of people with HD5870's and a 9600GT.
  32. From THG's test it seems the GTS250 (a super-sized GF9800) is the best balance that let's you max out physX. This may change but currently there's no benefit from more power than that as a dedicated GPU-PPU.
    Price wise the GF9800GT might be a good idea, and while the GF9800GT without a power connector is less energy consuming it's also about halfway between the GF9600GT and GF9800GT in performance, so it might be a bit sluggish, but if you want effects and not on high a GF9500GT or 9600GSO/GT seems perfect, and on high PhysX settings might bottleneck you a bit. However even enabling PhysX on a dedicated GTX285 as a GPU-PPU takes a bit of a performance hit, but it's mild, and it seems it's similar to that experienced by the GTS250 on high settings.
  33. TheGreatGrapeApe said:
    From THG's test it seems the GTS250 (a super-sized GF9800) is the best balance that let's you max out physX. This may change but currently there's no benefit from more power than that as a dedicated GPU-PPU.
    Price wise the GF9800GT might be a good idea, and while the GF9800GT without a power connector is less energy consuming it's also about halfway between the GF9600GT and GF9800GT in performance, so it might be a bit sluggish, but if you want effects and not on high a GF9500GT or 9600GSO/GT seems perfect, and on high PhysX settings might bottleneck you a bit. However even enabling PhysX on a dedicated GTX285 as a GPU-PPU takes a bit of a performance hit, but it's mild, and it seems it's similar to that experienced by the GTS250 on high settings.


    The 9800 GT can hit GTS 250 speeds if overclocked, does a dedicated PhysX card show up under nTune or Rivatuner and can it be overclocked?
  34. I don't know, I haven't seen anything about OCing a PPU-oriented graphics card. I can't see why it wouldn't (heck you could hack the BIOS if you wanted), but I would almost always recommend for something like this that it would be best to simply get the lower process node GTS250 and simply stick to stock speeds to keep heat and power under control, but for the adventurous type, sure give it a go, OCing is far less risky than it used to be (although the BIOS changing is more risky).
  35. magneezo said:
    From here on out I think my ears are open wider than my mouth.

    You mean your eyes, dude? Unless your using one of those accessibility tools such as the narrator. :D

    Agree with the above. Might as well go with the 2 GPU setup to get best of both worlds while running in old Nvidia drivers.
  36. masterjaw said:
    You mean your eyes, dude? Unless your using one of those accessibility tools such as the narrator. :D

    I'm sorry, explain it again please?
Ask a new question

Read More

Graphics Cards Nvidia ATI Graphics