GTX 570 or ATI 6900 series

Just wanted to know what video card would be better to play high demanding graphic games such as Crysis? I was thinking between the Nvidia GTX 570 or the ATI 6950 and 6970. Any suggestoins?

Here are my specs...

Genuine Windows 7 Home Premium 64-bit
Intel Core i7-860 quad-core processor [2.8GHz, 1MB L2 + 8MB shared L3 cache]
8GB DDR3-1333MHz SDRAM [4 DIMMs] from 6GB
1TB 7200 rpm SATA 3Gb/s hard drive from 640GB
Corsair 650W
54 answers Last reply
More about 6900 series
  1. I would buy the GTX 570. I will actually buy it in January, when I will build my Sandy Bridge rig :)
  2. Go for the 570, it's the better overall card, plus it has more overclocking headroom, stable drivers, just see the reviews for yourself.

    You may also want to check out these threads:
    http://www.overclock.net/ati/889655-6970-50-dissapointing.html

    Real world user's benchmarks of the 6970 vs 570, 580 ( even an overclocked 470 outperformed a 6970 )
    http://www.overclock.net/ati/890792-my-6970-benchmarks.html

    It's funny how some people are going around saying that the 6970 is slightly faster than a 570, when from everything I have read, the 570 has a slight edge, it's $20 cheaper, and has specific benefits like CUDA, PHYSX. 3D surround.

    Plus, they are comparing AMD's fastest SINGLE GPU to Nvidia's 2nd fastest, it's like comparing a GTX 580 to the 6950, let's look those up in the benchmarks, LOL

    Just clarifying this for you AHEAD of time.
  3. Yep it will - if you think you are possibly going to SLI down the road, be sure to get 750w or 850w Corsair PSU.

    But if you're going with a single card, that PSU will handle it.
  4. kg2010 said:
    Yep it will - if you think you are possibly going to SLI down the road, be sure to get 750w or 850w Corsair PSU.

    But if you're going with a single card, that PSU will handle it.


    Sweet! thanks for your help!
  5. What kind of hdmi cable do i need if i get the gtx 570?

    here is what comes with it...http://www.newegg.com/Product/Product.aspx?Item=N82E16814130595
  6. anyone? i would appreciate it if i could get sum help
  7. most hdmi cable sold in shop will be compatible as it ships with a mini hdmi to hdmi adapter... look for those which supports hdmi 1.3a
  8. if you wouldnt be minding flashing the BIOS then a 6950 would be AWESOME bang for buck. you can unlock a 6950 into a 6970 just by flashing the bios and the 6950 has dual BIOS so even if the flash goes wrong the card will work fine :)
  9. phenom90 said:
    most hdmi cable sold in shop will be compatible as it ships with a mini hdmi to hdmi adapter... look for those which supports hdmi 1.3a


    so i should buy a basic hdmi cable? i dont need a mini hdmi cable?
  10. no. if you buy the evga gtx 570 it should come with the adapter as shown in the picture....
  11. I had to choose between these also. I have always used ATI cards in the past, but I really wanted to try something new, so I went with the 570. I am sure glad that I did, because I love it. The 570 is everything I always expected my other graphic cards to be. I bought the Gigabyte card, and I certainly recommend it.

    Yes you need mini HDMI, my Gigabyte card came with one, but I bought a 6ft version that I use instead.

    Have fun!
  12. I was in this same situation quite recently. I ALMOST (and I mean almost) bought the gtx 570 but then decided on a hd 6950. Why? 4 reasons. Its cheaper, with the new hack out (it seems to almost always work) I can make it reach 6970 speeds, which is arguably slightly faster/slower than the 570, I can crossfire it in the near future without blowing my psu (two 570's require at least an 850w psu, I use a 750), and I can eyefinity it without need of another card. Something that I really do want to try.
  13. Then it sounds to me like you made the right choice! I had the same dilemma with crossfire/sli coming up for me as well, and the psu issue too. But I know that I plan to slap together a sandy bridge or bulldozer system when it is available, so I can make sure to get a better PSU and a SLI capable mobo. I also do not have any desire to use eyefinity or multiple displays, but if I were in your shoes, I would have gotten the 69xx also. Sounds to me like you made the logical choice, I hope it doesn't disappoint you!
  14. phenom90 said:
    no. if you buy the evga gtx 570 it should come with the adapter as shown in the picture....


    so instead of using there mini hdmi adapter i can go with a mini hdmi to hdmi cable and directly connect if yo the video card??
  15. djphaze said:
    so instead of using there mini hdmi adapter i can go with a mini hdmi to hdmi cable and directly connect if yo the video card??

    Whether with an adapter or not, you will need a mini-HDMI to HDMI connection. DVI will work just as well, if you don't need to transmit audio over the cable.

    In fact, using the HDMI cable might cause your monitor to function like a TV instead of a PC monitor. What does that mean? Your PC monitor will no longer automatically go into Standby mode when the PC shuts off. You will always need to physically turn the monitor off and on when using the computer. With DVI, the monitor will go into Standby mode, turning off automatically, and turning on by itself when the PC is powered up. There is no difference in image quality between HDMI and DVI.
  16. 17seconds said:
    Whether with an adapter or not, you will need a mini-HDMI to HDMI connection. DVI will work just as well, if you don't need to transmit audio over the cable.

    In fact, using the HDMI cable might cause your monitor to function like a TV instead of a PC monitor. What does that mean? Your PC monitor will no longer automatically go into Standby mode when the PC shuts off. You will always need to physically turn the monitor off and on when using the computer. With DVI, the monitor will go into Standby mode, turning off automatically, and turning on by itself when the PC is powered up. There is no difference in image quality between HDMI and DVI.


    I thought hdmi gave you better quality. Hmm.
  17. djphaze said:
    I thought hdmi gave you better quality. Hmm.

    I spent a lot of time researching that, and apparently there is no difference whatsoever. In fact, they used with DVI to create HDMI. The only difference is that HDMI carries sound.
  18. yes... i'm currently using dvi to hdmi adapter for my pc... and it has no difference in picture quality whatsoever... so you don't need to bother...
  19. dont get the gtx 570, get the hd 6950 and bios flash it and basically turn it into an hd 6970

    http://www.techpowerup.com/articles/overclocking/vidcard/159
  20. so im going with the gtx 570. My resolution i will be playing on is 1920 x 1080 will this work good?
  21. It will be excellent in 99% of the games and adequate in high demanding games like metro2033.
  22. andy5174 said:
    It will be excellent in 99% of the games and adequate in high demanding games like metro2033.


    Thanks!
  23. So im now debating on the HD 6950 or the GTX 570. Its about a $70 difference. Thats alot considering these cards are close in performance. Can anyone finalize and help me make my decision? Can anyone help me choose please. Monitor Resolution of 1920 x 1080. games i would be playing are Crysis and Metro 2033.
  24. does anyone know where i can find the best price for the gtx 570?
  25. i just ordered my card. Gtx 570 superclocked. I cant wait
  26. Good to see you stayed with the logic for the GTX570. Metro 2033 is a PhysX title, so be sure to enable hardware physics on that one when you fire it up.
  27. 17seconds said:
    Good to see you stayed with the logic for the GTX570. Metro 2033 is a PhysX title, so be sure to enable hardware physics on that one when you fire it up.


    Nice. Will do. Thanks!
  28. Just installed psu and the GTX 570. i was playing and testing it out for about an hour and i noticed my card got hot. I wanted to test the temperature and see how how its getting. can someone help me out and tell me how to do this? When im not gaming temp reads at a steady 45C. When playing it goes up to 84C. is this normal?
  29. Yes, if your case does not have proper airflow or the ambient temperature is high, that temperature is OK, though a bit on the higher side for a GTX 570.
  30. Thanks for the reply. is there a way i can keep the card cooler?
  31. As long as you are in the mid-80's max, then you are fine. People shouldn't get too caught up in their GPU temperature as long as it's within the normal range (which yours is) and the exhaust is being vented outside the case (which yours is). The problem would come if your fan speeds were being forced too high to maintain that load temperature (above 75%).
  32. 17seconds said:
    As long as you are in the mid-80's max, then you are fine. People shouldn't get too caught up in their GPU temperature as long as it's within the normal range (which yours is) and the exhaust is being vented outside the case (which yours is). The problem would come if your fan speeds were being forced too high to maintain that load temperature (above 75%).


    Ok. Well that sound a lil more comforting. i was playing just cause 2 max settings last night for 2 hours. My GPU temp a highest was at 84C. I notice the fan starting to get louder towards those two hours so i stopped playing. i got a lil worried. I touched my card and it was blazing hot.
  33. Keep an eye on your fan speeds and make sure you are under, really, 72% or less. If so, game on!
  34. 17seconds said:
    Good to see you stayed with the logic for the GTX570. Metro 2033 is a PhysX title, so be sure to enable hardware physics on that one when you fire it up.


    I've tried it with and without PhysX, with a dedicated card, and I believe you get the best visuals per performance value with it off. Metro 2033 doesn't add a lot with it's physX and the game is super demanding already. Feel free to try it both ways, I found I had better overall performance per visual levels without physx.

    Anyways, have fun with your new setup!
  35. bystander said:
    I've tried it with and without PhysX, with a dedicated card, and I believe you get the best visuals per performance value with it off. Metro 2033 doesn't add a lot with it's physX and the game is super demanding already. Feel free to try it both ways, I found I had better overall performance per visual levels without physx.

    Anyways, have fun with your new setup!

    That's the whole point of getting a high-end card. Turn up the details and have no fear.
  36. 17seconds said:
    That's the whole point of getting a high-end card. Turn up the details and have no fear.


    Maybe, but Metro 2033 would require two gtx 570's to get close to max details. A single card leaves you having to find what gives you the most performance with the best visuals. You don't max that one with any single card. Ever look at the detail levels the benchmark sites use for that game? They use medium settings without physX. It's just that demanding.
  37. bystander said:
    Maybe, but Metro 2033 would require two gtx 570's to get close to max details. A single card leaves you having to find what gives you the most performance with the best visuals. You don't max that one with any single card. Ever look at the detail levels the benchmark sites use for that game? They use medium settings without physX. It's just that demanding.

    It is extremely demanding. I played the demo with my old GTX480, with everything turned up to max. The only slowdown was during the early outdoor scene when snow is falling and creatures are jumping out and basically, its chaos. It was still very playable. I have a lot of respect for the demands of that game. I wouldn't hesitate to play it maxed out with my system, or a good system running a GTX570.
  38. 17seconds said:
    It is extremely demanding. I played the demo with my old GTX480, with everything turned up to max. The only slowdown was during the early outdoor scene when snow is falling and creatures are jumping out and basically, its chaos. It was still very playable. I have a lot of respect for the demands of that game. I wouldn't hesitate to play it maxed out with my system, or a good system running a GTX570.


    I don't know what demo you played, but you certainly didn't play the official game at max settings on a gtx480 unless you have a tiny resolution.

    http://www.pcgameshardware.com/aid,743498/Geforce-GTX-480-and-GTX-470-reviewed-Fermi-performance-benchmarks/Reviews/?page=13

    The GTX 480 has 26 FPS without PhysX at 1680x1050.

    EDIT: I was wrong about one thing, with a dedicated physX card, I can run it just fine with physX on. I don't lose any performance with a dedicated card. I still don't notice any difference in actual visual, but I'm sure there is something I'd see with side by side images.
  39. bystander said:
    I don't know what demo you played, but you certainly didn't play the official game at max settings on a gtx480 unless you have a tiny resolution.

    http://www.pcgameshardware.com/aid,743498/Geforce-GTX-480-and-GTX-470-reviewed-Fermi-performance-benchmarks/Reviews/?page=13

    The GTX 480 has 26 FPS without PhysX at 1680x1050.

    EDIT: I was wrong about one thing, with a dedicated physX card, I can run it just fine with physX on. I don't lose any performance with a dedicated card. I still don't notice any difference in actual visual, but I'm sure there is something I'd see with side by side images.

    Are you German? This is a little more like what I was looking at:
    http://www.legitreviews.com/article/1476/7/
    (Alright, yes I had to search a little for that one, but it is possible and Legit.)

    BTW, have you checked out this thread:
    http://www.tomshardware.com/forum/307727-33-pics-leaked-benchmarks#t2300547
  40. 17seconds said:
    Are you German? This is a little more like what I was looking at:
    http://www.legitreviews.com/article/1476/7/
    (Alright, yes I had to search a little for that one, but it is possible and Legit.)

    BTW, have you checked out this thread:
    http://www.tomshardware.com/forum/307727-33-pics-leaked-benchmarks#t2300547


    I am German, but I don't speak it. The reason I linked that benchmark, is it's the only site that used maxed settings. What you provided used High instead of Very high, and it didn't include DoF either. It was also questionably playable at 39 average FPS.

    When I get 54 average FPS, it spends most of the benchmark at 40. There is a section at the end of the benchmark that has a stretch of extremely high FPS that boosts the average. 39 average fps likely has a long stretch at just below 30 and that's still not maxed settings.
  41. Thanks for the tips. So i zip tied the cables from the PSU to get if out of the way of the GPU so it can breath a little. Looks a lot neater too! Then i opened up some metal slots from the back of the comp so it can be better ventilated. I bought me a $15 Honeywell fan about 10 inch diameter and pointed it on the GPU with the side panel of the comp open and tadaaaaaa! its does'nt get hot anymore. I was playing just cause 2 for about 1.5 hrs and the hottest it got was 66C .Its a lil ghetto but what a difference! Oh and my idle temp now is 40C!
  42. I just passed Crysis. Played it on max settings! With anti aliasing x8 for most of the game. Turned it off on the last boss because i notice a bit drop in fps nevertheless im sure it could of handled it. I was pretty amazed of how beautiful and exciting that game was. im really happy my computer handled it very well. I probably had 2 crashes from beginning to end. Im a lil late, but better late then never!
  43. bystander said:
    Maybe, but Metro 2033 would require two gtx 570's to get close to max details. A single card leaves you having to find what gives you the most performance with the best visuals. You don't max that one with any single card. Ever look at the detail levels the benchmark sites use for that game? They use medium settings without physX. It's just that demanding.



    I'm currently playing Metro 2033 :) and i'm only using an ASUS 6850 albeit clocked to 950/1150 but it still doesnt compare to the cards ye'r talking about :) and on very high setting @ 1920x1080 im getting around 30-40FPS no AA of course but you could easily max this game out with a gtx 570, especially considering this game prefers Nvidia cards :) With a good CPU and a GTX 570 this game will give you 40-50fps at max, run in it dx10 too not dx11 :)
  44. Did you turned on all DX11 features?

    EDIT: sorry, from your last sentence i understand you can max the game in DX10, no?

    If that's so you did not maxed the game settings ..haven't played metro 2033 for a wile and forgot if after very high it has extreme or something like that. If you play in dx10 then the game is not maxed.
  45. What is your res OP?
  46. djphaze said:
    I just passed Crysis. Played it on max settings! With anti aliasing x8 for most of the game. Turned it off on the last boss because i notice a bit drop in fps nevertheless im sure it could of handled it. I was pretty amazed of how beautiful and exciting that game was. im really happy my computer handled it very well. I probably had 2 crashes from beginning to end. Im a lil late, but better late then never!

    Did your pc crashed or just the game?
  47. ionut19 said:
    Did you turned on all DX11 features?

    EDIT: sorry, from your last sentence i understand you can max the game in DX10, no?

    If that's so you did not maxed the game settings ..haven't played metro 2033 for a wile and forgot if after very high it has extreme or something like that. If you play in dx10 then the game is not maxed.


    I know what you mean, but there is VIRTUALLY no difference between Dx10 and Dx11 in Metro 2033, research it for yourself, if maxing a game means a drastic drop in FPS for no visual gain then whats the point, its the same in crysis, its virtually unplayable in dx10 unless you have a monster card, but in dx9 your get much better performance for no visual loss. But yes i understand where your coming from but as of now dx11 has little to no benefit over dx10, it will in the future :)
  48. Pc Guru_07 said:
    Did your pc crashed or just the game?


    The game crysis crashed.
Ask a new question

Read More

Graphics Cards Graphics Product