GTX 570 or ATI 6900 series
Tags:
-
Graphics Cards
-
Graphics
- Product
Last response: in Graphics & Displays
djphaze
December 17, 2010 9:55:40 PM
Just wanted to know what video card would be better to play high demanding graphic games such as Crysis? I was thinking between the Nvidia GTX 570 or the ATI 6950 and 6970. Any suggestoins?
Here are my specs...
Genuine Windows 7 Home Premium 64-bit
Intel Core i7-860 quad-core processor [2.8GHz, 1MB L2 + 8MB shared L3 cache]
8GB DDR3-1333MHz SDRAM [4 DIMMs] from 6GB
1TB 7200 rpm SATA 3Gb/s hard drive from 640GB
Corsair 650W
Here are my specs...
Genuine Windows 7 Home Premium 64-bit
Intel Core i7-860 quad-core processor [2.8GHz, 1MB L2 + 8MB shared L3 cache]
8GB DDR3-1333MHz SDRAM [4 DIMMs] from 6GB
1TB 7200 rpm SATA 3Gb/s hard drive from 640GB
Corsair 650W
More about : gtx 570 ati 6900 series
kg2010
December 17, 2010 11:39:59 PM
Go for the 570, it's the better overall card, plus it has more overclocking headroom, stable drivers, just see the reviews for yourself.
You may also want to check out these threads:
http://www.overclock.net/ati/889655-6970-50-dissapointi...
Real world user's benchmarks of the 6970 vs 570, 580 ( even an overclocked 470 outperformed a 6970 )
http://www.overclock.net/ati/890792-my-6970-benchmarks....
It's funny how some people are going around saying that the 6970 is slightly faster than a 570, when from everything I have read, the 570 has a slight edge, it's $20 cheaper, and has specific benefits like CUDA, PHYSX. 3D surround.
Plus, they are comparing AMD's fastest SINGLE GPU to Nvidia's 2nd fastest, it's like comparing a GTX 580 to the 6950, let's look those up in the benchmarks, LOL
Just clarifying this for you AHEAD of time.
You may also want to check out these threads:
http://www.overclock.net/ati/889655-6970-50-dissapointi...
Real world user's benchmarks of the 6970 vs 570, 580 ( even an overclocked 470 outperformed a 6970 )
http://www.overclock.net/ati/890792-my-6970-benchmarks....
It's funny how some people are going around saying that the 6970 is slightly faster than a 570, when from everything I have read, the 570 has a slight edge, it's $20 cheaper, and has specific benefits like CUDA, PHYSX. 3D surround.
Plus, they are comparing AMD's fastest SINGLE GPU to Nvidia's 2nd fastest, it's like comparing a GTX 580 to the 6950, let's look those up in the benchmarks, LOL
Just clarifying this for you AHEAD of time.
m
0
l
Related resources
- Ati vs nvidia 2011 500 series vs 6900 series - Forum
- ATI HD 6900series Crossfire - Forum
- How do I connect two Radeon Sapphire 6900 series? - Forum
- How to get 3 monitors with Radeon HD 6900 Series? - Forum
- Problems with 2 x ATI HD6900 crossfire - Forum
djphaze
December 17, 2010 11:42:53 PM
thanks,. Will this PS work with the 570?
http://www.newegg.com/Product/Product.aspx?Item=N82E168...
http://www.newegg.com/Product/Product.aspx?Item=N82E168...
m
0
l
kg2010
December 17, 2010 11:49:23 PM
djphaze
December 17, 2010 11:51:49 PM
djphaze
December 26, 2010 5:57:48 PM
What kind of hdmi cable do i need if i get the gtx 570?
here is what comes with it...http://www.newegg.com/Product/Product.aspx?Item=N82E168...
here is what comes with it...http://www.newegg.com/Product/Product.aspx?Item=N82E168...
m
0
l
djphaze
December 28, 2010 8:59:04 AM
phenom90
December 28, 2010 9:09:31 AM
djphaze
December 28, 2010 9:24:05 AM
phenom90
December 28, 2010 9:36:26 AM
willmalcom
December 28, 2010 12:43:01 PM
I had to choose between these also. I have always used ATI cards in the past, but I really wanted to try something new, so I went with the 570. I am sure glad that I did, because I love it. The 570 is everything I always expected my other graphic cards to be. I bought the Gigabyte card, and I certainly recommend it.
Yes you need mini HDMI, my Gigabyte card came with one, but I bought a 6ft version that I use instead.
Have fun!
Yes you need mini HDMI, my Gigabyte card came with one, but I bought a 6ft version that I use instead.
Have fun!
m
0
l
Digital Dissent
December 28, 2010 12:51:15 PM
I was in this same situation quite recently. I ALMOST (and I mean almost) bought the gtx 570 but then decided on a hd 6950. Why? 4 reasons. Its cheaper, with the new hack out (it seems to almost always work) I can make it reach 6970 speeds, which is arguably slightly faster/slower than the 570, I can crossfire it in the near future without blowing my psu (two 570's require at least an 850w psu, I use a 750), and I can eyefinity it without need of another card. Something that I really do want to try.
m
0
l
willmalcom
December 28, 2010 3:17:35 PM
Then it sounds to me like you made the right choice! I had the same dilemma with crossfire/sli coming up for me as well, and the psu issue too. But I know that I plan to slap together a sandy bridge or bulldozer system when it is available, so I can make sure to get a better PSU and a SLI capable mobo. I also do not have any desire to use eyefinity or multiple displays, but if I were in your shoes, I would have gotten the 69xx also. Sounds to me like you made the logical choice, I hope it doesn't disappoint you!
m
0
l
djphaze
December 28, 2010 5:22:26 PM
djphaze said:
so instead of using there mini hdmi adapter i can go with a mini hdmi to hdmi cable and directly connect if yo the video card??Whether with an adapter or not, you will need a mini-HDMI to HDMI connection. DVI will work just as well, if you don't need to transmit audio over the cable.
In fact, using the HDMI cable might cause your monitor to function like a TV instead of a PC monitor. What does that mean? Your PC monitor will no longer automatically go into Standby mode when the PC shuts off. You will always need to physically turn the monitor off and on when using the computer. With DVI, the monitor will go into Standby mode, turning off automatically, and turning on by itself when the PC is powered up. There is no difference in image quality between HDMI and DVI.
m
0
l
djphaze
December 29, 2010 12:40:00 AM
17seconds said:
Whether with an adapter or not, you will need a mini-HDMI to HDMI connection. DVI will work just as well, if you don't need to transmit audio over the cable. In fact, using the HDMI cable might cause your monitor to function like a TV instead of a PC monitor. What does that mean? Your PC monitor will no longer automatically go into Standby mode when the PC shuts off. You will always need to physically turn the monitor off and on when using the computer. With DVI, the monitor will go into Standby mode, turning off automatically, and turning on by itself when the PC is powered up. There is no difference in image quality between HDMI and DVI.
I thought hdmi gave you better quality. Hmm.
m
0
l
phenom90
December 29, 2010 8:48:55 AM
exhail
December 29, 2010 10:24:15 AM
dont get the gtx 570, get the hd 6950 and bios flash it and basically turn it into an hd 6970
http://www.techpowerup.com/articles/overclocking/vidcar...
http://www.techpowerup.com/articles/overclocking/vidcar...
m
0
l
djphaze
January 12, 2011 6:47:12 AM
andy5174
January 12, 2011 8:54:54 AM
djphaze
January 12, 2011 6:48:03 PM
djphaze
January 14, 2011 9:08:42 PM
So im now debating on the HD 6950 or the GTX 570. Its about a $70 difference. Thats alot considering these cards are close in performance. Can anyone finalize and help me make my decision? Can anyone help me choose please. Monitor Resolution of 1920 x 1080. games i would be playing are Crysis and Metro 2033.
m
0
l
djphaze
January 14, 2011 11:39:18 PM
djphaze
January 17, 2011 10:11:30 PM
djphaze
January 17, 2011 11:25:48 PM
djphaze
January 20, 2011 6:11:43 AM
Just installed psu and the GTX 570. i was playing and testing it out for about an hour and i noticed my card got hot. I wanted to test the temperature and see how how its getting. can someone help me out and tell me how to do this? When im not gaming temp reads at a steady 45C. When playing it goes up to 84C. is this normal?
m
0
l
djphaze
January 20, 2011 6:16:28 PM
As long as you are in the mid-80's max, then you are fine. People shouldn't get too caught up in their GPU temperature as long as it's within the normal range (which yours is) and the exhaust is being vented outside the case (which yours is). The problem would come if your fan speeds were being forced too high to maintain that load temperature (above 75%).
m
0
l
djphaze
January 20, 2011 9:04:54 PM
17seconds said:
As long as you are in the mid-80's max, then you are fine. People shouldn't get too caught up in their GPU temperature as long as it's within the normal range (which yours is) and the exhaust is being vented outside the case (which yours is). The problem would come if your fan speeds were being forced too high to maintain that load temperature (above 75%).Ok. Well that sound a lil more comforting. i was playing just cause 2 max settings last night for 2 hours. My GPU temp a highest was at 84C. I notice the fan starting to get louder towards those two hours so i stopped playing. i got a lil worried. I touched my card and it was blazing hot.
m
0
l
17seconds said:
Good to see you stayed with the logic for the GTX570. Metro 2033 is a PhysX title, so be sure to enable hardware physics on that one when you fire it up.I've tried it with and without PhysX, with a dedicated card, and I believe you get the best visuals per performance value with it off. Metro 2033 doesn't add a lot with it's physX and the game is super demanding already. Feel free to try it both ways, I found I had better overall performance per visual levels without physx.
Anyways, have fun with your new setup!
m
0
l
bystander said:
I've tried it with and without PhysX, with a dedicated card, and I believe you get the best visuals per performance value with it off. Metro 2033 doesn't add a lot with it's physX and the game is super demanding already. Feel free to try it both ways, I found I had better overall performance per visual levels without physx.Anyways, have fun with your new setup!
That's the whole point of getting a high-end card. Turn up the details and have no fear.
m
0
l
17seconds said:
That's the whole point of getting a high-end card. Turn up the details and have no fear.Maybe, but Metro 2033 would require two gtx 570's to get close to max details. A single card leaves you having to find what gives you the most performance with the best visuals. You don't max that one with any single card. Ever look at the detail levels the benchmark sites use for that game? They use medium settings without physX. It's just that demanding.
m
0
l
bystander said:
Maybe, but Metro 2033 would require two gtx 570's to get close to max details. A single card leaves you having to find what gives you the most performance with the best visuals. You don't max that one with any single card. Ever look at the detail levels the benchmark sites use for that game? They use medium settings without physX. It's just that demanding.It is extremely demanding. I played the demo with my old GTX480, with everything turned up to max. The only slowdown was during the early outdoor scene when snow is falling and creatures are jumping out and basically, its chaos. It was still very playable. I have a lot of respect for the demands of that game. I wouldn't hesitate to play it maxed out with my system, or a good system running a GTX570.
m
0
l
17seconds said:
It is extremely demanding. I played the demo with my old GTX480, with everything turned up to max. The only slowdown was during the early outdoor scene when snow is falling and creatures are jumping out and basically, its chaos. It was still very playable. I have a lot of respect for the demands of that game. I wouldn't hesitate to play it maxed out with my system, or a good system running a GTX570.I don't know what demo you played, but you certainly didn't play the official game at max settings on a gtx480 unless you have a tiny resolution.
http://www.pcgameshardware.com/aid,743498/Geforce-GTX-4...
The GTX 480 has 26 FPS without PhysX at 1680x1050.
EDIT: I was wrong about one thing, with a dedicated physX card, I can run it just fine with physX on. I don't lose any performance with a dedicated card. I still don't notice any difference in actual visual, but I'm sure there is something I'd see with side by side images.
m
0
l
bystander said:
I don't know what demo you played, but you certainly didn't play the official game at max settings on a gtx480 unless you have a tiny resolution.http://www.pcgameshardware.com/aid,743498/Geforce-GTX-4...
The GTX 480 has 26 FPS without PhysX at 1680x1050.
EDIT: I was wrong about one thing, with a dedicated physX card, I can run it just fine with physX on. I don't lose any performance with a dedicated card. I still don't notice any difference in actual visual, but I'm sure there is something I'd see with side by side images.
Are you German? This is a little more like what I was looking at:
http://www.legitreviews.com/article/1476/7/
(Alright, yes I had to search a little for that one, but it is possible and Legit.)
BTW, have you checked out this thread:
http://www.tomshardware.com/forum/307727-33-pics-leaked...
m
0
l
17seconds said:
Are you German? This is a little more like what I was looking at:http://www.legitreviews.com/article/1476/7/
(Alright, yes I had to search a little for that one, but it is possible and Legit.)
BTW, have you checked out this thread:
http://www.tomshardware.com/forum/307727-33-pics-leaked...
I am German, but I don't speak it. The reason I linked that benchmark, is it's the only site that used maxed settings. What you provided used High instead of Very high, and it didn't include DoF either. It was also questionably playable at 39 average FPS.
When I get 54 average FPS, it spends most of the benchmark at 40. There is a section at the end of the benchmark that has a stretch of extremely high FPS that boosts the average. 39 average fps likely has a long stretch at just below 30 and that's still not maxed settings.
m
0
l
djphaze
January 21, 2011 1:17:50 AM
Thanks for the tips. So i zip tied the cables from the PSU to get if out of the way of the GPU so it can breath a little. Looks a lot neater too! Then i opened up some metal slots from the back of the comp so it can be better ventilated. I bought me a $15 Honeywell fan about 10 inch diameter and pointed it on the GPU with the side panel of the comp open and tadaaaaaa! its does'nt get hot anymore. I was playing just cause 2 for about 1.5 hrs and the hottest it got was 66C .Its a lil ghetto but what a difference! Oh and my idle temp now is 40C!
m
0
l
djphaze
February 22, 2011 7:37:52 PM
I just passed Crysis. Played it on max settings! With anti aliasing x8 for most of the game. Turned it off on the last boss because i notice a bit drop in fps nevertheless im sure it could of handled it. I was pretty amazed of how beautiful and exciting that game was. im really happy my computer handled it very well. I probably had 2 crashes from beginning to end. Im a lil late, but better late then never!
m
0
l
Brian_Coffey
February 22, 2011 8:39:22 PM
bystander said:
Maybe, but Metro 2033 would require two gtx 570's to get close to max details. A single card leaves you having to find what gives you the most performance with the best visuals. You don't max that one with any single card. Ever look at the detail levels the benchmark sites use for that game? They use medium settings without physX. It's just that demanding.I'm currently playing Metro 2033
and i'm only using an ASUS 6850 albeit clocked to 950/1150 but it still doesnt compare to the cards ye'r talking about
and on very high setting @ 1920x1080 im getting around 30-40FPS no AA of course but you could easily max this game out with a gtx 570, especially considering this game prefers Nvidia cards
With a good CPU and a GTX 570 this game will give you 40-50fps at max, run in it dx10 too not dx11
m
0
l
Did you turned on all DX11 features?
EDIT: sorry, from your last sentence i understand you can max the game in DX10, no?
If that's so you did not maxed the game settings ..haven't played metro 2033 for a wile and forgot if after very high it has extreme or something like that. If you play in dx10 then the game is not maxed.
EDIT: sorry, from your last sentence i understand you can max the game in DX10, no?
If that's so you did not maxed the game settings ..haven't played metro 2033 for a wile and forgot if after very high it has extreme or something like that. If you play in dx10 then the game is not maxed.
m
0
l
Pc Guru_07
February 23, 2011 5:26:03 AM
Pc Guru_07
February 23, 2011 5:28:31 AM
djphaze said:
I just passed Crysis. Played it on max settings! With anti aliasing x8 for most of the game. Turned it off on the last boss because i notice a bit drop in fps nevertheless im sure it could of handled it. I was pretty amazed of how beautiful and exciting that game was. im really happy my computer handled it very well. I probably had 2 crashes from beginning to end. Im a lil late, but better late then never!Did your pc crashed or just the game?
m
0
l
Brian_Coffey
February 23, 2011 11:51:00 AM
ionut19 said:
Did you turned on all DX11 features? EDIT: sorry, from your last sentence i understand you can max the game in DX10, no?
If that's so you did not maxed the game settings ..haven't played metro 2033 for a wile and forgot if after very high it has extreme or something like that. If you play in dx10 then the game is not maxed.
I know what you mean, but there is VIRTUALLY no difference between Dx10 and Dx11 in Metro 2033, research it for yourself, if maxing a game means a drastic drop in FPS for no visual gain then whats the point, its the same in crysis, its virtually unplayable in dx10 unless you have a monster card, but in dx9 your get much better performance for no visual loss. But yes i understand where your coming from but as of now dx11 has little to no benefit over dx10, it will in the future
m
0
l
djphaze
March 1, 2011 11:27:52 PM
- 1 / 2
- 2
- Newest
Related resources
- SolvedZotac gtx 460 vs ati hd 4600 series Forum
- PC gaming frame rate droppes dramaticly from 60+ to 0 (Radeon 6900 Series) Forum
- SLI, GTX 800 Series, or ATI/AMD? Forum
- how to remove plastic cover off ATI 6900 for cleaning Forum
- HDMI resolution problem with HD 6900 Series! Forum
- Has the Radeon 6900 series been rendered moot? Forum
- 6900 series Forum
- Any news on the Radeon 6900 series? Forum
- AMD's Cayman - HD 6900 series info delayed till week of Dec 13th Forum
- AMD Announces Radeon HD 6900 Series of Video Cards - 6850, 6970, 6990 Forum
- Dual 5970 or new 6900 series? Forum
- GTX 570, 580, 670, or ATI Radeon HD 7870, 7950? Forum
- Gtx 570 1.25 gb or ati radeon hd 6970 2gb? Forum
- My AMD Radeon 6900 turn in ATI Radeon HD 4250 Forum
- GTX 570 or ATI 6950 Forum
- More resources
Read discussions in other Graphics & Displays categories
!