Sign in with
Sign up | Sign in
Your question
Closed

Ati vs. Nvidia

Last response: in Graphics & Displays
Share
June 22, 2011 7:20:22 PM

What is better, Nvidia or ATI cards ? I'm looking at geforce 560 or HD6870. I have a crossair 650 watt psu with 4 gb ddr3 (1600) ram amd athlon 2 x4 2.9 ghz.

More about : ati nvidia

a c 130 U Graphics card
a b Î Nvidia
June 22, 2011 7:28:17 PM

It makes no difference there are some who prefer one to the other but there are no actual reasons for it in the main. Kinda like football supporters really.
There are some features that if they matter to you then sway the decision like PhysX with Nvidia cards or if you were looking for a really good video playback experience than an AMD card would serve you better.
For general use there really is nothing in it.
if you let us know what it is you want it for,Games, Movies and your Screen resolution we can advise better. Telling people which Athlon would be helpful as not everyone knows or can work out which you mean just by family and Ghz ;) 

Mactronix :) 
Score
0
June 22, 2011 7:30:40 PM


Games mainly.
Score
0
Related resources
June 22, 2011 7:36:23 PM

I'm not sure what resolution i will be running it at, but it will be lower resolution.
Score
0
June 22, 2011 7:37:31 PM

I'd go with the 6870, if not a 560 ti, a bit more expensive. 6870 CF kills everything in sight for the price.
Score
0
a c 130 U Graphics card
a b Î Nvidia
June 22, 2011 7:41:57 PM

Just so we are clear your looking at a GTX 560 and not a 560 TI ? Looking at reviews it would appear that the 560 is the better performer of the two
This review should give you an idea of how the cards perform. the review is actually of the Asus Top version but it includes the info on both cards your interested in as well as plenty of others.

Mactronix :) 
Score
0
June 22, 2011 7:43:46 PM

mactronix said:
Just so we are clear your looking at a GTX 560 and not a 560 TI ? Looking at reviews it would appear that the 560 is the better performer of the two
This review should give you an idea of how the cards perform. the review is actually of the Asus Top version but it includes the info on both cards your interested in as well as plenty of others.

Mactronix :) 

yes. the 560. not ti
Score
0
a b U Graphics card
June 22, 2011 7:52:49 PM

cronos177 said:
I'd go with the 6870, if not a 560 ti, a bit more expensive. 6870 CF kills everything in sight for the price.


yea but no way will the OPs cpu be able to handle crossfire, furthermore 6870 crossfire offers crap minimum fps and a whole host of driver problems consistent across several drivers, I know from experience and have since ditched them for a single gtx 570. just sayin, I think either a single 6870 or a single 560 would be a good choice for the op it just come s down to what models you are considering
Score
0
a b U Graphics card
June 22, 2011 7:54:57 PM

OP: also I have a backup rig that I run with a Athlon II 240 (stock 2.8) that I have clocked at 3.5ghz and have seen a big jump in performance in most games bc of doing so. I would consider doing the same, mind you this was all done easily with a stock cooler from my 955 that I never used. This system uses a 5770 so I am sure that your cpu at stock will hold back a 6870/560 because it is almost as fast as 2 5770s in crossfire
Score
0
June 22, 2011 7:58:25 PM

jjb8675309 said:
yea but no way will the OPs cpu be able to handle crossfire, furthermore 6870 crossfire offers crap minimum fps and a whole host of driver problems consistent across several drivers, I know from experience and have since ditched them for a single gtx 570. just sayin, I think either a single 6870 or a single 560 would be a good choice for the op it just come s down to what models you are considering

Sure, but he's only getting one card now. 6870 is better and cheaper than the 560nonti, for future if he wants to get another card then itll be even cheaper, with driver updates of course. So the best choice for now an future is 6870, considering that there is a possibility of getting a phenom ii x4 eventually as prices will drop cuz bulldozer. Rather than gettin a 350+ card. Cf 6870 will last years.
Score
0
a b U Graphics card
June 22, 2011 8:03:00 PM

yeah if you can put up with spending 400+ on gpus to have them not work properly all the time, the micro stutter was consistent across several games and drivers, I don't know about you but when I spend 400+ on any gpu configuration I want the overall experience to be a satisfying one, that is my 2 cents, but I think a 6870 for now is a great choice regardless it is a good card by itself.
Score
0
a c 592 U Graphics card
a c 385 Î Nvidia
June 22, 2011 8:15:43 PM

This is for the stock 560. Of course it's well known to overclock like mad, so who would just stick with the stock clocks?

Score
0
June 22, 2011 8:24:02 PM

17seconds said:
This is for the stock 560. Of course it's well known to overclock like mad, so who would just stick with the stock clocks?
http://media.bestofmicro.com/G/4/292180/original/Avg%201920.png


Yup, the 6870 being cheaper beats the gtx 560nonti, looking at minimum frames(for me the most important aspect in gaming).
Score
0
June 22, 2011 9:22:32 PM

ok. so which overclocks better ?
Score
0
a c 216 U Graphics card
a c 80 Î Nvidia
June 22, 2011 9:27:59 PM

jjb8675309 said:
yeah if you can put up with spending 400+ on gpus to have them not work properly all the time, the micro stutter was consistent across several games and drivers, I don't know about you but when I spend 400+ on any gpu configuration I want the overall experience to be a satisfying one, that is my 2 cents, but I think a 6870 for now is a great choice regardless it is a good card by itself.


Just keep in mind that while you had issues, it probably is not the norm. The review sites seem to be getting good results.
Score
0

Best solution

a c 592 U Graphics card
a c 385 Î Nvidia
June 22, 2011 10:12:46 PM

logox said:
ok. so which overclocks better ?

560 can get above 1 Ghz. The 6870 is known to have little headroom for overclocking.

Add in the fact that Nvidia clocks shaders at 2x the core speed, versus AMD shaders clocked at 1x the core speed. What that means is twice as much boost to shader speed for Nvidia cards when overclocking and an explanation for the fact that Nvidia cards scale better than AMD cards when overclocking. Headroom + scalability makes the 560 a better option for overclocking enthusiasts.


From the overclocking section of the Guru3d review of the 6870:
"As you can see the increased clock did not offer that much extra performance, we see this quite a bit with high-end ATI products."
http://www.guru3d.com/article/radeon-hd-6850-6870-revie...

From the conclusion of the Guru3d review of the ECS 560:
"Fact remains though that in it's current design this product really offers nice baseline performance for the money. And the tweakability e.g. overclock potential is nothing to be ashamed of either. Apply a tiny bit of voltage tweaking and you will get close to the 1 GHz threshold on the GPU. memory wise this product allowed to be overclocked the most, we got the memory from 4000 MHz (effective) towards 5126 MHz (effective) and with products in the low and mid-range segment, memory bandwidth matters alright."
http://www.guru3d.com/article/ecs-geforce-gtx-560-revie...
Share
June 22, 2011 10:44:21 PM

Thanks to all who posted !!!
Score
0
a b U Graphics card
June 22, 2011 10:51:22 PM

I say: ATI, unless...

ATI usually gives you the same performance with lower power consumption and a lower retail price. So go for ATI, unless:

1) you're using Linux, for which Nvidia has better driver support
2) you have some weird motherboard incompatibility that prevents you from running your ATI card of choice
3) you want the most powerful graphics card available, money not being an issue, and that card happens to be from Nvidia at the time you're reading this
4) through a fluke of the free market you can get a Nvidia card really cheap or through a fluke of bad engineering ATI puts out a card in your preferred segment that turns out to be an unacceptable powerhog (e.g. the HD 5830).
Score
0
a c 216 U Graphics card
a c 80 Î Nvidia
June 22, 2011 10:51:43 PM

17seconds said:

Add in the fact that Nvidia clocks shaders at 2x the core speed, versus AMD shaders clocked at 1x the core speed. What that means is twice as much boost to shader speed for Nvidia cards when overclocking and an explanation for the fact that Nvidia cards scale better than AMD cards when overclocking. Headroom + scalability makes the 560 a better option for overclocking enthusiasts.


I don't want to go searching for the good reviews on the AMD side for balance at this point, but I do find this to be completely misinterpreted.

For starters, both companies shaders are completely different. AMD has tons more shaders than Nvidia, but they perform differently, but the end product, regardless of how they are clocked, is very similar in performance. If Nvidia's shaders are clocked at x2 of the base, and AMD's is clocked at x1, then increasing the clock of either still results in the same percentage of performance gained.
Score
0
June 22, 2011 10:56:23 PM

Best answer selected by Logox.
Score
0
June 22, 2011 11:01:40 PM

I have owned several ATI (long before AMD bought them) boards and one last year. I abandoned them for many years because of buggy drivers. I reluctantly decided to try a 5750 last year and found the drivers to be as buggy as ever. I switched back to Nvidia and the difference is quite obvious to me. Others here will tell you I'm nuts and everybody has driver issues at times. They are right but ATI/AMD issues are the rule rather than the exception. Believe what you want.
Score
0
a c 130 U Graphics card
a b Î Nvidia
June 23, 2011 5:21:13 AM

bystander said:
I don't want to go searching for the good reviews on the AMD side for balance at this point, but I do find this to be completely misinterpreted.

For starters, both companies shaders are completely different. AMD has tons more shaders than Nvidia, but they perform differently, but the end product, regardless of how they are clocked, is very similar in performance. If Nvidia's shaders are clocked at x2 of the base, and AMD's is clocked at x1, then increasing the clock of either still results in the same percentage of performance gained.


Agreed it just dosent stand up to scrutiny. Still never mind teh OP seems happy.

Mactronix :) 
Score
0
a c 592 U Graphics card
a c 385 Î Nvidia
June 23, 2011 5:45:42 AM

I guess you guys failed basic arithmetic.
Score
0
a c 216 U Graphics card
a c 80 Î Nvidia
June 23, 2011 6:50:16 AM

17seconds said:
I guess you guys failed basic arithmetic.


This is algebra.

ax = a2y

a = clock speed.
x is all the power of the ati shaders.
y is all the power of the nvidia shaders.

if a is increased by any percent, both sides remain equal.

If at 800 mhz they equal each other in performance, then they always equal each other assuming no other factors, even as "a" approaches infinity.

Of course the equation could be made a little more complicated, due to the fact they are at different clock speeds for their similar performance. The same principle applies.

It might look more like this for a better representation.

a = percentage of stock clocks.
b = ati's stock clock
c = nvidia's stock clock

then abx = ac2y

And once again, they remain equal no matter what the percentage the clocks are increased.

I tried to spell it out, because it's hard to know how much math knowledge people on the forums have. While I was a music major, I did take up to Calculus 2 in high school.
Score
0
a c 592 U Graphics card
a c 385 Î Nvidia
June 23, 2011 3:09:51 PM

Your attempt is admirable and confusing, but you still haven't given an explanation for the fact that Nvidia cards do scale better when overclocking. I have and it's pretty simple. Increasing the shader speed 2x gives that much more boost than increasing shader speed 1x. Really, in all the probably hundreds of posts I have mentioned this in, this is the only time anyone has chosen to question that basic logic.
Score
0
a c 216 U Graphics card
a c 80 Î Nvidia
June 23, 2011 3:18:09 PM

That's because people rarely understand math very well.

Look at it this way. Maybe you can wrap your head around it easier.

If at 800 mhz, both cards are equal. While Nvidia multiples the clocks by 2 to get it's result, and ATI multiplies their shaders by 1 to get the same result, logically, that means ATI's total shaders are twice as fast per clock as Nvidia's shaders.

800 * 2 * NvidiaShaders = 800 * 1 * ATIshaders.
If you simplify that it breaks down like so:
800/800 * 2* NvidiaShaders = 800/800 * 1 * ATIshaders
2 * NvidiaShaders = 1 * ATIshaders

The only reason the Nvidia card would overclock better is if it had more head room to increase. If you increase clocks by the same percentage, they should increase performance by the same percentage.
Score
0
June 23, 2011 3:35:58 PM

I agree with bystander , by looking at this chart:


Based on CORE frequency overclocking a nvidia does not translate into 2x the performance, I think it is rather that their core can overclocks higher with respect to stock than AMD. Base on raw numbers from core clocks there is not a 2x performance or a relation of each 1 mhz translates to more performance.

Of course this is just an example of only one game, whereas different games shows slight differences in performance from overclocking on an AMD or Nvidia GPU.
Here is the link for all the charts: http://www.xbitlabs.com/articles/graphics/display/radeo...
Score
0
a c 592 U Graphics card
a c 385 Î Nvidia
June 23, 2011 3:47:28 PM

Until someone explains the fact that Nvidia cards scale better when overclocking, it's all just wasted words. You're putting forth a theory on why they shouldn't scale better, but they do.

Cronos, we're talking about scaling percentage, i.e. when you increase a core 10% then a performance increase of 10% would equal 100% scaling. Perfect scaling doesn't happen, but Nvidia cards have a higher scaling percentage than AMD cards. Core speeds, overclocking headroom, and the amounts of the overclock are irrelevant. The amount of the overclock and the amount of performance gained is the issue.
Score
0
a c 130 U Graphics card
a b Î Nvidia
June 23, 2011 4:23:55 PM

@ matto17secs,
Thats all well and good for shader heavy games but what happens for games that are not so shader heavy ? The GPU card market works thus.
Irrespective of how the architecture works AMD and Nvidia cards tend to perform the same for a given price/ performance point.
We are after all talking about totally different architectures here so it stands to reason that one would do better over clocking than the other but lets be clear here Nvidia cards do Scale better thats a fact that cant be disputed on the face of it, but its only slightly its not night and day, or is it ? please post proof if it is.
You are aware that you cant just work a mathematical formula based on numbers of what we shall call shaders for ease X clocks right ? The utilization of shaders by each architecture makes this kind of math totally invalid,.
I'm open to being educated here as despite my status im always looking to learn and unless im 100% sure im right im always open minded and even if im sure im right i wont ignore counter facts that prove other wise.
I think a good attitude that many ( no one posting here) could/should adopt.

Mactronix :) 
Score
0
a c 216 U Graphics card
a c 80 Î Nvidia
June 23, 2011 5:00:35 PM

A couple notes I wanted to also mention on my example above.

1) If the stock clocks of the two cards compared are same or close (6950 vs 560ti), they should scale about the same per point, but if the clocks are vastly different at stock, like 600 vs 900 and they are equal in performance, then it takes 1.5 clock points on the 900 mhz to equal the same increase as 1 on the 600. This is why I kept talking about raising the clock by a percentage and not point per point. (i.e. 560 vs 6870, the 560 is starting at 810 vs 900 on the 6870, point per point, the nvidia card gains slightly more. This is assuming the same performance at stock).

2) There are definitely other factors involved. I just simplified it to show how your example was way off reality.
Score
0
a c 130 U Graphics card
a b Î Nvidia
June 23, 2011 5:05:56 PM

Yes, of course the base clocks vary wildly which also throws the assertional that the clock multi can be in any way mathematically utilized to prove anything. Along with other issues such as the Amd utilization of the shaders .

Mactronix :) 
Score
0
a c 592 U Graphics card
a c 385 Î Nvidia
June 23, 2011 6:48:20 PM

I still think it's pretty simple, if you increase the shader clocks twice as much, you're going to get twice as much performance increase out of the shader component. If the starting performance is the same, the one that gets the bigger clock boost is the one that gains the most performance.

Thanks for the discussion guys, I'm moving on now!
Score
0
a c 130 U Graphics card
a b Î Nvidia
June 23, 2011 6:51:59 PM

I love it when someone is wrong but tries to take the high ground :lol: 

Mactronix :) 
Score
0
a c 216 U Graphics card
a c 80 Î Nvidia
June 23, 2011 6:59:33 PM

Clearly our educational system failed here, or pride got in the way.
Score
0
a c 130 U Graphics card
a b Î Nvidia
June 23, 2011 7:03:04 PM

I would assume the latter, anyway the OP seems happy and matt has moved on so say la vie.
I just hate miss information though.

Mactronix :) 
Score
0
a c 271 U Graphics card
a c 171 Î Nvidia
June 23, 2011 8:33:36 PM

This topic has been closed by Mousemonkey
Score
0
!