Sign in with
Sign up | Sign in
Your question
Closed

Do AMD Cards have "terrible Open GL performance" compared to Nvidia?

Last response: in Graphics & Displays
Share
a b U Graphics card
October 8, 2012 4:09:12 PM

So I have some friends at university who all say that there is no point in getting an AMD card at all because Nvidia are better in every way.

Yes, I know, please keep reading.

They seem to think that Physx is a huge selling point (because of some incredibly specific demos they saw on Nvidia's youtube channel) and that CUDA is a big selling point to a consumer.

But one thing that particularly interested me is that one of them said "yeah, and you should see Radeon's utterly *** open gl performance compared to Nvidia". This piqued my interest and I did some research but there doesn't seem to be any definitive answers and I do not know of any benchmarks that could accurately delineate such a claim.

So I'm wondering if anyone with a bit of knowledge on the matter could help out. How does AMD's Open GL performance stack up against Nvidia's?
a c 106 U Graphics card
a b À AMD
a b Î Nvidia
October 8, 2012 4:16:29 PM

opengl is a wishy washy topic that is dependant on program.

for instance in Minecraft an opengl title AMD performs better at it. but all programs have to basically utilize cards so which card does better wil generally be dependent on how the programmer optimized it. both ends of the spectrum have left opengl atm and are focusing on other innovations. Link above was to give you insight that nvidia wouldn't technically be better in all categories.


the reason why opengl dropped off is the fact that most developers use directx. If more users used linux, then you will see a sharp rise in opengl apps.
Score
0
a b U Graphics card
October 8, 2012 4:19:51 PM

BigMack, yeah I know exactly what you're on about, and I do take what they said with not so much a grain of salt, but more a truck load.

My main motivation for this question was literally to see if there is any credence what so ever to the claim. Sure Physx and CUDA actually exist and MAY have a marketing point to some people. But is this Open GL comparison even a thing in the first place?

I understand that Furmark's software is Open GL based, would researching posted benches based on that be of any use?
Score
0
Related resources
a b U Graphics card
October 8, 2012 4:20:35 PM

Dudewitbow, thanks for the link and the insight.
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:20:41 PM

There was a time that may have been true. Now both have crappy OpenGL support. If you run an OpenGL benchmark, you never know what you'll get. The last one that was being run on here a few months ago had AMD cards utterly destroying Nvidia cards (cinebench).

The reality is, very few games use OpenGL. John Carmack and engines he plays a part in are about the only engines that use OpenGL. The card manufacturers comes up with special driver support for those specific engines, and pretty much ignores everything else OpenGL related.

There are only a few games that even use GPU accelerated PhysX, and a few of them don't actually gain much of anything using the GPU. It can be nice when you have one of those games. I own 2 such games (Batman: AA and AC). Many people aren't aware that most PhysX games don't use GPU acceleration.
Score
0
a c 106 U Graphics card
a b À AMD
a b Î Nvidia
October 8, 2012 4:21:00 PM

its hard to take synthetic benchmarks as truths as it only displays relativity on stregnth. it doesn't display realistic results.
Score
0
a c 84 U Graphics card
a b À AMD
a b Î Nvidia
October 8, 2012 4:26:30 PM

Griffolion said:

I understand that Furmark's software is Open GL based, would researching posted benches based on that be of any use?

well i didnt know furmark was opengl, but that wont work either. Modern cards and their drivers artificially limit power consumption and performance when they detect 'power virus' like furmark... (so their cards wont go up in smoke)
Score
0
a b U Graphics card
October 8, 2012 4:30:14 PM

BigMack, you're right about that, but I would love nothing more than to come back at them with something solid and show them up wrong.

As a bit of a funny anecdote, we were once just having a general conversation about just about anything, and the concept of group think came up. It's basically a psychological concept that a group of like minded people are more likely to blindly agree with each other based on the fact that they are an "in group", rather than rationally exploring a new idea put forward by a member.

Anyway, one of my Nvidia fan friends said "yeah, it's just terrible that people would be victim to group think". The irony nearly killed me as it flooded the room.

Dudewitbow, you're probably right.

Bystander, I don't think I've ever played a game with GPU accelerated Physx, nor have I seen one on Youtube. I played Metro with my 5970 and it ran fine, plus the graphics were brilliant. I don't see what Physx added to it, let alone GPU accelerated Physx.
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:39:45 PM

Griffolion said:
Bystander, I don't think I've ever played a game with GPU accelerated Physx, nor have I seen one on Youtube. I played Metro with my 5970 and it ran fine, plus the graphics were brilliant. I don't see what Physx added to it, let alone GPU accelerated Physx.


Metro 2033 is one of 2 games I'm aware of that uses GPU accelerated PhysX, but doesn't really benefit from it. With or without the GPU to help, it runs similarly hard on systems. From what I can gather from tooltips from their benchmark and in game, the PhysX will use real time calculations to display the smoke and physx effects. Without the advanced PhysX on, it uses precalculated values, which seem pretty good, so I don't turn it on.

Borderlands 2 doesn't seem to need a Nvidia Physx card either, but most others will, if you want to use their advanced PhysX effects.

Here is a list of GPU accelerated PhysX Games. If someone shows you one with 100-200 games, they are looking at PhysX engines, but not ones that use Nvidia cards for added effects.
http://www.geforce.com/hardware/technology/PhysX/pc-gam...
Score
0
a b U Graphics card
October 8, 2012 4:42:43 PM

Haha yeah, I guess you're right. They did listen to reason, once. We were looking at some benches of the 7970 and the 680. The 680 appeared to edge out the 7970, and it would be the overall winner (though with overclocking, the results would definitely have been different).

So anyway, they took the, on average, ~2 FPS difference in performance across many game benches as definitive proof that Nvidia are the shiznit.

I quickly went on Scan (the standard UK PC parts supplier, in case you're outside the UK) and took a look at the prices.

Cheapest 7970: £310
Cheapest 680: £375

The best they came up with was "hmm, yeah, okay.".
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:43:03 PM

All the above said, if you play games that benefit from PhysX, it is added value for buying an Nvidia card. A lot of people call 3D a gimmick too, but I love it and is the reason I have Nvidia cards, because I have a 3D Vision monitor, which had the best 3D option at the time I bought it, and still does with 3D Vision 2.
Score
0
a b U Graphics card
October 8, 2012 4:43:46 PM

PS - Thanks for that link Bystander.
Score
0
a b U Graphics card
October 8, 2012 4:46:30 PM

I personally don't put a great deal of stock into the whole 3D thing. I tried Nvidia 3D and it really made me dizzy (I'm normally good with 3D). Plus, I'm not a fan of having to use their proprietary glasses and approved monitors just to get the effect. Same with AMD. I just don't buy into 3D in general. But I'm fully accepting that some people consider Nvidia 3D to be a big thing for them, and for that, more power to them that they get it.
Score
0
a c 106 U Graphics card
a b À AMD
a b Î Nvidia
October 8, 2012 4:47:29 PM

Basically all the additions that Nvidia has in their cards are all preferential choices. I can't say 3d is bad, but I can definitely say that it isn't for everyone. (I had a neutral experience with it in my local frys. cousin had a negative experience as a person who doesn't have a high end rig)


Basically, purchase a card for what you need and for the right price. If you want physx and such, than purchase a nvidia card, but if you really dont care, dont make the additive options convince you to go that way. its sorta like what i recently seen with batteries. You have a pack of batteries, and you have another pack of batteries of a different brand with a free gluestick. dont let the gluestick decide your purchase unless you relaly need that gluestick.
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:49:57 PM

There is one other thing to consider in all this, and that is multi GPU setups and micro-stuttering. This is one area that has had very little testing done. I'm only aware of one test and it was done with last generations cards.

http://www.tomshardware.com/reviews/radeon-geforce-stut...

When looking at the charts, it is clear that SLI delivers smoother performance. Whether that has improved is not known as no one has done any tests since this one that I've been able to find. It has been commonly said that the higher end cards didn't suffer from this, but clearly the 6990 did as you can see by this chart:
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:52:10 PM

Griffolion said:
I personally don't put a great deal of stock into the whole 3D thing. I tried Nvidia 3D and it really made me dizzy (I'm normally good with 3D). Plus, I'm not a fan of having to use their proprietary glasses and approved monitors just to get the effect. Same with AMD. I just don't buy into 3D in general. But I'm fully accepting that some people consider Nvidia 3D to be a big thing for them, and for that, more power to them that they get it.


What game had you tried? I have run into a few games where the depth contrast of the UI and objects caused me a little disorientation. That said, in most games it's pretty much like you see in real life. Oh, and one other thing, if you get used to playing a game in 2D or 3D and you go to using the opposite, it can be disorienting because your mind gets accustomed to one or the other.
Score
0
a c 106 U Graphics card
a b À AMD
a b Î Nvidia
October 8, 2012 4:52:54 PM

bystander said:
There is one other thing to consider in all this, and that is multi GPU setups and micro-stuttering. This is one area that has had very little testing done. I'm only aware of one test and it was done with last generations cards.
snip

When looking at the charts, it is clear that SLI delivers smoother performance. Whether that has improved is not known as no one has done any tests since this one that I've been able to find. It has been commonly said that the higher end cards didn't suffer from this, but clearly the 6990 did as you can see by this chart:
snip



if you trust techreport, there was a slight one done in BF3 with the newer gen cards

http://techreport.com/review/22890/nvidia-geforce-gtx-6...

its much smaller than last generation but is still there sorta.
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:56:17 PM

dudewitbow said:
if you trust techreport, there was a slight one done in BF3 with the newer gen cards

http://techreport.com/review/22890/nvidia-geforce-gtx-6...

its much smaller than last generation but is still there sorta.


The chart was done differently than THG's, but ya, the AMD card has more variance for sure.
Score
0
a b U Graphics card
October 8, 2012 4:56:38 PM

bystander said:
What game had you tried? I have run into a few games where the depth contrast of the UI and objects caused me a little disorientation. That said, in most games it's pretty much like you see in real life. Oh, and one other thing, if you get used to playing a game in 2D or 3D and you go to using the opposite, it can be disorienting because your mind gets accustomed to one or the other.


The last one I tried was the remake of Doom 3 specially for 3D at the Eurogamer Expo recently. The whole thing was just really bad. The time before that was at the 580 release event at Scan a couple of years ago. Black Ops (not yet out at that point) was being demo'd in 3D. Same again. It didn't disorientate me, it just didn't impress me. I thought "why would I invest in a 3D ecosystem just for this?".
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 4:59:07 PM

BigMack70 said:
Yup techreport takes a look at microstutter, and it looks like AMD and Nvidia are similarly good/bad on this now, and are both much better than they were in the previous generation.


Looking at that link, the AMD cards showed more micro-stuttering than the Nvidia cards, but it did show improvement over the 6000 series.

Score
0
a c 106 U Graphics card
a b À AMD
a b Î Nvidia
October 8, 2012 5:02:52 PM

note the graph is in frame time and not frames per second. frametime is how fast the frame would be displayed, lower is better.(like it says on the side.
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 5:05:29 PM

Griffolion said:
The last one I tried was the remake of Doom 3 specially for 3D at the Eurogamer Expo recently. The whole thing was just really bad. The time before that was at the 580 release event at Scan a couple of years ago. Black Ops (not yet out at that point) was being demo'd in 3D. Same again. It didn't disorientate me, it just didn't impress me. I thought "why would I invest in a 3D ecosystem just for this?".


I couldn't say what you saw. It's possible it was not setup well, the game wasn't well suited for 3D, or maybe you just don't like it. 2 occurrences don't always mean it is not something you wouldn't like when done right.

I did look up the compatibility of the MW series, and it's listed as "Good" support. From my experience with 3D Vision, "Good" is not good enough to play in 3D. I only play games listed as Excellent or 3D Vision Ready. Anything else has too many problems for 3D. So that Black Ops demo was likely just a bad representation. Doom 3 is listed as "Fair", which is worse yet.

That said, there are some games listed as Good or Fair that have had mods to fix their issues, which often make them excellent for 3D. Here is the mod site I like to fix games:
http://helixmod.wikispot.org/gamelist
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 5:06:37 PM

dudewitbow said:
note the graph is in frame time and not frames per second. frametime is how fast the frame would be displayed, lower is better.(like it says on the side.

From what I can tell, the width of the bars is really the most important part. Maybe I misunderstood the chart.
Score
0
a b U Graphics card
October 8, 2012 5:09:51 PM

bystander said:
I couldn't say what you saw. It's possible it was not setup well, the game wasn't well suited for 3D, or maybe you just don't like it. 2 occurrences don't always mean it is not something you wouldn't like when done right.

I did look up the compatibility of the MW series, and it's listed as "Good" support. From my experience with 3D Vision, "Good" is not good enough to play in 3D. I only play games listed as Excellent or 3D Vision Ready. Anything else has too many problems for 3D. So that Black Ops demo was likely just a bad representation. Doom 3 is listed as "Fair", which is worse yet.

That said, there are some games listed as Good or Fair that have had mods to fix their issues, which often make them excellent for 3D. Here is the mod site I like to fix games:
http://helixmod.wikispot.org/gamelist


You're perhaps right. But my problem with 3D is more a general one. I just don't see it as a huge advantage, at least not so much so that I would spend money on the specialised hardware to implement it. I'm the same with 3D TVs. I'll take my measly two dimensions. :na: 
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 5:13:51 PM

Looking at the charts, you may be looking at the wrong ones. They did a few different types of tests, and they weren't all about micro stutter and several games. BF3 did show the best results for AMD, and it was close, but in some of the games the difference was bigger.

http://techreport.com/review/22890/nvidia-geforce-gtx-6...
Score
0
a c 216 U Graphics card
a b À AMD
a c 80 Î Nvidia
October 8, 2012 5:18:37 PM

Another thing that I'm seeing, and realizing, is they are testing it in many different ways. Not just as a study of microstutter. The charts you have shown that show them almost the same, are about latency, the ones I initially showed were about micro-stutter. The best I can see, the AMD cards have more stutter, but no more time spent with latency. I guess that means Nvidia was producing more consistent frames, but almost always with a bit of latency.
Score
0
a b U Graphics card
October 8, 2012 6:14:24 PM

Basic open GL? Very quickly they are about the same, not a lot of difference if the cards are a pair of competitively priced parts. CUDA and Physx are beneficial if the game or program you are using is optimized for these added features. Some are, not many, but some are. So you need to look at WHAT you will be doing to decide if these features are worth while for you. For most of us, they don't mean squat, really. They are more a "look at the cool things we have built into our products" type of thing to catch your eye and make you spend your money on them.

Now before anyone says I am bias for knocking nVidia, please note my setup!
Score
0
!