Do AMD Cards have "terrible Open GL performance" compared to Nvidia?

Status
Not open for further replies.

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960
So I have some friends at university who all say that there is no point in getting an AMD card at all because Nvidia are better in every way.

Yes, I know, please keep reading.

They seem to think that Physx is a huge selling point (because of some incredibly specific demos they saw on Nvidia's youtube channel) and that CUDA is a big selling point to a consumer.

But one thing that particularly interested me is that one of them said "yeah, and you should see Radeon's utterly *** open gl performance compared to Nvidia". This piqued my interest and I did some research but there doesn't seem to be any definitive answers and I do not know of any benchmarks that could accurately delineate such a claim.

So I'm wondering if anyone with a bit of knowledge on the matter could help out. How does AMD's Open GL performance stack up against Nvidia's?
 
opengl is a wishy washy topic that is dependant on program.

for instance in Minecraft an opengl title AMD performs better at it. but all programs have to basically utilize cards so which card does better wil generally be dependent on how the programmer optimized it. both ends of the spectrum have left opengl atm and are focusing on other innovations. Link above was to give you insight that nvidia wouldn't technically be better in all categories.


the reason why opengl dropped off is the fact that most developers use directx. If more users used linux, then you will see a sharp rise in opengl apps.
 

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960
BigMack, yeah I know exactly what you're on about, and I do take what they said with not so much a grain of salt, but more a truck load.

My main motivation for this question was literally to see if there is any credence what so ever to the claim. Sure Physx and CUDA actually exist and MAY have a marketing point to some people. But is this Open GL comparison even a thing in the first place?

I understand that Furmark's software is Open GL based, would researching posted benches based on that be of any use?
 
There was a time that may have been true. Now both have crappy OpenGL support. If you run an OpenGL benchmark, you never know what you'll get. The last one that was being run on here a few months ago had AMD cards utterly destroying Nvidia cards (cinebench).

The reality is, very few games use OpenGL. John Carmack and engines he plays a part in are about the only engines that use OpenGL. The card manufacturers comes up with special driver support for those specific engines, and pretty much ignores everything else OpenGL related.

There are only a few games that even use GPU accelerated PhysX, and a few of them don't actually gain much of anything using the GPU. It can be nice when you have one of those games. I own 2 such games (Batman: AA and AC). Many people aren't aware that most PhysX games don't use GPU acceleration.
 

Kari

Splendid

well i didnt know furmark was opengl, but that wont work either. Modern cards and their drivers artificially limit power consumption and performance when they detect 'power virus' like furmark... (so their cards wont go up in smoke)
 

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960
BigMack, you're right about that, but I would love nothing more than to come back at them with something solid and show them up wrong.

As a bit of a funny anecdote, we were once just having a general conversation about just about anything, and the concept of group think came up. It's basically a psychological concept that a group of like minded people are more likely to blindly agree with each other based on the fact that they are an "in group", rather than rationally exploring a new idea put forward by a member.

Anyway, one of my Nvidia fan friends said "yeah, it's just terrible that people would be victim to group think". The irony nearly killed me as it flooded the room.

Dudewitbow, you're probably right.

Bystander, I don't think I've ever played a game with GPU accelerated Physx, nor have I seen one on Youtube. I played Metro with my 5970 and it ran fine, plus the graphics were brilliant. I don't see what Physx added to it, let alone GPU accelerated Physx.
 


Metro 2033 is one of 2 games I'm aware of that uses GPU accelerated PhysX, but doesn't really benefit from it. With or without the GPU to help, it runs similarly hard on systems. From what I can gather from tooltips from their benchmark and in game, the PhysX will use real time calculations to display the smoke and physx effects. Without the advanced PhysX on, it uses precalculated values, which seem pretty good, so I don't turn it on.

Borderlands 2 doesn't seem to need a Nvidia Physx card either, but most others will, if you want to use their advanced PhysX effects.

Here is a list of GPU accelerated PhysX Games. If someone shows you one with 100-200 games, they are looking at PhysX engines, but not ones that use Nvidia cards for added effects.
http://www.geforce.com/hardware/technology/PhysX/pc-games
 

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960
Haha yeah, I guess you're right. They did listen to reason, once. We were looking at some benches of the 7970 and the 680. The 680 appeared to edge out the 7970, and it would be the overall winner (though with overclocking, the results would definitely have been different).

So anyway, they took the, on average, ~2 FPS difference in performance across many game benches as definitive proof that Nvidia are the shiznit.

I quickly went on Scan (the standard UK PC parts supplier, in case you're outside the UK) and took a look at the prices.

Cheapest 7970: £310
Cheapest 680: £375

The best they came up with was "hmm, yeah, okay.".
 
All the above said, if you play games that benefit from PhysX, it is added value for buying an Nvidia card. A lot of people call 3D a gimmick too, but I love it and is the reason I have Nvidia cards, because I have a 3D Vision monitor, which had the best 3D option at the time I bought it, and still does with 3D Vision 2.
 

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960
I personally don't put a great deal of stock into the whole 3D thing. I tried Nvidia 3D and it really made me dizzy (I'm normally good with 3D). Plus, I'm not a fan of having to use their proprietary glasses and approved monitors just to get the effect. Same with AMD. I just don't buy into 3D in general. But I'm fully accepting that some people consider Nvidia 3D to be a big thing for them, and for that, more power to them that they get it.
 
Basically all the additions that Nvidia has in their cards are all preferential choices. I can't say 3d is bad, but I can definitely say that it isn't for everyone. (I had a neutral experience with it in my local frys. cousin had a negative experience as a person who doesn't have a high end rig)


Basically, purchase a card for what you need and for the right price. If you want physx and such, than purchase a nvidia card, but if you really dont care, dont make the additive options convince you to go that way. its sorta like what i recently seen with batteries. You have a pack of batteries, and you have another pack of batteries of a different brand with a free gluestick. dont let the gluestick decide your purchase unless you relaly need that gluestick.
 
There is one other thing to consider in all this, and that is multi GPU setups and micro-stuttering. This is one area that has had very little testing done. I'm only aware of one test and it was done with last generations cards.

http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995.html

When looking at the charts, it is clear that SLI delivers smoother performance. Whether that has improved is not known as no one has done any tests since this one that I've been able to find. It has been commonly said that the higher end cards didn't suffer from this, but clearly the 6990 did as you can see by this chart:
Crossfire-SLI-stuttering,Y-X-300633-22.png
 


What game had you tried? I have run into a few games where the depth contrast of the UI and objects caused me a little disorientation. That said, in most games it's pretty much like you see in real life. Oh, and one other thing, if you get used to playing a game in 2D or 3D and you go to using the opposite, it can be disorienting because your mind gets accustomed to one or the other.
 



if you trust techreport, there was a slight one done in BF3 with the newer gen cards

http://techreport.com/review/22890/nvidia-geforce-gtx-690-graphics-card/7

its much smaller than last generation but is still there sorta.
 

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960


The last one I tried was the remake of Doom 3 specially for 3D at the Eurogamer Expo recently. The whole thing was just really bad. The time before that was at the 580 release event at Scan a couple of years ago. Black Ops (not yet out at that point) was being demo'd in 3D. Same again. It didn't disorientate me, it just didn't impress me. I thought "why would I invest in a 3D ecosystem just for this?".
 


Looking at that link, the AMD cards showed more micro-stuttering than the Nvidia cards, but it did show improvement over the 6000 series.

bf3-amd.gif
 


I couldn't say what you saw. It's possible it was not setup well, the game wasn't well suited for 3D, or maybe you just don't like it. 2 occurrences don't always mean it is not something you wouldn't like when done right.

I did look up the compatibility of the MW series, and it's listed as "Good" support. From my experience with 3D Vision, "Good" is not good enough to play in 3D. I only play games listed as Excellent or 3D Vision Ready. Anything else has too many problems for 3D. So that Black Ops demo was likely just a bad representation. Doom 3 is listed as "Fair", which is worse yet.

That said, there are some games listed as Good or Fair that have had mods to fix their issues, which often make them excellent for 3D. Here is the mod site I like to fix games:
http://helixmod.wikispot.org/gamelist
 

From what I can tell, the width of the bars is really the most important part. Maybe I misunderstood the chart.
 

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960


You're perhaps right. But my problem with 3D is more a general one. I just don't see it as a huge advantage, at least not so much so that I would spend money on the specialised hardware to implement it. I'm the same with 3D TVs. I'll take my measly two dimensions. :na:
 
Status
Not open for further replies.