Sign in with
Sign up | Sign in
Your question

How accurate is the Graphics Card Hierarchy Chart?

Last response: in Graphics & Displays
Share
February 13, 2010 4:50:07 PM

Some of you might be familiar with this card: http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Gigabytes HD 4550, $20 after rebates. I was tempted to pull the trigger on this one to upgrade my failing system containing Nvidia's 7600 GS.

Yet behold: On the graphics hierarchy chart, the 7600 GS is a tier higher than the HD 4550! So even though the 4550 had features that the 7600 GS was missing (HD playback, HDMI, etc), I decided not to sacrifice gaming performance.

Except this is untrue. I don't care what benchmark you look at, the HD 4550 consistently outperforms the 7600 GS even in gaming. Now, if they were in the same tier as each other, things would make sense and all would be well... but it makes no sense that the 7600 GS is ranked higher on the charts.

Which leads me to wonder how this chart was created in the first place, and whether any reviews are being done at all....
February 13, 2010 5:09:55 PM

Hi jyjjy. I am relatively knowledgeable about most mainstream and high-end gaming cards, but this was the first time I laid my eyes on an extremely cheap, home-theater-use graphics card and was interested in how it would double as a video gaming computer (for potential use in LAN parties at my place), and I have been using the graphics card hierarchy chart as a crutch, but I see that that it's possible to make erroneous decisions due to the chart, and I was just wondering how the rest of the community perceived it's veracity.
m
0
l
Related resources
February 17, 2010 12:42:51 PM

i suspect it's more accurate when comparing ATI with ATI and Nvidia with Nvidia. I've just gone from an x1950pro to an HD3850 and the difference is about what the chart suggests - an extra 10fps in Doom 3 with a few extra quality settings and more AA. It's a 4 tier upgrade according to the chart so i wasn't expecting miracles. What i really wanted was full W7 compatibility since the x1950 was on legacy support and TV output was disabled. In your case, as was mentioned, i wouldn't try to use an HD4550 for gaming, and honestly, you'd be better off with the HD5xxx series
m
0
l
February 18, 2010 4:28:33 PM

sorry to double post - but one thing i noticed today was that my brother's HD4670 scores 6.2/6.2 in the W7 experience index and my HD3850 scores 6.9/6.9 yet they are both in the same tier. Make of this what you will.
m
0
l
a b U Graphics card
February 18, 2010 6:46:17 PM

I wouldn't put much weight on WEI scores though. My pair of 4870x2s score a 7.1 for example, only a bit higher than your HD3850 (and I'm pretty sure a pair of 4870x2s is faster than a single 3850 by a pretty sizable margin).
m
0
l
February 21, 2010 3:49:18 AM

chevallier, that seems to be the idea here. The Nvidia and ATI charts do not accurately represent each graphic cards strength once it goes below the ~9600 models from Nvidia and ~4650/70 from ATI, but from what I can see, they make sense if we compare green apples to green, and red apples to red.

And yeah, relying on Windoze benchmarks probably isn't the best of ideas =P.
m
0
l
!