Tom's Hardware Graphics Charts: Performance In 2014
Tags:
-
Graphics Cards
-
AMD
- Nvidia
Last response: in Reviews comments
FormatC
May 11, 2014 11:00:03 PM
Two years and two graphics card generations have passed since the last major update to our famous graphics card performance charts. It's time to get them back up to speed. We introduce modern benchmarks, new measurement equipment, and fresh methodology.
Tom's Hardware Graphics Charts: Performance In 2014 : Read more
Tom's Hardware Graphics Charts: Performance In 2014 : Read more
More about : tom hardware graphics charts performance 2014
blackmagnum
May 11, 2014 11:29:53 PM
Related resources
- Why don't Toms Hardware graphics charts include the 9800GT? - Forum
- Hardware used for Graphics Charts? - Forum
- question about 2014 vga charts (3d mark firestrike) - Forum
- Tom's Hardware Graphic Charts - Forum
- Graphics Card Performance Hierarchy Chart missing cards - Forum
tomfreak
May 12, 2014 1:11:48 AM
cypeq
May 12, 2014 1:42:57 AM
First it's great to see new charts.
I was never a fan of this style of benchmarking. It sure gives clean graph of gpu capabilities which we always needed. I would love to see new bottleneck analysis. Or at least parallel test done on midrange PC.
Everyone should keep mind that these charts represent performance of <1% PC builds out there.
If I recall correctly we are at this moment at the edge of PCI 2.0 x8 which = PCI 1.0 x16 . Next or following gen will finally outdate PCI 1.0 in single and PCI 2.0 in dual GPU configs as there will finally be noticeable bottle necks.
I was never a fan of this style of benchmarking. It sure gives clean graph of gpu capabilities which we always needed. I would love to see new bottleneck analysis. Or at least parallel test done on midrange PC.
Everyone should keep mind that these charts represent performance of <1% PC builds out there.
tomfreak said:
First thing Tom need is to bench how PCIE 2.0 8x vs 16x perform on a modern top end GPU. Since 290X are passing the bandwidth from crossfire bridge to PCIE, may be is time to check them again? As I recall AMD do not recommend putting 290x XDMA crossfire on PCIE 2.0 8x. Please check this out If I recall correctly we are at this moment at the edge of PCI 2.0 x8 which = PCI 1.0 x16 . Next or following gen will finally outdate PCI 1.0 in single and PCI 2.0 in dual GPU configs as there will finally be noticeable bottle necks.
Score
1
mitcoes16
May 12, 2014 3:32:54 AM
Any Steam OS or GNU/Linux benchmarks?
It would be nice to add any opengl crossplattform game as any ioquake based one or something more modern and test it under MS WOS and under GNU / Linux
Better if it is future Steam OS to let us know the performance at the same game under MS WOS and under GNU/Linux.
Also it would be nice to test at MS WOS with and without antivirus, perhaps avast that is free or any other of your preference.
Last but not least, in opengl or in directx there are version changes and being able to split cards generations by opengl / directx version support would help as a current price / performance index based in your sponsored links prices.
It would be nice to add any opengl crossplattform game as any ioquake based one or something more modern and test it under MS WOS and under GNU / Linux
Better if it is future Steam OS to let us know the performance at the same game under MS WOS and under GNU/Linux.
Also it would be nice to test at MS WOS with and without antivirus, perhaps avast that is free or any other of your preference.
Last but not least, in opengl or in directx there are version changes and being able to split cards generations by opengl / directx version support would help as a current price / performance index based in your sponsored links prices.
Score
0
mitcoes16
May 12, 2014 3:50:01 AM
No 720p tests?
720p ( 1280x720 píxels = 921.600 píxels) is half 1080p more or less
1080p (1920×1080 píxels = 2.073.600 pixels)
And when a game is very demanding or you prefer to play with better graphics playing at 720p is a great option
Of course,latest best GPUs would be able to play at 4k and full graphics, but when we read the benchmarks we want to know also if our actual card CAN play at 720p (1k) or what the best ones can do at 1k to be able to compare
Also even it is not a standard or accurate, for benchmarking purposes calling 720p (1k) 1080p (2k) and 2160p (4K) wouldbeeasier to understand in a fast sight than UHD FHD and HDR, that can be used too UHD (4k) FHD (2k) HDR (1k)
720p ( 1280x720 píxels = 921.600 píxels) is half 1080p more or less
1080p (1920×1080 píxels = 2.073.600 pixels)
And when a game is very demanding or you prefer to play with better graphics playing at 720p is a great option
Of course,latest best GPUs would be able to play at 4k and full graphics, but when we read the benchmarks we want to know also if our actual card CAN play at 720p (1k) or what the best ones can do at 1k to be able to compare
Also even it is not a standard or accurate, for benchmarking purposes calling 720p (1k) 1080p (2k) and 2160p (4K) wouldbeeasier to understand in a fast sight than UHD FHD and HDR, that can be used too UHD (4k) FHD (2k) HDR (1k)
Score
0
mitcoes16 said:
No 720p tests? 720p does not stress most reasonably decent GPUs much and how many people would drop resolution to 720p these days with all the re-scaling artifacts that might add? In most cases, it would make more sense to stick with native resolution and tweak some of the more GPU/memory-intensive settings down a notch or two - at least I know I greatly prefer cleaner images over "details" that get blurred by the lower resolution and re-scaling that further distorts it.
Considering how you can get 1080p displays for $100, I would call standardizing the GPU chart on 1080p fair enough: the people who can only afford a $100 display won't care much about enabling every bell and whistle and the people who want to max everything out likely won't be playing on $100 displays and $100 GPUs either.
Score
0
2Be_or_Not2Be
May 12, 2014 7:06:22 AM
I really like to see the charts on how much noise a video card's cooling fans make. That makes more of a difference to me as limiting something distracting that I hear every time I game versus getting a louder card with 10 fps more.
I also like seeing how current cards stack up performance-wise to previous generations. That really helps when you're deciding whether to upgrade or not.
I also like seeing how current cards stack up performance-wise to previous generations. That really helps when you're deciding whether to upgrade or not.
Score
1
Zeh
May 12, 2014 7:46:22 AM
WHAT?
So you're not (directly) controlling the relative humidity of the air you're testing the GPUs in? You do know that it affects air's thermal capacity, huh?
(I'm just joking, I'm glad you normalize temperature. Besides, by using an AC unit you're already putting a ceiling on RH%, thus controlling it Indirectly).
So you're not (directly) controlling the relative humidity of the air you're testing the GPUs in? You do know that it affects air's thermal capacity, huh?
(I'm just joking, I'm glad you normalize temperature. Besides, by using an AC unit you're already putting a ceiling on RH%, thus controlling it Indirectly).
Score
0
FormatC
May 12, 2014 8:41:31 AM
@Zeh:
The air-conditioner is only the last help. I'm living in Central Europe in the 1st floor of a a very old, historical building with very thick walls (up to 1 meter!). It is in the hottest summer impossible to reach more than 25 or 26°C inside (with closed doors and windows). This can be cooled down very fast and easy. Mostly I have to heat up my room
For the 720p lovers:
I'll start after summer the entry-level charts with smaller cards and the same benchmarks - but lower resolution and settings for a better comparison. The difference between all cards is too large to put it into one database. This must fail.
The air-conditioner is only the last help. I'm living in Central Europe in the 1st floor of a a very old, historical building with very thick walls (up to 1 meter!). It is in the hottest summer impossible to reach more than 25 or 26°C inside (with closed doors and windows). This can be cooled down very fast and easy. Mostly I have to heat up my room
For the 720p lovers:
I'll start after summer the entry-level charts with smaller cards and the same benchmarks - but lower resolution and settings for a better comparison. The difference between all cards is too large to put it into one database. This must fail.
Score
0
voltagetoe
May 12, 2014 11:38:40 AM
Tomtompiper
May 12, 2014 12:37:41 PM
I know at this moment we only make up a few % of your target audience, but it is an increasing number, and a sea change is on the way. Could you please add just a few Linux benchmarks to allow us nerds to have an idea what the potential is for gaming on linux. I was lucky to be given an R9 290 (Gigabyte OC) for my latest build and I am more than pleased with the performance, however if I had to invest in a card with my own hard earned cash then a little information would be appreciated.
However I appreciate the effort that has been put into trying to give some sort of comprehensive chart which can be of some use.
However I appreciate the effort that has been put into trying to give some sort of comprehensive chart which can be of some use.
Score
0
FormatC
May 12, 2014 2:01:50 PM
Linux is a big problem because it extremely depends at the drivers. It is difficult to stay up2date with all this cards and I'm not able to re-bench all again and again due some funny driver changes over the year.
I understand your interest but this is at the end a big time-problem. But it will be a good idea for a separate review with the most common cards.
I understand your interest but this is at the end a big time-problem. But it will be a good idea for a separate review with the most common cards.
Score
0
1. Anything to address the new wrinkle of cards coming with two settings. I see the 290x for example with Quiet Mode and Uber mode but they both are going at the same 1000 speed ? If so what's the point of the having the option ? Would be nice to see just what the improvement speed wise is.
AMD Radeon R9 290X Reference
4GB Uber Mode / R9 290, 4GB GDDR5, 1000 MHz
4GB Quiet Mode / R9 290, 4GB GDDR5, 1000 MHz
2. Anything to address driver date ? ..... we all know that both teams make driver improvements but if a card is tested with version X.01 in May and then other cards are added in September, how do we compare the current performance of the May tested card w/ the current driver ZZ.01 and the September card with the current driver ? Will the tests be updated with driver revisions for apples and apples (current and current or release date and release date) comparisons ? Of course this is asking a lot but it would make the data more relevant.
3. Any chance of getting a bar extension on those charts so that for example we can see just what a non reference card adds to the equation either outta the box or when OC'd "Bawlz to the Wall".
4. Any chance of getting a specs chart for the "variations" as to what stock clocks are, base and boost, PCB, VRM phases, warranty, dimensions like the one here
http://www.tomshardware.com/reviews/geforce-gtx-560-ti-...
Yes, again asking a lot, but would make everything more relevant .... haven't installed a reference card in as far back as I can remember.
AMD Radeon R9 290X Reference
4GB Uber Mode / R9 290, 4GB GDDR5, 1000 MHz
4GB Quiet Mode / R9 290, 4GB GDDR5, 1000 MHz
2. Anything to address driver date ? ..... we all know that both teams make driver improvements but if a card is tested with version X.01 in May and then other cards are added in September, how do we compare the current performance of the May tested card w/ the current driver ZZ.01 and the September card with the current driver ? Will the tests be updated with driver revisions for apples and apples (current and current or release date and release date) comparisons ? Of course this is asking a lot but it would make the data more relevant.
3. Any chance of getting a bar extension on those charts so that for example we can see just what a non reference card adds to the equation either outta the box or when OC'd "Bawlz to the Wall".
4. Any chance of getting a specs chart for the "variations" as to what stock clocks are, base and boost, PCB, VRM phases, warranty, dimensions like the one here
http://www.tomshardware.com/reviews/geforce-gtx-560-ti-...
Yes, again asking a lot, but would make everything more relevant .... haven't installed a reference card in as far back as I can remember.
Score
0
So out of 8 gaming benchmarks, you're going to go ahead and have 7 of them come straight out of the AMD Gaming Evolved program? Several of these games are well-known to favor AMD cards disproportionately and feature AMD exclusive technology. At least one, Dirt 3, is played by no one, yet features a type of lighting that relies heavily on OpenCL to render, a well-known advantage for AMD cards. I'm not sure that's exactly defined as covering all the bases.
Score
0
I have always wondered why reviews don't list the ASIC quality score of the GPUs they are testing. Its so easy to get using GPU-Z and would help to create a nice database of ASIC scores that would allow us to draw conclusions about its significance.
Do reviewers get cherry-picked golden sample GPUs for testing?
Does company X bin their superclock/OC model chips higher?
Does ASIC quality consistently mean better overclocking potential?
Does ASIC quality have any significance at all in real world gaming?
Etc.
Do reviewers get cherry-picked golden sample GPUs for testing?
Does company X bin their superclock/OC model chips higher?
Does ASIC quality consistently mean better overclocking potential?
Does ASIC quality have any significance at all in real world gaming?
Etc.
Score
0
FormatC
May 12, 2014 10:31:40 PM
@17 seconds:
- 90% of all cards are pure retail cards, no golden samples. All this was verified and proofed.
- the Asic quality is more or less voodo. GPU-Z makes a lot of errors and it's not clear, which GPU-Z version gives you which result.
- I've tested a handfull of 290X f.e. and the bechmark results were mostly similar. But the power consumption not (up to 5% difference)
As I we wrote in the article - the selection of benchmarks is the result of a long selection process and if you take a look at the normalized results (index) you can see, that this results are very close to the average of other sites. For all this benchmarks the driver war is more or less over, so we get stable results over a longer time. All exclusive things were not used as StressFX or PhysX, some anti-alisasing options or lights/shadows.
If I see from company A or N some significant driver improvements, I'm able now to re-bench all the stuff partially. This was done one time with the latest Wonder-driver from Nvidia a few weeks ago. And Dirt3? OpenCL is public, not AMD-exclusive. It is Nvidias part to improve finally the OpenCL performance, because it is 100% a driver issue.
The difference between quiet and uber mode is with full heated cards (and normalized over all benches) below 2%. You can hold in therory the clock rates a little bit longer but after heating and reaching the target temperature above 90°C all this reference cards increases the clock speed to hold it. This "uber mode" is only used to disguise the weakness of this really horrible cooling solution for a few minutes longer.
And finally:
I will bench all reference cards first to make an overview, but I'll also add the results of custom cards later - periodically, each month. I'm not able to write reviews and bechmark more than 20 cards per month at the same time. The current charts content was produced within 2 months and I'm sure that this is a good base to extend it step by step.
- 90% of all cards are pure retail cards, no golden samples. All this was verified and proofed.
- the Asic quality is more or less voodo. GPU-Z makes a lot of errors and it's not clear, which GPU-Z version gives you which result.
- I've tested a handfull of 290X f.e. and the bechmark results were mostly similar. But the power consumption not (up to 5% difference)
As I we wrote in the article - the selection of benchmarks is the result of a long selection process and if you take a look at the normalized results (index) you can see, that this results are very close to the average of other sites. For all this benchmarks the driver war is more or less over, so we get stable results over a longer time. All exclusive things were not used as StressFX or PhysX, some anti-alisasing options or lights/shadows.
If I see from company A or N some significant driver improvements, I'm able now to re-bench all the stuff partially. This was done one time with the latest Wonder-driver from Nvidia a few weeks ago. And Dirt3? OpenCL is public, not AMD-exclusive. It is Nvidias part to improve finally the OpenCL performance, because it is 100% a driver issue.
The difference between quiet and uber mode is with full heated cards (and normalized over all benches) below 2%. You can hold in therory the clock rates a little bit longer but after heating and reaching the target temperature above 90°C all this reference cards increases the clock speed to hold it. This "uber mode" is only used to disguise the weakness of this really horrible cooling solution for a few minutes longer.
And finally:
I will bench all reference cards first to make an overview, but I'll also add the results of custom cards later - periodically, each month. I'm not able to write reviews and bechmark more than 20 cards per month at the same time. The current charts content was produced within 2 months and I'm sure that this is a good base to extend it step by step.
Score
0
walshlg
May 13, 2014 6:46:51 AM
How about a better test of graphics card performance: 3D. Sure not much 3D these days due to outrageous monitor pricing but next year with the Occulus and Sony 3d Headmount displays we will need something to compare cards with.
In addition, most games don't really stress out a good card. Try 3D on a 4k monitor on 3D and then we can really talk about a stress test and performance gains that make a difference in gameplay.
In addition, most games don't really stress out a good card. Try 3D on a 4k monitor on 3D and then we can really talk about a stress test and performance gains that make a difference in gameplay.
Score
0
T-Bag
May 13, 2014 7:56:35 PM
tourist
May 14, 2014 10:26:32 AM
kudorgyozo
May 14, 2014 10:28:21 PM
mordorrson
May 18, 2014 6:28:10 PM
tomfreak said:
First thing Tom need is to bench how PCIE 2.0 8x vs 16x perform on a modern top end GPU. Since 290X are passing the bandwidth from crossfire bridge to PCIE, may be is time to check them again? As I recall AMD do not recommend putting 290x XDMA crossfire on PCIE 2.0 8x. Please check this out If I recall correctly we are at this moment at the edge of PCI 2.0 x8 which = PCI 1.0 x16 . Next or following gen will finally outdate PCI 1.0 in single and PCI 2.0 in dual GPU configs as there will finally be noticeable bottle necks. [/quote]
Unless my eyes are deceiving me, or I've been reading information very incorrectly for the last few months, we're currently at PCIe x16 3.0
Score
0
FormatC
May 19, 2014 6:50:23 AM
kudorgyozo said:
where ARE the charts?Take a look at the top and you will see a menu labeled "Chart". Please move your mouse over this text and left-click on this link. A windows will appear! Read the content of the new page carefully! You'll see a link to the current VGA charts...
Or take the cookie:
http://www.tomshardware.co.uk/charts/2014-vga-charts/be...
Score
0
mordorrson said:
Unless my eyes are deceiving me, or I've been reading information very incorrectly for the last few months, we're currently at PCIe x16 3.0He was writing about the point where bandwidth starts causing significant performance degradation. Right now, most GPUs are still mostly fine with 1.0x16/2.0x8/3.0x4 but at the higher end of the spectrum, 2.0x16/3.0x8 are starting to become necessary - the penalties are becoming steeper than the 3-5% they used to be.
Score
0
bin1127
May 19, 2014 3:24:34 PM
Someone Somewhere said:
QHD=1440p...When NTT originally coined and demonstrated QHD many years ago, it meant 7680x4320.
I just hate when acronyms get re-defined a bunch of times between the time they are first used and the time people settle on one specific meaning. HD was much the same with many outlets slapping the HD label on anything capable of more than 480i, some trying to be more transparent by labeling those ED or whatever else so we ended up with the FHD label for displays genuinely capable of 1080 lines or better... just like QHD got bastardized from 1280x720 "HD" so now we also have QFHD for the real thing and a shift to 2k/4k/8k naming to disambiguate the bunch-of-letter-nobody-agrees-on-a-standard-definition-of mess but is still plagued by tons of variants of its own.
As far as this sort of article goes, people who complain about the lack of different resolution testing need to keep a couple of things in mind:
1- this is only a general performance ranking thing
2- testing at extra resolution takes tons of extra time - particularly when re-testing all the cards due to OS, software and driver changes to update rankings
3- testing at lower resolutions adds little value since the GPU will be limited by fill rate so lower-resolution results can be extrapolated relatively easily
4- testing at uber resolution also adds little value since different GPUs' performance varies drastically between titles, multi-GPU setup, settings and other variables so people interested in beyond-HD resolutions, uber details would never be happy with the results from a ranking chart no matter how thorough the underlying testing may have been since that weighed-average chart result will include tons of results that are of no importance to them
More testing may produce a more "accurate" overall value but a more accurate performance index does not tell you anything particularly useful: the moment it includes results from games, resolution and quality settings you do not care about, the ranking becomes worthless beyond giving you an at-a-glance relative performance comparison and you have to hunt down individual benchmark results to get any more details than that.
So, for the purposes of generating an overall performance chart, only doing 1080p sounds good enough to me.
Score
0
Liam Campbell
May 29, 2014 7:21:46 PM
Anonymous
a
b
U
Graphics card
May 30, 2014 7:40:45 PM
Wouldn't Grid 2 be a better choice than Dirt 3? It is from the same company, but is much newer, is better coded/optimized, and comes with a built-in benchmark (as does, Dirt 3, as far as I remember). I mean, Dirt 3 is old news which, even when it was around, being a rally game, didn't have as much mass appeal as a regular GT racing game.
Score
0
Leah-Alaine
July 11, 2014 1:14:40 PM
I'm looking to upgrade my computer, I am not great with conputers but I'm willing to pay a fair amount to get it working for games. Bf3, MW3..games like that.
My specs (emachine)
Windows 7
-AMS Athlon 11x2 235e dual core processer
-NVIDIA GeForce 6150SE integrated
DVD super multi drive
-750 GB multi drive
-6 GB DDR2 memory
... Thanks
My specs (emachine)
Windows 7
-AMS Athlon 11x2 235e dual core processer
-NVIDIA GeForce 6150SE integrated
DVD super multi drive
-750 GB multi drive
-6 GB DDR2 memory
... Thanks
Score
0
Leah-Alaine said:
I'm looking to upgrade my computer, I am not great with conputers but I'm willing to pay a fair amount to get it working for games. Bf3, MW3..games like that. My specs (emachine)
Windows 7
-AMS Athlon 11x2 235e dual core processer
-NVIDIA GeForce 6150SE integrated
DVD super multi drive
-750 GB multi drive
-6 GB DDR2 memory
... Thanks
You're probably better off posting a thread.
Score
0
Related resources
- SolvedHardware for best quality 2014, please help. Forum
- SolvedSlow performance in 2014 games and svchost.exe using alot of memory Win7 64x Forum
- SolvedWoW 2014 (WoD and beyond) - what performance will I get with this build? Forum
- SolvedNew hardware and technology 2014 Forum
- SolvedBest Cooler For Money vs performance 2014 Forum
- SolvedBest 1440p Gaming High End Graphics Cards 2014 Forum
- Building a High Performance PC 2014-2015 Forum
- SolvedNew graphics card 2014 Forum
- Solvedi'm looking for a 10k gddr5 graphics card from nvidia which will be able to run all the upcoming games of 2014 in ultra setti Forum
- SolvedDell dimension E520 Graphics Card upgrade 2014 Forum
- SolvedLooking for a $500 Gaming Desktop. Don't require a graphics card 2014 Forum
- Solvedwould intel hd graphics 4400 sufficient for running softwares like pro-e ,autocad2014,ansys,photshop Forum
- SolvedBest SLI or Single graphics card for Arma 3 as of 2014? Forum
- SolvedI want graphics card for MAYA 2014 as well high end graphic games Forum
- SolvedWhat graphics card for 100-200$ that will last me through 2014? Forum
- More resources
!