Gaming CPU Hierarchy Chart; Time for Another Tier?

clutchc

Titan
Ambassador
I wonder if it isn't time for Mr. Angelini and Mr. Woligroski to add another tier to their chart? I realize this is grouping CPUs for gaming only, and that there comes a point of diminishing returns, but certainly the high-end i7-5xxx and i7-4xxx Haswell CPUs at the top of the tier deserve to be in a different tier than the older i5-23xx Sandy Bridge CPUs even for gaming... no?
 
Do you have the benchmarks to show them getting significantly better performance? In contrast, the GTX 980 ti is a singe tier above the GTX 780 ti and GTX 980. It provides 20%-30% better performance in most games. Some of the other GPU tiers seem to show about a 10%-15% performance improvement.

For Dragon Age Inquisition-one of the newest and most demanding games, the difference between an i5 3470 and an i7 5960x is about 6%. Even that difference will be impossible to see because in order to show the difference you have to turn down the settings to the point where the game runs much faster than any monitor can display.

The newer/better CPUs are faster, but they just don't make much difference for gaming except in a few edge cases. I think that the current chart encourages people to be sensible with where they spend their money. Jumping a tier or two in GPUs will make a difference. I think it should be the same with CPUs.
 

clutchc

Titan
Ambassador
I guess I'm basing my thoughts on (for example) the fact that the AMD Phenom II X4 980 sits a tier above the Phenom II X4 965. I have/have had both CPUs. If the tiny difference in gaming performance between them warrants a change of tiers, I would think the low end Sandy Bridge and high end Haswell/Haswell-E CPUs should be separated as well.

For example... There seems to be a noticeable improvement in gaming performance between my older i5-2500K and the i5-4690K I have now. Same gxf card.
Smoother gamplay, greater fps.
 
The Phenom II 980 was tiered at a time when it could actually improve performance in some games by 10%. It's true that for some games it made no difference, but where it did make a difference, it could mean the difference between smooth 60fps operation or not. At that time many more games were CPU limited and a speed bump actually made a difference. Designers were pushing the envelope of what was possible back then. Tom's doesn't even include any games in their benchmark comparisons for CPUs anymore.

Newer systems do usually perform better, but controlled tests just don't show much difference between the generations of intel core CPUs. This is especially true if you consider that the older generations often overclocked better than the new.