Something I've always had a hard time understanding is how to look at GPU specs. For CPUs, it's relatively simple: Look at your clock speed, your level 3 Cache, and what the port type is (e.g.. LGA 1155). I know what a good clock speed is, and I know that having 15MB of Level 3 Cache is really good. However, I don't know how to look at a GPUs stats the same way. Can someone help me out here.
An explanation of what the different stats mean and how they factor into the overall performance of the card would be ideal.
Thanks for making an awesome community where I can freely ask questions such as this!
Even in the case of CPUs, you still have to look at the benchmarks to get a clear picture. Never forget the lesson of Netburst: clock speed is the worst single factor to consider when comparing processors. Pipeline depth and scheduling are far more important than pure clock speed. Intel seems to have learned their lesson well. Efficiency improvements have been impressive throughout the development of each generation of the Core series.