I feel like there is a lot of misinformation regarding console GPUs vs. PC GPUs and I figured this would be a nice place to discuss this in more detail. Take a look at this article as an example:
http://www.gamechup.com/nvidia-ps4s-gpu-is-more-powerful-than-xbox-720s-gpu-but-3x-slower-than-titan/
Why are GPU flops often the only specification given for console GPUs? This seems silly, considering when I'm comparing GPUs, flops are not something that really pop up in the comparisons. Is it naive to think that the PS4 GPU is only 3x slower than a Titan? Console games are played on 720p with no antialiasing, dumbed-down graphical settings, and no DX11 features. Thus, can they really even be compared when the latter wouldn't be much of an issue for the lowest tier PC GPUs?
http://www.gamechup.com/nvidia-ps4s-gpu-is-more-powerful-than-xbox-720s-gpu-but-3x-slower-than-titan/
Why are GPU flops often the only specification given for console GPUs? This seems silly, considering when I'm comparing GPUs, flops are not something that really pop up in the comparisons. Is it naive to think that the PS4 GPU is only 3x slower than a Titan? Console games are played on 720p with no antialiasing, dumbed-down graphical settings, and no DX11 features. Thus, can they really even be compared when the latter wouldn't be much of an issue for the lowest tier PC GPUs?