I'm not giving any personal remarks.
I'm stating facts.
PC does not work the same as console, they are vastly different. What equals 4 TFLOPS on PC by no means equals 4 TFLOPS on console due to the fact they operate on cut down mobile chips, not full graphics cards. Here is an example, and i'm afraid if you can't accept this then I don't know what to do.
Lets use previous gen consoles as an example.
A PS3 with 1.6TFLOPs as you say.
A GTX 750ti has 1.3 TFLOPS, so the PS3 should easily beat it in terms of performance, correct?
Incorrect. Even though the PS3 graphics chip is more optimized, it simply doesn't have the same performance as a full fat desktop GPU, therefore you can't possibly expect the new consoles to have the same power as a GTX 970.
Here is a comparison between the graphics card the PS3's chip is BASED ON, meaning the card listed is what would have been the full version of the graphics chip present on the PS3 had it not been cut down, meaning this is a very generous comparison. The 750ti actually has a lower TFLOP count at 1.3 TFOPS. See the performance difference.
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-750-Ti-vs-Nvidia-GeForce-7800-GTX/2187vsm12348
Also here is the CPU comparison between the i5 4690 and AMD's FX 9590. The 4690 has 4 cores compared to the FX 9590's 8, and is clocked at a significantly lower speed. As you can see the Intel beats it by a mile, at a 55% higher score in Cinebench R10, and a 75% higher geekbench score with a 50% larger LN2 cache. Clock speeds and cores are not everything. Brush up on your facts, I can help you if you want, just ask.
http://cpuboss.com/cpus/Intel-Core-i5-4690-vs-AMD-FX-9590