Personally, I consider things like the GeForce 6600GT and 7600GT to be anomalies here, probably more of a testament to what nVidia did wrong with the high-end cards of the previous generation than anything, ESPECIALLY in the case of the 6600GT. I'd note that the FX 5600 series was utterly laughable in comparison to the GeForce 4 Ti, and the 8600 series, as noted, was easily beaten by the 7900 series. The same thing was seen on the ATi side; the X600XT was, for all intents and purposes, actually a 9600XT made for PCI-express and vendors actually sometimes had the gall to call it a mid-range card; the X700pro might've been better, but still was a 9800 or 9800SE at best.
And anyone remember the X1600 series? It would've classified as fully mid-range for even the X-series! (and still didn't really quite show any marked improvement over, say, the 9800XT) And again, once we headed into DirectX 10, the HD 2600 series doesn't fare so well compared to the best DirectX 9.0c card like even the Radeon X1950pro.
after 70m your brain doesn't process information that fast and your eyes cant see that many FPS,
That's actually incorrect; the human eye can probably percieve at around 300-500fps, not numbers around 60 as most people claimed. Television signals were only set at 60Hz because, as I'd note, this was back in like the 1950's, and it was about the best trade-off they could find between resolution and refresh rates... As well as just about all that would fit through a coaxial composite cable.