I realize that but your missing the point.
Am I?
The point, I would think, is that you would not recommend a person without a high-end dual core CPU to get an 8800 GTX because it would bottleneck them.
My point is that if they can afford the best card they can get, it won't bottleneck them to any significant amount.
Let's consider the results of the article at Tom's
http://www.tomshardware.com/2006/11/29/geforce_8800_needs_the_fastest_cpu/
Concentrating on the 1600x1200 resolution - the minimum you'd want to play if you went out and purchased an expensive GTX, right? If you want to play at 1280x1024, you probably don't need an 8800 in the first place...
Doom3: 4xAA 8xAF, 1600x1200
Core2 Extreme 6800 with 8800 GTX:
123 fps
Core2 Extreme 6800 with X1950 XT:
80 fps
Athlon64 FX 60 with 8800 GTX:
108 fps
FEAR: 4xAA 8xAF, 1600x1200
Core2 Extreme 6800 with 8800 GTX:
83 fps
Core2 Extreme 6800 with X1950 XT:
57 fps
Athlon64 FX 60 with 8800 GTX:
79 fps
Oblivion: Outdoors, 1600x1200
Core2 Extreme 6800 with 8800 GTX:
39 fps
Core2 Extreme 6800 with X1950 XT:
22 fps
Athlon64 FX 60 with 8800 GTX:
35 fps
Scenario by scenario, what's the bottleneck? Is it the videocard or is it the processor? Seems pretty obvious that the card is a much bigger bottleneck than the processor. And at higher resolutions in that article, the CPU is even less of a bottleneck.
Seems like if you're going to be playing at high resolutions with eye candy, the processor isn't that much of a bottleneck at all...
...And like I said, if you're playing at 1280x1024 with no AA, why are you paying the extra hundreds of dollars for the 8800 GTX in the first place? At decent resolutions, the processor bottleneck is all but removed.