From the link of the OP:
[...] Nvidia pushes the message that if you have a low-end CPU and a high-end graphics card you will play better than with high-end quad-core CPU and cheap graphics.
That's a perfectly fine statement, even in its rather general perspective. "Cheap graphics" means "onboard" or "average laptop" graphics to me, maybe even with no dedicated video RAM. Even the fastest CPU cannot achieve good framerates in the target market of nVidia (> 1280x1024, full screen, high poly, high shader usage), with such cheap graphic hardware, if it has to do most things (even RAM transfers) itself.
On the other hand, a slow(ish) CPU can still get quite good frame rates if the polygon count is not too high, and if thus most of actual processing is happening on a fast, modern GPU.
These days, game companies invest a huge amount of effort in graphics (which means lots of polygons, shaders etc.). The other parts of a game, where a CPU may be heavily used (AI etc.) do seem to be unproportionally lower developed. So, of course, hardware support for the fastest growing part of games means more than CPU.
Plus, some things (AA, interpolation, resolution) do not affect the CPU at all (the CPU doesn't care a bit if the resolution is 800x600 or 1920x1600, or at least so it doesn't matter much).
There simply are not many options in the average game where you can tune for a slow/fast CPU, but any game is stuffed to the brim with options to adjust for a weak/strong GPU.
Nvidia is publicly claiming that the GPU is better and smarter than the dull CPU. CPUs are boring and
Duh. Target age of this statement? Maybe 13-16 years?
Nvidia claims GPU matters more than a CPU
This title is of course grossly misleading, no meaning whatsoever can be interpreted into it.