To specify, I have a microcenter add in front of me displaying a EGVA GeForce GTX 650 Superclocked 2GB GPU with a 1202mhz clock speed. It's going for 159$
There also Advertising a EVGA GeForce GTX 680 2GB GPU (Not even Superclocked, FTW, or anything special) with a 1006MHz clock speed for 469$! So, For 310$ more you drop 200mhz clock speed? Seems like a bad deal...
So this brings me to the assumption that GPU Clock speed doesnt matter.
If it doesnt, then what makes the 680 so much more expensive? And what dictates GPU performance?
Video cards, just like ANY other component, are extremely complex.
You can't look at a Phenom II running at 4.0 GHz and say it's faster than an i5 at 3.8 GHz - even though they have the same number of cores, the architectural difference between the two chips means the i5 is roughly twice as fast.
There IS no number that you can use to point at a card and say "that's better." You have to compare real-world benchmarks. That being said, the 680 is way faster than the 650 because they use different chips. (And the 680 itself is a bad card when compared to the 670 - the 680 is about 5% faster for 25-30% more money.)
EVERYTHING matters. There is no single "what matters" like you're looking for.