Just a few thoughts. If your principle use is gaming, I doubt you will see much benefit from a Core i7 vs a Core i5. I have a "lowly" Core i5 2300, and in real life, real world (synthetics be damned) performance it is 87-95% the CPU the Core i5 2500 is, even with a significant Mhz penalty at stock speeds. In gaming, that number is 95%+ (because most games aren't CPU bound!) Of course, overclockers see some benefits I can't realize, but that is a separate issue.
Even with this lowly CPU, if I set my priorities right, and processor affinities right -- I can video encode, play DiRT-3, run folding at home, run autodock, and occasionally break for an IRC/IM session if needed while running 3 displays on a simply Core i5 2300 + a 6870. My graphics card gets a workout, and yes, those apps run a bit slower doing this all at once -- but it does it without affecting game performance.
The days of being CPU bound are pretty much over. Intel is giving the mainstream sever/workstation level CPU's for dirt cheap (considering heavy multitasking).
If its just gaming, I don't see much of a need beyond a Core i5 2500K. Its got some overclocking headroom if you need/want it, and its quite a capable chip. If you do feel the need, the 2600K is all of that, and a bag of chips for your application -- overkill. For a heavy multi-tasker like myself, it might be worth it, but for the typical user, no.
So get the 2600K/2500K, throw major $$$ at your high end GPU (7970 looks nice, but early adapters pay a heavy penalty both in costs, and in not having the better performing version that comes out down the road), make sure you get a large SSD drive for the speed there. The one thing I really love about the 7970 is that it promises to actually be able to make eye-finity work. I have it on the Sapphire Flex 6870 1GB, and its just not enough memory, or horsepower to make it worthwhile.
AMD can compete in graphics cards, but AFA cpu's are concerned - that battle is over.
---
All of that being said, there are some cool consoles coming out -- PC games are getting fewer, and fewer. My decision to game with my PC only became viable because my graphics card could handle some of the scientific computing I also do with it, and more scientific apps are starting to utilize the graphics card (but its slow to change -- I did need it for open GL performance period, but I could have done that with a much lower card). For the type of money you are throwing around on this, you could build a nice home use desktop, and buy several consoles with a longer lifespan between mega $$ upgrades.
OTOH good emulators come out all of the time. Wii emulators are pretty good on the PC, and you get PC games too. Maybe it pays off, maybe it doesn't.