I've just built a new rig using the I5-3570k and a GTX 680 with a reflashed BIOS, so effectively a GTX 770. Playing Crysis 3, 1080p, max settings, SMAA 2x, I get 55-60fps. The CPU only has all 4 cores at 50-60% during game play (clocked at 3.6GHz), whilst the GPU is at 100% most of the time.
I don't think even a faster GPU eg. GTX780, would utilise the CPU more as it's the GPU that would increase the frame rate.
Therefore, I don't think you need anything more than a I5 @ 3.6GHz (Sandy/Ivy/Haswell) or AMD equivalent for this game.
Looking at the Alpha benchmarks of BF4, even with a Titan, it doesn't utilise all 4 cores of an FX-4300 or hyperthreading of I7-2600k. A 4 core old generation I5-2500k @ 3.3GHz gets marginally better framerate than the 8 core FX-8350 @ 4GHz.
Although the Ivy is not as good at o/c as Sandy and Haswell it not as good as Ivy, each one has more IPC than its predecessor, so you can o/c 100-200MHz less and still get the same performance.
My 3570k will o/c to 4.2GHz without increasing the voltage. It's stable at 4.8GHz with 1.3v which gives a 30% increase in performance, but there's no point as no game needs that yet. I reckon games over the next 12-24 months will require a better GPU or maybe 2x 680s in SLI, so I'll buy another next year when they are cheaper second hand. If the CPU needs more than a 30% increase in performance during that time and games are starting to take advantage of hyperthreading, then I'll get an I7 and o/c it.