Yes, it is dependent on the individual game, but it's much like how a modern i5 is "good enough" to hit 60 in most games, but you'll want a modern i7 if you're aiming higher - an older Sandy Bridge i7 is good enough for 60fps in most modern games, with some exceptions, but is going to fall short if you're running a 120 or 144hz screen. I'm making generalities here, I'm not saying that it's only good for just 60fps in every game.
Take Watch Dogs 2 as an example:
A stock 2600K (nearly the same CPU as OP has) showed an average of 57fps and minimums in the low 40's, whereas a stock i7 6700 has minimums almost 50% higher and usually stays above 60fps. If we assume framerates will scale linearly with clockspeed (which is not usually strictly true), an i7 7700K overclocked to 5ghz would be getting ~110fps averages, with minimums in the 80's - almost twice as fast as the Sandy Bridge CPU. If OP is running a 120hz screen, the difference would be huge, but if OP is only running a 60hz screen, the only difference would be the elimination of those dips into the 40's.
In Witcher 3, OP's i7 should definitely keep things above 60fps, but falls well short of 120 or 144, which could easily be achieved with an overclocked Kaby Lake chip: