I don't think people really hate the i5 6400. At least I hope people don't hate it. It's just in a bad place in the general lineup.
Also, if a program or game hasn't been recompiled or otherwise optimized to take advantage of the new instruction sets that allow for the greater IPC, IPC means f#ck all and clock speed reigns supreme. The first example off the top of my head is Planetside 2 running on the 64bit client, because it can bottleneck every CPU in existence, from a lowly celeron to an overclocked 5960x. For a game that needs to run on a lot of variations of older hardware, and is F2P, not much of the underlying code has been optimized to run on the newer stuff, fearing it may break compatibility with legacy hardware. Thus, every extra mhz counts directly towards FPS gains. For a proper example with numbers, a G3258, 4690k, 4790k, i7 920, i7 870 (Xeon W3540), Phenom 965BE, 955BE, 1066T and 5960x all running at 4ghz yield the same fps, within margin of error, with the same stock 980ti and 16gb RAM. FX CPU's are garbage tier for this game and engine, and with an FX8320 at 5ghz+, it's still less than half as good as my i5 4460, and 1/3 as good as a 4790k. Overclock any k sku intel CPU, and any FX gets instantly sunk. I'm not saying every game makes FX CPU's this miserable, just this game. And H1Z1, Arma2, Arma3, and every asian action fantasy MMO ever (like TERA and BNS). Older, less optimized stuff.
/rant
TL;DR there is an upper limit to "IPC gains". With the way things are going, non-turbo clock speeds still matters quite a bit to single core performance, which is still important to many, many people. Like me.