The vast majority of games only use 2 cores. Maybe the games you are interested in playing can make use of more than 2 cores, but the fact is games capable of using more than 2 cores represents a small fraction of all games released every year.
Generally speaking, Intel CPU perform better in games than AMD CPU. There are professionally written reviews / benchmarks all over the web that proves it. Maybe AMD can perform better in a couple of games, but the number of games that Intel CPUs perform better in is like an avalanche.
Benchmarks aside, will you notice a difference in games? That depends on the game, graphics card and the type of monitor. If you have a powerful graphics card that can get 80+ FPS with an FX-8350 vs 90+ FPS with a Core i5-3570k? Maybe.
Getting 90+ FPS could mean that the action looks smoother. But you will need a 120Hz monitor to get more than 60 FPS on the screen. If you are only using a 60Hz monitor then it does not matter whether you are getting 80+ FPS, 90+ FPS or even 120+ FPS. The monitor will only display 60 FPS.
If you are going to be video recording game play, then the FX-8350 should be the better option than a Core i5-3570k. If you are playing a game that can actually use 4 cores effectively, then at means when you start recording the video capture / compression programs starts to steal away some processing power from the game which would decrease game performance especially if the game can effectively use 4 cores. While the FX-8350 is not a "true 8 core CPU" in the sense that each pair of cores share a single FPU (Floating Point Unit) so there would be times when both cores would be fighting for the same resource. Nevertheless, there are 8 cores and up to 4 of the remaining 4 cores could be used to encode the video.