The question is fundamentally flawed because it assumes a relationship between the CPU and GPU that does not exist.
The FX-6300, can bottleneck ANY GPU in a compute intensive game in some conditions. So can an i7-4790K...
Pick a CPU that can play the games you want to play, at the FPS you want to play them at.
There is nothing you can do in terms of adjusting the "size" of the GPU to "fix" a CPU performance problem unless the change in GPU causes a significant difference in compute overhead per frame drawn to screen (which is the case when switching to different architectures and different APIs, but for now, it would be best to assume that your compute workload for a given FPS is the same no matter what GPU you use).
The FX-6300 will have 20-40FPS minimums in popular compute intensive multiplayer games no matter what GPU you pair it with. An i5-4590 in those same conditions will maintain 30-60FPS minimums in those same compute bound conditions. The GPU selection will only effect these minimums in so much as it effects the compute overhead of the API (AMD DX11, vs NV DX11, vs Mantle, for example).
-------------
Pick a GPU that can play the games you want to play, at the VISUAL QUALITY settings and FPS you want to play at.
Any modern GPU can play any modern game at any FPS you want (within reason). Differences in GPU performance will always manifest as differences in visual quality once you adjust settings to achieve your FPS goals. Example: The GTX650Ti and GTX680 can both produce similar FPS in a particular game and conditions, as long as the GTX650Ti is running at 1080P resolution while the GTX680 is running at 1440/1600P resolution. (or 720P vs 1080P, respectively would also work).
Here's a good example of this:
http://www.bit-tech.net/hardware/graphics/2012/10/09/nv...
Note the GTX650Ti @ 1080P produces roughly the same FPS as the GTX680@1600P. In both cases, the settings have been adjusted to firmly plant the bottleneck on the GPU at less than ideal FPS. This is typical of GPU benchmarks, as they paint an unrealistic picture of where performance originates in gaming with nearly every GPU in the showing delivering a less than ideal result because the visual quality settings are cranked very high to help ensure they get a GPU bound result across the board. In the real world, these sort of benchmarks lead to more confusion about where performance originates than anything else, because people who read these charts then associate FPS with the GPU, even though FPS is inversely adjustable with visual quality on any GPU. The real hard limits of FPS, are always dictated by the CPU.
------------
Knowing that an FX-6300 is going to suffer from lower FPS minimums in compute intensive games than say, an i5 would, shouldn't necessarily be a deterrent in and of itself to buying a high end GPU. You just have to be prepared for a reality that you can't expect to play compute intensive games at a perfect 60FPS+ all the time on the FX-6300. Once you accept this reality, you should have no problem coming up with some creative ways to leverage the GTX970 for something other than higher FPS (choose higher visual quality settings instead).
These amateurs who start balking about system balance, without even mentioning the monitor resolution, the game titles, or the conditions in those games, have no business being on the answer committee here. Ignore any and all advise from these sudo-hardware enthusiasts.
The FX-6300+GTX970 is a fantastic combination if it meets the FPS and visual quality goals without a lot of unused overhead on the more expensive GPU. Such a combination would be an ideal match to gaming on a 1440P monitor with high visual quality and medium (~45) FPS goals in many games. If the goal is to play competitive first person shooters at 120FPS on a 1080P monitor, then the FX-6300 is a poor CPU choice regardless of which GPU is used. Always match the CPU to the compute workload created by the goal FPS in the games you play, not to the GPU.