AMD FX-6300 Six core and Geforce GTX 770 bottlenecking issues?PU

Whovian98

Honorable
May 6, 2013
94
0
10,640

shotgunz

Distinguished
Dec 17, 2011
188
0
18,710
This was done with a GTX680 OC so it's roughly the same as the 770. Yes, you will get bottlenecks on some games. http://www.tomshardware.com/reviews/gaming-processor-frame-rate-performance,3427-9.html

Average.png


If you played Metro 2033 or Far Cry 3, there wouldn't be too much of a difference
Metro.png


However, if you played Starcraft 2 or Skyrim...
StarCraft2.png


We should be getting games with better core usage later since Xbox One and PS4 are basically using x86 with AMD cpu's and gpu's.
 

Deus Gladiorum

Distinguished

I am probably WAYYYYY too late for this but assuming you haven't done so as of yet, DO NOT buy a GTX 770 for an FX 6300. I have that very build, and the bottlenecking is killing me. By the way, OC does NOT help since I OC'ed my CPU to 4.2 GHz and I've gotten maybe 2 frames worth of improvement at best for said bottlenecked games. Such games include Borderlands 2, Crysis 1, Metro: Last Light, and many others which you'd assume aren't too CPU intensive. Honestly, I should've saved $150 - $200 and went with the Radeon HD 7870. It's a pretty fair card for non-bottlenecking With most of these games I lose around 10-20 frames from my minimum fps values compared to a lot of rigs using a 770 and a superior CPU, usually an i5-3570(k) or an i5-2500(k).
 

Deus Gladiorum

Distinguished


The frames are usually pretty good actually. But most of these games have sections with a ton of draw distance which is apparently CPU calculated and causes some bad slow downs even at 4.2 GHz. Keep in mind that the only game that is limited by my GPU is Metro: Last Light. While other games only resulted in differences between 2-5 fps when I lowered my settings and resolution, Metro: Last Light is the only one out of the 3 games I'm about to list which had MAJOR frame rate changes when I upgraded from my 720p television to my 1080p monitor. This of course implies that for the other games, I'm CPU limited while for Metro: Last Light I'm GPU limited.

On Borderlands 2 with max settings at 1080p including Ultra High distance, I'm typically well over 60. However, I do get the occasional slow down in it to around 46-50 for some sections, but then with sections that have rendered a large distance I get between 30 and 40. These frame rates occur when a lot of particle effects have been rendered through Medium PhsyX. This can be alleviated a bit by changing the draw distance of the game and then changing it back to ultra so the game rerenders the area without the added particle effects but the new minimums will be somewhere between 38-50. I also had some bad slow downs occur in firefights but this seems to have been alleviated by OC'ing my CPU. By comparison, I've seen videos of the game played with the GTX 770 and an i5-2500k or i5-3570k and the minimum frame rates never descended below 59 fps.

With Crysis 1 on ultra settings and 1080p, during the beginning section when the protagonist first lands in the water at night up until the first large section at daybreak, I had between 55-60 fps. However, even then and throughout the game, I'd have pretty major stuttering due to the game having to recalculate its draw distance every time you look down your sights or enter a new area with a lot of objects. Obviously this impacted most on larger open areas where there was a lot to render. This stuttering only lasted for maybe 50-200 milliseconds each time, but it increased my frame latency to around 30-100 ms before going back to its normal values. And considering this was a shooting game where you tend to aim down sights a lot, it can prove annoying. At the first major open area of the game at day break, I experienced a pretty consistent 30-35 fps when I looked into the vast expanse. Lowering resolution and other settings only gave me an extra 2 fps at most. As I got more onto the beaches and through other more enclosed sections of the game, I was back to 48-57 fps not considering the stuttering when I looked down sights or entering new chunks of the game where the engine rendered new objects. At another large rendering distance point I was at around 25-30 fps. During firefights...I don't even know. It was so random because sometimes there'd be a mountain 100 feet away from me so aiming down sights didn't require the game to recalculate much and my fps didn't drop too much. Other times, this was not the case and I'd get stuttering major stuttering. Overall, the fights themselves didn't do much to frame rate I suppose. By comparison I've seen videos of the game played with a GTX 770 with an i7-3770k and frame rates from the exact same section I just described never went under 55 fps.

Lastly Metro: Light. It's not CPU intensive anyway, there's not large rendering distances, but it'll put that 770 to work. Due to this, I've experienced frame rates that stayed mostly between 50-60 with everything on ultra except SSAA and Tesselation which I turn off, however there were quite a number of sections which lower me to around 35-ish. Lowering the resolution from 1920x1080 to 1280x720 ensured that I never went below 49 fps in these sections.

So now that you've read my book report on this, I just suggest you either get a GTX 660 or a Radeon HD 7870 for an FX-6300 at stock clock. If you're willing to overclock, I'd say you could even push it to the respective Ti or GHz editions. I wouldn't recommend higher than that though. Also, a good resource to keep your hardware balanced is to use certain online guides. I really liked http://www.logicalincrements.com/ which is a good hardware guide for budgeting and balancing. As for me, I'm just hoping that AMD's new Steamroller cores come out soon and have noticeable efficiency improvements over the Piledrivers.
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710




Some of the games you play don't use all 6 cores of your cpu. With that said, I like to call it a 'game bottleneck' rather than a 'cpu bottleneck'. So in this case the 'Game' is not using enough cpu, therefore the 'Game' bottlenecks the 'CPU' and forces the 'CPU' to bottleneck the 'GPU'. This is my theory and what I think. But just upgrade to an FX-8350 or something. If you don't have the money for it then just return/sell your GTX 770 and get a lower-end gaming graphics card like a GTX 660 or AMD R9 270. Or maybe even the new GTx 750(Ti) graphics card which is pretty low-end but you might be interested in.
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710




Some of the games you play don't use all 6 cores of your cpu. With that said, I like to call it a 'game bottleneck' rather than a 'cpu bottleneck'. So in this case the 'Game' is not using enough cpu, therefore the 'Game' bottlenecks the 'CPU' and forces the 'CPU' to bottleneck the 'GPU'. This is my theory and what I think. But just upgrade to an FX-8350 or something. If you don't have the money for it then just return/sell your GTX 770 and get a lower-end gaming graphics card like a GTX 660 or AMD R9 270. Or maybe even the new GTx 750(Ti) graphics card which is pretty low-end but you might be interested in.
 

Deus Gladiorum

Distinguished


What you just described as a "game bottleneck" is exactly what a CPU bottleneck is for most AMD users; the game not designed to use all 6 cores. While yes, I suppose you can separate this term from a "CPU bottleneck" -- wherein no matter how well optimized a game is for your CPU it just can't keep up pace -- for most purposes you can view there as being no distinction since the end result is the same. Regardless, even a "game bottleneck" on an FX-6300 still mostly stems from the fact that the FX-6300 doesn't have good single-core performance as opposed to say, an i5-4670k.

In addition, you should reconsider your suggestion on "just [upgrading] to an FX-8350". The problem stems exactly from the fact that the game isn't designed enough to utilize enough cores, so changing to a CPU with the exact same architecture and a higher core count won't fix anything.

UPDATE: In addition, why would I sell my GTX 770 to get a lower end graphics card? That would be absolutely pointless. Performance could only get worse by downgrading a graphics card. By merely downgrading in the hopes of equalizing the difference in GPU:CPU quality, there isn't some sort of performance difference. Such a thing would be absolutely counter-intuitive. Rather than having a frame rate that shifts between 40 - 60 fps in one game, I'd now have a game limited only to 40 fps and upgrading my CPU in future would be pointless because I'd now have my GPU bottlenecking my CPU. I'm not sure why you'd recommend such a thing, but it'd make absolutely no sense.

Lastly, this a seven month-old thread. I've noticed so many others doing this with older threads I've posted on, so may I ask what the point is of gravedigging?
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710


Well the only reason why I said upgrading to an FX-8350 is because it is the only solution without having to buy a new motherboard and cooler and stuff just to get Intel. And I'm pretty sure that the FX-8350 wouldn't bottleneck the GTX 770. Even in some games that don't utilize all 8-cores it will still perform well specially if it is overclocked.
 

Deus Gladiorum

Distinguished


That would not be the case at all. Let me help clear up your confusion for you. The core count is only as important as how much the program is designed to utilize it, like you were getting at. The FX-8350 and the FX-6300 are the exact same CPU, the only difference is that the FX-6300 is clocked 500 MHz lower and has two locked cores. Other than that, their architecture is the exact same. Now, if you were to overclock an FX-6300 to 4.0 GHz, in a game that can only utilize six or fewer cores an FX-8350 and an FX-6300 would perform the exact same because the only real advantage the FX-8350 has over the FX-6300 is the extra cores (and increase in frequency which one can do themselves). However, those two extra cores don't mean anything if the program doesn't use them.

So like you said previously, a "game bottleneck" occurs because the game isn't using all your cores. The bottlenecking issues you get with an FX-6300 occur almost exclusively in games that aren't using all your cores, so getting an FX-8350 would be a waste of money as well as I may go from 75 fps to 90 fps in something like Battlefield 3, but it'd leave my 40 fps in something like the Morrowind Overhaul at 40 fps. There's very little point to buying an FX-8350 for gaming when you already have an FX-6300.
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710


I don't know where you are getting at because what I meant to say is that the 8350 will not bottleneck the GTX 770 in games like BF4 where it utilizes all 8 cores. Oh yeah and by the way the FX-8350 is not exactly the same as the FX-6300, because apart from the 2 extra cores it also has better single-core performance than the FX-6300. Therefore making the 8350 better than the 6300 in single-core and multi-core tasks.
 

Deus Gladiorum

Distinguished


You're completely ignoring everything I wrote. The issue is clearly not with programs/games that utilize six or more cores. We've established that. The issue is with games that utilizes less than six cores. There's not even an issue with games that use six or more cores because for the majority of those games, they won't bottleneck a GTX 770. Battlefield 3 or 4 isn't even an issue here because they don't bottleneck with an FX-6300. I've never once fallen below 60 fps on BF3 with my GTX 770 and FX-6300 on a sixty-four player map, and according to benchmarks the average and minimum for BF4 is well above 60 fps. But again, that's not the issue. Please actually read what I'm writing. The issue is with games that use fewer than six cores, i.e. games like Borderlands 2, Crysis, Skyrim, and so many others. Those are games which only use somewhere between two to four cores, hence why upgrading to an FX-8350 would be pointless.

Also, again, please actually read the things I've written as it's obvious from your response that you've done nothing more than glance over them. Why do you think that the FX-8350 sports slightly higher single-core performance over the FX-6300? Reread this:

The FX-8350 and the FX-6300 are the exact same CPU, the only difference is that the FX-6300 is clocked 500 MHz lower and has two locked cores.
[...]
Now, if you were to overclock an FX-6300 to 4.0 GHz, in a game that can only utilize six or fewer cores an FX-8350 and an FX-6300 would perform the exact same because the only real advantage the FX-8350 has over the FX-6300 is the extra cores (and increase in frequency which one can do themselves).

The FX-8350 is the exact same as the FX-6300, just with two more unlocked cores and a 500 MHz increase in clock speed. The slight increase in clock speed is the only reason why the FX-8350 has better single core performance over the FX-6300, but if you've listened to all of my examples, I've mentioned that if you've also overclocked then there's absolutely zero difference in performance in programs/games that utilize fewer than six cores.
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710
Well it seems that you know more than I do about CPUs.

I apologize for my poor knowledge. But hey, now I know more about CPUs and I won't make the same mistake again. So basically it comes down to this: 'Upgrade to Intel' because it seems to be the only way to fix your problem unless you get something like an FX-9590 ..

I'm just too young to know everything ;)
 

Whovian98

Honorable
May 6, 2013
94
0
10,640
I thought the 9590 was the sam as the 6300 and the 8350, just with a crazy high power rating? That, and this thread is unbelievably old. I ended up going with 2 x 7870s, and they fit just right with my 6300 overclocked. But thanks for the help anyways.
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710


The 9590 is the same as the 8350 just clocked way higher and more power requirement.
 

Osoclocker69

Honorable
Jan 12, 2014
138
0
10,710


The 9590 is the same as the 8350 just clocked way higher and more power requirement.