Will the AMD FX-6120 bottleneck a GTX 770?

Vortechs

Honorable
Dec 24, 2013
10
0
10,510
Hello, I was wondering if an AMD FX-6120 would bottleneck a GTX 770 2gb? If so, how bad would the bottleneck be? My current processor is the 6120. I will only have to use the 770 and 6120 together for a couple months.
 
Solution
Well I don't know what people are on about. I'm using an FX-8350 at stock clocks and I can play any game without lag, including the so-called "CPU-bound games" like Civilization V and Rome: Total War. I have an FX and I love it. Now, having said that, the Piledriver core is considerably superior to the original Bulldozer core and that will make a difference. As Deus pointed out, a drop from 60fps to 43fps is nothing you'd really notice. It doesn't impede gameplay and for $110, you really can't go wrong. The number of CPU-bound games out there I can count on one hand. Usually, they're just badly-coded with sloppy programming. The number of games that are not CPU-bound numbers in the hundreds so the odds of you playing a CPU bound...

Deus Gladiorum

Distinguished
it depends on the game, but anything CPU bound will cause a huge amount of bottlenecking. Overclocking will help very, very little. I have a GTX 770, and an FX-6300 which is considerably better than your FX-6120, and even overclocked to a very stable 4.2 GHz in CPU bound games the bottlenecking is extreme.

For the most part, in these games I can manage 60 fps but depending on the level/location I'm in the bottlenecking can be fierce. Just as an example, I've been playing Far Cry 2 recently. While I maintain 60 fps in a lot of areas (V-Sync is always on), in a lot of others my frame rate drops to 43 - 45 fps. Not terrible, and still very playable, but when I'm looking at my OSD which I have on to monitor my GPU usage through MSI Afterburner, my GPU usage is a measly 30 - 40% during these instances of 43-45 fps. Quite clearly, if it wasn't for this terrible bottleneck, I'd have the game constantly running at 60 fps.

Other games where similar occurences happen is basically anything expansive or open world. That includes Borderlands 2 (min 30, average 60), Skyrim (min 45, average 60), Assassin's Creed IV (min 37, average 50?), Arkham City (min 30, average 50?), and Crysis 1 (min 20, average 36?). As you can see, all of these games suffer from really bad bottlenecking, and they're quite annoying. Crysis 1 is the worst, but overall the thing in common is that all of these should be capable of maintaining 60 fps constantly without any drops, if we were just looking at the capability of the GTX 770 on its own. It's quite unfortunate, but that's what I get for not doing my research and blindly picking AMD because the sound of "6 'cores' at a whopping 3.5 GHz" was enough to get me to throw down $120 for it. Bleh.

But for GPU intensive games, or CPU lenient games, it's always 60 fps or close to it. Battlefield 3 (60), Crysis 2 (50), etc.
 

Deus Gladiorum

Distinguished
I don't trust FX at all, and I don't think AMD does either. AMD's current marketing strategy seems to be to just abandon its dedicated CPU base. Instead, it looks to try and secure the APU base because Intel doesn't yet have a pseudo-monopoly on that (because Intel on-board graphics are awful). We haven't had a new architecture since over a year ago. Intel on the other hand, releases new architectures annually. That wouldn't be a big deal, but there's also rumors that the next core architecture, Steamroller, which is due to arrive in Q3 2014, won't even be offered for dedicated CPUs. It's instead supposedly going to be put on their APUs exclusively. Again, it's just a rumor, and AMD has said they're not going to discontinue FX, but even so it's just an unreliable series to me.

Because of that, I recommend you save up some cash, about $200 - $240, and then when you get a chance buy a reliable Z87 motherboard ($100 - 120) and an i3-4130 ($100 - $120). The i3-4130 is much more powerful than your current CPU, and the advantage to buying it is that in future you can upgrade to a higher end Intel CPU instead such as the i5-4670k or the i5-5xxx, or whatever may be the next iteration for LGA 1150. Until then, if you desperately need a beautiful new case, go for it and just wait until you have acquired another $200 - $240.

On the other hand, it's not like there won't be a performance difference between an FX-6130 and an FX-8320, but it just won't be as good or anywhere near as impressive as waiting for that Intel build.
 

Vortechs

Honorable
Dec 24, 2013
10
0
10,510
Oh, and another thing, I would imagine you were running all of those games at 1080p on ultra correct? I can honestly deal with less than ultra settings for the time being.
 

Deus Gladiorum

Distinguished


If only it was a matter of Ultra. The problem with CPU bottlenecking is that it rarely has anything to do with your graphical settings. Unless I'm changing something such as the rendering distance in my game, it'll have no effect on FPS. I have this one, awful memory specifically: When playing Crysis 1 I got to this one point where my frame rate just stuck at 33 fps in this one area while I was at 1080p very high settings. So to confirm if this was a bottleneck, I turned off all AA, and changed the quality from Very High to medium, and I was still at 33 fps. A similar incident occurred for Borderlands 2. In this one area where I was getting a constant 38 fps, I turned off FXAA, turned down other detail settings, and was still at 38 fps. Only when I changed my view distance from "very far" to "near" did my fps shoot back up to 60 fps.
 
Well I don't know what people are on about. I'm using an FX-8350 at stock clocks and I can play any game without lag, including the so-called "CPU-bound games" like Civilization V and Rome: Total War. I have an FX and I love it. Now, having said that, the Piledriver core is considerably superior to the original Bulldozer core and that will make a difference. As Deus pointed out, a drop from 60fps to 43fps is nothing you'd really notice. It doesn't impede gameplay and for $110, you really can't go wrong. The number of CPU-bound games out there I can count on one hand. Usually, they're just badly-coded with sloppy programming. The number of games that are not CPU-bound numbers in the hundreds so the odds of you playing a CPU bound game are relatively small and even then, the difference is not something that would ruin the experience.

The best route to take is to try it out. If it's not something you find bothersome then great. If it does bother you, look at upgrade options then. It's no use spending money because it MIGHT be a problem is it? LOL
 
Solution

Deus Gladiorum

Distinguished
Please don't put words in my mouth. I never said that a drop from 60 to 43 is "nothing you'd really notice". I said it's "not terrible" and that it's "still playable" but it's far from "nothing you'd really notice". It certainly does impede gameplay and break immersion, especially when you play a fast paced fps with lots of camera movement, and that holds doubly true for games which lack motion blur and make 45 fps extremely choppy such as Borderlands 2 and Skyrim. Also, believe it or not, Civ V and Rome: Total War are far from CPU bound. In general, RTS's are so well optimized for the CPU that rarely can you not get over 60 fps with them. Civ V is a terrible example considering it's a turn based game where individual units are moving at once compared to multiple units moving at the same time, and the Total War series is known for having a fantastic engine which uses a lot of tricks and programming optimization to do its job. Even if you do experience a slowdown in these games it's not like there's tons of sensitive camera movement and rotation unlike an fps which is fast paced and changes in frame rate are very, very visible. Also, CPU bound games are very plentiful. They include, but aren't limited to:

Metro: Last Light, Far Cry 2, Far Cry 3, Borderlands 2, Batman: Arkham City, Batman: Arkham Origins, The Elder Scrolls V: Skyrim, Crysis, Crysis: Warhead, Metro 2033, ARMA II, ARMA III, DayZ, Assassin's Creed IV: Black Flag, and Saints Row 2.

That's 15 CPU bound games off the top of my head. But anyway, your last point is right. Like I said previously, you should get the case first and then save up your money to replace your CPU and motherboard, and the FX-8350 will make a difference, just not a massive one compared to Intel. And if you decide you can live with the low performance or if the poor performance isn't even perceivable, then there's little point in replacing it and you can save your money for something else.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
I played all games maxed out on old core2 quad 6600 @ 3ghz. Now that I have 4770k i can notice speed improvement only in starcraft2 and only because its just a poorly written game using only one CPU core. In other games I get more than 60fps on average, but that matters so very little for actual game play. The only nice thing is bigger minimum fps, and its much smoother now. With that said- it was very far from unplayable on c2q. I don't think that for gaming alone its worth to upgrade just yet.
 

Deus Gladiorum

Distinguished


I'm curious, what games were you playing and what minimum frames were you getting with that Q6600? Because as I pointed out in my earlier posts, it's not like my averages are bad at all (except for Crysis 1 and Warhead), but my minimums kill games for me, and averages are only half the story.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010


First, I had gtx 460 on win 7 64. Now, I played Crysis 1/warhead/2/3, Far cry3, Assasins Creed 1 and 2, Starcraft 2, Oblivion, Skyrim, Deus Ex HR, LoL, DoTA 1/2 and I'm sure I'm missing many more.

The only games I could not play maxed out @1080p was Cryses 3. Starcraft2 was only laggy on huge (and rare) 3v3+ battles. When I played those I only had to lower physics from ultra to medium and problem was gone (everything else was left @ ultra).

I did not spend much time benchmarking games, as I was busy playing :) They all set themselves at max settings by default, and I hadn't the need to tweak any settings- only crysis 3 made me set it to high (it defaulted to medium). Also, Crysis 2 with dx11 patch made me lower shaders from ultra to very high or it would dip below 20fps often.



When I did use fraps, it would show ~30fps for the top games(criseses), and over 60 for the simpler ones (skyrim, assassins creed...) I did not note minimum fps, but I could tell that when a game was loading or some extreme instances in the more complex games that fps would drop below 20fps. Those were barely noticeable, and lasted no longer than a second.

For the vast majority of games and the vast majority of time I did not feel like I "needed to upgrade". Now I did so because I felt it was time and because of work for school and need for faster (3d) rendering.
 

VenBaja

Distinguished
Nov 8, 2008
343
0
18,810


That's relying a lot on your "feel" and personal preference though. You mentioned ~30 fps on several games, which I would consider unplayable. I can tell you that on some BF3 multiplayer maps I would dip into the 40's with my FX-6300, and once I upgraded to the 4770k I don't ever dip past 70 with the same settings. That's an enormous difference. And I think a Q6600 would explode if it tried to run BF4's multiplayer. It all depends on what you play and what you think is acceptable fps, but there will definitely be a sizable difference.

 

Deus Gladiorum

Distinguished


That's really strange, because with an FX-6300 I've sunk at least 17 hours into BF3 and literally only once can I recount going under 60 fps o_O

And yes, I played constantly on 64 player servers.
 

VenBaja

Distinguished
Nov 8, 2008
343
0
18,810


Here is me playing a 64 player TDM match, with the FX-6300 and 7870 LE
https://www.youtube.com/watch?v=sXtmeIdcPDM

As you can see, it's very often in the 40's.

And then in BF4, which is even more CPU intensive
https://www.youtube.com/watch?v=WaBwgTTKKWM

The lowest dip I had was to 61, and that was brief and only in one part of the map.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010


When you turn on your TV you have 30fps in US, 25 in EU and when you go to cinema you get 24fps. Only the Hobbit 2012 had an option for 48fps and most people thought it looked "bad".

You want 60fps as an extreme gamer and only because it can vertical sync with your monitor refresh rate so it wont vertical tear when you have over 60fps. If you don't use v-synch, more fps is useless to most people.

With that said, you definitely don't want below 20fps. That will look choppy. Smart people figured out that about 25 frames a second is all human eye needs to see fluid animation. Below that and it can tell it's a slideshow. Above that, and you get diminishing returns very fast.

60frames is arbitrary number in use from 1940s when US had to pick a number for TV and wanted it to be better then existing cinema (24). Since you couldn't carry full image back then over long distances, every image was split in 2. First sending every odd and then every even line. This method is called interlace and it gives you effectively 60 half frames a second through 30 images a second.

Even though technology evolved much since then, we still use 60fps a default refresh rate for all monitors. Now with true 60images a second called progressive. This is denoted by little "p" next to the number you see in YouTube videos, for example. Like 1080p- meaning 1920x1080 pixels in progressive mode, and not interlaced "i". We still use to vertical pixels to declare an image resolution just because when TVs started they only had vertical lines.

A brief history of frames. Merry Christmas :)
 

VenBaja

Distinguished
Nov 8, 2008
343
0
18,810


I never use v-sync on any game and I definitely notice a difference. 40 fps looks choppy as all get out, and over 60 or 70 looks pretty smooth. The big thing about the 4770k, is that it now allows me to not ever drop below 60. So my average fps is much higher, but more importantly, I don't get those drops into the 40's or 50's and miss the snap shot while defending a flag.

It's all about what you plan to do and what you're comfortable with though. I like to be competitive in the newest online FPS's, so having the highest possible framerate is ideal. Now, if I was just playing Skyrim I'd be happy with medium settings at 35fps on my Core2Duo laptop.
 

Hand754

Honorable
Jan 25, 2014
3
0
10,510