Sign in with
Sign up | Sign in
Your question
Solved

FX-6300 4.8ghz or 5.0ghz OC

Last response: in CPUs
Share
August 27, 2013 2:31:07 PM

I was wondering if I would see any performance increase in OC from 4.8ghz to 5.0ghz for gaming?

More about : 6300 8ghz 0ghz

a b à CPUs
a b 4 Gaming
August 27, 2013 2:42:26 PM

Warjb34 said:
I was wondering if I would see any performance increase in OC from 4.8ghz to 5.0ghz for gaming?


Depends on a lot of things, namely the game and if you're CPU bottlenecked for said game. I'll tell you now from personal experience overclocking my FX-6300 to 4.6 GHz, it does very little in terms of a gaming performance increase for any of my CPU-bottlenecked games. I've only seen a 6-8 minimum fps increase for my most bottlenecked games so I'm willing to be a 200 MHz OC won't do much for you either. You could always try anyway though, just for the fun of it because regardless of effectiveness it does generate a pretty prideful feeling to be strutting a 5.0 GHz CPU.
m
0
l

Best solution

a c 473 à CPUs
a b 4 Gaming
August 27, 2013 2:51:16 PM

Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.0GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


*** Edit ***

Fixed typo. 5.0GHz, not 5.9GHz as originally stated.

Share
Related resources
a b à CPUs
a b 4 Gaming
August 27, 2013 3:34:44 PM

Bragging rights ? No really 5 ghz is pretty awesome. what vcore do you need to get that high ?
m
0
l
August 27, 2013 3:44:56 PM

cmi86 said:
Bragging rights ? No really 5 ghz is pretty awesome. what vcore do you need to get that high ?


I believe for my application with the FX-6300 I need at least 1.48vcore minimum.. But its like what was said previous im not going to risk it that much for not so much gain in gaming performance
m
0
l
a b à CPUs
a b 4 Gaming
August 27, 2013 3:56:08 PM

jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


This needs to be stickied or something. I've always been wondering why my 1.1 GHz OC didn't do jack for my CPU bound, CPU bottlenecked games like Skyrim, Crysis 1 or Borderlands 2. I mean, after all to the overclocking beginner it seems logical that increasing a CPU's frequency would make the most difference in games where the CPU is bottlenecking the GPU. Thanks for clarifying.
m
0
l
a b à CPUs
a b 4 Gaming
August 27, 2013 4:34:31 PM

Warjb34 said:
cmi86 said:
Bragging rights ? No really 5 ghz is pretty awesome. what vcore do you need to get that high ?


I believe for my application with the FX-6300 I need at least 1.48vcore minimum.. But its like what was said previous im not going to risk it that much for not so much gain in gaming performance


Right on that makes good sense. I was just curious about the voltage needed in those ranges as I was fortunate enough to be able to hit 4.5 on stock voltage (1.36v) and never really tried for more. Nice job getting up there though !
m
0
l
a c 210 à CPUs
a b 4 Gaming
August 27, 2013 4:43:01 PM

jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.

A 1.1 GHz bump on clock speed should theoretically make closer to 10-15% difference for most modern CPUs, in some instances perhaps just a tad more.

@Deus:

Your 1.1 GHz OC falls about in line with what I would expect, 10-15% performance increase from an OC that high is about right. I would bet if you did the math on it, you would come out right around that much improvement over your FPS before.

In some cases, if you're getting 40 FPS and then get 48 FPS after your overclock, that's a 20% increase in FPS and you're really getting a good scaling for your CPU performance in that game. In other situations you might only go from 40-44/45 FPS or so if the game is poorly optimized, or there is some other issue also "bottlenecking" your system. For example, some games respond really well to more RAM bandwidth, so if you upped your RAM speed and got a benefit from it, you may actually see some benefit there as well depending on the game in question.
m
0
l
a b à CPUs
a b 4 Gaming
August 27, 2013 5:11:18 PM

8350rocks said:
jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.

A 1.1 GHz bump on clock speed should theoretically make closer to 10-15% difference for most modern CPUs, in some instances perhaps just a tad more.

@Deus:

Your 1.1 GHz OC falls about in line with what I would expect, 10-15% performance increase from an OC that high is about right. I would bet if you did the math on it, you would come out right around that much improvement over your FPS before.

In some cases, if you're getting 40 FPS and then get 48 FPS after your overclock, that's a 20% increase in FPS and you're really getting a good scaling for your CPU performance in that game. In other situations you might only go from 40-44/45 FPS or so if the game is poorly optimized, or there is some other issue also "bottlenecking" your system. For example, some games respond really well to more RAM bandwidth, so if you upped your RAM speed and got a benefit from it, you may actually see some benefit there as well depending on the game in question.


Yea, Borderlands 2 is the only game that got me that 8 fps increase. There's one specific area in the game that causes me to drop from 60 to 36-38 before I overclocked. After Overclocking, I get 44-46 fps. Still quite horrible for that game, and it's not a GPU issue because I have a GTX 770 and in aforementioned area my GPU usage percentage just drops. But I heard it was a game that scales well to Overclocking. Too bad no one told me what "well" actually was relative to my expectations. I was thinking of an increase to a solid low 50 range. Ah well. :/ 

The other game is Crysis 1. I wish I could play it but unfortunately, it's ridiculously CPU bound and my overclock never made a performance increase I could readily measure beyond, I suppose, 3 or 4 fps? The majority of the game for me is just a mess, jumping anywhere from 60 to 45 to 30 to 24 to 45 to 34 to 60 again. It's a whole mess and it's not fun in the slightest. Boy, do I wish I could play Crysis 1. My GPU is more than capable to handle it at Ultra 1080p, and the first 5 minutes of the game ran on Ultra at a consistent 60 fps. Truly, even for a 2007 game it's freaking gorgeous which makes it all the more heartbreaking that I'm one crappy component away from playing it.
m
0
l
a c 210 à CPUs
a b 4 Gaming
August 27, 2013 5:16:41 PM

Deus Gladiorum said:
8350rocks said:
jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.

A 1.1 GHz bump on clock speed should theoretically make closer to 10-15% difference for most modern CPUs, in some instances perhaps just a tad more.

@Deus:

Your 1.1 GHz OC falls about in line with what I would expect, 10-15% performance increase from an OC that high is about right. I would bet if you did the math on it, you would come out right around that much improvement over your FPS before.

In some cases, if you're getting 40 FPS and then get 48 FPS after your overclock, that's a 20% increase in FPS and you're really getting a good scaling for your CPU performance in that game. In other situations you might only go from 40-44/45 FPS or so if the game is poorly optimized, or there is some other issue also "bottlenecking" your system. For example, some games respond really well to more RAM bandwidth, so if you upped your RAM speed and got a benefit from it, you may actually see some benefit there as well depending on the game in question.


Yea, Borderlands 2 is the only game that got me that 8 fps increase. There's one specific area in the game that causes me to drop from 60 to 36-38 before I overclocked. After Overclocking, I get 44-46 fps. Still quite horrible for that game, and it's not a GPU issue because I have a GTX 770 and in aforementioned area my GPU usage percentage just drops. But I heard it was a game that scales well to Overclocking. Too bad no one told me what "well" actually was relative to my expectations. I was thinking of an increase to a solid low 50 range. Ah well. :/ 

The other game is Crysis 1. I wish I could play it but unfortunately, it's ridiculously CPU bound and my overclock never made a performance increase I could readily measure beyond, I suppose, 3 or 4 fps? The majority of the game for me is just a mess, jumping anywhere from 60 to 45 to 30 to 24 to 45 to 34 to 60 again. It's a whole mess and it's not fun in the slightest. Boy, do I wish I could play Crysis 1. My GPU is more than capable to handle it at Ultra 1080p, and the first 5 minutes of the game ran on Ultra at a consistent 60 fps. Truly, even for a 2007 game it's freaking gorgeous which makes it all the more heartbreaking that I'm one crappy component away from playing it.


That's odd, because the 6300 runs Crysis 3 quite well, and it's even more demanding than original Crysis!

Have you contacted Crytek to see if there were any patches or hotfixes to the game to see if they updated optimizations? It's weird that something that should run very well on high end 6 year old technology, is having issues with your FX. I think there is an underlying issue elsewhere...

Have you turned on Vsync on your GPU? If you haven't you might try it...
m
0
l
a b à CPUs
a b 4 Gaming
August 27, 2013 5:38:54 PM

8350rocks said:
Deus Gladiorum said:
8350rocks said:
jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.

A 1.1 GHz bump on clock speed should theoretically make closer to 10-15% difference for most modern CPUs, in some instances perhaps just a tad more.

@Deus:

Your 1.1 GHz OC falls about in line with what I would expect, 10-15% performance increase from an OC that high is about right. I would bet if you did the math on it, you would come out right around that much improvement over your FPS before.

In some cases, if you're getting 40 FPS and then get 48 FPS after your overclock, that's a 20% increase in FPS and you're really getting a good scaling for your CPU performance in that game. In other situations you might only go from 40-44/45 FPS or so if the game is poorly optimized, or there is some other issue also "bottlenecking" your system. For example, some games respond really well to more RAM bandwidth, so if you upped your RAM speed and got a benefit from it, you may actually see some benefit there as well depending on the game in question.


Yea, Borderlands 2 is the only game that got me that 8 fps increase. There's one specific area in the game that causes me to drop from 60 to 36-38 before I overclocked. After Overclocking, I get 44-46 fps. Still quite horrible for that game, and it's not a GPU issue because I have a GTX 770 and in aforementioned area my GPU usage percentage just drops. But I heard it was a game that scales well to Overclocking. Too bad no one told me what "well" actually was relative to my expectations. I was thinking of an increase to a solid low 50 range. Ah well. :/ 

The other game is Crysis 1. I wish I could play it but unfortunately, it's ridiculously CPU bound and my overclock never made a performance increase I could readily measure beyond, I suppose, 3 or 4 fps? The majority of the game for me is just a mess, jumping anywhere from 60 to 45 to 30 to 24 to 45 to 34 to 60 again. It's a whole mess and it's not fun in the slightest. Boy, do I wish I could play Crysis 1. My GPU is more than capable to handle it at Ultra 1080p, and the first 5 minutes of the game ran on Ultra at a consistent 60 fps. Truly, even for a 2007 game it's freaking gorgeous which makes it all the more heartbreaking that I'm one crappy component away from playing it.


That's odd, because the 6300 runs Crysis 3 quite well, and it's even more demanding than original Crysis!

Have you contacted Crytek to see if there were any patches or hotfixes to the game to see if they updated optimizations? It's weird that something that should run very well on high end 6 year old technology, is having issues with your FX. I think there is an underlying issue elsewhere...

Have you turned on Vsync on your GPU? If you haven't you might try it...


I know for sure that Crysis 1 is FAR more CPU bound than Crysis 2 or 3. Crysis 2 and 3 are designed completely differently than Crysis 1. They're designed with small, linear level environments in mind and are nowhere near as open. Plus Crysis 1 is designed to use 2 cores, and for AMD users that's not good. AMD's typical thing is to make CPUs with far less efficiency per clock, but somewhat compensate for this by offering budget CPUs with a large number of cores. Bad news for us, because a lot of games can't make use of the extra cores. Additionally, you can tell how CPU bound Crysis 1 is based on how dynamic the environments are with its constant rendering and derendering of anything that's passes through your 10m rendering/derendering radius. It's pretty ridiculous, because world objects as huge as barrels literally 50 feet away aren't rendered until you walk ahead or look down your sights in which case you have massive amounts of objects that need to be rendered all at once all the time.
m
0
l
a c 210 à CPUs
a b 4 Gaming
August 27, 2013 5:49:35 PM

Deus Gladiorum said:
8350rocks said:
Deus Gladiorum said:
8350rocks said:
jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.

A 1.1 GHz bump on clock speed should theoretically make closer to 10-15% difference for most modern CPUs, in some instances perhaps just a tad more.

@Deus:

Your 1.1 GHz OC falls about in line with what I would expect, 10-15% performance increase from an OC that high is about right. I would bet if you did the math on it, you would come out right around that much improvement over your FPS before.

In some cases, if you're getting 40 FPS and then get 48 FPS after your overclock, that's a 20% increase in FPS and you're really getting a good scaling for your CPU performance in that game. In other situations you might only go from 40-44/45 FPS or so if the game is poorly optimized, or there is some other issue also "bottlenecking" your system. For example, some games respond really well to more RAM bandwidth, so if you upped your RAM speed and got a benefit from it, you may actually see some benefit there as well depending on the game in question.


Yea, Borderlands 2 is the only game that got me that 8 fps increase. There's one specific area in the game that causes me to drop from 60 to 36-38 before I overclocked. After Overclocking, I get 44-46 fps. Still quite horrible for that game, and it's not a GPU issue because I have a GTX 770 and in aforementioned area my GPU usage percentage just drops. But I heard it was a game that scales well to Overclocking. Too bad no one told me what "well" actually was relative to my expectations. I was thinking of an increase to a solid low 50 range. Ah well. :/ 

The other game is Crysis 1. I wish I could play it but unfortunately, it's ridiculously CPU bound and my overclock never made a performance increase I could readily measure beyond, I suppose, 3 or 4 fps? The majority of the game for me is just a mess, jumping anywhere from 60 to 45 to 30 to 24 to 45 to 34 to 60 again. It's a whole mess and it's not fun in the slightest. Boy, do I wish I could play Crysis 1. My GPU is more than capable to handle it at Ultra 1080p, and the first 5 minutes of the game ran on Ultra at a consistent 60 fps. Truly, even for a 2007 game it's freaking gorgeous which makes it all the more heartbreaking that I'm one crappy component away from playing it.


That's odd, because the 6300 runs Crysis 3 quite well, and it's even more demanding than original Crysis!

Have you contacted Crytek to see if there were any patches or hotfixes to the game to see if they updated optimizations? It's weird that something that should run very well on high end 6 year old technology, is having issues with your FX. I think there is an underlying issue elsewhere...

Have you turned on Vsync on your GPU? If you haven't you might try it...


I know for sure that Crysis 1 is FAR more CPU bound than Crysis 2 or 3. Crysis 2 and 3 are designed completely differently than Crysis 1. They're designed with small, linear level environments in mind and are nowhere near as open. Plus Crysis 1 is designed to use 2 cores, and for AMD users that's not good. AMD's typical thing is to make CPUs with far less efficiency per clock, but somewhat compensate for this by offering budget CPUs with a large number of cores. Bad news for us, because a lot of games can't make use of the extra cores. Additionally, you can tell how CPU bound Crysis 1 is based on how dynamic the environments are with its constant rendering and derendering of anything that's passes through your 10m rendering/derendering radius. It's pretty ridiculous, because world objects as huge as barrels literally 50 feet away aren't rendered until you walk ahead or look down your sights in which case you have massive amounts of objects that need to be rendered all at once all the time.


The issue with that is...6 years ago, when they made that engine for dual cores...the CPUs were still less effective than the new ones are now.

Think about this:

You had Athlons in 2007, then came Phenom and Phenom II followed, from Athlon II in '07 to Phenom II was a 15% jump in efficiency. Bulldozer was a somewhat lateral move, but Piledriver was about a 15% improvement again...

That means that worse case scenario, your FX 6300 is minimum 20% more efficient than machines from back then, and at maximum closer to 35% more efficient per core.

Now, knowing this information, do you understand why I am quite a bit perplexed as to why your PC is bogging down? It would be like your PC choking on Diablo 2 or something, it's a bit absurd.

However, I am beginning to wonder if your settings for 3D acceleration are set to software rendering or hardware rendering. I would check those settings and ensure they're set to hardware rendering. Check the settings in the game as well.

I would also contact CryTek directly and ask about any patches they may have made to the game to improve performance. Something doesn't sound right there...

Check Vsync too on your GPU settings...
m
0
l
a b à CPUs
a b 4 Gaming
August 27, 2013 5:52:08 PM

Deus Gladiorum said:
8350rocks said:
Deus Gladiorum said:
8350rocks said:
jaguarskx said:
Going from 4.8GHz to 5.0GHz represent a 4.17% increase in clockspeed. Programs / games are not 100% efficient when overclocking due to potential bottlenecks in a CPU like delays when waiting for results from other calculations, there are many other that are rather technical and lengthy to explain. In a very CPU bound game you might see maybe half that increase in FPS or 2%. So if you get 50 FPS when playing Skyrim at 4.8GHz, you may get 51 FPS when clocked at 5.9GHz.

BioShock Infinite does not really gain any performance since it does care too much about the CPU as long as it does not bottleneck the graphics card.

If you run some benchmarks you'll see a small difference in performance. Otherwise in "real life" the performance increase is marginal at best.


I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.

A 1.1 GHz bump on clock speed should theoretically make closer to 10-15% difference for most modern CPUs, in some instances perhaps just a tad more.

@Deus:

Your 1.1 GHz OC falls about in line with what I would expect, 10-15% performance increase from an OC that high is about right. I would bet if you did the math on it, you would come out right around that much improvement over your FPS before.

In some cases, if you're getting 40 FPS and then get 48 FPS after your overclock, that's a 20% increase in FPS and you're really getting a good scaling for your CPU performance in that game. In other situations you might only go from 40-44/45 FPS or so if the game is poorly optimized, or there is some other issue also "bottlenecking" your system. For example, some games respond really well to more RAM bandwidth, so if you upped your RAM speed and got a benefit from it, you may actually see some benefit there as well depending on the game in question.


Yea, Borderlands 2 is the only game that got me that 8 fps increase. There's one specific area in the game that causes me to drop from 60 to 36-38 before I overclocked. After Overclocking, I get 44-46 fps. Still quite horrible for that game, and it's not a GPU issue because I have a GTX 770 and in aforementioned area my GPU usage percentage just drops. But I heard it was a game that scales well to Overclocking. Too bad no one told me what "well" actually was relative to my expectations. I was thinking of an increase to a solid low 50 range. Ah well. :/ 

The other game is Crysis 1. I wish I could play it but unfortunately, it's ridiculously CPU bound and my overclock never made a performance increase I could readily measure beyond, I suppose, 3 or 4 fps? The majority of the game for me is just a mess, jumping anywhere from 60 to 45 to 30 to 24 to 45 to 34 to 60 again. It's a whole mess and it's not fun in the slightest. Boy, do I wish I could play Crysis 1. My GPU is more than capable to handle it at Ultra 1080p, and the first 5 minutes of the game ran on Ultra at a consistent 60 fps. Truly, even for a 2007 game it's freaking gorgeous which makes it all the more heartbreaking that I'm one crappy component away from playing it.


That's odd, because the 6300 runs Crysis 3 quite well, and it's even more demanding than original Crysis!

Have you contacted Crytek to see if there were any patches or hotfixes to the game to see if they updated optimizations? It's weird that something that should run very well on high end 6 year old technology, is having issues with your FX. I think there is an underlying issue elsewhere...

Have you turned on Vsync on your GPU? If you haven't you might try it...


I know for sure that Crysis 1 is FAR more CPU bound than Crysis 2 or 3. Crysis 2 and 3 are designed completely differently than Crysis 1. They're designed with small, linear level environments in mind and are nowhere near as open. Plus Crysis 1 is designed to use 2 cores, and for AMD users that's not good. AMD's typical thing is to make CPUs with far less efficiency per clock, but somewhat compensate for this by offering budget CPUs with a large number of cores. Bad news for us, because a lot of games can't make use of the extra cores. Additionally, you can tell how CPU bound Crysis 1 is based on how dynamic the environments are with its constant rendering and derendering of anything that's passes through your 10m rendering/derendering radius. It's pretty ridiculous, because world objects as huge as barrels literally 50 feet away aren't rendered until you walk ahead or look down your sights in which case you have massive amounts of objects that need to be rendered all at once all the time.


Idk man I usually get anywhere from 45-60(v-sync) on C3 with my 6300 and 7870 LE, and your 770 a lot faster than my card.
m
0
l
August 27, 2013 6:47:07 PM

cmi86 said:
Warjb34 said:
cmi86 said:
Bragging rights ? No really 5 ghz is pretty awesome. what vcore do you need to get that high ?


I believe for my application with the FX-6300 I need at least 1.48vcore minimum.. But its like what was said previous im not going to risk it that much for not so much gain in gaming performance


Right on that makes good sense. I was just curious about the voltage needed in those ranges as I was fortunate enough to be able to hit 4.5 on stock voltage (1.36v) and never really tried for more. Nice job getting up there though !


I'm betting if u got that on stock volts u can surely go higher with a little vcore bump and get even more performance if u have sufficient cooling... And if u get it to 4.7-5.0ghz us knocking on the door of the i5 3570k performance.
m
0
l
a b à CPUs
a b 4 Gaming
August 27, 2013 6:57:03 PM

8350rocks said:
Idk man I usually get anywhere from 45-60(v-sync) on C3 with my 6300 and 7870 LE, and your 770 a lot faster than my card.


We're talking about Crysis 1. Not Crysis 3. I get Crysis 3 in the mid 50s at least.

8350rocks said:
The issue with that is...6 years ago, when they made that engine for dual cores...the CPUs were still less effective than the new ones are now.

Think about this:

You had Athlons in 2007, then came Phenom and Phenom II followed, from Athlon II in '07 to Phenom II was a 15% jump in efficiency. Bulldozer was a somewhat lateral move, but Piledriver was about a 15% improvement again...

That means that worse case scenario, your FX 6300 is minimum 20% more efficient than machines from back then, and at maximum closer to 35% more efficient per core.

Now, knowing this information, do you understand why I am quite a bit perplexed as to why your PC is bogging down? It would be like your PC choking on Diablo 2 or something, it's a bit absurd.

However, I am beginning to wonder if your settings for 3D acceleration are set to software rendering or hardware rendering. I would check those settings and ensure they're set to hardware rendering. Check the settings in the game as well.

I would also contact CryTek directly and ask about any patches they may have made to the game to improve performance. Something doesn't sound right there...

Check Vsync too on your GPU settings...


Do you remember what the benchmarks for Crysis 1 was back then? The most powerful non-dual GPU setups never got past an average of 30 fps on high (not even very high) settings on Direct X 9 at 1080p with 0x AA and 0xAF. Then consider that across a benchmark running from 1280x1024 to 1920x1080, while averages changed quite drastically, the maintained minimum was 24-25 fps with 1 outlier of 21 fps, the sign of a potential bottleneck:
http://www.bit-tech.net/hardware/graphics/2008/04/01/nv...

Now consider that it's a game built on Windows Vista and is now forcibly running on Windows 7 with some of the most problematic issues just getting it to run, while I'm also playing at Very High settings at 1080p with 16x AA, 16x AF, on Direct X 11 and I have moderately long stretches of time where I'll get 60 fps followed by ridiculous changes in fps mostly down to the 30 range but in one area down to 24. I'm not running it in software mode, because those settings would cripple my CPU. But additionally, I have a GPU monitoring OSD and it changes dynamically, so it's a pretty safe bet I'm in hardware mode. In fact, I'm not even sure you're allowed to enable AA and AF in software mode for games. VSync is enabled through the Nvidia Control Panel. So yes, I'm positive it's because Crysis 1 is so CPU hungry that it's causing a detriment to my fps.
m
0
l
a c 473 à CPUs
a b 4 Gaming
August 27, 2013 7:42:13 PM

8350rocks said:

I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.


Thanks. Fixed it.

At 5.9GHz the FX-6300 would be smokin'... and not in a good way either... unless liquid nitrogen is involved...

m
0
l
a c 210 à CPUs
a b 4 Gaming
August 28, 2013 6:40:05 AM

Deus Gladiorum said:
8350rocks said:
Idk man I usually get anywhere from 45-60(v-sync) on C3 with my 6300 and 7870 LE, and your 770 a lot faster than my card.


We're talking about Crysis 1. Not Crysis 3. I get Crysis 3 in the mid 50s at least.

8350rocks said:
The issue with that is...6 years ago, when they made that engine for dual cores...the CPUs were still less effective than the new ones are now.

Think about this:

You had Athlons in 2007, then came Phenom and Phenom II followed, from Athlon II in '07 to Phenom II was a 15% jump in efficiency. Bulldozer was a somewhat lateral move, but Piledriver was about a 15% improvement again...

That means that worse case scenario, your FX 6300 is minimum 20% more efficient than machines from back then, and at maximum closer to 35% more efficient per core.

Now, knowing this information, do you understand why I am quite a bit perplexed as to why your PC is bogging down? It would be like your PC choking on Diablo 2 or something, it's a bit absurd.

However, I am beginning to wonder if your settings for 3D acceleration are set to software rendering or hardware rendering. I would check those settings and ensure they're set to hardware rendering. Check the settings in the game as well.

I would also contact CryTek directly and ask about any patches they may have made to the game to improve performance. Something doesn't sound right there...

Check Vsync too on your GPU settings...


Do you remember what the benchmarks for Crysis 1 was back then? The most powerful non-dual GPU setups never got past an average of 30 fps on high (not even very high) settings on Direct X 9 at 1080p with 0x AA and 0xAF. Then consider that across a benchmark running from 1280x1024 to 1920x1080, while averages changed quite drastically, the maintained minimum was 24-25 fps with 1 outlier of 21 fps, the sign of a potential bottleneck:
http://www.bit-tech.net/hardware/graphics/2008/04/01/nv...

Now consider that it's a game built on Windows Vista and is now forcibly running on Windows 7 with some of the most problematic issues just getting it to run, while I'm also playing at Very High settings at 1080p with 16x AA, 16x AF, on Direct X 11 and I have moderately long stretches of time where I'll get 60 fps followed by ridiculous changes in fps mostly down to the 30 range but in one area down to 24. I'm not running it in software mode, because those settings would cripple my CPU. But additionally, I have a GPU monitoring OSD and it changes dynamically, so it's a pretty safe bet I'm in hardware mode. In fact, I'm not even sure you're allowed to enable AA and AF in software mode for games. VSync is enabled through the Nvidia Control Panel. So yes, I'm positive it's because Crysis 1 is so CPU hungry that it's causing a detriment to my fps.


Try backing off on the AA and AF to x4 each and see if your frame rates pick back up...I don't think it's necessarily the CPU portion that's crushing you unless you're monitoring core usage and see spikes over 95% and the core usage stays there while the FPS dips. It could simply be that CryEngine 2 was not as well optimized as CryEngine 3 is for certain hardware.
m
0
l
a c 210 à CPUs
a b 4 Gaming
August 28, 2013 6:40:57 AM

jaguarskx said:
8350rocks said:

I know it's just a typo...but you put if it was overclocked to 5.9 GHz which should have read 5.0.


Thanks. Fixed it.

At 5.9GHz the FX-6300 would be smokin'... and not in a good way either... unless liquid nitrogen is involved...



Yes...5.9 would be utterly impressive with any cooling solution outside of LN2 or DICE or what have you.

EDIT: Though I have seen 5.556 on the 8350...however, his cooling solution was a massive evaporative cooler designed to dissipate 450W of heat.
m
0
l
a b à CPUs
a b 4 Gaming
August 28, 2013 7:19:39 AM

Warjb34 said:
cmi86 said:
Warjb34 said:
cmi86 said:
Bragging rights ? No really 5 ghz is pretty awesome. what vcore do you need to get that high ?


I believe for my application with the FX-6300 I need at least 1.48vcore minimum.. But its like what was said previous im not going to risk it that much for not so much gain in gaming performance


Right on that makes good sense. I was just curious about the voltage needed in those ranges as I was fortunate enough to be able to hit 4.5 on stock voltage (1.36v) and never really tried for more. Nice job getting up there though !


I'm betting if u got that on stock volts u can surely go higher with a little vcore bump and get even more performance if u have sufficient cooling... And if u get it to 4.7-5.0ghz us knocking on the door of the i5 3570k performance.


I plan on at least trying it this winter. I can't now because the room where my rig is located in a very warm room in the house and when under load it makes this room even warmer like 30c !!!! With the ambient temps she already gets up in to the high 50's even with the 212 evo and AS5. I can't wait to move to a place with good AC lol.
m
0
l
December 27, 2013 12:37:49 PM

I know its a bit old , but just for help someone who might see this post, i manage to hit 4.8ghz stable with vcore 1.5 and vdda 2.7 (on a hyper 212 evo with noctua thermal paste and 2x fan with system in/out,) max temp idle 29c/load 59c and this is socket temp of course. I still didnt manage to get a 5.0Ghz stabe , not even at vcore of 1.55.
m
0
l
!