Sign-in / Sign-up
Your question

Power per core 3770 vs 8350?

Tags:
  • CPUs
  • Core
Last response: in CPUs
February 12, 2013 4:51:48 AM

I looked up the specs and the i7 rates per core (passmark divided by threads) 1184 for 8 threads. The 8350 rates 1144, 8 threads. The extra difference per core seems useless because at most games today will make use of only 4 cores , well save for crysis 3. Are there any other factors I'm not considering here? I do a fair amount of video recording (of games) and editing, and gaming. Would there be any noticable difference between a nice hole in my pocket? (difference between cpu + mobo + ram is roughly 200$

More about : power core 3770 8350

February 12, 2013 4:54:41 AM

I would go with the AMD. Mainly because the price difference, the AMD is 80$ cheaper
m
0
l
a c 640 à CPUs
February 12, 2013 6:02:10 AM

The vast majority of games only use 2 cores. Maybe the games you are interested in playing can make use of more than 2 cores, but the fact is games capable of using more than 2 cores represents a small fraction of all games released every year.

Generally speaking, Intel CPU perform better in games than AMD CPU. There are professionally written reviews / benchmarks all over the web that proves it. Maybe AMD can perform better in a couple of games, but the number of games that Intel CPUs perform better in is like an avalanche.

Benchmarks aside, will you notice a difference in games? That depends on the game, graphics card and the type of monitor. If you have a powerful graphics card that can get 80+ FPS with an FX-8350 vs 90+ FPS with a Core i5-3570k? Maybe.

Getting 90+ FPS could mean that the action looks smoother. But you will need a 120Hz monitor to get more than 60 FPS on the screen. If you are only using a 60Hz monitor then it does not matter whether you are getting 80+ FPS, 90+ FPS or even 120+ FPS. The monitor will only display 60 FPS.

If you are going to be video recording game play, then the FX-8350 should be the better option than a Core i5-3570k. If you are playing a game that can actually use 4 cores effectively, then at means when you start recording the video capture / compression programs starts to steal away some processing power from the game which would decrease game performance especially if the game can effectively use 4 cores. While the FX-8350 is not a "true 8 core CPU" in the sense that each pair of cores share a single FPU (Floating Point Unit) so there would be times when both cores would be fighting for the same resource. Nevertheless, there are 8 cores and up to 4 of the remaining 4 cores could be used to encode the video.
m
0
l
Related resources
February 12, 2013 6:17:03 AM

jaguarskx said:
The vast majority of games only use 2 cores. Maybe the games you are interested in playing can make use of more than 2 cores, but the fact is games capable of using more than 2 cores represents a small fraction of all games released every year.

Generally speaking, Intel CPU perform better in games than AMD CPU. There are professionally written reviews / benchmarks all over the web that proves it. Maybe AMD can perform better in a couple of games, but the number of games that Intel CPUs perform better in is like an avalanche.

Benchmarks aside, will you notice a difference in games? That depends on the game, graphics card and the type of monitor. If you have a powerful graphics card that can get 80+ FPS with an FX-8350 vs 90+ FPS with a Core i5-3570k? Maybe.

Getting 90+ FPS could mean that the action looks smoother. But you will need a 120Hz monitor to get more than 60 FPS on the screen. If you are only using a 60Hz monitor then it does not matter whether you are getting 80+ FPS, 90+ FPS or even 120+ FPS. The monitor will only display 60 FPS.

If you are going to be video recording game play, then the FX-8350 should be the better option than a Core i5-3570k. If you are playing a game that can actually use 4 cores effectively, then at means when you start recording the video capture / compression programs starts to steal away some processing power from the game which would decrease game performance especially if the game can effectively use 4 cores. While the FX-8350 is not a "true 8 core CPU" in the sense that each pair of cores share a single FPU (Floating Point Unit) so there would be times when both cores would be fighting for the same resource. Nevertheless, there are 8 cores and up to 4 of the remaining 4 cores could be used to encode the video.


on the display actually you are wrong. Imput to displays is ALWAYS 60hz max. 120hz and 240hz tvs interpolate frames but you have to be careful though because some have imput lag. I have a 120hz display and it is super smooth even though games will only run at 60fps
m
0
l
a b à CPUs
February 12, 2013 6:31:19 AM

1zacster said:
on the display actually you are wrong. Imput to displays is ALWAYS 60hz max. 120hz and 240hz tvs interpolate frames but you have to be careful though because some have imput lag. I have a 120hz display and it is super smooth even though games will only run at 60fps


He said 120 hz monitor, not TV. TVs are a completely different tech with different rules for determining performance. For the most part though a TVs "Hz" can be almost completely ignored as irrelevant to a PC.
m
0
l
a c 640 à CPUs
February 12, 2013 6:37:37 AM

1zacster said:
on the display actually you are wrong. Imput to displays is ALWAYS 60hz max. 120hz and 240hz tvs interpolate frames but you have to be careful though because some have imput lag. I have a 120hz display and it is super smooth even though games will only run at 60fps



Who said anything about a HDTV? Unless the OP specifically states he is using a HDTV, then the natural assumption is he is using a monitor.

If you read my post you should note that I specifically mentioned a 120Hz monitor. Not a 120Hz HDTV. They have different inputs. A 120Hz monitor uses a dual linked DVI-D cable or two single DVI-D cables for 120Hz input.

I would never play games on a HDTV in 120Hz mode, only 60Hz mode. Interpolation creates artificial input lag.
m
0
l
a c 640 à CPUs
February 12, 2013 6:44:53 AM

1zacster said:
I looked up the specs and the i7 rates per core ...


I'll revise my response somewhat since I had originally thought you were considered in a Core i5-3570k.

The Core i7-3770k should perform better than the Core i5 when it comes to video recording and playing games at the same time. While the Core i7-3770k can process up to 8 thread, there are only 4 physical cores and 4 virtual cores. Hyper Threading does nothing for games, but it can come in handy for video recording and benchmarks have shown that Hyper Threading can make a difference in encoding video as long as the codec used supports Hyper Threading.

However, I would say that the FX-8350 would still provide better overall performance because of the 8 physical cores.
m
0
l
a b à CPUs
February 12, 2013 8:49:09 AM

For video recording and editing, the FX 8350 will hold its own against the 3770K, but in gaming it will always trail it slightly.
The difference won't be noticable but it will be there.
In gaming + recording they will be on par
m
0
l
February 15, 2013 8:20:20 AM

jaguarskx said:
I'll revise my response somewhat since I had originally thought you were considered in a Core i5-3570k.

The Core i7-3770k should perform better than the Core i5 when it comes to video recording and playing games at the same time. While the Core i7-3770k can process up to 8 thread, there are only 4 physical cores and 4 virtual cores. Hyper Threading does nothing for games, but it can come in handy for video recording and benchmarks have shown that Hyper Threading can make a difference in encoding video as long as the codec used supports Hyper Threading.

However, I would say that the FX-8350 would still provide better overall performance because of the 8 physical cores.


What if I turned off HT in the bios before boot on both, wouldn't that increace gaming preformance?
m
0
l