Intel vs AMD -- specific case

dbooksta

Honorable
Dec 28, 2012
17
0
10,510
Not trolling; just haven't assembled a computer in a few years and the CPU manufacturers have obfuscated things so much I hardly know what to look for anymore!

I'm building a primary home machine, not looking to overclock or tune anything. Just want to grab a Gigabyte MB, Intel 330 SSD, 16GB RAM, and build a quiet machine.

Given my typical usage and priorities I thought I'd narrow the search to low-wattage 4-core 3.4GHz CPUs with integrated GPUs. So I come up with:
1. $125 AMD A10-5700 Trinity FM2 running 65W vs.
2. $220 Intel i5-3570K Ivy Bridge LGA 1155 running 77W

Is this a fair comparison? Am I missing something, or does AMD really offer the better value in this market segment?
 

sterlin22

Honorable
May 17, 2012
74
0
10,630
The A10-5700 features a 32nm process, while the I5-3570k features a 22nm process. The I5-3570k will perform better than the A10-5700 if they were both set at the same frequencies. Pretty sure cache has something to do with it too.
 

payturr

Honorable
Dec 3, 2012
819
0
11,060
Not fair because the i5 absolutely destroys the A10. If you want good integrated graphics, go with the A10, but it's gonna have a junky processor. Personally I'd get the 3570K, replace the stock cooler with a Hyper 212 Plus, and I would pair it up with a 7870 for gaming. But since you are building a home machine, I'm guessing this doesn't including gaming, so a Core i3 3220 is perfect for your use - just remember to replace the cooler.
EDIT: That's incorrect - the A10 runs at 95W.
 


The AMD chip is better for your needs. The intel chip is faster, but its reflected in the price. The AMD chip has the better integrated GPU.
 


That exactly what he just said he DIDN'T want to do. Do you guys even read the posts?
 

dbooksta

Honorable
Dec 28, 2012
17
0
10,510


First, thanks for the straight answer!

Second, on what basis do we evaluate CPU performance these days? Last time I shopped CPUs they all employed the same instruction sets so first you made sure it was native 64-bit, then counted the number of cores, gave it up to 50% credit for hyperthreading, and then assumed that throughput per processor thread would be proportional to clock speed, +/- a little bit for cache. I forgot that we also need to adjust for feature size, since the same chip layout at half the size could (barring any other bottleneck) run up to twice (or maybe sqrt[2]) the speed. Is this still a good heuristic for judging pure processor speed? Or have they jammed so many features onto the CPU that you can't judge speed without "real-world" testing anymore?
 

bentremblay

Distinguished
Jan 2, 2012
138
0
18,680
I'm in a similar situation: looking at upgrading from Athlon II 630 (2.8 X4 OC 3.4GHz), seemed that moving to Intel was the thing to do. (MoBo is AM3, so it and the 630 will become a separate box, for work.)

But just now I read this:
"No Surprise: Intel Takes The Performance Crown, AMD Represents Value : CPU Charts 2012: 86 Processors From AMD And Intel, Tested"; "If you're looking for a sweet spot in Intel's line-up, we're still fans of the Core i5-3570K at $215. The Core i5-3470 at $200 isn't bad either, though an unlocked multiplier ratio on the K-series part is easily worth $15 on its own."
"In AMD's line-up, an FX-8320 might be your best bet for a desktop PC with discrete graphics. That's one of the models we plan to add shortly. At $180, it looks like a decent alternative to the FX-8350 at $220 or FX-8150 at $190."

So I'm back to scratching my head ... looking at the i5-3570K, or low-ball with the i5-3450.

p.s. not meaning to suggest that these are appropriate for you application. just saying that what had seemed clear for a while got blurred again when I looked into it more.
 


Clock speeds really don't matter much anymore since the Intel processors are faster and more efficient than AMD processors. If you have a stock Intel CPU at 3.2 GHz and an AMD CPU at 3.7 GHz the Intel CPU would still out perform it because it's micro-architecture is better. For general use I would go with a Pentium G or I3 over the APU. I really see no point in getting an APU. Sure you get better graphics but you aren't doing anything GPU demanding so Intel's HD graphics would be fine. With graphics out of the equation you are looking at the APU with a weak CPU core or a Pentium G/I3 with a more powerful CPU core and will be faster. I don't know about you but I would take the the faster CPU.
 

InvalidError

Titan
Moderator

The only fair metric is actual benchmarks in whatever applications/games you are interested in because the way each CPU behaves is highly dependent on each program's instruction mix and how well threaded (if at all) the program is. It has always been that way and always will be. The best CPU for one thing is not necessarily the same for everything else, although Intel has a large lead across most of the board right now.

Trying to gauge performance based on transistor count, feature size, clock speed, cache size, etc. is pointless since CPUs optimized for single-threaded performance will behave drastically differently from CPUs optimized for simultaneous multi-threading (Intel Xeon Phi, IBM Power-7, Oracle UltraSparc T1/T2/T3/T4, GPGPUs, etc.) even with the same transistor count and feature size simply because the architecture and optimization goals are completely different.

As for feature size vs clock speeds, CPUs have been stuck around the same 3.5GHz they were at nearly 10 years ago so major clock bumps from process improvements are effectively history - not going much faster on 22nm today than we were 10 years ago at 90nm because desktop CPUs sunk all the transistor budget into keeping TDP in check and improving single-threaded ILP.
 


He doesn't say but it's a weird choice. He says he doesn't want to overclock and doesn't mention gaming so you would think it's for everyday use but he picks two CPU's that a gamer would be looking at.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

No reason to look at an Intel K processor as your paying for the overclock ability.

typical usage priorities = what? 16 gig of ram for web browsing is overkill.
 

InvalidError

Titan
Moderator

A10 isn't much of a gaming CPU. It does offer a much better CPU-GPU balance, which is great for people who may do a little bit of gaming but not enough to justify buying a discrete GPU.

If OP picked a 3570k as his 2nd choice with mild IGP gaming in mind while hoping to avoid buying a discrete GPU, I think he would have much better chance of getting what he wants with A10.
 

dbooksta

Honorable
Dec 28, 2012
17
0
10,510


Besides web browsing/apps and file sharing the biggest processing demands are occasionally encoding video and image processing. Though I guess it doesn't matter if the video work takes a while to churn on "background" threads.

Believe it or not just holding open a typical 40+tab browsing session (Chrome, of course) and background processes can use up 4GB and keep my current dual-core Athlon running warm.

So I'm getting the sense I might be barking up the wrong tree CPUs on a replacement system? Like I should just look at i3's or something?
 

bentremblay

Distinguished
Jan 2, 2012
138
0
18,680
Maybe we can come up with plain English distinctions here.

Unless it involves something like video editing (I've done that, and sound editing too. Video is special; for me that's where i7 comes into the picture.) I think "work" and "gaming" are sufficiently different.
I can typically have between 30 and 50 tabs open in FireFox. Sometimes (Don't shoot!) as many as 75. But that's basically text-based ... with Flash/YouTube thrown in here and there. But that ain't gaming.

For my work box I want quiet. Which, for me, means cool. Very diff requirement.
 


I agree and you should know by now what I think of the APU for gaming. I would never buy one or use one for gaming or everyday use. Either way I would go with a Pentium G or I3 over the APU.
 
As others have pointed out, this is a rather lopsided comparison...

Cost - Obviously going with the A10-5800k is the least expensive option of the two. The difference between the two is performance. The Intel Core i5-3570k has a more powerful CPU core, but weaker graphics core compared to the A10-5800k.

iGPU - Integrated Graphic - From the iGPU perspective, the A10-5800k's Radeon HD 7660D is more powerful than the Intel HD 4000. Below are some benchmarks from the linked review. The i5-3570k is not part of the comparison because of the price difference. The Core i3-3225 is used instead because at least the price is comparable. Also the games only use 2 CPU cores so the performance difference between an i3 and i5 is small. There more more benchmarks in the link. Below is just a few.

http://www.xbitlabs.com/articles/graphics/display/amd-trinity-graphics_6.html#sect4


batman-2.png


battlefield-2.png


borderlands-2.png


CPU Performance - While AMD wins on the integrated graphics. Intel wins in CPU performance. The link below is related to the review in the above link; it is part 2 of a 2 part review of AMD's Trinity APU. Again, the core i5-3570k is not part of the review, but it does have the weaker Core i5-3330 so it is safe to assume that the Core i5-3570k will provide a little better performance results. I've only listed some of the benchmarks; click the link to review the others. The Trinity APU does not win in any of the benchmarks.

http://www.xbitlabs.com/articles/cpu/display/amd-a10-5800k_8.html#sect0


winrar.png


photoshop.png


x264-1.png


3dsmax-2.png


x264-2.png



 
I decided to split my post because not all of the charts were being displayed when I was previewing my post...


Dedicated Graphics - If you are going to be installing a dedicated graphics card, then it depends on what card you install and how powerful it is. Trinity APUs are capable of dual graphics (formerly known as hybrid crossfire I think). Basically if you install an AMD graphics card no more powerful than the Radeon HD 6670 (a.k.a Radeon HD 7670), then the dedicated graphics card and the iGPU can work together to provide better graphics performance than just a Radeon HD 6670 alone. Probably somewhat close to a Radeon HD 6750 in performance. Not sure if Trinity's dual graphic capabilities will be compatible with the up coming Radeon HD 8xxx series next year. This feature is not compatible with any nVidia GPU.

Most gamers prefer to install a powerful graphics card. Meaning the Radeon HD 6670 / HD 7670 is much too weak for them. In this case dual graphics is disabled and the Trinity APU will only use the dedicated (a.k.a. discrete) graphics card. Below are benchmarks using the nVidia GTX 680 graphics card. The benchmarks are from the same review as above (the 2nd part of Xbitlab's 2 part review of Trinity). Again, not all benchmarks are shown below, click the link to see all benchmarks. The A10-5800k does not win any of the benchmarks.

http://www.xbitlabs.com/articles/cpu/display/amd-a10-5800k_7.html#sect0

batman.png


dirt.png


metro.png



Power Consumption - Below are power consumption charts from both part 1 and 2 of Xbitlabs' Trinity APU review. The first set of charts is from the 1st part of the review. Since the review did not have any Core i5 CPU, just take the information as is. Since the A10-5800k uses less power when idling than the Core i3-3225, you can safely assume that it also uses less power than the Core i5-3570k when idling as well.

http://www.xbitlabs.com/articles/graphics/display/amd-trinity-graphics_11.html#sect0

power-1.png


power-2.png


power-3.png



The following set of power consumption chart is from the 2nd review which includes a nVidia GTX 680 which explains why the power consumption when idling is higher. It also includes the Core i5-3330 which should provide similar power consumption results compared to the slightly faster Core i5-3570k. The A10-5800k consumes the least amount of power when the system is idling, but that changes when the CPU needs to do something... Note that the charts do any include any power consumption from playing games, if there were then you will see power consumption jump up because the nVidia GTX 680 uses around 185w of power

http://www.xbitlabs.com/articles/cpu/display/amd-a10-5800k_9.html#sect0

power-1.png


power-2.png


power-3.png
 

bentremblay

Distinguished
Jan 2, 2012
138
0
18,680
LOL you inject your opinion AFTER the fact after the poster says what he is going to be using the computer for. Thats easy to do at the time of posting I didn't know what he wanted to do with the computer.
And at the time of my post I was replying to this:
So really, you're just looking to score points by making it personal.
Now, back to the topic.

p.s. for "work" he used the same case I did: "Believe it or not just holding open a typical 40+tab browsing session". I cited between 30 and 50, sometimes 70+.
 

bentremblay

Distinguished
Jan 2, 2012
138
0
18,680
You have no idea what you are talking about and your post scrambled and make no sense. Here's a tip since you are new here to the forums if you want to be respected here don't attack people and act like a prick. It is a sure fire way to get flamed.
I see you attacking me. I don't recall having attacked anybody. You're entirely fixated on me, while I'm dealing with CPU for work / game.

You let me know what you don't understand with this and maybe we can work through it:
"p.s. for "work" he used the same case I did: "Believe it or not just holding open a typical 40+tab browsing session". I cited between 30 and 50, sometimes 70+."
 
Lastly, here are some additional power consumption charts from Techpowerup's review of the A10-5800k. While again, the Core i5-3570k is not being represented, at least the more powerful i7-3770k CPU is. Therefore, it is generally safe to assume that the i5-3570k will not consume more power than the i7-3770k.

http://www.techpowerup.com/reviews/AMD/FM2_APU_Review/5.html

Basically the two charts below tries to measure only the CPU power consumption, not the entire system.

power_eps.gif


power_eps_load.gif


Full system...

power_full.gif


power_full_load.gif



And lastly, some gaming performance benchmarks in the same Techpowerup review which includes the i7-3770k using only the Intel HD 4000, the i7-3770k + Radeon HD 6670, the A10-5800k using only the Radeon HD 7660D iGPU, and the A10-5800k + Radeon HD 6670 (dual graphics mode):

f1_2010.gif


shogun.gif


sniperv2.gif
 

dbooksta

Honorable
Dec 28, 2012
17
0
10,510
To confirm a few points: Yes, I want a machine that is cool and quiet. I won't run a separate GPU/graphics card, and I won't play games on this.

JaguarSKX, thanks for the benchmarks. The first set sort of suggested I might be on the right track with the A10. But the second and third set were baffling, perhaps because they emphasize gaming and incorporate systems with external GPUs?
 

bentremblay

Distinguished
Jan 2, 2012
138
0
18,680
That economy will cover SSD for your OS and regular applications.
(No need to get huge, and they're wickedly fast, so I'm going that way.)

BTW dbooksta: If I didn't have my now redundant CPU (Athlon II; 630) and GPU (Radeon 5770) I'd be in precisely your position. But I'm not, so haven't researched this.

FWIW I see a newly updated Benchmarks CPU Charts 2012 but I see no updated version of Benchmarks Desktop CPU Charts 2010