Sign in with
Sign up | Sign in
Your question

Intel vs AMD -- specific case

Tags:
Last response: in CPUs
Share
December 29, 2012 2:26:02 AM

Not trolling; just haven't assembled a computer in a few years and the CPU manufacturers have obfuscated things so much I hardly know what to look for anymore!

I'm building a primary home machine, not looking to overclock or tune anything. Just want to grab a Gigabyte MB, Intel 330 SSD, 16GB RAM, and build a quiet machine.

Given my typical usage and priorities I thought I'd narrow the search to low-wattage 4-core 3.4GHz CPUs with integrated GPUs. So I come up with:
1. $125 AMD A10-5700 Trinity FM2 running 65W vs.
2. $220 Intel i5-3570K Ivy Bridge LGA 1155 running 77W

Is this a fair comparison? Am I missing something, or does AMD really offer the better value in this market segment?
December 29, 2012 2:30:02 AM

The A10-5700 features a 32nm process, while the I5-3570k features a 22nm process. The I5-3570k will perform better than the A10-5700 if they were both set at the same frequencies. Pretty sure cache has something to do with it too.
a b à CPUs
December 29, 2012 2:30:04 AM

Not fair because the i5 absolutely destroys the A10. If you want good integrated graphics, go with the A10, but it's gonna have a junky processor. Personally I'd get the 3570K, replace the stock cooler with a Hyper 212 Plus, and I would pair it up with a 7870 for gaming. But since you are building a home machine, I'm guessing this doesn't including gaming, so a Core i3 3220 is perfect for your use - just remember to replace the cooler.
EDIT: That's incorrect - the A10 runs at 95W.
Related resources
a b à CPUs
December 29, 2012 2:35:36 AM

dbooksta said:
Not trolling; just haven't assembled a computer in a few years and the CPU manufacturers have obfuscated things so much I hardly know what to look for anymore!

I'm building a primary home machine, not looking to overclock or tune anything. Just want to grab a Gigabyte MB, Intel 330 SSD, 16GB RAM, and build a quiet machine.

Given my typical usage and priorities I thought I'd narrow the search to low-wattage 4-core 3.4GHz CPUs with integrated GPUs. So I come up with:
1. $125 AMD A10-5700 Trinity FM2 running 65W vs.
2. $220 Intel i5-3570K Ivy Bridge LGA 1155 running 77W

Is this a fair comparison? Am I missing something, or does AMD really offer the better value in this market segment?


The AMD chip is better for your needs. The intel chip is faster, but its reflected in the price. The AMD chip has the better integrated GPU.
a b à CPUs
December 29, 2012 2:37:32 AM

payturr said:
Not fair because the i5 absolutely destroys the A10. If you want good integrated graphics, go with the A10, but it's gonna have a junky processor. Personally I'd get the 3570K, replace the stock cooler with a Hyper 212 Plus, and I would pair it up with a 7870 for gaming. But since you are building a home machine, I'm guessing this doesn't including gaming, so a Core i3 3220 is perfect for your use - just remember to replace the cooler.
EDIT: That's incorrect - the A10 runs at 95W.


That exactly what he just said he DIDN'T want to do. Do you guys even read the posts?
December 29, 2012 5:56:33 PM

FALC0N said:
The AMD chip is better for your needs. The intel chip is faster, but its reflected in the price. The AMD chip has the better integrated GPU.


First, thanks for the straight answer!

Second, on what basis do we evaluate CPU performance these days? Last time I shopped CPUs they all employed the same instruction sets so first you made sure it was native 64-bit, then counted the number of cores, gave it up to 50% credit for hyperthreading, and then assumed that throughput per processor thread would be proportional to clock speed, +/- a little bit for cache. I forgot that we also need to adjust for feature size, since the same chip layout at half the size could (barring any other bottleneck) run up to twice (or maybe sqrt[2]) the speed. Is this still a good heuristic for judging pure processor speed? Or have they jammed so many features onto the CPU that you can't judge speed without "real-world" testing anymore?
December 29, 2012 6:09:13 PM

I'm in a similar situation: looking at upgrading from Athlon II 630 (2.8 X4 OC 3.4GHz), seemed that moving to Intel was the thing to do. (MoBo is AM3, so it and the 630 will become a separate box, for work.)

But just now I read this:
"No Surprise: Intel Takes The Performance Crown, AMD Represents Value : CPU Charts 2012: 86 Processors From AMD And Intel, Tested"; "If you're looking for a sweet spot in Intel's line-up, we're still fans of the Core i5-3570K at $215. The Core i5-3470 at $200 isn't bad either, though an unlocked multiplier ratio on the K-series part is easily worth $15 on its own."
"In AMD's line-up, an FX-8320 might be your best bet for a desktop PC with discrete graphics. That's one of the models we plan to add shortly. At $180, it looks like a decent alternative to the FX-8350 at $220 or FX-8150 at $190."

So I'm back to scratching my head ... looking at the i5-3570K, or low-ball with the i5-3450.

p.s. not meaning to suggest that these are appropriate for you application. just saying that what had seemed clear for a while got blurred again when I looked into it more.
a c 140 à CPUs
December 29, 2012 6:28:18 PM

dbooksta said:
First, thanks for the straight answer!

Second, on what basis do we evaluate CPU performance these days? Last time I shopped CPUs they all employed the same instruction sets so first you made sure it was native 64-bit, then counted the number of cores, gave it up to 50% credit for hyperthreading, and then assumed that throughput per processor thread would be proportional to clock speed, +/- a little bit for cache. I forgot that we also need to adjust for feature size, since the same chip layout at half the size could (barring any other bottleneck) run up to twice (or maybe sqrt[2]) the speed. Is this still a good heuristic for judging pure processor speed? Or have they jammed so many features onto the CPU that you can't judge speed without "real-world" testing anymore?


Clock speeds really don't matter much anymore since the Intel processors are faster and more efficient than AMD processors. If you have a stock Intel CPU at 3.2 GHz and an AMD CPU at 3.7 GHz the Intel CPU would still out perform it because it's micro-architecture is better. For general use I would go with a Pentium G or I3 over the APU. I really see no point in getting an APU. Sure you get better graphics but you aren't doing anything GPU demanding so Intel's HD graphics would be fine. With graphics out of the equation you are looking at the APU with a weak CPU core or a Pentium G/I3 with a more powerful CPU core and will be faster. I don't know about you but I would take the the faster CPU.
a c 184 à CPUs
December 29, 2012 6:35:19 PM

What's this machine going to do?
a c 101 à CPUs
December 29, 2012 6:39:27 PM

dbooksta said:
Second, on what basis do we evaluate CPU performance these days?

The only fair metric is actual benchmarks in whatever applications/games you are interested in because the way each CPU behaves is highly dependent on each program's instruction mix and how well threaded (if at all) the program is. It has always been that way and always will be. The best CPU for one thing is not necessarily the same for everything else, although Intel has a large lead across most of the board right now.

Trying to gauge performance based on transistor count, feature size, clock speed, cache size, etc. is pointless since CPUs optimized for single-threaded performance will behave drastically differently from CPUs optimized for simultaneous multi-threading (Intel Xeon Phi, IBM Power-7, Oracle UltraSparc T1/T2/T3/T4, GPGPUs, etc.) even with the same transistor count and feature size simply because the architecture and optimization goals are completely different.

As for feature size vs clock speeds, CPUs have been stuck around the same 3.5GHz they were at nearly 10 years ago so major clock bumps from process improvements are effectively history - not going much faster on 22nm today than we were 10 years ago at 90nm because desktop CPUs sunk all the transistor budget into keeping TDP in check and improving single-threaded ILP.
a c 140 à CPUs
December 29, 2012 6:44:15 PM

amuffin said:
What's this machine going to do?


He doesn't say but it's a weird choice. He says he doesn't want to overclock and doesn't mention gaming so you would think it's for everyday use but he picks two CPU's that a gamer would be looking at.
a b à CPUs
December 29, 2012 6:54:30 PM

dbooksta said:
Not trolling; just haven't assembled a computer in a few years and the CPU manufacturers have obfuscated things so much I hardly know what to look for anymore!

I'm building a primary home machine, not looking to overclock or tune anything. Just want to grab a Gigabyte MB, Intel 330 SSD, 16GB RAM, and build a quiet machine.

Given my typical usage and priorities I thought I'd narrow the search to low-wattage 4-core 3.4GHz CPUs with integrated GPUs. So I come up with:
1. $125 AMD A10-5700 Trinity FM2 running 65W vs.
2. $220 Intel i5-3570K Ivy Bridge LGA 1155 running 77W

Is this a fair comparison? Am I missing something, or does AMD really offer the better value in this market segment?

No reason to look at an Intel K processor as your paying for the overclock ability.

typical usage priorities = what? 16 gig of ram for web browsing is overkill.
a c 101 à CPUs
December 29, 2012 6:59:24 PM

rds1220 said:
He doesn't say but it's a weird choice. He says he doesn't want to overclock and doesn't mention gaming so you would think it's for everyday use but he picks two CPU's that a gamer would be looking at.

A10 isn't much of a gaming CPU. It does offer a much better CPU-GPU balance, which is great for people who may do a little bit of gaming but not enough to justify buying a discrete GPU.

If OP picked a 3570k as his 2nd choice with mild IGP gaming in mind while hoping to avoid buying a discrete GPU, I think he would have much better chance of getting what he wants with A10.
December 29, 2012 7:02:48 PM

amuffin said:
What's this machine going to do?


Besides web browsing/apps and file sharing the biggest processing demands are occasionally encoding video and image processing. Though I guess it doesn't matter if the video work takes a while to churn on "background" threads.

Believe it or not just holding open a typical 40+tab browsing session (Chrome, of course) and background processes can use up 4GB and keep my current dual-core Athlon running warm.

So I'm getting the sense I might be barking up the wrong tree CPUs on a replacement system? Like I should just look at i3's or something?
a c 184 à CPUs
December 29, 2012 7:07:05 PM

Video encoding and image processing would be the i5. Though you don't need to get the K unlocked version, you can opt out for a 3470.
December 29, 2012 7:07:52 PM

rds1220 said:
He doesn't say but it's a weird choice. He says he doesn't want to overclock and doesn't mention gaming so ...
Maybe we can come up with plain English distinctions here.

Unless it involves something like video editing (I've done that, and sound editing too. Video is special; for me that's where i7 comes into the picture.) I think "work" and "gaming" are sufficiently different.
I can typically have between 30 and 50 tabs open in FireFox. Sometimes (Don't shoot!) as many as 75. But that's basically text-based ... with Flash/YouTube thrown in here and there. But that ain't gaming.

For my work box I want quiet. Which, for me, means cool. Very diff requirement.
a c 140 à CPUs
December 29, 2012 7:21:46 PM

InvalidError said:
A10 isn't much of a gaming CPU. It does offer a much better CPU-GPU balance, which is great for people who may do a little bit of gaming but not enough to justify buying a discrete GPU.

If OP picked a 3570k as his 2nd choice with mild IGP gaming in mind while hoping to avoid buying a discrete GPU, I think he would have much better chance of getting what he wants with A10.


I agree and you should know by now what I think of the APU for gaming. I would never buy one or use one for gaming or everyday use. Either way I would go with a Pentium G or I3 over the APU.
a c 446 à CPUs
December 29, 2012 7:28:20 PM

As others have pointed out, this is a rather lopsided comparison...

Cost - Obviously going with the A10-5800k is the least expensive option of the two. The difference between the two is performance. The Intel Core i5-3570k has a more powerful CPU core, but weaker graphics core compared to the A10-5800k.

iGPU - Integrated Graphic - From the iGPU perspective, the A10-5800k's Radeon HD 7660D is more powerful than the Intel HD 4000. Below are some benchmarks from the linked review. The i5-3570k is not part of the comparison because of the price difference. The Core i3-3225 is used instead because at least the price is comparable. Also the games only use 2 CPU cores so the performance difference between an i3 and i5 is small. There more more benchmarks in the link. Below is just a few.

http://www.xbitlabs.com/articles/graphics/display/amd-t...








CPU Performance - While AMD wins on the integrated graphics. Intel wins in CPU performance. The link below is related to the review in the above link; it is part 2 of a 2 part review of AMD's Trinity APU. Again, the core i5-3570k is not part of the review, but it does have the weaker Core i5-3330 so it is safe to assume that the Core i5-3570k will provide a little better performance results. I've only listed some of the benchmarks; click the link to review the others. The Trinity APU does not win in any of the benchmarks.

http://www.xbitlabs.com/articles/cpu/display/amd-a10-58...













a c 446 à CPUs
December 29, 2012 7:30:14 PM

I decided to split my post because not all of the charts were being displayed when I was previewing my post...


Dedicated Graphics - If you are going to be installing a dedicated graphics card, then it depends on what card you install and how powerful it is. Trinity APUs are capable of dual graphics (formerly known as hybrid crossfire I think). Basically if you install an AMD graphics card no more powerful than the Radeon HD 6670 (a.k.a Radeon HD 7670), then the dedicated graphics card and the iGPU can work together to provide better graphics performance than just a Radeon HD 6670 alone. Probably somewhat close to a Radeon HD 6750 in performance. Not sure if Trinity's dual graphic capabilities will be compatible with the up coming Radeon HD 8xxx series next year. This feature is not compatible with any nVidia GPU.

Most gamers prefer to install a powerful graphics card. Meaning the Radeon HD 6670 / HD 7670 is much too weak for them. In this case dual graphics is disabled and the Trinity APU will only use the dedicated (a.k.a. discrete) graphics card. Below are benchmarks using the nVidia GTX 680 graphics card. The benchmarks are from the same review as above (the 2nd part of Xbitlab's 2 part review of Trinity). Again, not all benchmarks are shown below, click the link to see all benchmarks. The A10-5800k does not win any of the benchmarks.

http://www.xbitlabs.com/articles/cpu/display/amd-a10-58...








Power Consumption - Below are power consumption charts from both part 1 and 2 of Xbitlabs' Trinity APU review. The first set of charts is from the 1st part of the review. Since the review did not have any Core i5 CPU, just take the information as is. Since the A10-5800k uses less power when idling than the Core i3-3225, you can safely assume that it also uses less power than the Core i5-3570k when idling as well.

http://www.xbitlabs.com/articles/graphics/display/amd-t...








The following set of power consumption chart is from the 2nd review which includes a nVidia GTX 680 which explains why the power consumption when idling is higher. It also includes the Core i5-3330 which should provide similar power consumption results compared to the slightly faster Core i5-3570k. The A10-5800k consumes the least amount of power when the system is idling, but that changes when the CPU needs to do something... Note that the charts do any include any power consumption from playing games, if there were then you will see power consumption jump up because the nVidia GTX 680 uses around 185w of power

http://www.xbitlabs.com/articles/cpu/display/amd-a10-58...





December 29, 2012 7:34:33 PM

Quote:
LOL you inject your opinion AFTER the fact after the poster says what he is going to be using the computer for. Thats easy to do at the time of posting I didn't know what he wanted to do with the computer.
And at the time of my post I was replying to this:
rds1220 said:
ajbhatti said:
What's this machine going to do?

He doesn't say but it's a weird choice. He says he doesn't want to overclock and doesn't mention gaming so you would think ...
So really, you're just looking to score points by making it personal.
Now, back to the topic.

p.s. for "work" he used the same case I did: "Believe it or not just holding open a typical 40+tab browsing session". I cited between 30 and 50, sometimes 70+.
December 29, 2012 7:43:08 PM

Quote:
You have no idea what you are talking about and your post scrambled and make no sense. Here's a tip since you are new here to the forums if you want to be respected here don't attack people and act like a prick. It is a sure fire way to get flamed.
I see you attacking me. I don't recall having attacked anybody. You're entirely fixated on me, while I'm dealing with CPU for work / game.

You let me know what you don't understand with this and maybe we can work through it:
"p.s. for "work" he used the same case I did: "Believe it or not just holding open a typical 40+tab browsing session". I cited between 30 and 50, sometimes 70+."
a c 446 à CPUs
December 29, 2012 7:43:43 PM

Lastly, here are some additional power consumption charts from Techpowerup's review of the A10-5800k. While again, the Core i5-3570k is not being represented, at least the more powerful i7-3770k CPU is. Therefore, it is generally safe to assume that the i5-3570k will not consume more power than the i7-3770k.

http://www.techpowerup.com/reviews/AMD/FM2_APU_Review/5...

Basically the two charts below tries to measure only the CPU power consumption, not the entire system.





Full system...






And lastly, some gaming performance benchmarks in the same Techpowerup review which includes the i7-3770k using only the Intel HD 4000, the i7-3770k + Radeon HD 6670, the A10-5800k using only the Radeon HD 7660D iGPU, and the A10-5800k + Radeon HD 6670 (dual graphics mode):





December 29, 2012 11:29:08 PM

To confirm a few points: Yes, I want a machine that is cool and quiet. I won't run a separate GPU/graphics card, and I won't play games on this.

JaguarSKX, thanks for the benchmarks. The first set sort of suggested I might be on the right track with the A10. But the second and third set were baffling, perhaps because they emphasize gaming and incorporate systems with external GPUs?
December 30, 2012 12:07:01 AM

dbooksta said:
To confirm a few points: Yes, I want a machine that is cool and quiet. I won't run a separate GPU/graphics card ...
That economy will cover SSD for your OS and regular applications.
(No need to get huge, and they're wickedly fast, so I'm going that way.)

BTW dbooksta: If I didn't have my now redundant CPU (Athlon II; 630) and GPU (Radeon 5770) I'd be in precisely your position. But I'm not, so haven't researched this.

FWIW I see a newly updated Benchmarks CPU Charts 2012 but I see no updated version of Benchmarks Desktop CPU Charts 2010
a c 446 à CPUs
December 30, 2012 12:16:32 AM

Generally speaking when idling the A10-5800k uses less power than the i5-3570k. But most people do not simply turn on their PC and walk away. When under some type of load (CPU working on something), the opposite occurs. The A10 uses more power than the i5; even when just watching a movie. Forget about power consumption while playing games.

In essence, the A10 uses more power due to the following basic reasons.

1. 32nm vs 22nm die process. - All the tiny little transistors and other parts that makes up what a CPU is takes up space. The bigger something is the more electricity is needed to power it up because of more resistance. Resistance causes electricity to be wasted as heat. As an example, while a CPU may consume 50w of power, 20w might be wasted as heat because of electrical resistance.

AMD's Trinity is based on the 32nm die process and Intel's Core i3/i5/i7 CPUs uses the 22nm die process. This means all the tiny little parts in the Intel CPUs are smaller than in Trinity and less electricity is needed because less electricity is wasted as heat.

2. IPC - Instructions Per Clock (Cycle). The primary purpose of the CPU is to execute instructions so that programs can do their thing... IPC is the number of instructions the CPU can execute every 1Hz. Intel CPUs can execute more instructions per 1Hz than AMD CPUs / APUs can.

For example, let's just say that the Core i5 can execute 10 instructions every 100MHz, and the A10 can only execute 7 instructions every 100Hz. At 3.0GHz the i5 can execute 300 instructions, and the A10 can only execute 210 instructions. That is a difference of 42.86%. Theoretically that means the A10 has to be clocked 42.86% higher than the i5 CPU so that both can perform equally; that is clockspeed of 4.285GHz.

The faster the CPU runs the more power it takes. Kinda like a car. Say at 50 MPH the car is getting 30 MPG (miles per gallon). If you step on the gas to go 100 MPH, you are no longer getting 30 MPG, it might drop down to 22 MPG. You get to your destination sooner, but you also use more gas.

The faster the CPU is clocked, the more electricity it needs. However, pumping too much electricity into a CPU to make it go faster can eventually burnout the CPU much like driving your car at ever higher speeds. Eventually the engine overheats and stalls leaving you stranded until the engine cools down and maybe adding in more coolant.
a c 446 à CPUs
December 30, 2012 12:48:10 AM

dbooksta said:
Besides web browsing/apps and file sharing the biggest processing demands are occasionally encoding video and image processing. Though I guess it doesn't matter if the video work takes a while to churn on "background" threads.

Believe it or not just holding open a typical 40+tab browsing session (Chrome, of course) and background processes can use up 4GB and keep my current dual-core Athlon running warm.

So I'm getting the sense I might be barking up the wrong tree CPUs on a replacement system? Like I should just look at i3's or something?


Unless you are going a lot of HD video encoding, I would say that the core i5-3570k is not worth the additional cost over the A10-5800k. Yes, when under a load the i5 does use less electricity than the A10, but not enough over time to justify the additional cost.

The i3-3220 or i3-3225 would likely be a better alternative. The difference between the two is the Intel HD 2500 (i3-3220) and the Intel HD 4000 (i3-3225). Both are capable of playing back video, but I think the Intel HD 4000 also includes some additional algorithms to provide a little better video playback quality. The difference is $15 though. The i3 generally matches up pretty well against the A10-5800k based on the following benchmarks.

http://www.anandtech.com/bench/Product/675?vs=677

If you encode video using the x.264 codec with a program like Handbrake, then the A10 is going to give better performance as shown in the benchmarks. However, the Intel HD 2500 and HD 4000 have something called QuickSync and it's specialty is video encoding. Using a video converting program that can make use of QuickSync, the amount of time it takes to encode dramatically decreases.

http://benchmarkreviews.com/index.php?option=com_conten...

Slightly older benchmarks using the Intel HD 3000 in the 2nd gen "Sandy Bridge" CPUs. The performance is slightly increased since then, and video quality has greatly improved since then.

http://www.tomshardware.com/reviews/video-transcoding-a...
December 30, 2012 2:06:01 AM

jaguarskx said:
2. IPC - Instructions Per Clock (Cycle). The primary purpose of the CPU is to execute instructions so that programs can do their thing... IPC is the number of instructions the CPU can execute every 1Hz. Intel CPUs can execute more instructions per 1Hz than AMD CPUs / APUs can.


Last time I studied CPU architecture you basically had an instruction set and then some number of pipelines (now "cores") that could advance one instruction per cycle. I thought AMD and Intel were essentially keeping up with each others' instruction innovations (including caching and branch-prediction heuristics), but based on what you're saying that's not the case: Intel CPUs can process some instructions that have to be decomposed into multiple instructions on AMDs? And these are in fact exploited by common software. Is that right, or do I need to go back to school?
a c 101 à CPUs
December 30, 2012 3:11:41 AM

dbooksta said:
I thought AMD and Intel were essentially keeping up with each others' instruction innovations

Instruction sets are one thing but how they journey through the CPU's pipelines is another.

AMD's decoder is completely different from Intel's decoder with AMD using a conventional L1 instruction cache with one set of decoders shared between a pair of integer and one floating core per module while Intel reused the Netburst trace cache to bypass decode penalty on cache hits (good stuff is good, even if borrowed from Intel's worst architecture ever), AMD's re-order buffer is shallower than Intel's and Intel's has extra hardware in it for HT. The mix of issue ports and distribution of ALU/FPU resources between ports are different between each one's various product lines, etc. All those factors and a many more make each architecture behave quite differently from the others.

Block diagrams may give you a general idea of what to expect but the only way to know for sure how well a particular CPU architecture behaves/scales for a given application/game is to benchmark it because different code/data will exercise decode/execution units in a different way that may work better/worse on some architectures than others.

If you want fully predictable performance, you need to get non-superscalar non-multi-core in-order CPUs and even microcontrollers are evolving beyond that point these days. Once you get into superscalar CPUs with deep out-of-order queues, simultaneous multi-threading (HT in intelspeak), speculative execution, etc., things become wildly variable.
a c 162 à CPUs
December 30, 2012 4:11:59 AM

When the arguing began i skipped to the bottom to post, as it comes down to processor power and the difference in the igpu to cost. APU wins for the igpu by almost double to intel HD graphics, Like always intel wins for cpu performance, but in ur case looking at the i5 is double the cost, an i3 is more suitable in the cost range but doesnt have the intel hd 4000 graphics that the i5/i7s have, if the igpu is an issue in ur purchase.

Ur using an Athlon dual core so any cpu upgrade is more than enough from what u do since a dual core is sufficient, and if ur using the built in motherboard graphics then well Intel HD or Apu graphics is an upgrade as well.
a c 101 à CPUs
December 30, 2012 2:50:54 PM

lazyboy947 said:
an i3 is more suitable in the cost range but doesnt have the intel hd 4000 graphics that the i5/i7s have

Actually, the i3-3225 does have HD4000 IGP.

And among the desktop i5, only the i5-3570k does.

So HD4000 on anything lower than i7-37xx desktop-wise is the exception rather than the norm.
December 30, 2012 4:57:59 PM

InvalidError said:
Actually, the i3-3225 does have HD4000 IGP.
And among the desktop i5, only the i5-3570k does.
I'm looking at i3-3225 on NewEgg.ca ... $145. Pretty attractive! Seems to me it's come down to these 2.

Any advantage to 3570K that fits your use case?
December 30, 2012 5:45:27 PM

Yeah, looks like the i3-3225 is going to fit the bill. With two fast hyperthreading cores and good built-in graphics that should have me amply covered for the foreseeable future. Since I'm not gaming it looks like the only reason to step up is if I find a highly-parallel, processor-intensive program that I'm not content to leave churning in the background.
a b à CPUs
December 30, 2012 6:17:27 PM

dbooksta said:
Yeah, looks like the i3-3225 is going to fit the bill. With two fast hyperthreading cores and good built-in graphics that should have me amply covered for the foreseeable future. Since I'm not gaming it looks like the only reason to step up is if I find a highly-parallel, processor-intensive program that I'm not content to leave churning in the background.


A10-5700 is just as fast, has a better GPU, and it's cheaper. AMD motherboards tend to be a bit less expensive also.
a b à CPUs
December 30, 2012 6:58:55 PM

The A10 isn't as fast, but it is cheaper & has a better GPU. I wouldn't buy it though because you buy a processor to have a smart, fast piece that does many calculations. AMD processors, in my opinion, aren't worth their low price unless you're building a budget system, but even then I rather use a Pentium.
a b à CPUs
December 30, 2012 7:23:08 PM

payturr said:
The A10 isn't as fast, but it is cheaper & has a better GPU. I wouldn't buy it though because you buy a processor to have a smart, fast piece that does many calculations. AMD processors, in my opinion, aren't worth their low price unless you're building a budget system, but even then I rather use a Pentium.


Yes, it is just as fast.

So it's not worth it to buy a cheaper, better unit just because AMD makes it? In your opinion, of course. That's exactly what you just said.
December 31, 2012 1:26:54 AM

dbooksta said:
Yeah, looks like the i3-3225 is going to fit the bill. ... Since I'm not gaming it looks like the only reason to step up is if I find a highly-parallel, processor-intensive program that I'm not content to leave churning in the background.
Just to re-iterate, I truly am riding along with you on this. (Many dozen tabs in either Chrome or FF seem symptomatic!)

Now Payturr and Falcon open the door to an alternative to i3-3225: something by AMD. (Here's where you and I part company; I'm going to be using the Athlon II 630 that's presently in my game box where it was OC'd to 3.4GHz.)
a b à CPUs
December 31, 2012 4:08:20 AM

The Athlon was a good CPU - everything that AMD does today is god awful compared to what it once was. They let clockspeeds & cores get to their head, and now they can't make an architecture like Thuban or Deneb anymore. Those were real processors. I miss those days.

To Falcon: I'm not hating on AMD, their processors just aren't the same anymore. I know the Core i3 does better than the A10 in certain fields, which is making me lead towards it. I've just become an Intel guy since they let themselves go.
a b à CPUs
December 31, 2012 5:08:31 AM

payturr said:

To Falcon: I'm not hating on AMD, their processors just aren't the same anymore. I know the Core i3 does better than the A10 in certain fields, which is making me lead towards it. I've just become an Intel guy since they let themselves go.


It is a little frustrating. But they do still have some good products at certain price points. And I still go AMD as often as possible. I know too well what will happen if intel reclaims a monopoly. No more K series, much higher prices. In fact, there wouldn't be a K series without AMD and the black edition phenoms.

And the A10-5700 really is a better part than the 3225 for the OP. The cpu is about as fast, the gpu is faster, and it has a lower price.
a b à CPUs
December 31, 2012 9:43:11 AM

price wise AMD performs equally well with the Intel counter parts. more so when we take into account the integrated graphics.

but amd s tend to be a little more power hungry than the intel s. but again they are better backwards compitible
a b à CPUs
December 31, 2012 12:05:51 PM

one thing that most forget about, the a10-5700 was binned for a better power efficiency. Thats inherent in its design that some chips will be considerably less power than others, part of the resonant clock mesh sometimes hits a peak efficiency at its peak cpu speed. The outcome is considerable.



very few reviews out there even bothered to look at the non-K cpus. Thats 38W less at full cpu/gpu load. Thats right, 22nm cpu running only 11 watts less than "power hungry AMD." What does that 11 watts get you tho?



!