Sign in with
Sign up | Sign in
Your question

Wait for Intel's Haswell, or get AMD's Piledriver?

Last response: in CPUs
Share
September 15, 2012 8:11:32 PM

Hi, Im looking to build a fast computer that does everyday tasks and A LITTLE gaming.

The gaming part isn't important, but I want to overclock a very tiny bit.
I'm going to be getting everything at the holidays so I wouldn't have to wait too long for the Haswell,

but Intel is very expensive. My budget is $600-$1000, or 457-761 Euros.
a b à CPUs
September 15, 2012 8:18:18 PM

wait for the pd
m
0
l
a c 478 à CPUs
a b 4 Gaming
a c 117 å Intel
September 15, 2012 8:39:34 PM

There is no real point in waiting for PileDriver.

If PD can achieve a 10% performance improvement over Phenom II / FX then that will mean PD will be about as powerful as the 1st generation of Core i3/i5/i7 CPUs (Clarkdale / Nehalem).

If you build a PC around an Intel Sandy Bridge / Ivy Bridge Core i3/i5/i7 it will be, by default, faster than PD.
m
0
l
Related resources
a b à CPUs
September 15, 2012 8:45:14 PM

Wait for piledriver and see what AMD has in store. It always amazes me how people can see into the future and predict CPU benchmarks.
m
0
l
a c 478 à CPUs
a b 4 Gaming
a c 117 å Intel
September 15, 2012 8:58:28 PM

According to AMD they stated that their goal is to achieve between 5% to 15% improvement in performance with the release of every new CPU.

5% is a bit weak, it's only slightly better than a "side-grade" in which case PD will be like another FX (but without all the hype).

15% is a bit extreme; not even Intel achieved that feat; although Sandy Bridge came close with about a 12% performance over Clarkdale / Nehalem Core i3/i5/i7 CPUs. Needless to day Intel has a larger R&D budget. Can AMD achieve a 15% performance increase? Yes, it is possible (anything is possible). But how probable is it?

10% represents a reasonable performance increase expectation for AMD. Given AMD's more limited R&D budget (about 1/6 of Intel's) it should be achievable. Having said that, Ivy Bridge CPUs are about 29% more powerful than Core 2 Duo / Quad, Phenom II and FX CPUs. Therefore, in order for AMD's PileDriver to become somewhat competitive to Intel's Ivy Bridge CPUs, PileDriver would need to have at least a 22% performance increase which would place it in realm of Sandy Bridge's performance. Close enough to be considered competitive against Ivy Bridge.
m
0
l
September 15, 2012 9:24:05 PM

There's no such thing as a "bad" processor or a "bad" videocard.
It all comes down to price.

If the AMD cpu is 20% slower than Intel's but it costs 30% less, then it's a better buy. Not everyone wants to be on the bleeding edge of performance. I for example can use the already old Athlon X2 in my laptop very well for most common tasks ( browsing, music/video, a little web programming, sorting and processing photos, others )
My desktop is a highly overclocked quad core (even if it's not latest generation) ... Q6600 running at 3.4Ghz, and to be honest while it's benchmark scores are at least 3 times higher than the laptop, I do stuff at pretty much the same speed as on the laptop.

Unless you do HD encoding, Rendering or hardcore gaming, those top-end CPU's are severely underused.
So yea, it's down to price...
m
0
l
a b à CPUs
a b 4 Gaming
September 15, 2012 9:34:15 PM

If you don't care about gaming very much then you would have no problem piecing together a setup with a 2500k and a 6770. A Z68 or Z77 board would have all the UEFI, 6GBPS, USB3, and whatever else you might want.

That way you don't have to place bets on any untested future products.
m
0
l
a c 478 à CPUs
a b 4 Gaming
a c 117 å Intel
September 15, 2012 9:34:53 PM

The disparity of AMD and Intel CPU performance allows for a wide range of prices for current generation CPU based on overall performance. Should everyone on this planet buy an Intel CPU? No. Sure, overall their CPUs have better performance, but they also cost more. There are less expensive Intel CPUs, but lower end AMD CPUs that costs just as much or less can provide similar performance levels.
m
0
l
a b à CPUs
a b 4 Gaming
September 15, 2012 9:49:40 PM

Like 80% of all desktop/laptop processors sold are Intel.

Sure, its not 100%, but its close.

Sure they aren't for everybody, but their value proposition is good enough to take in the lion's share of the total.

TBH, AMD kinda really just sucks in performance per watt and that makes a pretty large difference in terms of the overall ownership cost of a given CPU.

The gap is set to widen a whole lot more soon too. Rumor is that for Ivy vs Haswell, the chips that are the same power level the Haswell will use about half the wattage as the Ivy Bridge uses.

That will put serious gamer CPUs at under 50w easily, maybe even under 40w. If history is any guide Piledriver will probably be more like 150w.

That is a lot of unnecessary heat and power bills avoided.

AMD chips have a steep markdown in the retailer, but when you add back to that all the indirect costs the Intel chips start to look better.

AMD needs to get their processor wattage down by like 2/3 to even compete.
m
0
l
a c 99 à CPUs
September 16, 2012 2:07:30 AM

luckylachance said:
Hi, Im looking to build a fast computer that does everyday tasks and A LITTLE gaming.

The gaming part isn't important, but I want to overclock a very tiny bit.
I'm going to be getting everything at the holidays so I wouldn't have to wait too long for the Haswell,

but Intel is very expensive. My budget is $600-$1000, or 457-761 Euros.


Piledriver (Vishera) should be out by then, it sounds like a very good fit for you. If not, get an existing Llano (K10/Stars-based) APU. You want an inexpensive CPU for pretty untaxing work, you want to be able to overclock a little, and you want decent graphics. That is exactly what AMD's APUs offer. Intel's CPUs excel at CPU-heavy tasks but their graphics are significantly behind AMD's, most can't really overclock at all, and they are pricey as you mention. Intel is a company that prides itself on high CPU performance and a highly integrated laptop platform and charges large premiums for those. The preliminary data on Haswell pretty much says that it will be very much in line with this.
m
0
l
September 16, 2012 1:55:20 PM

I would wait on PD. Trinity tests showed about a 15% increase in in performance clock-over-clock. If this holds true in the full PD, then your looking at some very competitive chips if priced right. A 4-core PD could show up around $130 or less. Might not be as good as Intel's i5's, but it will be much cheaper.
m
0
l
a c 478 à CPUs
a b 4 Gaming
a c 117 å Intel
September 16, 2012 10:43:13 PM

If you are referring to Trinity vs. Llano performance, then just be aware that Llano is not based on the Phenom II or FX CPUs. They are more or less modified Athlon II CPUs that are slower than Phenom II. Therefore, it might actually only be around 10% faster than Phenom II / FX CPUs.
m
0
l
a c 99 à CPUs
September 17, 2012 1:41:41 AM

jaguarskx said:
If you are referring to Trinity vs. Llano performance, then just be aware that Llano is not based on the Phenom II or FX CPUs. They are more or less modified Athlon II CPUs that are slower than Phenom II. Therefore, it might actually only be around 10% faster than Phenom II / FX CPUs.


Llano is based on the Family 16(10h)/Stars/K10 architecture, which is is the same one used by the Athlon II and Phenom II. The Athlon II is generally slower than the Phenom II because the Phenom II is an Athlon II plus 6 MB of L3 cache. Llano has no L3 cache but is a little faster than the Athlon II because the memory controller and a couple other things in Llano got tweaked relative to the Athlon II/Phenom II.
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 17, 2012 1:52:02 AM

Raiddinn said:
AMD needs to get their processor wattage down by like 2/3 to even compete.

I agree that needing a 125W AMD chip to compete with a 80W (or sub-50W with Haswell) Intel chip is a little sad. With such a large performance-per-watt gap, the Intel chips can pay for themselves via lower power bills over as little as a single year.
m
0
l
a c 146 à CPUs
a b 4 Gaming
September 17, 2012 2:47:51 AM

I would go with Ivy Bridges if you want to build a computer now. Haswell isn't going to be out until next year and it doesnt seem like you want to wait that long. I wouldn't bother with AMD and I wouldn't wait for Pildriver either. Even IF it does have the performance increase AMD is claiming than it will put it on the level of the first generation I core processors, still way behind Intel. Even a low end SB or IB can beat out most of the AMD Bulldozer's and it will probably be the same for Piledriver once it's released..
m
0
l
September 17, 2012 6:20:16 AM

I have a i5 2500k(4,1ghz) and im gonna take the haswell, i never wants a AMD cpu anymore
m
0
l
a b à CPUs
September 17, 2012 12:57:51 PM

InvalidError said:
I agree that needing a 125W AMD chip to compete with a 80W (or sub-50W with Haswell) Intel chip is a little sad. With such a large performance-per-watt gap, the Intel chips can pay for themselves via lower power bills over as little as a single year.


45 watts difference is enough to pay for the itself? I'm not sure what you are paying for electricity, but it would take over 10 years at that rate to pay for itself.
m
0
l
a b à CPUs
a b 4 Gaming
September 17, 2012 1:58:24 PM

egilbe said:
45 watts difference is enough to pay for the itself? I'm not sure what you are paying for electricity, but it would take over 10 years at that rate to pay for itself.


If you are going to make assumptions, you should say what they are. So should the person you quoted.

10 years of how much per day usage?

1 hour? If so then it would be possible to get 10 years shrunk down into 1 year just by using the PC 10 hours a day.

Also, its not important to recover the entire cost of the processor. It is more important to recover the gap between worse processor A and better processor B. Once that gap is recovered, you are in the black purely on the numbers. Thus to "pay for itself" you really only need to cover 1/2 to 1/3 of the total cost of the processor, ie the difference between most Intel chips and their cheaper higher wattage "equivalents".

Case in point the FX-8150 (at $190 on Newegg) usually comes up compared with a 2500k (and lacking btw, at $220). The difference in pricing is $30 out of $220 or about 1/7 the total cost. It is only that $30 that needs to be recovered for there to be zero downside to choosing the Intel chip, even in terms of the up front purchase price.

Rounding some numbers off for simplicity (and ignoring that OCing is heavily done with both processors and that skews the data even more towards getting the Intel chip), the usage of the Intel chip saves you about 1/20 of a KWH and at 10 cents per KWH, every hour the computer is used saves about 1/2 of 1 cent. 2 hours makes 1 cent. 200 hours makes 1 dollar. 6000 hours makes $30.

Where I live, electricity varies from 7.2 to 13.2 cents per KWH using plans from about 100 different providers with widely varying terms of service, so I went with a value in the middle both for ease of calculation and because values lower often have service terms that make up the difference anyway. Plenty of plans at 10 cents here are 100% renewable energy sources as well, for those environmentally aware.

Anyway, at 6 hours a day that is 1000 days to recover the gap which is about 3 years. At 12 hours a day the time shrinks by half down to a year and a half. Most hardcore gamers will probably fall somewhere in the difference between those. Some may even push the break even point down pretty close to 1 year.

It may really take 10 whole years to recover the entire $220, but that is an entirely flawed way to look at cost-benefit analyses anyway, regardless what you assume "pay for itself" means.

Going back to above where I said ignoring OCing, its pretty common for gamers not to ignore this and at +2% power gains +1% performance average rate the higher wattage FX would scale up in wattage much faster than the lower wattage SB chip would.

Saying the OCer were wanting a +50% performance in both cases (taking the 2500k to 5GHZ vs 8150's 5.4 GHZ) which is pretty aggressive, that would double the gap between the high and low and half the numbers above, making it quite easily to recover the difference in one year for hardcore gamers. Even if you tamed the OCs some down to a more normal +1/3 (2500k goes to about 4.4 GHZ and 8150 goes to more like 4.8 GHZ) it would still require less than 12 hours a day to recover the whole $30 in one year.

Anyway, that is all just a serious effort to look at the numbers for direct cost. They don't even begin to factor indirect costs like the fact that the CPU fan runs at a lower RPM with the SB as compared to the FX, the cost of added strain on the PSU, and the effects of higher temperatures on other hardware in the computer. Those figures serve to slant the results even more towards Intel instead of AMD.

- edit - clarity
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 17, 2012 4:50:52 PM

egilbe said:
45 watts difference is enough to pay for the itself? I'm not sure what you are paying for electricity, but it would take over 10 years at that rate to pay for itself.

As Raidinn said, I was referring to the price gap between the AMD and nearest indisputably faster Intel CPU. If going with Intel costs $60 extra for the CPU+board, $60 is the up-front cost to recover over time.

Since I leave my PC on 24/7 most of the year (except summer), my break-even point would be around two years.
m
0
l
a b à CPUs
September 17, 2012 4:56:46 PM

InvalidError said:
As Raidinn said, I was referring to the price gap between the AMD and nearest indisputably faster Intel CPU. If going with Intel costs $60 extra for the CPU+board, $60 is the up-front cost to recover over time.

Since I leave my PC on 24/7 most of the year (except summer), my break-even point would be around two years.


45 watts is less than $5 a year. Are you really going to keep a pc that long?
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 17, 2012 5:20:11 PM

egilbe said:
45 watts is less than $5 a year. Are you really going to keep a pc that long?

Huh? I think you put the decimal point at the wrong place somewhere.

24h/day x 365days/year x 0.045kW x $0.10/kWh = $39/year.
m
0
l
a b à CPUs
a b 4 Gaming
September 17, 2012 5:26:06 PM

egilbe said:
45 watts is less than $5 a year. Are you really going to keep a pc that long?


This $5 figure means nothing. It could be randomly picked out of the air for all we know. Indeed, without any backing it appears so.

Two people have shown their calculations backing the opposing side. Feel free to show your calculations backing your own side.
m
0
l
a b à CPUs
September 17, 2012 6:32:12 PM

InvalidError said:
Huh? I think you put the decimal point at the wrong place somewhere.

24h/day x 365days/year x 0.045kW x $0.10/kWh = $39/year.


That's full power, 24/7/365. Adn that is going to happen.
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 17, 2012 9:56:16 PM

egilbe said:
That's full power, 24/7/365. Adn that is going to happen.

Depends on who you ask.

My main computer probably averages 16h/24/ For people running SETI, Folding or other things like that, this is indeed 100% load 100% of the time the computer is turned on.

And Intel's chips have lower idle power as well, so there are savings to be made there as well. With Intel's Haswell having an allegedly 20X better idle power, the idle savings may become quite significant as well.
m
0
l
a b à CPUs
a b 4 Gaming
September 17, 2012 10:54:09 PM

Bulldozer doesn't idle lower than equivalent Intel generations on that link.

2500k = 100w
8150 = 112

Those are non-OCd figures and Intel wins by a nice margin.

OCd the numbers change, but its noteworthy that they aren't both OCd to the same extent. The 8150 has about a 22% OC whereas the 2500k has about a 32% OC.

Pushing the 8150 to a similar 32% OC (5.3 Ghz) its not at all clear it would still idle lower than the 2500k. It may, but its it also may not.
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 17, 2012 11:50:09 PM

egilbe said:
Seti @ home and folding @ home are exceptions, not the rule.

They may be "exceptions" but there are still almost 2.5 million active people on BOINC projects.
m
0
l
a c 186 à CPUs
a b 4 Gaming
a b å Intel
September 18, 2012 12:01:44 AM

Also, enterprise servers too! Companies don't want to shell out tons of money for electricity, they want to most efficient performance and lowest cost! That's what intel is best at!
m
0
l
September 20, 2012 10:04:38 AM

Just want to add that most of the time, the 2500k also finishes faster on a good amount of tasks which goes on it's idle state which will also save more electricity compared to the 8150 which would still be on it's full active mode that will obviously suck more power, giving a bigger variance to the difference as opposed to just 45 Watts.
m
0
l
September 20, 2012 10:38:14 AM

I have often thought that when comparing wattage between AMD and Intel, you should take into account the core count. The i5 idles at around 100W with two cores, while the FX8150 idles at aound 100W with 4 modules (or eight cores, if you use AMD's definition). An AMD module is only using 25W per module in comparison to about 50 for the i5 2500K. In non-multithreaded scenarios the AMD chip is not power efficient (partly because of Win7 scheduling), in threaded scenarios (where it beats the i5 on performance) it is.
m
0
l
a b à CPUs
September 20, 2012 10:48:01 AM

If you consider that AMD intended the Zambezi's to run 30% faster than the previous generation, compensating as Intel did with P4 the deep pipelines with clockspeed thus not sacrificing on IPC that much. If you run it at that clock you see the intended IPC and its very good IPC, PD will run at that clock so yes the IPC's will be vastly improved.

I can rest assure you a 1100T beats i7 920,930 edged out by 950's......don't tell me PD is first gen.....if anything maybe first gen in power performance.

If Intel really were 30% faster or to some 50% faster then AMD would have stopped making CPU's a long time ago, you can dramatise it all you want but the margins are nowhere near as wide.
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 20, 2012 12:07:05 PM

sarinaide said:
If Intel really were 30% faster or to some 50% faster then AMD would have stopped making CPU's a long time ago, you can dramatise it all you want but the margins are nowhere near as wide.

http://www.anandtech.com/bench/Product/288?vs=203

In 2500k vs 1100T-BE, Intel is 30-50% faster in about half the benchmarks and loses only 3-4 out of 40+.
m
0
l
a b à CPUs
September 20, 2012 12:32:28 PM


eh..huh what? :heink:  Like 80% of all desktop/laptop processors sold are Intel = no it will be about 60-65% most of the people are using amds processer and amd is not giving any bad performance when compared to the price instead of 8 cores bulldozer chip first of all most of the people are saying that fx 4100 is not better then i3 2120 but they dnt know that allover performance of fx 4100 is better then i3 2120:) 

http://www.legitreviews.com/article/1766/1/

the fx 4100 can overclock till 4.6ghz with stock heat sink without increasing any voltages and will give far much better performance it will give far better performance then i3 most of the people pls first use any thing then comment rubbiish about anything :pfff: 

@op pls wait and see what piledriver vishera based chip has got then take the move be happy:) 
m
0
l
a b à CPUs
September 20, 2012 1:22:48 PM

InvalidError said:
http://www.anandtech.com/bench/Product/288?vs=203

In 2500k vs 1100T-BE, Intel is 30-50% faster in about half the benchmarks and loses only 3-4 out of 40+.


I don't know what 171 vs 139 means, its basically a thumbsucked number to try represent something, that is like the worst possible graph I have ever come across basically a lot of meaningless numbers..........reaking of benchmarketing.

Anyways considering BD is basically a watered down version of what it was intended to be on a rather crappy GF process. At 4.3ghz where it was intended to run stock it has similar IPC's to Intel.....repeat Stock speed.

m
0
l
a b à CPUs
September 20, 2012 1:31:07 PM

Ags1 said:
I have often thought that when comparing wattage between AMD and Intel, you should take into account the core count. The i5 idles at around 100W with two cores, while the FX8150 idles at aound 100W with 4 modules (or eight cores, if you use AMD's definition). An AMD module is only using 25W per module in comparison to about 50 for the i5 2500K. In non-multithreaded scenarios the AMD chip is not power efficient (partly because of Win7 scheduling), in threaded scenarios (where it beats the i5 on performance) it is.

2nd gen i5's=95W
3rd gen i5's=77W
8150=125W

and those are the quad core i5's
m
0
l
a b à CPUs
September 20, 2012 1:44:41 PM

8 vs 4 Cores, you need to feed them, this is not a massive hoohah. The FX 81XX maintained 125w vs the Thubans on the same TPD while actually improving it.

Another factor is Intel DT chips are really notebook processors beefed up so that would help.

And lastly......who cares.
m
0
l
a b à CPUs
a b 4 Gaming
September 20, 2012 1:53:58 PM

@ sunnk - If experience is what matters, you might not want to Vs. all the people with a much higher rank than you on discussions, just sayin.

The FX-8150 is indeed 30-50% slower than the 2500k when both are at stock while using up almost 2x as much power for that performance. Practically every real world benchmark has shown that.

The FX-8150 has to have a healthy OC just to equal a stock 2500k. The FX-8150 cannot even attempt to equal a 2500k when both are OCd.

Even in your example with the FX-4100 that can be OCd about 1GHZ more for about 25% more performance on a high quality air cooler, most programs that people want to use are still locked at two cores.

So yes, it might have some small advantage in the programs that use 4 cores but don't use the HT of the i3-2120 and maybe even some that do use HT with a max air OC, but that small advantage doesn't make up for the huge disadvantage on anything that locks you down to just 2 cores.

It also doesn't make up for the fact that the FX-4100 is going to use like 50w more at stock when it can't equal the 2120 on these 2 core things and another 50w more when you try to get that 25% OC. That is 3x the juice sucked down to hope to match.

Just for fun here is a review comparing the i3-2100 vs the FX-8150. Sure there are about half the benchmarks where the 8150 comes out ahead, but for less than half the RL $, the 2120 matches or beats it in the other half of things. The half that people care about the most (games) is the half that leans towards the 2100.

Dragon Age Origins = 2120 wins
Dawn of War 2 = 2120 wins
World of Warcraft = 2120 wins
Starcraft 2 = 8150 wins by a narrow margin

Mind you that stuff is stock for the 8150 and maybe the 8150 can win on some of those with massive OCs, but it does set you back a whole lot more in RL $.

On the other hand, the 2100 doesn't beat the 2500k in pretty much anything except wattage if you change to 2100 vs 2500k on that same screen.

At least if you pay for a better processor with Intel then you actually get a better processor. One that is $200 shouldn't lose to one that is $120 or $100, but that's exactly what BD are famous for.

The FX-4170 may be a hidden gem diamond in the rough sort of processor, but the scaling from there is horrible because of poor adoption of more cores in a lot of things people really use, notably games. It is only in synthetic benchmarks that the FX-8150 can really shine.

It is also pretty sad when your processor has to have a high OC just to match a competitor's equivalent processor. That is, however, par for the course with all of AMDs processors in the last 5 years or so.

If you aren't going to be doing any gaming or you don't care about spending more and getting less having to OC to hope to match the competitors product, then go for the 4100 or 4170. The excess power usage of the FX chip will make sure you net-lose on price over the long term by a long shot in order to maybe gain a tiny bit in the things that really matter.

@sarinaide - Power usage does very much matter. Maybe it doesn't matter to the people whose daddies buy them everything and pay their power bills, but it matters to people who pay all their own bills. The lower cost of ownership is on Intel's side.

That is talking purely purchase cost + power bill. If you consider that having 100w more system used power also means more heat inside the case from the FX vs the i3/i5s and much higher heat inside the PSU, the computer that is much more likely to have a catastrophic hardware failure is the one with the FX chip in it. That point just can't be ignored.

Sure, pay $20 less to get a FX-4100 instead of an i3-2120, and then pay the same amount more to get a better PSU to handle the extra 100w and put your parts at a whole lot more risk while you are net losing on power bills.

- Edit - responded to people posting while I was writing this.
m
0
l
a c 132 à CPUs
a b 4 Gaming
September 20, 2012 2:00:25 PM

sarinaide said:
At 4.3ghz where it was intended to run stock it has similar IPC's to Intel.....repeat Stock speed.

Higher clock speed does not magically increase IPC. In all likelihood, IPC would decrease with clock speed due to more cache misses and heavier branch mispredict penalties from the increased latency discrepancy between core and RAM.

As for "intended" clock speeds, Intel intended Netburst to clock in at over 10GHz by now. The real-world does not always agree with the companies' engineers. What matters is the stuff that ends up on store shelves, not the design goals companies failed to meet.
m
0
l
a b à CPUs
September 20, 2012 3:00:08 PM

InvalidError said:
Higher clock speed does not magically increase IPC. In all likelihood, IPC would decrease with clock speed due to more cache misses and heavier branch mispredict penalties from the increased latency discrepancy between core and RAM.

As for "intended" clock speeds, Intel intended Netburst to clock in at over 10GHz by now. The real-world does not always agree with the companies' engineers. What matters is the stuff that ends up on store shelves, not the design goals companies failed to meet.


Higher clocks mitigate the loss of IPC, so yes there is loss but a lot less......the best example was Pentium 4 (The easy test is a 4.1ghz 8150 beats a i5 2500K its direct price point competitor). Zambezi when leaked was intended to be a 8,6,4 Core chips clocked at 4Ghz, 4.2ghz turbo core with 2billion transistors scaling down to 1.5billion transistors, The process was not ready. Piledriver now hits that goal but then again its a more mature 32nm process. SteamRoller will be the big leap forward, the 28nm process would be perfected by that stage.
m
0
l
a b à CPUs
September 20, 2012 4:49:17 PM

sarinaide said:
Higher clocks mitigate the loss of IPC, so yes there is loss but a lot less......the best example was Pentium 4 (The easy test is a 4.1ghz 8150 beats a i5 2500K its direct price point competitor). Zambezi when leaked was intended to be a 8,6,4 Core chips clocked at 4Ghz, 4.2ghz turbo core with 2billion transistors scaling down to 1.5billion transistors, The process was not ready. Piledriver now hits that goal but then again its a more mature 32nm process. SteamRoller will be the big leap forward, the 28nm process would be perfected by that stage.


He was pointing out your misunderstanding of what IPC (Instructions per cycle) is. By increasing the CPU frequency, there is no inherent change to IPC, but as he pointed out there is a strong potential for a decrease in IPC.

Sure if you overclock performance will go up, but not IPC.
m
0
l
a b à CPUs
September 27, 2012 1:38:54 PM

@raiddiinn first of all when it comes to experience and rank you will be thinking that u r more experienced then me but let me tell u that i m a student which is studying in school now in 9th std 14 years old and yeah i have got 1 silver medal here in graphics and display with in a month i have got the first rank in graphics and diplay section for a month record by getting most of the best answer most of the people are being happy with my post and their upgrade of hardwares and i m not coming to forums of toms hardware a year ago i m just reading the articles:) 

i have told their that an amd bulldozer 8 core chips just sucks i havent told that it is on par with i5 2500k:) 

i have two freinds one has i3 2120 one has fx 4100 after comparing them i got this result its true result not a fake or i m saying it with my own desires:) 

i3 2120 get beaten by fx 4100 in multitaking most of the time my freind is scanning his computer with antivirus+playing game that time mostly crysis 2,farcry 2,bf3 and running setup of some games which is ripped by kaos and believe me the usage doesnt goes over 60%of cpu in other place my freinds i3 2120 he is also doing some multitasking like my fx 4100s freind but his cpu usage is about 85 - 90% sometime 100% too:)  my both freind has hd 5770 of gigabyte udv series i think when running benchmark of farcry 2 fx 4100 with hd 5770 get about 40-60 fps
in other hand with i3 it gets about 35 - 55 fps u will be thinking that i m lying but its true and persnol experience well with crysis 2 there is not a big difference bot are getting about 35-50 fps with hardcore settings res 1366x768 with bf3 25 - 40 - 45 fps in bf3 fx 4100 mostly gets about 35 - 40 fps where the i3 2120 is a bit ahead it gets about 42fps most of the time but the cpu usage is about 60 to 85 % too which is a bit higher for me when comparing it with fx 4100 which doesnt go over 60% of usage in bf3 and the fx 4100 has some advantage over the i3 2120 which is the price the i3 is of 135$ where the fx 4100 is of 100$ + fx 4100 gives 2 more extra cores ability to overclock better performance in multitaking and video encoding after overclocking it gives more and much better performance in everything power consumption tdp doesnt matter for most of the peoples now days:) 





m
0
l
a b à CPUs
a b 4 Gaming
September 27, 2012 2:17:46 PM

TDP doesn't matter to enough people. It should matter to more of them.

Proper power and cooling also doesn't matter to enough people and it should matter to more of them. A large percentage of the problems I see in the wild trace back to this area.

It would make sense, though, that if you aren't paying for your own hardware or power bills that you wouldn't care about TDP. Might as well offload the drawbacks of going AMD on some 3rd party unlike 95% of TH users actually do.

If you need a quad for whatever because that is just what is recommended, then yes putting a quad vs a dual core processor is going to favor the quad. The numbers I posted above aren't lying, though. Few games need quads whereas all benefit from high IPCs which AMD doesn't have.
m
0
l
September 27, 2012 5:37:23 PM

MU_Engineer said:
Piledriver (Vishera) should be out by then, it sounds like a very good fit for you. If not, get an existing Llano (K10/Stars-based) APU. You want an inexpensive CPU for pretty untaxing work, you want to be able to overclock a little, and you want decent graphics. That is exactly what AMD's APUs offer. Intel's CPUs excel at CPU-heavy tasks but their graphics are significantly behind AMD's, most can't really overclock at all, and they are pricey as you mention. Intel is a company that prides itself on high CPU performance and a highly integrated laptop platform and charges large premiums for those. The preliminary data on Haswell pretty much says that it will be very much in line with this.

Piledriver sounds like a very good fit for him, but a i5 3570K isn't? :ange: 

I love how you go with "they are pricey as you mention". :lol: 
m
0
l
September 28, 2012 8:19:49 AM

just HAD to add my 2cents on the intel vs amd debate

you get what u pay for

quality and lower power comsumption vs price is the real debate
in which case this is a no-brainer
m
0
l
a b à CPUs
a b 4 Gaming
September 28, 2012 12:58:57 PM

It isn't quality + power vs price.

When you factor in the cost of additional cooling, addition power bill cost, and everything else involved the price swings to the other side, as such:

quality + power + price vs ...something...
m
0
l
!