If Bulldozer is a fail Is Ivy Bridge also a fail

Status
Not open for further replies.

vishalaestro

Distinguished
Jun 29, 2011
1,446
0
19,310
doesn't know but i'm sure that ivy bridge will run cooler sandy bridge due to its shrink and also gives 20% faster performance in cpu processing and more than 50% performance in graphics dept..i'm pretty sure that except 8 core model all the processor from bulldozer justify their performance from their price so its a cost to if u take a i5 2400 and a fx 6100 ,fx 6100 is cheaper and its performance are according to its price amd didn't do really a wrong they just need to do some adjustments..
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
we will see. most rumors as of late are pointing to no to very little drop in power. Trying to cool a smaller chip that uses the same power, not good.

Even if it is a disaster, Intel will never be labeled a flop. People back then and even now still try to defend the P4.

AMD on the other hand is examined with an electron microscope to find any and every small detail and blow it out of porportion.
 


Ivy Bridge runs hotter than Sandy Bridge. Shrinking down the die size helps reduce power consumption which Ivy Bridge was able to achieve. However, shrinking the surface area of the CPU also reduces the efficiency of heat transfer from the CPU to the heatsink. It's a matter of physics. While Ivy Bridge uses less power than Sandy Bridge, it also transfers heat less effectively. That means the CPU retains more heat and runs a little hotter.

Not sure why people are hoping for a 20% improvement in performance. If it was going to be that high I'm positive Intel would have mentioned they expect Ivy Bridge will have better performance. They never mentioned any improvement in CPU perform.

The integrated graphics core was something that Intel did say would be an improvement. Anandtech estimated a 60% improvement sometime last year, and some recent preliminary benchmarks should the Intel HD 4000 is around a 45% improvement over the Intel HD 3000, if i am not mistaken.

See following link for some synthetic benchmarks. Actual benchmarks should appear once Intel's NDA has elapsed. Note the small increase in CPU performance. The review also reveals on a small 2w drop in power consumption compared to the i5-2500k Sandy Bridge in both load and idle. While the idle temp of both i5-3570k and i5-2500k are the same, at full load the i5-3570k runs 7C hotter.

http://www.tweaktown.com/articles/4618/ivy_bridge_preview_with_gigabyte_z77x_ud5h_intel_z77_and_core_i5_3570k/index6.html
 


Only if you had absurdly high expectations of Ivy Bridge like a 20% increase in CPU performance which are based on "hopeful wishes" of people rather than based on anything that Intel has published to the best of my knowledge. Intel did mention that Ivy Bridge includes an updated AVX instruction set which can boost performance of programs that uses AVX by a decent amount, however there are very few programs that uses AVX instruction sets. To the best of my knowledge, only financial and scientific modelling use AVX.

If you are an overclocker, then I suppose Ivy Bridge is more of a disappointment than a failure due to the higher operating temps as a result less efficient heat transfer. That is basically due to the laws of physics; specifically thermodynamics.


Bulldozer was a "fail" because it failed to live up to the hype AMD's marketing department's promises. I believe after the actual results of Bulldozer's performance (or lack there of) was revealed AMD downsized the majority of their marketing department.
 
Hanging around here makes it easy to forget that ~95% of the market won't overclock.

Business, consumer, casual gaming and notebook customers will determine if Ivy Bridge (and Bulldozer) is a success.

If the enthusiast customers don't find favor with one series or another it's not going be an indicator of 'success' or 'fail'.
 
The bulldozer will be a disappointment on the basis that it underwhelmed the expectations that AMD naively created. A radical architecture without time or prior knowledge to work off was always going to be a gamble. So in that sense it is/was.

On the other hand is the Zambezi a abject failure in that there is still enough positives that came out of it. IPC's aside the multithreaded performance is promising considering its AMD's first run at SMT and CMT's results overall are impressive, considering the FX 8120/50 priced to compete at the 2500K level beat the 2500K in practically every bench requiring high thread counts, and competes with Intels 2700K and Extreme processors in the high thread count game, considering that its only priced at $240 max, it is a good showing.

real world vs synthetics tell completely different stories, overall I would rate Zambezi 7/10 while SB at around 9/10
 


+1 to this
 

Chip in a box

Honorable
Mar 2, 2012
40
0
10,540
In that case Ivy will still be a failure compared to Trinity because Trinity is going to improve much more with some leaks saying 30% cpu and 50% graphics improvement.

The A10 5800K looks to be quite close to the i7 970 in single thread cpu performance with a very good graphics portion included. It looks like the gap is closing quite fast. http://citavia.blog.de/2012/04/08/trinity-piledriver-performance-13460109/
 

vitornob

Distinguished
Jun 15, 2008
988
1
19,060
Intel intents is to bring more iGPU power than ever. (checked)
Also lower TDP. (checked)
But runs hotter at OC. (sadly checked)

AMD intents was to bring the house down with the new BD architecture. (not checked)
And yet failed to make ppl move from own Phenom 2 architecture....

This is why AMD failed with bulldozer, and this is why Intel not failed, but do disappointed me with higher temps... and everyone else.
 

vitornob

Distinguished
Jun 15, 2008
988
1
19,060


Well.. if trinity is going to be this powerful AMD should throw away FX-line and go only with this one.

BTW, I looked the chart in your post, and the chart looks weird.. there's two i5-2500k.
But this isn't my major concern.. if the chart you posted is correct, the chart means that this new A10-5800k can't touch i5-2500k, and have a worse IPC than A8-3870 in INT and FP calculations... will A10-5800k be a Llano downgrade?
 
If you are only looking at a 6% clock by clock gain over a comparitive SB chip, factoring in the need to go to Z77 to have all the features of the IB, that may not be sufficient to make a SB owner move. As for HD 4000, it will still be made to look hopelessly redundant compared to a APU. PCI-e 3.0 may only be a factor by the time Haswell arrives making Lucid and Faster memory speeds the only reasons to move to IB.

Trigate and its efficiency is still to be determined, while IB will be the most sophisticated x86 processor, it may not be enough to move from SB though.
 

Chip in a box

Honorable
Mar 2, 2012
40
0
10,540


Obviously it's not going to be as fast as the 2500K when it has a 6670-level gpu attached. Trinity/Piledriver has lower IPC and higher clocks which makes it faster overall. Why is this a downgrade?
 

vitornob

Distinguished
Jun 15, 2008
988
1
19,060


When I wrote Llano downgrade I was talking about "architecture" downgrade. Why lower the IPC? Since higher Ghz deliver more current leakage, less room for OC and considering heat dissipation law, more Ghz means more than linear TDP increase.
 

Chip in a box

Honorable
Mar 2, 2012
40
0
10,540
AMD says Trinity doubles performance per watt over Llano - http://blogs.amd.com/fusion/2012/04/19/amd-%E2%80%9Ctrinity%E2%80%9D-and-%E2%80%9Cbrazos-2-0%E2%80%9D-heading-your-way/

This will be at low TDP, rumours say that the 17 Watt Trinity performs about the same as the 35 Watt Llano and thats an amazing increase. Things won't be so great at 100 Watts because of what you mentioned, but that's the same with intel too. Intels ivy bridge low TDP chips will be better in performance per Watt than the high TDP chips.
 

ebalong

Distinguished
Sep 11, 2011
422
0
18,790



Where did that "20% improvement" come from anyhow? Sandy Bridge wasn't 20% faster in raw CPU performance than Westmere was it? Patterns suggest that the more significant overhaul of the processor (the "Tock") is going to result in the greater gains over the last generation.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

so your logic is that people upgraded from the I7 920 to the I7 2500k? Not everyone upgrades from one gen to the next, logically your cpu should last at least 2. The I7 920 is still very much viable cpu as is the phenom II. A current owner of either cpu wouldn't be worth upgrading yet.

When I wrote Llano downgrade I was talking about "architecture" downgrade. Why lower the IPC? Since higher Ghz deliver more current leakage, less room for OC and considering heat dissipation law, more Ghz means more than linear TDP increase.

kinda like I7 920 2.66 ghz being 130 W and 2500k 3.3 ghz being 95w? maybe your logic can only be applied towards your hatred for AMD.

As for that chart, they posted all the speeds as their trubo speed, so the per ghz is off as they didn't lock the speed. Turbo doesn't automatically max out, especially in a benchmark.
 

ebalong

Distinguished
Sep 11, 2011
422
0
18,790



Are you comparing the 8120 to the 2500K? True, the former is ~$40 cheaper than the latter (if both are bought off of NewEgg for example), but the 2500K performance superiority is probably worth more than that $40 difference.

FX-6100 is more expensive right now than the i3-2120 and I think you could make a case that the i3 is better performing on average in games and other tasks.

I'm pretty sure I saw some gaming benches somewhere that put the G860 above the FX-4100, which is of course, $10 more than the Pentium.
 

ebalong

Distinguished
Sep 11, 2011
422
0
18,790



Yeah, isn't that the 8150 vs the 2500K though? The Bulldozer that is more expensive than the 2500K (and was initially priced much closer to the 2600K)?
 
Status
Not open for further replies.