Sign in with
Sign up | Sign in
Your question

AMD 45nm Deneb benchmarks

Tags:
  • CPUs
  • Nehalem
  • AMD
  • Phenom
Last response: in CPUs
Share

What do you think of Deneb's performance?

Total: 23 votes (3 blank votes)

  • Good
  • 60 %
  • Poor
  • 40 %
November 13, 2008 9:00:14 PM

More about : amd 45nm deneb benchmarks

November 13, 2008 9:12:13 PM

The 3.2ghz one seems pretty nice. But it dosent look like the 2.3ghz one is even worth buying if you own a current Phenom.
November 13, 2008 9:21:18 PM

Makes me want to get a Deneb and an AM2+ board to play around :( 
Related resources
November 13, 2008 9:22:17 PM

So just like others said, we'll see a small but decent clock-over-clock improvement, while most of the advantages come from small node.
November 13, 2008 9:43:56 PM

well through a little work, ive found out that the 3.2 GHz version is on par with the QX6800 from Intel. Also, it performs the same in Fritz 11 as the Q9450.
Hopefully the performance improves. I was kinda hoping for a little more.
a b à CPUs
a b À AMD
November 13, 2008 10:00:59 PM

eh?? those are the same old benchies from august, it.anand had some new ones
http://it.anandtech.com/IT/showdoc.aspx?i=3456
server stuff, but hey its a server chip. and at least for perf/watt it beats the 'new' xeon quite clearly
November 13, 2008 10:26:58 PM

Kari said:
eh?? those are the same old benchies from august, it.anand had some new ones
http://it.anandtech.com/IT/showdoc.aspx?i=3456
server stuff, but hey its a server chip. and at least for perf/watt it beats the 'new' xeon quite clearly


Be nice if they had the Xeon at a lower speed grade for comparison sake.

Word, Playa.
November 13, 2008 10:33:45 PM

Lol, sorry guys, I just had to do it after seeing Shanghai benchmark that makes it look good under server conditions. It's already #1 on Google for "Deneb benchmarks." *dies* :na: 
November 13, 2008 10:47:43 PM

i was kinda wondering why the benchmarks were in chinese?
November 13, 2008 10:59:27 PM

the last resort said:
i was kinda wondering why the benchmarks were in chinese?


It's a Chinese site. :p 
November 13, 2008 11:04:09 PM

the last resort said:
i was kinda wondering why the benchmarks were in chinese?


Because they usually get the ES chips and boards first, given that they're the manufacturing base afterall. :bounce: 
November 14, 2008 4:08:52 AM

I edited to place the Roadmap in it's own thread. It's worth it's own discussion.

I'll wait to see Deneb, Propus, Heka and Rana benchmarks before I make a final decision on how they compare to Agena. Right now, my Toliman's good enough. What I really need this year is a 24" LCD. I'm getting tired of a 17" CRT.

I don't expect Deneb to do as well on the desktop as Nehalem, but I expect it to catch up to more recent Core 2 quad and overtake Kentsfield at best.
a b à CPUs
November 14, 2008 6:10:11 PM

I'd really like to see some gaming benchies with multi-GPU setups, like what we saw recently with Ci7. In fact, I'd like to see a side-by-side comparison of the two across a 'wide variety of (gaming) workloads' :) .
a c 127 à CPUs
a b À AMD
November 14, 2008 7:09:17 PM

chaohsiangchen said:
http://www.techradar.com/reviews/computing/components/p...

In German
http://www.tecchannel.de/server/prozessoren/1776972/tes...

From the look of it, Opteron 2384 still perform relatively inferior to Penryn in desktop applications. However, the CFD calculation result is intriguing.


As I said before. Server results generally never translate to desktop results. SO we are still in the dark until a full official review comes, and according to Yips it will be about January when Denebs NDA is released.
a b à CPUs
November 15, 2008 1:49:10 AM

Why can't they just run 'normal' apps on it???

ppff

I remeber... K10 barcy in server space beat kenty
a c 127 à CPUs
a b À AMD
November 15, 2008 10:10:26 AM

amdfangirl said:
Why can't they just run 'normal' apps on it???

ppff

I remeber... K10 barcy in server space beat kenty


Because tha would make sense?

Have you not noticed the trend lately? Everyone does what makes no sense.

Just like ppl who vote for the wrong reasons or without knowing a damn thing just because they think the person is cool!!!!

Its just the way it is. Nothing makes anymore sense.

You my dear fangirl, live in Senseville. We are all stuck in Nonsenseville.
November 15, 2008 3:02:02 PM

fazers_on_stun said:
I'd really like to see some gaming benchies with multi-GPU setups, like what we saw recently with Ci7. In fact, I'd like to see a side-by-side comparison of the two across a 'wide variety of (gaming) workloads' :) .



So AMD is going to have mobo's with CF and SLI support on the same platform like the i7? I wasnt aware of this. Gj AMD! :love: 
a b à CPUs
November 15, 2008 3:24:27 PM

roofus said:
So AMD is going to have mobo's with CF and SLI support on the same platform like the i7? I wasnt aware of this. Gj AMD! :love: 


I dunno. But since NVidia is letting Intel's X58 boards use SLI, why wouldn't they let AMD do the same with their boards? After all, AMD could withhold some key technology from NVidia like Intel threatened to do with QPI, unless NVidia cooperates. NVidea already had one can of "whoopass" explode in its face - it doesn't need another :kaola: 

November 15, 2008 4:03:16 PM

jimmysmitty said:
Because tha would make sense?

Have you not noticed the trend lately? Everyone does what makes no sense.

Just like ppl who vote for the wrong reasons or without knowing a damn thing just because they think the person is cool!!!!

Its just the way it is. Nothing makes anymore sense.

You my dear fangirl, live in Senseville. We are all stuck in Nonsenseville.


OBAMA!!!!

Word, Playa.
November 15, 2008 4:19:04 PM

fazers_on_stun said:
But since NVidia is letting Intel's X58 boards use SLI, why wouldn't they let AMD do the same with their boards?


I seriously doubt this will ever happen as it would most likely kill Nvidia's own chipset division. Other than SLI is there really any compelling reason to buy a Nvidia chipset?
November 15, 2008 5:12:30 PM

Just_An_Engineer said:
I seriously doubt this will ever happen as it would most likely kill Nvidia's own chipset division. Other than SLI is there really any compelling reason to buy a Nvidia chipset?


They wont. I knew it when I responded to the post I did. I think it would be a smart move and one that would grab some respect but they wont do it. They are way too "in tune" with what everyone wants lol.
November 15, 2008 5:19:48 PM

Too late for that, nvidia chipsets are already dead. If Nvidia does put out boards/chipset for i7, only an idiot would buy them. Literally no reason to buy one if you can get an Intel board with SLi support.
November 15, 2008 5:28:47 PM

Very VERY true. Intel does make the best chipsets for Intel. They run cooler, much more stable, more headroom for overclocking. Never had problems with my 780i but I know thats an exception, not the rule.
November 15, 2008 6:46:10 PM

Buying SLI right now would make no sense, or Nvidia for that matter.

AMD/ATI 4xxx gpu's with 8.12 drivers and beyond will have stream computing built into the drivers. This means that every app/game that has the capability of using a GPGPU will automatically use those new features of ATI's drivers/cards.

On top of that Nvidia's CUDA is a high level driver vs ATI's low level.
This meand that if a developer writes something to take use of CUDA they have to rely on Nvidia's drivers to make it work. in other words something may break in future driver releases.

ATI's low level(or CTM) approach will allow developers to make their program and forget about it. it will ALWAYS work the way it was designed in all future catalyst updates. Nvidia would have to freeze their driver to be able to get that kind of reliability out of CUDA.....or make it much more bloated and slower. AMD/ATI's approach is much better for developers.

November 15, 2008 6:55:33 PM

amd has the same problem it has had for years

intel sell a 3ghz cpu that runs 4-5ghz and sells and 3ghz cpu that might run barely run just over 3ghz

perception is everything and this die shrink and updates are good but not good enought to keep market share from slipping

hard core amd fans hang in there!
November 15, 2008 7:32:21 PM

grndzro said:
Buying SLI right now would make no sense, or Nvidia for that matter.

AMD/ATI 4xxx gpu's with 8.12 drivers and beyond will have stream computing built into the drivers. This means that every app/game that has the capability of using a GPGPU will automatically use those new features of ATI's drivers/cards.

On top of that Nvidia's CUDA is a high level driver vs ATI's low level.
This meand that if a developer writes something to take use of CUDA they have to rely on Nvidia's drivers to make it work. in other words something may break in future driver releases.

ATI's low level(or CTM) approach will allow developers to make their program and forget about it. it will ALWAYS work the way it was designed in all future catalyst updates. Nvidia would have to freeze their driver to be able to get that kind of reliability out of CUDA.....or make it much more bloated and slower. AMD/ATI's approach is much better for developers.


No thanks. I may be game to ditching the Nvidia chipset in favor of the x58 but I wont consider ATI cards for at least a year, year and a half (providing they maintain good offerings like they currently do). My investment in video cards I do try to maintain discipline with because I used to spend wayyy too much money playing that game.
a b à CPUs
November 15, 2008 10:11:11 PM

roofus said:
They wont. I knew it when I responded to the post I did. I think it would be a smart move and one that would grab some respect but they wont do it. They are way too "in tune" with what everyone wants lol.


I guess if AMD wanted to force the issue, they would have to figure out which would be the lesser loss - fewer Deneb sales vs. increased 48XX GPU sales... However if they did nothing, it would be a bit of an embarrassing admission that they don't swing the same weight as Intel does. Wonder how much of an ego Dirk Meyer has, compared say to "Mr. Whoopass" Jen-Sun Huang?
November 29, 2008 4:58:54 AM

Those benchmarks don't show the true performance of the deneb cpu's as the tests were run on a bottlenecking AM2+ board with DDR2.
a c 127 à CPUs
a b À AMD
November 29, 2008 5:02:13 AM

sonar610 said:
Those benchmarks don't show the true performance of the deneb cpu's as the tests were run on a bottlenecking AM2+ board with DDR2.


Um so having a faster link and DDR3 should make it super amazingly inredibly ultra faster?

Meh. Last time AMD switched memory types (DDR to DDR2) we saw either worse performance or maybe 1-2% gains. I am doubting DDR3 will make it much faster considering that current DDR3 latencies are pretty high.
a b à CPUs
November 29, 2008 5:08:31 AM

The latencies aren't bad at all on DDR3. Sure, CL8 at 1600 MHz sounds high, but it's actually the same latency as CL4 DDR2-800.
November 29, 2008 5:08:39 AM

I believe AMD has claimed a 5% increase on the AM3s
November 29, 2008 6:08:58 AM

jimmysmitty said:
Um so having a faster link and DDR3 should make it super amazingly inredibly ultra faster?

Meh. Last time AMD switched memory types (DDR to DDR2) we saw either worse performance or maybe 1-2% gains. I am doubting DDR3 will make it much faster considering that current DDR3 latencies are pretty high.


There is a reason why Intel has opted for DDR3 on thier Core i7, just like DDR2 when it first came out, almost no performance gains, over time the DDR3 will show some noticable performance gains, especially when the bandwidth of DDR3 can work with the Cache and other component bandwidths, one day we will have RAM, CPU and GPU (probally via PCI-E) bandwidth all the same speed I guess ;) 
a c 127 à CPUs
a b À AMD
November 29, 2008 7:38:26 AM

cjl said:
The latencies aren't bad at all on DDR3. Sure, CL8 at 1600 MHz sounds high, but it's actually the same latency as CL4 DDR2-800.


I understand all that. But it really depends on how well AM3 works with DDR3 and at what voltages their CPU will work with such high speed memory.

I was just pointing out that performance increase from mainly a memory switch for AMD and Intel has always shown no performance improvements, except in Intels case where they added another channel to the memory so that does help performance a bit.

lashton said:
There is a reason why Intel has opted for DDR3 on thier Core i7, just like DDR2 when it first came out, almost no performance gains, over time the DDR3 will show some noticable performance gains, especially when the bandwidth of DDR3 can work with the Cache and other component bandwidths, one day we will have RAM, CPU and GPU (probally via PCI-E) bandwidth all the same speed I guess ;) 


Someday we will have everything on one chip. Soon it will be GPU and CPU. Then it will be memory, CPU and GPU. One step in that direction is Fusion and Intels equivalent. Another step is Intels Terascale chip thats extremely modular to the point that each core can be either GPU, PPU or CPU.

Of course its still a bit away but I am sure we will have it some day.
a c 127 à CPUs
a b À AMD
November 29, 2008 11:41:26 AM

sighQ2 said:
4 on air
5 on ice
6 on LN2

http://www.legitreviews.com/article/836/1/


As I have said before, there is no way to tell if this isn't just a cherry picked chip being used for that purpose.

Considering the last time AMD showed off a Phenom you might want to take this with a grain of salt. Or dive right into the hyp and end up getting let down. Its your choice.
November 29, 2008 12:18:36 PM

sighQ2 said:
4 on air
5 on ice
6 on LN2

http://www.legitreviews.com/article/836/1/


of course when you posted this you did realize this is the same exact event that keeps getting re-posted? read it carefully. this was done by AMD, not an unbiased 3rd party. as encouraging as it is, i will wait to reserve judgment.
!