Deneb won't clock as high as we'd hoped in 2008

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Tom's has this to say:

AMD is planning to launch two 45 nm Phenom X4 CPUs (Deneb core) in 2008 with core frequencies of between 2.5-2.8 GHz and 2.4-2.7 GHz, with both adopting 6 MB L3 Cache and having a TDP rating of 95W, according to sources at motherboard makers. AMD will announce the final order date for its 125 watt Phenom X4 9750 by the end of the second quarter and the CPU will be replaced by a 95 watt version - which is available this quarter - while the 95 Wwatt Phenom X4 9850 will appear before January of 2009.

http://www.tomshardware.com/news/amd-cpu-phenom,5248.html

As a generally loyal AMD fan, who built 3 Athlon X2 systems this past year for home use (and who is still debating getting an 8750 as a holdover), I have to say I'm mildly disappointed that we won't see 3.0 and 3.2 Denebs on launch, as the rumor mill claimed.

Also, I can't see AMD keeping the B3 9750 and 9850 around in 95 watt versions if Deneb's going to have much higher performance. If Deneb's die shrink with SOI can't change things, then I guess we'll all have to hope AMD's marketing for Bulldozer finally pans out in two years.

AMD the budget CPU maker of choice for those of us so old school we still like our pins on the CPU and not the motherboard :lol: !

Seriously, I love ATI/AMD chipsets and ATI GPU's but AMD's fallen into the same market they had with the K62. It's as if the Athlon XP, the Athlon 64 and Athlon X2 never happened.

This makes me think I should go triple core 8750 and overclock it to 2.8 on a new Gigabyte board rather than wait for Deneb. The way things are going, an 8750 at 95 watts just might be viable against AMD's best until Bulldozer changes things.

Yes, I know Intel will have better, but I don't like Intel chipsets any more than I like Nvidia's (which is not at all).

 

jj463rd

Distinguished
Apr 9, 2008
1,510
0
19,860
Are you sure about the L3 cache for the early Denebs? that doesn't seem like a correct figure. I thought that it was the L2 cache that was increased.A TDP rating of 95 watts is good though.
Oh wait I do see another link here.From here it does say 6MB L3 cache.


http://www.hardware.info/en-US/news/#gQlK

On the story Wednesday 11:15 a.m. 4/30/08 -First 45nm Phenom in November.It says they are going to introduce during the meantime a 2.6 Ghz Phenom X4 9950 and energy efficient 9150e and 9350e processors.
 


sounds like the same deal as a 90nm A64 vs 65nm - slightly lower tdp, no real benifits...

Why dont you like Intel chipsets? There THE best chipsets, Intel has a reliable platform overall, AMD has a shaky one at best... for now but improving rapidly now that they make there own chipsets now.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Well, I don't like the premium cost of Intel motherboards, but the main reason I don't like Intel chipsets is they fail in regards to both IGP performance and in regards to video playback quality.

I usually get a good motherboard with a decent IGP, just in case I have to default to it if a card fails and I'm RMA'ing it. Though I play games, I also watch DVD's (will watch Bluray soon too), watch unlicensed fansub animes and also do some graphics and video editing.

Intel falls behind compared to both ATI and Nvidia in that regard. I've had compatibility problems with Nvidia chipsets, so I"ve basically preferred ATI/AMD. My only P4 has an ATI X200 board, and I have 690V, 690G and 780G boards for our Athlon X2 builds.

Yes, I don't think Deneb will be a loser, but it won't be Bulldozer, so I don't see any problems considering a budget 8750 until Bulldozer arrives in two years time. I am disappointed in not seeing at least 3.0 stock, but it could be due to SOI.

AMD is only shaky in regards to the Phenom. It's an average processor. AMD chipsets are great. ATI GPU's are quite good and if we had to use an IGP, we don't have to worry that it can't do the job as far as video playback is concerned. G35 is horrible as far as that's concerned.


 


http://www.anandtech.com/mb/showdoc.aspx?i=3299 - AMD boards with issues using the high end AMD chips - id prefer the price premium of an Intel motherboard knowing that it will work correctly, and atleast Intel learnt from the heat issues and regulation cooling, besides - the premium goes towards a better performing system, and what gamers use integrated graphics?
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


There's nothing wrong with the 780G chipset. Some motherboard manufacturer's chose not to provide support for the 125 watt 9750 and 9850 on their 780G boards.

Does a Q6600 work in every Intel motherboard? Does a Wolfie or a Penryn? Whenever I check Newegg, whether Intel or AMD, I see boards listed by CPU supported. Then, when I buy a motherboard and CPU, I check the manufacturer's CPU support list.

Sometimes it's lack of a bios that means a motherboard doesn't support a particular CPU. Other times, it's voltage issues. The 780G is getting a bad rap here because it's an HTPC, SOHO and light gaming IGP not designed for the higher wattage quad cores.

They do support the watts and voltage of the 95 watt quad and triple cores. Quite good budget boards.



Is every PC built for gamers, or for gaming? We have one gaming PC with a 3870x2, one mainstream PC with a 3850, a kid's PC with the 780G integrated graphics and an HTPC with an X200 IGP (eventually will be replaced with another 780G or higher).

I would not trust Intel integrated graphics on our HTPC, or for our kid's light gaming (games like Fate, which is E 10+) and school work. In every generation since IGP arrived, Intel's had lousy graphics. When a discrete GPU is not needed, at least I want to know the IGP supports H.264 and can handle everything we throw at it video wise.

I'd consider a mainstream Intel CPU if ATI still made Intel chipsets. I wasn't upset and didn't blame ATI when the X200 board with a P4 630 couldn't support a C2D upgrade, but when I built new systems to replace that and the old i865 Northwood, I went AMD because the X2's were inexpensive and the 690 chipsets were the true phenomenally good products from AMD.






 

jj463rd

Distinguished
Apr 9, 2008
1,510
0
19,860
Look at the bright side of things.The Deneb won't be revolutionary but AMD's upcoming Deneb's will have more cache similar to what Intel's Core 2 Duos and Quads have (hopefully there will be much greater improved performance than the current Phenoms with this),they have changed the process so the chips will use less power and they will run a little faster.They might even overclock much better than the current K10 Phenoms.
No one here (I think) has seen benchmarks of the upcoming Deneb compared to Intel's higher end current Quads or for that matter the upcoming Nehalem.
So we will just have to wait and see.

From what I read AMD's Istanbul was supposed to be increased to a 12 core beast to compete with a future Intel 8 core processor.

 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Hey yipsl, why does every thread involving you somehow always gets sidetracked into ANOTHER discussion about your upgrade plans? Yes dude, we know you love AMD/ATI and won't comtemplate evil Intel, but please, if you make a thread about Deneb, at least stick to the topic!

My thoughts on this:
I'm surprised AMD isn't going to push the TDP envelope, 2.8GHz at 95W means 3GHz+ at 125W should be possible. Unless something else is holding the clockspeeds back, I don't see why AMD isn't going all out. I mean, they can hardly claim they are going the 'energy efficient' route since 95W is hardly 'green' by any stretch, why not just release a faster 125W and not get ridiculed as much by the press and reviewers when it inevitably gets compared to 3GHz+ Nehalems?
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780


Well I'd certainly hope so, since this is basically all AMD will have to compete against Nehalem for the next 1.5 - 2 years until Bulldozer hits.

It appears AMD is just conceding the higher end CPU market for now, and is shifting the focus towards the mainstream level, as well as trying to sell the 'platform' instead of just the CPU. In their current financial state, it probably makes the most sense. Trying to stay on the bleeding edge is costly, they simply don't have the resources to battle Intel on all fronts and are prioritising in markets that can generate the most revenue.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Not every, but perhaps too many. Sorry about that. This thread is sort of my disappointment that Deneb won't be 3.0 or 3.2, and won't be 65 watts. 95 watts at 45nm? Is this an SOI issue?

Frankly, I wasn't going to upgrade until Deneb. When people ask upgrade advice I said 'wait for Deneb or get an 8750' (which actually outshines the 9850BE for overclocking).

The comments on the motherboard was because Apache_lives seems to think, like too many people here, that every PC needs to be a gaming rig. Maybe if there's only one in the household and gamers dwell there, but my experience shows that there are reasons to go IGP, when the IGP is stellar.

Intel's held gaming back with their IGP's. If developers only had to code for ATI and Nvidia IGP's at the low end, then we'd at least see more PC gaming over consoles. When Intel IGP's are the low end, there is no hope for improvement.

Plus, Intel's really lousy at H.264 decoding. Maybe G45 will change that, but I won't hold my breath.



3.0 seems to be the sweet spot that will please enthusiasts and the mainstream OEM's. They're not getting it at stock. Whether they will overclock as well as the 8750's and some 9850BE's remains to be seen, but not everyone overclocks. I don't.

If they do overclock, then I don't expect much beyond 3.2 out of 45nm with SOI. I guess they aren't increasing the pipelines and pushing the thermals Prescott style as rumored.

If AMD wants to go past the OEM market and loyal fan base, they'll need a better CPU and that won't happen for a couple of years. At least ATI makes good chipsets and both IGP and discrete GPU's.
 


Wait what? Since when do you plan to use a IGP for your gaming rig? I understand your loyalty but dude Intels chipsets (yes even the IGPs) are the best I have ever used. They pack all the same features and allow easier and better OC's and are more stable than nVidias. nVidias are only good for SLI really.

Now for a HTPC I understand but who would use a IGP over a low-mid range GPU, like a HD2600 or 8600, especially since it will help with the encoding and such? And they wont cost much more.

Intels mobos are not that bad in price. The top teir Intel mobo is the same price roughly as a equivalently equiped AMD AM2+ mobo. But the mest is normally the one step down such as the P35 was over the X38.

But if you use the IGP just for "in case of" it wouldn't matter which one you get since you will be getting a new GPU withing a week or two.

As for the OP, I think it is SOI. I was thinking it seemed harder to do but we will see how well they OC compared to the Phenoms. If they are crap in OC that could be SOI.

Oh and I like having no pins on the CPU. Less chance for them to get bent. Plue the BGA(ball grid array) gives more of a contact surface for the CPU.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780

That depends entirely on your expectations. Personally, I think all IGPs suck for gaming (some worse than others *cough* Intel *cough). They are useful for many other things, but a 6600GT (a mid range GPU from 2004) still beats the crap out of a 780G, which I think puts things into some perspective about the 'uber' gaming performance of that platform. If you don't mind playing older games, its fine, but it struggles to run newer games at playable framerates.

Why buy a 2600 when you dont have to?
To get playable framerates on newer games? I think thats a pretty good reason. ;)
http://www.tomshardware.com/reviews/amd-780g-chipset,1785-11.html

As I said earlier, its sufficient for older games though:
http://www.tomshardware.com/reviews/amd-780g-chipset,1785-10.html
 
Yeah, the 780G is too memory limited for modern games. When I get mine up and running I'll test it, but my guess is as I OC higher and higher the returns will decrease rapidly because of memory bandwidth. I wish I could have gotten the 790G with the integrated DDR3.

However, for new games I'll use my 2900, so this isn't an issue. The 780G is only for TV and older games like RTW.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
I don't game on my IGP. I do like having an IGP on a motherboard so I can get it booting before I even install a GPU. My 7 year old games on his 780G.

My whole point is that, in each generation, AMD and Nvidia IGP's beat Intel hands down. They're neck and neck while Intel's eating their dust. Just imagine how many people buy Dells', Gateway's, HP's and Emachines and hope to play an occasional game on the Intel IGP and find that they cannot.

The HTPC proves my point. When you are doing anything more than simply surfing the net, Word or Excel, then you need more than an Intel IGP. That is true whether you game or not.

Perhaps I didn't make my argument clear. I don't like Intel IGP's and if I were forced to build an Intel system for myself, I'd be in a quandary as I prefer to have an IGP on my boards.

As for that turn around time Jimmy, I only had to send a GPU back once, and I didn't have an IGP on that i865 chipset board with a P4 Northwood. Back then, we only had one PC in the house and doing without the net, anime and older games was a nuisance.

That's why I made my choice last year to only buy motherboards with decent IGP's along with PCIe x16 slots for decent GPU's. It never hurts to have a backup. One thing Nvidia's doing right that I wish AMD would follow, is to have the IGP enabled on motherboards of all prices, because future power saving features rely on an IGP.
 


The general consensus on Intel chipsets seems to be three things:

1. Their are very stable.
2. The IGP in the IGP versions are well behind NVIDIA's and ATi's.
3. They frequently are the reason why you will not be able to run a new chip in your board, even though the socket is the same. (The VRM is also to blame in many cases.)

Now for a HTPC I understand but who would use a IGP over a low-mid range GPU, like a HD2600 or 8600, especially since it will help with the encoding and such? And they wont cost much more.

Most people would use an IGP over a midrange GPU like the HD 2600 or GF 8600 as most people do not do gaming. Putting a midrange GPU that throws off 60 watts or so in the becoming-very-common SFF cases that many business desktops and some home desktop ship in would be tough to cool as well as add an unnecessary cost. The most stress that most people would put on a computer would be playing back video, and a decent IGP like the Radeon HD 3200 or GF 8200 can handle that very well, the G35 not so well, particularly HD.

Intels mobos are not that bad in price. The top teir Intel mobo is the same price roughly as a equivalently equiped AMD AM2+ mobo. But the mest is normally the one step down such as the P35 was over the X38.

Intel's chipsets are more expensive. NVIDIA makes the GeForce 7000 series, which are nicer IGPs than the G35. The top 7000-series unit, the 7150, is on boards costing $75-90. AMD's 780G chipset which is also a much nicer IGP than the G35 is on units costing $70 to $105. Intel G35 boards start at $95 and go up to over $120 (prices gotten from Newegg.) The AMD chipset may be less expensive due to the northbridge not having the memory controller onboard, but the GeForce 7150 *does* have the memory controller in the northbridge and still costs less than the G35 boards. I don't think you can look at this and say anything except Intel charges a premium for its chipsets.

But if you use the IGP just for "in case of" it wouldn't matter which one you get since you will be getting a new GPU withing a week or two.

That's true, so you wouldn't want to pay a bunch more for an IGP that may be a little better. Except that you're paying more for the Intel IGP and it's *not* better than the competition.

As for the OP, I think it is SOI. I was thinking it seemed harder to do but we will see how well they OC compared to the Phenoms. If they are crap in OC that could be SOI.

It could be any number of things, from process maturity to the tweaking of the production for greater yields or lower voltages at the expense of a bunch of highest-bin chips.

[/quote]Oh and I like having no pins on the CPU. Less chance for them to get bent. Plue the BGA(ball grid array) gives more of a contact surface for the CPU.[/quote]

First of all, socket 775 is a LAND grid array, not a ball grid array. Otherwise you would be complaining VERY loudly about not being able to upgrade your CPU as it would be soldered to the board. The only Intel desktop chips that I know of that are BGA are the Celeron 200 series soldered to the D201GLY/2 series mini-ITX boards. BGA is pretty much limited to chipsets and mobile/embedded CPUs for the most part.

Secondly, you can bend the contacts in the LGA socket just the same as you can bend the pins in a pin grid array (PGA) chip like most Core 2 Duo mobile CPUs and AMD desktop/laptop CPUs. The pins in an LGA socket are spring-loaded which helps in preventing bending from a force pushing straight down on them (which would cause a kink in a PGA pin) but lateral forces can bend the LGA socket pins too. The key point is that you need to be careful handling all of the different types of processors and ICs, be they PGA, LGA, or even DIP or edge-pin chips.

Third, your contact surface argument makes no sense. Yes, LGA mounting arrangements allow for a higher density of contacts between the CPU package and the motherboard. (BGA arrangements allow for the highest contact density per unit area.) But it wasn't really needed as AMD has managed to successfully put 940 pins under their chips since 2003 and have done just fine with it- Intel putting 775 pins underneath the P4/PD/Core 2 packages didn't require an LGA setup to accomplish. You can see that if you flip over a socket 775 chip as the underneath of the package isn't even completely covered with lands as there is a square in the middle covered with capacitors. A chip that probably did need to go to LGA to get the required amount of contacts would be a socket 1207 Opteron, but those have 432 more lands than socket 775 chips and actually do use the entire inferior surface of the CPU package for lands. The reason that Intel went LGA in the first place was because they thought it would lead to better cooling for the P4 Prescott, but the socket 775 Prescotts ran about 10% hotter than the socket 478 units IIRC.
 

MarkG

Distinguished
Oct 13, 2004
841
0
19,010


Intel chips need a big cache because the memory latency is much worse than a CPU with an integrated memory controller. I doubt that a big cache size increase would make anywhere near as much difference to an AMD CPU.
 
First of all, the 790G is more powerful, and can be oceed as well. So its getting close to exceptable gaming, and combined with a dirt cheap 3470, itll outdo any 2600 easily, and you still save money and power
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Thanks, MU_Engineer for the LGA details. I guess I'm just old fashioned. I kvetched when I built my first LIF socket 486DLC-40, as I was used to the 386SX's soldered on the motherboard.

I don't like Intel socket T heatsinks. AMD heatsinks are more like what I was used to for years with socket 478, socket 7 and earlier. Just attach on one side and clip down on the other. Then again, I've only built one LGA 775 system. If I hadn't switched to AMD, I'd have gotten more used to it.

Amdfangirl, I've found the 1200 and 1250 IGPs of the 690V and G adequate for many games and the 780G even more so. I usually like to test out the system before installing the GPU. Morrowind, KOTOR, Baldur's Gate 1 and 2, Icewind Dale, Fate, HOMM 2 through 4 all play well on AMD IGP's. They played well on the Nvidia 405 chipset board, and on the ATI X200. Oblivion actually is playable on a 780G with reduced settings, but not all that much.

Sure, no one will get Crysis to run well, except at low resolutions with hybrid Crossfire, but I'm sure LOTR Online would run okay. WoW is very popular and I'm sure a 780G or Nvidia equivalent would give decent framerates.

I'm all for discrete GPU's, but when I want an IGP on my board (and I do), I want it to perform well. We keep hearing how G45 will change things. We keep hearing about Larrabee too. I won't hold my breath.

How I wish that AMD had not bought ATI and I had a choice of a 780G board with a Wolfie! I'd still add the 3870x2, but it would be great having the best in each category; CPU, chipset and GPU.