Sign in with
Sign up | Sign in
Your question

Deneb won't clock as high as we'd hoped in 2008

Last response: in CPUs
Share
May 1, 2008 6:07:37 AM

Tom's has this to say:

Quote:

AMD is planning to launch two 45 nm Phenom X4 CPUs (Deneb core) in 2008 with core frequencies of between 2.5-2.8 GHz and 2.4-2.7 GHz, with both adopting 6 MB L3 Cache and having a TDP rating of 95W, according to sources at motherboard makers. AMD will announce the final order date for its 125 watt Phenom X4 9750 by the end of the second quarter and the CPU will be replaced by a 95 watt version - which is available this quarter - while the 95 Wwatt Phenom X4 9850 will appear before January of 2009.


http://www.tomshardware.com/news/amd-cpu-phenom,5248.ht...

As a generally loyal AMD fan, who built 3 Athlon X2 systems this past year for home use (and who is still debating getting an 8750 as a holdover), I have to say I'm mildly disappointed that we won't see 3.0 and 3.2 Denebs on launch, as the rumor mill claimed.

Also, I can't see AMD keeping the B3 9750 and 9850 around in 95 watt versions if Deneb's going to have much higher performance. If Deneb's die shrink with SOI can't change things, then I guess we'll all have to hope AMD's marketing for Bulldozer finally pans out in two years.

AMD the budget CPU maker of choice for those of us so old school we still like our pins on the CPU and not the motherboard :lol:  !

Seriously, I love ATI/AMD chipsets and ATI GPU's but AMD's fallen into the same market they had with the K62. It's as if the Athlon XP, the Athlon 64 and Athlon X2 never happened.

This makes me think I should go triple core 8750 and overclock it to 2.8 on a new Gigabyte board rather than wait for Deneb. The way things are going, an 8750 at 95 watts just might be viable against AMD's best until Bulldozer changes things.

Yes, I know Intel will have better, but I don't like Intel chipsets any more than I like Nvidia's (which is not at all).

May 1, 2008 6:37:44 AM

Hm, I smell smoke...
a b à CPUs
May 1, 2008 8:35:01 AM

Are you sure about the L3 cache for the early Denebs? that doesn't seem like a correct figure. I thought that it was the L2 cache that was increased.A TDP rating of 95 watts is good though.
Oh wait I do see another link here.From here it does say 6MB L3 cache.


http://www.hardware.info/en-US/news/#gQlK

On the story Wednesday 11:15 a.m. 4/30/08 -First 45nm Phenom in November.It says they are going to introduce during the meantime a 2.6 Ghz Phenom X4 9950 and energy efficient 9150e and 9350e processors.
a b à CPUs
May 1, 2008 9:39:10 AM

yipsl said:
Tom's has this to say:

Quote:

AMD is planning to launch two 45 nm Phenom X4 CPUs (Deneb core) in 2008 with core frequencies of between 2.5-2.8 GHz and 2.4-2.7 GHz, with both adopting 6 MB L3 Cache and having a TDP rating of 95W, according to sources at motherboard makers. AMD will announce the final order date for its 125 watt Phenom X4 9750 by the end of the second quarter and the CPU will be replaced by a 95 watt version - which is available this quarter - while the 95 Wwatt Phenom X4 9850 will appear before January of 2009.


http://www.tomshardware.com/news/amd-cpu-phenom,5248.ht...

As a generally loyal AMD fan, who built 3 Athlon X2 systems this past year for home use (and who is still debating getting an 8750 as a holdover), I have to say I'm mildly disappointed that we won't see 3.0 and 3.2 Denebs on launch, as the rumor mill claimed.

Also, I can't see AMD keeping the B3 9750 and 9850 around in 95 watt versions if Deneb's going to have much higher performance. If Deneb's die shrink with SOI can't change things, then I guess we'll all have to hope AMD's marketing for Bulldozer finally pans out in two years.

AMD the budget CPU maker of choice for those of us so old school we still like our pins on the CPU and not the motherboard :lol:  !

Seriously, I love ATI/AMD chipsets and ATI GPU's but AMD's fallen into the same market they had with the K62. It's as if the Athlon XP, the Athlon 64 and Athlon X2 never happened.

This makes me think I should go triple core 8750 and overclock it to 2.8 on a new Gigabyte board rather than wait for Deneb. The way things are going, an 8750 at 95 watts just might be viable against AMD's best until Bulldozer changes things.

Yes, I know Intel will have better, but I don't like Intel chipsets any more than I like Nvidia's (which is not at all).


sounds like the same deal as a 90nm A64 vs 65nm - slightly lower tdp, no real benifits...

Why dont you like Intel chipsets? There THE best chipsets, Intel has a reliable platform overall, AMD has a shaky one at best... for now but improving rapidly now that they make there own chipsets now.
May 2, 2008 4:54:03 AM

apache_lives said:
sounds like the same deal as a 90nm A64 vs 65nm - slightly lower tdp, no real benifits...

Why dont you like Intel chipsets? There THE best chipsets, Intel has a reliable platform overall, AMD has a shaky one at best... for now but improving rapidly now that they make there own chipsets now.


Well, I don't like the premium cost of Intel motherboards, but the main reason I don't like Intel chipsets is they fail in regards to both IGP performance and in regards to video playback quality.

I usually get a good motherboard with a decent IGP, just in case I have to default to it if a card fails and I'm RMA'ing it. Though I play games, I also watch DVD's (will watch Bluray soon too), watch unlicensed fansub animes and also do some graphics and video editing.

Intel falls behind compared to both ATI and Nvidia in that regard. I've had compatibility problems with Nvidia chipsets, so I"ve basically preferred ATI/AMD. My only P4 has an ATI X200 board, and I have 690V, 690G and 780G boards for our Athlon X2 builds.

Yes, I don't think Deneb will be a loser, but it won't be Bulldozer, so I don't see any problems considering a budget 8750 until Bulldozer arrives in two years time. I am disappointed in not seeing at least 3.0 stock, but it could be due to SOI.

AMD is only shaky in regards to the Phenom. It's an average processor. AMD chipsets are great. ATI GPU's are quite good and if we had to use an IGP, we don't have to worry that it can't do the job as far as video playback is concerned. G35 is horrible as far as that's concerned.


a b à CPUs
May 2, 2008 6:21:15 AM

yipsl said:
Well, I don't like the premium cost of Intel motherboards, but the main reason I don't like Intel chipsets is they fail in regards to both IGP performance and in regards to video playback quality.

I usually get a good motherboard with a decent IGP, just in case I have to default to it if a card fails and I'm RMA'ing it. Though I play games, I also watch DVD's (will watch Bluray soon too), watch unlicensed fansub animes and also do some graphics and video editing.

Intel falls behind compared to both ATI and Nvidia in that regard. I've had compatibility problems with Nvidia chipsets, so I"ve basically preferred ATI/AMD. My only P4 has an ATI X200 board, and I have 690V, 690G and 780G boards for our Athlon X2 builds.

Yes, I don't think Deneb will be a loser, but it won't be Bulldozer, so I don't see any problems considering a budget 8750 until Bulldozer arrives in two years time. I am disappointed in not seeing at least 3.0 stock, but it could be due to SOI.

AMD is only shaky in regards to the Phenom. It's an average processor. AMD chipsets are great. ATI GPU's are quite good and if we had to use an IGP, we don't have to worry that it can't do the job as far as video playback is concerned. G35 is horrible as far as that's concerned.


http://www.anandtech.com/mb/showdoc.aspx?i=3299 - AMD boards with issues using the high end AMD chips - id prefer the price premium of an Intel motherboard knowing that it will work correctly, and atleast Intel learnt from the heat issues and regulation cooling, besides - the premium goes towards a better performing system, and what gamers use integrated graphics?
May 2, 2008 8:18:44 AM

apache_lives said:
http://www.anandtech.com/mb/showdoc.aspx?i=3299 - AMD boards with issues using the high end AMD chips - id prefer the price premium of an Intel motherboard knowing that it will work correctly, and atleast Intel learnt from the heat issues and regulation cooling, besides - the premium goes towards a better performing system,


There's nothing wrong with the 780G chipset. Some motherboard manufacturer's chose not to provide support for the 125 watt 9750 and 9850 on their 780G boards.

Does a Q6600 work in every Intel motherboard? Does a Wolfie or a Penryn? Whenever I check Newegg, whether Intel or AMD, I see boards listed by CPU supported. Then, when I buy a motherboard and CPU, I check the manufacturer's CPU support list.

Sometimes it's lack of a bios that means a motherboard doesn't support a particular CPU. Other times, it's voltage issues. The 780G is getting a bad rap here because it's an HTPC, SOHO and light gaming IGP not designed for the higher wattage quad cores.

They do support the watts and voltage of the 95 watt quad and triple cores. Quite good budget boards.

apache_lives said:
[
and what gamers use integrated graphics?


Is every PC built for gamers, or for gaming? We have one gaming PC with a 3870x2, one mainstream PC with a 3850, a kid's PC with the 780G integrated graphics and an HTPC with an X200 IGP (eventually will be replaced with another 780G or higher).

I would not trust Intel integrated graphics on our HTPC, or for our kid's light gaming (games like Fate, which is E 10+) and school work. In every generation since IGP arrived, Intel's had lousy graphics. When a discrete GPU is not needed, at least I want to know the IGP supports H.264 and can handle everything we throw at it video wise.

I'd consider a mainstream Intel CPU if ATI still made Intel chipsets. I wasn't upset and didn't blame ATI when the X200 board with a P4 630 couldn't support a C2D upgrade, but when I built new systems to replace that and the old i865 Northwood, I went AMD because the X2's were inexpensive and the 690 chipsets were the true phenomenally good products from AMD.






a b à CPUs
May 2, 2008 8:30:45 AM

Look at the bright side of things.The Deneb won't be revolutionary but AMD's upcoming Deneb's will have more cache similar to what Intel's Core 2 Duos and Quads have (hopefully there will be much greater improved performance than the current Phenoms with this),they have changed the process so the chips will use less power and they will run a little faster.They might even overclock much better than the current K10 Phenoms.
No one here (I think) has seen benchmarks of the upcoming Deneb compared to Intel's higher end current Quads or for that matter the upcoming Nehalem.
So we will just have to wait and see.

From what I read AMD's Istanbul was supposed to be increased to a 12 core beast to compete with a future Intel 8 core processor.

May 2, 2008 8:45:08 AM

A 5% tweak, a higher overall clock and yes, hopefully a better OC would be nice to see
May 2, 2008 8:48:29 AM

Hey yipsl, why does every thread involving you somehow always gets sidetracked into ANOTHER discussion about your upgrade plans? Yes dude, we know you love AMD/ATI and won't comtemplate evil Intel, but please, if you make a thread about Deneb, at least stick to the topic!

My thoughts on this:
I'm surprised AMD isn't going to push the TDP envelope, 2.8GHz at 95W means 3GHz+ at 125W should be possible. Unless something else is holding the clockspeeds back, I don't see why AMD isn't going all out. I mean, they can hardly claim they are going the 'energy efficient' route since 95W is hardly 'green' by any stretch, why not just release a faster 125W and not get ridiculed as much by the press and reviewers when it inevitably gets compared to 3GHz+ Nehalems?
May 2, 2008 8:57:15 AM

Will there be enough for them to bin? Thatd be my question as to why they wouldnt do it
May 2, 2008 9:20:05 AM

JAYDEEJOHN said:
Will there be enough for them to bin? Thatd be my question as to why they wouldnt do it


Well I'd certainly hope so, since this is basically all AMD will have to compete against Nehalem for the next 1.5 - 2 years until Bulldozer hits.

It appears AMD is just conceding the higher end CPU market for now, and is shifting the focus towards the mainstream level, as well as trying to sell the 'platform' instead of just the CPU. In their current financial state, it probably makes the most sense. Trying to stay on the bleeding edge is costly, they simply don't have the resources to battle Intel on all fronts and are prioritising in markets that can generate the most revenue.
May 2, 2008 11:41:17 AM

epsilon84 said:
Hey yipsl, why does every thread involving you somehow always gets sidetracked into ANOTHER discussion about your upgrade plans? Yes dude, we know you love AMD/ATI and won't comtemplate evil Intel, but please, if you make a thread about Deneb, at least stick to the topic!?


Not every, but perhaps too many. Sorry about that. This thread is sort of my disappointment that Deneb won't be 3.0 or 3.2, and won't be 65 watts. 95 watts at 45nm? Is this an SOI issue?

Frankly, I wasn't going to upgrade until Deneb. When people ask upgrade advice I said 'wait for Deneb or get an 8750' (which actually outshines the 9850BE for overclocking).

The comments on the motherboard was because Apache_lives seems to think, like too many people here, that every PC needs to be a gaming rig. Maybe if there's only one in the household and gamers dwell there, but my experience shows that there are reasons to go IGP, when the IGP is stellar.

Intel's held gaming back with their IGP's. If developers only had to code for ATI and Nvidia IGP's at the low end, then we'd at least see more PC gaming over consoles. When Intel IGP's are the low end, there is no hope for improvement.

Plus, Intel's really lousy at H.264 decoding. Maybe G45 will change that, but I won't hold my breath.

epsilon84 said:

My thoughts on this:
I'm surprised AMD isn't going to push the TDP envelope, 2.8GHz at 95W means 3GHz+ at 125W should be possible. Unless something else is holding the clockspeeds back, I don't see why AMD isn't going all out. I mean, they can hardly claim they are going the 'energy efficient' route since 95W is hardly 'green' by any stretch, why not just release a faster 125W and not get ridiculed as much by the press and reviewers when it inevitably gets compared to 3GHz+ Nehalems?


3.0 seems to be the sweet spot that will please enthusiasts and the mainstream OEM's. They're not getting it at stock. Whether they will overclock as well as the 8750's and some 9850BE's remains to be seen, but not everyone overclocks. I don't.

If they do overclock, then I don't expect much beyond 3.2 out of 45nm with SOI. I guess they aren't increasing the pipelines and pushing the thermals Prescott style as rumored.

If AMD wants to go past the OEM market and loyal fan base, they'll need a better CPU and that won't happen for a couple of years. At least ATI makes good chipsets and both IGP and discrete GPU's.
May 2, 2008 11:51:22 AM

I think AMD would be best served if they named their next product, quiet surprise
a c 127 à CPUs
a b À AMD
May 2, 2008 2:04:26 PM

yipsl said:
Well, I don't like the premium cost of Intel motherboards, but the main reason I don't like Intel chipsets is they fail in regards to both IGP performance and in regards to video playback quality.

I usually get a good motherboard with a decent IGP, just in case I have to default to it if a card fails and I'm RMA'ing it. Though I play games, I also watch DVD's (will watch Bluray soon too), watch unlicensed fansub animes and also do some graphics and video editing.

Intel falls behind compared to both ATI and Nvidia in that regard. I've had compatibility problems with Nvidia chipsets, so I"ve basically preferred ATI/AMD. My only P4 has an ATI X200 board, and I have 690V, 690G and 780G boards for our Athlon X2 builds.

Yes, I don't think Deneb will be a loser, but it won't be Bulldozer, so I don't see any problems considering a budget 8750 until Bulldozer arrives in two years time. I am disappointed in not seeing at least 3.0 stock, but it could be due to SOI.

AMD is only shaky in regards to the Phenom. It's an average processor. AMD chipsets are great. ATI GPU's are quite good and if we had to use an IGP, we don't have to worry that it can't do the job as far as video playback is concerned. G35 is horrible as far as that's concerned.


Wait what? Since when do you plan to use a IGP for your gaming rig? I understand your loyalty but dude Intels chipsets (yes even the IGPs) are the best I have ever used. They pack all the same features and allow easier and better OC's and are more stable than nVidias. nVidias are only good for SLI really.

Now for a HTPC I understand but who would use a IGP over a low-mid range GPU, like a HD2600 or 8600, especially since it will help with the encoding and such? And they wont cost much more.

Intels mobos are not that bad in price. The top teir Intel mobo is the same price roughly as a equivalently equiped AMD AM2+ mobo. But the mest is normally the one step down such as the P35 was over the X38.

But if you use the IGP just for "in case of" it wouldn't matter which one you get since you will be getting a new GPU withing a week or two.

As for the OP, I think it is SOI. I was thinking it seemed harder to do but we will see how well they OC compared to the Phenoms. If they are crap in OC that could be SOI.

Oh and I like having no pins on the CPU. Less chance for them to get bent. Plue the BGA(ball grid array) gives more of a contact surface for the CPU.
May 2, 2008 11:01:16 PM

They game better than you think. Why buy a 2600 when you dont have to?
May 3, 2008 1:42:08 AM

JAYDEEJOHN said:
They game better than you think.

That depends entirely on your expectations. Personally, I think all IGPs suck for gaming (some worse than others *cough* Intel *cough). They are useful for many other things, but a 6600GT (a mid range GPU from 2004) still beats the crap out of a 780G, which I think puts things into some perspective about the 'uber' gaming performance of that platform. If you don't mind playing older games, its fine, but it struggles to run newer games at playable framerates.

Quote:
Why buy a 2600 when you dont have to?

To get playable framerates on newer games? I think thats a pretty good reason. ;) 
http://www.tomshardware.com/reviews/amd-780g-chipset,17...

As I said earlier, its sufficient for older games though:
http://www.tomshardware.com/reviews/amd-780g-chipset,17...
a b à CPUs
May 3, 2008 2:00:38 AM

Yeah, the 780G is too memory limited for modern games. When I get mine up and running I'll test it, but my guess is as I OC higher and higher the returns will decrease rapidly because of memory bandwidth. I wish I could have gotten the 790G with the integrated DDR3.

However, for new games I'll use my 2900, so this isn't an issue. The 780G is only for TV and older games like RTW.
May 3, 2008 3:55:15 AM

Oh dear, 640 x 480 minimum details, THE HUMANITY!

If you value your eyesight, please don't play at such settings. ;) 
May 3, 2008 4:05:30 AM

I don't game on my IGP. I do like having an IGP on a motherboard so I can get it booting before I even install a GPU. My 7 year old games on his 780G.

My whole point is that, in each generation, AMD and Nvidia IGP's beat Intel hands down. They're neck and neck while Intel's eating their dust. Just imagine how many people buy Dells', Gateway's, HP's and Emachines and hope to play an occasional game on the Intel IGP and find that they cannot.

The HTPC proves my point. When you are doing anything more than simply surfing the net, Word or Excel, then you need more than an Intel IGP. That is true whether you game or not.

Perhaps I didn't make my argument clear. I don't like Intel IGP's and if I were forced to build an Intel system for myself, I'd be in a quandary as I prefer to have an IGP on my boards.

As for that turn around time Jimmy, I only had to send a GPU back once, and I didn't have an IGP on that i865 chipset board with a P4 Northwood. Back then, we only had one PC in the house and doing without the net, anime and older games was a nuisance.

That's why I made my choice last year to only buy motherboards with decent IGP's along with PCIe x16 slots for decent GPU's. It never hurts to have a backup. One thing Nvidia's doing right that I wish AMD would follow, is to have the IGP enabled on motherboards of all prices, because future power saving features rely on an IGP.
a c 99 à CPUs
May 3, 2008 4:06:57 AM

jimmysmitty said:
Wait what? Since when do you plan to use a IGP for your gaming rig? I understand your loyalty but dude Intels chipsets (yes even the IGPs) are the best I have ever used. They pack all the same features and allow easier and better OC's and are more stable than nVidias. nVidias are only good for SLI really.


The general consensus on Intel chipsets seems to be three things:

1. Their are very stable.
2. The IGP in the IGP versions are well behind NVIDIA's and ATi's.
3. They frequently are the reason why you will not be able to run a new chip in your board, even though the socket is the same. (The VRM is also to blame in many cases.)

Quote:
Now for a HTPC I understand but who would use a IGP over a low-mid range GPU, like a HD2600 or 8600, especially since it will help with the encoding and such? And they wont cost much more.


Most people would use an IGP over a midrange GPU like the HD 2600 or GF 8600 as most people do not do gaming. Putting a midrange GPU that throws off 60 watts or so in the becoming-very-common SFF cases that many business desktops and some home desktop ship in would be tough to cool as well as add an unnecessary cost. The most stress that most people would put on a computer would be playing back video, and a decent IGP like the Radeon HD 3200 or GF 8200 can handle that very well, the G35 not so well, particularly HD.

Quote:
Intels mobos are not that bad in price. The top teir Intel mobo is the same price roughly as a equivalently equiped AMD AM2+ mobo. But the mest is normally the one step down such as the P35 was over the X38.


Intel's chipsets are more expensive. NVIDIA makes the GeForce 7000 series, which are nicer IGPs than the G35. The top 7000-series unit, the 7150, is on boards costing $75-90. AMD's 780G chipset which is also a much nicer IGP than the G35 is on units costing $70 to $105. Intel G35 boards start at $95 and go up to over $120 (prices gotten from Newegg.) The AMD chipset may be less expensive due to the northbridge not having the memory controller onboard, but the GeForce 7150 *does* have the memory controller in the northbridge and still costs less than the G35 boards. I don't think you can look at this and say anything except Intel charges a premium for its chipsets.

Quote:
But if you use the IGP just for "in case of" it wouldn't matter which one you get since you will be getting a new GPU withing a week or two.


That's true, so you wouldn't want to pay a bunch more for an IGP that may be a little better. Except that you're paying more for the Intel IGP and it's *not* better than the competition.

Quote:
As for the OP, I think it is SOI. I was thinking it seemed harder to do but we will see how well they OC compared to the Phenoms. If they are crap in OC that could be SOI.


It could be any number of things, from process maturity to the tweaking of the production for greater yields or lower voltages at the expense of a bunch of highest-bin chips.

[/quote]Oh and I like having no pins on the CPU. Less chance for them to get bent. Plue the BGA(ball grid array) gives more of a contact surface for the CPU.[/quote]

First of all, socket 775 is a LAND grid array, not a ball grid array. Otherwise you would be complaining VERY loudly about not being able to upgrade your CPU as it would be soldered to the board. The only Intel desktop chips that I know of that are BGA are the Celeron 200 series soldered to the D201GLY/2 series mini-ITX boards. BGA is pretty much limited to chipsets and mobile/embedded CPUs for the most part.

Secondly, you can bend the contacts in the LGA socket just the same as you can bend the pins in a pin grid array (PGA) chip like most Core 2 Duo mobile CPUs and AMD desktop/laptop CPUs. The pins in an LGA socket are spring-loaded which helps in preventing bending from a force pushing straight down on them (which would cause a kink in a PGA pin) but lateral forces can bend the LGA socket pins too. The key point is that you need to be careful handling all of the different types of processors and ICs, be they PGA, LGA, or even DIP or edge-pin chips.

Third, your contact surface argument makes no sense. Yes, LGA mounting arrangements allow for a higher density of contacts between the CPU package and the motherboard. (BGA arrangements allow for the highest contact density per unit area.) But it wasn't really needed as AMD has managed to successfully put 940 pins under their chips since 2003 and have done just fine with it- Intel putting 775 pins underneath the P4/PD/Core 2 packages didn't require an LGA setup to accomplish. You can see that if you flip over a socket 775 chip as the underneath of the package isn't even completely covered with lands as there is a square in the middle covered with capacitors. A chip that probably did need to go to LGA to get the required amount of contacts would be a socket 1207 Opteron, but those have 432 more lands than socket 775 chips and actually do use the entire inferior surface of the CPU package for lands. The reason that Intel went LGA in the first place was because they thought it would lead to better cooling for the P4 Prescott, but the socket 775 Prescotts ran about 10% hotter than the socket 478 units IIRC.
May 3, 2008 4:21:25 AM

jj463rd said:
The Deneb won't be revolutionary but AMD's upcoming Deneb's will have more cache similar to what Intel's Core 2 Duos and Quads have (hopefully there will be much greater improved performance than the current Phenoms with this)


Intel chips need a big cache because the memory latency is much worse than a CPU with an integrated memory controller. I doubt that a big cache size increase would make anywhere near as much difference to an AMD CPU.
May 3, 2008 6:44:45 AM

First of all, the 790G is more powerful, and can be oceed as well. So its getting close to exceptable gaming, and combined with a dirt cheap 3470, itll outdo any 2600 easily, and you still save money and power
May 3, 2008 6:45:31 AM

Thanks, MU_Engineer for the LGA details. I guess I'm just old fashioned. I kvetched when I built my first LIF socket 486DLC-40, as I was used to the 386SX's soldered on the motherboard.

I don't like Intel socket T heatsinks. AMD heatsinks are more like what I was used to for years with socket 478, socket 7 and earlier. Just attach on one side and clip down on the other. Then again, I've only built one LGA 775 system. If I hadn't switched to AMD, I'd have gotten more used to it.

Amdfangirl, I've found the 1200 and 1250 IGPs of the 690V and G adequate for many games and the 780G even more so. I usually like to test out the system before installing the GPU. Morrowind, KOTOR, Baldur's Gate 1 and 2, Icewind Dale, Fate, HOMM 2 through 4 all play well on AMD IGP's. They played well on the Nvidia 405 chipset board, and on the ATI X200. Oblivion actually is playable on a 780G with reduced settings, but not all that much.

Sure, no one will get Crysis to run well, except at low resolutions with hybrid Crossfire, but I'm sure LOTR Online would run okay. WoW is very popular and I'm sure a 780G or Nvidia equivalent would give decent framerates.

I'm all for discrete GPU's, but when I want an IGP on my board (and I do), I want it to perform well. We keep hearing how G45 will change things. We keep hearing about Larrabee too. I won't hold my breath.

How I wish that AMD had not bought ATI and I had a choice of a 780G board with a Wolfie! I'd still add the 3870x2, but it would be great having the best in each category; CPU, chipset and GPU.
May 3, 2008 7:43:48 AM

Thatd come out something like nVelidia or nvilid for short
May 3, 2008 8:24:10 AM

JAYDEEJOHN said:
First of all, the 790G is more powerful, and can be oceed as well. So its getting close to exceptable gaming, and combined with a dirt cheap 3470, itll outdo any 2600 easily, and you still save money and power


790GX is about 20% faster than 780G. Overclocking is irrelevant since you can overclock a 2600XT as well.

Overall, a Radeon 2600 XT is about 50% faster than a 780G/HD3450 CF setup. 790GX will bring it closer, but it still won't match it.

I really question the worth of Hybrid CF when you can get a 512MB HD 3650 for $40 after MIR at Newegg. A HD3450 card already costs $25, plus there is the added premium of a 780G mobo, and you STILL can't get anywhere near the performance of a $40 discrete GPU. Why bother?
May 3, 2008 5:59:55 PM

yipsl said:
Well, I don't like the premium cost of Intel motherboards, but the main reason I don't like Intel chipsets is they fail in regards to both IGP performance and in regards to video playback quality.

I usually get a good motherboard with a decent IGP, just in case I have to default to it if a card fails and I'm RMA'ing it. Though I play games, I also watch DVD's (will watch Bluray soon too), watch unlicensed fansub animes and also do some graphics and video editing.

Intel falls behind compared to both ATI and Nvidia in that regard. I've had compatibility problems with Nvidia chipsets, so I"ve basically preferred ATI/AMD. My only P4 has an ATI X200 board, and I have 690V, 690G and 780G boards for our Athlon X2 builds.

Yes, I don't think Deneb will be a loser, but it won't be Bulldozer, so I don't see any problems considering a budget 8750 until Bulldozer arrives in two years time. I am disappointed in not seeing at least 3.0 stock, but it could be due to SOI.

AMD is only shaky in regards to the Phenom. It's an average processor. AMD chipsets are great. ATI GPU's are quite good and if we had to use an IGP, we don't have to worry that it can't do the job as far as video playback is concerned. G35 is horrible as far as that's concerned.


This is a fairly recent argument to make. It had not been more than two years that ATI/AMD has had a modern IGP chipset. You do not like Nvidia chipsets? I shudder to think of the Via 266 you must have been running in the Athlon XP days.

May 3, 2008 6:16:36 PM

MU_Engineer said:

NVIDIA makes the GeForce 7000 series, which are nicer IGPs than the G35.

While everything else you said about Intel chipsets and their IGPs is true, this one is not (not that I can fault you, as Intel seems to be trying hard to confuse the hell out of the consumer with all of these very similar sounding IGP and chipset names). From all tests I have seen the Geforce 7150 IGP for Intel processors trades blows with the GMA 3100 found on the G31, G33, Q33, and Q35 chipsets, an IGP which shares its lineage with the GMA 900 and 950 (and still to this day lacks hardware TnL). The X3500, as found on the G35, is the successor to the X3000 found on the G965 chipsets, being a much more advanced, much higher performing IGP than the GMA 3100 that the GF 7150 manages to beat half the time. The biggest thing holding back the GF 7150 IGP is the single channel memory controller (which wouldn't surprise me if Intel had a hand in that). However, the best aspect of the GF 7000 series IGPs for Intel processors is the much better driver support and compatibility with graphics intensive programs.

It should be noted though, that the GF 7000 series IGP for Intel processors COMPLETELY LACKS PUREVIDEO. From what I'm able to gather, that even means lack of Mpeg-2 (DVD) hardware acceleration, which is unfathomable in today's market. I have one of these IGPs, and when I can I'll get around to installing an up to date DVD player program to see if hardware acceleration is recognized through Direct X layer. You can completely forget about any hardware acceleration for HD content. That makes the Geforce 7000 series IGP for Intel processors a very poor multimedia IGP, especially so for HTPC use. It's unfortunate to say the least.

Here's a quick link to a review article pitting the MCP73 (the nvidia IGP) up against the Intel G33 IGP. It's probably not the best of reviews, but it gets the point across, and it's not in Chinese:
http://www.vr-zone.com/print/5333
May 3, 2008 6:55:59 PM

epsilon84 said:
790GX is about 20% faster than 780G. Overclocking is irrelevant since you can overclock a 2600XT as well.

Overall, a Radeon 2600 XT is about 50% faster than a 780G/HD3450 CF setup. 790GX will bring it closer, but it still won't match it.

I really question the worth of Hybrid CF when you can get a 512MB HD 3650 for $40 after MIR at Newegg. A HD3450 card already costs $25, plus there is the added premium of a 780G mobo, and you STILL can't get anywhere near the performance of a $40 discrete GPU. Why bother?

With a 3450, youre looking at a 70% performance boost, which is better than a 2600. so it is worth it
May 4, 2008 2:38:42 AM

JAYDEEJOHN said:
With a 3450, youre looking at a 70% performance boost, which is better than a 2600. so it is worth it


No, its not better than a 2600XT, not even close.

http://www.tomshardware.com/reviews/amd-780g-chipset,17...


Now, will you please let it go about how great 780G/H-CF is for gaming? Its not, it sucks, a $40 GPU can easily put it to shame.
a b à CPUs
May 4, 2008 3:18:41 AM

True.

Tho it looks like a super office machine.

Unfortunately here is Australia no Govt departments are allowed to buy AMD products ... the tenders all specify Intel.

That is a monopoly if I even saw one ... disgusting eh?

Getting back to Deneb ... on topic ... for a change ... how is a bigger cache going to fix fundamental issues like the IPC ??

The core needs some work ... and even then the Core2 is a wider issue design.

Dif I just ... Oh ... never mind ... <sob>
May 6, 2008 1:18:21 AM

Reynod said:
True.

Tho it looks like a super office machine.

Unfortunately here is Australia no Govt departments are allowed to buy AMD products ... the tenders all specify Intel.

That is a monopoly if I even saw one ... disgusting eh?

Getting back to Deneb ... on topic ... for a change ... how is a bigger cache going to fix fundamental issues like the IPC ??

The core needs some work ... and even then the Core2 is a wider issue design.

Dif I just ... Oh ... never mind ... <sob>


That should be challenged in court. The Intel monopoly might have begun because OEM's didn't carry many AMD boxes back during the disreputable OEM Rebate period (challenged by regulators on 3 continents), but AMD is relying on OEM's nowadays. So, the after purchase support for government and business needs is there.

At any rate, I don't think AMD will have a winner until Bulldozer. Deneb will solve some issues, but not bring improvements we'd hoped for. Since Intel got past Prescott and Smithfield intact, I expect AMD to get past the Phenom.

What that leaves us AMD fans with is a dilemma. Do we stay with aging Athlon X2's, get the cheapest Phenom out there and wait for Bulldozer, wait for Deneb, or just go Intel in the interim?

I'm making up my mind by not doing anything. I'm tired of trying to figure out if an AMD CPU upgrade is worth it.


May 6, 2008 12:34:17 PM

joefriday said:
This is a fairly recent argument to make. It had not been more than two years that ATI/AMD has had a modern IGP chipset. You do not like Nvidia chipsets? I shudder to think of the Via 266 you must have been running in the Athlon XP days.


You would do well to shudder, i had a via chipset in an old athlon xp system, it was an unstable pos... then after 30 minutes 'warming up' time at the desktop after starting, resetting cos windows had frozen, then restarting again, yes then it decided to work.
May 6, 2008 12:39:43 PM

yipsl said:
Tom's has this to say:

Quote:

AMD is planning to launch two 45 nm Phenom X4 CPUs (Deneb core) in 2008 with core frequencies of between 2.5-2.8 GHz and 2.4-2.7 GHz, with both adopting 6 MB L3 Cache and having a TDP rating of 95W, according to sources at motherboard makers. AMD will announce the final order date for its 125 watt Phenom X4 9750 by the end of the second quarter and the CPU will be replaced by a 95 watt version - which is available this quarter - while the 95 Wwatt Phenom X4 9850 will appear before January of 2009.


http://www.tomshardware.com/news/amd-cpu-phenom,5248.ht...

As a generally loyal AMD fan, who built 3 Athlon X2 systems this past year for home use (and who is still debating getting an 8750 as a holdover), I have to say I'm mildly disappointed that we won't see 3.0 and 3.2 Denebs on launch, as the rumor mill claimed.

Also, I can't see AMD keeping the B3 9750 and 9850 around in 95 watt versions if Deneb's going to have much higher performance. If Deneb's die shrink with SOI can't change things, then I guess we'll all have to hope AMD's marketing for Bulldozer finally pans out in two years.

AMD the budget CPU maker of choice for those of us so old school we still like our pins on the CPU and not the motherboard :lol:  !

Seriously, I love ATI/AMD chipsets and ATI GPU's but AMD's fallen into the same market they had with the K62. It's as if the Athlon XP, the Athlon 64 and Athlon X2 never happened.

This makes me think I should go triple core 8750 and overclock it to 2.8 on a new Gigabyte board rather than wait for Deneb. The way things are going, an 8750 at 95 watts just might be viable against AMD's best until Bulldozer changes things.

Yes, I know Intel will have better, but I don't like Intel chipsets any more than I like Nvidia's (which is not at all).


Didnt amd say they would be using processes as they become available, and refining them on a rolling basis? Initially the 45nms wont be coming at much higher clocks, but over time faster ones will be introduced on improved steppings. Thats what i read somewhere.
a c 99 à CPUs
May 6, 2008 3:30:16 PM

Quote:
Yup mine stopped working for half a year.......


I personally haven't had problems with VIA chipsets, but then again the only ones I used were the MVP4 on a Super 7 board with a K6-2/500 and the KT266A with an Athlon XP 2400+. The "A" revisions of the VIA chipsets weren't bad, but yeah, the guys with the first-version ones had a lot of trouble.
May 6, 2008 7:24:13 PM

How did a thread about Deneb de-evolve into which low-end card or integrated card performs "Better." All are horrible in any modern game.

I've had a VIA chipset before with AMD Duron. Great times.
May 7, 2008 12:23:59 AM

MU_Engineer said:
I personally haven't had problems with VIA chipsets, but then again the only ones I used were the MVP4 on a Super 7 board with a K6-2/500 and the KT266A with an Athlon XP 2400+. The "A" revisions of the VIA chipsets weren't bad, but yeah, the guys with the first-version ones had a lot of trouble.

My families first PC, a Compaq Deskpro 4000N, used a Via chipset for socket 7 (not an mvp3/4). It was pretty stable, as far as things went in the Windows 95 era. The VIA P4m800 chipset I used when I built my first computer also was decent. The IGP was terrible, but the actual chipset functions were okay. Not an 865/875 in terms of performance or stability, but still admirable none the less. I wouldn't have much reservation about using a modern VIA chipset in a new low end office-type build, as they seem stable enough and perform well in low demanding tasks. Even SIS is okay, although the only SiS experience I've had, a 661FX found in my Foxconn Ebot, left me with the impression that SiS chipsets are little furnaces. Nice and stable, but hot running and abysmal at overclocking. I could be wrong.
May 7, 2008 1:21:38 AM

joefriday said:
This is a fairly recent argument to make. It had not been more than two years that ATI/AMD has had a modern IGP chipset. You do not like Nvidia chipsets? I shudder to think of the Via 266 you must have been running in the Athlon XP days.


Never had an Athlon XP. I had two Northwoods.

I became an AMD fanboy when I built my first Athlon X2 AM2 system. Now, I have two AM2 and one AM2+ systems. 3 Athlon X2, one P4 socket T.

Never bought a Via chipset board. I had one given to me, along with a couple of sticks of DDR333, and I moved the 2.4 Northwood from a Soyo i845 board, because the Soyo board only supported SDRAM.

Guess what? The chipset was so bad, I moved the CPU back, SDRAM and all. It just wouldn't play Morrowind. Not at all, and that was the game I was playing most at that time.

epsilon84 said:
No, its not better than a 2600XT, not even close.

Now, will you please let it go about how great 780G/H-CF is for gaming? Its not, it sucks, a $40 GPU can easily put it to shame.


I wonder how a 3470 would do compared to a 2600? It's okay for light gaming. As far as it goes, it works okay for MMORPG's like WoW and LOTR Online, though I don't think it can handle the DX10 patch. My friend's 8600 crashes every once in awhile when he has it enabled, but doesn't in DX9 mode.

The way validation and anti-piracy are making single player PC games a nuisance (i.e. HOMM 5 does not like one of our PC's DVD-RW drives because of Secure Rom; Mass Effect and Spore will be worse, with only 3 validations), I think these "light" gaming IGP's in hybrid Crossfire or SLI will handle the most popular MMORPG's.

Right now, I'd rather go back and play Daggerfall under Dosbox than to play a Starforce, Secure Rom or heavy validation single player game. As far as it goes, I spend most of my time with LOTR Online, and MMORPG's don't need the kind of GPU I have right now.

The death of PC gaming might mean the death of high end GPU's. Maybe Intel, AMD and Nvidia should work with the game companies to perfect a scheme that doesn't harm legal customers who can't get stupid things like Secure Rom to recognize their DVD's, or even worse dreck like Star Force that wreck havoc if there's a legal burning program, even an old version that comes free with a GPU from ATI or Nvidia.

There's got to be a better way, or we'll all just be playing MMORPGs.
a b à CPUs
May 7, 2008 11:55:34 AM

Very interesting story at fudzilla amdfangirl.Thanks for the link.
May 8, 2008 6:11:49 AM



Well, what's to believe or not believe about a lawsuit? It's a matter for the trier of fact. We'll see how that goes.

As for rev. D, I'm all for it. People pointed out here that Intel ditched SOI for good reason and that 45nm won't help AMD all that much until they ditch SOI too.

We've recently heard that Deneb won't clock higher than 2.5-2.8 in 4th quarter 2008, and that's probably due to SOI. Whether the new process that follows will allow clocks above 3.0 remains to be seen, but I'm hopeful.

Considering SOI Deneb is reported as 95 watts and they may not get many chips to 2.8, we may just be looking at no major AMD improvement before Bulldozer.

An 8750, which does overclock to 2.8 looks better all the time. Otherwise, AMD upgrade choices just tends to put one to sleep. :sleep: 


May 9, 2008 2:17:35 AM

Yep, if they ever get 6 cores on the desktop, then I'll go for it. I just hope the game developers go for it too! They need the cash in the server area, as Nehalem should erase their memorable advantage there.

I did make an error in one post above. I'd forgotten that the K62-450 I had must have had a Via chipset. That long ago, I don't remember. I didn't know much about chipsets back then and just bought decent name brand motherboards. I didn't start paying attention to chipset differences till the P4 days.

Let's all spend our stimulus checks buying AMD and ATI kit!
a b à CPUs
May 9, 2008 3:51:56 PM

I'm sick of the stories.

When something decent actually arrives tell me ... after ... and when it works better.

I am over it ...

I am just going to soup up all my X2 rigs to the fastest AM2 X2 Silicon I can find ... the reliable 90nm 2 X 1Mb cache dual cores around 3 Ghz (6000+) will do ... or the 6400+ ... hopefully they drop a bit more in price.

The best teh have.

Kinda like Detroit Iron ...

Don't talk to me about top end or fuel ... I just want a nice ride and something that starts first time and runs all day.


May 9, 2008 7:16:32 PM

Quote:


AMD Athlon X2 BE-2400 @1.7GHZ @ 0.9V + AMD RS690G Chipset + X1250 graphics


Wow it looks like your PC is so low Watt green that it puts electricity back in the wall socket, filters CO2 from the air and stores it underground, while spitting out tiny little puppies to cheer all the kids in the neighborhood.. :wahoo: 

Do you have a matching power usage for your screen?
May 10, 2008 5:28:02 AM

Reynod said:
I'm sick of the stories.

When something decent actually arrives tell me ... after ... and when it works better.

I am over it ...

I just want a nice ride and something that starts first time and runs all day.


Well, I don't mind the stories, but then again, I'm not on a low sodium diet. I can handle a shaker of salt every time I click on The Inquirer. FUD with salt is the breakfast of geeks. Why do you think we're all so fat? :) 

Instead of spending my stimulus money on an 8750 and a 780G board with DDR2 1066, I just decided to spend it on two lifetime subscriptions to LOTR Online for $199 each (the offer's open till June 1 in honor of the one year anniversary). So, rather than $30 a month for years so my wife and can adventure together, we can fight the minions of Sauron on the cheap.

For that, I don't even need to upgrade my CPU or GPU. My GPU's overkill and a 65 watt Windsor's just fine. Sorry AMD, you've given this fanboy no reason to jump on the Phenom bandwagon after months of defending the CPU as the budget choice.

All I want is something that works the way it was intended, and though the 4850's and 4870's will work as intended, nothing in the quad or triple core lineup from AMD right now works as intended, or at least as promised, even though they work basically okay compared to midrange Athlon X2's.

a b à CPUs
May 10, 2008 2:28:03 PM

yeah ... 42 Kg for a chick is kinda on the heavy side .... LOL.

Seriously ... step AWAY from the Girlie magazines and the Coke ads ... just don't look at the TV ok !!!

42 kg ... <mumble> ...
a b à CPUs
May 10, 2008 2:46:31 PM

Well the whole image thing is just sad.

May 13, 2008 10:18:13 AM

I call my wife my Botticelli babe, meaning she's as beautiful as if she came from La Primavera:

http://www.ibiblio.org/wm/paint/auth/botticelli/bottice...

They knew beauty back then, they did!

Too thin began with Twiggy in the '60's, but she'd be fat compared to some of the supermodels today, who look like they're one meal away from being involuntarily committed to an anorexia clinic.

Don't know kilograms in my head, this is America where we have most of a continent and keep our own archaic measurements just fine, but I weigh in at 190 and need to lose 20-30 lbs and my wife weighs in at 160 even, but doesn't need to lose more than 10 lbs. She's taller than me by a bit too.

a b à CPUs
May 13, 2008 11:39:16 AM

yipsl you get my vote ... tho I quite like the Greek statues ... what was with the missing arms .... anyone know ?

Perhaps that was the Celeron Artist of the day ?

heh heh
a c 99 à CPUs
May 14, 2008 4:09:04 AM

Quote:
^ And 42kg is considered fat now?


My copy of the latest CDC pediatrics growth charts say 42 kg is very normal for a 13-year-old girl- it's just a little under the 50th percentile. That's an acceptable weight as long as you're taller than about 150 cm and ideal if you're between 155 and 160 cm.

EDIT: For yipsl, 42 kg is 92 pounds. 150 cm is 4' 11", 155 cm is 5' 1", and 160 cm is 5' 3". Your 190 lb weight by comparison is 86 kg and your 160-pound wife weighs 73 kg. There are 2.54 centimeters per inch and 0.454 kg per pound (or 2.2 lb per kg.) If you go to the doctor's office, you can guarantee that your measurements are recorded in metric as just about everything in medicine is now metric. I suppose that is because BMI is kg/m^2 and drug dosages, particularly IV drugs are all mg/kg. Your foods are measured in metric as the Calorie and grams are metric- SAE/English would be BTU and grains (1/7000 pound) respectively. Even the growth charts are metric, although there is an SAE/English scale on them as well.

And about the women...I find healthy women attractive. I like ones that are not overly skinny nor overweight. I'm particularly partial to the athletic type as they're more prone to go with me to the gym and the trails and I like to go to the gym and the trails. It's nice to have somebody to talk to there as well as getting spotted on the bench press. The staff will oblige grudgingly with the latter but they aren't real prone to help if you are not one of the frat boys trying to bench your absolute max as you might do more than one rep and distract them from their job of talking on their cell phone. It's also funny to see big guys who are very out of shape coming in and lifting less than a girl that's a foot shorter and an easy 100-120 pounds lighter. I never say anything to them as it's great they're coming in to lift as maybe they'll stick with it as it's good for them, but it's still a bit funny.

Quote:
I actually could do well enough with a Celeron and GMA 950 graphics for the kinda stuff I do all the time with my computer (which is web surfing and tablet drawing). No point in upgrading yet. If fact I just have a way overpowered computer...........meh.........


The 945G/GM aka GMA 950 is a very poor IGP. I should know as my laptop has the 945GM chipset. It works passably with strict 2D work and struggles with any sort of 3D load, even something as small as a composited desktop. It also does not accelerate video playback one iota, meaning that everything is done via CPU. The system with its 1.06 GHz C2D ULV can't even play 640x480 H.264 video smoothly- it has to be MPEG-2 or Xvid to play smoothly, and then it takes up 80% of the CPU cycles to play. My desktop's X2 4200+ and x1900 play back the same video with 15% CPU usage. I tried to play a relatively simple FPS game at 640x480 with all settings on low and got 7-14 FPS...terrible. My 6-year-old 1.8 P4-M laptop with a Radeon M9000 (RV250LF) and 1/6 the RAM could handle the game at 1280x1024 and good quality and get about 40 FPS. You were very wise to get the RS690 as it's a ton better than the GMA 950, a ton better.
!