Sign in with
Sign up | Sign in
Your question

Penryn pull-in for real after all?

Last response: in CPUs
Share
February 21, 2007 5:53:08 PM

50w quad-core? Crikey. 8O

Wish they would better rate their wattage, though.
February 21, 2007 6:02:39 PM

canelake....16 core mobo....cray-zee
Related resources
February 21, 2007 6:22:56 PM

I know, I know, I know, I know, I know...

IT'S THE INQ!

But...

Clovertown LV within 6 weeks??? Caneland by Oct. 1??? Can this be anywhere near accurate? If so this is a freakin' Blitzkrieg by Intel!
February 21, 2007 6:46:23 PM



If AMD lives or dies remains to be seen but at very least we can thank them for stealing enough bananas from the 800 lbs gorilla to make the Gorilla very, very, very mad.

The other rumor that is starting to bubble up is intel moving to half speed multipliers on Core2Due with the 1333 fsb, ie 167 mhz increments not 333 mhz.

This would project to the following line up:

3.33 ghz Extreme Edition
3.0 ghz upper performance
2.83 ghz performance
2.67 ghz mainstream
2.5 ghz mainstream
2.33 ghz entry level dual core

Imagine - a 2.33 ghz Core2duo with 4 megs cache as an "entry level" $163 part :) 

Scary, but in a nice way :) 
February 21, 2007 7:04:50 PM

If I can get my greedy mitts onto this by Autumn of this year, I wouldn't care if AMD got torn to shreds by the 800 lb. gorilla.



I WANT IT!!!!!!!!!!!!!
February 21, 2007 7:28:10 PM

Intel doesn't seem to be playing around.. Kudos to AMD for giving them a solid kick in the ass. Here's to Barcelona, Kuma, and Agena... hopefully they can pull something off. I know Wusy is worried..

(o__O)
February 21, 2007 7:29:25 PM

You'll never have it if AMD gets torn to shreds because if that happens, they will put such a price tag on it that you'd be tempted to sell your house to get it :lol: 
February 21, 2007 7:44:12 PM

now this could be fun... once AMD/ATI get back on their feet.
February 21, 2007 8:21:39 PM

geez Intel are really starting to dominate the CPU market, I think AMD better get a move on with this Barcelona, otherwise they are going to be even further behind.
February 21, 2007 9:07:21 PM

I seriously agree that if they dont start getting their act together, and getting some fresh new hardware out, they are going to be in serious trouble. Their whole future of CPU manufacturing or even the survival of the company relies on this Barcelona chip.

i think we can recommend AMD chips at the low end level, becoz AMD's mid range CPU's are priced at the low end level and compete with Intel Low end E43's.
February 21, 2007 9:29:27 PM

Well, at the end, I think they know what they're doing. Don't forget that a depression phase is something totally natural in a business (Intel just passed it with Netburst but it's deep roots made it get over that pretty well). Plus they have made a huge investment merging with ATi.
The sole facts that AMD has strong support from IBM and an escalating partnership with Dell are warranties for a healthy future. The whole PC industry has realized AMD is a MUST for it's own health; the future for AMD is tough but not at all dark I'd say
February 21, 2007 9:48:48 PM

...guess it might be too late for K8. S939 is now out of production; with AM2 supporting k10 and K10 almost in sight, it would be a step backwards using eDRAM on K8s, because, whatever the gap from Core2 becomes, if the numbers are those AMD promised, K8 will still be too far from K10, too low performing to throw money at.
February 21, 2007 10:38:17 PM

I'm willing to adopt a pro-AMD bias as long as they're competitive. That means I would pick AMD for low end or "mainstream" machines and AMD GPUs with Intel CPUs for high-end desktops and notebooks (until K10 becomes widely available). AMD needs to allow Crossfire on all platforms, including Bensley 2S workstations like the Mac Pro. They already have a (rumored) deal with Apple for R600 in Crossfire for the next Mac Pro refresh. And bring out those external GPUs ASAP. Take on Graphzilla first and then go after Chipzilla once K10 is fully ramped.
February 21, 2007 11:58:28 PM

Quote:
They have time,they should use it to step a bit more.


In the past didn't Intel wait to ramp up a new process until it was fairly mature and thus offered at least the performance of the prior generation? I realize there is heightened competition at this point, but it is not as if Intel's transition to 65nm wasn't under a great deal of stress, with Netburst underperforming K8, and still they didn't release newer-process CPUs at lower bins than before.
February 22, 2007 12:12:25 AM

You are going to tell Intel to halt advancement on CPU's so AMD can catch up? Baka denshi, think please.
February 22, 2007 12:17:01 AM

AMD could have kept K8 prices reasonable when they had K8 on top to capture as much market share as possible. But no .... they had to screw us for every last penny. Now they are paying the price.

I paid £750 each for Opteron 275s in Jan 2006 ... they are £150 now.
February 22, 2007 12:19:30 AM

Quote:
AMD could have kept K8 prices reasonable when they had K8 on top to capture as much market share as possible. But no .... they had to screw us for every last penny. Now they are paying the price.

I paid £750 each for Opteron 275s in Jan 2006 ... they are £150 now.


The major problem for AMD was the capacity. I don't think you know this. AMD started to have CPU inventory in the last quarter only.
February 22, 2007 12:20:19 AM

The major ramping for 45nm products will be in 2008, I don't think the situation changes much. :wink:
February 22, 2007 12:26:15 AM

Trust me, they'll be on level playing fields soon enough.
February 22, 2007 12:28:02 AM

What exactly does Intel interpret 50W TDP as? Is that normal usage or max usage. Because if it is normal usage then 50W is not so great. :? If this chip comes out as the article states, I bet it is just an underclocked Core 2 Quad -- nothing real exciting.
February 22, 2007 12:33:04 AM

Quote:
What exactly does Intel interpret 50W TDP as? Is that normal usage or max usage. Because if it is normal usage then 50W is not so great. :? If this chip comes out as the article states, I bet it is just an underclocked Core 2 Quad -- nothing real exciting.

Thats an excellent question; Intel doesnt list it the same as AMD or so I have read.
I have read that too... I just can't remember.
February 22, 2007 12:38:27 AM

so guys , while we are on this topic at the moment , I just want to know if AMD is in the process of designing a new architecture ?
February 22, 2007 12:42:26 AM

Quote:
that's what barcelona is, the name of the core of amd's new arch k8l


ohhh , I thought Barcelona was just a K8 arch upgrade??
Because I dont think Barcelona is a whole new arch?
February 22, 2007 12:59:52 AM

lol , yeahh I definitely heard of K10 , but never knew anything about or whether it was being designed at the moment.

I know this has nothing to do with CPU's , but when do you guys reckon Flash Hard drives will go into mass production?? , and have the costs of the magnetic hard drives now??

Because ive noticed the only thing that slows down a computer is the Hard drives.
February 22, 2007 1:02:29 AM

Quote:
I have read that too... I just can't remember.
best if we dont consider either companies TDP claims,and simply wait for benches.

I would like to see both companies advertise the true "maximum" TDP and "normal" TDP.

Then we can use the benchies to see who is telling the truth. :wink:
February 22, 2007 2:04:16 AM

Quote:
What exactly does Intel interpret 50W TDP as? Is that normal usage or max usage. Because if it is normal usage then 50W is not so great. :? If this chip comes out as the article states, I bet it is just an underclocked Core 2 Quad -- nothing real exciting.


TDP from both companies refers to necessary cooling at maximum load. As long as you run at stock voltage and frequency on a reference motherboard, any software that you throw at the chip shouldn't ever cause the chip to exceed power consumption defined by TDP. What makes Intel and AMD TDPs incomparable may have to do with different platform designs. I'm aware that on Intel processors the voltage supplied to the core purposely "droops" slightly as the core tries to draw more power, whereas I don't see the same happening on AMD CPUs. TDP of course does not apply if you run at voltages and frequencies out of specification.

LV/ULV parts from Intel have traditionally just been underclocked devices running at lower voltages. Peak power consumption is roughly proportional to core frequency and to the square of voltage. A higher-binned part can attain a higher frequency at a given voltage, thus such a part could reach a given frequency with less voltage, at considerable power savings relative to a lower-binned part.
February 22, 2007 3:48:44 AM

Quote:
You'll never have it if AMD gets torn to shreds because if that happens, they will put such a price tag on it that you'd be tempted to sell your house to get it :lol: 


Nah. There's plenty of computer retailers around my house. I'll just break in and steal one.

Yes, I am well aware that the worst thing that could happen to the PC biz is if either part of the duopoly were to go POOF. Whoever would be left would put the screws on the consumer so bad our eyes would pop out. But I am just simply not impressed with almost anything AMD has done in the past few quarters. I mean, Vista was delayed and delayed and delayed... What if Vista had come out on time? Would we still be sitting here waiting for a DX10 card from them? The 65nm shrinks had all the overwhelming market impact of a wet squib. You could jump into a cold pool and get a better shrink than they did. And Hector's new mantra of performance/watt is a crock! It's obviously intended for the enterprise IT purchasing directors' ears and it's probably 15th or 20th on their list of requirements. In the vast majority of cases of common non-24/7-crunching applications (and we're still arguing that with Baron in another thread) the energy savings are so tiny as to be insignificant. Even at the enterprise level!

Quote:
Well cappin I have sad news;The reason for the delay is that i spilled my coffee on the documentation and E samples and in the process of trying to clean it and power down the pc before it fried ,i mistakenly grabbed acetone insted of water to clean the coffee and when the mother board sparked i jumped cause it was loud,and i spilled acetone everywhere and the fab went up in flames :oops:  :oops: 

I am really really really embarrassed. :oops:  Dielectrics really are pretty when the burn,the green fire from copper connects is cool too. The whole fab burning was like a rainbow fire the only bad part was when the etching chemicals and metal deposition chemicals went.That was the wierdest cloud of smoke in the history of man it was dissolving trees as it passed throgh them.

Any way this is just as valid a story as any from the inquirer.


A couple of questions.

1) Have you been posting on AMZ lately? :wink:

2) How much did the competition pay you to burn the fab?

3) Unfortunately (I'm just as big a skeptic about anything the Inq writes) there has been confirmation elsewhere.

4) Can you imagine what kind of ecodisaster a fab burn would really be? There's chemicals in there that make nerve gas look tasty!

Quote:

I agree if this is true, what's intel trying to do, destroy amd? Now that just isn't fair business, when amd had the chance, they probably could have done a lot more damage than have lower priced cpus that outperformed the intel ones and gain tons of market share, but destroying the competition is just disgusting to me, it's the same reason why we're all stuck with only three major OSes, and microsoft has all the software written for it though it has the most buggs


If a company has a technology that has an edge on the competition and introduces it onto the market, even at an accelerated schedule, that is far from being unfair business. It's smart business. It's how companies like Sony built up their global market presence in the 70s and 80s: Quality and technology that put the competition to shame.

Was it unfair way back when AMD was whipping Intel? (Edited to suit sensibilities) :wink:
February 22, 2007 4:08:36 AM

Quote:
Was it unfair way back .... ?


A little bit too colorful .... deliverance was not one of my favorite movies anyway. :) 

Duly Edited.

Saw it when I was a kid. Had nightmares about it for weeks. :cry: 

Your discourse on TDP brings up the point that I've been bugging Baron about on the other thread.

Let's take a hypothetical situation:

I'm an Enterprise IT Purchasing Manager. I've got to buy a few zillion PCs to replace the ones I have now. These PCs are used for all sorts of different apps, but they are relatively common office type apps. No folding or 24/7 number crunching.

Performance/watt has to be a concern, but in real time, bottom-line situations, what is the difference between a, say, 65W and 45W CPU in how fast it makes the electric meter spin? Baron maintains 20W is the difference. I cannot possibly believe that, because by my calculation then the 45W CPU would be putting electricity back into the grid at idle!!
February 22, 2007 4:40:12 AM

Quote:
Was it unfair way back .... ?


A little bit too colorful .... deliverance was not one of my favorite movies anyway. :) 

Duly Edited.

Saw it when I was a kid. Had nightmares about it for weeks. :cry: 

Your discourse on TDP brings up the point that I've been bugging Baron about on the other thread.

Let's take a hypothetical situation:

I'm an Enterprise IT Purchasing Manager. I've got to buy a few zillion PCs to replace the ones I have now. These PCs are used for all sorts of different apps, but they are relatively common office type apps. No folding or 24/7 number crunching.

Performance/watt has to be a concern, but in real time, bottom-line situations, what is the difference between a, say, 65W and 45W CPU in how fast it makes the electric meter spin? Baron maintains 20W is the difference. I cannot possibly believe that, because by my calculation then the 45W CPU would be putting electricity back into the grid at idle!!

Well then I'd buy the 45W processors.

Wouldn't you?

:D 

In all seriousness, I agree with you. Depending on how much the systems are actually used, though, and also taking into consideration that the 45W versions are typically (but not always) lower speeds and therefore lower prices than the 65W versions, and taking into account that you're buying a zillion systems and saving $10 times a zillion is a helluva lot of money, I WOULD go with the 45W versions.

But not because it's a 20W difference at idle. Because it isn't. Hell, it's not even a 20W difference at load, since we all know AMD's TDP is an envelope that is (theoretically-and more importantly, especially so in your hypothetical situation) NEVER exceeded-or even equaled.
February 22, 2007 4:52:00 AM

Thank you Jack. :) 
February 22, 2007 4:58:16 AM

Quote:
But not because it's a 20W difference at idle. Because it isn't. Hell, it's not even a 20W difference at load, since we all know AMD's TDP is an envelope that is (theoretically-and more importantly, especially so in your hypothetical situation) NEVER exceeded-or even equaled.


Maybe AMD should rethink how they calculate TDP -- more similar to Intel. This would drop their TDP ratings and provide more ammo for advertising.

It seems kinda misleading to have multiple ways of calculating this value.
February 22, 2007 5:01:41 AM

Quote:

Well then I'd buy the 45W processors.

Wouldn't you?

:D 

In all seriousness, I agree with you. Depending on how much the systems are actually used, though, and also taking into consideration that the 45W versions are typically (but not always) lower speeds and therefore lower prices than the 65W versions, and taking into account that you're buying a zillion systems and saving $10 times a zillion is a helluva lot of money, I WOULD go with the 45W versions.

But not because it's a 20W difference at idle. Because it isn't. Hell, it's not even a 20W difference at load, since we all know AMD's TDP is an envelope that is (theoretically-and more importantly, especially so in your hypothetical situation) NEVER exceeded-or even equaled.


Definitely! If the TDP is unreachable anyway, what are the real savings in a more efficient processor? Next to nothing.

Let's throw some figures around and see if we can come up with a formula.

Enterprise needs 1,000 PCs.

90% are M-F 9-5 general office apps, so let's be generous and say 20% of the time at 80% CPU usage, 40% of the time at 20% CPU usage, 40% idle.

10% are serious crunchers. Give them M-F 9-9 @ 100%.

My formula is:

251 business days p.a.
Total idle hours 251 x 8 x 900 x .4 = 722880
Total 20% hours 251 x 8 x 900 x .4 = 722880
Total 80% hours 251 x 8 x 900 x .2 = 361440
Total 100% hours 251 x 12 x 100 = 301200

Now this should be a very simple amount to factor in IF we know what the actual W draw is at each CPU usage rate and we multiply that by the cost of electricity.

Therefore, this simplified formula should be correct for each CPU. We can use 65W vs. 45W or whatever:

(722880 x IdleCost) + (722880 x 20%Cost) + (361440 x 80%Cost) + (301200 x 100%Cost) x ElectricCost

OK, so let's solve this. I really wanna find out if performance/watt is as important as being bandied around or if it's wholly insignificant!


Quote:
AMDzone? AMD zone?are you high? the fab was an accident I thought i had grabbed water,,,,AMD zone?

give me some credit here dewd :lol:  :lol:  Id get banned there in 2 seconds with my take on AMD. :wink:


Hey, I seem to remember that you got banned here too! :wink: :lol: 
February 22, 2007 5:29:30 AM

Quote:
Interestingly, Scentia has come into the forum and added his two cents --- and he has been very pragmatic as well --- hats off to him. I was impressed.


I don't like Scientia. He's terrible and awful and mean to poor helpless kittens. :cry: 

OK, Scientia, J/K... :lol: 

I just stole from another thread a link to idle vs. full-load chart which might help fill in some of the question marks in my formula above, but still doesn't contain any info on intermediate load usage. Anyone help?
February 22, 2007 5:29:42 AM

Quote:
Thank you Jack. :) 


Your welcome --- there has always been this contention between what each company means by TDP.... frankly, LostCircuits does the best job of measuring CPU power CONSUMPTION (sorry for shouting ;)  ) as they have modified their test boards to put a power meter inline with the VRM. This gives a direct measure of the power draw on the processor, which is 1/2 the battle.

For you, me, and many enthusiast --- TDP is important to us only for shopping around for an HSF and the power draw is important in shopping for a PSU (fortunately, TDP is enough of an estimate to make good PSU decisions). For the ultra-enthusiast, OCing fool --- such as Wusy, TDP is pretty much meaningless as they go out of their way to figure out ways to dissipate enough power to heat your house, hence TDP is irrelevant to them. :) 

jack

That's easy!

Get a Pentium D. Or a QFX system.

And don't forget to overclock. :wink:
February 23, 2007 1:46:57 AM

Quote:
You'll never have it if AMD gets torn to shreds because if that happens, they will put such a price tag on it that you'd be tempted to sell your house to get it :lol: 


That's why God put AMD moron (oups, sorry, Fanboy :wink: ) on Earth... so we can have gthe best of both world, performances and good price.

OH, and by the way, AMD would do exactly the same in 20 years from now should they ever manage to get ahead of Intel. That's the name of the game baby :roll:
February 23, 2007 2:09:18 AM

Quote:
That's why God put AMD moron (oups, sorry, Fanboy :wink: ) on Earth... so we can have gthe best of both world, performances and good price.


Such a classy post.

Quote:
OH, and by the way, AMD would do exactly the same in 20 years from now should they ever manage to get ahead of Intel. That's the name of the game baby :roll:


Do you have severe short memory loss? Or, do you selectively forget that AMD had the better processor for years (until Core/Core 2)? They've done it before, it's possible they could do it again. Grow up!
February 23, 2007 3:04:09 AM

Quote:
They've done it before, it's possible they could do it again. Grow up!


30 years ago I ran in a marathon. I've done it before, it's possible I could do it again... NOT! 8O
February 23, 2007 3:16:14 AM

Quote:
They've done it before, it's possible they could do it again. Grow up!


30 years ago I ran in a marathon. I've done it before, it's possible I could do it again... NOT! 8O

Grow up! :p 
February 23, 2007 3:38:51 AM

Quote:
Grow up! :p 


Please choose one:

Can I borrow your face for a few days while my a$$ is on vacation?

Are you so dense that light bends around you?

Did the mental hospital test too many drugs on you today?

Converse with any plankton lately?

Don't you have a terribly empty feeling in your skull?

And/Or

Learn from your parents' mistakes. Use birth control!

(Don't flame, just jokin'!!!) :lol: 

Now to prevent the crime of having a content-free post:

I have no doubt that AMD could reclaim its performance crown. It would not only be good for competition, but even better for the enthusiast. As I have no doubt that K10 could theoretically be a monster C2Q killer. The problem I have is that from my observations, the rudder fell off AMD almost a year ago. First of all, allow me to state the obvious, they could not afford ATI. A merger would have been a far better solution for both companies than a poison-pill debt-fueled buyout which, at $14.48/share has effectively wiped out any value either company had. Then the 65nm shrink, necessary as it may have been, simply replaced uncompetitive 90nm product with uncompetitive 65nm product. Yes, the sound you hear is the enthusiast market snoring. And if we have to take the recent roadmaps as gospel, the fastest K10 will be a 2.5GHz for the next 14 months at which time a 2.6GHz will be introduced. More snoring.

Also please note this post was written on a San Diego 3700 by a guy who's had AMD only for many years. 8)
February 23, 2007 3:56:19 AM

Quote:
I have no doubt that AMD could reclaim its performance crown. It would not only be good for competition, but even better for the enthusiast. As I have no doubt that K10 could theoretically be a monster C2Q killer. The problem I have is that from my observations, the rudder fell off AMD almost a year ago. First of all, allow me to state the obvious, they could not afford ATI. A merger would have been a far better solution for both companies than a poison-pill debt-fueled buyout which, at $14.48/share has effectively wiped out any value either company had. Then the 65nm shrink, necessary as it may have been, simply replaced uncompetitive 90nm product with uncompetitive 65nm product. Yes, the sound you hear is the enthusiast market snoring. And if we have to take the recent roadmaps as gospel, the fastest K10 will be a 2.5GHz for the next 14 months at which time a 2.6GHz will be introduced. More snoring.

Also please note this post was written on a San Diego 3700 by a guy who's had AMD only for many years. 8)


I completely agree with you... AMD better get a grip and get it soon, or they are in for a rude awakening.

As I type on my 1.8GHz P4-M that I have thoroughly enjoyed for nearly 5 years now. :wink:
February 23, 2007 4:02:35 AM

Quote:

I completely agree with you... AMD better get a grip and get it soon, or they are in for a rude awakening.

As I type on my 1.8GHz P4-M that I have thoroughly enjoyed for nearly 5 years now. :wink:


Thank you sir. Yes, the decisions made in the AMD boardroom lately have been rather puzzling. I've stated in the past that the most likely scenario to allow for their market missteps is one of sheer unmitigated panic by AMD Execs, and I don't know if that is too far from the truth!

Hey, the 1.8 P4-M wasn't a bad CPU. I know a guy who has had one for years, just like you, and swears by it!
February 23, 2007 4:14:58 AM

Quote:
Thank you sir. Yes, the decisions made in the AMD boardroom lately have been rather puzzling. I've stated in the past that the most likely scenario to allow for their market missteps is one of sheer unmitigated panic by AMD Execs, and I don't know if that is too far from the truth!

Hey, the 1.8 P4-M wasn't a bad CPU. I know a guy who has had one for years, just like you, and swears by it!


Did you see the R600 was delayed again? After setting up all of the fanfare in the coming weeks -- yet another misstep. It would have been better for them to just keep quiet. They are really starting to look bad.
February 23, 2007 4:51:16 AM

I have know about this since yesterday and now im pissed cus currently I have no graphics card in my rig and I was willing to wait a month and a little but it has gotten to the point where I dont want to waste my time so im buying a 8800GTX, why do you let me down when I need you most AMD. but with Intel trying to push ahead once again, it will force AMD to pull something out of their asses, so I am now waiting to see how AMD will react to this assault. I hope they turn to multi core processors also with multicore streaming processors integrated into them, just think of the possibilities, it will be like the first dual core processors all over again, something new and ridiculously powerful that it will take programmers years to start writing code for.
February 23, 2007 6:31:13 AM

Ranman, it might seem like I'm making this up, but as soon as I read about a week ago about the inexplicable last-minute change of name from X28xx to X29xx, I figured that something had just gone BOOM at AMD central. I think that they may have found out that they have a very serious problem and have had to go back to the drawing board and basically come out with a completely different card. Well, sometimes you get the bear and sometimes the bear gets you.

P8ntslinger, I can assure you that you are not alone in the enthusiast market as a consumer who would pull out his credit card right now for an ATI DX10 card but is going with Nvidia. Each one is a cut in the death of a thousand cuts that AMD has self-inflicted.

Jack, first of all, not only would I not disagree with you, but I would place you on my highest pantheon alongside the other great Deities who have guided my life, Buddha, Bacchus and John Holmes. :twisted: Therefore, I can certainly understand the R600 delay as well. My basic point is this:

Vista was supposed to be launched a few centuries ago. It missed that date, and then the next and then the next, so on and so on. We all know the story. However, let us assume that the bozos at MS were not as dense as the bozos at AMD and Vista had come out, say in Q1 06. Now we would have a situation where AMD-ATI would have absolutely nothing on the market for that OS until almost a year and a half later, which is more than halfway to the next OS release according to MS's calendar. That's not just missing the wave, that's missing the whole damn tsunami! :lol: 
February 23, 2007 10:55:01 AM

Quote:


OH, and by the way, AMD would do exactly the same in 20 years from now should they ever manage to get ahead of Intel. That's the name of the game baby :roll:


Do you have severe short memory loss? Or, do you selectively forget that AMD had the better processor for years (until Core/Core 2)? They've done it before, it's possible they could do it again. Grow up!

Through all those years, they still couldn't overthrow Intel, who had an inferior (in every sense of the word) product.

AMD may have a good future product in the works, but their capacity to deliver and appeal is simply restricting their potential growth. They only have what...three fabs?

That's no where near enough to pump out K8, K8L, GPU, northbridge and southbridge chips.

Advertising is another problem for AMD...
February 23, 2007 12:13:20 PM

Richard is known for arrogant, the dumb kind though, opne mouth stick foot type :lol: 
February 23, 2007 12:15:28 PM

true
!