Sign in with
Sign up | Sign in
Your question

AMD K10 benches starting to leak: K10 20% slower than Clovertown c-f-c

Tags:
Last response: in CPUs
Share
August 30, 2007 3:51:59 PM

How come CPU-Z says the Phenom only has 1 core?
August 30, 2007 4:05:57 PM

jwlangs said:
How come CPU-Z says the Phenom only has 1 core?


Its not a phenom ... its two quad core K10s i.e. a dual socket server rig.
August 30, 2007 4:20:33 PM

how do you link and quote and do all that fancy forum crap.
August 30, 2007 4:24:22 PM

So, CPU-Z sees 8 single cores in the dual opteron setup, and 2 quad cores in the dual xeon setup. Is this just some sort of nuance within CPU-Z?
August 30, 2007 4:27:20 PM

The second page of that link shows the barcelona completing a SuperPI test in 39 Seconds. My 4400+ X2 does that superPI test within a similar time. I'm not completely sure these Barcelona benchmarks are trustworthy, but we'll find out when it's actually released. Such benchmarks are not to be taken seriously anyway because Game benchmarks and real world benchmarks will give a far more accurate indication of performance.
August 30, 2007 4:35:15 PM

jwlangs said:
So, CPU-Z sees 8 single cores in the dual opteron setup, and 2 quad cores in the dual xeon setup. Is this just some sort of nuance within CPU-Z?

Its the type of nuance that occurs with every new chip the cpuz author has not updated cpuz for.
August 30, 2007 5:09:38 PM

Ouch

If this is true, at least on these benchmarks, AMD is in deep sh*t

8 Cores of Deepfritz gets 8000 ish on Barcelona, gets 13000ish on a clovertown.

bad.
August 30, 2007 5:46:02 PM

Woah, bad news!!! No wonder Henri Richard bailed AMD... EVERYONES sinking ship rat analogy was correct! DAMN THOSE RATS!!! Somehow I think the architecture will only really start to shine over 2.6GHZ, so I hope they are big power savers in the AM3 boards, or are priced really low, we'll see how their stock performs when this news is mainstream...
August 30, 2007 5:51:52 PM

BaronMatrix said:
Not to be a stickler but the AMD system s running Generic GDI while the Intel system is running a 7950GX.


Cinebench has nothing to do with GFX cards :lol: 

Prediction: CPUs are going to get a lot more expensive in the next 18 months.
August 30, 2007 6:21:37 PM

Wombat2 said:
Cinebench has nothing to do with GFX cards :lol: 

Prediction: CPUs are going to get a lot more expensive in the next 18 months.



The point was that when comparing systems, everything should be the same. Also, all the Intel chips are clocked higher. The real SPEC numbers will come out on the 10th along with any other official benches they did.

I can wait.
a b à CPUs
August 30, 2007 6:55:01 PM

BaronMatrix said:
The point was that when comparing systems, everything should be the same. Also, all the Intel chips are clocked higher. The real SPEC numbers will come out on the 10th along with any other official benches they did.

I can wait.

That was my initial thought as well, mostly tho; what mobo's and chipset were these "tests" run on. I'm always suspect of benches like this especially when CPU-z hasn't been updated to fully recognize the proc. What else in the test is not fully recognized?!?!

I can wait too. Especially for more "reliable" site's and benches.

August 30, 2007 7:30:22 PM

Hopefully this isn't true, but we will know soon enough. I may even get a chance to test one out myself around launch because even if these things are terrible im sure my roommate will be getting one! woot! lol

Best,

3Ball
August 30, 2007 8:52:05 PM

For those who say Phenom is nowhere to be found and is slower here is a quote from Rahul Sood of VoodooPC.



So, AMD decided to unveil Phenom running at 3.0GHz without showing actual benchmarks. What it showed was a game running smoothly with all details enabled, which makes perfect sense to me. And for the record, if you were to benchmark Phenom at 3GHz you would see that it kicks the living crap out of any current AMD or Intel processor%u2014it is a stone cold killer (at 3GHz, now imagine how it would perform if they could squeek some more juice out of it?).


Now here it is from a reputable OEM who has no reason to lie and definitely has access to the chips.

http://www.rahulsood.com/2007/08/benchmarks-are-wiggedy-wiggedy-whack.html


He also reiterates what I said last year BEFORE AMD, benchmarks aren't the be all and end all of a computing experience.

ALL HAIL THE DUOPOLY!!!

August 30, 2007 8:58:51 PM

I would see that as being more of a gpu demonstration then a cpu demo.....
August 30, 2007 9:06:16 PM

Quote:
it kicks the living crap out of any current AMD or Intel processor - it is a stone cold killer


Still don't buy it. I don't care, even if the pope were to say "Phenom is great! Have faith!".

And about the 3Ghz "kicking the living crap" out of any current AMD or Intel processor, who the hell cares? Current processors will always get kicked around by future processors.

Quote:
I'm guessing that AMD will be able to launch some parts at higher clocks than it is currently showing in its roadmaps


Yeah sure, you're guessing. It's all like a poker game, sure. They're bluffing. They're not betting big money, but they actually have a royal flush in their hands. They're going to surprise everyone by actually winning! Those treacherous bastards!

Quote:
It's interesting to note that AMD isn't showing benchmarks on a part that delivers the goods - perhaps it too is seeing that performance benchmarks are only a small piece of the overall experience puzzle.


OK, the whole "Experience" thing stands to reason, but the big question here is: if, like, 90% of IT people base their decisions on performance benchmarks and 10% in the "Experience", why wouldn't you show the vast majority of potential customers what they want to see? I don't care if this guy owns Vodoo or even is Bill Gates himself. He missed a very important point: show the customers what they want to see, not what you think they should want to see.

Simple enough, eh? It's not rocket science.

Also, gamer experience as in image quality is much more video-card-dependent. Heck, even system responsiveness depends a lot on video cards nowadays, with Vista's interface. My conclusion from this article is much more in the lines of "get a better video card, it'll give you a better experience" than "Phenom rocks".
August 30, 2007 11:30:09 PM

Why speculate, we will either see amd get kicked in the nads again or we won't everything out there now i likely a lie and thus it is hard to tell which bench is actually real. so just wait until it comes out.
a b à CPUs
August 31, 2007 10:43:40 AM

Like the baron said ... I can wait.

In the meantime Intel will go on a massive campaign of FUD.

AMD will stay silent ... laywers re-reading the NDA's with the partners ... trigger fingers twitching silently.

Anand will have rewritten his top shelf analysis of the state of play about 30 times by then.

I'll read what he has to say first ...
August 31, 2007 7:29:18 PM

Quote:
Like the baron said ... I can wait.
If by waiting you mean waiting a couple hours so you can find a post that backs up your point of view, no matter how obscure it is, then ok. :) 

Quote:
In the meantime Intel will go on a massive campaign of FUD.
Yes, because it has been them all along that's been releasing slides about their next chip being the second coming and pointing out every obvious shortcoming of competitor's chips. All the while not releasing a single bit of hard data. Yes Intel are such FUD-mongers.
August 31, 2007 7:33:51 PM

BaronMatrix said:
For those who say Phenom is nowhere to be found and is slower here is a quote from Rahul Sood of VoodooPC.



So, AMD decided to unveil Phenom running at 3.0GHz without showing actual benchmarks. What it showed was a game running smoothly with all details enabled, which makes perfect sense to me. And for the record, if you were to benchmark Phenom at 3GHz you would see that it kicks the living crap out of any current AMD or Intel processor%u2014it is a stone cold killer (at 3GHz, now imagine how it would perform if they could squeek some more juice out of it?).


Now here it is from a reputable OEM who has no reason to lie and definitely has access to the chips.

http://www.rahulsood.com/2007/08/benchmarks-are-wiggedy-wiggedy-whack.html


He also reiterates what I said last year BEFORE AMD, benchmarks aren't the be all and end all of a computing experience.

ALL HAIL THE DUOPOLY!!!
Yes because salesmen are some of the most trustworthy people on the face of the earth. :sarcastic: 
August 31, 2007 11:38:13 PM

BaronMatrix said:
Not to be a stickler but the AMD system s running Generic GDI while the Intel system is running a 7950GX. Also, if you notice, the 2.4GHz Intel chip scores THE SAME as the 3.0GHz Intel, around 17000.

Hmmmm.


Not to be a stickler, but AMD has been basing benchmarks off simulated CPUs at estimated clockspeeds against Intel processors using integrated graphics





And AMD has been caught and called on it by so many sites now that not only have they removed the graphs from their site, but also any mention of comparitive performance.

http://multicore.amd.com/us-en/AMD-Multi-Core/Products/Barcelona/Performance.aspx

Little more than one month ago, George Ou of ZDnet posted this story:

http://blogs.zdnet.com/Ou/?p=567

AMD posts blatantly deceptive benchmarks on Barcelona

George Ou of ZDnet wrote:
Quote:


* Not real product. Fastest Barcelona being released in September is 2.0 GHz

As you can see from above, AMD’s claim that they have a 20% clock-for-clock advantage with Barcelona is simply wrong. Based on the latest certified SPEC.org results, AMD has a little more than a 1% clock-for-clock performance advantage in a dual-socket 8-core Server configuration but they have 50% clock speed deficit when the Barcelona finally launches in September. That means Barcelona will not be the Intel quad-core killer that AMD has been promising for most of this year and it won’t even be close.

The deception doesn’t end with the quad-cores; AMD is also claiming to have an advantage on dual-core processors when in fact they have a major performance deficit. AMD claims to have a 2.5% advantage when Intel actually has a 14.7% advantage when you’re looking at the certified SPEC.org scores.




a b à CPUs
September 1, 2007 3:12:10 AM

Hey Baron ... I'm with you ... hoping AMD deliver something competitive.

I just hope I can get a quad core cheap and some decent performance improvement over my dual cores.

I'd also like the possibility of doing that by just swapping a chip ... not a mobo transplant. Hence ... fingers crossed.

As that such an inrealistic ask?

And if AMD does a bit of creative markinging to boost their shakey financial position it isn't as if Intel hasn't done that before.

We remember the last of the netburst rubbish they put out to compete with the A64's.

We remember the massive FUD campaign there.

We hope the marketing practices Intel have been "allegedly" engaging in over past years earns them their just penalties in court.

AMD could do with a few billion to wipe their debt.

All hail the duopoly as previously aid.

I think the server world is going to benefit the most ... performance per watt ... from the new AMD quads.

I think the high end gaming wold is probably going to continue to see Core2 keeping most benchies ...

Happy with that.

Looks like they have optimised the K10 with a key priority for multithread applications, inter core communication, and tried to improve IPC without a drastic redesign. Plus also focussed on a huge emphasis on power savings ... for servers.

Lets see what pops up then.
September 1, 2007 4:50:29 AM

function9 said:
Yes because salesmen are some of the most trustworthy people on the face of the earth. :sarcastic: 

Hmm, doesnt he sell BOTH cpu's? If so, its the same for him which sales best and your sarcasm is out of place here, HP most likely have few babies up and running while preparing their new lines of PCs for the autumn/winter seasons.
September 1, 2007 5:57:11 AM

So hyping a product you sell isn't good business? Ok then. It had nothing to do with Intel or AMD. But with the fact that the guy is a salesman first and foremost. I fail to see how my statement about salesmen is out of place. I would love to know though, what one has to do with the other. Please inform me.
September 1, 2007 6:07:43 AM

LOL, whats the debate about? Sood sells niether Barcelona nor Phenom yet as neither exist in the retail market. Hyping that which does not exist cannot impact sales of that which one cannot sell.
a b à CPUs
September 1, 2007 11:23:38 AM

Thanks for that link Wombat2 to extreme systems.

There is a Jack over there ... lol.
September 1, 2007 4:09:13 PM

BaronMatrix said:
For those who say Phenom is nowhere to be found and is slower here is a quote from Rahul Sood of VoodooPC.



So, AMD decided to unveil Phenom running at 3.0GHz without showing actual benchmarks. What it showed was a game running smoothly with all details enabled, which makes perfect sense to me. And for the record, if you were to benchmark Phenom at 3GHz you would see that it kicks the living crap out of any current AMD or Intel processor%u2014it is a stone cold killer (at 3GHz, now imagine how it would perform if they could squeek some more juice out of it?).


Now here it is from a reputable OEM who has no reason to lie and definitely has access to the chips.

http://www.rahulsood.com/2007/08/benchmarks-are-wiggedy-wiggedy-whack.html


He also reiterates what I said last year BEFORE AMD, benchmarks aren't the be all and end all of a computing experience.

ALL HAIL THE DUOPOLY!!!


What a load of crap. This is a computer enthusiast site and we should all just forget about benchmarks? Instead lets just judge our system performance by the warm fuzzy feeling we get. THE EXPERIENCE? Could there be anything more subjective? Basically Rahul is saying that AMD's new chips aren't going to steal the benches, but it'll be fast enough. There's no such thing as fast enough on an enthusiast site.
September 2, 2007 2:47:32 AM

function9 said:
So hyping a product you sell isn't good business? Ok then. It had nothing to do with Intel or AMD. But with the fact that the guy is a salesman first and foremost. I fail to see how my statement about salesmen is out of place. I would love to know though, what one has to do with the other. Please inform me.

Hyping product and lying about it are different things. You question Rahul credibility on basis that he is also saleman, thats BS, and since he sells both AMD and Intel cpu's, he doesnt have a clear reason to be bias to any manufacturer. If you dont trust him, bring up something more than "he is salesman".
September 2, 2007 2:53:50 AM

Scarchunk said:
What a load of crap. This is a computer enthusiast site and we should all just forget about benchmarks? Instead lets just judge our system performance by the warm fuzzy feeling we get. THE EXPERIENCE? Could there be anything more subjective? Basically Rahul is saying that AMD's new chips aren't going to steal the benches, but it'll be fast enough. There's no such thing as fast enough on an enthusiast site.

Actualy it was smartly written article, while you and some others are extremalist. Enthusiast doesnt HAVE to be extremalist, he may have balanced view of the subject without going to extremes, like Rahul's article.
September 2, 2007 3:31:20 AM

http://www.rahulsood.com/2007/08/benchmarks-are-wiggedy-wiggedy-whack.html

Prelude to dissapointment.

While Mr Sood is correct that there is more to computers than benchmarks:
Quote:
Users care that when they turn on their PC it boots reasonably quickly and works with all of their devices. If one of their components fails, they are looking for a simple way of replacing the component with the best access to customer support.....

-
......There are many more factors to consider when building and/or buying your next PC. You’ll want to consider operating noise, image quality, ease of access, ease of upgradeability, ease of replacing components, how “quick” it feels when you’re booting it up, storage space, stability, operating system usability, style and design...


I am suprised he would take this tack in minimzing their importance.

Noise, storge capacity ease of upgradeability etc are all important, however, many of Mr Soods points are moot IRT benchmarks.
For example:
-The ease of replacing components: An Asus 775 mobo is as easy to replace components as on an Asus AM2 mobo
-Quickness of booting has more to do with the BIOS, peripheral device drivers and software than the CPU.....an XP 1900 with nothing but word will boot to operability faster than an AM2 with a digitizer pad, plotter, external HDDs, firewall, anti virus software, constant broadband connection and graphics suite.
-Noise of a system: Dependant on the case (and AFAIK there are no cases specifically designed for only AM2 or 775) and cooling options/hardware the user chooses.
-Storage space. Absolutely nothing to do with the CPU or CPUs, completely mobo/chipset/user bound

Nothing Mr Sood mentions is mutually exclusive to either CPU manufacturers systems. Not a single thing. When all is said and done, with the exception of the CPU and mobo, a buyer can configure systems that identical for either AMD or Intel. This leaves the only choice as the CPU and mobo. And what is a consumer to base their CPU/mobo choice on? HDD Capacity? Peripheral device drivers? The Antivirus or word processing suite they will use? Their internet connection?


Based on the fact that both AMD and Mr Sood touted AMDs benchmarks when AMD held the lead in perfromance, and that Mr Sood points out considerations that impact both Intel annd AMD equally, I can only conclude that Mr Sood is not impressed with the barcelonas performance and is attempting to minimze the importance of benchmarks to minimze dissappointment with Barcelona.


September 2, 2007 5:12:58 AM

I think you missed his point turpit. Rahool speaks of "differences between overall experience and performance benchmarks." For some ppl benchmarks = all, while it definately isnt, at least not for majority of users. He doesnt deny benchmarks, just says its not all-in-all.

For example, C2D based Xeons are in most cases faster than Opterons, but also uses more power per system, since its not only cpu watts whats counted. Why its important? Because by choosing cpu you choose all platform with all its positive and negative sides, and just cpu benchmark wont tell you (at least not always) which platform is better for your specific needs.

Another example - if you want SLI, you are out of luck with Intel MB's, even if lets presume in X38 Yorkfield would work faster than in i680/whatever.

Upgradability? AMD did very smart move in the past when they had Socket A for ages, instead of rapid change of sockets and cpu compatibilities a la Intel style recently.

Cooling? Hotter cpu's needs better cooling, which in most cases means louder systems. Again cpu specific thing which affects user experience.

Anyway, again Rahool doesnt deny benchmarks, just I agree with him it isnt everything.

Oh, and although it may seem he tries to minimize the importance of benchmarks, he also says:
"And for the record, if you were to benchmark Phenom at 3GHz you would see that it kicks the living crap out of any current AMD or Intel processor—it is a stone cold killer (at 3GHz, now imagine how it would perform if they could squeek some more juice out of it?)."

Now, I dont have Phenom to test if he is right, neither do you, but since he works for HP, I'm pretty sure they got some samples and knows what he is talking about. If not, we will know pretty soon.
September 2, 2007 7:38:57 AM

Harrisson said:
I think you missed his point turpit. Rahool speaks of "differences between overall experience and performance benchmarks." For some ppl benchmarks = all, while it definately isnt, at least not for majority of users. He doesnt deny benchmarks, just says its not all-in-all.

For example, C2D based Xeons are in most cases faster than Opterons, but also uses more power per system, since its not only cpu watts whats counted. Why its important? Because by choosing cpu you choose all platform with all its positive and negative sides, and just cpu benchmark wont tell you (at least not always) which platform is better for your specific needs.


I disagree. In both AMDs and Intels cases, when they hold the postion of perfromance leader, they will tout benchmarks. When they follow, they will allude that 'benchmarks arent that important'. Not only will the manufacturers do that, but the vendors who sell those products will also use or deliberately avoid using perfromance figures in their advertising.

As for benchmarks not being "all" for that majority of users, I would agree, not because people dont care about perfromance, but because the typical person that walks into best buy, wall mart, what have you, dosnt know how perfromance in computing is measured. But I do beleive that those people want the most for their money, and typically that person is not going to be persuaded by a few watts of power savings, or a few decibles less of SPL vs the abilty to run or not run an application smoothy, or have longevity.

In terms of opteron vs xeon, I wont seperate those from DT systems, but the server world is a different place than the DT. There are commercial concerns whose priority may be low power usage, while there are others whose priority is perfromance at any cost and still others that look for a balance of performance vs purchase/operating cost. So I dont think using the opteron or xeon is particularly apropos when talking about people in general, however, since the Phenom is still not due out for some time, the barcelona cored opterons will be the only products available for comparion for some time.

But in regards to overall experiance, for example, I can take my A64 3200, and load oblivion on it. It will run the game and it is a quieter system, and uses less elecrity than my other systems, but the "experiance" leaves much to be desired. Conversly, my E6600 runs the game at a level that makes the experiance pleasant regardless of the fact that it uses a little more power and creates a a few Db higher SPL than the A64 3200. Now, Oblivion is a GPU bound game, but as is true of most modern games, it can be bound by the CPU as well, when the performance of the CPU descends below a critical value. Theses results hold true when I run 3DSmax, CorelDraw, Photopaint, MSOffice, MyFlix or any of other apps I use.

On a different note, I have been seeing people claim Barcelona will consume considerrably less power than Xeon/C2D. I would have to disagree with that. We already know that Barcelona, aka K10, is an update of K8. We also know it will be manufactured on the 65nm node and move to wider 128 bit pathways. If K10 is to outperform K8 n the same node, with wider pathways, logic dictates it must use more power than current K8s. Now, power usage gamemanship aside (and both AMD and Intel have been "playing games" to reduce power usage) since current Xeons and Opterons of equitable perfromance display insignifcant power consumption advantage at 100% usage, logic would dictate that a more powerful Barcelona will use more power than a xeon. Contrary to what some people want to beleive, even the mighty AMD cant get something for nothing.

Harrisson said:

Another example - if you want SLI, you are out of luck with Intel MB's, even if lets presume in X38 Yorkfield would work faster than in i680/whatever.


LOL, Cmon now ;)  the same is true if you want crossfire. Im my personal case, I want niether since the few FPS advantage is greatly outwieghed by the cost, but to each their own when it comes to that.

Harrisson said:

Upgradability? AMD did very smart move in the past when they had Socket A for ages, instead of rapid change of sockets and cpu compatibilities a la Intel style recently.

I disagree. While socket A had exceptional longevity, that was the physical socket only, just as has happened to Intel's long lived socket 775. As AMD continued to improve it's Athlon series of processors, successive generations of Athlons required new chipsets, meaning new motherboards. Socket A existed during the transition to AGP, which required a new chipset and thus motherboard. The transition from EIDE to SATA required compatable chipsets and thus new motherboards. 775 has demonstrated the same charecteristics in terms of longevity as Socket A has, requiring new chipsets and thus motherboards for the advance from P4 to C2D, AGP to PCIe etc.

Harrisson said:

Cooling? Hotter cpu's needs better cooling, which in most cases means louder systems. Again cpu specific thing which affects user experience.

And both AMD and Intel moved to automatic thermal protection, as well as automatic throttling, and the smaller, lower consumption, cooler 65nm node. People seem to forget that the high end versions of the last series of Athlon XPs ran as hot if not hotter than the high end Netbursts. Net sum zero since both current high end AMD and Intel offerings run cooler thant their predecessors, have better factory HSFs than previously, and can make use of high end aftermarket silent coolers. Furthermore, tests of power usage and TDP fall into the catargory of benchmarks. So if an individual is looking for the most power efficient system, or coolest system how do they find it? Power usage and TDP benchmarks.


Harrisson said:

Anyway, again Rahool doesnt deny benchmarks, just I agree with him it isnt everything.

I dont disagree, however, when people start pulling out the 'benchmarks arent everything' argument, again, it is usually because their product does not outperform the competiton. Now, Mr Sood is not directly affiliated with either AMD or Intel, and Alienware produces both Intel and AMD based systems, however, there was a time when Mr Sood was quite publically enamoured with AMD, so it is quite odd that someone who once made a living selling ultra perfromance machines and did favor one particular manufacturers products, would say benchmarks arent everything. Now, it is also true that some benchmarks, on an individual basis, are fairly worthless. The synthetic benchmarks specifically. People run Word, 3DSMax, Doom etc, not simu-word, simu-3DSmax or simu-doom. So certainly there are benchmarks that are meaningless, but how a machine runs an actual application is something that is important enough to be a question that everyone asks. Benchmarks provide that information.

Harrisson said:

Oh, and although it may seem he tries to minimize the importance of benchmarks, he also says:
"And for the record, if you were to benchmark Phenom at 3GHz you would see that it kicks the living crap out of any current AMD or Intel processor—it is a stone cold killer (at 3GHz, now imagine how it would perform if they could squeek some more juice out of it?).
And how does he know that without benchmarking the system? And if "Phenom at 3GHz".."kicks the living crap out of any current AMD or Intel processor" then why belittle the importance of benchmarks?

Harrisson said:

Now, I dont have Phenom to test if he is right, neither do you, but since he works for HP, I'm pretty sure they got some samples and knows what he is talking about. If not, we will know pretty soon.


How do we know? Because Mr Sood said so? Where is the proof? If we are to believe what people say, then AMD doesnt really need Barcelona or Phenom since there are people who still claim AM2 outperfroms C2D. Just as Intel claimed Netburst could outperform AM2. Without performance tests, how can any perfromance claims be verifed as valid? But has this been the case with Barcelona? Do we really lack benchmarks? That depends on how the individual choose's to judge AMDs marketing and press releases of the past 6 months. Remember, it wasnt that long ago when AMD executives were claiming that "based on simulated applications, K10 would outperform C2D by up to 70%" clock for clock. As the story floated and the quotes form the different execs varied, that rapidly changed from clock for clock to FP, then to "...up to 70% over AM2 Opteron", then to 40% then 20%. There are now guestimates floating around of 1~2% clock for clock. Which is true? Does it matter? According to AMD's Henri Richard, benchmarks dont matter. If benchmaks dont matter, then why did AMD post graphs on their website and slides at PR events showing "simulations" of Barcelona running at "estimated" clockspeeds outperforming Xeon? If AMD's simulations showed their product outperforming the competition, but benchmarks dont matter, then why bother posting those benchmarks at all? And since those benchmarks were only of simulations, now that AMD actually has the 3.0 Barcelona, why not post actual benchmarks to support their earlier claims? The theory that AMD has been hiding Barcelonas performance to deny Intel the chance to respond by 'tweaking' Penryn still persist, but if AMD was really trying to hide its perfromance from Intel, then why go through the trouble of displaying the performance of "simulated" CPUs? Isnt that letting the cat out of the bag? And worse, why allow Anandtech access to the Barcelona Uarch for the purpose of a public critique? If AMD really wanted to keep Intel of balance, why not create and 'leak' benchmarks showing Barcelona underperforming C2D, in order to lull Intel into a false sence of security? Making bold claims of a 70% performance improvement over the competition, or 40% or 20% does not sound like AMD has been trying to hide anything, nor does it support Henri Richards and Rahul Soods philosophy that 'benchmarks arent everything'

So, while Mr Sood and Henri Richard can say "benchmarks arent everything" the actions of AMD and past actions of Mr Sood indicate that they function under a different philoshopy.
a b à CPUs
September 2, 2007 12:06:22 PM

Cripes turpit ... you got anything left to say??

I think with a post like that you probably need a break ...

Not that I believed much of what you said.

Without SLI poor old Intel is up the creek without a paddle.

A single card won't cut it soon enough ... AMD will catch up (well they already have in 3D Mark ) and because crossfire works and they now have the chipset, cpu, and gpu (read platform) advantage they will move forward.

NVidia will release a CPU soon ... that's why Intel is madly trying to destroy AMD so they can get ready.

Why else is Intel hiring soo many new gpu people?

Read between the lines on Sood's old posts ... clever guy there.

We could do with three all competing ... I like that !!

Has any more benchies surfaced yet??
September 2, 2007 2:56:04 PM

LMAO!

You really, really think SLI is such a wonderful idea? You really think it matters that much, eh?

I have never, ever, seriously considered going SLI. And I bet that 95+% of all gamers do not consider SLI at all. I think that "bad support for SLI" is hardly an excuse for saying
Quote:

Without SLI poor old Intel is up the creek without a paddle.


I think turpit has some very good points there.
a b à CPUs
September 2, 2007 3:11:46 PM

What?

DELETED

THIS GOES FOR EVERYONE AS WELL AS REYNOD.

LEAVE REFERENCES TO OTHERS FAMILY OUT OF YOUR POSTS, NO MATTER HOW INOCENT YOU MAY THINK THEY ARE


You have something to say to someone, you say it to them and leave their mothers, fathers, daughters, sons etc out of it.


Turpit
September 2, 2007 8:50:55 PM

I dont have hours to respond to your every claim turpit, but again, you miss the big picture. CPU's benchmarks are a part of the picture, not all-in-all. You dont disagree but then again going how benchies is pretty much everything to you ;)  Although some of the things you say are correct, some are out of place and exagerated. Rahool article is pretty good, and much more balanced than yours posts IMO.

Btw, "And how does he know that without benchmarking the system?". Why do you think he didnt benched it or at least havent got hard data before saying it? Do you honestly think OEM's still havent got samples to work on? I dont know if Rahool is right or wrong, but I dont have the reason to doubt him and I'm sure he have way more inside info than we do. If that said, he still didnt had the access to benchmarks info, then his wording was pretty bad, same as in other thread nmdante was saying one thing but it appeared he meant something else ;) 
September 2, 2007 9:32:11 PM

Reynod said:
Cripes turpit ... you got anything left to say??

I think with a post like that you probably need a break ...

Not that I believed much of what you said.

Without SLI poor old Intel is up the creek without a paddle.

A single card won't cut it soon enough ... AMD will catch up (well they already have in 3D Mark ) and because crossfire works and they now have the chipset, cpu, and gpu (read platform) advantage they will move forward.

NVidia will release a CPU soon ... that's why Intel is madly trying to destroy AMD so they can get ready.

Why else is Intel hiring soo many new gpu people?

Read between the lines on Sood's old posts ... clever guy there.

We could do with three all competing ... I like that !!

Has any more benchies surfaced yet??


And what dont you beleive? Please be specfic so I can provide you with the links.

I wouldnt put much stock in either crossfire or SLI. A handfull of enthusiasts have bought them, but they are marketing tools for bragging rights/PR & advertising, not mainstream revenue producers. The old sales trick of win on sunday, sell on monday. Even here, where enthusiasts come to chat, most people dont have SLI/crossfire or even want it.

In fact, theres a poll for you to run....how many people have SLI vs how many people who dont.
-or-
How many people want SLI/Crossfire vs how many people arent interested

That GPUs will eventually go multicore is inevitable, however, unlike AMD or Intel, boht Nvida and ATI have a few nodes left to shrink before they have no other options left. As for now, its just the cheaper route for the manufacturers to go to so than can market higher performing cards, but at the heavy cost of increased power usage and greater waste heat. In small quantities these things probably dont deter most people, but the rate at which multiplying cores will increase these byproducts, they become unattractive enough for people to take note. Just look over in the GPU section to see the compliants of heat and power usage.

As for Nvida marketing CPUs, if your talking aoout the rumour started by the Inquirer, well, to start, its the inquirer.
Second, you do realize that is is a rumour, not a fact. Nvidia has made not staements to the effect that they will be produing a CPU, Third, Nvidia would have to skip an entire node to market a competative product, not to mention the small details of spending a significant amount of money for tooling, licensing, R&D etc, just to start. If you are talking about the other rumours of Nvidia including an on die processor in its GPUs for physics handling, that is a much more realistic possibility and not a threat to either AMD or Intel, and I doubt either is worried about it.


As for AMD catching up in 3Dmarkk, if you're refering to the rumored 30000+ Bench, that is so much rumor that I would counter with Coolaler's benchmarks. Coolaler has proven to be much more reliable than the inquirers hacks who's laptops mysteriously get stolen.
September 2, 2007 9:51:36 PM

Harrisson said:
I dont have hours to respond to your every claim turpit, but again, you miss the big picture. CPU's benchmarks are a part of the picture, not all-in-all. You dont disagree but then again going how benchies is pretty much everything to you ;)  Although some of the things you say are correct, some are out of place and exagerated. Rahool article is pretty good, and much more balanced than yours posts IMO.


No, my point is that an individual can build nearly identical systems frrom either manufacturer, focusing on power, or waste heat, or perfromance, or value. The only thing that changes are the mobo and CPU. If a person wnat the best value, what do they look at? The cost/perfromance benchmarks. If a person wants low heat, what do thay look at? Thermal benchmarks. If they want a silent HTPC, what do they look at? Power consumption and thermal benchmarks.

So, what else is there?
Style? Yes, people do spend thousands of dollars having their cases custom air brushed, and yes, for them that is part of the "overall experiance", but realistically, do most people place the asthetics of a case in front of the systems perfromance? I doubt it. A computer is, afterall, only a tool. Its unlikely you'll every find one sitting in the Louvre next to the Mona Lisa.
Foot print? Absolutely, and again mobo manufacturers offer comparable mATX/ITX solutions for both AMD/Intel. So how do you choose which to go with?

Harrisson said:

Btw, "And how does he know that without benchmarking the system?". Why do you think he didnt benched it or at least havent got hard data before saying it? Do you honestly think OEM's still havent got samples to work on? I dont know if Rahool is right or wrong, but I dont have the reason to doubt him and I'm sure he have way more inside info than we do. If that said, he still didnt had the access to benchmarks info, then his wording was pretty bad, same as in other thread nmdante was saying one thing but it appeared he meant something else ;) 


I know OEMs have samples. But you do know that they dont need 3.0s, 2.0s or even fully functional silicon to create a mobo, dont you?

I know he has more inside info than any of us. I also know that when a CPU performs well, the manufactuers, OEMs and vendors fall over themselves quoting comparative benchmarrks. I also know that when a product dosent perform that well, the manufactuers, OEMs and vendors revert to smoke and mirrors, obscure benchmarks, and the 'benchmarks arent everything' tactic. Surely you know that?

Also note that in nthe post to which yoou are responding, I ask a lot of rhetorical questions. And I tell you what...If Barcelona outperforms C2D clock for clock by a significant margin as one would hope it does, we'll see how much 'benchmarks dont matter' when AMD and the OEMs start the post K10 release marketing Blitz


BTW, to what claims and 'exagerations' are you refering? Please point them out so I can post the supporting links. Thanx
a b à CPUs
September 3, 2007 9:50:45 AM

One minute you tell me this is an enthusiast site ... benchmarks matter... now you tell me enthusiat's don't go for SLI / CF.

Cmon turpit ... make your mind up.

High end gamers buy SLI / CF rigs and run them.

Your logic is in the toilet bud.

Or are you just trying to create a reality suited to Intel's current market position?

RE: MY LAST POST

Sure looks like it the way your editing my posts ...

The reference was made to make a point regarding whether the user could afford a SLI rig ... who bought it?? Nothing more.

Often people will try to bend and twist things to suit their current circumstances. Look up Cognitive Dissonance Theory.





September 3, 2007 1:33:33 PM

I'll be happy with good price, innovative architecture (no glued together die jobs like Intel), decent performance per watt and per clock, and RUNS COOL LIKE AN OPTERON SHOULD. This is most likely how AMD is planning to fight Intel, and they will have me as a customer (again).
September 3, 2007 5:35:26 PM

Reynod said:


RE: MY LAST POST

Sure looks like it the way your editing my posts ...

The reference was made to make a point regarding whether the user could afford a SLI rig ... who bought it?? Nothing more.

Often people will try to bend and twist things to suit their current circumstances. Look up Cognitive Dissonance Theory.



Im only going to say this once, and Im going to do it publically. I have edited 2 of your posts. One for use of an unacceptable word, which at the time was not yet included in the filters, and the second for a comment referencing someones family member.
Now:
1) there are other mods who would have just banned you without discussion, because what you did was in violation of TOS, without question.
2) Cognitive Dissonance Theory: Im glad you know what that means. It means you'll understand this. When engaged in social situations, it is not how you interpret what you say that matters, but how other people interpret what you are saying to them. While you may have seen your 'mother' comment as innocent, other people would not, and I can garuntee that, having to read through hundreds posts a day and watch the most idiotically insignificant remark turn into a flame war, and then waste hours going through the thread selectively editing and deleting. On that point, in reference to point #1, the reason other mods would have just banned you and deleted all your posts, is that it would be easier and less time consuming for them to do so. I havent quite got to that point yet.
3) THERE IS NO REASON, ABSOLUTELY NO REASON WHAT SO EVER UNDER ANY CIRCUMSTANSES TO BRING ANYONES FAMILY MEMBER INTO A DISCUSSION.
4) Since you were intelligent enough to look up Cognitive Dissonance Theory, I presume you are intelligent enough to realize what Ive said is true, and that you are wrong, and that you owe me an appology.
September 3, 2007 6:49:22 PM

Reynod said:
One minute you tell me this is an enthusiast site ... benchmarks matter... now you tell me enthusiat's don't go for SLI / CF.

Cmon turpit ... make your mind up.

High end gamers buy SLI / CF rigs and run them.

Your logic is in the toilet bud.

Or are you just trying to create a reality suited to Intel's current market position?

RE: MY LAST POST

Sure looks like it the way your editing my posts ...

The reference was made to make a point regarding whether the user could afford a SLI rig ... who bought it?? Nothing more.

Often people will try to bend and twist things to suit their current circumstances. Look up Cognitive Dissonance Theory.


So, in your world, the only people allowed to talk about computers here are those who are enthusiasts with SLI/Crossfire setups, then?

That's completely wrong. I'm sorry, but not all enthusiasts have 1) the need for SLI/Crossfire, or 2) want SLI/Crossfire. The extra FPS aren't all that needed in today's gaming. Sure, you can crank the eye candy and run a game at insane resolutions, but how many people have 26+ inch monitors that run at 1920x1200, 2048x1080, or higher?

Quote:

4:3 Aspect Primary Displays (913966 of 1092675 Total Users (83.64% of Total) )
6" 75 0.01 %
7" 9 0.00 %
8" 32 0.00 %
9" 2 0.00 %
10" 625 0.07 %
11" 36 0.00 %
12" 890 0.10 %
13" 13,281 1.45 %
14" 28,391 3.11 %
15" 96,562 10.57 %
16" 249,428 27.29 %
17" 214,392 23.46 %
18" 83,309 9.12 %
19" 161,837 17.71 %
20" 28,943 3.17 %
21" 1,187 0.13 %
Over 21" 34,761 3.80 %

16:9 Aspect Primary Displays (173919 of 1092675 Total Users (15.92% of Total) )
7" 246 0.14 %
11" 1 0.00 %
12" 201 0.12 %
13" 369 0.21 %
14" 2,816 1.62 %
15" 22,324 12.84 %
16" 8,466 4.87 %
17" 11,031 6.34 %
18" 8,496 4.89 %
19" 30,451 17.51 %
20" 17,484 10.05 %
21" 11,159 6.42 %
22" 18,899 10.87 %
23" 1,494 0.86 %
24" 18,131 10.42 %
Over 24" 22,351 12.85 %

Primary Display Resolution (1092679 Users)
800 x 600 23,152 2.12 %
1024 x 768 400,597 36.66 %
1152 x 864 65,831 6.02 %
1280 x 800 446,804 40.89 %
1440 x 900 55,387 5.07 %
1600 x 1200 19,078 1.75 %
1680 x 1050 53,623 4.91 %
1920 x 1200 14,970 1.37 %
Other 13,237 1.21 %

From the Steam Stat page

You are making all enthusiasts = high end gamers, and that's not true. Some enthusiasts just want the raw CPU power to running 24/7 FoH, or even encoding/decoding.

I'd like to see how many people are actually running SLI/Crossfire setups to those gamers who don't. I don't consider myself a high end gamer, cause I don't need the top of the line stuff to run my games, but I am a computer enthusiast, no matter what your point of view of an enthusiast is.

Oh, and I'm running a simple E6700 water cooled with a Swiftech H20-Premium setup, 8800 GTS 320MB also water cooled with the Swiftech MCW60, 550W Mushkin PSU, 2 500GB HDs, 2 320GB HDs, X-FI Platinum, Antec P160 (soon to move to the Antec 900). Does that just make me a non-enthusiast?
September 4, 2007 6:00:37 AM

Quote:
LOL, whats the debate about? Sood sells niether Barcelona nor Phenom yet as neither exist in the retail market. Hyping that which does not exist cannot impact sales of that which one cannot sell.
Sorry, I should've worded it differently as future product.
Quote:
Hyping product and lying about it are different things.
Depends if he knows what the product is truly capable of or not.
Quote:
You question Rahul credibility on basis that he is also saleman, thats BS
Why wouldn't I? There will be many other sources that have nothing to gain by sharing the info about the processor when reviews can be released. You would be a fool to rely on a salesman's word about any product they're selling.
Quote:
...and since he sells both AMD and Intel cpu's, he doesnt have a clear reason to be bias to any manufacturer.
He has a bias towards any product he sells.
Quote:
If you dont trust him, bring up something more than "he is salesman".
No, I think him being a salesman is enough. :) 
September 4, 2007 9:09:41 AM

Quote:
No, I think him being a salesman is enough.


Hate to butt in here (sorry) but theres a difference between a salesman who knows what he's talking about and one who will blind you with numbers but not know what they mean. Yea, he's biased, but that doesnt mean he's a worthless source.

Some salespeople can be very knowledgable about their subject and should be listened to (e.g. a record shop owner pointing you in the direction of music that you might like); others clearly should be ignored (like the idiot that always directs you to the top 40 whatever your musical preference... "this is selling very well to the target demographic").

From what I've read of Rahul Soods blog, he's quite an intelligent person, and does know what hes talking about. The question is how much it relates to this particular discussion, and, given the context Sood is coming from, how much weighting you need to give his conclusions.

Its interesting ideas - but not gospel.
a b à CPUs
September 4, 2007 10:54:28 AM

The point about SLI is that you can't call it an enthusiast site then discount it ... is that simple enough ???

If the best scores are posted for gaming by SLI / CF rigs ... then acknowledge them ... irrespective of cpu / mobo combination.

If Intel don't get a SLI / CF platform going them they will lose the "halo" effect generated by this.

I still think NVidia must be painfully aware though they have the highest end single graphics card and a great chipset ... they are reliant on AMD and INTEL for their continued support. Fine when your in front ... not so fine when the competition catch up.

September 4, 2007 12:48:48 PM

Intel loyalist have the bragging rights at THIS point in time. It is undeniable and not even a question. I must add, however, that although it appears most of the Intel loyalist seem to be the loudest and longest winded in this forum, they do not seem to follow conventional wisdom. ie: The most BANG for the least BUCK.

Surely any monkey can build a machine for bragging rights, and that one will gain his two minutes of glory and all the gibberish that accompanies such an endeavor. Arguments and discussions which are based on conjecture are pointless.

September 4, 2007 3:30:50 PM

So when most games today don't tax a single 8800, even at incredibly high resolutions and Intel is the #1 seller of graphics cards (those are all Integrated), there are actually people out there that think SLI is something that's necessary for a processor's success? Wow...

Intel's taking GPU people, because they have space. With multiple cores (i.e. 8 or higher), since normal OS processing is pretty much taken care of by 1 or 2 cores, the logical next step is to make the extra cores specialized. Now, since Intel's chipset to processor attach rate is incredibly high, they don't need to rush GPUs into the processors, but they could do it at the chipset level if they'd like (they're already doing it). If you have a dual core NB chip, one entire die of the chipset could be dedicated to graphics - which in todays chips would probably run as well as something from the Nvidia 5000 or 6000 series. Not bad for a ten dollar chip. There's no reason to have a behemouth 4 inch by 4 inch 80 core processor that is completely dedicated to normal CPU operations. Intel knows this and is anticipating the future.

Nvidia's pretty happy to have a larger chunk of the stand alone graphics card market, since AMD's acquisition of ATI pretty much killed the high end ATI market (way to go AMD - people who love AMD for not allowing Intel to have a monopoly should be pissed at what they did to the high end graphics market by destroying ATI).
September 4, 2007 3:31:16 PM

BaronMatrix said:
Not to be a stickler but the AMD system s running Generic GDI while the Intel system is running a 7950GX. Also, if you notice, the 2.4GHz Intel chip scores THE SAME as the 3.0GHz Intel, around 17000.

Hmmmm.


Hmmmmm...

Xeon 2.4Ghz scored 17295 in Cinebench...
Xeon 3.2 Ghz scored 22479...

where does your "2.4Ghz chip scores the same as the 3.0Ghz chip" come from? DELETED

EDIT: my apologies.
!