Sign in with
Sign up | Sign in
Your question

Barton 2500+ Preview

Tags:
  • CPUs
Last response: in CPUs
Share
February 6, 2003 12:08:09 AM

<A HREF="http://www.thetechboard.com/reviews/barton.php" target="_new"> Click to read </A>

<b> "You can put lipstick on a pig, but hey, it's still a pig!" - RobD </b>

More about : barton 2500 preview

February 6, 2003 1:12:38 AM

hmmm. bit too preliminary to draw and conclusions... a damn fine overclocker though :smile:

methinks the XP2500+ will be a hot item.

<b>My Computer is so powerful Sauron Desires it and mortal men Covet it, <i>My Precioussssssss</i></b>
February 6, 2003 2:23:57 AM

not enough details yet, but looks promising. Scaling? When are we going to see that!??! We all want a 2500+ barton to go from 1.83Ghz to 2.8Ghz ^^

Instead of Rdram, why not just merge 4 Sdram channels...
Related resources
February 6, 2003 2:56:38 AM

I want one... If anyone who has one reads this I will be your love slave for it

:evil:  Wow, if he's here who's running hell? :evil: 
February 6, 2003 7:59:44 AM

Now that's asking for trouble :smile:

:eek:  The rumours are true. <b>Eden</b> likes to help <b>Svol</b> make snowballs :eek: 
February 6, 2003 8:05:21 AM

you only wish you had one so I could be your lover :p 

:evil:  Wow, if he's here who's running hell? :evil: 
February 6, 2003 1:59:19 PM

This is not "Other" forum

<b> "You can put lipstick on a pig, but hey, it's still a pig!" - RobD </b>
February 6, 2003 2:29:02 PM

Quote:
hmmm. bit too preliminary to draw and conclusions... a damn fine overclocker though :smile:

I disagree vehemently. The article in no way indicates good OCing potential.

It <i>seems</i> like a good overclocker, until you look at those Comanche4 benchmark results. The overclock got a whole 0.05% performance increase at frames/sec and a whole 0.06% performance increase in triangles/sec. Yet it's clocked 20.84% faster <i>and</i> that's with a raised FSB (meaning that it has more bandwidth). And no, I didn't forget to multiply that 0.06% by 100. It really is six hundreths of a percent.

Obviously something is wrong. A 20.84% clock speed increase should give more than a 0.06% performance gain. As far as I can figure, there are only three possible answers:

1) This dolt who did the benchmarking was using insufficent cooling and while running real-world apps where the CPU actually warms up, it was already being throttled. (Based on that the non-OCed Comanche benchmark scores were virtually identical to the OCed scores, meaning that both scores were so similar because the CPU was being throttled down to a 'safe' level both times.)

2) The Barton was right at the very edge of it's performance, so much so that when OCed it was throttled right back down to the same performance as when not OCed.

3) AMD has implemented a new way to prevent OCing by ensuring that the performance is the same no matter how you externally clock the chip. (With the possible exception of performance gains from a higher FSB.)

These are, of course, in the order that I believe likely, meaning that my impression of the reviewer is awfully low. Either way though, no matter how you cut it that review makes it's own validity extremely questionable. I personally don't think that the review in any way indicates what the Barton is or isn't capable of.


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
February 6, 2003 2:49:44 PM

Maybe it is a stupid point, but can be the GEFORCE <b>2</b> Ti the reason of bad scaling? I wonder what would be the result if an ATI 9700Pro would has been used.

With that config like that, I can only give some credit to pure CPU benchmarks, not all-system benchmarks.


Still looking for a <b>good online retailer</b> in Spain :frown:
February 6, 2003 3:21:09 PM

Uh the Comanche Benchmark is a 3D test that pushes the video card and system. It is not a pure measure of the CPU. The video card is the limiting factor in this case not the platform.

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by paulj on 02/06/03 09:30 AM.</EM></FONT></P>
February 6, 2003 7:49:01 PM

baldurga and paulj, I do not believe that the graphics card put up an inpenetrable wall. Yes, indeed it does hinder the performage gain possible. However, I have <b>never</b> seen a point where any game cannot be sped up by a faster processor, regardless of how crappy the graphics card on it is.

Think of all of those systems with dinky onboard graphics. Even <i>they</i> see a gain in performance from faster processors, and you <i>know</i> that their graphics system is maxed out.

Yet in this case, the game sped up a whole 0.05%, or in effect, it didn't improve <i>at all</i>. Even with a maxed out graphics card, it should have seen <i>some</i> improvement. So it didn't have anything to do with the graphics card. The Barton was simply hitting some sort of a wall. Now, whether that wall was the fault of the reviewer or of AMD is all that really remains to be answered.


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
a b à CPUs
February 6, 2003 11:10:53 PM

So clock for clock the Barton is better, but XP rating for XP rating it's worse? Yet AMD's PRICES are based on XP ratings, not clock speed! So unless they can produce one with a higher clock speed (say, 2200MHz) they still aren't progressing!

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
February 6, 2003 11:24:11 PM

Quote:
It seems like a good overclocker, until you look at those Comanche4 benchmark results. The overclock got a whole 0.05% performance increase at frames/sec and a whole 0.06% performance increase in triangles/sec. Yet it's clocked 20.84% faster and that's with a raised FSB (meaning that it has more bandwidth). And no, I didn't forget to multiply that 0.06% by 100. It really is six hundreths of a percent.

Comanche4 sucks in terms of pulling faster fps. It's one of the worst optimized games of today. If you have the fastest video card, then you might expect very little fps gain in this game. This case, the reviewer is using a GF2 Ti, and Comanche4 makes full use of Dx8. So it's sure that, video card is not letting the CPU to scale perfectly.

<b> "You can put lipstick on a pig, but hey, it's still a pig!" - RobD </b>
February 6, 2003 11:52:00 PM

Right now on pricewatch.com you can get a xp 3000+ (2.16 ghz) 333fsb Barton for <font color=red>$624</font color=red>!

Now I can get my 2100 to 2.1ghz... and it cost me $98. Dont see people jumping on this unless they are numbers freaks..

<A HREF="http://tekkoshocon.com/" target="_new">http://tekkoshocon.com/&lt;/A> Southeast Pennsylvania gets an Anime Convention!
February 7, 2003 12:22:59 AM

Sorry Slvr, but I think it can be entirely the card here. Look at the THG VGA Charts 2, and look at the XP2700 and the 3.06GHZ disparity for one card in the low end.
<A HREF="http://www6.tomshardware.com/graphic/20021218/vgacharts..." target="_new">http://www6.tomshardware.com/graphic/20021218/vgacharts...;/A>
Here we see the Radeon VE having a very small 0.1% boost at most, when going to the XP2700 from the 3.06GHZ. Even though the 3.06 was fitted badly with the RAM and all, it still shows that although there is such a significant clock speed difference, with all the bandwidth over the AthlonXP, it still does little. Jedi Knight II also only starts to bottleneck CPUs later on.

Therefore these results, if truly on a cheap GF2, are in fact normal. I would've been surprised if it were on an R300.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
a b à CPUs
February 7, 2003 1:19:35 AM

Yes, pricing based on XP numbers means that AMD processors at the higher speeds are not the value people claim them to be, because the XP rating excludes any performance advantage.

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
February 7, 2003 1:29:45 AM

lol what gay site used a geforce 2 to test Barton
February 7, 2003 2:07:45 AM

Additionally I looked up an even more evident article, the mainstream comparison. Notice here <A HREF="http://www6.tomshardware.com/graphic/20030120/vgacharts..." target="_new">http://www6.tomshardware.com/graphic/20030120/vgacharts...;/A>
how moving from a 1GHZ Tbird to an AthlonXP 2700, more than twice the clock speed, and theoretically, the performance, on a Geforce 2 MX, yeilded no more than a 7th of a frame.
Therefore I believe you are wrong in seeing the Barton has bad scaling due to this very negligeable increase of performance in Commanche, and I believe these scores are more than normal when the video card was long ago the bottleneck. And here we are talking about a small ~25% clock increase for the Barton overclock.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 02/06/03 11:09 PM.</EM></FONT></P>
February 7, 2003 2:41:45 AM

Look, it's been said before, when you're testing for CPU scaling, you have to take out the vid card factor, so the games and benchies aren't vid reliant. Hence, yes, the Gf2 ti was holding it back somewhat here. Although, when you look at a lot of benches like the Winstone and SiSoft, all that theoretical and synthetic crap, the vid card won't have any say in it. Also, I never beleved for once that these would have a perfect scaling ratio of 100 percent performance to Mhz for each increase. Nice overclock nonetheless

Instead of Rdram, why not just merge 4 Sdram channels...
February 7, 2003 10:51:09 AM

Yes but I was replying to Slvr's comments towards the Commanche benchmark. I believe it is obvious the card was the limiting factor there and the review has a very poor capability of reviewing if he made the benchmark scale like this and used an old card.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
February 7, 2003 1:15:04 PM

Sorry Eden, but I still think you're mistaken.

Quote:
Notice here http://www6.tomshardware.com/graphic/20030120/vgacharts...
how moving from a 1GHZ Tbird to an AthlonXP 2700, more than twice the clock speed, and theoretically, the performance, on a Geforce 2 MX, yeilded no more than a 7th of a frame.

Only a 0.7FPS difference, yet that's still a 4.07% increase in performance, which is a <b>lot</b> larger than a 0.06% increase. Further, you're talking about a MX card. If use that same graph to look at a GF2Ti, you're getting an 18.15%, which is a <i>hell</i> of a lot better than a 0.06%.

Now, to be fair let's factor in the differences. An AXP 2700+ is <i>theoretically</i> equivalent to T-Bird 2.7GHz, meaning that it is about 170% faster. The Barton OC was only 20.84% faster. So if we multiply the 18.15% increase for the GF2Ti listed above by (20.84 / 170.0) to normalize the result according to the Barton OC, we still get an expected 2.27% performance incease, which is a heck of a lot larger than the actual 0.06% recorded.

Even if we take your GF2MX difference of 4.07% and normalize it in the same way, we get an 0.5% increase on the GF2MX, which is still almost a hundred times more than 0.06%.

So sorry Eden, but 0.06% is just <i>way</i> too low for me to believe, no matter what the graphics card. Even the GF2MX should have done almost a hundred times better than that.

I'm not saying that the 20.84% OC should have given a direct 20.84% FPS increase. Heck, I'm not even saying that it should have given a 1% FPS increse. I am however saying that the recorded 0.06% increase was <i>way</i> too low. It indicates that <i>something</i> wasn't quite right.

Especially when you consider that all of the other benchmarks were <i>synthetic</i> benchmarks. The <i>only</i> real-world benchmark is the one and only one that seems off. To me, it just indicates that we can't really put any stock in the validity of that review.

That's all that I'm saying, is that I trust that review about as far as I can throw it. If you want to trust it, fine, go ahead. Just remember that even the author admits his benchmarks are lame.
Quote:
<font color=blue>Update 02/05/03: There was some discussion about my "lame" benchmarks on Anandtech. Do I think they're lame? Compared to otehrs I've done and seen, yes.</font color=blue>

After an endoresement like that from the very author of the article, do I even have to say any more?


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
February 7, 2003 2:14:52 PM

Quote:
I am however saying that the recorded 0.06% increase was way too low. It indicates that something wasn't quite right......

To me, it just indicates that we can't really put any stock in the validity of that review.

Well I think we can all agree that something was wrong with this example of benchmarking. Why use a GF2 anyway? I feel guilty for having even discussed this benchmark. :smile:

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
February 7, 2003 2:17:39 PM

I think it's ironic that probably every one in this thread would do the same thing if they owned this system. Get a new video card.

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
February 7, 2003 3:08:04 PM

Quote:
So clock for clock the Barton is better, but XP rating for XP rating it's worse?

Quote:
Yes, pricing based on XP numbers means that AMD processors at the higher speeds are not the value people claim them to be, because the XP rating excludes any performance advantage.

It seems that way in the synthetic benchmarks but not in the application benchmarks (unfortunately there is only the Comanche benchmark).

You still have to compare benchmarks relevant to your individual applications regardless of the XP rating or P4 speed and then compare price. But you may be right.

The game will change again when we go to 400MHz (AMD) and 800MHz (Intel) fsb. :smile:

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
February 7, 2003 6:52:25 PM

I just wish they wouldn't always lower the clock speed so much whenever they increase the IPC, in order to jam the XP rating even further. Would it hurt if they had an XP2800+ at 2.25GHZ become a Barton and stay 2.25GHZ? It'd sure as hell compete better!

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
a b à CPUs
February 7, 2003 7:10:24 PM

I prefered the old days when AMD simply said "our product is faster". I even liked the bullet train commercial. Then they went from faster to cheaper with the XP rating idea. Then they jacked up the prices so that they are no longer faster nor cheaper? I thought they wanted to INCREASE market share?

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
February 7, 2003 7:23:46 PM

Crash, while you have an ideal view, the reality is different, the XP2700, despite its price, is selling very well! Many here have even the XP2800!
It seems to me people are still ready to pay for AMD CPUs despite higher prices. I am awaiting eagerly to see how the XP3000+ reaction will be, with its P4 top of the line price. If it sells, I'll really be surprised, it'll show quite well how many AMD fools there are.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
February 7, 2003 7:34:06 PM

Quote:
Then they jacked up the prices so that they are no longer faster nor cheaper?

Are you kidding? The <i>only</i> reason that they came up with that silly XP rating system was to jack up the prices in the first place. It's hard convincing a customer that they should pay more for a 1.33GHz CPU than a 1.4GHz CPU just because it performs better. And that was just their own product line. Then try convincing the customer to pay more for their 2GHz CPU than Intel's 2GHz CPU.

Sure, AMD <i>could</i> have stuck with the "our product is faster" mentality and remained honest. It was too much work though. So instead of trying to explain why you should pay more for their slower-clocked CPUs, they just made up some numbering system that looked so much like MHz that they could sell their slower-clocked processors for a LOT more than an equivalently clocked P4 (or T-Bird if you want to believe that...).

It was always about jacking up the prices. AMD didn't want customers thinking that they could pay considerably less for an AMD CPU with the same performance as an Intel CPU, because then AMD would be losing out on a lot of <font color=green>$$$</font color=green>.

I'm not saying that AMD didn't deserve the money, but I'm not convinced that the means were justified by the end.

And now this whole Barton thing is really stretching the limits of what is and isn't acceptable. It was bad enough that AMD offers two 2600+s (one at 2083MHz with 333FSB, one at 2133MHz with 266MHz FSB). Now to offer Barton at 2500+ with a much lower clock-speed, and yet have it outperform a 2600+. Is AMD even paying attention to their own rating system?


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
a b à CPUs
February 7, 2003 7:39:09 PM

I wouldn't necessarily say fools. If everything is equal in terms of processor value there's still the motherboard to consider, and the nForce2 is a really great chipset. Of course there are people who would pay P4 prices for an XP, slap it on a VIA chipset board that cost as much as an i850E board, and use memory that cost more than PC1066, but we'll leave those people out of it.

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
a b à CPUs
February 7, 2003 7:51:41 PM

Hmmm, I don't know? I've always liked "underdog" companies because they help keep everyone "honest" so to speak (they stop price gouging). And AMD has had some nice technology for a while now.

You don't increase market share by going for max profit, you do it by offering best value. It's increasingly obvious that AMD is no longer in "growth mode". If you don't grow, you die...AMD better get busy or this'll be another K6 fiasco by fall.

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
February 7, 2003 9:28:22 PM

I understand AMD creating the rating system. If I had a slower clocked processor that performed like my competitors faster clocked one I would want to get the same price as my competitor.

As I mentioned before you take your money and see how much power you can buy. Who cares if they call it an XP2700+ or a 2.17 GHz processor. The rating is a marketing thing that allows them to get more sales at a higher price from people who know no better. I don't buy an Asus motherboard because the model number is bigger than an Abit motherboard. I buy because I get the performance/price I want.

You might even say that AMD is helping ignorant people buy creating a rating system. People who are ignorant of AMD performance might buy an XP2000+ vs. an P4 2.0GHz. But they probably wouldn't buy a 1.67GHz Athlon vs. a P42.0GHz. So they would pay more money unnecessarily without the rating system.

It doesn't frustrate me at all that a Barton XP2500+ may be slower in some respects than a Thoroughbred XP2400+. There is no perfect way to rate a processor that makes sense in every benchmark.

In fact I don't know why anyone in here would complain that the Barton XP2500+ is clocked slower than the Thoroughbred XP2400+. It makes no more sense than complaining about the P4s inflated clock speed that performs much worse clock for clock than any AMD chip of late. Also the Athlons below 2400+ are still cheaper than their P4 counterparts. So most people can get an Athlon cheaper than a P4.

Ideally there would be a standard test method that would indicate the raw performance of any CPU. Unfortunately it wouldn't be accurate for every mobo, every application, every RAM, every HD, etc. And Intel and AMD would never agree on the motherboards to use for testing. Would it be an RDRAM system or DDR, IDE or SCSI, RAID, etc.?

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
February 7, 2003 9:34:59 PM

You said a mouth full there. Congrats!!!!

I'm still learning & having fun doing it!! Trouble comes with the things you forget or overlook along the way that make it not so fun!!
February 8, 2003 9:53:31 PM

I just think that the guy cant benchmark. :smile:

And i wonder about all that stuff how the mobo must be 45A certified...
i know my 8k3a+ isnt, but as i can quite happily run a overclocked processor putting out close to 100W im sure i will be able to use one :) 

<b>My Computer is so powerful Sauron Desires it and mortal men Covet it, <i>My Precioussssssss</i></b>
December 30, 2009 2:07:46 PM

For the love of god WHY is this thread linked to in the latest "5 gaming cases reviewed" article here: http://www.tomshardware.com/reviews/gaming-case-review,... as a "topics being discussed in the forums".

Honestly, site/forum integration is fine in theory, but if you're a tech website you have no excuse for this kind of failure.
a b à CPUs
December 31, 2009 12:10:02 AM

^I see why you bought this up, and yes this site has some problems. I blame it on the French admins that run this site. :lol: 

anyways:
THIS THREAD IS DEAD DO NOT POST HERE
a b à CPUs
December 31, 2009 12:14:09 AM

.
a b à CPUs
December 31, 2009 1:02:44 AM

Wow this brings back memories.

I remember my AMD AthlonXP 3200+. I had her paired with an nForce2 board. The Asus A7N8X Deluxe.

Fun times.
a c 172 à CPUs
December 31, 2009 5:45:19 AM

I'm still using my XP2400+, primarily for internet.
a b à CPUs
December 31, 2009 9:48:39 AM

Crashman said:
So clock for clock the Barton is better, but XP rating for XP rating it's worse? Yet AMD's PRICES are based on XP ratings, not clock speed! So unless they can produce one with a higher clock speed (say, 2200MHz) they still aren't progressing!

I agree, AMD are foolish if they don't. But I have high hopes for K8 when it comes out later this year. We'll see how that pans out.
December 31, 2009 9:56:23 AM

barton was revolutionary.
!