Sign in with
Sign up | Sign in
Your question

ATI's Radeon 2600 XT Remixed

Last response: in Graphics & Displays
Share
September 13, 2007 11:20:55 AM

http://www.tomshardware.com/2007/09/13/the_radeon_2600xt_remixed/index.html

On release the 2600 XT cost too much, given its performance. Newer models cost around $100, but they're equipped with slower memory. How do they perform?
September 13, 2007 11:45:59 AM

Slower
September 13, 2007 1:33:20 PM

Unknown because you did not do any tests!
I presumed the whole article was to see how these cards did compared to the cards with the faster memory.

Would it not make sense to test these cards against one of those?
This was a "slow" 2600XT vs 8600GT.
There was no attempt to address this question.

And no, you can't reuse old benches since the system is different, the drivers are different, among other factors.

It was nice to see how well these cards did for the price vs the 8600GT. However, I would really like to know what I would get if I bought a less crippled 2600XT.
Related resources
September 13, 2007 2:49:26 PM

I was more concerned with seeing if the 2600XT was a good buy, rather than concentrating on the difference a 10% memory clockspeed difference makes.

All of the 2600 XT's I can find under $130 have 700 MHz memory. The 800 MHz models - as well as the GDDR4 models - are priced too high to matter, because they're butting heads with the X1950 PRO which will kill them. Because of this, the 800 MHz models are irrelevant as far as a purchase option.

Having said all that, If I had one lying around the lab I would have included benches for it, but I didn't consider it a priority. If this article has a large response though, I'll try to do a follow up article about the DDR2 8600 GTs, and I'll include both 8600 GTs as well as the different flavors of 2600 XTs (even the GDDR4 version) out there if I can get my hands on em.
September 13, 2007 2:50:15 PM

Agreed. I really wanted to see how they perform against the reference clock versions.
September 13, 2007 2:52:38 PM

Cleeve beat me by a minute. I didn't realize the 800MHz models were all priced that high. Good to know.
September 13, 2007 2:54:56 PM

On a side note, I just scoured Newegg for 2600XTs with 800 MHz memory... I didn't find a single one. All 700 MHz memory.

The only exceptions are the GDDR4 versions (which are all above $130), and an overclocked version by HIS that according to the specs listsed, has 960 MHz DDR2 (also above $130).

I don't think anyone's manufacturing 2600 XT's with 800 MHz memory anymore...
a c 130 U Graphics card
September 13, 2007 3:32:03 PM

I know its not really the point of the article but while you seen to be taking requests :)  i would have liked to see a 7600GT and a X1650XT in there for comparison
Mactronix
September 13, 2007 3:40:51 PM

I might include a 7600 GT I have around next time. I don't have any X1650 XTs though.
September 13, 2007 3:51:39 PM

After having been playing with the older DDR3 2600XT (from Sapphire), I don't really have much in the line of complaint...

...apart from no ATITool support...

...and no one knowing what to mod for volts...

...and a buncha other stuff...

...although it's a good card, I feel...

...could still be better.
September 13, 2007 3:53:47 PM

I suspect the Atitool guys will fix those concerns. I think the new beta allows for 2900 XT voltmodding...
September 13, 2007 4:38:57 PM

I'm sorry to say:
using incorrect aspect ratios leads one to question the validity of othe entire article

To clarify:
1280x1024 is 5:4 not "(standard 4:3 ratio)"
1400x1050 is 4:3 not "(widescreen ratio)"

Mistakes like these throw a shadow of doubt over the writers technical knowledge and therefore the entire article
September 13, 2007 4:47:04 PM

Snyper, if you think it's sensible to sweepingly judge the validity of solid testing and reasonable conclusions because of an insignificant (and, for all intents and purposes, irrelevant) resolution misprint that has absolutely no bearing on test results... well, you be my guest.

Sorry folks, I meant to say 1440x900. Time to chuck the article in the garbage! :ange: 
September 13, 2007 5:09:57 PM

Again - sorry
It was an unfortunate misuse of terms - perhaps boardering on standard BS techniques - that I was constructively pointing out.
I suppose that not knowing monitor resolutions may be considered irrelevant in some circles - but this is supposed to be a technical hardware review. There are people out there that will read an article like this and adopt the information as fact and propagate the errors

Sorry Cleeve, I was Just trying to help for future articles
September 13, 2007 5:13:47 PM

snyper said:
I'm sorry to say:
using incorrect aspect ratios leads one to question the validity of othe entire article

To clarify:
1280x1024 is 5:4 not "(standard 4:3 ratio)"
1400x1050 is 4:3 not "(widescreen ratio)"

Mistakes like these throw a shadow of doubt over the writers technical knowledge and therefore the entire article


No offense, but those are strong words coming from a person that has 2 posts.

The newbies show no respect any more. :D  j/k.
September 13, 2007 5:19:47 PM

yes indeed - a newbies as you say - to this forum

?? perhaps more valid for it as I have a fresh viewpoint to present??

I do however find it strange that misinformation is somehow considered irrelevant

and that it is unacceptable to have someone attempt to correct obvious errors

How many posts are required before it is acceptable to be correct?
September 13, 2007 5:24:35 PM

It has nothing to do with you correcting. What is more important is that you try to question the entire article over a TYPO. If you think Cleeve is an idiot then say it. Otherwise, just let him know about the mistake and don't say the whole article is in doubt.
September 13, 2007 5:25:52 PM

You didn't see the just kidding at the end of that post, did you?

I think Cleeve's point was not that the error was irrelevant, but the fact that it was negligible. Call me stupid, but I am more interested in the resolution of the monitor than the aspect ratio.

The only problem I had was with how you said what you did. Instead of questioning his tech knowledge, you could have simply asked if it was a typo (which it was) and Cleeve would have fixed it.

Anyway, don't take what I said personally. I realized after I posted it that it might ruffle your feathers a bit. My bad.
September 13, 2007 5:38:40 PM

just to clarify - a standard technique that I use when scanning articles is to skip over a number of pages to the conclusions (to determine if something worth reading may be contained) and then go back into article. I Do this because there are a large number of articles which simply are rehashes with very little new content. I suspect that I am not abnormal in using this technique.
Most 20" monitors will be 1600x1200 or 1680X1050 widescreen with only a few at 1400x1050
Most 19" monitors are 1280x1024 or 1440x900 widescreen

I appreciate the effort involved in testing/reporting etc. and simply do not want to see the efforts wasted by a few simply corrected innaccurate statements
September 13, 2007 6:49:20 PM

I'd consider it pretty irrelevant because neither the 1440x900 or 1400x1050 resolutions were used in any of the tests. This oddball resolution was only mentioned as a side comment about what monitors might be paired with video cards in this price category, in this case I was talking about a 19" widescreen 1440x900 and a 19" standard 1280x1024 which push a very similar amount of pixels. This tidbit really has no impact on the review whatsoever.

Is it a mistake? Sure! I'm human. I'll make the odd mistake, and if you point it out I'll happily take it to heart.

But suggesting that it throws doubt over the validity of the entire article... I mean, come on. It's an insignificant typo that has absolutely nothing to do with the testing and conclusions.

Of course I'm not going to respond happily to that kind of insinuation. All you had to do was point the mistake out, that's constructive. I don't think the exaggerated accusations were called for though.
September 13, 2007 7:26:25 PM

sigh - the human language is so fickle - it requires both parties to participate.

I will try once again.

I consider myself to be one of a great many (a typical reader)
I consider you to be one of few (the ones who take the time to provide information through your writings)

I have outlined my (and I suspect, many other's) reading techniques

I was guessing that your intent was to have as many people as possible "GET" your message

When a message is clouded - as I pointed out - it is not as well received as is could/should be.

This forum is probably read by a tiny minority of all those that might read the article and imho are not representative of the 'general public'

It would certainly be possible to make allowances for typos and human error and in this cut and paste world - so easy to paste the wrong info.

My point again?
just trying to help the writer perfect his trade
so that it provides the most benefit to those I'm guessing he/she is trying to contact - the general reader.

Accusations? Insinuations? Once again the human language has failed me - I thought I had presented FACTS
with a possible conclusion as to the consequences of misrepresentation

At no point did I claim your works were inaccurate as I myself have not gone to the effort of testing. I simply stated that I experienced doubt.

For example
when seeking advice about a sore knee
I will be less inclined to believe someone who is examining my elbow while asking where it hurts

now that is an exaggerations and a bad example but it DOES hopefully get my message across

I don't have a dog in this race so will ignore your attempts to inflame the intent of my posts.
Just trying to help for the future
September 13, 2007 7:49:29 PM

snyper said:
sigh - the human language is so fickle - it requires both parties to participate.
I thought I had presented FACTS


u did, but they were irrelevant
September 13, 2007 8:30:47 PM

Well I just bought a "regular" Sapphire 2600XT and I think this one has 700 mhz memory as well. It was much cheaper than the 8600GT though, and I bought it for bioshock.
Also may I ask why the 7.9 drivers were not used? Or was the article being made right before the release of those?

Good article tho.
September 13, 2007 8:33:46 PM

lol - thanks spuddyt

perfect example of a failing of communication
:) 

Now that you all have determined that I was way off base - can we move on to "relevant' discussion.

An articles that I would like to see - would be written from the point of view of a typical user. Lets pick for example - that big community playing world of warcraft - and try to determine what would be the best value hardware at different price points. All based on what monitor the end user would want to run. Since the monitor would be the determining factor for required video/computing power and there are fixed steps to required performance - I would think it very relevant
ie. if have a 19" you need x video card/cpu

and before I'm flamed for Wow example - it could be argued that it IS a popular title as the ongoing subscriptions would prove

Since we are talking "value" cards for the masses
September 13, 2007 8:58:18 PM

why would amd release a new part that cost more than the old one and does not perform as well? lol
September 13, 2007 9:53:14 PM

I really don't wanna take sides, but.... the little typo's and editing mistakes that run rampant on the net take away some of the net's credibility. I still have people say things to me like "where did you hear that? on the internet?" meaning they don't give the net any real respect (like the television is some bastion of truthfulness :sarcastic:  ). Attacking someone because they are stating something about the lack of editing and overview on the net seems kinda counterproductive, IMHO. Snyper may have written something that didn't sit well with the writer but I believe he was truly being altruistic in his desire to point out a mistake. I recently read an article somewhere on Tom's where the writer called 1366x768 720p, so I took the rest of the article with a grain of salt because the writer must not have tried ACTUALLY using an HDMI cable. Was my rush to judgement called for? Probably not, but that was my gut feeling about the article after seeing that. So I have to agree - little mistakes tend to flush the writer's credibility down the tubes with the more educated readers. And I'm not looking for a flame war - just adding my 2 cents.
September 13, 2007 10:13:23 PM

Once again - I certainly don't have a problem with having my mistakes pointed out. And yes, I will make them. That's the kind of guy I am; I'm not perfect.

I like to think my conclusions are for the most part pretty solid though. The jist of it, the conclusion, what a person takes away from an article is what I'm going to concentrate on. To be honest, I'm not sweating the resolution typo all that much. It has very little bearing on the point of the article. Now, if it turned out that I was wrong on pricing and the 8600 GT competing card was $20 cheaper than I thought, hell, I'd be damned ashamed of myself. That would have a profound affect on the conclusion.

Now Mr. Snyper felt the need to point out my mistake, and I'll reiterate: that's a good thing. If mistakes aren't pointed out, they can't get fixed, and might be repeated in the future. I think we can all agree on that.

I just don't think pointing out the mistake necessarily had to be accompanied by what I see as a derogatory statement like that. What kind of value did that statement add? Did it assist me in learning of the mistake? No.

We can argue it up and down, but I can guarantee this: that if his post simply and respectfully mentioned the mistake - instead of suggesting that the rest of the article was suspect because of it - all of this extra stuff wouldn't have happened.

I'm sorry, but I don't see the value of the derogatory comment. If you guys do, great.
September 13, 2007 10:33:20 PM

cleeve: [:turpit:2]
a c 130 U Graphics card
September 14, 2007 12:00:27 PM

Absolutly spot on cleeve if people took the time to sit and think about how they worded posts i recon about half of these types of threads could be avoided,Now im not saying im perfect and have worded posts badly myself before.
I dont think snyper intended to be insulting as he pointed out himself and if he was trying to be confruntational then he would have seised on lostandwanderings joking post.
So i for one would like to welcome him and the input he will bring to the forum.
Oh and Cleeve keep up the good work :) 
Mactronix
a c 130 U Graphics card
September 14, 2007 12:29:22 PM

Anyway back to the point at hand i have just got off of the Overclockers site and they have some of their 2600s at 600mhz :ouch:  yea and lots of ram to make them seem good :ouch: 
Mactronix
September 14, 2007 12:39:42 PM

Mugz said:

...and no one knowing what to mod for volts...

Theres volt mods out there that will get you too 1ghz core with air cooling(all be it some funky custom air cooler.
I think they could have waited for the 7.9cats to be released before they did this article
a c 130 U Graphics card
September 14, 2007 12:49:13 PM

Volt mods are nice for the enthusiasts but i think i will stick to stock :)  ocing has me worried never mind moding :( 
I guess they could have waited for the drivers but then we would be slating them for the review taking too long
Rock and hard place i guess :) 
Mactronix
September 14, 2007 1:33:09 PM

cleeve said:
I was more concerned with seeing if the 2600XT was a good buy, rather than concentrating on the difference a 10% memory clockspeed difference makes.


A good buy is a relative thing, as you have pointed out yourself. And, as your article shows, the 2600xt seems to be a decent buy at that price point. The question is, who buys it? For those that maybe interested in a 2600xt it usually represents an upgrade or a component for a new build.
For upgraders a comparison is important to see whether it is a worthy upgrade or not. A comparison with a 7600GT and similar cards would be a good idea.
For those that build a new computer the budget can easily shift. Buying a smaller processor might free up enough money to turn a 2600XT (700Mhz) into a 2600XT (1100Mhz) if needed or even an entirely different card.

The big problem i see here doesn't actually lie within the article, but within card itself. The expectations for the 2600 series were high and AMD didn't meet them. At first the prices were off and the drivers were bad, now they change the memory clockspeed while drivers and pricing have improved. Overall that creates a situation of high reader expectations. Some want to see how the new drivers perform or compare the 800Mhz version to the 700Mhz version. Some just want to know how it performs at all or if it is a worthy upgrade or if it will work with their media pc, etc.

I think a comparison between old model and new model would be a good addition. Comparing them with the older 7600gt and/or 1650xt would add a lot too - but doing it all would certainly blow it out of proportion. It's easy to criticize so I hope i haven't worded my reply too harsh.
September 14, 2007 2:26:15 PM

Again, as far as I know the old model also has 700 mhz memory.
Maybe not the reference design, but the regular 2600XT from sapphire does.
September 14, 2007 3:10:45 PM

I hear you, Slobogob.

I'll try hard to get GDDR4 version (and 700 MHz mem version if I can get one) for an upcoming review for comparison.

My gut tells me the GDDR4 version won't be worth consideration since it's price puts it in X1950 PRO territory...
September 14, 2007 4:50:51 PM

Wait too much and the refresh cards will be out. They might just be much more worthwhile. Then again we don't know pricing of the Gemini or the new cards.
September 14, 2007 5:15:01 PM

http://www.xbitlabs.com/news/video/display/200708310741...

GeCube’s Gemini 3 Radeon HD 2600XT X2 512MB graphics card with 512MB of memory has manufacturer suggested retail price of $259, whereas the 1GB version will cost $279.

So the Saphire card will be in the same ball park. regardless, that's pretty cheep for basically 2 cards.
$130 ~ $140 per core / card... however you want to look at it.
September 14, 2007 5:22:57 PM

Wow, I really don't think the performance is going to be worth that price. You're getting into 8800gts territory there.
September 14, 2007 5:26:31 PM

Except that the card still comes with the crossfire bridge, so you're talking quadfire. Also, I don't think that the cards will maintain that price. We'll see since the actual list price hasn't been posted yet.
September 14, 2007 5:43:31 PM

Quadfire, that sounds interesting. And you do have a good point about the price. They could be similar to the 2600xt, not good at the original price point, but finding a new home at a lower price later.
September 15, 2007 4:30:54 AM

Did anyone notice it's only GDDR2?
September 15, 2007 4:10:33 PM

what's only DDR2?

the 2600 comes in DD2, 3 and 4.

X2's will be DDR3

EDIT:
i'll rephrase that.
The GeCUBE X2's are DDR2
The Saphire X2's are DDR3
September 15, 2007 6:03:19 PM

2600xt dual in Crossfire doesn't seem to me like it will be a good value. Might as well just get 8800 or HD 2900, but one 2600xt for $100 seems to be decent value.
a b U Graphics card
September 15, 2007 10:29:28 PM

I agree I'd like to see other cards in comparison, but for what the goal was I liked it, and love the addition of the min fps.

I think it's pretty obvious that the lack of ROPs and dedicated AA resolve hardware just kills them, and unlike the HD2900, the shader power difference isn't enough to help with AA when the GF8600 had 8 ROPs to do the work, and the HD2600 had to go back to the shaders to do the AA. I did find it interesting though that in D3 the performance change went the other way around, and I can't figure that out based on the hardware AA issue. So it's obviously not a straight cut and dry case, maybe the ROPs of the HD2600 were holding it back in the pixels it could render and thus the shader power wasn't being maximized in the older title. But it was an interesting change of the usual 'HD series loses because of AA' position we've all tended to see and come to accept.

I would've like to see a lower but minimum level of AF applied to the noAA level like 0AA and 4XAF as I've seen the performance impact is almost nil, but the visual effect is noticeable. Of course as a baseline I think more people would (IMO wrongly) complain that they need a 0/0 baseline.

I think Cleeve if you do re-run it how about checking a mi-grade IQ level where 2XAA and 4XAF seem to offer a good balance of improve IQ and maintained performance. Just a personal desire based on my past experiences with the midrange and the sweetspots I've found for those 128bit solutions.

Like the new style though, I think more info (like min fps) with slightly fewer cards is a good thing, but adding 2 more reference points, like the GF7600GT and X1950Pro would understandably give people additional upgrade information on whether each solution is worth it (worth it to move from an GF7600GT or choose those instead of it, or also spend a few more and go to the X1950Pro?)

Great work though!
September 16, 2007 4:48:41 AM

San Pedro said:
2600xt dual in Crossfire doesn't seem to me like it will be a good value. Might as well just get 8800 or HD 2900, but one 2600xt for $100 seems to be decent value.


only problem with that idea is that;
2900 = $400~$500
2600 X2 = $250~$270

I'm sure with an X2 i'll be able to play damn near anything I want with very few exceptions or problems. Not to mention that I honestly don't see the point to blowing $400+ on a video card that will be obsolete in 6 months and 25%~50% less cost. I'm more than happy to bide my time and wait for a price drop on any and everything. I'm not trying to divert from the fact that the 2900 is a better card, it's just the fact that not everyone is willing to pay so much for a video card.
September 16, 2007 8:18:10 PM

I'm pretty dure an 8800 GTS 320 will beat down a single 2600 X2 for about the same price.
September 16, 2007 8:29:34 PM

edit: I misread
September 17, 2007 3:52:46 AM

reddozen said:
only problem with that idea is that;
2900 = $400~$500
2600 X2 = $250~$270

I'm sure with an X2 i'll be able to play damn near anything I want with very few exceptions or problems. Not to mention that I honestly don't see the point to blowing $400+ on a video card that will be obsolete in 6 months and 25%~50% less cost. I'm more than happy to bide my time and wait for a price drop on any and everything. I'm not trying to divert from the fact that the 2900 is a better card, it's just the fact that not everyone is willing to pay so much for a video card.


here is my problem with your logic:

higher-end card on release = $400-$500 bucks.
after 6 months, the card is hardly obsolete in the sense of functionality or market demand... and THAT determines when the price drops just as much as a newer model release does. Take the 8800gts right now: Released almost a YEAR ago and it is still only ~50 bucks under your magic $400 mark. It is also hardly experiencing functional obsolescence...

Take my current 1900xt512 that was $500 right when it came out. (which is when I bought it) My logic has always been to buy the most you can afford at the time, always reading reviews and watching the tech directions... So at the moment I was upgrading the 1900s came out and I jumped. I gamed for 6 months and still saw no drop in price. (I always look back on what I bought to modify my thinking for the next upgrade)

I am (obviously) still gaming on this card very well w/o being forced to make any sacrifice in quality. I have not seen a game force me to lower settings... yet... (crysis may do that, or UT3... but we will see) It is now well over 1.5 years old and still very viable. If I had waited until the price came down I would have saved (maybe) $100 but been gaming on my finally overtaxed 9700pro that was over 3 years old. (which I also bought at $400 I believe)

Now, my 9700pro is still living in a secondary system... but was my primary gamer for over 3 years and rocked it. I expect to at least get 2 years out of this 1900 if not more. (can't expect it to last like the workhorse 9700 but it is possible) So basically over 5 years I have bought 2 cards and been gaming at or near the top-end for all that time.

Every friend I had that bought the 9600/FX5700 mainstream cards after I got my 9700 were replacing them with an x700 or 6600 (or a now cheaper 9800) as soon as they were released. My 9700 (oc'd by then) was still viable but showing its age by the time of the 1000/7000 generation so I went to the 1900. They then jumped to a 7600 (or a now cheaper 6800/x850) and/or then jumped to an 8600.

figure that out, even setting the baseline price for those cards at $220 (generous IMO) you have 4 different card purchases for ~880 bucks. In that same time mine was 2 purchases at $900. For my extra 20 I only turned down my settings towards the end to the life of my 9700, which brought me to the level my friends were gaming at. (about 6 months) For the rest of that 5 or so years I was (and still am) gaming at the top-end of visual quality. If you upgrade less, then you are REALLY scraping the bottom of the visual barrel, like some that are still using a 6600 on games like Oblivion.

What I am saying is that you end up spending the same but getting a better gaming experience by getting a high-end card when you upgrade. They last longer so you upgrade less (as long as you didn't buy the high-end FX5800 ;)  )

Of course, if you just plain don't have the scratch... then my argument is moot and you have to do what you gotta do...

...rock on man.
!