HD 2600 compared to HD 3850

I was wondering just how much difference there is between these two cards, not as per a review but in the real world. Has anyone run both of these cards on the same system ? I'm talking about an XP system, not Vista
The reason I'm asking is because i have recently upgraded to a 3850 from a X1650XT HIS ICEQ TURBO and to be honest the difference is night and day visually speaking. I installed GRID and its really good. The thing is we repeatedly tell people on these forums not to bother upgrading unless the card they have is struggling. Well my old card wasn't struggling at all but visually the difference is well worth the outlay. I don't want to steer people wrong and I'm wondering if although the card wasn't that much faster, would it have had enough difference visually to be a viable upgrade. or did i do right to wait for the 3850 ?

Mactronix
 

sarwar_r87

Distinguished
Mar 28, 2008
837
0
19,060
u did the right thing considering 2600xt performs on par with x1650pro.and 3850 is almost 70% faster.......

and as for visual difference, i think there will mont be much difference when u r playing a movie......they both have same engine to decode......but when u r playing games, u will have the options to turn on more eye candies and hence ur visual experience will increase
 

sarwar_r87

Distinguished
Mar 28, 2008
837
0
19,060


I owned a 2600........my frnd with x1650pro boasted better fps in vertually all titles.....he had a pentim d 2.6GHz while i had a x2 2.8Ghz.....

check ur link......2600xt performs almost half compared to 1950pro.........1650xt also performed almost half of 1950xt.......so u can say they are of the same class and infact 2600xt provides less than 1650xt.
 

pauldh

Illustrious

First, you really can't compare two different systems. You either are not comparing apples to apples, or your system is screwed up if a X1650 pro provides more performance. Your card should destroy his.

Second, The 2600XT can = the X1950 pro, but only in some games and only without fsaa. With fsaa (like my firingsquad link) it falls way behind the X1950 pro. Trust me, I have researches this a lot, and tested all these cards myself. I own both a GDDR3 X1650 pro and a X1650XT AGP, and a PCI-e 2600XT. The 2600XT is typically better than the X1650XT and way better than a even the best X1650 pro. If the pro is GDDR2, it would be even worse. But results vary by the game, and level of fsaa. New games, the 2600XT is often very good, just like the X1650XT left the 7600GT behind in some new titles. Look at UT3 as an example, and the 2600XT = X1950 pro, the X1650XT is about like the 2600 pro. The 7600GT is far behind the X1650XT.

http://www.anandtech.com/video/showdoc.aspx?i=3128&p=4

Also, Look at the HD2600XT in these links(In some games, without fsaa, it can = the X1950 pro)
http://www.legionhardware.com/document.php?id=698&p=2
http://www.legionhardware.com/document.php?id=693&p=5
http://www.legionhardware.com/document.php?id=691&p=2

edit: one more link. Without fsaa, the 2600XT beats the X1650XT in every game in this review:
http://www.anandtech.com/video/showdoc.aspx?i=3023&p=7

If I had to generalize, the HD2600XT is right between the X1950 pro and X1650XT. It can at times match the X1950 pro, other times the X1650XT matches it. Without FSAA it's probably closer the the 1950, with fsaa closer to the X1650.
 

sarwar_r87

Distinguished
Mar 28, 2008
837
0
19,060


i did some more digging.....apears that 2600xt is not on par with 1650pro but with x1650xt.my mistake
http://www.tomshardware.com/charts/desktop-vga-charts/overall-all-games-fps,572.html?p=1642%2C1640%2C1632%2C1633%2C1610%2C1627%2C1607%2C1625%2C1639

but 2600xt is definately not twice as powerful as 1650pro. maybe 40%...and u cant judge a card with one game
 

pauldh

Illustrious

I provided a bunch of games showing it matching a X1950 pro or beating a X1650XT, not one game. I could provide more if you like. Believe what you want about the 2600XT, but I think I painted an accurate picture of it bouncing between a X1650XT and X1950 pro. And you should know something is wrong if yours loses to your buds X1650 pro as they are quite a step below the XT even.

Often, especially in old reviews(old games/old drivers) or with FSAA, it does about equal the X1650XT. But try Crysis, UT3, COD4, HL2 EP2, TF2, etc., and without fsaa (how you would expect to play on this level card) it will pull away from the X1650XT, and even sometimes match the X1950 pro.

Unfortunately, You are being misled by those toms charts. They are a total of all games at all settings, most of which are unplayable on both cards. 4X FSAA is on in the vast majority of those scores, and lowers the 2600XT's chance of dominating as that is it's weak point like I mentioned above. Plus, they are pretty old games, not really ones the 2600XT shines in anyway. Oblivion is the best shader heavy example there, and the 2600XT destroys the X1650 pro by over 100%, yet doesn't earn a huge point advantage because it's max details and too much for either card to be playable or put up good numbers. Look over all the individual tests that make up the total score and you will see often the 2600XT is over double the X1650 pro. Better yet, look over all the links I provided. Also, keep in mind it's the highest clocked X1650 pro against the lowest clocked 2600XT in those toms charts. The GDDR3 X1650 pro like they used and I have, is rare compared to the slower common GDDR2 version. Their HD2600XT is about like the average AGP 2600XT though.

Here are a few tests that have the GDD4 and GDDR3 2600XT, as well as the GDDR3 X1650 pro.
http://www.digit-life.com/articles2/digest3d/0907/itogi-video-ch-wxp-1280-pcie.html
http://www.digit-life.com/articles2/digest3d/0907/itogi-video-pr-wxp-1280-pcie.html
http://www.digit-life.com/articles2/digest3d/0907/itogi-video-sc2-wxp-1280-pcie.html
 
@ pauldh,
So where do you stand with the visual performance between the two cards, From what i know of them raw performance wise the 2600XT was released on a par with a X1650XT and after a few driver improvements drew away from it a bit. As you say towards the end of its development before the 3 series cards it was equalling a 1950 pro in some titles. However its the actual real world visual differance im interested in.

Thanks
Mactronix :)
 
UT3 looks terrible on my 2600. The framerates are bad unless you include 2xAA in any game, and (not to go agaisnt you, pauldh) my COD4 framerates improve with FSAA in COD4. They are only around 26fps, medium settings, but still.

Haven't gotten around to crysis yet.

Now, this is all at 1280x1024

To be honest, I hate that 2600. I like what my x1650xt could do much better. Perhaps it's just my card.
 

pauldh

Illustrious
Frozen, if I recall, you are on a single core Athlon XP2800+ right? That is why you don't see an FPS hit with fsaa, you aren't GPU limited. It really makes no sense that increasing the load on the GPU by turning on fsaa would increase framerates. Unfortunately, if you like FSAA, you have the wrong card with the HD2600XT. Too bad you don't like the card, but I think if you had more CPU power, you would be happier.

One quick example, look how in TF2, the 2600XT leads even the 8600GTS.
http://www.legionhardware.com/document.php?id=691&p=2
But with 4xaa, it drops to under half the frames of the 8600GTS (and also X1950 pro)
http://www.legionhardware.com/document.php?id=691&p=3

 

pauldh

Illustrious

Yeah, I agree with that. The 2600XT blew me away when with updated drivers, and newer games, it turned out better than it originally seemed. The GF7's on the other hand looked pitiful in those same new games. But compared to other cards I have, the 2600XT is just not up to fsaa IMO; turn it on it tanks. You can see just how poorly it does in reviews that exclusively use fsaa like the firingsquad link above. (It's nowhere near the X1950 pro or 8600GTS with fsaa).

I can't tell you I see a real increase in image quality though. The X1650XT and X1950XT were really pretty decent and better than the GF7's IMO. Only increase you may see is from higher details, but everything at the same settings, I would guess no difference. But, the HD3850 would just blow these cards away and allow you to max out many games the others can't do. In that regard the IQ would be much better.
 
I know the FSAA performance improvement doesn't make any sense either, it stumped me, but at this point, I don't care - faster is faster, and I won't argue with it.

The GPU has more time sitting idle that it can use to do work, which I guess makes a little sense, but not much. This old athlon has done it's job well through the years I've had it. Hey - I'm proud of the damn thing. Still playing good games 5 years later.

The biggest reason why I don't like the card is that getting it to run correctly took almost 2 days. Just benchmarks after benchmarks, try this driver, try this driver...blah. I slapped that x1650xt in, downloaded drivers, and it ran real nice.