Sign in with
Sign up | Sign in
Your question

HD2400/HD2600 VS 8600GTS

Last response: in Graphics & Displays
Share
June 6, 2007 11:26:10 PM

I saw that the HD2400XT has 40 stream processors unlike the 32 in the 8600GTS... I saw this on Sapphires page. Sapphire HD2400XT Specs

And the HD2600XT has 120! Thats quadruple the 8600GTS! But I know that ATi used diferent pixel pipes as well on older series... So are these the same or are they different?

More about : hd2400 hd2600 8600gts

June 7, 2007 12:09:33 AM

well the geforce 8600 shouldnt be too hard to stump, even some of ati's last gen stuff can outpace it. that is of course, ati's high end last gen stuff, but still...
June 7, 2007 12:16:47 AM

Quote:
well the geforce 8600 shouldnt be too hard to stump, even some of ati's last gen stuff can outpace it. that is of course, ati's high end last gen stuff, but still...


Not really, ATi's mid-range X1950Pro pretty much trumps the 8600's, and it's cheaper too.
Related resources
Can't find your answer ? Ask !
June 7, 2007 12:39:58 AM

Quote:
well the geforce 8600 shouldnt be too hard to stump, even some of ati's last gen stuff can outpace it. that is of course, ati's high end last gen stuff, but still...


Not really, ATi's mid-range X1950Pro pretty much trumps the 8600's, and it's cheaper too.

and there you have it!
June 7, 2007 12:57:13 AM

Yeah well the 2900 XT has 320 stream processors and is just as good as the 8800GTS which only has 96.
June 7, 2007 1:10:43 AM

Quote:
Yeah well the 2900 XT has 320 stream processors and is just as good as the 8800GTS which only has 96.


Oh yea :? thats right. Well the drivers are still horrid from what I read tho.
June 7, 2007 1:11:30 AM

Quote:
64-bit DDR2/GDDR3 memory interface


not very impressive :oops: 
June 7, 2007 1:19:35 AM

That would probably be on the very low-end HD 2400. The HD 2600XT has a 128-bit bus (I know, still not very impressive).
June 7, 2007 1:21:02 AM

yea i wouldnt buy anything under 256bit at this point
June 7, 2007 1:34:42 AM

Also my 8600GT do the same benchmark result in Dx10 ( Lost planet Demo ) and Compagny of Heroes Dx10. Hd2900XT is shit, imagine the 2400XT and 2600XT. Maybe they will put the HD acceleration in the mainstream oh yell can we say the HD2900XT is not mainstream also ?
June 7, 2007 2:10:12 AM

Neither nvidia nor ati have video acceleration on there high end cards like they have on there mid range. Only reason people are all freakin out about it is because it was made known actually because of ATi's mishap. If you have a HD 2900XT you don't need the video acceleration and if you thought you need a HD 2900XT in a HTPC well then some more research needs to be done.

And finally no the HD 2900XT is not mainstream and accounts for very little of the amount of graphic cards sold by ATi.
June 7, 2007 2:28:12 AM

Quote:
Also my 8600GT do the same benchmark result in Dx10 ( Lost planet Demo ) and Compagny of Heroes Dx10. Hd2900XT is ****, imagine the 2400XT and 2600XT. Maybe they will put the HD acceleration in the mainstream oh yell can we say the HD2900XT is not mainstream also ?
My 8800GTX doesn't have HD acceleration.
a b U Graphics card
June 7, 2007 4:42:52 AM

Quote:

And the HD2600XT has 120! Thats quadruple the 8600GTS! But I know that ATi used diferent pixel pipes as well on older series... So are these the same or are they different?


They are different as mentioned, and will prtty much act like 64 stream shaders to the XT's 120. But while they did improve the Texture unit ratio for the XT it's still crippled by two items;

a) the 128bit memory they both have

and

b) the lowly single ROP cluster (like 4 traditional ROPs).

Which means to mean that while it may or may not be 'better' than the GF8600GTS, it will likely still suffer when compared to the value of the previous generations higer end cards, especially the X1950Pro and X1950XT.

The late summer 256bit models should offer what most people hoped for in these cards.
June 7, 2007 4:55:59 AM

So I should go with a 8600GTS then? Im not getting a 8800 because I dont need that much power I just want to use the HDCP and play crysis at w/e settings plus anything is beter than my current EVGA 7600GS...
a b U Graphics card
June 7, 2007 5:02:37 AM

I would wait until the HD2600 is tested better before deciding.

As for crysis performance, I wouldn't expect the GF8600 to be all that great in Crysis, but I wouldn't expect the HD2600 to be much better.

IMO the X1950XT is much more attractive for the money. Of course it consumes more power.

Overall unless you NEED to upgrade now, then I'd say wait for those summer refreshes like the RV670.
June 7, 2007 5:09:56 AM

the X1950 does not have HDCP I dont think. Plus no DX10 games are out... we dont know if the 8600GTS performs better in DX10... plus! the only "DX10" demo is a crappy port of a 360 game, BLEH! Well im not able to get this till about November so I will wait to see what happens.
a b U Graphics card
June 7, 2007 5:30:15 AM

Quote:
the X1950 does not have HDCP I dont think.


Yes the X1950XT and X1950Pro both have support for HDCP over DVI, plus VIVO.

http://www.newegg.com/Product/Product.asp?Item=N82E1681...

Quote:
Plus no DX10 games are out... we dont know if the 8600GTS performs better in DX10... plus! the only "DX10" demo is a crappy port of a 360 game, BLEH!


Actually, no the Company of Heroes patch adds to the DX10 game title list. IMO it's a different style of implementing some DX10 feature and appears to have different workload focuses than Crysis. But really don't expect miracles out of the GF8600 in Crysis,

Quote:
Well im not able to get this till about November so I will wait to see what happens.


Well then there's no rush, so IMO the GTS won't be on your radar screen by that time. Expect the RV670 and it's nV counterpart to be your top candidates.
June 7, 2007 6:16:16 AM

Quote:

The late summer 256bit models should offer what most people hoped for in these cards.


Have you got any sources or are just speculating? Im not attacking you or anything Im just wondering because Im about to sell my current system and get a new one with a 86XX or 26XX and the last thing I want to do is buy a 128bit bus card a week or 2 before the 256bit cards are announced. But the longer I wait then the more value my current pc loses.
June 7, 2007 7:51:03 AM

Only rumours...
It has been "information" floating around that the next version of 8600 in july will have 256 memory interface... The problem can be that it's allmost as slow as 128 bit 8600 and the price is so near 8800 gts that it's not worth of it.

The most important thing is that the "leaked" specks of 256bit version are othervice sama as with 128bit. (only 512 MB of memory vs 256 Mb with old cards) So we should see some really interesting direct comparison between old and new version....
What I expect though is that they pump up core speed a little bit to make new version more "atractive"
June 7, 2007 8:18:29 AM

ATI's 2400 and 2600 are really good for video use. Very low TDP, so exelent in silent multimedia PC.

Don't expect them to be graphic monsters. Most new generation graphic cards are not.

When the DX9 was introduced, first DX9 low and middle range cards vere really slow compared to the old DX8 stuff. The secong generation though was good enough.

I expect that these are very popular in small factor PC-cases!
June 7, 2007 10:40:10 AM

Quote:
My 8800GTX doesn't have HD acceleration.


I know that but it doesn't suppose to have one too compared to ATi HD2900XT witch say they have one on it and in reality doesn't.
a b U Graphics card
June 8, 2007 5:51:28 AM

Quote:

The late summer 256bit models should offer what most people hoped for in these cards.


Have you got any sources or are just speculating? Im not attacking you or anything Im just wondering because Im about to sell my current system and get a new one with a 86XX or 26XX and the last thing I want to do is buy a 128bit bus card a week or 2 before the 256bit cards are announced. But the longer I wait then the more value my current pc loses.

Well it's rumours from places like The Inquirer and Fudzilla and the usual suspects. But they are attractive sounding and logical steps similar to the X1800GTO and later X1900GT and X1950 Pro.

So while they won't just be tack on 256bit memory onto the GF8600/HD2600 like originally thought, but actual balanced upper mid-range solutions on 65nm, they should be perfect for the 'efficient gamer'.

Here's some of the stories;

http://www.vr-zone.com/?i=4188
http://www.fudzilla.com/index.php?option=com_content&ta...
http://www.fudzilla.com/index.php?option=com_content&ta...

Now VR-Zone is reporting a revised schedule, but it's not being suported by other info out there;
http://www.vr-zone.com/index.php?i=5044

IMO look and see if the HD2600 offers you what yu need, because the GF8600 series is definitely not a good long-term solution, it's underpowered, and not attractive compared to an X1950XT. The HD2600 may be equally unattractive, in which case look for agood GTS-320 deal or maybe wait and see what the end of the summer brings.

A 65nm X2900XL or GF8800GS style card with few shader than the GF8800/HD2900, but more than the GF8600/HD2600, plus 256bit memory would be the nice option for those who like X1950PRO/GF7900GS type values.
a b U Graphics card
June 8, 2007 5:56:48 AM

Quote:
Only rumours...
It has been "information" floating around that the next version of 8600 in july will have 256 memory interface... The problem can be that it's allmost as slow as 128 bit 8600 and the price is so near 8800 gts that it's not worth of it.


Nah the GF8600Ultra is kinda dead, it's still too weak in the shader department for games like COH and Lostplanet in DX10, so the value of just adding more memory bandwidth is pretty weak.

nV needs more stream processors (64 would be nice), and AMD needs at least another ROP (4 eq cluster) in addition to the additional bandwidth.

Cuttig down the G80 and R600 doesn't work, because it's an expensive chip, too hot, and both have architectural inefficiences and downsides (you'd want to add the video processing parts to these cards missing from the higher end).
a b U Graphics card
June 8, 2007 6:06:23 AM

Quote:
My 8800GTX doesn't have HD acceleration.


I know that but it doesn't suppose to have one too compared to ATi HD2900XT witch say they have one on it and in reality doesn't.

Do you even know what he or you are talking about?

The R600 and G80 DO have HD acceleration, but it's not much different than that found on the GF7900/X1600, it's not the same as the GF8600, which is also not the same as that on the HD2400/2600.

The GF8600 had only minor difference than the G80, and the HD acceleration of H.264 content is similar in speed to the X1K and G7, it uses software assist to be better for unencrypted content.

Also I have yet to see something from AMD that has the HD2900 as having UVD, mainly only AIB partner had that in their media. The confusion is with the HD2900 supporting AVIVO HD in the shaders (which is still hardware acceleration BTW) while the HD2400/2600 have dedicated Xilleon style hardware to do it called the UVD.
June 8, 2007 10:47:44 AM

Oh my bad miss "You know everything" I don't see anywhere that show the 8800 have a dedicated HD acceleration. And i must be blind seeing everyone talking about amd doesn't have UVD and supposed to have it. You must have better source than I.
a b U Graphics card
June 8, 2007 11:19:23 AM

Quote:
Oh my bad miss "You know everything" I don't see anywhere that show the 8800 have a dedicated HD acceleration.


Guess you didn't see their launch information;
http://www.nvidia.com/page/8800_features.html

The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity,

"Hardware Decode Acceleration:
Provides ultra-smooth playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies. "


And oh, look when not using specialized software the GF8600 performs just like the GF7 and X1K series;
http://www.elitebastards.com/cms/index.php?option=com_c...

Quote:
And i must be blind seeing everyone talking about amd doesn't have UVD and supposed to have it.


Yeah you like all the knobs who missed it the first time around did miss that AMD said it supported AVIVO HD not UMD, they read-in the later because of descriptions of UVD and AVIVO HD together. But some reviewers like EliteBastards got it right at launch, others didn't figure out their own mistake until weeks later;
http://www.elitebastards.com/cms/index.php?option=com_c...
"Functionality offered for both the mid-range and low-end on their new range of boards, with the processing power of R600 left to do much of the work on the Radeon HD 2900."

We've covered this already.

Quote:
You must have better source than I.


Seems pretty obvious now doesn't it ?!? :roll:
June 8, 2007 11:34:32 AM

Quote:

And oh, look when not using specialized software the GF8600 performs just like the GF7 and X1K series;
http://www.elitebastards.com/cms/index.php?option=com_c...


Elite Bastard must be blind too, because i don't get past 15% CPU Utilization with a 1080p Movie. I guess Windows Media player is a specialized software ?
a b U Graphics card
June 8, 2007 11:55:06 AM

Does it do VC-1? Itll do H.264.... the charts are there.
a b U Graphics card
June 8, 2007 8:18:54 PM

Quote:

Elite Bastard must be blind too, because i don't get past 15% CPU Utilization with a 1080p Movie. I guess Windows Media player is a specialized software ?


It doesn't matter what you claim to get, what do you get on your G80, X1950, HD2900 and GF7900GS and how do they compare?

So far you've been wrong.

The point is, there is HD acceleration of some kinds on the G80, although like I said it may be slightly different than the GF8400/8600, but then again so is the HD2400/2600 vs HD2900 as well as the GF8600.

And like the HD2900 & G80 the GF8600 needs software to make the most of it's features obviously since it won't act better than the other in every situation only in apps tweaked for it. So It's all still software dependant to some extent, but whether the shader based HD2900 method, dedicated UVD method, or method that the G80s show any difference to the driver is another story. Who knows, maybe the X1950 and GF7600GT are running at max themselves to reduce the load, while the GF8600 is running at idle, but the impact on the CPU is about the same, and no ones tested the rest of the situations extensively enough other than to pop in encrypted disks which doesn't rely on the HD acceleration making the difference so much as the DRM decode.

Either way I don't think it's EB who's blind considering they bothered to check it out, whereas you missed the HD info in the G80 launch info.

Personally I'm waiting for the good AVIVO and PureVideo reviews from people like Cleeve, Crashman, or A E D at Firingsquad, I respect their attention to detail for a larger picture than HQV tests even the new BR version.
June 9, 2007 1:37:05 PM

http://www.tomshardware.com/2007/06/08/avivo_vs_purevid...

This is a better source. I can see the x1900xt, 8800, HD2900XT at the same range, comparing to the 8600, 8500 ( In vista ). Quote from the review for XP ( The Nvidia representatives said the 8500/8600 drivers are not yet decoding H.264 video in Windows XP ) But they are better then the other.
a b U Graphics card
June 11, 2007 11:36:57 PM

Once again that shows that the GF8600 is also software dependant. So while we wait for drivers to turn the GF8500 from a video decelerator in XP to an accelerator , we'll have to wait for similar drivers for the HD2900 to offer it's benefits too.

Thus confirming my previous statement;
"So It's all still software dependant to some extent, but whether the shader based HD2900 method, dedicated UVD method, or method that the G80s show any difference to the driver is another story."
June 12, 2007 12:59:42 AM

What do you mean by Software dependent ? Driver ? Well the 8800 is out for a long time and you are going to tell me they didn't figure out a driver to benefit for the hd acceleration ? Same for ati, card may be out only fews day ago but the card itself was under construction for years and its all they can do ? They can't do sh**.
a b U Graphics card
June 12, 2007 1:19:14 AM

Quote:
What do you mean by Software dependent ? Driver ? Well the 8800 is out for a long time and you are going to tell me they didn't figure out a driver to benefit for the hd acceleration ?


How long did it take ATi and nV to get AVIVO and PureVideo to work to near their potential? They leapfrog each other every 6-9 months due to upgrades. So considering how much the software is involved there's a long way to go before this story is even beginning to be explored considering their need of optimized apps to support them as well as better drivers.

Quote:
Same for ati, card may be out only fews day ago but the card itself was under construction for years and its all they can do ? They can't do sh**.


Did you even read cleeve's review?

Based on the above how do you explain the GF8500's video decelerator in XP with the GF8600 doing so to in H.264? XP's been out for as long as either the G80 or R600 has been in development stages on paper, yet the GF8500 and 8600 perform worse than software decoding in XP? So despite being out longer than the HD2900 and having a sibling in the market and having forever to work on drivers for XP that is a standard not moving target unlike Vista, they still couldn't get it to work.

So how does that link help your argument against the HD2900 and AVIVO-HD, which AMD says needs the software update to allow the shaders to perform the functions as described?

Also explain to me how your analogy favours the GF8500/8600 over the other 2 options? Right now the G80 beats the GF8500 in 3 out of 4 of the situations explored by Cleeve and versus the GF8600 it's 1 win and 2 ties and 1 loss for either, and that's despite the supposed benefit of new hardware acceleration.

Seems like they all have a long way to go when it comes to software intergration/compatability. And I expect that to come rather quickly now that the HTPCs are the target, but it's still very early.

Once we have both solutions running without those glaring issues, then we can start comparing which one does what better/worse.
June 12, 2007 1:36:14 AM

Quote:

Once we have both solutions running without those glaring issues, then we can start comparing which one does what better/worse.


That right but i doubt amd will find a solution soon, they are just too busy to finish the barcelona. Since you always find answer, ( this is not graphical question) but how does the BE-2350 Processor ( 45WTP ) is consuming more power then a X2 5000 (Idle) and near a X2 3800 (65nm) Load and perform worst then a X2 3800 ? which i saw that there: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=300...
June 12, 2007 1:55:44 PM

Save your CPU garbage for the CPU forum please - don't pollute this place with further seeds for fanboisms.
June 12, 2007 3:48:05 PM

Quote:
What do you mean by Software dependent ? Driver ? Well the 8800 is out for a long time and you are going to tell me they didn't figure out a driver to benefit for the hd acceleration ?


How long did it take ATi and nV to get AVIVO and PureVideo to work to near their potential? They leapfrog each other every 6-9 months due to upgrades. So considering how much the software is involved there's a long way to go before this story is even beginning to be explored considering their need of optimized apps to support them as well as better drivers.

Quote:
Same for ati, card may be out only fews day ago but the card itself was under construction for years and its all they can do ? They can't do sh**.


Did you even read cleeve's review?

Based on the above how do you explain the GF8500's video decelerator in XP with the GF8600 doing so to in H.264? XP's been out for as long as either the G80 or R600 has been in development stages on paper, yet the GF8500 and 8600 perform worse than software decoding in XP? So despite being out longer than the HD2900 and having a sibling in the market and having forever to work on drivers for XP that is a standard not moving target unlike Vista, they still couldn't get it to work.

So how does that link help your argument against the HD2900 and AVIVO-HD, which AMD says needs the software update to allow the shaders to perform the functions as described?

Also explain to me how your analogy favours the GF8500/8600 over the other 2 options? Right now the G80 beats the GF8500 in 3 out of 4 of the situations explored by Cleeve and versus the GF8600 it's 1 win and 2 ties and 1 loss for either, and that's despite the supposed benefit of new hardware acceleration.

Seems like they all have a long way to go when it comes to software intergration/compatability. And I expect that to come rather quickly now that the HTPCs are the target, but it's still very early.

Once we have both solutions running without those glaring issues, then we can start comparing which one does what better/worse.

Okay, knock it off.

You going to use a 1.4 mobile Celeron to play HD movies? Unless you have a laptop, who really gives a damn about H.264 acceleration on GPUs?

Does it bother anyone that their CPU runs at 80% instead of 20% utilization while watching HD video? If it does, you need your head examined. Watch your movies in the living room with the gf, not on your PC with a towel....

By the time BD-ROMs, HD-DVD drives and most importantly recorders reach the magic sub-$100 range mickey mouse videos cards will have it.

Maybe watching a Bluray flick on a second monitor while you surf the net and encode video..... nm gimme a break. This is stupid.

Argue about something meaningful please :roll:
a b U Graphics card
June 12, 2007 6:05:30 PM

Quote:

Okay, knock it off.

You going to use a 1.4 mobile Celeron to play HD movies? Unless you have a laptop, who really gives a damn about H.264 acceleration on GPUs?


Uh yeah these are the exact ones I am considering it for, and to be specific of two candidates for each VPU, the HP Dragon HDX or the new Apple 17" LED-lit MacBookPro.

Quote:
Does it bother anyone that their CPU runs at 80% instead of 20% utilization while watching HD video? If it does, you need your head examined. Watch your movies in the living room with the gf, not on your PC with a towel....


Well remember alot of people are considering these cards for HTPCs, which is the ONLY benefit of the GF8600 currently. so alot of people are taking their old CPUs and putting them in these rigs or using cool underclocked CPUs, so being able to remove the burden is nice. And anyhing that approaches 80+% means that if another thing uses system resources then you will get stutters, freezes or crashes. So it is an issue even if running just under 100% on a new Core2Duo desktop.

I still prefer the idea of the HD2600 for the laptops especially HDMI 1.2 over DVI; but I'm open to the F8600GT (not GS) as it's pretty competant too for a laptop which is why I want the info first rather than people deciding the fate of yet to be released products.

Quote:
By the time BD-ROMs, HD-DVD drives and most importantly recorders reach the magic sub-$100 range mickey mouse videos cards will have it.


I already have an Xbox 360 HD-DVD Drive, and I'm waiting for the LaCie to come down or LG to bring an external BR+HD-DVD recorder.

Quote:
Maybe watching a Bluray flick on a second monitor while you surf the net and encode video..... nm gimme a break. This is stupid.

Argue about something meaningful please :roll:


If you don't see the benefits, that's fine, but dismissing the benefits is ignorant, just as much so as pretending that any of the solutions are finalized yet. However considering this is very relevant to some of us, and is a graphics card/chip issue I don't understand your gripe.
June 13, 2007 4:17:19 PM

Quote:

Okay, knock it off.

You going to use a 1.4 mobile Celeron to play HD movies? Unless you have a laptop, who really gives a damn about H.264 acceleration on GPUs?


Uh yeah these are the exact ones I am considering it for, and to be specific of two candidates for each VPU, the HP Dragon HDX or the new Apple 17" LED-lit MacBookPro.

Quote:
Does it bother anyone that their CPU runs at 80% instead of 20% utilization while watching HD video? If it does, you need your head examined. Watch your movies in the living room with the gf, not on your PC with a towel....


Well remember alot of people are considering these cards for HTPCs, which is the ONLY benefit of the GF8600 currently. so alot of people are taking their old CPUs and putting them in these rigs or using cool underclocked CPUs, so being able to remove the burden is nice. And anyhing that approaches 80+% means that if another thing uses system resources then you will get stutters, freezes or crashes. So it is an issue even if running just under 100% on a new Core2Duo desktop.

I still prefer the idea of the HD2600 for the laptops especially HDMI 1.2 over DVI; but I'm open to the F8600GT (not GS) as it's pretty competant too for a laptop which is why I want the info first rather than people deciding the fate of yet to be released products.

Quote:
By the time BD-ROMs, HD-DVD drives and most importantly recorders reach the magic sub-$100 range mickey mouse videos cards will have it.


I already have an Xbox 360 HD-DVD Drive, and I'm waiting for the LaCie to come down or LG to bring an external BR+HD-DVD recorder.

Quote:
Maybe watching a Bluray flick on a second monitor while you surf the net and encode video..... nm gimme a break. This is stupid.

Argue about something meaningful please :roll:


If you don't see the benefits, that's fine, but dismissing the benefits is ignorant, just as much so as pretending that any of the solutions are finalized yet. However considering this is very relevant to some of us, and is a graphics card/chip issue I don't understand your gripe.

My gripe is you are being way too anal about something that won't be an issue in a short period of time.

Reminds me of when DVD-ROMs came out. You needed GPU acceleration or a dedicated decoder with a sub 500 MHz CPU. Once CPUs got into the 600+ range, you could easily play DVDs with no assisted acceleration at all. Today you can play multiple 480 videoes similtaneously with an old PCI 2d card if u wanted to.

We are already looking at cheap quads next months. CPUs will soon make HD acceleration via GPUs a non-issue. History repeats itself. Companies like Cyberlink will ensure it.

Background tasks causing stuttering in HD on a recent CPU? Now I know you are full of it...



My old P4 tech bench machine. Yes that is HD video playing in Nero.
a b U Graphics card
June 13, 2007 7:24:29 PM

Quote:

My gripe is you are being way too anal about something that won't be an issue in a short period of time.


It will be an issue for laptops for a long period of time, even 20% versus 60% means a huge difference in battery life, as well as heat..

Quote:
Reminds me of when DVD-ROMs came out. You needed GPU acceleration or a dedicated decoder with a sub 500 MHz CPU. Once CPUs got into the 600+ range, you could easily play DVDs with no assisted acceleration at all. Today you can play multiple 480 videoes similtaneously with an old PCI 2d card if u wanted to.


Yeah however even those better CPUs were nowhere near as good as the dedicated solutions like my Creative DVD that came with the daughter card decoder.

Quote:
We are already looking at cheap quads next months. CPUs will soon make HD acceleration via GPUs a non-issue. History repeats itself. Companies like Cyberlink will ensure it.


Yes it does, but most people are moving their older rigs to HTPC role so for the time being it will be necessary. Sure it not a big deal for future cores, but then again it won't be a big deal for them because they will have the VPU+CPU option anyways.

Quote:
Background tasks causing stuttering in HD on a recent CPU?


Sure, depends on the tasks and what's being watched and done. You image is really pretty but doesn't show anything about the stability of the video nor it's quality or compression method. And doesn't show your priority level of tasks, putting Nero and Norton in low priority mode means little as a stress test, and considering your 2 threads top out simultaneously then it's far from efficient, nor even a good comparison of what would hapen on a single thread situation which is still the majority of secondary HTPC PCs for now using old Athlons and P4s which couldn't take advantage of HT if they're running above 50% all the time. Also what's the bit rate of your content, I doubt it's anywhere near the future 40+Mb/s, so it's not like we're even at the end of the challenge from future titles yet so your current rig's ability to handle them is in question as well.

Quote:
Now I know you are full of it...


Yeah unlike you who keep talking about a quad core uber rigs when we're takling about the VPU assist for the midrange level. Like I said the features ofthe HD2900 and GF8800 will matter little since they'll be slapped on big CPUs and even at idle they consume more than the GF8600 and GF7900GS at full tilt, why don't you postthe results of a huge compressor equipped phase change rig? I'm sure HTPC owners would love that loud rig just so they can avoid upgrading their X300SE. :roll:

Once again your statements are ignorant and irrelevant to the market that looks at these features we're talking about, and theoriginal criticism is pointless.
June 14, 2007 1:54:44 AM

You're an @ss. My statements aren't ignorant, they are sarcastic but realistic. Yours are assuming, anal retentive and arrogant, and not just to me. You have a serious attitude problem. Which is why I posted in the first place.

Who is going to buy an 8600 and put it in their Althon XP HTPC? Oh wait.....you can't....

So maybe those with HTPC rigs built out of old spare parts (ie. old P4 2.4) will buy ATi HD 2600 AGP cards if and when they come out. Then go out and spend another $200 on a 360 HD-DVD or $500 on a Bluray writer.

How many users is that? Come on. People slap together old stuff to have fun for free or little money. These are machines playing below standard def downloaded videos and music, like modded a modded Xbox user.

Seriously. Almost everyone serious about HD and using a PCI-e board capable of an 8600 in their HTPC would already have a halfway decent CPU. Pentium E2160s are what? Under a hundred bucks?

I don't know if you've noticed, but CPUs are getting cooler, quieter, consuming less power while getting cheaper and more powerful. This is what I was getting at. I'm sure you know that and are just being a pr1ck to make yourself look better than me.

Sorry I didn't use a really crappy chip and a high priority background process and a 40mbit D video stream in my cap. I really didn't think someone would want to or attempt something so stupid.
June 14, 2007 2:55:28 AM

Quote:


the lowly single ROP cluster (like 4 traditional ROPs).

Which means to mean that while it may or may not be 'better' than the GF8600GTS, it will likely still suffer when compared to the value of the previous generations higer end cards, especially the X1950Pro and X1950XT.

That´s what i´m afraid of. Somehow AMD can´t get the midrange right. When the 1xxx midrange showed up it was too expensive. And even now the 1650xt is, at least in europe, almost as expensive as the 1950 Pro. :?


Quote:
I would wait until the HD2600 is tested better before deciding.

As for crysis performance, I wouldn't expect the GF8600 to be all that great in Crysis, but I wouldn't expect the HD2600 to be much better.

IMO the X1950XT is much more attractive for the money. Of course it consumes more power.

Overall unless you NEED to upgrade now, then I'd say wait for those summer refreshes like the RV670.


With building the 1950Pro AMD created a price/performance monster that keeps killing and maiming everything close by. I hope they make a nice profit with those things, yet i believe they will cut into the 2600 market - at least until the 256 bit variants are released.
And while AMD stated late summer, i honestly don´t believe they will show up then. Given AMDs history i wouldn´t even be surprised if we don´t see them this year.
June 14, 2007 3:15:51 AM

Nice find turpit. Thanks for posting it.
a b U Graphics card
June 14, 2007 7:23:09 AM

Quote:
You're an @ss. My statements aren't ignorant, they are sarcastic but realistic. Yours are assuming, anal retentive and arrogant, and not just to me. You have a serious attitude problem. Which is why I posted in the first place.


And you're acting like a Fuktard, your statements are ignorant of the HTPC and Laptop markets, which are obvious to anyone interested in either.

Quote:
Who is going to buy an 8600 and put it in their Althon XP HTPC? Oh wait.....you can't....


Who said anything about an Athlon XP? I said OLD Athlons, which includes the Athlon 64 2800-3500+ which would be perfect candidates for these benefits and often find themselves in HTPCs built from old parts. As for the older Athlons, they are candidates for the AGP HD2400-2600.

Quote:
How many users is that? Come on. People slap together old stuff to have fun for free or little money. These are machines playing below standard def downloaded videos and music, like modded a modded Xbox user.


You can try to characterize them like that, but other than HD these rigs are usually quite competent at everything else and were their OC'ed gamers before the launch of the C2D, now their back to stock for cooler operation. And there's alot of them out there, I've parted 3 of them in the last year for people and was planning on moving my old parts to one before I went all laptops. There's alot of members here who have such rigs.

Quote:
Seriously. Almost everyone serious about HD and using a PCI-e board capable of an 8600 in their HTPC would already have a halfway decent CPU. Pentium E2160s are what? Under a hundred bucks?


Anyone 'Serious about HD' already had stand alone solutions before Xmas. :roll:
Thes market for HTPC makers are for people who know the value of low CPU utlization and being able to do many things in hardware like audio.

Quote:
I don't know if you've noticed, but CPUs are getting cooler, quieter, consuming less power while getting cheaper and more powerful.


CPUs don't get quieter, unless you're hearing the gates switching. :tongue:

Quote:
This is what I was getting at. I'm sure you know that and are just being a pr1ck to make yourself look better than me.


I don't need that to make me look better than you, all I need is the fact that an old CPU running @ idle still consumes less power and thus generates less heat than a newer 'efficient' CPU running @ load. That's what is obvious to anyone who has HTPCs or Laptops, but seems so foreign to you.

Like I said the GF8800 and HD2900 don't matter because they both consume a ton of energy at idle, so their difference between hardware assist and software assist will be minor, and their benefit over CPU utilization will also be minor, but in a low power/heat mid-range card/chip the overall effect will be big. IF you can't see that then nothing will enlighten you.
a b U Graphics card
June 14, 2007 7:35:26 AM

Quote:

That´s what i´m afraid of. Somehow AMD can´t get the midrange right. When the 1xxx midrange showed up it was too expensive. And even now the 1650xt is, at least in europe, almost as expensive as the 1950 Pro. :?


Yeah I think we have to wait for the RV670 and GF8800GS class product to get a good replacement, and yeah it's very similar to the X1600 situation last generation, which I said since the GF8600 launched.

Quote:
With building the 1950Pro AMD created a price/performance monster that keeps killing and maiming everything close by.


Yeah and the cheap X1900XTs in the US didn't help either, and to a lesser extent the GF7900GS is also killing the value of the GF8600, and likely the HD2600 in that anyone who already owns any 3 of those, or even the GF7600GT/X1650XT will have a difficult time justifying a GF8600GTS or HD2600XT , and same for the GS and GF8500.

Quote:
I hope they make a nice profit with those things, yet i believe they will cut into the 2600 market - at least until the 256 bit variants are released.
And while AMD stated late summer, i honestly don´t believe they will show up then. Given AMDs history i wouldn´t even be surprised if we don´t see them this year.


Yeah that's the big question, no solid dates and alot of speculation from both extremes in the timeframe. I suspect the fall is a realistic timeframe, but we'll have to wait and see. Also the idea of a Gemini/Dual part seems to be getting alot more traction, which personally I think is the wrong direction for that , but I'll reserve final judgement once we know whether it's more effective than the lackluster X1950Pro DUAL card was.

I think whomever gets that upper mid-range product to market first for a reasonable price will win alot of contracts and alot of value conscious, power concious, and performance upgraders. The GTS-320 is a great value performer, but it's not much les power consuming than the other 8800s or even the HD2900 to be considered for people with PSU constraints. A GF8800GS or HD2900Pro would be sweet.
June 14, 2007 8:23:17 AM

Quote:

Yeah and the cheap X1900XTs in the US didn't help either, and to a lesser extent the GF7900GS is also killing the value of the GF8600, and likely the HD2600 in that anyone who already owns any 3 of those, or even the GF7600GT/X1650XT will have a difficult time justifying a GF8600GTS or HD2600XT , and same for the GS and GF8500.

The 7900 is another good example. Last week i noticed a 99€ offer at some online shop. That´s 20€ less than the already cheap 1950pro. It looks like some people are clearing their inventory thanks to the 8600. 8O

Quote:

Yeah that's the big question, no solid dates and alot of speculation from both extremes in the timeframe. I suspect the fall is a realistic timeframe, but we'll have to wait and see. Also the idea of a Gemini/Dual part seems to be getting alot more traction, which personally I think is the wrong direction for that , but I'll reserve final judgement once we know whether it's more effective than the lackluster X1950Pro DUAL card was.

I liked the idea of that dual card. It´s not really the same as nvidias GX2 since ATI only took a mid-range chip. The pricing with these offers is critical though and until now i haven´t seen a single dual chip solution that is worth its money.

Quote:

I think whomever gets that upper mid-range product to market first for a reasonable price will win alot of contracts and alot of value conscious, power concious, and performance upgraders. The GTS-320 is a great value performer, but it's not much les power consuming than the other 8800s or even the HD2900 to be considered for people with PSU constraints. A GF8800GS or HD2900Pro would be sweet.

Indeed! By looking at the hardware specs the gap between the 8800GTS and the 8600GTS is too big. The 2600XT might or might not close it, but the refresh chips will - i´m quite sure of that!
June 14, 2007 5:38:26 PM

Quote:


I don't need that to make me look better than you, all I need is the fact that an old CPU running @ idle still consumes less power and thus generates less heat than a newer 'efficient' CPU running @ load. That's what is obvious to anyone who has HTPCs or Laptops, but seems so foreign to you.

Like I said the GF8800 and HD2900 don't matter because they both consume a ton of energy at idle, so their difference between hardware assist and software assist will be minor, and their benefit over CPU utilization will also be minor, but in a low power/heat mid-range card/chip the overall effect will be big. IF you can't see that then nothing will enlighten you.


Yeah. I'm sure an old AMD64 3000+ with an 8600/2600 video card would consume less power at idle (and run cooler/quieter) than a board with onboard component video and a EE chip at load.... :roll:

http://www.hardwaresecrets.com/fullimage.php?image=6714
a b U Graphics card
June 14, 2007 9:00:20 PM

Well unless you show something a little more impressive, then I'll go with the two cool running card and CPU outstripping the hard running CPU, especially since the 690G you link to isn't that good with 1080P;
http://www.techreport.com/reviews/2007q1/amd-690g/index...

And that's a movie trailer not HD-DVD or BluRay so you can expect the CPU to be working much harder on that intetgrated solution if it'll even be able to handle the content at all without skipping since it has the warning on most mobos about not being able to play 1080P smoothly. So you can focus on individual portions, but your solutions don't combine all the benefits of the GF8600 and HD2600 to HTPC owners. And if people already have the Gf8800 and HD2900 anyways, then it's nice free bonus to them. So still you're either missing the issue or being obtuse.

Also, now you've got them buying a new processor, new mobo and new memory from that AMD64 3000+, that's a little much don't you think compared to a $90 graphics card? C'mon!
June 14, 2007 10:04:02 PM

Quote:
Well unless you show something a little more impressive, then I'll go with the two cool running card and CPU outstripping the hard running CPU, especially since the 690G you link to isn't that good with 1080P;
http://www.techreport.com/reviews/2007q1/amd-690g/index...

And that's a movie trailer not HD-DVD or BluRay so you can expect the CPU to be working much harder on that intetgrated solution if it'll even be able to handle the content at all without skipping since it has the warning on most mobos about not being able to play 1080P smoothly. So you can focus on individual portions, but your solutions don't combine all the benefits of the GF8600 and HD2600 to HTPC owners. And if people already have the Gf8800 and HD2900 anyways, then it's nice free bonus to them. So still you're either missing the issue or being obtuse.

Also, now you've got them buying a new processor, new mobo and new memory from that AMD64 3000+, that's a little much don't you think compared to a $90 graphics card? C'mon!


I don't disagree. Been playing Devil's advocate because you love to argue, never admit you might be wrong about something and love to dismiss those around you.

My system would consume less power and produce less heat, even at full load vs. your 3000+ and 8600/2600 system at idle.

Which is what I was commenting on and what you avoided.

Anyway, I'm getting bored of this.
a b U Graphics card
June 14, 2007 11:18:39 PM

Quote:

My system would consume less power and produce less heat, even at full load vs. your 3000+ and 8600/2600 system at idle.


Got any numbers for that?

Quote:
Which is what I was commenting on and what you avoided.


I didn't avoid anything I said at the start of the section you quoted that you need to show me something more impressive to back up your statement.

Because your P4 isn't that great.

http://www.xbitlabs.com/articles/cpu/display/sempron-30...
http://www.xbitlabs.com/articles/cpu/display/core2duo-s...
http://www.xbitlabs.com/articles/cpu/display/amd-energy...

A minimum of of +50W difference on those CPUs (and usually more but let's say less like 40W for a 20% vs 100% comparo), while the worst case scenario power hungry GF8600GTS (not GF8500) on it' 2D max is ~30W and the HD2400 will be much much less thanks to 65nm;

http://www.xbitlabs.com/articles/video/display/geforce8...

So really like I said, show me something more compelling to support your statement. Because your 60-80C CPU is going to be kick out more heat period (laws of phyics) and it will also run into temperature issues (different than heat of course) because it's going to be taxing the thermal capacity of the HSF and the air cooling capacity inside the rig, while the other system won't be taxed at all and easily dissipate the heat leading to lower temperatures all around.

Quote:
Anyway, I'm getting bored of this.


I got bored from your first post, because it's obvious you don't understand the need yet wanted to jump in an criticise for personal reasons.
!