Sign in with
Sign up | Sign in
Your question

Question about HD 3870

Last response: in Graphics & Displays
Share
November 17, 2007 5:55:36 AM

I have a question on the ATI 3870 lineup. I noticed that all the ones listed have a 256 bit memory bus. Are they going to release any 512 bit ones? Sorry if this is an obvious question I just haven't been keeping up with this stuff.

More about : question 3870

November 17, 2007 6:43:00 AM

No plans that I know of. They did some tweaking to get about the same performance out of the new modified 256-bit bus as they did from the 512-bit bus. I would love to test out the differences but based on the performance in games I don't the need for the 512-bit bus.

a b U Graphics card
November 17, 2007 8:02:09 AM

The 512-bit bus was needed on the 2900XT because only big numbers would help it sell.
Related resources
November 17, 2007 10:35:15 AM

nightscope said:
I have a question on the ATI 3870 lineup. I noticed that all the ones listed have a 256 bit memory bus. Are they going to release any 512 bit ones? Sorry if this is an obvious question I just haven't been keeping up with this stuff.

Yes in a way as they plan 2x 256bit GPU on a single card. Quote below is from the second to last paragraph.
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=12
Quote:
AMD is talking about sticking two 3800 GPUs on a single card
a b U Graphics card
November 17, 2007 10:54:12 AM

This doesnt mean theyre going to use a 512 interface for that card tho, tho they could/may. It would be interesting if they did, being that I believe the GTX using its 384 barely uses all of it, but having 2 3870s , maybe there could be a slight improvement doing this. I havnt checked the potential for something like this, if anyone can reflect better or more on this, Id love to hear it
November 17, 2007 11:00:44 AM

512bit isnt really needed up to very high res, and not sure if AMD will use it for R680 since its expensive and not that much needed. I would prefer them buffing TMU/ROP instead :)  Which probably wont happen either - R680 will be launched very soon and nothing changed in this area as of yet that we know of.
a b U Graphics card
November 17, 2007 11:07:58 AM

Ive read that a few problems were dealt with in the 38xx series from the 29xx series. The rops/tmus will be dealt with in the next (7xx) series. Cant remember where I read it, but its being dealt with
November 17, 2007 3:04:46 PM

Do the 320 unified stream processors help much? I mean, on paper, comparing the 8800GT to the HD3870, the HD3870 should have the upper hand. ATI is not fully utilizing all the hardware it put in it...

When is the 700 series coming out? I heard beginning of 2008...is that right?
November 17, 2007 3:45:02 PM

I guess alot of it has to do with games not taking advantage of 320 shaders. But it doesnt help that nvidia's cards shader's are running twice as fast as ATI's.
a b U Graphics card
November 18, 2007 12:37:30 AM

Exactly! It works like a single threaded game. You could have a 1.8GHz Opteron 165 getting it's butt whooped by a 2.8GHz 4000+ (Sure, you'd never leave the opty at 1.8, but it's a scenario). You got more cores, but having less but higher speed cores is more useful in a single-threaded game.
November 18, 2007 5:01:25 AM

Can you overclock the shader's speed?
a b U Graphics card
November 18, 2007 5:23:03 AM

Yes.
November 18, 2007 3:21:56 PM

With ATI Tool? And does it really make a difference?
a b U Graphics card
November 18, 2007 9:33:09 PM

Wait, I must have thought this was a different thread. I dunno about whether you can in ATITool, I haven't got a card with separate shader and core clocks to try it.
a b U Graphics card
November 18, 2007 9:48:21 PM

You cant oc ati's shaders. Only the gpu, which in turn ocs the shader gpu=shader for ati. With the 88xx series you can seperately oc the shaders with riva tuner, and eventually nVidia is going to release a capable oc in their forceware
a b U Graphics card
November 18, 2007 10:13:39 PM

There you go ;) 

I should have known, RaBiT doesn't have any options for changing the shader clock. Oh well, I always miss the obvious.
!