Sign in with
Sign up | Sign in
Your question

Radeon 3870X2 - My First Impressions

Last response: in Graphics & Displays
Share
January 29, 2008 11:53:49 AM

Well, yesterday i ordered the HIS Radeon 3870X2 from Overclockers, at about 9:30 this morning it arrived! (Thank you OCUK!)

First, upon opening the box, you see the card...and man is it big! I know i had the measurements on me when i checked it would fit, but the size of it still shocked me, so did the weight! This thing is a beast! Then, hidden under the card is the usual stuff, CD, plug adaptors etc. But a nice touch is the screwdriver/torch/level combo you get. It has a choice of 4 heads (large and small cross and flat) and is a nice touch.

Ok, after a fair amount of trouble installing it (having to relocate my two HDs and remove my RAM to do so) i was ready to start. When i first powered up i heard this wooshing (best word for it really) noise as it did that POST check (the beep at the start right?). This i presume is the fan at full speed and its not that quiet.

After installing the drivers i tried the 2 games i have installed. Toca3, everything on full settings and it played fine, as soon as i closed the game i checked CCC and the temp was in the mid 70s, so im guessing it wasnt much higher during the game. I then tried Crysis (oooh), and at 1680x1050 with no AA and all except Post processing effects on high i got around 45fps (high 30s at lowest). Im guessing my Q6600 CPU (at only 2.4ghz) would bring this down a little so those of you with faster CPUs, enjoy! After a few minutes of playing Crysis the temp in CCC was high 70s. It is currently idling at 58. My temps appear to be only slightly higher than the ones here, http://enthusiast.hardocp.com/article.html?art=MTQ1NCw3LCxoZW50aHVzaWFzdA==, and thats probably due to the fact that its in a fairly tight space (if temps become a problem ill move it).

I have also not yet found a way to monitor fan usage, but based on the fact that during the POST check it goes up to full speed (im almost certain its the graphics fan) and my other 3870s didnt, i think its working!

Edit: Anyone know if all X2s have PCI-E 2.0? For some reason the parts that mentioned it on the box of mine were covered in black tape and in GPUZ it says the interface is PCI-E x16 @ x16, no mention of PCI-E 2.0 anywhere. I also cant see any mention of it on the overclockers product page. http://www.overclockers.co.uk/showproduct.php?prodid=GX-091-HT&groupid=701&catid=56&subcat=416
January 29, 2008 12:12:25 PM

Ok now I cant wait for mine to come and me to get home and install it. Mine is sitting on a fed ex truck.

What drivers did you use the CD or download?
January 29, 2008 12:39:36 PM

CD, im going to download the latest now.

Also, playing Crysis just then, with my case side fan (36cm) turned up full, whenever i minimised crysis to CCC (which was already open) the highest i saw the core temp was 80c, and in the second before the game minimised i doubt that it could cool much! So im guessing max load temp of 85c in a badly ventillated area, when i can be arsed to move it up on top of my desk it will be cooler.

Update: It scored 141 3D marks less than my old Crossfire3870 system with 13647...still beats 97% of all systems they have results for :D 
Related resources
January 29, 2008 1:05:37 PM

Quote:
with my case side fan (36cm) turned up full

Off-Topic: To my knowledge, there are only two cases with 36cm fans. (Retail, who know what those crazy-yet-lovable modders do...)
Are you using the APlusCase/XClio Diablo or the XClio 2?


On-Topic: Have you tried the "Crisis Ultra Quality" Mod? I think you'll enjoy it!
http://www.crymod.com/thread.php?threadid=13790&page=1

Also, what's your 3D-Mark06 graphics score?
I'm dying to find out if I should go with this card or the G92 8800GTS... :D 
January 29, 2008 1:07:03 PM

Just said my 3D mark score in previous post (13647) and im using an XClio Propellor case, but i currently only have the 12cm exhaust fan mounted, cant figure out how to get the front 12cm intake on yet!

Also, might try that mod! Not sure if my CPU will hold back the X2 though.

Edit: Also, does anyone know why in CCC under Current Clock Settings, it says my GPU Clock is 300mhz, but in the Requested Clock Settings section above that (in the overdrive tab) it says the gpu clock is at 825mhz? GPUz also says the GPU Clock is 300mhz. Im tihnking thats either wrong or means something else, as with a clock that low there is no way it could nearly equal my old 3DMark score with twin 3870s.
January 29, 2008 1:13:57 PM

What's the point of this post? To state that you bought the card? j/k

How do you like the card?
January 29, 2008 1:23:00 PM

I had this same issue on my 3850's. I couldnt find anything on it.
January 29, 2008 1:26:26 PM

Heh, guess I'm going blind.
Kids, don't play with overpowered lasers! _____________________


Whoa, that's a nice case, I have the XClio WindTunnel, your case looks like a mix between mine and the XClio 2.
http://www.newegg.com/Product/Product.aspx?Item=N82E16811103011

Just wondering: Did your case come with Yellow rails for the 5.25" and Green for the 3.5"?
I'm not that into fashion. But I know that the colors of Aliens and Eggs don't go with Black... ^^
Well, maybe Aliens... "Men in Black".

Do you really need a puny little 12cm intake fan? ^^
To install mine, I had to remove the 5.25" drives and the front cover, which had "flaps" on both sides.
Then I installed a 12cm-fan as an exhaust.

13647... Eeep.
You will probably go even higher with newer drivers... Which will deserve an even bigger "Eeep".
January 29, 2008 1:29:35 PM

What power supply are you running?
January 29, 2008 1:42:09 PM

Well, i love the card, my case didnt come with any tool-less stuff, so i had to use screws :(  Im also not totally sure if i need the intake fan, might leave it as its a pain to install!

Im using a Hiper 680W one (the box recommends 550W or 750W for two X2s). And ill get the new drivers some time this evening and try 3DMark again.
January 29, 2008 1:46:46 PM

haydox said:
Just said my 3D mark score in previous post (13647) and im using an XClio Propellor case, but i currently only have the 12cm exhaust fan mounted, cant figure out how to get the front 12cm intake on yet!

Also, might try that mod! Not sure if my CPU will hold back the X2 though.

Edit: Also, does anyone know why in CCC under Current Clock Settings, it says my GPU Clock is 300mhz, but in the Requested Clock Settings section above that (in the overdrive tab) it says the gpu clock is at 825mhz? GPUz also says the GPU Clock is 300mhz. Im tihnking thats either wrong or means something else, as with a clock that low there is no way it could nearly equal my old 3DMark score with twin 3870s.


I believe it says 300 because ATi has it's cards clock themselves down when in a simple 2D desktop environment, ie when you're looking at CCC or gpu-z in order to try to save power but they clock back to the 825 under full screen 3D and the like.
January 29, 2008 2:22:58 PM

bravo29 said:
What power supply are you running?


This is what I want to know. AMD lists the 3870x2 under two card power supplies:

http://ati.amd.com/technology/crossfire/buildyourown2.h...

What I'd like to know is this a Crossfire setup of 2 3870x2 or is it a single 3870x2?

The power requirements in the reviews say something along the line of 225 at idle and up to 384 at full load.

I have an Antec Neo 550, and I was going to get a Gigabyte 3870 for $229, but I am so tempted to get the 3870x2 and just defer something else from my tax return. The only thing that I might have to do is get higher watt power supply (around 700 or 850, which would cover the listed Seasonic and an Antec respectively). That adds to the cost.

When is R770 supposed to be out? Last I heard it was 6 months. If ATI's on time, then a 3870x2 is expensive for a 6 month card to be passed down to a relative or friend, but if R770's delayed till 2009, then the 3870x2 is the best choice for the foreseeable future.

Should I do a poll?

What the heck buy a 3870x2 and a new PSU (probably around $600)

or

Wait for R770, get a mere 3870 and buy that $500 loveseat from Ikea

Okay, Newegg's PSU calculator estimated 607 watts for my system , choosing the 2900XT, which is on the AMD PSU list alongside the 3870x2. So, I guess AMD doesn't have a true Crossfire x4 recommendation for the 3870x2 yet.

The Antec Truepower 850 is $179, and the Seasonic M12 700 is also $179. Is this card worth having to get a new power supply that I originally bought for a regular 3870? For this, I don't just need an income tax rebate, I need the stimulus package too. Maybe R770 will be out by then. LOL




January 29, 2008 2:30:44 PM

yipsl said:
This is what I want to know. AMD lists the 3870x2 under two card power supplies:

http://ati.amd.com/technology/crossfire/buildyourown2.h...

What I'd like to know is this a Crossfire setup of 2 3870x2 or is it a single 3870x2?

The power requirements in the reviews say something along the line of 225 at idle and up to 384 at full load.

I have an Antec Neo 550, and I was going to get a Gigabyte 3870 for $229, but I am so tempted to get the 3870x2 and just defer something else from my tax return. The only thing that I might have to do is get higher watt power supply (around 700 or 850, which would cover the listed Seasonic and an Antec respectively). That adds to the cost.

When is R770 supposed to be out? Last I heard it was 6 months. If ATI"s on time, then a 3870x2 is expensive for a 6 month card to be passed down to a relative or friend, but if R770's delayed till 2009, then the 3870x2 is the best choice for the foreseeable future.

Should I do a poll?

What the heck buy a 3870x2 and a new PSU (probably around $600)

or

Wait for R770, get a mere 3870 and buy that $500 couch from Ikea


Are you sure the "225 at idle and up to 384" isn't for the full test system of Motherboard, CPU, and 3870 X2? Or is the card really use near 400 watts on its own?
January 29, 2008 2:33:56 PM

Ah, thanks fudge, i wondered if it was something like that.

Yipsl, its really up to you, the box of my X2 says 550W minimum for 1 card and 750W minimum for twin X2s (which have no driver support yet!). So it sounds like your current PSU could cope (especially as its a good brand name). I have also owned both a twin 3870 setup and this 3870X2 setup and the X2 is far easier. To start with 3870s have an annoying problem on nearly all brands where the fan speed is dodgy and can often lead to huge temps (mine hit 100c). This cant be controlled easily as programs like Riva hate crossfire! However the X2 has a good working cooling system (mine htis 80c-ish under load) and is generally a lot less hassle!

Oh, and seraphic, those wattages are full system. Look on the link in my first post, they did the wattages and temps of an 8800 and an X2 and compared them.
a c 143 U Graphics card
January 29, 2008 2:55:59 PM

haydox said:
CD, im going to download the latest now.

Also, playing Crysis just then, with my case side fan (36cm) turned up full, whenever i minimised crysis to CCC (which was already open) the highest i saw the core temp was 80c, and in the second before the game minimised i doubt that it could cool much! So im guessing max load temp of 85c in a badly ventillated area, when i can be arsed to move it up on top of my desk it will be cooler.

Update: It scored 141 3D marks less than my old Crossfire3870 system with 13647...still beats 97% of all systems they have results for :D 


Nice! My own PC has a stock Q6600 just like yours, and a BFG 8800 GTX OC2, and I only get 12000. Wow! :o 
January 29, 2008 3:13:28 PM

Woot Mine just came!

Now I just have to wait to get home to install it.
January 29, 2008 3:17:54 PM

Nice bravo29! Good luck, and have fun :D 

Aevm, i was looking at buying the exact same card before i was told the X2 was out on monday! And even though mine got 1647 more 3dmarks there isnt that much difference in games (yours will beat mine in a lot of games), the only reason mine got more marks (i read this next bit somewhere else) is because 3DMark is a synthetic testing program that just stresses your graphics card and its capabilities, mine simply won as it has two cores! Yours will excell over mine in games with no multi-core support.
January 29, 2008 3:25:46 PM

Once 3dmark tunes the app to the drivers I can see the scores going up.
a c 143 U Graphics card
January 29, 2008 3:29:21 PM

Thanks, feeling 5% less envious now :)  Still, congratulations!!!

Post some screenshots from Crysis on "very high" one of these days... I keep hearing how ATI cards make games look better than nVidia cards. I wonder if there's any truth in that.
January 29, 2008 3:46:36 PM

Yeah, i cant wait till ATI actually release some new drivers specifically for the X2.

Also, i sadly cant play Crysis on very high, only high (using good old XP). I might try the mod snillet suggested in post 4 and ill try get some screenshots of it on high, it still looks awesome!

Oh, and i just installed CoD4, autodetected settings at 1024x768 res, needless to say i felt insulted, so i put it on 1680x1050, everything else up full (including AA), all textures on Extra and it ran perfectly and looked incredible! Sadly, as my room has got warmer thanks to the central heating the graphics card has got warmer so its taking longer to cool and i can clearly hear the fan on it, during gaming its not bad at all but with no sound its definately there!
January 29, 2008 4:04:33 PM

you guys really ought to think about at least a light OC on those quads...it really can just be too easy...and free
January 29, 2008 4:17:04 PM

I actually manage to run Crysis with my single 8800GTS 640M, with all settings on Very High and at a respectable framerate. (High 20's.) - With 2xAA! (High 10s with 4xAA) and on Vista

My secret:
800x600 - The pixels are as big as the Koreans, and the whole game feels a bit "Playstationy"


The Ultra Mod "optimises" the settings to get a higher graphical "feel" to the game. So far, I'm loving it.
I'm running it at 1024x768, without AA.



6000+ @ 3GHz, ASRock K8SLI-eSATA2 + AM2CPU Upgrade, 4GB Corsair 800MHz Twin-X @750MHz (AM2 Multiplier issue.)
Windows Vista Ultimate x86, with Memory Hole turned off.
Defective Leadtek 8800GTS 640MB @ Stock speeds. (RMA soon.)
21" >30kg Geriatric Monitor (PanaSync/Pro 7G) - 1600x1200@75Hz for Stereo 3D Gaming.
eDimensional Shutterglasses - which I can't get to work with my 8800 :( 
January 29, 2008 4:18:08 PM

I know how to OC mine, just up the FSB and once its stable lower the voltages as far as i can, i even got someone's exact voltages from their Q6600 so i have a base to go on. Im just making sure the temp on my graphics card doesnt get too high currently (Update: just got my brother to make the valve on my radiator actually turn, so i can turn it right down if it gets too hot!). If that is all ok then ill OC my cpu to 3ghz.

Ah, ok snillet, i get what the mod does now. So i can, in effect, have the settings higher and take a far smaller framerate hit. How will it work on XP with all settings already on High (except no AA and postprocessing on medium)?
January 29, 2008 4:33:26 PM

haydox said:
Ah, thanks fudge, i wondered if it was something like that.

Yipsl, its really up to you, the box of my X2 says 550W minimum for 1 card and 750W minimum for twin X2s (which have no driver support yet!). So it sounds like your current PSU could cope (especially as its a good brand name). I have also owned both a twin 3870 setup and this 3870X2 setup and the X2 is far easier. To start with 3870s have an annoying problem on nearly all brands where the fan speed is dodgy and can often lead to huge temps (mine hit 100c). This cant be controlled easily as programs like Riva hate crossfire! However the X2 has a good working cooling system (mine htis 80c-ish under load) and is generally a lot less hassle!

Oh, and seraphic, those wattages are full system. Look on the link in my first post, they did the wattages and temps of an 8800 and an X2 and compared them.


Okay, the listing under Crossfire with the X2900 was confusing me. The 3870x2 is virtually Crossfire itself. I have a 690G board, but I was thinking of switching the CPU to a 770 for future decent Phenoms. (yes, I'm charitable towards AMD right now).

As long as I don't have to buy a new PSU, I'll go for it, and afford the love seat if I don't get the couch right away. We really are sitting on the floor and computer chairs, but this is gaming. How can mundane things like furniture compare to that.

Thanks for the clarification on the total system watts. The main reason I'd planned on getting the Gigabyte with a Zalman fan was because of the fan issues I'd heard. It just had DDR3 instead of DDR4. So does the 3870. Any idea which brand is rated best at this early date?
January 29, 2008 4:57:14 PM

Think i must be missing something - you have just forked out £300 quid to replace your 'old' system running 2 3870 cards in crossfire, and you replace them with a 3870 with 2 GPU's which in fact runs a tad slower!
January 29, 2008 5:03:26 PM

haydox said:
Just said my 3D mark score in previous post (13647) and im using an XClio Propellor case, but i currently only have the 12cm exhaust fan mounted, cant figure out how to get the front 12cm intake on yet!

Also, might try that mod! Not sure if my CPU will hold back the X2 though.

Edit: Also, does anyone know why in CCC under Current Clock Settings, it says my GPU Clock is 300mhz, but in the Requested Clock Settings section above that (in the overdrive tab) it says the gpu clock is at 825mhz? GPUz also says the GPU Clock is 300mhz. Im tihnking thats either wrong or means something else, as with a clock that low there is no way it could nearly equal my old 3DMark score with twin 3870s.

well duuh, the power feacture dude

the 300 MHz is 2d frecuency
the OVerdrive tab is when it kicks to 3d speeds

that means it will OC it automatically to the desired overclocked frecuency when it goes to 3d mode.
thus you can still have a low 300 Mhz in 2d and get some OC in 3d mode ;) 
January 29, 2008 5:04:51 PM

I didnt know what the power feature was! I just trusted that it would work....to be honest i was happy to have a card which could keep itself cool and not hit 100c after 2 minutes of gaming!

Nice to know its meant to happen though :D 
January 29, 2008 5:25:04 PM

Crysis will be my test as will playing EQ2 on Extreme High quality with shadows.

I will post my screen shot once I get home and get mine running.
January 29, 2008 5:30:14 PM

Hmm, glad I didn't wait for the 3870X2 then. My E8400 and 8800GTS (512mb) play Crysis with everything on very high (and I mean everything) but no AA--- bout 20-30fps (playable for me) (no mods), and I haven't even overclocked my GTS yet! This on vista ultimate 32bit of all things.

So not very impressive to me, but yeah it's probably the CPU that is holding your card back in Crysis. You should overclock your CPU at least a few hundred MHZ, and you might see a big performance gain. No friggin way my 8800GTS should be outperforming your 3870X2 lol =)

January 29, 2008 6:58:23 PM

Im almost certain its my CPU, its the only thing that could be holding it back. I have an X2, 4GB of RAM and about 400gb free HD space! I sadly cant test Very High as i only have XP!
January 29, 2008 7:42:42 PM

Just got mine in and so far only 1 issue. Upon loading the drivers vista said their was no compatable card installed. It killed the video siganl but installed the drivers. I had to do a hard reboot and it came back up.

Have not run 3d mark yet but got into EQ2 and running under extreme quality grapics is a go with no lag or shutter.

Next up 3d mark and crysis.

Note** I had to relocate my HD's for the RMS titanic card.
a c 143 U Graphics card
January 29, 2008 7:45:09 PM

haydox said:
Im almost certain its my CPU, its the only thing that could be holding it back. I have an X2, 4GB of RAM and about 400gb free HD space! I sadly cant test Very High as i only have XP!


Nah, it's Crysis itself, it's not optimized properly for Crossfire and that hurts the X2's performance. The X2 will win in lots of other games. There's hope, some Crysis patch or driver patch could fix this eventually.

http://www.gamespot.com/pages/forums/show_msgs.php?topic_id=26171950
January 29, 2008 7:54:13 PM

Yeh, bravo, mine actually came with a note saying that with vista the screen can go black and it will still install the drivers. They just said leave it for half an hour or so then reboot and it will be fine. Odd that they released it knowing about that problem, but nice that HIS put in a little note saying what to do!

And, aevm, i thought it could be something to do with the game itself, i know patch 1.1 had improved fps for dual gpu setups but im guessing if it were more like CoD4 it would be a lot better, in CoD4 there is an option to select multi gpu setups.
January 29, 2008 7:59:08 PM

12986 score mine came back with
January 29, 2008 8:02:19 PM

Nice, i recon my extra few points came from my CPU being a quad core...seems to be the only area i beat you in.

Also, a little update on this cards cooling, after 1 hour of playing Crysis as soon as i quit i checked CCC and the temp was 76c, taking into account the short time it had to cool and the speed it cools after 100% usage im guessing it was at around 85c.

One little question, anyone know if all X2s have PCI-E 2.0? For some reason the parts that mentioned it on the box of mine were covered in black tape and in GPUZ it says the interface is PCI-E x16 @ x16, no mention of PCI-E 2.0 anywhere. I also cant see any mention of it on the overclockers product page. http://www.overclockers.co.uk/showproduct.php?prodid=GX-091-HT&groupid=701&catid=56&subcat=416
January 29, 2008 8:33:48 PM

So is 8 GPU gonna happen then?? - they say 4 cards - not 4 gpus..

Quote (discussing the x2 card)

"The card will support CrossfireX technology by the end of this quarter, which will allow up to four graphics cards to work together to scale multimedia performance, AMD said in a news release."

http://news.xinhuanet.com/english/2008-01/29/content_75...

Ryan
January 29, 2008 8:54:16 PM

I have to admit the card is a risky move for ATI. ATI has been hurting since the absolute destruction of the 3870 by g92 series. Why I am concerned because this follows similar flaws in other video companies. Everyone remembers the horrible voodoo card which tried to solve the gpu issues by stacking them up. Also everyone remembers the more recent 7950x2 which, beside allowing a sli setup for single slot mobo, was a horrible waste of money when 8800 series absolutely killed it. I see the same happening with this card. Two is not always better then a newer updated chip. Now it woulda been badas* if ATI could have made a dual core gpu. That would make huge difference and the technogoly is already there from cpu's. I dont know what Nvidia has planned but i would not be suprised to see a nvidia 9800 dual core gpu.
January 29, 2008 9:01:17 PM

one question did yours just have 2-6 pin plug ins or 1-6 pin and 1-8 pin??

MY PSU doesnt have an 2X4 (8 pin) pin plug for VGA and the box says for overdrive use to have one 2x4 PciE 8 pin.

Any harm just running the 6 pin in it or anybody know some where to get an 6 to 8 pin adapter?
January 30, 2008 1:14:38 AM

bravo29 said:
Any harm just running the 6 pin in it or anybody know some where to get an 6 to 8 pin adapter?

I was under the impression that you needed TWO power cables plugged into the card in order for it to work properly. If you don't have an 8-pin two 6-pins are supposed to work fine, you just won't be able to overclock it.
January 30, 2008 5:01:13 AM

lankiller said:
I have to admit the card is a risky move for ATI. ATI has been hurting since the absolute destruction of the 3870 by g92 series. Why I am concerned because this follows similar flaws in other video companies. Everyone remembers the horrible voodoo card which tried to solve the gpu issues by stacking them up. Also everyone remembers the more recent 7950x2 which, beside allowing a sli setup for single slot mobo, was a horrible waste of money when 8800 series absolutely killed it. I see the same happening with this card. Two is not always better then a newer updated chip. Now it woulda been badas* if ATI could have made a dual core gpu. That would make huge difference and the technogoly is already there from cpu's. I dont know what Nvidia has planned but i would not be suprised to see a nvidia 9800 dual core gpu.


Horibble? I had a voodoo 2 card that spanked some serious ass (unreal etc)
Ryan
January 30, 2008 1:03:48 PM

Ironnads said:
Horibble? I had a voodoo 2 card that spanked some serious ass (unreal etc)
Ryan


I had a Diamond Monster (the first Voodoo with 4 megs) and then I had a Voodoo 2 with 12 megs, but I never had two of the second card. Did it work better than Crossfire and SLI?

If I wanted to spend twice as much on a 790 board, and almost $200 for a new PSU, then I could get a second 3870x2 or an R700 next fall, but that's expensive for a few extra fps. Buying one 3870x2 this week is enough GPU for a year.

All the people kvetching about Crysis, send a message to the developers who push the envelop too far past current tech: Wait till it's been out for four years, buy it for $10, and play it on that generation's IGP or Swift.

January 30, 2008 1:08:22 PM

lankiller said:
I have to admit the card is a risky move for ATI. ATI has been hurting since the absolute destruction of the 3870 by g92 series. Why I am concerned because this follows similar flaws in other video companies. Everyone remembers the horrible voodoo card which tried to solve the gpu issues by stacking them up. Also everyone remembers the more recent 7950x2 which, beside allowing a sli setup for single slot mobo, was a horrible waste of money when 8800 series absolutely killed it. I see the same happening with this card. Two is not always better then a newer updated chip. Now it woulda been badas* if ATI could have made a dual core gpu. That would make huge difference and the technogoly is already there from cpu's. I dont know what Nvidia has planned but i would not be suprised to see a nvidia 9800 dual core gpu.



You are a bit late to the game. ATi has been hurting since AMD bought them.. but as far as this generation goes it's since the 2900 XT failed against the 8800 Series.

7950X2, 9800 GX2, and HD3870X2 are all bridge cards. End of generation just trying to squeeze a bit more performance out of them. If ATi hasn't released this card we wouldn't even be talking about them being on par with an Ultra.

nVidia isn't releasing a dual core GPU, it's a Dual GPU card, the 9800 GX2.

ATI has tape outs of a dual core GPU, the RV770. The highend solution plans quad core (2 dual core GPUs). This makes the cards modular and extremely profitable when you only have to create one core to satisfy all ranges of cards.
January 30, 2008 1:31:36 PM

bravo29 said:
one question did yours just have 2-6 pin plug ins or 1-6 pin and 1-8 pin??

MY PSU doesnt have an 2X4 (8 pin) pin plug for VGA and the box says for overdrive use to have one 2x4 PciE 8 pin.

Any harm just running the 6 pin in it or anybody know some where to get an 6 to 8 pin adapter?


8 pin if you want to use the OC software
January 30, 2008 1:37:36 PM

yipsl said:
I had a Diamond Monster (the first Voodoo with 4 megs) and then I had a Voodoo 2 with 12 megs, but I never had two of the second card. Did it work better than Crossfire and SLI?

If I wanted to spend twice as much on a 790 board, and almost $200 for a new PSU, then I could get a second 3870x2 or an R700 next fall, but that's expensive for a few extra fps. Buying one 3870x2 this week is enough GPU for a year.

All the people kvetching about Crysis, send a message to the developers who push the envelop too far past current tech: Wait till it's been out for four years, buy it for $10, and play it on that generation's IGP or Swift.


Just had the one.. Fantastic card at the time though.. I remember drooling over unreal.. (shame unreal 2 was crap!)
Ryan
January 30, 2008 3:17:24 PM

Bravo, mine came with a 6 pin and an 8 pin plug but in the box it had a 6 to 8 pin adaptor, if yours didnt come with one maybe its been forgotten. Try contacting where you got it from.
January 30, 2008 3:42:38 PM

Haydox. The reason GPU-Z is showing the card running only at 300 is because that's the speed of the GPU in 2D mode. If you were to run Rivatuner and show the history you would see it go from 300 to 875 or whatever it runs at while doing a 3d app. It will promptly go back to 300 when it's done "working" I'm a little confused on the PCIE 2.0 as well. I ordered a pair of them from NCIX (Diamond Vipers) and waiting for them to come in. But NCIX shows them as PCIE 2.0 x16 When I open GPU-Z it shows my current 3870s both as being PCIE 2.0 x16 running at 2.0 x16

As for the power plugs the 6 Pin will work, you will not be able to overclock it. 8 Pin is required before the CCC will allow the clocks to be moved higher.
As for your fan. Yes it's working. Those temps are normal for that type of GPU. I'm just glad I have mine all on water. My load temps after a couple of hours of Crysis are only 46 for the top card, and 47 for the bottom card.

Can't wait for the 2 3870 x2s to come in. I'll find out soon enough if I am able to crossfire them yet!! LOL Wish me luck boys!
January 30, 2008 3:50:45 PM

Haydox. The reason GPU-Z is showing the card running only at 300 is because that's the speed of the GPU in 2D mode. If you were to run Rivatuner and show the history you would see it go from 300 to 875 or whatever it runs at while doing a 3d app. It will promptly go back to 300 when it's done "working" I'm a little confused on the PCIE 2.0 as well. I ordered a pair of them from NCIX (Diamond Vipers) and waiting for them to come in. But NCIX shows them as PCIE 2.0 x16 When I open GPU-Z it shows my current 3870s both as being PCIE 2.0 x16 running at 2.0 x16

As for the power plugs the 6 Pin will work, you will not be able to overclock it. 8 Pin is required before the CCC will allow the clocks to be moved higher.
As for your fan. Yes it's working. Those temps are normal for that type of GPU. I'm just glad I have mine all on water. My load temps after a couple of hours of Crysis are only 46 for the top card, and 47 for the bottom card.

Can't wait for the 2 3870 x2s to come in. I'll find out soon enough if I am able to crossfire them yet!! LOL Wish me luck boys!
January 30, 2008 4:10:04 PM

The latest version of Everest Ultimate will let you know if your PCI-E bus is running 2.0 or not.

-mcg
January 30, 2008 4:52:13 PM

lankiller said:
I have to admit the card is a risky move for ATI. ATI has been hurting since the absolute destruction of the 3870 by g92 series. Why I am concerned because this follows similar flaws in other video companies. Everyone remembers the horrible voodoo card which tried to solve the gpu issues by stacking them up. Also everyone remembers the more recent 7950x2 which, beside allowing a sli setup for single slot mobo, was a horrible waste of money when 8800 series absolutely killed it. I see the same happening with this card. Two is not always better then a newer updated chip. Now it woulda been badas* if ATI could have made a dual core gpu. That would make huge difference and the technogoly is already there from cpu's. I dont know what Nvidia has planned but i would not be suprised to see a nvidia 9800 dual core gpu.


Yeah but modern driver technology and os's make a much better job of handling more than 1 gpu then back in the days of the voodoo, plus the x2 has been released with working drivers which the voodoo didnt have, plus plus plus dual gpu support is infinetly more common now then in the voodoo's day, and has come a long way since the 7950gx2's day. If both ati and nvidia are launching cards with multiple gpus then that must be a big testament to the direction graphics cards and games are taking.
!