AMD 3870 X2 official specs!

http://en.hardspell.com/doc/showcont.asp?news_id=2134

Official specs of 3870 X2 revealed. Check it out
47 answers Last reply
More about 3870 official specs
  1. looks neat. tho i guess we have to wait for the benchies.
  2. "666 million transistors"
    lol it's evil...
  3. So when can we expect to see this in stores?
  4. Kari said:
    "666 million transistors"
    lol it's evil...

    :O. ati, three letters, times 3 by every letter and you get 3 3 3 or nine...divide nine by three and you get three...3 numbers for each letter..if you times them by two...you get,,,6 6 6, dear god.

    other then that,
    hawtness, (no pun)
    cant waitt.

    edit:the image links aren't working fo meh.
  5. Ok, I haven't posted in a long time now. I was reading this and with my 8800 640 I feel like I was living under a rock.

    640 Stream Processors??? That is totally insane.

    Well, I give up. I am going to only buy new systems at this point with each "revolutionary" game that comes out, that marks a new Game Engine generation, ex. Doom 3, Crysis, etc.

    My next new system will be when then next engine AFTER Crysis is out. Then I will WAIT until an x4 or x8 Videocard of that generation can get 60 FRAMES PER SECOND 1080P or else I ain't buying nothing.

    There. That tactic seems foolproof. I will never buy for a game again to be burned at sub-30 FPS gameplay. Lesson learned. Just blew 400+$ a few months ago for 96 stream processors now we are up to 640????

    Live and learn.
  6. Looks hawt. Although I'm sure my 8800GT will be fine for me until the generation that follows the 9800gtx. No thankyou to buying a new card every generation.

    But this looks exciting for anyone about to build.
  7. Aaaaand, NVidia's response is...
  8. Sure, 640 stream processors... yay.

    Too bad it's crippled by a lack of RoPs and TMUs. (only 16 each per GPU so a total of 32 each for the x2, which severely bottlenecks all your pretty stream processors)

    EDIT: Don't get me wrong, I'd really love one of these cards. I'm just disappointed in ATI for not improving upon a blatant shortcoming in their flagship product. Would 8 more of each RoP and TMU have really increased cost that much? I'm willing to bet the cost would have been offset by the reduction of the memory lanes.
  9. You can't Quad-crossfire these, can you?
  10. So that's Octo-crossfire then.
  11. Evilonigiri said:
    So that's Octo-crossfire then.


    That's getting out the marshmellows or hot dogs to roast on the heat output.

    Now where was that add for a 1500wt psu?
  12. No octo it doesnt have enough connectors, 2 of these or 4 of the single ones for quad xfire.


    Idk why people say "will this beat dual 3850???" etc...


    Obivously if its a 3870x2 then it is 2x 3870's which means it will beat 2x 3850s!!


    Common sense people!


    Atleast xfire scales extremely well compared to SLI so this will likely match 2x 8800GT in most things.


    It isn't really crippled by ROPS you just use the shaders to do AA instead.
  13. Actually Hatman, I just read the last link posted by qwertycoptor. It says it can be quad-crossfired. PSU support is another thing...Jesus, where's that 2k PSU?
  14. i think i need a change of boxers :T the ati fanboy inside of me is crying tears of happiness T_T...and here i was bout to go for a 8800GTS. but we'll see how the 9xxx series stands eh?
  15. we'll see when it comes out, ati never has power at the start then they pull that ace outta nowhere, usually too little too late!
  16. Can't wait to see benchies for this thing. If it performs as well as is speculated, it may very well be the card for my in progress build.
  17. I said it CAN be quad crossfired, since it has 2 gpu's, but 4 of them cannot be used to make octo, just 2 of those cards for quad.

    Np strangestranger sry 4 being so aggressive btw :P


    Should perform just as well as 2x 3870's in xfire so have a look to see how well they do.


    Tbh it should need a stupid amount of power... a single 3870 is rated 100watt so if its like the GX2 powerwise it should be less tbh it would be around 150watt for a single 3870x2. So dual ones of these will likely use less power than dual GTX's.
  18. It's a rumor...

    - The AMD logo in the upper right hand corner for every image is not centered, when using a "official" PowerPoint presentation, all the logos should be even with each other, it's professionalism. Look at every official thing AMD has said in the past, it's even, clean-cut, it's professional. These are sloppy and don't look anything even close to professional.

    - If these were official specs, DailyTech or TG Daily would be all over it like a nude celebrity and the tabloids. But, some no-name site is covering the OFFICIAL specs? ...right.

    Until this comes from a serious and legit source, I'm calling this one BS. Nothing seems legit about this article.
  19. dam you just shot down all my hopes and dreams.... guess it's back to cutting myself
  20. Well, if those aren't the real stats, don't ya think it'll still be close to that? 1teraflop seems realistic when you think of 2 3870's working together right?
  21. It should be under, since the 8800GT gets only 504 Gigaflops, and the 3870 is weaker.
  22. Evevn if its not legit the stats of it should be right it IS what AMD is aiming for, dual 3870 that has been on the go for months.
  23. thx good info

    one thing guys it looks likes its 13-15 inches long it will fit in only a few cases
  24. Idk maybe they will shrink it, remember 7900GX2 that was huge when it came out.
  25. dragonsprayer said:
    thx good info

    one thing guys it looks likes its 13-15 inches long it will fit in only a few cases


    This card does look real to me, but more like an engineering card and not retail.

    As for its size, doesn't anyone know how to use a hacksaw? My hacksaw sits right next to my hammer, both helping to fit stubborn parts into places they otherwise wouldn't.
  26. Benchies benchies pleeeeeeze! This means I'll be waiting even longer before I buy my new comp.. (I must have the longest lived 5900 card!).
    Wonder if this thing'l get 40-50 frames per sec in crysis with eye candy turned up at 1000 by 1600???
    Yipeee! (It's all good!)
    Ryan.
  27. I have a new question

    anyone know if you can combine a 3870 with a 3870x2 in crossfire?

    also anyone have the release date?
  28. As far as I know, crossfire-x supports "up to" 4 GPUs, so I would imagine it would work just fine.
  29. Ironnads said:
    Benchies benchies pleeeeeeze! This means I'll be waiting even longer before I buy my new comp.. (I must have the longest lived 5900 card!).
    Wonder if this thing'l get 40-50 frames per sec in crysis with eye candy turned up at 1000 by 1600???
    Yipeee! (It's all good!)
    Ryan.



    LOL! You poor poor man.... And I thought my 6600GT 128mb agp was bad.....

    I'd say, based on what 1 3870 does in crysis, that your expecations are fairly reasonable. Hopefully it works even better then 2 cards in crossfire. Everything maxed and 50frames, no, but it should still pack a nasty punch compared to even a 8800GT.

    Don't get too excited now people.... Yeah it look great, but how much will it cost? What is Nvidia'a answer? How much will NV 9800GTX\GTS cost when released in FEB (hopefully?)? How much power will they have? What about NV 8800GTSx2?

    Lot of questions to be answered, before anyone should say "I am definitely buying this card". :sweat:
  30. EVEN IF these specs were real, I would really like to know if it even WORKS?!

    my 3870s looks sweet in my rig..... but the crap doesn't work (in most instances).
  31. Er..... So the R680 might not even work because you've had bad luck with your 3870?
    Kinda a silly thing to say, cause (maybe I am wrong) most people seem to have no problems with their 3870s =\
  32. Ok, I haven't posted in a long time now. I was reading this and with my 8800 640 I feel like I was living under a rock.

    640 Stream Processors??? That is totally insane.
    /quotemsg]

    640 stream processors is indeed insane, but if we look back to the SP of a 8800GTX and HD2900XT, the 2900 has
    twice as much 320 compared to 128 on the 8800. When the results came did all those stream processors kill the 8800? Nope. IMHO I say when comparing today's DirectX 10 graphics the bigger the specs may not guarantee it will perform faster or better. It's all about the architecture.


    BTW: I may be wrong on this, but if I remember correctly, the SP on Nvidia and ATi's cards are totally different.
    For Nvidia's SP has 5 different task (forgot what they are) and each of these SP can do a different task at the same time thus leaving no SP idle.

    For Ati, the SP also have 5 task, but instead of all of them working only one can do the whole work while the other four stays idle.

    Now if that is the case then for every group of 5 SP in a AMD card and only 1 can do the whole job while the 4 of them are drunk. Then 640SP may not seems as much.

    And am I a nv fanboi?
    Nah, own a HD2900XT here. :)
  33. I wish somebody would hurry up and release a dual gpu card...
  34. kellytm3 said:
    I wish somebody would hurry up and release a dual gpu card...


    I wish Heidi Klum was my room mate, and perhaps a couple million dollars to keep us company. I guess we all have to wait.
  35. LOL! sure ya do..... Until you get to know her better and discover what a total spoiled B!tch she is! And this is one chick you DON'T wana try overclocking as she will overheat quickly in a bad bad way! =0

    Sorry I just don't think much of them supermodels or hollywood types. But I like your millions of dollars idea =D
  36. gamebro said:
    LOL! sure ya do..... Until you get to know her better and discover what a total spoiled B!tch she is!


    Ok, maybe for just a night or two. Got to have my fantasies, you know. Real life can be a drag at times.
  37. LOL! =)
  38. this should work a little bit better than 2 3870s in crossfire as there is no crossfire bridge to connect the two which can cause slow downs and why scaling is so bad on SLi as it only uses one bridge.

    in a PCI-e 2.0 slot it should scale extremely well in crossfire, as it could theoretically allow for each GPU to access x16 1.1 bandwidth in the same slot meaning there would be no bottlenecks caused by a crossfire or SLi bridge.

    this should also help with the problem crossfire has in some games actually slowing down the FPS dramatically.

    hopefully nV can come out with a similar solution to force ATi prices downward as a 2xG92 would beat a 2xRV670 in just about every app and game there is FPS wise making the ATi solution a price/performance king, and nV the raw performance king letting people choose for a price efficient or pure powerful system...

    ahhh the joys of competition, how lovely they are for the consumer
  39. sailer said:
    Ok, maybe for just a night or two. Got to have my fantasies, you know. Real life can be a drag at times.


    after a couple of days when you get bored feel free to give me a ring.. I'll take her off your hands for a bit. Even if she is a "hand full" hmm. :ouch:
    Ryan Adds
  40. Recently Used cars= ok
    Recently Used girls= not ok

    ewwwwwwww! =D
    Unless you are Quagmire from family guy, then you would say "Giggidy!"
  41. What can I say. I'm a Sailer, and isn't it known that a sailer has a different girl in every port?

    Heidi one port, someone else the next, and on down the line.
  42. bstep1989 said:
    this should work a little bit better than 2 3870s in crossfire as there is no crossfire bridge to connect the two which can cause slow downs and why scaling is so bad on SLi as it only uses one bridge.


    GTX/Ultra can use 2 bridges, but they are different, and it's not a bridge bottleneck.

    Quote:
    in a PCI-e 2.0 slot it should scale extremely well in crossfire, as it could theoretically allow for each GPU to access x16 1.1 bandwidth in the same slot meaning there would be no bottlenecks caused by a crossfire or SLi bridge.


    No you'd still be stuck when you put 2 together, so while potentially faster than on a 16x(1.x) mobo, it wouldn't have any advantage over a 4 slot mobo for speed as each VPU still needs to make a 4 tap stop and then share communal chipset resources.

    Quote:
    this should also help with the problem crossfire has in some games actually slowing down the FPS dramatically.


    Not really, any slowdowns now are more driver related than hardware, which is similar to SLi.

    Quote:
    hopefully nV can come out with a similar solution to force ATi prices downward as a 2xG92 would beat a 2xRV670 in just about every app and game there is FPS wise making the ATi solution a price/performance king, and nV the raw performance king letting people choose for a price efficient or pure powerful system...


    Depends on the settings, as already seen the HD3870 does much better in Xfire than when comparing single card to dual and can beat dual G92s in enough situations to make it interesting. The scaling runs out for both though, however some structural limitations may limit SLi sooner, and may remove one of it's biggest advantages in the hardware based AA once you move beyond the 2 card model.

    Which leads me to believe that the main bravado 'e-p3n1s' fight will be for who can get quad or more working smoothly with the highest efficiency. This would be an area where getting supertiling to finally work as expected would offer a greater advantage than more restrictive traditionally workload sharing.
  43. justinmcg67 said:

    - The AMD logo in the upper right hand corner for every image is not centered, when using a "official" PowerPoint presentation, all the logos should be even with each other, it's professionalism. Look at every official thing AMD has said in the past, it's even, clean-cut, it's professional. These are sloppy and don't look anything even close to professional.


    What are you talking about, your statements are pretty funny since you're basing iton powerpoint, and obviously with little experience with it or ATi/AMD's presentations.
    If it involved something spectacular, fine, but the anomalies you mention are just the same as those currently found on their site and in previous presentations. It's not about proffesionalism, it's about the artistic merit of things in presentation software, not desktop publishing software. 1 line title get's one type of positioning, 2 line titles get another.

    Look at their FireGL presentation and it look exactly the same with regards to this spacing you get so focused on;
    http://ati.amd.com/products/brochures/FireGL_2007_Series_Presentation_v3.1.pdf

    Quote:
    - If these were official specs, DailyTech or TG Daily would be all over it like a nude celebrity and the tabloids. But, some no-name site is covering the OFFICIAL specs? ...right.


    Unless these are things we already know. Like "what are the specs you get when you put 2 VPUs on the same PCB"... um 2xR680 specs plus 1x the PCB output specs? Why would that be news since the existence of such a card and it's obvious specs have been a well covered thing since the early fall and photos started trickling in shortly after that.

    Quote:
    Until this comes from a serious and legit source, I'm calling this one BS. Nothing seems legit about this article.


    Ok, to me I call BORING... nothing new here and like even current Xfire/SLi (let alone exotic ones) it will depend alot on drivers and useage as to whether there is any value in these solutions.
  44. Not really, any slowdowns now are more driver related than hardware, which is similar to SLi.


    actually when i switched between my 1950xt and CF edition to the saphire 1950pro dual, I increased my FPS in fear by about 12, in oblivion by 6, and in hl2 by ~25...so yeah the hardware does have something to do with it, the more metal electricity that is forced through the more it slows down and the lower bandwidths are a result since we can never get 100% efficiency out of any device mad by man (or machines made by man's ideas).
  45. These are suppose to scale pretty well, a lot better than Crossfire or SLI. The next best thing to having 2 GPU cores on one die. But That is coming down the road as soon as the heat thing can be taken careof I presume. Aren't these 55nm ?, Seems like more room is needed for Graphic than CPU's so probably it will take getting down to a 35nm process before that can occur?, Just guessing here.
Ask a new question

Read More

Graphics Cards AMD Graphics