4870x2

I found pictures of the monster :D . It doesn't give much details but I thought it might be worth mentioning.
http://www.techpowerup.com/index.php?64318
61 answers Last reply
More about 4870x2
  1. The R700. No ones sure of the clocks, only guesses at this point. Also, will we see a somewhat gauranteed 1.8 scaling? Lots of interest in this card.
  2. black is beautiful eh
  3. beautiful, simply beautiful is all I can say.

    I must be in love. :D

    I am interested in this supposed 4890 that is supposed to be an even higher clocked 4870, so like a core clock of at least 900-ish right?

    Edit: I also cannot wait for that fabled RV740 w/ a 45 nm chip w/ 480 SP! The price point on that has to make it a killer mainstream card, simply killer!
  4. eh, i dont know how you can get more mainstream than the 4850

    i mean 200$ is mainstream really~~
  5. I think the 4840 will go against the GT and cream it, at a better price point. ATI needs to get these cards out soon, as nVidia is dumping all their old 8xxx cards on the cheap
  6. JAYDEEJOHN said:
    I think the 4840 will go against the GT and cream it, at a better price point. ATI needs to get these cards out soon, as nVidia is dumping all their old 8xxx cards on the cheap


    You're right about that... Isn't competition beautiful ?
  7. I am kinda on the fence about the x2... Each graphics core only has 512 mb memory, and if I am running a crossfire setup I want each graphics core to have 1 gb of memory (like 2 4870s) but I want to build a comp with tons of calculating power (like the x2). I wonder if an x2 with 2 gb of memory will come out?
  8. Plain Old Me said:
    I am kinda on the fence about the x2... Each graphics core only has 512 mb memory, and if I am running a crossfire setup I want each graphics core to have 1 gb of memory (like 2 4870s) but I want to build a comp with tons of calculating power (like the x2). I wonder if an x2 with 2 gb of memory will come out?


    There is a rumour that the 4870x2 will share the memory between the two cores...ie giving 1gb effective rather than 2x 512mb

    Its just a rumour tho at this point....i will be very suprised if it actually comes true.
  9. http://anandtech.com/video/showdoc.aspx?i=3341&p=9 at the bottom artitect for crossfire for R700 they have a section for crossfire in the artictect. Seems like a bag of lucky charms :pt1cable:
  10. Theres been a few pix out, not sure if theyre photochopped or not, but one shows 1Gig of memory. Also, does this, or might this mean a reading of 1 gig shared, or there actually being 2 gigs on board? Still speculating....
  11. JAYDEEJOHN said:
    Theres been a few pix out, not sure if theyre photochopped or not, but one shows 1Gig of memory. Also, does this, or might this mean a reading of 1 gig shared, or there actually being 2 gigs on board? Still speculating....

    Yea it could have very well been photocroped but one thing is certian ATI has something up thier sleeve with this card with crossfire. It only makes sense in how ATI is going with 200/300 price market first and work their way up, using crossfire to win against NIDIA's high end card. Would benifit them in the highest long term.

    "Call of Duty 4
    AMD's architecture did very well under Call of Duty 4 in the single-card tests, with a single Radeon HD 4870 performing better than a GeForce GTX 260. The scaling from one to two cards is beyond perfect in CoD4, the reason being that we test on two different platforms (Intel X48 for CrossFire, NVIDIA 790i for all single-cards), the end result is a rare case where two of AMD's $300 cards actually outperform two of NVIDIA's $650 cards. By no means is it the norm, but it is a testament to the strength of AMD's RV770 GPU." found that bit scary :lol:
  12. kelfen said:
    found that bit scary :lol:


    AMD/ATI's back in the game!
  13. Part of the problem which will continue to show is, nVidias chipsets sometimes have problems, and sometimes cant perform to peak . They just arent as consistent as Intels. Theres going to be individual games where the ATI cards are just plain fastest, due to the demands of the game, and the ability of the card in question. That also means theres going to be games where nVidia cards show a higher increase over their ATI competing cards. Add this in with the possibility to even those games that favor nVidia with the ability of whatever it is that ATI is doing, which just may turn out to be a somewhat solid 1.8x consistentcy in their new setup, then ATI will even look good in those nVidia favored games
  14. Thanks for the info!
  15. omg, i found my one true love

    until the next gen
  16. Plain Old Me said:
    I am kinda on the fence about the x2... Each graphics core only has 512 mb memory, and if I am running a crossfire setup I want each graphics core to have 1 gb of memory (like 2 4870s) but I want to build a comp with tons of calculating power (like the x2). I wonder if an x2 with 2 gb of memory will come out?


    1GB 4870 versions will probably came later so i dont see problem have 2GB on 4870x2, maybe except they will be limited and little more expensive but i realy doncare much about paying little extra to get better card.

    Until now i was happy owner of 8800GTX, but this card realy tempting me to upgrade :)
  17. great cards... but umm.... anyone got some spare change? i'm saving up... :-p
  18. lol @ the nude shots... should put NSFW in the title.
  19. xrodney said:
    1GB 4870 versions will probably came later so i dont see problem have 2GB on 4870x2, maybe except they will be limited and little more expensive but i realy doncare much about paying little extra to get better card.

    Until now i was happy owner of 8800GTX, but this card realy tempting me to upgrade :)

    Well not shure how much more money it will be since it is GDDR5 double the memory.
  20. kelfen said:
    http://anandtech.com/video/showdoc.aspx?i=3341&p=9 at the bottom artitect for crossfire for R700 they have a section for crossfire in the artictect. Seems like a bag of lucky charms :pt1cable:


    What the hell is that supposed the mean? I wonder if you will be able to crossfire a 4870 and a 4870 X2 like the 3xxx series....

    Edit: I really hope so, because TRI crossfire with a 3870 and a 3870 X2 produced some pretty nice scaling.
  21. What that may mean is, theyre trying to get intercommunication between die BEFORE cf, for better scalability, at a hoped for 1.8x. Supposedly at 499 USD and coming the last week of July http://translate.google.com/translate?sourceid=navclient&hl=en&u=http%3a%2f%2fwww%2ehkepc%2ecom%2f%3fid%3d1410%26fs%3dc1n
  22. 4870x2 in 1 month....?

    Can I wait.....? I honestly don't know........
  23. JAYDEEJOHN said:
    Part of the problem which will continue to show is, nVidias chipsets sometimes have problems, and sometimes cant perform to peak . They just arent as consistent as Intels. Theres going to be individual games where the ATI cards are just plain fastest, due to the demands of the game, and the ability of the card in question. That also means theres going to be games where nVidia cards show a higher increase over their ATI competing cards. Add this in with the possibility to even those games that favor nVidia with the ability of whatever it is that ATI is doing, which just may turn out to be a somewhat solid 1.8x consistentcy in their new setup, then ATI will even look good in those nVidia favored games



    And then those games will have a patch, claiming that there was an "error" when played with Nvidia cards :lol:
  24. Oh I see, thanks jaydeejohn.
  25. Good to see the deals coming out. Maybe more on the horizon too, as long as ATI can keep making competitive cards. Itd be nice to see a huge price drop on the G2xx series if these x2 cards rock at 499$usd
  26. JAYDEEJOHN said:
    Itd be nice to see a huge price drop on the G2xx series if these x2 cards rock at 499$usd


    Hah, only after the revision, my Antec 900 and PC P&C 750w are screaming at the thought of those things...
  27. I thought the price point was $450? When did they up it?
  28. When they saw how bad the GTX280 was.
  29. Are any of the partners going to make a 4850x2?


    Would it find a market niche between 4870 and 4870x2?


    Would it beat the GX280? (for less money)
  30. 4870 already beats the gx280 or is on par in all cases where frame rates are actually playable (gx280 will probably beat a 4870 in crysis on the highest resolution where both are unplayable in any case).
    and is half the price.

    so the 4870x2 will beat the 280 for sure. no contest..
  31. i'd love to see a monstrosity like the asus trinity where they put (3) hd3850's on like MXM modules [gpu's for laptops]

    that would be truly an amazing thing, although the trinity did occupy three slots, 2 on bottom and one on top of the card....
  32. lightzy said:
    4870 already beats the gx280 or is on par in all cases where frame rates are actually playable (gx280 will probably beat a 4870 in crysis on the highest resolution where both are unplayable in any case).
    and is half the price.

    so the 4870x2 will beat the 280 for sure. no contest..

    There will never be a 4850x2.

    Putting 2 of those very quick chips (4850 chip is the same as 4870) and then limiting all that power with dd3 memory would be pointless....it wouldn't really be able to stretch its legs.

    If they did give it the ddr5 memory, then it would basicly be just a lower clocked 4870x2 which people would then just get and overclock to 4870 speeds.
  33. OK, I have a quick question. I am looking at getting a 30" monitor, what I want to know is will 2 4870's in crossfire work. It will have to use two DVI cables and this made me unsure. I presume the 4870x2 will be able to do this with no problem... but if I can't wait for one, will 2 4870 in crossfire be able to?
  34. spaztic7 said:
    OK, I have a quick question. I am looking at getting a 30" monitor, what I want to know is will 2 4870's in crossfire work. It will have to use two DVI cables and this made me unsure. I presume the 4870x2 will be able to do this with no problem... but if I can't wait for one, will 2 4870 in crossfire be able to?


    yeah, lesser config.'s will work in fact, but you know it all depends on what you are actually using it for, you know?
  35. It will be mainly for gaming at native resolution (I am looking at the gateway XHD3000 monitor, so a resolution of 2560x1600). This is an upgrade that will happen in a about a month or so, so I have time to shop around.

    Oh, incase anyone hasn't seen the other thread that has the review from [H], here it is!
  36. For a 30" monitor, I thought you needed two DVI input cables to go into the monitor so it is DHCP compliant or something.... for protected media so you can watch it or something.... I am not quite sure, but I though that is what it was for.
  37. spaztic7 said:
    For a 30" monitor, I thought you needed two DVI input cables to go into the monitor so it is DHCP compliant or something.... for protected media so you can watch it or something.... I am not quite sure, but I though that is what it was for.


    I would do a little research on that, b/c I ahve seenon 26" monitors two DVI ports, so i wonder fi that is true or not...?

    My 24" doesn't So I assume if what you are saying is true than 1900*1200 res. doesn't need an extra DVI cable...

    I do know this for a fact though: " - And on the third day, God created the Remington bolt-action rifle, so that Man could fight the dinosaurs. And the homosexuals." -some dude on this forum.

    :lol: :lol: :lol:
  38. You guys must have read 2X 4870X2 CFX (FOUR 4870!!!) previews lately.. SLI 2X GTX280 actually beats in Crysis and many other games!

    i think its really risky AMD bet on CFX configuration for their top models instead of a single high-end like GTX280
  39. concrum said:
    You guys must have read 2X 4870X2 CFX (FOUR 4870!!!) previews lately.. SLI 2X GTX280 actually beats in Crysis and many other games!

    i think its really risky AMD bet on CFX configuration for their top models instead of a single high-end like GTX280


    Crossfire scales horribly in Crysis. That said, quadfire/quad SLI usually also sucks, so I'm not surprised that they didn't do too well. Hopefully 4870 1gb + 4870X2 will hit the sweetspot. and did you not see some of the benchmarks where the 4870X2 beats out GTX280 SLI? pointing out one instance where quadfire loses to SLI and not acknowledging when the X2 beats SLI is a little bias, isn't it?

    in the anandtech review, the 4870X2 beats GTS280 SLI in every instance except in Crysis and Oblivion 2560 x 1600. It completely trounces GTX280 SLI in Conan and Grid, and beats it out at every resolution in Oblivion up until 2560x1600.
  40. Quote:
    no, you need a dual link DVI connector, not dual dvi connectors. DHCP compliancy is slightly different and is to do with the I/O connectors on the equipment themselves rather than cables AFAIK.


    Dual link DVI connector? I am not sure I understand.
  41. spaztic7 said:
    Dual link DVI connector? I am not sure I understand.

    For resolution over 1900x1200 you need Dual Link DVI (1 screen split over 2DVI adapters). If card doesnt support Dual link DVI then even if you have have 4 DVI outputs you still cant use higher resolution.
    For example all 30" LCDs with 2560x1600 resolution require it.
  42. Quote:
    a dual link dvi socket is one that has two signal transmitters instead of one allowing for twice the bandwidth. a single link dvi can only really manage 19 x 10 res so you need the extra bandwidth to handle the higher res of the 30".

    look up the wikipedia article for a better explanation.


    ok...

    you guys and wiki explained some, but there is something that I still don’t understand. Wiki says that with dual link dvi, it can do double the bandwidth; k, that’s fine. Is Dual Link DVI just 1 cable that has the correct pins or do you need two cables?



    I just want to make sure that I do understand this. It makes sense if one cable would work because if the cable is dual link, it can do 4 megapixal. I just thought you needed two cables to do this.
  43. Dual link DVI is a single cable, but it has more pins.
  44. concrum said:
    You guys must have read 2X 4870X2 CFX (FOUR 4870!!!) previews lately.. SLI 2X GTX280 actually beats in Crysis and many other games!

    i think its really risky AMD bet on CFX configuration for their top models instead of a single high-end like GTX280


    see the problem w/ w/e concrum says, I have to actually fact-check for, b/c he has trolled so many posts in this forum, and spreads way too much misinformation...
  45. Quote:
    x2


    What?!
  46. one of those instances where x2 = +1
  47. ^+1 :lol:
  48. Ahhh... ok.

    Wow, i feel enlightened now. I always thought that you needed two cables but you just need 1 dual link DVI cable. Nice.

    And I assume that the 4870x2 with all the bench marks showing and all, will be more than able to play back High def crap while in crossfire mode.

    What is even better is ATi is realizing that we want these things to make us food, so they are making them hot enough to warm a can of soup if you place it on the video card. After 3 minutes, stir then heat to a simmer!
Ask a new question

Read More

Graphics Cards Monster Graphics Product