OMG!! Real pics of the X2800!!! - page 2

129 answers Last reply
  1. Interesting 2amps for the fan alone. :twisted:

    Now that should move some CFM.

    I'd be interesting in finding out if it's much quieter, because alot of people who buy DELL's and such would worry about the volume of their pre-built rig. Most enthusiasts will just change the HSF if they don't like it.

    I agree with MPjesse about the mid-range, that's my point of interest, because that's likely the target for laptops.

    EDIT: Another thing of note, this card will draw air from inside the case and bring it out, but the configuration also means that you can put something right beside it and don't need to leave the nextslot free unlike the X1950 and GF7900/GF8800 cards.
  2. For everyone saying that this card is huge, L2READ

    Quote:
    There will be 2 versions of R600XTX; one is for OEM/SI and the other for retail. Both feature 1GB DDR4 memories on board but the OEM version is 12.4" long to be exact and the retail is 9.5" long. The above picture shows a 12.4" OEM version.


    The r600's for the consumer's market are only 9.5" long.

    Just to give you an idea, the 8800gtx is 10.5" long.

    The current 1950XT's are 9" long.

    The r600 is only an 1/2" larger than the x1950xt, and 1" shorter than the 8800gtx.

    Hope this helps
    ~3lfk1ng
  3. From VR who I think we can all agree has one.

    "R600 DX10 performance is 5 to 10% faster than NVIDIA G80"

    This is rumured mind you.
  4. +1 on being a good thread.

    I plan on buying the R600XTX when it comes out, and I'm a little concerned my mid-tower case will be able to fit. It's already a squeeze with my X1950XTX. :?
  5. Quote:
    LOL Hope they dont come with vibration? Powerin one of those up...that gets me to thinking..4x4, 1080 watts for the gpu's....damn Ill stick (pun intended) to my blowup doll and collect the static. Anyways, this card should be killa


    I just can't see how:

    1) This card is not going to appeal to anyone but a very small fraction of hypergamers with $100 bills hanging out of their jeans.

    2) They can possibly expect internal power to take care of this monster. By all means just run a 110/220V cable out to the mains. Why bother going through the PSU and forcing the buyer to shell out another $300 or so?

    As for the vibes, dude, for my gf if it doesn't register on the Richter Scale, she aint interested... :lol:
  6. what case you got?
  7. Still is not gonna be better card than the 8800 GTX 8)
  8. LMAO, please...20 bucks says it's going to be able to take on the 8900GTX without over clocking!

    l8er :)
  9. Do you think that ATI will be able to do a hard launch this time?

    I just read over at xtreview that a Chinese paper wrote that they entered production at the end of Jan.
  10. Quote:
    what case you got?


    Thermaltake Armor Jr.
  11. Hot damn! Half the recent posters are Canucks! Don't you guys have some snow to shovel, a square to drink, or some moosi to hunt? :lol:
  12. Quote:
    Do you think that ATI will be able to do a hard launch this time?

    I just read over at xtreview that a Chinese paper wrote that they entered production at the end of Jan.


    Actually it was the circuit board that has been in production, not the gpu.

    Quote:
    Don't you guys have some snow to shovel
    Ya I got to go start my car and brush off all the snow.
  13. LOL, yah here in Ottawa it snowed all night, and its a day off school so i'm just taking it easy, plus would u go out in -24 celsius temps?

    l8er :)
  14. Well here in Montreal, you can get a killer overclock for free, just open the window; can't match that hey rednecks? :lol: :wink:
  15. Nice find. I have been waiting for this... something to compare the 8800GTX against. I am with greatape... this thing better put the 8800GTX to shame given its rumored power draw. 160w is still alot for the 8 series, but 240, good grief. At 240 this thing will consume more than an entire OC'd system.... ok maybe not but it gets close. This doesn't inspire my confidence in AMD/ATI. If the benchies are close than that means it takes ATI engineers 240W to get something done that Nvidia can get done in 160w... not to mention Nvidia had theirs done first by a good 3 months. There is alot hinging on this card.... GL to ATI and I hope it turns out but I am very skeptical.
  16. Quote:
    anyone read the article?

    say the retail will be 9 inches :roll:


    Can't find a reference to a full article. Or are you just meaning the short paragraph under the picture?
  17. Quote:
    anyone read the article?

    say the retail will be 9 inches :roll:


    Can't find a reference to a full article. Or are you just meaning the short paragraph under the picture?

    Touche....
  18. Quote:
    I WANT THAT CARD


    Want that card? I'm salvating over that card like Pavlov's dog! Get out the MasterCard for when it appears. It may be huge, but if it performs as well as its size, I'll probably have to upgrade my whole computer to take advantage of its power.

    I just hope that this may turn around the dive AMD has been taking on the stock market recently. Then maybe I can sell and let AMD buy me a whole new computer.
  19. Quote:
    hey man, leave off the canadians. we got one temping with us at work. a very nice person i have to say. i know your only joking but still, it could be worse, they could be yanks :P


    Hey, my first wife was a Canuck from the wilds of Owen Sound and I spent many a year watching Peter Mansbridge's hairline recede. I have to admit (albeit half-heartedly) that the vast majority of Canadians I have had the privilege of meeting are some of the nicest and best people on the planet, and that large swaths of Canada are heaven on earth.

    Having said that, CANUX SUX! NA NA NAAAAA NAAAAAAAA NAAAAAAAAAH! :P
  20. yeah just the short article.
  21. OK, a serious question for a change. Are there any more high end DX9 cards in the pipeline? Anything in sight to exceed, say, an X1950XT?
  22. Quote:
    OK, a serious question for a change. Are there any more high end DX9 cards in the pipeline? Anything in sight to exceed, say, an X1950XT?


    I wouldn't expect so since the DX10 cards run DX9 just as well, although it is obviously not optimized for it. So while I say no, that doesn't mean they won't come out, but it just doesn't make a whole lot of sense.
  23. Quote:
    OK, a serious question for a change. Are there any more high end DX9 cards in the pipeline? Anything in sight to exceed, say, an X1950XT?


    I'll go out an a limb and disagree with Superfly on this. After all, cards have kept coming out for AGP machines, and AGP has been largely replaced by PCI-E, so why shoudn't new cards come out for XP/DX9 machines, and for all those who use Linux as well, since they aren't affected by DX10 anyway.

    True enough, on one hand it doesn't make sense, but on the other, it does. After all there are a lot of people who won't be upgrading to Vista for a long time and they will need cards for their machines. Even though a DX10 card is backwards compatible with DX9, such a card won't seem that great if its power not used. Ok, I allow that some DX10 cards may be downgraded to work with DX9, similar to the way that PCI-E cards get downgraded a bit to work with AGP. Who can tell, but that Nvidia's 8800 series will be among those cards that are a main upgrade for XP users, for instance?
  24. just one question guys..
    they say this videocard will do 240 watts
    thats around 20 amperes at 12 volts right?

    how many watts does the G80 consume??


    Quote:
    ok, from what i found on the most recent data on this card i made a little comparison between the 8800GTX and the X2800XTX, obviously i could be wrong and feel free to comment;

    8800 GTX
    Stream Processors: 128 I donno if it’s unified too
    Core Clock (MHz) 575
    Shader Clock: (MHz) 1350
    Memory Clock: (MHz) 900
    Memory Amount: 768MB GDDR3
    Memory Interface: 384-bit
    Memory Bandwidth: (GB/sec) 86.4 64
    Texture Fill Rate: (billion/sec) 36.8


    X2800XTX
    Stream Processors: 64 unified
    Core Clock: (MHz) 700
    Shader Clock: (MHz) couldn’t find
    Memory Clock: (MHz) 1000
    Memory Amount: 1024MB GDDR4
    Memory Interface: 512-bit
    Memory Bandwidth: (GB/sec) 153.6
    Texture Fill Rate: (billion/sec) I donno

    l8er :)


    you forgot something important


    G80 = Direct X 10
    R600 = Direct X 10.1

    G80 = 128 simple 1 way shaders ( max 128 shaders per cycle )
    R600 = 64 complex 4 way shaders. (max 256 shaders per cycle )

    Quote:
    What i find somewhat interesting is the Nvidia's 8800 GTX needs at least a C2D X6800 to make full use of the card. Anything below that CPU, the card bottlenecked by the CPU.

    So the R600 will experience the same thing. There are going to be so few people that will actually see the full potential of the card unless they have a $1000 CPU to go with it (and high end RAM).

    It's great that ATI/AMD and Nvidia are releasing new technology like they are and pushing our visuals to the point we're at now, but the rest of the world can't keep up yet. I find it almost humerous that ATI/AMD is releasing this card that their own current high end CPU won't be able to keep up with (I know they have Barcelona coming out soon too)...could be a little funny.


    Didnt they said already that DAMMIT used a new technology to dont be cpu bound?

    Quote:
    Also the R600 does not have Stream Processors. They are 64 unified shader's that can do 128 shader operations per cycle.

    its my imagination or you dont read much?
    Vrzone claimed they will do 256 shaders per cycle ( 64 x 4) at most
    and if they use the same r600 core to run a stream aceleration, obviusy they have a similar way to process the "streams" just like the X1950's had....
    Vrzone also said there will be R600 variations with watercooling specially for workstation for stream processing
    http://www.vr-zone.com/index.php?i=4627
  25. :tongue: :tongue: :tongue:

    I'm starting to drool right about now.. :P
  26. At 9.5 inches, the retail is still a very large card!
  27. Quote:


    you forgot something important


    G80 = Direct X 10
    R600 = Direct X 10.1


    so does this mean that the X2800 wont work with DX10.2 and above? is a new card needed for every upgraded version of DX10?

    l8er :)
  28. Quote:


    you forgot something important


    G80 = Direct X 10
    R600 = Direct X 10.1


    so does this mean that the X2800 wont work with DX10.2 and above? is a new card needed for every upgraded version of DX10?

    l8er :)
    tell that to those who own Direct 9.B but not C of the X500 X800 and X850 series of ati video cards, and now cant run games like rainbow six vegas ¬_¬
  29. You make a valid point and I do see your argument. I agree, many people won't be upgrading to Vista for a long time and there are other people who will never use it (you point out Linux). However, I am not sure that it is a financially sound decision to continue to make two entirely different architectures.

    DX10 is obviously very different from DX9 (we all agree here) and thus the card is a very different make. If I had a break down of the cost of materials I could make a better judgement and plant capacity I could do a nice analysis and figure out what nvidia was most likely to do. It all depends on the cost. If it costs the same to make a DX10 card such as a 8300GS or 8600GS as it does to make a 7350GS (made that up as a refresh of the 7300GS), I would expect Nvidia to stick to their DX10 guns because they can hit the same price points with more features.

    On the other hand, if DX9 cards come in at $20 cheaper to build, I would agree with you and expect to see refreshes for a few months because in the budget/mainstream markets $20 is a huge difference. Think about it... a DX10 card for $100 or a DX9 card for $75 and you don't really game.... that's a no brainier because DX10 isn't a requirement to run Vista.

    So without more financial information it is possible that either of us could be right :D
  30. Doh! Edited due to bonehead mistake.
  31. you wont get a card without a cooler right?so even if the card is just 2cm long and the cooler is 40cm thats what really counts anyway
  32. Quote:
    you wont get a card without a cooler right?so even if the card is just 2cm long and the cooler is 40cm thats what really counts anyway


    I will if I get it WC'd :wink:
  33. Quote:

    I think you mean those who don't get the full effect of the game. You imply they can't run it at all which is a misstatement. :)


    Actually it's not playable (yet), because Epic decided to go SM3.0 only, just like UBi did with SplinterCell. This IMO had more to do with the SM3.0 nature of the feature sets in the new consoles than anything else, since SplinterCell is nV and R6V is ATi IIRC.

    Oh yeah BTW, I think 240 is still high side, like I said the G80 was rumoured at 220 so rumour to retail diff was 60W, and so who knows the R600 retail could be floating around the 200W mark.

    I Also wonder if the one VR has or has been told about isn't the GDDR3 early samples, which would also consume more power than the GDDR4 versions (although by a small factor).
  34. How much will the 1GB version cost again? Gawd I already have enough trouble trying to fit a 8800GTS in my midtower case. Whatever happened to MICROtechnology. Geezus.

    Now why can't they just do a Quad-core GPU with the ability to add memory and call it a year...
  35. Quote:

    I think you mean those who don't get the full effect of the game. You imply they can't run it at all which is a misstatement. :)


    Actually it's not playable (yet), because Epic decided to go SM3.0 only, just like UBi did with SplinterCell. This IMO had more to do with the SM3.0 nature of the feature sets in the new consoles than anything else, since SplinterCell is nV and R6V is ATi IIRC.

    Oh yeah BTW, I think 240 is still high side, like I said the G80 was rumoured at 220 so rumour to retail diff was 60W, and so who knows the R600 retail could be floating around the 200W mark.

    I Also wonder if the one VR has or has been told about isn't the GDDR3 early samples, which would also consume more power than the GDDR4 versions (although by a small factor).

    Well damn... I am wrong. Ignore my post lol.

    Thanks grape.

    Quad FX: $700
    Crossfire R600: $1100
    Heating your home without a separate heater: Priceless.

    For everything else there is AMD, which can't get its ducks in a line with using enough power to run NYC.
  36. Quote:
    How much will the 1GB version cost again? Gawd I already have enough trouble trying to fit a 8800GTS in my midtower case. Whatever happened to MICROtechnology. Geezus.

    Now why can't they just do a Quad-core GPU with the ability to add memory and call it a year...


    and im still surprised that.... AROUND 80% OF PEOPLE WHO POSTED HERE, DID NOT READ FULLY THE LINK

    a lot doesnt notice the big diferences betwen then OEM ( 12 ' )

    and RETAIL (9'5 )
  37. i just measured my X1950pro and she's 9" long, so an extra .5" doesn't really make much of a difference other them cable management will be more of a challenge.

    l8er :)
  38. Quote:
    i just measured my X1950pro and she's 9" long, so an extra .5" doesn't really make much of a difference other them cable management will be more of a challenge.

    l8er :)


    Well crap doesn't come easy ... at least the stuff that is worth anything :wink:
  39. I hear ya, but even around 200w, sheesh. Now I'm not a "save the planet and don't buy this" kind of nut. But the one thing I can't stand is a jet engine going anytime the computer is on. I might have to break down and finally do a WC setup.

    Though Thermalright has a new revision of their hr-03 coming for 8800 cards, so maybe there might be some hope for a quiet ATI card.
  40. Quote:
    and im still surprised that.... AROUND 80% OF PEOPLE WHO POSTED HERE, DID NOT READ FULLY THE LINK


    thats what i said on the first page :lol:
  41. Quote:

    I got a link from a source to the Samsung memory being used and it specs between 1ghz and 1.4ghz depending on the timing/voltage. Current target seems to be 1.2ghz.


    Well looks like the memory I had been told about has now been confirmed by the InQ;
    http://www.theinquirer.net/default.aspx?article=37559

    http://www.samsung.com/Products/Semiconductor/GraphicsMemory/GDDR4SDRAM/512Mbit/K4U52324QE/K4U52324QE.htm
  42. Quote:
    I hear ya, but even around 200w, sheesh. Now I'm not a "save the planet and don't buy this" kind of nut. But the one thing I can't stand is a jet engine going anytime the computer is on. I might have to break down and finally do a WC setup.


    Well we don't know the volume levels of that OEM, even the reviewers are saying likely it's a quiet blower, and not a whiney fan like we're used to. However I doubt the same fan will appear on the shorter cards, so who knows.

    Quote:
    Though Thermalright has a new revision of their hr-03 coming for 8800 cards, so maybe there might be some hope for a quiet ATI card.


    IMO, I think a combo heatpipe peltier solution similar to some GF8800s is a great way to go about it, but once again a few extra watts of power required to run the peltier portion, and could be extremely quiet while being very good.
  43. Quote:
    From VR who I think we can all agree has one.

    "R600 DX10 performance is 5 to 10% faster than NVIDIA G80"

    This is rumured mind you.


    Because we have DX10 games to bench them on right :roll:

    Card looks good... not as sexy as the X1950XTX though :/
  44. Quote:
    Wow, these card gets more and more ugly these days. I wonder if the card comes with a computer case to fit it?


    I was thinking it looked pretty industrial. As did the 270 watt rating.

    Ah, if your case doesn't have a window, nobody will know that your GPU looks like the bottom side of your lawn mower.
  45. I think this new card will get an extra 190% boost in performance just because of the fact that it has a handle on the card (or what looks like a handle) :evil:
  46. that card will have to either add 40% performance, or be half the price or AMD or ATI are in big trouble
  47. This won't be much faster than the 8800 GTX. Look at the insanely high clockspeeds they're using. Look at the size of that cooler. It's totally clear they had to go brute force with high clockspeeds to match Nvidia.
  48. Quote:
    20 bucks says it will take on an 8800GTX in sli :D


    I'll take that bet. 8)

    Edit: Really hope the power reqs are exaggerated... 20A just for the card...?! 8O
  49. Thanks for the info on the DX9 thing, guys. Now let's look at the situation from my own standpoint. Gaming: Zero. High Quality Video: Maximum. Is there any reason, no matter how small, that DX10 does anything for me?

    Furthermore, I am rather surprised that DX10 has become a MS monopoly. I understand that DX10 implementation into an OS is not just a small patch job, but would it not be anti-competitive and anti-trust to relegate all other OSs to 2006 video status for eternity?
  50. Holly crap this thing is huge!! i get butterflies just thinking about its performance lol. Can't wait for my mate 2 eat his own words about the 8800gtx kicking R600's A$$.with all those heat pipes on it do u think that it will b rather heavy and possibly cause problems with the PCI-X slot? maybe bow the card?but congrats to amd/ati, best of luck for their launch.(Crysis now has a worthy beast to slay it) haha
Ask a new question

Read More

Graphics Cards ATI Graphics