RADEON HD 2600XT, 256MB GDDR4, HDMI, Dual DVI (2 x Dual-Link

23 answers Last reply
More about radeon 2600xt 256mb gddr4 hdmi dual dual link
  1. look at the box, it has 512 Gddr4 on it. No extra power connector for the card either, I wonder how well it does in the gaming benches.
  2. Nice find. Now I just hope they don't go nVidia's way and release it with a 256-bit bus.

    @goldragon: Yeah, the box does say 512MB, but the image is "reference only", so it'll probably be 256MB (atleast initially).

    EDIT: I read somewhere (I think it was an HD 2900XT review) that the HD 2600XT sucked very little power (~40W), so they may not need an external power connector. I really hope this card performs well; if it does, AMD/ATi can really grab the lucrative middle-market.
  3. Quote:
    @goldragon: Yeah, the box does say 512MB, but the image is "reference only", so it'll probably be 256MB (atleast initially).

    well everything else is right on the box, so there will probably a 512mb version of the card
    Quote:

    EDIT: I read somewhere (I think it was an HD 2900XT review) that the HD 2600XT sucked very little power (~40W), so they may not need an external power connector. I really hope this card performs well; if it does, AMD/ATi can really grab the lucrative middle-market.


    It would be great to see, maybe then they can come out with powerful cards that us existing power supplies.
  4. hope ati doesn't blow it
  5. Just decided to throw this out there.

    Well I decided to do a search on the HD 2600 xt, and I found a few things, one was that the 2900xt would be from the 65nm process (either this is old, or the 2900xt that is being reviewed now isn't the true 2900xt), and I found this link from Hardocp.
  6. I think they'll be moving it to 65nm later (possibly with a revision or the XTX).

    If you look at that link you gave me, it says the 2600XT uses 45W power and the 2400XT uses 25W. Hopefully this doesn't come at the expense of performance, but it will be nice for those people with 350-400W PSU's.
  7. Quote:
    I think they'll be moving it to 65nm later (possibly with a revision or the XTX).

    If you look at that link you gave me, it says the 2600XT uses 45W power and the 2400XT uses 25W. Hopefully this doesn't come at the expense of performance, but it will be nice for those people with 350-400W PSU's.


    Well, it would be great to finally see how it does in benches, like I said before. I thank though, that ATI is waiting for there drivers to be a bit more expectable, and may start getting there 65nm cores before the 2600's launch.
  8. ahhh... cant wait till ate june :x
  9. How do we know these cards will be available in June? Also, does anybody know if you can have an "unbalanced" cross-fire setup? In other words, if I get a 2900XT now, would it be possible to add a 2600 later to make it crossfire? Thanks.

    Rob (obviously not of SLI fame)
  10. No, you can't have that. The best you can do is get similar (not identical) cards and flash their firmware to be identical... it may work then. Never between entirely different chips though.
  11. :lol: :lol: Model Name GC-RX26XTG4-D3(ROHS)
    ASIC RV630XT (65nm)
    Core Frequency 800Mhz
    Memory Size 256MB
    Memory Type GDDR4
    Memory Bus 128-bit
    Interface PCI-Express x16
    D-Sub Yes (DVI to CRT dongle)
    TV-Out S-Video / HDTV-out
    DVI Dual DVI (2 Dual Link, HDCP)
    Video-In N/A
    Cooling GECUBE Exclusive X-Turbo2 Silent Fan
    Windows VISTA Ready Certified for Windows VISTA Premium requirement.
    Shader Model 4.0
    Direct X DirectX® 10
    HDMI TV ATI exclusive DVI to HDMI dongle (HD Video + 5.1 channel surround audio)



    ITs doomed :evil: :evil: 128 bit piece of crap :lol:
  12. Quote:
    Nice find. Now I just hope they don't go nVidia's way and release it with a 256-bit bus.

    @goldragon: Yeah, the box does say 512MB, but the image is "reference only", so it'll probably be 256MB (atleast initially).

    EDIT: I read somewhere (I think it was an HD 2900XT review) that the HD 2600XT sucked very little power (~40W), so they may not need an external power connector. I really hope this card performs well; if it does, AMD/ATi can really grab the lucrative middle-market.

    The 8600 has a 128 bit bus. It will be REALLY SAD if the 2600XT also has a 128 bit bus!
  13. I posted this yesterday. If anyone wants the people at gpureview have figured out specs like shader operations per sec, pixel fill rate etc. Form theses specs it looks like the 2600XT wont perform aswel asl the 8600GTSin dx9 games but when it comes to shader heavy games (ala dx10) it will eat the 8600GTS for lunch.

    http://www.gpureview.com/show_cards.php?card1=520&card2=512

    Quote:
    It will be REALLY SAD if the 2600XT also has a 128 bit bus!

    It is very sad that the 2600XT will use a 128bit bus, why couldn't they just go that little further and make it 196bit or something. Whats even sadder is that the 24XX series look to have a 64bit bus. 8O :? :( :cry:
  14. this is from AMD/ATI site
    Quote:
    # 128-bit DDR2/GDDR3/GDDR4 memory interface
    # Ring Bus Memory Controller
    * Fully distributed design with 256-bit internal ring bus for memory reads and writes

    Link

    EDIT: It also says
    Quote:
    390 million transistors on 65nm fabrication process

    So it is going to be 65nm
  15. If its 65nm then it could be almost as good as the 7950 GT.
  16. Sorry guys but it's been known for a while that the X2400/X2600 cards were going to be 128-Bit, 65nm parts.

    Until there are some solid benchmarks, there's no point in guessing the performance. It could very well be that ATI's architecture deals with the 128-Bit bus much more efficiently than NV's. Then again it could deal with the 128-Bit bus far worse than the NV parts. One thing that is a good plus is power efficiency. ATI's mid-range parts often lag behind NV's on power efficiency. That doesn't appear to be the case with this generation.
  17. There are cards that have performed very well with a 128-bit bus, like the 7600GT. The memory bandwidth of the HD 2600XT is 35.2 Gb/s (same as 6800 Ultra), compared to the 32 Gb/s of the 8600GTS, and up from the 22.4 Gb/s of the 7600GT (which ran neck-and-neck with the previous generation champs). Hopefully, ATi can utilize the 128-bit bus in the same way nVidia did with their 7600GT, and we can have a decent card on our hands.
  18. Quote:
    64bit needs to die completely and the low range should be 128 bit with the midrange being 256bit. Aren't the high end cards 512bit or something now???


    QFT :trophy: :trophy: :trophy:
  19. :?: I have a question:
    Why the hell they(AMD/ATI & nVIDIA) don't make their mid-rang cards with a 256Bit memory bus?? since the AMD/ATI's high end card has a 512Bit memory bus i think that they should release their mid-level card with a 256Bit memory bus!
    Remember Radeon X1800GTO?? It was an awesome mid-level card when released and all because of it's 256Bit memory interface and it easily outperformed the 7600GT & it was neck to neck with 7800GT.(I know that it was more expensive than 7600GT but it worthed the extra money over the 7600GT since it was faster and it supported HDR+AA)
  20. You might see the 2900 pro with a 256 bit, a crippled version of the xt, say for upper midrange.
  21. I don't think anyone would call a HD2900Series, AMD/ATI's mid-range card.
    Oh, come on! 2900Pro will be mid-high-end card(Like X1950Pro/GT/X1900GT) not mid range!
    Something like HD2700XT or HD2800GTO would be better for that title!
  22. Therell more than likely be 3 cards higher/better than this card, itll be uppermid or lower highend
  23. Quote:
    Yea hopefully the DDR4 and the 65nm process can offset some of the performance hit by going with 128bit instead of 256bit. 64bit needs to die completely and the low range should be 128 bit with the midrange being 256bit. Aren't the high end cards 512bit or something now???


    The 128bit interface in considirable more expensive to produce than 64bit, same with 256 vs. 128bit. So it's up the money. I think that we will see some 256 bit cards from those cards, where all 512bit lines does not work.

    Those who knows better may correct, please, but I think that it affect the pin count, so the price does not reduce when you go down, with prosessing technology, so it's as expensive to produce 256bit, to 65nm prosessing technology as it was with 110nm...
Ask a new question

Read More

Graphics Cards DVI HDMI AMD ATI HD Radeon Graphics