Closed

Gigabyte's Radeon R9 290X WindForce Spotted

Yes! Another custom Radeon R9-290X solution.

Gigabyte's Radeon R9 290X WindForce Spotted : Read more
31 answers Last reply
More about gigabyte radeon 290x windforce spotted
  1. ahh, these finally begin to show up :D
  2. Hopefully I can pick one up before the miners clean 'em out.
  3. cant wait to see some benches of how these cards SHOULD have been made.
  4. & in a best form as this will be excellent solution that will not cost much more then stock based ones.
    It will probably be a best market solution!
  5. Instead of "the way its meant to be played" it will be a case of "the way it should have been built"
  6. yea they also have a 4GB version of the R9 270x coming too. :P
  7. Am I the only one who dislikes the new I/O panel compared to the old one?

    I always thought a single slot I/O with a full width vent was a cleaner look. I also am a big fan of miniDP and HDMI over DVI or DP.

    Yes I do realize it has an HDMI out, I just with it had only that and 5 miniDP connectors for eyefinity 6 connectivity.
  8. Hando567,
    DP1.2 supports up to four 1920x1200 monitors via a single DP connection using either Daisy Chain (supported monitors) or a DP hub.
  9. UPDATE:
    There is also an R9-290 being prepared.
    http://www.gigabyte.com/products/product-page.aspx?pid=4884#ov
    http://www.hardware.fr/news/13494/r9-290-290x-windforce-3x-gigabyte.html
    "Ces cartes devraient arriver dans le commerce fin décembre / début janvier à un tarif non précisé."
    "These cards should arrive at the end of December / beginning of January at unspecified prices"

    @Hando567.
    You are partially right but, I use Eyefinity with an active DP->DVI adapter and I don't see why I should get a new adapter with my new card when it actually does the exact same thing. I'm not saying there should be no cards with nothing but miniDP connections, but there DEFINITELY have to be cards with DVI+ full size DPs.
  10. Grrrrr ... no release date info? I have my trigger finger on a Titan (if the price doesn't put me off), a 780 Ti or one of these. I want the world ... I want the whole world ... etc.
  11. Forget the Titan. It's super computing, not consumer computing. You'll be paying for precision calculations that are not used in the real world.
  12. @ 9-Ball Why would you spend a grand on a Titan when the 780ti outperforms it for 730 dollars or less?
  13. @ 9-Ball Why would you spend a grand on a Titan when the 780ti outperforms it for 730 dollars or less?
  14. Devoteicon said:
    Hopefully I can pick one up before the miners clean 'em out.

    I was under the impression that serious miners went ASIC these days.
  15. Some bitminers may invest in ASIC hardware, but many will still get a GCN AMD card especially if they want a practical gaming PC at the same time.

    Bitmining will fail eventually anyway as it's basically a pyramid scheme. At some point the electricity cost alone will be more than any returns.
  16. I am so glad I picked up my 290's when I did, the price hike is a shame...damn miners!
  17. Morbus said:
    Forget the Titan. It's super computing, not consumer computing. You'll be paying for precision calculations that are not used in the real world.


    Well, I'm looking at 3x 27" 1440p monitors for gaming - that 6GB VRAM is mighty attractive.
  18. Swordkd said:
    @ 9-Ball Why would you spend a grand on a Titan when the 780ti outperforms it for 730 dollars or less?


    It doesn't outperform it at 1440p with three 27" screens from the benches I've seen. The 6GB VRAM becomes more important. I was considering a single 780Ti but not for all 3 monitors. Am I wrong? Open to hearing why. My plan B is 2x 290x cards but I'm tired of waiting for non-ref cooler versions. Budget isn't too big a concern.
  19. Shut up and take my god da*n money!
  20. Stevemeister said:
    Instead of "the way its meant to be played" it will be a case of "the way it should have been built"


    Isnt "the way its meant to be played" an Nvidia promotion?
  21. start the PC and warm your entire room... good to alaska and some places in russia. here the temps is about 40ºc when start this maybe the room can reach 50ºc
  22. Really look forward to seeing how the 290X is truly capable of performing without have heat problems causing it to clock down
  23. Now the question is, is that a custom pcb, or did they just slap a wind force on a reference card to get it out in time for Christmas?(if they make it)
  24. I hope the OEMs have a 'gentleman's agreement' to launch at a common date ...

    Otherwise, it will be madness x insanity.
  25. Wisecracker said:

    I hope the OEMs have a 'gentleman's agreement' to launch at a common date ...

    Otherwise, it will be madness x insanity.



    I doubt it, they're most likely racing right now.
  26. I definitely want to see 3-5 different brand 290Xs tested with their versions of coolers.

    However it seems GPU cooling bottleneck is not all it will take for it to stop down-clocking:
    http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/4
  27. YES! the specs include an analogue resolution and it appears both dvi connectors support a dvi-vga adapter. I really hope it's not a typo or a stock image. 100hz lag free gaming for life thank you!
  28. Is it a product launch when I can't buy it yet? It should be a PR release not an article called a LAUNCH. Maybe I could have swallowed this with the word PAPER in the title, as in PAPER LAUNCH. I don't see any in stock. Can you launch anything without selling ONE? Without selling a single card, I think it's an announcement. When you don't even give a date, it is nowhere NEAR calling it a LAUNCH. I'm confused. Not even a price either.
  29. Want some cheese with that whine?

    :lol:
  30. I think the water/fan combo cooling solutions are the way to go. I'll wait for that.
  31. The pixel fill rate is whats important when running multiple.monitors. the 290x is hands down the best. Get 2 in crossfire if you expect to have 120+ fps on multiple monitors.
Ask a new question

Read More

Graphics Cards Components