Sign in with
Sign up | Sign in

Gigabyte Intros R9 270X OC 4GB with WindForce 3X Cooler

By - Source: TechPowerUp | B 14 comments

Gigabyte has announced its R9 270X OC 4 GB, which carries a WindForce 3X cooler.

It seems that many manufacturers are releasing R9 270X graphics cards with 4 GB of graphics memory rather than 2 GB, and Gigabyte is one more manufacturer to join the crowd. Its latest graphics card, the R9 270X OC, with codename GV-R927XOC-4GD, features something the competitors don't – a triple-fan cooler.

The card is built on a non-reference PCB design. The triple-fan cooler is known as the WindForce 3X cooler, and is also featured on various high-end graphics cards from Gigabyte. It is built with three fans, two aluminum fin stacks, as well as a number of heat pipes and a plastic shroud.

Gigabyte has clocked its R9 270X OC 4 GB at 1050 MHz base with a 1100 MHz GPU boost frequency. The memory on the card is clocked at an effective speed of 5.6 GHz.

There was no word on pricing or availability yet.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 11 Hide
    jd_w98 , December 27, 2013 12:53 PM
    Before anyone complains about the 4gb being unnecessary, it does have a practical use. Think about people wanting to do crossfire and end up with more performance then an R9 290x, wouldn't be much use to crossfire without all the Vram otherwise you would end up with 2gb of vram.
Other Comments
  • 11 Hide
    jd_w98 , December 27, 2013 12:53 PM
    Before anyone complains about the 4gb being unnecessary, it does have a practical use. Think about people wanting to do crossfire and end up with more performance then an R9 290x, wouldn't be much use to crossfire without all the Vram otherwise you would end up with 2gb of vram.
  • Display all 14 comments.
  • 3 Hide
    Cryio , December 27, 2013 1:42 PM
    As next-gen is on our doorsteps, I think the better future proofed you are right now, the better. 1-2 GB video for today will be TINY in the following 1-2 years, as texture quality will explode.
  • 6 Hide
    Morbus , December 27, 2013 2:00 PM
    Don't fool yourselves. Consoles won't be pushing VRAM in the next 10 years, just like the last generation of consoles did not. By the time we're playing games that require more than 2GB of VRAM, the current gen 2GB cards will be so underpowered you'll need to reduce the settings way down anyway. And chances are, if you're buying gaming video cards now, by that time you'll already have bought something new. SPECIALLY if you're going with the "low end" of gaming cards.

    I bought a 2GB GTX760 and I'm very happy with it, it performs awesome on 1080p for all games I own, including Crysis 3, Battlefield 4 and Company of Heroes 2. But in two years? I'll have bought something new and sold my current 760 for the change... It's the only sensible thing to do.

    YOU CANNOT FUTURE PROOF ON THE GRAPHICS DEPARTMENT!
  • -7 Hide
    iam2thecrowe , December 27, 2013 4:50 PM
    Quote:
    Before anyone complains about the 4gb being unnecessary, it does have a practical use. Think about people wanting to do crossfire and end up with more performance then an R9 290x, wouldn't be much use to crossfire without all the Vram otherwise you would end up with 2gb of vram.


    to bad crossfire is completely useless for any triple monitor setup or 4k monitor requiring 4gb vram. May have a use for 1440p dx 11 games i guess.....
  • 0 Hide
    anatomyofme , December 27, 2013 7:44 PM
    If the price is right then why not? I guess this would add 270x as cards out of stock due to mining tho..
  • -3 Hide
    s3anister , December 27, 2013 8:33 PM
    This card has been available on newegg for at least a week or two now, I didn't even know this was news.
  • -2 Hide
    rantoc , December 27, 2013 9:04 PM
    4 GB sounds like overkill atleast until the day you try tri monitor over the old and outdated 1080p or high res 1600p+ gaming with all features on, especially triple buffering in case you like vsync and still don't want bad input lag. Yes its overkill for most but so are a card of this caliber anyway... Now where is that 1600p+ monitor with gsync....
  • 2 Hide
    Cryio , December 28, 2013 5:36 AM
    Quote:
    Don't fool yourselves. Consoles won't be pushing VRAM in the next 10 years, just like the last generation of consoles did not. By the time we're playing games that require more than 2GB of VRAM, the current gen 2GB cards will be so underpowered you'll need to reduce the settings way down anyway. And chances are, if you're buying gaming video cards now, by that time you'll already have bought something new. SPECIALLY if you're going with the "low end" of gaming cards.


    Consoles won't be pushing VRAM? When PS3/X360 consoles launched the usual game required 256 MB of VRAM. Now games to be run in 1080p, maxed out WITH FXAA require 1.5 GB or so. Not all games of course, still. Seing how proper PC games like Battlefield 3/4, Crysis 2 with MaldoHD and Metro games push VRAM easily with their amazing textures, (and that's now) think how will it be in a few years time.

    You said PCs built now will be underpowered? If you bought a new PC at the end of 2006, 3-4 GB of ram, and 8800 GTX 512-1GB (SLI or not) and an Intel Q6600 you could play ALL games today (sans the 2-3 or so DX11 only games) no problem, even in 1080, on medium-high.

    If you would've bought something with 256 MB, by the end of 2007 you were already out of luck. Even 512 MB by the end of 2010 was pushing it.

    See where I'm going?
  • 0 Hide
    photonboy , December 28, 2013 11:43 AM
    While VRAM usage in general has gone up over the years, much of the reason is inefficient coding. Game developers have discussed the reasons why VRAM may stay the same or even be REDUCED over the coming years.

    Better anti-aliasing methods, tessellation, and paged streaming from the System RAM are all reasons why VRAM usage could be reduced.

    Mantle in particular, once optimized in several years, can get away with a FRACTION of the video memory for a similar experience to now.

    The new consoles will influence things of course, but keep in mind a game might have access to a SHARED 5GB (say 2GB of "VRAM" and 3GB of "SYSRAM") which on PC can easily be done already with a 2GB card and 8GB of System RAM.
  • 0 Hide
    none12345 , December 28, 2013 6:27 PM
    All things considered ram is cheap, id much rather have too much ram then too little.

    That said if things really do go to a unified memory architecture, then the discussion is moot. Id rather just have one large pool of ram for the system; rather then the system and video ram split we have now.

    Things are just straight up inefficient when you have to copy textures from system ram to video ram, use them, then copy them back. Or vice versa.
  • -1 Hide
    xiinc37 , December 28, 2013 6:37 PM
    Quote:
    Before anyone complains about the 4gb being unnecessary, it does have a practical use. Think about people wanting to do crossfire and end up with more performance then an R9 290x, wouldn't be much use to crossfire without all the Vram otherwise you would end up with 2gb of vram.

    If you compare a pair of 4GB 270Xs in crossfire to one 290, they are exactly the same at every spec but clockspeed (should be easy enough to OC a 3rd party cooled 290 to 270X clocks). On top of that, a 290 would not have losses from crossfire, giving it greater performance than the 270Xs. An OC'd and properly cooled 290X would probably sit even higher. The only way this card makes sense is if you want to crossfire three of them, one wouldn't be able to use all that VRAM, and two or four of them would be better replaced with one or two Hawaii cards.
  • 0 Hide
    anbello262 , December 29, 2013 6:06 PM
    Well, what you said might be true, but that's exactly the point. If two of these perform almost exactly like a R9 290, then 2 of these would need the 4gb of ram...
    I know you will probably say that "then, you're better off just buying the 290", but many people prefer to (or are only able to) buy 1 cheaper card now, and upgrade via crossfire in the future.
    For all that group, it's a lot better to buy a 4gb cheaper card, if available, as it opens you up to better crossfire in a future...
    And that stuff about crossfire broken at higher res/multiple screens, that's only temporal till they give us newer drivers.
  • 0 Hide
    16bit , January 1, 2014 10:25 PM
    AMD will eventually fix the problems regarding multi-monitor setups with crossfire.

    Hopefully sooner than later.