Sign in with
Sign up | Sign in

Galaxy 56NGH6DH4TTX GeForce GTX 560 MDT x4

Five Overclocked GeForce GTX 560 Cards, Rounded-Up
By

Galaxy’s entry is quite unique; it is the only GeForce GTX 560 that supports four monitors from a single dual-slot board, and it doesn’t require a second card in SLI to enable three-screen gaming.

Priced at $230 on Newegg, it is the most expensive card in our round-up, but not my much. It’s only $10 more than the Asus and Zotac offerings with their own factory overclocks.

Despite its ability to accommodate additional display connectivity, Galaxy’s MDT x4 is actually the smallest card, with a PCB that measures 8.25” x 4.5” PCB and overall dimensions of 8.75” x 5”, including the bezel.

In an apparent compromise for the unique output configuration, Galaxy's card sports the lowest operating frequencies of our five tested boards. Although an 830 MHz core still counts as overclocked, it's only 20 MHz higher than Nvidia's reference. A 1002 MHz memory clock matches the first GeForce GTX 560 we received from Nvidia exactly. Fortunately, this card's twin six-pin power inputs are up on top of the PCB, where we prefer them.

Galaxy’s small, unique cooler employs three 6 mm heat pipes to transfer thermal energy away from the GPU and into an array of aluminum fins. A single 3.5” radial fan facilitates heat dissipation from there.

As a multi-display card, Galaxy’s MDT x4 boasts the most interesting I/O panel on our bench. Four DVI connectors and a single mini-HDMI output leave no room back there for additional ventilation. This really isn't a problem, though, because none of the other GeForce GTX 560s we've tested force air down a closed shroud and out the back of the card. Zotac seems to be the only company designing 560s that exhaust heated air from your PC.

Let’s talk a little more about the card’s unique multi-display functionality. Galaxy taps the IDT VMM1403 multi-monitor controller to translate one dual-link DVI signal into three single-link DVI outputs. Unfortunately, bandwidth limitations prevent you from running the three screens attached to the IDT chip at 1080p/60. Instead, the card maxes out at 1080p and 50 Hz, yielding one 5760x1080 surface. 

You could encounter issues with screens that don't appreciate 50 Hz refresh rates. In that case, you'd need to back down to 5040x1050 (using three 1680x1050 displays) to enable 60 Hz. This happened to us with the 285.62 driver from Nvidia's site. The problem was fixed, however, by reverting to driver 285.54 from Galaxy.

We need to reiterate, though: you're still limited to two independent display pipelines from Nvidia's GF114 graphics processor. IDT's ViewXpand technology simply allows you to turn one of them into a single larger surface. If you use three or more 2560x1600 displays requiring dual-link DVI connectors or are not willing to compromise on lower refresh rates, you'd need two Nvidia cards in SLI or any number of AMD-based products with Eyefinity support instead.

Keep in mind that even though the card supports four monitors, the fourth cannot be made a part of the three-screen setup coming from IDT's chip.

 

Included with the card are two dual Molex-to-six-pin power adapters, a mini-HDMI-to-HDMI adapter, A DVI-to-VGA adapter, a driver disk, a software disk, and instruction pamphlets.

As we already know, this card is designed for three or four screens. However, Nvidia's driver isn't designed with that many displays in mind. As a result, Galaxy includes WinSplit's Revolution software, which lets you assign an application to a preset screen position using Control+Alt+Number pad keys. Alternatively, the company also offers a download for Galaxy MDT EZY Display, a little app that allows you to choose the display configuration you want, and automatically maximizes windows within the display on which they appear. Both pieces of software do a good job of managing windows where you want them to appear, but MDT EZY Display is simpler and more elegant.

Overclocking

Galaxy supports voltage manipulation in its Xtreme Tuner HD utility. Keep in mind that you have to use the version bundled with the card, or wait until the version on Galaxy's website is updated to release 3016 (Update: v3016 is now available for download from galaxy's website). Although the software lets you specify core voltages as high as 1.3 V, it drops down to a 1.15 V when you try to apply the setting. That's not entirely bad news; we wouldn’t want to push voltages much higher than 1.15 V on air cooling anyway.

With a peak 1.15 V setting and fan duty cycle dialed in to 100%, we managed to hit 1000 MHz core and 1250 MHz memory frequencies. That's an impressive overclock given Galaxy's more moderate shipping clocks.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 40 comments.
This thread is closed for comments
  • 5 Hide
    pensivevulcan , January 19, 2012 3:47 AM
    Kepler is around the corner, so are lower end AMD 7000 series parts, this was interesting but wouldn't one want to wait for a plethora of reasons.
  • 4 Hide
    payneg1 , January 19, 2012 4:19 AM
    The Galaxy model comes closest with its 830/1002 MHz clocks, but Zotac's AMP! edition goes all the way to 950/1100 MHz.

    This dosent match with the above chart
  • 0 Hide
    salad10203 , January 19, 2012 4:43 AM
    Are those temps for real? My 280 gtx has never idled under 40C.
  • -2 Hide
    crisan_tiberiu , January 19, 2012 4:51 AM
    so, basicaly if someone plays on a single monitor, there is no point going beyond a gtx 560 or a 6950 in today's games. (it slike in the "best gaming CPU chart", no point going beyond i5 2500k for gaming.
  • 0 Hide
    giovanni86 , January 19, 2012 4:52 AM
    salad10203Are those temps for real? My 280 gtx has never idled under 40C.

    Your kidding right, my overclocked 580GTX at 60% fan speed idles at 32c. Cards down clock themselves which allows them to run cooler at idle temps even if it were clocked at upwards i don't think a card would get hot unless it was being used.
  • -2 Hide
    crisan_tiberiu , January 19, 2012 4:52 AM
    sorry, i ment single monitor @ 1080p :p .
  • 6 Hide
    Anonymous , January 19, 2012 6:37 AM
    Im surprised they got 5 OCed GPUs to run BF3 without crashing
  • -1 Hide
    justme1977 , January 19, 2012 6:53 AM
    crisan_tiberiu[/nom..... (it slike in the "best gaming CPU chart", no point going beyond i5 2500k for gaming.


    I have the feeling that even a i5 2500k@4ghz bottlenecks a 7970 @1080p in most newer games.
    If the GPU market goes the way it does, it won't take long that even midrange cards will be bottlenecked @1080p by the cpu.


  • 1 Hide
    wizloa , January 19, 2012 7:04 AM
    heh, my 4870 runs at 80c regardless of idle or load
  • 3 Hide
    FunSurfer , January 19, 2012 8:22 AM
    I think there is an error on the Asus idle voltage: instead "0.192 V Idle" it should be 0.912
  • 0 Hide
    Memnarchon , January 19, 2012 8:49 AM
    justme1977
    crisan_tiberiu[/nom..... (it slike in the "best gaming CPU chart", no point going beyond i5 2500k for gaming.
    I have the feeling that even a i5 2500k@4ghz bottlenecks a 7970 @1080p in most newer games. If the GPU market goes the way it does, it won't take long that even midrange cards will be bottlenecked @1080p by the cpu.


    Not really. This is mostly game depended. Depends on how much stress each graphics engine push at cpu and gpu.
    Games like Dragon Age 2 and SWTOR are gpu intensive. So a GTX570 (that I have) is being used at 1080p at 99% of its usage with a low performance nowadays Q6600 in SWTOR (used MSI after burner to monitor it).
    But with games such Skyrim which cpu is more important than other games, a highly clocked sandybridge is required in order to play smoothly at 1080p.
    One thing is certain for sure. The higher the resolution the more gpu power and less cpu power requires a game.
  • -3 Hide
    nevertell , January 19, 2012 11:00 AM
    Hey, I have a gtx 460 and I play with tesselation and DX11 enabled in metro 2033 @1080p. It has some drops to 25 and lower, but that's never in the middle of a firefight.
  • 0 Hide
    silverblue , January 19, 2012 11:14 AM
    giovanni86Your kidding right, my overclocked 580GTX at 60% fan speed idles at 32c. Cards down clock themselves which allows them to run cooler at idle temps even if it were clocked at upwards i don't think a card would get hot unless it was being used.

    The 280 idles higher than the 580 to the best of my knowledge, plus it's a 65nm part and the largest gaming GPU ever created.
  • -4 Hide
    GhosT94 , January 19, 2012 11:44 AM
    would you please add Crysis 2 to all GPU benchmarks
  • 0 Hide
    stingstang , January 19, 2012 12:41 PM
    silverblueThe 280 idles higher than the 580 to the best of my knowledge, plus it's a 65nm part and the largest gaming GPU ever created.

    That's an enormous amount of fan speed for an Idle GPU. Hope you're happy having a nice loud fan at idle. I can't imagine how loud it gets under a light load.

    To the article, I don't think these comparisons are really necessary. All the cards are going to have different overclocking capabilities, which is what anyone from tom's is going to check. Hell, the worst card you guys test according to this comparison might overclock the most, and be the best card for the money on someone else's comparison.
  • 0 Hide
    Onus , January 19, 2012 12:46 PM
    Any subjective comments about the Asus cooler's noise? I'm wondering if the different fans reduce harmonic whine, or some similar effect of having two identical fans in close proximity. I have this cooler on my GTX560Ti, and I never hear it.
  • 0 Hide
    jaquith , January 19, 2012 12:55 PM
    Hmm...Missing something here...Where's any EVGA??? See -> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=50001402%2040000048%20600094002%20600158457&IsNodeId=1&bop=And&Order=REVIEWS&PageSize=20 EVGA on GTX 570/580 (-AR lines) also includes Lifetime Warranties. IMO EVGA and ASUS are the best choices for nVidia GPUs.

    For a $30 savings the ASUS ENGTX560 DCII OC/2DI is worth a look. Sure if you run the fans at 100% a higher CFM fan is going to be very loud, but no one runs their fans @ 100% either.

    With Apps like MSI Afterburner and others it's incredibly easy to OC any GPU. It's a balancing act between performance, temperatures, and dDA (noise). One of the big reasons for water blocks on higher end cards, etc.
  • 0 Hide
    jaquith , January 19, 2012 1:12 PM
    Duh me, typo dBA...coffee hasn't kicked in yet.

    BTW - I appreciate the Article, it's enlightening and offers good info. Thanks! :) 
  • 0 Hide
    cleeve , January 19, 2012 1:42 PM
    payneg1The Galaxy model comes closest with its 830/1002 MHz clocks, but Zotac's AMP! edition goes all the way to 950/1100 MHz.This dosent match with the above chart


    Quite right! Fixed.
  • 2 Hide
    wolfram23 , January 19, 2012 1:42 PM
    Quote:
    so, basicaly if someone plays on a single monitor, there is no point going beyond a gtx 560 or a 6950 in today's games. (it slike in the "best gaming CPU chart", no point going beyond i5 2500k for gaming.


    The GTX 560 is comparable to the 6870, though generally thought to be a little slower but with better OC headroom. The 6950 is much faster, and is comparable to the GTX 560 Ti.
Display more comments