Skip to main content

External Graphics Over PCIe 3.0? Netstor's NA255A, Reviewed

Setup And Overcoming Issues

In theory, populating the NA255A should be as easy as dropping in graphics cards, connecting their power leads, and hooking the external enclosure up to the host PC's PCI Express card. The TurboBox is designed to extend standardized interfaces, so no software driver should be necessary. In the real world, though, setup isn't quite that easy.

We encountered a couple of snags along the way. First, I initially didn't realize that the PCIe-based interface cards have specific I/Os. If you look closely, one port on each card is etched with a x16 and the other is etched with a x8. I accidentally hooked the x16 up to the x8 and vice versa. The mistake was easy to reverse, but it doesn't appear to be mentioned anywhere in Netstor's documentation.

The second hang-up was a little more worrisome. Mainly, I couldn't get the TurboBox working at PCI Express 3.0 signaling rates. First- and second-gen PCI Express worked fine. But when the jumped was set to PCIe 3.0, the enclosure stopped recognizing the graphics cards I was plugging in. Netstor helped us work through the issue, which involved reconfiguring switches on the interface cards. This solved our issue.

Our third issue wasn't the TurboBox's fault at all. During our first round of benchmarks, we saw odd performance drops with three Radeon HD 7970s installed. Much troubleshooting revealed that some of our Tahiti-based boards weren't working together the way they should have. It turned out that boards from different vendors shipped with incompatible firmware, which hampered multi-card configurations (even though this should have been fine). Mixing and matching products, even those from the same family, is asking for trouble. Fortunately, we worked around the problem with a different card combo.

Finally, we weren't able to test four Radeon HD 7970s at the same time. Again, this wasn't Netstor's fault, however. The TurboBox is absolutely able to accommodate a quartet of dual-slot boards. But because some of the 7970s in our lab are a little larger, they don't fit into the strict space limitations of two expansion slots. As a result, we're testing with three Radeon HD 7970s. It all works out, though: the ASRock X79 Extreme9 motherboard I'm using only has room for three 7970s anyway, so that's our hard limit for comparing native on-board connectivity to the performance of Netstor's device.

  • ohyouknow
    Interesting
    Reply
  • MasterMace
    This is a nice article. I wonder if Tom's can do a multi-cpu article as well.
    Reply
  • dagamer34
    Now if only we could get external GPUs via Thunderbolt (or it's future iterations) so that laptops wouldn't be forever gimped, we'd be in business!
    Reply
  • Whooo whoo, if i had the money to burn, i would get this NA255A, remove the PSU bundle, replace it with a Seasonic 1000 Platinum, slap four GTX Titans, add a custom water-cooling loop, connect it to my main PC and have (if it works) three more NA255A's for each of the PCIe for the main PC with a grand total of 16 GTX Titans for massive GPU computation. All for a grand total of $13,800. Massive electric bill, here i come!
    Reply
  • mayankleoboy1
    PCIE signals scale poorly to long wires. So it is a technical achievement to have these signals travel over a meter of wire.
    Reply
  • A Bad Day
    dagamer34Now if only we could get external GPUs via Thunderbolt (or it's future iterations) so that laptops wouldn't be forever gimped, we'd be in business!
    There are some external GPU cases.

    The only issue is that the cheapest is somewhere slightly less than $400.

    Please explain to me how an aluminum box, a micro-PSU, and a Thunderbolt-to-PCIE adapter adds up to even $200...
    Reply
  • A Bad Day
    EDIT: And when I meant the cheapest, I meant the ones that are only sufficient for a 7750. Want to pair a 7970 with a ultrabook?

    $400-$500 for a slightly longer box with a slightly more capable PSU.
    Reply
  • slomo4sho
    An expensive solution to inferior Mac hardware...
    Reply
  • Vatharian
    Good X79 workstation mobo with 7 PCI-e slots, and 4 K20x-s on each of them. That's a TON of computing power, and if you don't want to deal with high-speed networking multiple boxes, that's nice. Of course only if this thing can actuallty work in pairs or more and in some way circumvent the 15 gpu limit in bios memory mapping. Can this thing be turned on with working machine?
    Reply
  • adgjlsfhk
    But what about someone working on a Mac Pro? Apple's more limited ecosystem means there is no such thing as a three- or four-way graphics array. This could be one of the only options for enabling multiple GPUs. If massive compute potential is important, you might need to swallow hard and consider Netstor's solution the cost of doing business in Apple's world.
    Or you could use the $2000 to ditch your mac pro that is years out of date and use the money to buy a pc that is better in pretty much every way.
    Reply