Asus Ready For 2nd-Gen Ryzen With Five X470 Motherboards

Asus launched five motherboards based on the X470 chipset to accompany AMD’s newly launched 2nd-Gen Ryzen processors. Asus has an ATX model for each of its motherboard product lines: ROG, Strix, TUF, and Prime. The company also released one ITX motherboard, which falls into the Strix series.

ROG Crosshair VII Hero

The star of Asus’ X470 boards is the latest iteration of the long-running Crosshair name, ROG Crosshair VII Hero. When it comes to component support, the board features everything you’d expect, including four DDR4 DIMM slots, dual reinforced PCIe 3.0 x16 slots, dual PCIe 3.0 x4 M.2 slots (one with a heatsink), and six SATA 3.0 ports. On the networking front, the Crosshair has integrated 802.11ac WiFi and Bluetooth 4.2 (based on an Intel wireless controller) and gigabit Ethernet through a single RJ-45 jack.

What differentiates the Crosshair VII from its predecessors, beyond its chipset, are some interesting new Asus-exclusive features. Like other ROG boards, the Crosshair VII supports Asus’ Aura lighting ecosystem and has both four-pin 12V RGB and three-pin digital RGB headers. Asus mentioned that Aura is now even compatible with the Philips Hue wireless home-lighting ecosystem. Details on this integration are scarce at the moment, but Asus said the latter can be controlled by the Aura app.

Two other features new to the Crosshair, and also to the entire ROG motherboard line, are newly designed board silkscreen labels, which better highlight connectors, and what Asus calls Truvolt USB connectors. These USB ports are guaranteed to provide 5V so your port-powered devices, such as portable hard drives, won’t be at risk. Of course, other ROG stalwart features, such as an advanced integrated audio solution with a dedicated ESS Technologies DAC chip, are present. Because the ROG Crosshair is expected to be used with the highest-end Ryzen processors, which don’t have integrated graphics, the motherboard forgoes onboard video connectors.

Strix X470-F Gaming and Prime X470-Pro

Slotting in below the flagship Crosshair are the enthusiast gamer’s Strix  X470-F Gaming and the upper-midrange Prime X470-Pro. These two boards differ mainly in their aesthetics. The all-black Strix board comes with RGB-lit heatsink covers and an integrated rear I/O shield. The Prime follows Asus’ classic black-and-white color scheme and features less integrated RGB lighting. Both motherboards still feature built-in RGB headers, however.

When it comes to hardware, the Strix and Prime lose the Crosshair’s integrated wireless connectivity and have differently configured M.2 slots. Whereas the Crosshair has dual PCIe 3.0 x4 M.2 slots, the Strix’s and Prime’s second M.2 slot is only PCIe 3.0 x2. Between the Strix and the Prime, the Strix features a slightly upgraded integrated audio solution, but it isn’t as advanced as the Crosshair’s. Both of these boards have onboard HDMI and Displayport connectors for use with Ryzen processors that have integrated graphics.

TUF X470-Plus And Strix X470-I Gaming

Rounding out Asus’ announcement are the TUF X470-Plus for mainstream computing and the Strix X470-I Gaming for ITX gaming systems. The TUF series motherboards are the most basic of Asus’ offerings. Rather than trading features for a lower price, they forgo features for guaranteed reliability. Like others in the TUF series, the TUF X470 features power regulation circuitry that won’t enable the best overclocks, but are hardened against electrical fault. Other areas of the board also feature extensive surge protection. The board features the same configuration of PCI-E and M.2 slots as the Strix and Prime boards.

Although it’s the tiniest, the Strix X470-I Gaming isn’t considered by Asus to be a mainstream product like the TUF. The X470-I is still targeted at enthusiasts, so it packs advanced integrated audio, RGB lighting capabilities equivalent to the X470-F, and an integrated wireless solution equivalent to the Crosshair’s. Being smaller means it loses out on two DIMM slots, and all but one PCIe slot. The board has one PCIe 3.0 x4 M.2 slot on the front and a second on the back. Strangely and unfortunately, Asus stated that despite how many PCIe lanes should be available to this board, the rear M.2 slot shares its bandwidth with the main PCIE slot.

Most of Asus’ new motherboards can already be found on Newegg. The Crosshair retails for $300, the X470-F Gaming for $215, the Prime for $185, and the TUF for $160.

  • takeshi7
    Electrolytic caps don't belong on TUF series boards. I don't care if they're Nichicon gold caps for the audio. If I wanted that I'd buy a board from one of the other series. The entire reason I bought my TUF board was for the 100% all solid-state caps (and the 5 year warranty).
    Reply
  • 10tacle
    Looking forward to all the reviews! Let's just hope that there are no bugs this time, especially revolving around memory that the first generation suffered from.
    Reply
  • spdragoo
    20882841 said:
    Electrolytic caps don't belong on TUF series boards. I don't care if they're Nichicon gold caps for the audio. If I wanted that I'd buy a board from one of the other series. The entire reason I bought my TUF board was for the 100% all solid-state caps (and the 5 year warranty).

    ????

    Not sure where you're getting that from. On the TH page, your comment is the only time that the word "electrolytic" even appears. And on Asus's page (https://www.asus.com/us/Motherboards/TUF-X470-PLUS-GAMING/), there's no mention of "electrolytic" caps at all -- in fact, they specifically state

    TUF CAPACITORS
    +20% temperature tolerance and 5X-longer lifespan.

    Which seems to indicate their standard TUF caps are being used (https://www.asus.com/Microsite/mb/Tuf/why-tuf.htm).
    Reply
  • buzznut47
    LOL, ASUS website Description of TUF X470;
    "Designed exclusively for 8th generation Intel? Core™ processors to maximize connectivity and speed with Dual M.2, Gigabit LAN and on-board WiFi, USB 3.1 Gen2, and Intel? Optane™ Memory compatibility"
    Clearly AM4..

    Its a bit disappointing, the Crosshair board has 5 PCI express slots whereas the other boards have six. And why can they not put the cmos battery in a better place? If I have 2 video cards I'll never be able to get to it. If they're water cooled, I need to take apart my whole loop if I need access to the battery.

    They look nice though, ASUS does make sharp looking components.
    Reply
  • 10tacle
    20883586 said:
    Its a bit disappointing, the Crosshair board has 5 PCI express slots whereas the other boards have six. And why can they not put the cmos battery in a better place? If I have 2 video cards I'll never be able to get to it. If they're water cooled, I need to take apart my whole loop if I need access to the battery.

    Well where else could they put it? Do you see any free real estate they can move it to? Every motherboard maker has the CMOS battery down there for a reason. Regarding the TUF's five PCIe slots, did you miss the "TUF series has always been ASUS's most basic mainstream enthusiast offering" comment? But besides that, game developers are killing off any advantages of multi-GPU support anyway making it a waste of money getting two slower GPUs instead of a faster single one. Case in point, the poor scaling of a second GPU in Far Cry 5 with either SLI or CrossFire:

    https://cdn.mos.cms.futurecdn.net/b8eRjFF6gbTXTZTzfgXD9A-650-80.png

    ^^Note the whopping 27% improvement with a second Vega 56 or the barely better 38% improvement in SLI with two GTX 1080s. And that's a game that actually supports multiple GPUs well these days. Historically, good multiple GPU scaling for the money meant a 75% or higher improvement in frame rates. That number has been declining for years.
    Reply
  • N-ninja
    personally i liked the old strix design, and was looking for the same this generation, not a fan of the new strix look on the motherboard. i liked the clean PCB design with the accent color but this gen they seem to be hitting hard on promoting their boards as their name rather than looks (the branding is everywhere and not subtle) maybe we will get the other designs later on after the release, i was rlly looking forward to those.
    Reply
  • cryoburner
    20883656 said:
    Case in point, the poor scaling of a second GPU in Far Cry 5 with either SLI or CrossFire:
    https://cdn.mos.cms.futurecdn.net/b8eRjFF6gbTXTZTzfgXD9A-650-80.png
    ^^Note the whopping 27% improvement with a second Vega 56 or the barely better 38% improvement in SLI with two GTX 1080s.
    Are we looking at the same graph? >_>
    Vega 56 gets about 75 FPS average, while Vega 56 CF gets 126 FPS, a 67% improvement. The minimum frame rates only show a 28% improvement, but that is likely due in part to the CPU limiting performance in those cases. Likewise, an RX 580 averages around 50 FPS, while RX 580 CF gets 96 FPS, a 91% improvement, and minimums show about a 54% improvement for that card.

    In fact, the article that image comes from includes the line...
    If you have a second matched GPU available, both CrossFire and SLI work, with CrossFire in particular putting up some impressive scaling results.
    And at 4K, these two cards see 92% and 98% average frame rate increases respectively, with their minimums showing 71% and 86% improvements. So, that's not a particularly good "case in point", at least for CrossFire. The SLI scaling in that game is certainly less impressive, but makes at least some notable difference, which is more than can be said for many other games. In general, I agree that multi-card setups are typically not the best option, unless perhaps someone is not satisfied with the performance of the fastest cards, and is fine with spending a lot of money on something that will only work in a limited number of games, but you could have probably found a better example of a game that doesn't support multi-card setups well. : P

    20883586 said:
    And why can they not put the cmos battery in a better place? If I have 2 video cards I'll never be able to get to it. If they're water cooled, I need to take apart my whole loop if I need access to the battery.
    Would you even need to remove the battery though? If it's for resetting motherboard settings, I imagine that the board includes a button or jumper to perform the same task. And I really doubt that the battery would be likely to fail for a number of years, during which time anyone with liquid cooled graphics cards would probably need to tear their loop apart for maintenance more than once anyway. And of course, cards with standard air coolers should take only a matter of seconds to remove.
    Reply
  • 10tacle
    20886426 said:
    Are we looking at the same graph? >_>
    Vega 56 gets about 75 FPS average, while Vega 56 CF gets 126 FPS, a 67% improvement. The minimum frame rates only show a 28% improvement, but that is likely due in part to the CPU limiting performance in those cases. Likewise, an RX 580 averages around 50 FPS, while RX 580 CF gets 96 FPS, a 91% improvement, and minimums show about a 54% improvement for that card.

    I chose the latest example game because it's relatively easy on hardware so it is a BEST case scenario for multiple GPUs and not the norm. I'm looking at 97th percentile, not average FPS:

    RX 56 single: 65
    RX 56 CF: 83.2

    GTX 1080 single: 68.7
    GTX 1080 SLI: 84.6

    I'm looking at 97th percentile FPS which shows minimum performance over max/average. It's like the old horsepower vs. torque curve number comparisons: horsepower is important in upper RPM for high speed, but torque is important in low RPM to get launched. Both eventually cross in a power curve to find the balance.

    The PC Gamer tester (Jerred Walton, former Anandtech Editor) was using an i7 8700K paired with an MSI Z370 Gaming Pro Carbon AC with 16GB of DDR4-3200 CL14 G.SKILL memory, so I doubt the chipset was inhibiting any GPU performance.

    But that's neither here nor there. I had SLI 970s for two years and saw each new successive game decrease scalability. That is if they even supported SLI out of the gate. It's no big secret that multi-GPU support is a dying platform for PC gamers. I don't say this lightly as a 20-year PC building veteran being an early SLI adopter with a pair of Voodoo2's in 1999 and has had five SLI setups in that time!
    Reply