Sorry, VR: Asus Drops VirtualLink Port With Latest RTX 2070

Asus has launched a new version of its GeForce RTX 2070 Turbo graphics card this week. Dubbed the Evo edition, it's almost a mirror-image of its sibling, except for the lack of a USB-C port for Nvidia VirtualLink (for connecting VR headsets with one cable). Asus has opted to place an extra HDMI 2.0b port in its place.

(Image credit: Asus)


The Asus GeForce RTX 2070 Turbo Evo is one of the few graphics card that clings to the blower-style cooler. It's 26.8 cm and features a dual-slot design and a discrete black exterior with RGB lighting elements kept to a minimum. The only RGB present on the graphics card is a slim, illuminated strip on the side of the shroud. Asus has redesigned this shroud so that cool air is funneled into the single 80mm cooling fan. The dual-ball bearing fan is also IP5X dust resistant certified.

As with all modern Asus graphics cards, the GeForce RTX 2070 Turbo Evo is product of the brand's Auto-Extreme Technology completely automated manufacturing process. This requires the GeForce RTX 2070 Turbo Evo to be put under Asus' 144-hour validation program, which throws various tasks at the graphics card, from 3DMark benchmarking runs to popular titles, such as Fortnite, League of Legends, Overwatch and PlayerUnknown's Battlegrounds.

(Image credit: Asus)

The GeForce RTX 2070 Turbo Evo comes with 2,304 CUDA cores, 288 Tensor cores and 36 RT cores. The graphics card also sports 8GB of GDDR6 memory, clocking in at 1,750MHz (14,000MHz effective) across a 256-bit memory bus. The graphics card has a 1,410MHz base clock and two modes of operation. The Gaming mode, which is also the default mode, runs the graphics card with 1,620MHz boost clock while the OC Mode cranks it up to 1,650MHz.

The graphics card uses a single 8-pin PCIe power connector. In total, consumers have access to two HDMI 2.0b ports and two DisplayPort 1.4 outputs.

Asus hasn't revealed the pricing or availability for the GeForce RTX 2070 Turbo Evo.

Zhiye Liu
RAM Reviewer and News Editor

Zhiye Liu is a Freelance News Writer at Tom’s Hardware US. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • TechyInAZ
    VR seems to be very stagnant in terms of adoption. So not seeing a VRLink connection on this card makes sense. I'm sure there are more people out there that would rather have another HDMI port for productivity on more than one monitor.
    Reply
  • hannibal
    Most likely for VR you would like to have beefier card in anyway... Something like 2080ti or faster...
    Reply
  • Brian_R170
    21702803 said:
    Most likely for VR you would like to have beefier card in anyway... Something like 2080ti or faster...

    faster than a 2080ti?
    Reply
  • Brian28
    Do any headsets even have the USB-C VRLink connection yet? Most I've see require both a HDMI (or maybe DisplayPort) and a USB port. Maybe it just didn't make sense to have a port that only supports 1 or 2 models of headset, when the other models would just use HDMI.
    Reply
  • cryoburner
    21702775 said:
    I'm sure there are more people out there that would rather have another HDMI port for productivity on more than one monitor.
    The card also has two DisplayPort outputs for anyone wanting to use more than one monitor. How many people are running four monitors? I suppose it might be useful for those with older screens that lack DisplayPort though.

    Not including a VirtualLink Type-C port on this card will probably not be that big of a deal for anyone wanting to use a VR headset with it though. They'll just need to connect an adapter with a few cables, like anyone without one of these higher-end 20-series cards. I can't see headsets requiring the port any time soon, as it's mainly just there to simplify connections.

    In any case, if the card wasn't using a blower-style cooler, there would be more room for ports. Most other RTX 2070s, the Founder's Edition included, support four displays in addition to USB-C. Unless you're putting the thing in a tiny case with limited airflow where exhausting heat outside might be more beneficial, most would probably be better off with a more typical multi-fan cooler.

    21702981 said:
    Do any headsets even have the USB-C VRLink connection yet? Most I've see require both a HDMI (or maybe DisplayPort) and a USB port. Maybe it just didn't make sense to have a port that only supports 1 or 2 models of headset, when the other models would just use HDMI.
    As far as I know, no commercially-available headsets support it yet, though I suspect some will begin to support the connection this year, seeing as all the major headset manufacturers, along with AMD and Nvidia, worked together to develop the standard. When they do start supporting it, I fully suspect it will simply allow you to avoid plugging in the included adapter box, instead going straight to USB-C instead of splitting off to separate cables for video, data and power.

    21702803 said:
    Most likely for VR you would like to have beefier card in anyway... Something like 2080ti or faster...
    I'm pretty sure that the vast majority of those running VR today don't have anywhere remotely near that level of graphics performance. >_>
    Reply
  • joevt1
    The USB-C port is not just for VR.
    1) The card has a USB controller which ASUS probably did not remove. Now it has no connector.
    2) You can connect USB 3.1 gen 2 devices (10 Gbps), USB-C displays, display adapters, and docks to the USB-C port.
    3) It's easier to convert USB-C to HDMI than it is to convert HDMI or DisplayPort to USB-C. If they want to give us HDMI, then they should have just provided an adapter.
    4) It's easier to convert DisplayPort to HDMI than it is to convert HDMI to DisplayPort. They should have left it as DisplayPort if they didn't want to provide USB-C.
    Reply
  • fireaza
    I suspect that the reason they dropped it is because it will probably be awhile before a VR headset is released that uses it. The connector has been just recently been unveiled and many of the newest VR headsets like the PiMax were too far along in production to adopt it. By the time one comes out that uses it, we’ll probably be on the next generation of GPU, which meant the connector was entirely useless. Replacing it with a connector you can actually use today makes more sense.
    Reply
  • eye4bear
    Isn't PCIe4 just around the corner? I am not going to run out now and spend on a new card that will be out-of-date in 6 months. My 1080ti is just fine until then.
    Reply
  • cryoburner
    21708862 said:
    Isn't PCIe4 just around the corner? I am not going to run out now and spend on a new card that will be out-of-date in 6 months. My 1080ti is just fine until then.
    Current graphics cards are not even close to running into the bandwidth limitations of PCIe 3.0 x16. In fact, your 1080 Ti should perform nearly the same if you connect it to a PCIe 2.0 x16 or PCIe 3.0 x8 slot with half the bandwidth. PCIe 4.0 will likely do nothing to improve the performance of graphics cards for many years to come, and since the cards and slots will be backward and forward compatible, it's a non-issue.
    Reply