Skip to main content

Nvidia GeForce GTX 1080 Doesn’t Boost In VR With The Latest Game Ready Driver (Update)

Update, 7/11/16, 9:50pm PT: We received a reply from Nvidia's Senior PR Manager, Brian Del Rizzo, about the issue. Nvidia was aware of the problem and already has a solution. Del Rizzo said that the new driver that is due later this week will solve the Boost clock issue as well as the Display Port to Vive incompatibility.

Original story

Last week we reported that there is a compatibility issue between the HTC Vive and Nvidia’s Pascal-based graphics cards. Nvidia said that a solution is coming this week, but we’ve now discovered another problem.

Nvidia released GeForce Game Ready Driver 368.69 on July 6 in preparation for the impending release of Codemasters’ Dirt Rally VR update. Though this driver is meant to prepare your system for a VR game, the update (rather ironically) broke the ability for the GTX 1080 to put its impressive boost clock to work. We tested the driver to see if the Vive display port issue was corrected, but we had no reason to monitor boost clocks at the time.

1080 - Boost Not Working

Over the weekend, people started reporting on the GeForce support forums and the Oculus community support forums that the boost clock of GTX 1080 cards wasn’t activating in VR games. This morning, a similar report surfaced on the r/Oculus sub-Reddit. We performed a few tests to verify the issue ourselves, and we can confirm that the problem is real.

When both SteamVR and the Oculus Rift display are activated, our GTX 1080 constantly runs the GPU at the base clock. Even with just the compositor open, the GPU never fluctuates from the base clock speed while SteamVR is running.

We can’t say for sure if this will happen with the GTX 1070, because we don’t have one at the lab where our VR equipment is. However, we can verify that this does not happen with Maxwell-based cards.

We ran a couple of spot check tests on two SteamVR games, Space Pirate Trainer and The Brookhaven Experiment, and one Rift game, Lucky's Tale. We also tested Unigine Heaven to verify that regular content isn’t affected by the locked boost clock. We used a Gigabyte GeForce GTX 1080 G1 Gaming and a Gigabyte GeForce GTX 980 Ti Xtreme Gaming for comparison.

980 Ti - Boost Working

The Heaven benchmark test demonstrated that the boost clock for both cards worked as it should, but as soon as we started SteamVR, things changed. The clock speed of the 1080 immediately jumped from its idle clock (close to 250 MHz) to the base clock of 1,696 MHz. The Oculus Home software had a similar reaction, but this happened only when the Rift was off. When the proximity sensor detects something and the displays turn on, the boost jumps to 1,696MHz and remains there. Our GTX 980 Ti was not subject to either of these concerns. It boosts happily with the latest driver installed.

We also tested the previous version of the GeForce display drivers. Version 368.39, which was released with the GTX 1070 in June, doesn’t suffer from any of the aforementioned issues, so it may be a viable solution until Nvidia sorts this out. Nvidia said it will be releasing a new Game Ready driver this week. Perhaps the company already has a solution in the pipeline. In the meantime, don't forget to close SteamVR when you aren't using the Vive, unless you decide to roll back to last month's GeForce release. 

We’ve reached out to Nvidia for comment. 

Graphics cardGTX 1080 G1 Gaming - Driver 368.39GTX 1080 G1 Gaming - Driver 368.69GTX 980 Ti Xtreme Gaming - Driver 368.69
Heaven GPU Clock1,938 MHz1,938 MHz1,443 MHz
Space Pirate Trainer GPU Clock1,823 MHz1,696 MHz1,443 MHz
Brookhaven Experiment GPU Clock1,936 MHz1,696 MHz1,443 MHz
Steam VR GPU Clock1,696 MHz1,696 MHz1,215 MHz
  • Jeff Fx
    Thanks for the ongoing VR coverage. I haven't upgraded to a 1080 yet, and probably won't until they get the kinks out.
    Reply
  • jasonelmore
    Finally!! a way to disable GPU Boost! now we can overclock and know what our max speed will be. even if precision or afterburner tells you what your boost will be, my card still goes above the rated boost.
    Reply
  • CaptainTom
    Yes Nvidia GPU's are the ultimate VR solution - if they could just successfully plug into a VR device.

    **Shakes Head**
    Reply
  • alextheblue
    I bet they're going to add one more thing to the driver testing checklist now.
    Reply
  • Brandon_29
    Whoever wrote their regression test scenarios should be getting fired right about now.
    Reply
  • somebodyspecial
    18263586 said:
    Yes Nvidia GPU's are the ultimate VR solution - if they could just successfully plug into a VR device.

    **Shakes Head**

    Always has worked, with HDMI ;)
    "We are aware of user reports of an incompatibility when plugging HTC Vive into GeForce GTX 1080/1070 using the DisplayPort connector. We have identified a solution which is planned to be released in NVIDIA’s next Game Ready Driver targeted for later this week. In the interim, we recommend users connect HTC Vive to the HDMI port on GeForce GTX 1080/1070," read the statement from PeterS@Nvidia on the support forums.

    http://www.tomshardware.com/news/nvidia-vive-displayport-incompatible,32204.html

    So just like AMD power issues with 480, everyone has issues with new products these days.
    **Shakes Head** @whiners - as long as they fix things that are broke within a few weeks I'm ok with it. There are far too many configs for these guys (AMD or NV) to get everything right. You simply can't test for EVERY scenario and VR isn't making it any easier by having a dozen devices. Not surprising there are compatibility issues with something so new (and practically useless...LOL). :)
    Reply
  • jkteddy77
    After all of the bellyaching and shaming people go on about AMd hardware every single year... You have glitches like this from Nvidia that go unfixed. You STILL can't use a display Port connection with the 1070 or 1080... a full 2 months later. Now this??? AMd fixed their power issue in a full 6 days, and properly claimed that it happened, unlike Nvidia hiding all of these ugly glitches.
    On top of dropping 3 and 4 way SLI for their diehard consumers, as well as not just entirely disabling SLI for the 1060 just to force people into paying more for a single 1080 for performance instead of choosing to sli 1060's for less: How is Nvidia still the most favored brand? They've gone so far downhill, I don't want to give them my money, even less so than I felt last year...
    Vega, please hurry
    Reply
  • mattcrow
    "Over the weekend, people started reporting"

    Wait... what? Peo... People? People actually own those cards?!
    Weird... I've been trying to buy one since 1st of June with no luck... or, maybe I am actually lucky...
    Reply
  • falchard
    DisplayPort connections tend to be a little more finicky compared to HDMI, so I wouldn't really put too much weight into it. For instance take the AMD R9 290x using DisplayPort to a 28" 4k TN panel monitor at 60hz. The unique timing of the panel is not fully supported over their displayPort and it will flicker.
    Reply
  • jkteddy77
    18265001 said:
    DisplayPort connections tend to be a little more finicky compared to HDMI, so I wouldn't really put too much weight into it. For instance take the AMD R9 290x using DisplayPort to a 28" 4k TN panel monitor at 60hz. The unique timing of the panel is not fully supported over their displayPort and it will flicker.

    I think that might have just been some preliminary less advanced 4k Panels, the Samsung ones maybe?
    I used the AOC U2879vf 28" 4k Freesync monitor through my R9 290's Display Port, and had absolutely zero issues, no flashing or flickering. Note, this actually had the same panel as the Samsung too, no problems
    If this is what you mean: https://youtu.be/JfreP0AESQI?t=1m11s
    or with samsung's: https://www.youtube.com/watch?v=DAASlcrA5Xc&ab_channel=%E5%B0%9A%E6%98%8C%E5%AE%81
    I've never seen that before, and I have an original stock and BIOS R9 290
    I upgraded to an LG 27UD68-W 4k IPS Freesync monitor, no flickering issues with that monitor now either.
    Reply