Basemark’s long-awaited virtual reality benchmarking tool, VRScore, is finally available. It's built upon Crytek’s Cryengine, and it's the first test we've used that properly evaluates your PC's ability to present VR content.
VRScore was announced in March 2016 at GDC. At the time, Basemark said the test would include online results comparisons and performance rankings for the best GPU, CPU, and HMD. It also showed off a few screenshots of the Sky Harbor demo scene. During that presentation at GDC, Basemark teased its plan to release VRScore by June 2016. For reasons unknown, this ended up getting pushed back by more than six months. But VRScore is finally ready, and so are we.
There are several versions of the VRScore suite. There’s a free build, which is meant for consumers; a Professional version that offers customizable features for power users; a Media edition for the press; a Corporate edition capable of exporting results data; and a Corporate Plus version that accommodates test automation.
Basemark's free tool is a synthetic metric designed to measure your PC's performance. It consists of a scripted scene that simulates a virtual reality experience, complete with movements to simulate looking around. At the end of the test, Basemark issues a score and compares your system to its online records.
The System Test module, included with all versions of VRScore, evaluates your PC's performance. This benchmark works with or without a VR HMD plugged in. Should you choose to attach a headset and have the requisite drivers installed, VRScore launches your platform's runtime and assigns a score at the end of the test. If not, the metric runs on your monitor, spitting out a frame rate instead.
The System Test records the average frame rate of a 4K baseline scenario, followed by a series of Feature Tests that run modified versions of the System Test. While the main test is rendered at 4K for all HMDs, the runtime of each platform determines the resolution of these Feature Tests. If you run the benchmark without an HMD installed, it'll default to your display's native resolution.
If you download the free version of VRScore, you'll find that it's limited to the Official System Test and the VR Experience demo. Every other version of VRScore also includes the VRTrek test that measures minimum, maximum, and average frame rates. VRTrek also records the time between submitting a draw call and the image appearing in your headset, along with the time each frame takes to submit. At the end, VRScore spits out a result and summary.
Basemark also provides a “VRReady Meter” that's similar to the indicator Valve provides in the SteamVR evaluation test. If the marker lands in the green, your computer is ready for a “Great VR Experience.” Yellow suggests you're "VR Ready." A result in the red suggests you need a hardware upgrade before trying to enjoy VR.
The VRTrek Test comes with the Professional, Media, and Corporate editions of VRScore. This benchmark measures the persistence and latency of the panels inside of your HMD, along with the GPU's ability to maintain ample performance. To run the VRTrek Test, you need a VRTrek latency recording tool, which features two photodiode sensors spread 2.5" apart. These record when the HMD receives a signal, and send data back to your PC through a mic jack. Basemark says Realtek's audio controller is the only device approved for this feature.
MORE: Best Virtual Reality Headsets
MORE: The HTC Vive Review
MORE: The Oculus Rift Review
This isn't a GPU comparison. The article's scope is to show what VRScore is all about.
I would have liked to, but there are two problems with that.
I didn't have any more time to run more benchmarks. You have to run each pass 5 times per HMD. More GPUs would mean less HMDs. I cover VR, so the HMDs were my priority.
The other problem, and this one's the kicker, is that I don't have a GTX 1060. In fact, the two GPUs that I used are the only two current generation cards that I have access to.
Our GPU reviewers receive the graphics card samples, most of which go to Igor in Germany.
"Gigabyte's GTX 1080 G1 Gaming has no trouble maintaining roughly 90 FPS with the Rift and Vive, but PowerColor's RX 480 struggled to keep up."
It should be thoroughly noted in this article that the RX 480 is not AMD's top end offering and that the GTX 1080 and RX 480 are in a different price class. Yet we have here you giving AMD the business for something that should have been obvious from the onset. While you do note that the RX 480 and GTX 1080 are in a different class you only do so in regular font in a text swamp. Something that important should at least be in bold.
I could also nitpick how the Nvidia card is always above the AMD one or that it's charts come before the AMD ones in the 2nd half, which effects presence of mind. Likely you are projecting your own preference in this instance, perhaps without even knowing.
To those annoyed by the RX 480 vs 1080 comparison, I think you're completely missing the point. The article clearly states in the title that's it's testing HMDs and trialling a new benchmarking suite. On the methodology page there's a short and very clear paragraph stating that a future article comparing GPUs is in the works (looking forward to that, btw). And it's very clearly and explicitly stated that the goal of this article is NOT testing GPUs. Then, in exploring the results of the benchmark (necessary to see how the new benchmark suite works) we see a GTX 1080 performing better than an RX 480. So what!? 480s start at less than 1/3rd the price. Maybe Nvidia fans should get angry because the 480 achieves ~70% of the 1080s performance at 30% of the price? Or perhaps you could blame the author because they're perpetuating the perception that Nvidia cards are overpriced by only demonstrating the high end (and arguably overpriced) model?
No, this is an interesting introduction to a new benchmark, clearly labelled as such, and IMHO a worthwhile read. To be blunt, if someone just looks at a few of the charts without reading the article and concludes that Nvidia > AMD, then that's entirely their own fault.
Also, this test tool overall seems to miss the mark for what I was hoping for. This does not appear to be very helpful when it comes to comparing HMD's which in my opinion is what we need more. Figuring out what hardware works best is important, but can be approximated by looking at e.g. 4k game tests already.
As for the differences in render vs display resolution, it would be nice if it was possible to force rendering to the display native resolution and/or the lowest resolution HMD in the test round-up.
Until they address this, it seems the tool is useless at characterizing what a user of that HMD would actually experience.
Apparently, 6 months' delay wasn't enough. It's still not ready for use.
That said, I look forward to downloading it for the eye candy.