Intel NUC 8 VR (NUC8i7HVK) Review: Core i7, AMD Vega Meet in Hades Canyon

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

Overclocking & Test Setup

Overclocking

Thermal boundaries limit the NUC VR's performance. Thus, we kicked the fans up to 100% for all of our overclocking experiments, even though that resulted in much more noise than the pleasant-sounding Balanced profile.

A 230W limit is enforced by the power supply, though Intel's NUC also throttles performance in response to over-temperature conditions on the motherboard's voltage regulation circuitry or the Kaby Lake-G package itself. This throttling behavior stopped us well short of ever approaching the 230W barrier. In fact, we topped out at 197W with our Kill-A-Watt during our overclocking attempts.

Well-supported utilities like HWiNFO and AIDA do not report the package power consumption and temperatures of the Hades Canyon components correctly, making it difficult to log thermal data accurately. We will follow up with this data once the developers update their tools. We also monitored CPU and GPU frequencies during our testing to detect throttling.

Overclocking the NUC is all about balance, so Intel suggests tuning it based on your workload. While you can overclock the CPU, GPU, and HBM2, limited cooling capacity prevents you from tuning all three components to their maximum potential at the same time. You can focus on ultimate CPU performance or optimized GPU performance, but not both.

We used AMD's WattMan pane to overclock the GPU and HBM2. Interestingly, Intel's management utility looks remarkably like AMD's Radeon Settings window outfitted with Intel's logo and blue trim (hint: it is).

Swipe to scroll horizontally
NUC 8 VR NUC8i7HVKNUC 8 VR StockGPU 1350 MHz (HBM 800)GPU 1350 MHz (HBM 900)CPU 4.2 GHz (GPU Stock)CPU 4.2 GHz (1250/900)
Core i7-8809GStockStockStock4.2 GHz All-Core (1.05V)4.2 GHz All-Core (1.05V)
Vega GraphicsStock1350 MHZ (1.1V)1350 MHz (1.1V)Stock1250 MHz (1V)
HBM2Stock800 MHz (1.1V)900 MHz (1.1V)Stock900 MHz (1V)
System MemoryStockStockStockDDR4-3200DDR4-3200

We began by overclocking the GPU to 1350 MHz. Those settings are listed as GPU 1350 MHz (HBM 800) in the table above. This proved stable, and provided a solid boost to gaming performance. Then, we added HBM2 overclocking to the mix, listed as GPU 1350 MHz (HBM 900). Again, this gave us a few extra frames per second in our gaming benchmarks.

With solid graphics performance established, we decided to add increased CPU performance to the mix. To start, we overclocked the CPU with AMD's Vega GPU and HBM2 at stock settings. This gave us a 4.5 GHz ceiling at 1.05V with memory at DDR4-3200. Initially, we observed stable operation through our application suite (minus AVX-optimized tests). But even with the GPU in its default state, gaming workloads caused the platform to blue-screen.

At the point where we stopped seeing crashes, our CPU was running an all-core 4.2 GHz CPU overclock, listed as CPU 4.2 GHz (GPU Stock). Intel's Core i7-8809G already sports a single-core 4.2 GHz Turbo Boost frequency, but forcing all four cores to that clock rate yielded impressive results in our threaded application benchmarks. We also found that we couldn't drop the CPU voltage below 1.05V and still POST.

Finally, we slowly bumped our GPU and HBM2 frequencies up until all three overclockable components were running stably. This is listed as CPU 4.2 GHz (1250/900). As you can see in the table, we weren't able to apply as much voltage to the GPU and HBM2 with an overclocked CPU, so we settled for a slightly lower 1250 MHz GPU overclock.

You'll find all four configurations in our game testing to illustrate the software's response to various hardware settings. We also include the 4.5 GHz CPU results in our application testing, if only to highlight peak performance with a tuned host processor. We doubt that enthusiasts interested in overclocking would kick game performance to the curb in favor of slightly faster desktop apps.

Intel's AVX offset reduces frequency and voltage when it detects an AVX-optimized workload. We use this helpful feature to attain higher effective CPU overclocks on our desktops. But it's an even more important capability in a small form factor platform. For now, the NUC's AVX offset doesn't appear to be working correctly, which we reported to Intel. As a result, we do not include any AVX-based testing with HandBrake or y-cruncher for our overclocked configurations.

The BIOS' overclocking options are fairly Spartan. A few staples of modern tuning, such as Load Line Calibration control, are missing. This is likely due to Kaby Lake-G's power balancing capabilities. Intel expects enthusiasts to overclock primarily through its XTU, though we didn't see any equivalent controls in the software. Hopefully Intel refines its firmware to expose more knobs and dials. The NUC also needs an exterior BIOS reset button. There were several occasions during testing when holding down the power button did not work. Each time we had to remove the enclosure's cover and a secondary panel inside the NUC to access the BIOS reset jumper. 

Test Setup

Kaby Lake-G is truly unique, so it's hard to find configurations to benchmark against. In this first round of testing, we're looking to determine whether the NUC 8 VR can replace low-end desktops as we weather a debilitating shortage of add-in graphics cards. As such, it goes up against several PCs featuring an Nvidia GeForce GTX 1060. On paper, that puts Intel's NUC at a theoretical disadvantage. But it also gives us some context. Meanwhile, we're amassing comparable small form factor systems with MXM-based 1060s for follow-up testing.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Paul Alcorn
Managing Editor: News and Emerging Tech

Paul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.

  • rgd1101
    Should the dedicated gpu be a 1030 or 1050, not a 1060.
    Reply
  • AgentLozen
    I would liked to have seen it compared to the 1050Ti also. I imagine that the Intel 8809GH performs right between the two of them but I would like to see the benchmarks just to be sure.

    For $1000, how much do you get with this box? A case, power supply, motherboard, GPU, and CPU. You need to supply several hundred dollars in components to finish it off. The performance is decent, but not great compared to a i3 8350K + GTX 1060. So what advantage does this offer over building your own MicroATX computer?

    You could argue that graphics cards are over priced right now but what happens when they come down in price? The NUC8i7HVK would be REALLY cool if the final price of a complete system was $1000, but I feel like it doesn't offer enough as it is right now.
    Reply
  • Gigahertz20
    I've built two Intel NUC's for family members in the past couple of years and they love them. Fast, quiet and so far reliable. They don't game at all which is why I convinced them to buy them. I'm not sure if this NUC is going to be popular at all though at $1,000 barebones. Who is going to buy it? The gaming performance of this NUC is nothing special, gamers and enthusiasts are going to stick with desktops, alot of people are just waiting for the cryptocurrency craze to die down so we can get video cards at decent prices again. If that takes another year or 2 so be it.

    Your average person that just needs an office computer won't buy this at $1k, you can get a much cheaper NUC and throw in a SSD and that will work fine. Why pay a premium for a cute little powerful box, if you want small and portable you can get a laptop for cheaper. If they would have priced this at $600 barebones it would have been much more appealing to your average user that might want to play the occasional game at 1080P.
    Reply
  • Eximo
    There are i7-7700HQ laptops with GTX1060 for comparable prices.

    Great product, pricing is just too off to make sense. For this money I would look at ASRock's STX form factor.
    Reply
  • bit_user
    I wish they sold standalone Vega 24 dGPUs as a replacement for the RX 560.
    Reply
  • bit_user
    20839276 said:
    Should the dedicated gpu be a 1030 or 1050, not a 1060.
    Except:
    Intel claims its new chips should serve up similar graphics performance as Nvidia's GeForce GTX 1060 Max-Q.

    However:
    Test System & ConfigurationGigabyte GeForce GTX 1060 G1 Gaming 6G
    So, it seems the legit complaint is that they used a standard GTX 1060, instead of something closer to the Max-Q model. Here's how they compare:

    http://gpuboss.com/gpus/GeForce-GTX-1060-Max-Q-vs-GIGABYTE-GeForce-GTX-1060-G1-Gaming


    I think the Quadro P2000 would be pretty close to the GTX 1060 Max-Q:

    https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/documents/Quadro-P2000-US-03Feb17.pdf

    But, it's not a perfect match, and it would make for a slightly awkward comparison, probably raising more fuss than the card they chose. Still, they should've at least used a slower GTX 1060, like one of the ITX-friendly single-fan cards.
    Reply
  • bit_user
    20839877 said:
    You can get a laptop with a 1060 for around $1000 all-in. It costs even less for a 1050ti which I agree seems closer in performance.
    ... something about equating desktop and laptop GPUs.
    Reply
  • zodiacfml
    Glad to see products though I want to see even more such as integrated RAM.

    I wonder how AMD APUs would fare with HBM memory available to its CPU and GPU.
    Reply
  • FD2Raptor
    20839276 said:
    Should the dedicated gpu be a 1030 or 1050, not a 1060.

    The dedicated comparison should have been the RX 570/580 4GB GDDR5 to remove any nvidia vs amd optimizations difference from the equation.
    Reply
  • redgarl
    The GPU is really similar to a RX 570. Well, it is impressive for an IGPU. I am surprised AMD is not doing anything for that market on mobile.

    Probably next year with Zen 2.

    20841016 said:
    20839276 said:
    Should the dedicated gpu be a 1030 or 1050, not a 1060.

    The dedicated comparison should have been the RX 570/580 4GB GDDR5 to remove any nvidia vs amd optimizations difference from the equation.

    I totally agree. As of now it is impossible to know what this VEGA 11 chip really is in comparison to APU.

    Reply