Et Tu, GPU? Researchers Publish Side-Channel Attacks on Nvidia Graphics

Researchers at the University of California, Riverside have identified three practical side-channel attacks that work on Nvidia GPUs. These attacks can be used to compromise someone's passwords, snoop on their online activities, and use machine learning to discover a secret neural network's structure. The paper focused on Nvidia graphics cards, but the researchers also informed AMD and Intel of the vulnerabilities and will continue to investigate them.


Most people don't think of their graphics cards as high-value targets. What could attackers want to do with the chip that squeezes as many frames as possible out of a game, movie, or other form of media? But tech companies have become increasingly reliant on GPUs to ease computational workloads, as well as rendering graphics. This expansion of responsibilities and the GPU's ascension to nigh-ubiquity make Nvidia's products appealing targets.

The UC Riverside authors explained the vulnerabilities further in their announcement of the side-channel attacks:

"Web browsers use GPUs to render graphics on desktops, laptops, and smart phones. GPUs are also used to accelerate applications on the cloud and data centers. Web graphics can expose user information and activity. Computational workloads enhanced by the GPU include applications with sensitive data or algorithms that might be exposed by the new attacks. [...] GPUs are usually programmed using application programming interfaces, or APIs, such as OpenGL. OpenGL is accessible by any application on a desktop with user-level privileges, making all attacks practical on a desktop. Since desktop or laptop machines by default come with the graphics libraries and drivers installed, the attack can be implemented easily using graphics APIs."

All three attacks require the installation of a malicious program on a target system. None directly expose private information. The researchers had to get creative when it came to compromising information via Nvidia's GPUs. They managed it by monitoring GPU memory allocations or performance counters and using machine learning to suss out the results. That way they could monitor web activity or learn someone's passwords.

The third attack was used against cloud applications. UC Riverside explained:

"The attacker launches a malicious computational workload on the GPU which operates alongside the victim’s application. Depending on neural network parameters, the intensity and pattern of contention on the cache, memory and functional units differ over time, creating measurable leakage. The attacker uses machine learning-based classification on performance counter traces to extract the victim’s secret neural network structure, such as number of neurons in a specific layer of a deep neural network."

You can read the full paper detailing these attacks via UC Riverside's website. The school said it contacted Nvidia about the attacks and was told the company planned to "publish a patch that offers system administrators the option to disable access to performance counters from user-level processes." After informing AMD and Intel about the attacks, the researchers moved to investigate the possibility of these attacks working on Android smartphones.

Nathaniel Mott
Freelance News & Features Writer

Nathaniel Mott is a freelance news and features writer for Tom's Hardware US, covering breaking news, security, and the silliest aspects of the tech industry.

  • vapour
    I guess nothing is secure, huh?
    Reply
  • bit_user
    21480306 said:
    I guess nothing is secure, huh?
    Well, the performance counters are easily locked down. Contrary to how the article is worded, the driver controls access to those - it's not merely a matter of being able to compile and run an OpenGL program (and I'm also puzzled at the singling-out of OpenGL, here).

    As for the 3rd attack, that's rather more disturbing. It's a bit more like the hyperthreading-based CPU exploits. You could protect against it by imposing more stringent limits on how the GPU is shared, but user experience could suffer.
    Reply
  • irfbhatt
    Better use videocards for mining bitcoin )
    Reply