AMD’s Epyc Potential Win: Google May Ditch Intel

(Image credit: AMD)

Google, once an exclusive partner for some of Intel’s early server chip launches, is now rumored to be considering a swich to AMD’s Epyc server platform, according to Lynx Equity Research.

An Epyc Loss for Intel

As reported by Benzinga, analyst KC Rajkumar said Google has been growing increasingly dissatisfied with Intel’s server CPUs lately. And with Google’s Project Zero security team being among the first group of researchers to uncover the Meltdown and Spectre CPU flaws, who can blame it? 

Google’s engineers have had issues with Intel’s Management Engine (ME), a computer subsystem in Intel CPU chipsets, to the point where they started disabling it for some of their servers. Many privacy activists have warned for years that Intel’s ME could effectively be used as backdoor by malicious parties. Intel’s ME has also been found to have a bunch of vulnerabilities over the past few years.

Additionally, Google was among the first companies to disable Intel’s Hyper-Threading (HT) technology on its Chromebooks after MDS flaws were uncovered. 

Lynx Equity Research’s report also noted that Google is making its own server boards using AMD Epyc server CPUs, which would be a drastic change from Google’s almost exclusive use of Intel server CPUs. Plus, Google recently started using AMD GPUs for its Stadia game streaming service.

Google/Intel Romance Over?

The split between Google and Intel isn’t official yet, and even if Google announces a partnership with AMD, it doesn’t necessarily mean Google will use AMD CPUs exclusively. It’s also unlikely that Google will eliminate every single use of an Intel CPU from its large data centers. Chances are the chips will be replaced gradually as their lifecycle ends. 

However, Google even considering AMD is a big threat to Intel's dominance in the server market. If Google can consider using AMD server chips, other companies may be do the same. 

Just like the “Wintel” romance between Microsoft and Intel, of yesteryear, Google has had a tight relationship with Intel over the past decade or so. Beyond only using Intel CPUs for its servers, Paul Otellini, Intel’s former CEO, sat on its board of directors for many years.

Furthermore, Google has mostly steered clear of using Arm or AMD chips for the majority of Chromebooks, despite Chrome OS being the one and only mainstream operating system that could claim to be architecture-agnostic (with it running web apps and all).

Even Android isn’t truly architecture-agnostic, as a good portion of the platform’s applications are written in native code for Arm. This is also why Intel’s Atom CPUs would drain smartphone batteries quickly when people would played games (normally written with native code, as opposed to being written in Java) on their Intel-powered devices.

Over the past few years, Google has started branching out to other providers of chips, such as Nvidia for machine learning and AMD for its new game platform. The company has also built its own specialized machine learning chips, for both the training of neural network models and inference. 

Lucian Armasu
Lucian Armasu is a Contributing Writer for Tom's Hardware US. He covers software news and the issues surrounding privacy and security.
  • bigdragon
    I think this is a smart move. In the post-Meltdown and Spectre world, hardware diversity is looking like an appealing option. Having everything be the same architecture does help keep costs down, but bites you hard when major, multi-generational vulnerabilities come out.
    Reply
  • bit_user
    Even Android isn’t truly architecture-agnostic, as a good portion of the platform’s applications are written in native code for Arm. This is also why Intel’s Atom CPUs would drain smartphone batteries quickly when people would played games (normally written with native code, as opposed to being written in Java) on their Intel-powered devices.
    Android is ISA-agnostic, but apps can have natively-compiled portions. I don't know how common that is, but it's reasonable to expect that the popular game engines have optimized, native code for both ARM and x86.

    I honestly doubt that has much to do with why games are such power-hogs, though. The issue is really that smart phones spend most of their time idling, and when you really crank up the CPU and especially the GPU, it just depletes battery power very quickly.
    Reply