Skip to main content

Rivals in Arms: Nvidia's $199,000 Ampere System Taps AMD Epyc CPUs

Nvidia DGX A100

Nvidia DGX A100 (Image credit: Nvidia)

The DGX A100 is an ultra-powerful system that has a lot of Nvidia markings on the outside, but there's some AMD inside as well. A pair of core-heavy AMD Epyc 7742 (codenamed Rome) processors are at the heart of Nvidia's new $199,000 creation.

The DGX A100 employs up to eight Ampere-powered A100 data center GPUs, offering up to 320GB of total GPU memory and delivering around 5 petaflops of AI performance. The A100 might be doing most of the heavy lifting, but the team still needs a leader. However, Intel doesn't fit the bill. 

The A100 leverages PCIe 4.0, but Intel doesn't have any processors currently that support the interface. AMD, on the other hand, has openly embraced the PCIe 4.0 standard on the the majority of its modern CPUs. Nvidia ultimately found comfort in AMD's arms, more specifically the Red Team's second-generation Epyc offerings.

The DGX A100 features two 7nm Epyc 7742 processors. Each Zen 2 processor comes with 64 cores and 128 threads that run with a 2.25 GHz base clock and 3.4 GHz boost clock. The Epyc 7742 duo accounts for 128 cores and 256 threads on the DGX A100. 

The Epyc 7742 isn't just generous with cores; it's also pretty heavy on the cache too, supplying up to 256MB of L3 cache. More importantly, the 64-core part puts 128 high-speed PCIe 4.0 lanes at Nvidia's disposal.

A typical DGX A100 system also comes with 1TB of memory (upgradeable to 2TB), two 1.92TB NVMe M.2 SSDs in a RAID 1 array for the operating system (Ubuntu Linux) and up to four 3.84TB PCIe 4.0 NVMe U.2 drives in a RAID 0 array for secondary storay. Nvidia also offers the option to add four additional SSDs to bump the RAID 0 volume from 15TB up to 30TB.

The DGX A100 is designed with state-of-the-art networking. Nvidia recently acquired Mellanox Technologies in a whopping $6.9 billion deal, and it's already paying off. The DGX A100 packs eight single-port Mellanox ConnectX-6 VPI HDR InfiniBand adapters for clustering and one dual-port ConnectX-6 VPI Ethernet adapter for storage and networking usage. The aforementioned adapters have the capability to deliver up to 200 Gbps of throughput.