Skip to main content

AMD Instinct MI210 GPU Specs Surface: Aldebaran with 64GB of HBM2E

AMD
(Image credit: AMD)

According to a tweet from an engineer working on the LUMI supercomputer, AMD has quietly begun shipments of its Instinct MI210 PCIe card for high-performance computing (HPC). The accelerator can be used inside both servers and workstations, and will compete against Nvidia's A100 PCIe as AMD's new flagship accelerator card. Perhaps the most interesting thing, however, is that although AMD yet has to formally confirm the actual specifications of the product, the researcher shared a few of the unreleased specs a bit early.

See more

From the tweet, we can surmise that AMD has already started shipments of the Instinct MI210 PCIe cards to interested parties, according to George Markomanolis, an engineer from the CSC – IT Center for Science in Finland, where the 0.55 ExaFLOPS LUMI supercomputer is being built. The MI210 has been announced at a live event, but AMD didn't release the official specifications. 

See more

As it turns out, the Instinct MI210 has little to do with the dual-GCD Instinct MI250X. The compute GPU card has 104 compute units, which translates into 6656 stream processors, and 64GB of HBM2e memory. Given the specifications of the product, it looks like it uses only one GCD and therefore is not meant to achieve extreme performance. Meanwhile, with 64GB of memory and CDNA 2 architecture, the MI210 can indeed offer a formidable combination of performance and capabilities compared to AMD’s own Instinct MI100 or Nvidia’s A100 PCIe cards.

By contrast, the flagship Instinct MI250X accelerator with 128GB of HBM2e memory features 14,080 stream processors and 47.9 FP64 TFLOPS performance, comes in an open accelerator module (OAM) form-factor, and consumes up to 550W, according to rumors.

At present, it is impossible to tell how fast AMD's Instinct MI210 is. Meanwhile, it will certainly be interesting to see how AMD's MI210 stacks up against Nvidia's A100 PCIe GPU.

Anton Shilov
Anton Shilov

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • dalek1234
    "...AMD has quietly begun shipments of its Instinct MI210 PCIe card for high-performance computing (HPC) ..."

    Does anybody know what the reasons could be for AMD releasing such a product "Quietly"? Or is "quietly" just an interpretation coming from the author?
    Reply
  • HPC Master
    it's poor AMD vaporware - NOT public available till Q2 2022 like AMD Instinct MI250X ( very limited quantity's ) without native Nvidia CUDA support ( the GPGPU standard ) !
    All HPC Industry leaders don't count on AMD GPU - no demand on this market segment. AMD is well known for bad driver support & bad AMD ROCm open software for years.

    Check the latest AI/ML- HPC realword 'MLPerf Training v1.1 Benchmark' results from AMD here > MLPerf Training v1.1
    AMD = non existing !
    Reply
  • gc9
    The article (currently) does NOT say AMD has released it, quietly or otherwise. Maybe AMD just shipped a sample.

    GPU/Accelerator companies may ship early engineering samples to major customers (supercomputer buyers optimizing supercomputer software) or to companies whose products drive GPU sales (major design software, major videogame engines), so they can evaluate the drivers and help optimize the performance for their products before it is released. (They might also find reasons to adjust voltages, cooling, etc. parameters for some workloads.)

    One expects an engineering sample to be under non-disclosure agreements, so it is unusual that this tweet was allowed. Maybe that is why you thought it was "released".

    dalek1234 said:
    "...AMD has quietly begun shipments of its Instinct MI210 PCIe card for high-performance computing (HPC) ..."

    Does anybody know what the reasons could be for AMD releasing such a product "Quietly"? Or is "quietly" just an interpretation coming from the author?
    Reply