Microsoft Says Its Getting Easier to Buy Nvidia's AI GPUs

Nvidia
(Image credit: Nvidia)

Kevin Scott, chief technology officer of Microsoft, recently said at the Code Conference in Dana Point, California, that procuring Nvidia's compute GPUs for artificial intelligence and high-performance computing applications is now less challenging than it was a few months back. 

"Demand was far exceeding the supply of GPU capacity that the whole ecosystem could produce," Scott said in an interview with The Verge. "That is resolving. It is still tight, but it is getting better every week, and we have got more good news ahead of us than bad on that front, which is great. […] It is easier now than when we talked last time."

Kevin Scott, who manages GPU allocations at Microsoft, described the role as daunting over the past few quarters. However, he observed that the availability of compute GPUs is steadily getting better. Scott's responsibilities have been slightly eased due to the improving availability of Nvidia's chips, especially as the AI technology landscape continues to evolve. 

In a recent earnings discussion, Nvidia has expressed its intention to augment its supply throughout the forthcoming year, a sentiment echoed by its finance head, Colette Kress. In parallel, ChatGPT's user traffic has decreased for three straight months. Microsoft's Azure platform supports OpenAI, the company behind ChatGPT, by providing cloud-computing facilities.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • hannibal
    Nvidia has been moving production capasity away from gaming GPUs towards AI GPUs... That is starting to show here!
    Reply
  • brandonjclark
    Are we headed towards a lottery system for purchasing GPU's for gaming?
    Reply