Chinese startup founded by Google engineer claims to have developed its own TPU chip for AI — custom ASIC reportedly 1.5 times faster than Nvidia's A100 GPU from 2020, 42% more efficient
It's allegedly 1.5 times the speed of Nvidia's A100 GPU from 2020.
Chinese startup Zhonghao Xinying is offering a home-grown General Purpose Tensor Processing Unit (GPTPU) as an alternative to international AI training and inferencing hardware, like Nvidia's graphics cards, and Google's TPUs, as reported by the South China Morning Post. These ASIC chips are said to be up to 1.5 times faster than Nvidia's 2020 A100 release based on its Ampere architecture.
Although this is several years and generations behind the capabilities of the latest hardware from its international competition, this shows increasing competitiveness for global compute power and how China may have a path to silicon independence in the future as it explores both traditional GPU and ASIC designs as alternatives.
The "Ghana" chip was developed at the company by Yanggong Yifan, who previously attended Stanford and the University of Michigan to learn electrical engineering. He also worked on chip architectures at Google and Oracle, with specific design work on several generations of Google's TPUs. Co-founder Zheng Hanxun previously worked at Oracle and Samsung's Electronics research and development facility in Texas.
They claim the new TPU uses only self-controlled intellectual property for the core design, with no reliance on Western companies, software stacks, or components for development, design, or fabrication.
“Our chips rely on no foreign technology licences, ensuring security and long-term sustainability from the architectural level,” the SCMP quotes Xinying as saying earlier this year, highlighting his understanding that national security is now closely intertwined with access to semiconductors.
They claim that the Ghana chip is capable of delivering 1.5 times the performance of Nvidia's A100, as well as "reducing power consumption to 75 per cent using a manufacturing process that is an order of magnitude lower than that of leading overseas GPU chips."
If true, that would be an impressive achievement, but not unheard of gains for an ASIC, which is a purpose-built chip that excels at certain functions by stripping out all unnecessary compute elements found in more general-purpose silicon, like GPUs.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Still, if this Chinese TPU design is even close to what they claim, it should be quite powerful. The A100 was cutting-edge hardware five years ago, although even 1.5 times that performance would still put Ghana well behind the Hopper designs from 2022, and far, far behind the latest Blackwell Ultra hardware.
But for a Chinese market that is still smuggling in these older GPUs? That might be plenty.
All of this comes at an intriguing time for the AI chip industry. Although Nvidia has been a dominant force and the face of the industry for the past year, Google's recent announcement to start renting and then selling its own TPU silicon to Meta has opened up the possibility, and the very idea, of direct competition. It's a small-fry deal, despite being worth billions, but as alternatives appear in the West, so they do in the East as China pushes for more domestic chip production and support, via energy subsidy carrots, and mandated quota sticks.
GPUs like those developed by Nvidia and, to a far lesser extent, AMD, will likely remain the most versatile methods for training AI for some time to come, but ASICs like Google's TPUs, and perhaps even those from firms like this, could offer an intriguing alternative for companies looking to break free from the Nvidia near-monopoly.
Or just to get access to hardware. Memory prices, silicon shortages, and trade barriers can all get in the way of even accessing the GPUs your company needs. In their absence, unproven ASICs may be a viable alternative.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.