China doubles US research output on next-gen chips amid export bans — trade war fuels a research wave

China
(Image credit: Shutterstock)

The United States-China chip trade war is entering its fifth year, and it seems the U.S.'s intervention is coming back to bite it: A recent study by the Emerging Technology Observatory (ETO) found China has conducted more than double the United States' research on next-generation chipmaking technologies.

The ETO notes that 475,000 articles about chip design and fabrication were published between 2018 and 2023 worldwide. Of this body of work, 34% was produced by Chinese institutions, dwarfing the 15% coming from the United States and 18% from Europe. While chipmaking is not as popular for study as hot topics like AI and LLMs, China appears to be going all-in on studying the future of fabrication.

The quality of research coming from China is also at a high point. When looking only at articles in the top 10% of highest citations, 50% of this field comes from China. America and Europe sit far below at 22% and 17%, respectively. India, Japan, and South Korea also contribute to both metrics, but all fall well short of China's prolific research body and high citation count.

These numbers don't mean China is more advanced than the U.S., but the meta-study's authors believe it may be before long. In a comment to Nature, Zachary Arnold of the ETO shared, "I don't know if we've seen a field where there is quite this difference ... When you see so much activity, it's hard to imagine that [won't] have an effect on China's technological capability and ultimately manufacturing capability in the coming years."

In terms of what China is studying, neuromorphic computing (based on processors structured like neurons) and optoelectric computing (using light to transfer data within chips) take up the lion's share of modern research coming from China. These are post-Moore's Law technologies to pursue outside the traditional framework of chasing ever-smaller process nodes and, therefore, outside the regulations currently leveled on the Chinese industry. As nascent technologies, unless the U.S. manages to place patents on them before China can reach them, the standard Chip War M.O. of banning the export of tools will be useless against these next-gen chips.

The United States' offensive against China's chip market has primarily favored limiting China's access to making leading-edge chips, which was accomplished by putting sanctions on China's ability to import modern chipmaking equipment. This has included any tech for fabricating chips smaller than 14nm since 2022. International chipmaking suppliers, including ASML, have been specifically blocked from selling to Chinese-linked entities, effectively keeping the country limited to legacy chips for "national security reasons."

Alas, just as the industry is preparing for a flood of mature chips into the world market from China thanks to these regulations, China may also eventually discover chipmaking tech beyond the knowledge and capability of the West. Chinese research organizations comprise all the top-8 highest-cited groups worldwide in the chipmaking sphere and have no signs of slowing down. This considerable body of highly-cited work shuns common China-negative theories which posit that China only profits from stolen tech and research.

The U.S.-China trade war will not soon end, especially as both sides fan the flames with TSMC's $100b U.S. investment, China's bullish moves towards RISC-V architecture, and a new wave of tariffs launching today. What the long-term looks like for either nation is truly unknowable, though China stealthily making strides in the research game may pay dividends tomorrow.

Dallin Grimm
Contributing Writer

Dallin Grimm is a contributing writer for Tom's Hardware. He has been building and breaking computers since 2017, serving as the resident youngster at Tom's. From APUs to RGB, Dallin has a handle on all the latest tech news. 

  • 2Be_or_Not2Be
    China increases research spending, and the US pulls back funding for science research & even local chipmakers. So who's playing checkers while the other is playing chess?
    Reply
  • mitch074
    2Be_or_Not2Be said:
    China increases research spending, and the US pulls back funding for science research & even local chipmakers. So who's playing checkers while the other is playing chess?
    Actually, the USA are playing chess expecting China to play checkers... But China is playing mahjong.
    Reply
  • JRStern
    Neuromorphic will be the thang, just as soon as anybody figures it out.
    But, don't get too excited about citations, if China writes all the papers and they all cite each other, well, there it is..
    Lots of published papers say nothing, or are just wrong.
    US commercial companies have almost stopped R&D publications over the last 15-20 years.
    Even stopped patenting anything important because patents reveal technology.
    Reply
  • rluker5
    JRStern said:
    Neuromorphic will be the thang, just as soon as anybody figures it out.
    But, don't get too excited about citations, if China writes all the papers and they all cite each other, well, there it is..
    Lots of published papers say nothing, or are just wrong.
    US commercial companies have almost stopped R&D publications over the last 15-20 years.
    Even stopped patenting anything important because patents reveal technology.
    I read a while back that Intel already has neomorphic computing running.
    Reply
  • gg83
    LMao! This is a very funny headline.
    Reply
  • gg83
    JRStern said:
    Neuromorphic will be the thang, just as soon as anybody figures it out.
    But, don't get too excited about citations, if China writes all the papers and they all cite each other, well, there it is..
    Lots of published papers say nothing, or are just wrong.
    US commercial companies have almost stopped R&D publications over the last 15-20 years.
    Even stopped patenting anything important because patents reveal technology.
    Nailed it. I couldn't say it as well as you did.
    Reply
  • DS426
    2Be_or_Not2Be said:
    China increases research spending, and the US pulls back funding for science research & even local chipmakers. So who's playing checkers while the other is playing chess?
    China's economy is going to be getting a lot less business from the U.S., so they'll have to pull that back at some point (or won't be able to increase spend as much as they otherwise would). Pulling back on regulations can spur investment and innovation as well.
    Gotta love the hyper-focused cause-and-effect talking points. ;)
    Reply
  • smogfactory
    The US was warned that all their restrictions on China with regards to AI and chips would do is turbo charge chinese investment and development. Shock/Horror that is exactly what has happened.
    Reply
  • JRStern
    rluker5 said:
    I read a while back that Intel already has neomorphic computing running.
    There have been neuromorphic projects for fifty years, probably longer.
    It's probably some flavor of in-memory computing.
    Has to be mapped to existing problems and programs, like LLM.
    Even then it might turn out to be done better with software on more generic hardware, where generic includes something like a B100/B200, but not exactly like.
    Something like Intel processors with CXL, and with FPGA, that they spent all that money on eight years ago, might be a winner.
    Or intelligent storage drives, relatively small ones used in parallel.

    etc.
    Reply
  • Pierce2623
    JRStern said:
    There have been neuromorphic projects for fifty years, probably longer.
    It's probably some flavor of in-memory computing.
    Has to be mapped to existing problems and programs, like LLM.
    Even then it might turn out to be done better with software on more generic hardware, where generic includes something like a B100/B200, but not exactly like.
    Something like Intel processors with CXL, and with FPGA, that they spent all that money on eight years ago, might be a winner.
    Or intelligent storage drives, relatively small ones used in parallel.

    etc.
    Turns out AMD seemingly did much better than Intel at gobbling up an effective FPGA player to pad out their lineup.
    Reply