China doubles US research output on next-gen chips amid export bans — trade war fuels a research wave

China
(Image credit: Shutterstock)

The United States-China chip trade war is entering its fifth year, and it seems the U.S.'s intervention is coming back to bite it: A recent study by the Emerging Technology Observatory (ETO) found China has conducted more than double the United States' research on next-generation chipmaking technologies.

The ETO notes that 475,000 articles about chip design and fabrication were published between 2018 and 2023 worldwide. Of this body of work, 34% was produced by Chinese institutions, dwarfing the 15% coming from the United States and 18% from Europe. While chipmaking is not as popular for study as hot topics like AI and LLMs, China appears to be going all-in on studying the future of fabrication.

Sunny Grimm
Contributing Writer

Sunny Grimm is a contributing writer for Tom's Hardware. He has been building and breaking computers since 2017, serving as the resident youngster at Tom's. From APUs to RGB, Sunny has a handle on all the latest tech news.

  • 2Be_or_Not2Be
    China increases research spending, and the US pulls back funding for science research & even local chipmakers. So who's playing checkers while the other is playing chess?
    Reply
  • mitch074
    2Be_or_Not2Be said:
    China increases research spending, and the US pulls back funding for science research & even local chipmakers. So who's playing checkers while the other is playing chess?
    Actually, the USA are playing chess expecting China to play checkers... But China is playing mahjong.
    Reply
  • JRStern
    Neuromorphic will be the thang, just as soon as anybody figures it out.
    But, don't get too excited about citations, if China writes all the papers and they all cite each other, well, there it is..
    Lots of published papers say nothing, or are just wrong.
    US commercial companies have almost stopped R&D publications over the last 15-20 years.
    Even stopped patenting anything important because patents reveal technology.
    Reply
  • rluker5
    JRStern said:
    Neuromorphic will be the thang, just as soon as anybody figures it out.
    But, don't get too excited about citations, if China writes all the papers and they all cite each other, well, there it is..
    Lots of published papers say nothing, or are just wrong.
    US commercial companies have almost stopped R&D publications over the last 15-20 years.
    Even stopped patenting anything important because patents reveal technology.
    I read a while back that Intel already has neomorphic computing running.
    Reply
  • gg83
    LMao! This is a very funny headline.
    Reply
  • gg83
    JRStern said:
    Neuromorphic will be the thang, just as soon as anybody figures it out.
    But, don't get too excited about citations, if China writes all the papers and they all cite each other, well, there it is..
    Lots of published papers say nothing, or are just wrong.
    US commercial companies have almost stopped R&D publications over the last 15-20 years.
    Even stopped patenting anything important because patents reveal technology.
    Nailed it. I couldn't say it as well as you did.
    Reply
  • DS426
    2Be_or_Not2Be said:
    China increases research spending, and the US pulls back funding for science research & even local chipmakers. So who's playing checkers while the other is playing chess?
    China's economy is going to be getting a lot less business from the U.S., so they'll have to pull that back at some point (or won't be able to increase spend as much as they otherwise would). Pulling back on regulations can spur investment and innovation as well.
    Gotta love the hyper-focused cause-and-effect talking points. ;)
    Reply
  • smogfactory
    The US was warned that all their restrictions on China with regards to AI and chips would do is turbo charge chinese investment and development. Shock/Horror that is exactly what has happened.
    Reply
  • JRStern
    rluker5 said:
    I read a while back that Intel already has neomorphic computing running.
    There have been neuromorphic projects for fifty years, probably longer.
    It's probably some flavor of in-memory computing.
    Has to be mapped to existing problems and programs, like LLM.
    Even then it might turn out to be done better with software on more generic hardware, where generic includes something like a B100/B200, but not exactly like.
    Something like Intel processors with CXL, and with FPGA, that they spent all that money on eight years ago, might be a winner.
    Or intelligent storage drives, relatively small ones used in parallel.

    etc.
    Reply
  • Pierce2623
    JRStern said:
    There have been neuromorphic projects for fifty years, probably longer.
    It's probably some flavor of in-memory computing.
    Has to be mapped to existing problems and programs, like LLM.
    Even then it might turn out to be done better with software on more generic hardware, where generic includes something like a B100/B200, but not exactly like.
    Something like Intel processors with CXL, and with FPGA, that they spent all that money on eight years ago, might be a winner.
    Or intelligent storage drives, relatively small ones used in parallel.

    etc.
    Turns out AMD seemingly did much better than Intel at gobbling up an effective FPGA player to pad out their lineup.
    Reply