Skip to main content

US Restricts Export of AI Related to Geospatial Imagery

(Image credit: Shutterstock)

The U.S. Bureau of Industry and Security announced yesterday that it would restrict the export of artificial intelligence-related technologies beginning January 6. That might seem like bad news for the American tech industry, but it's actually not as bad as it could've been, because right now the restrictions only apply to geospatial imagery.

Those restrictions won't prohibit U.S. tech companies from exporting AI products related to geospatial imagery outright. The rules allow for the export of such tech to Canada, for example, and companies can apply for licenses to export their wares to other countries. There's just no guarantee those licenses will be granted.

James Lewis, from the Center for Strategic and International Studies think tank, told Reuters that the Bureau of Industry and Security essentially wants "to keep American companies from helping the Chinese make better AI products that can help their military." He said the U.S. fears the possibility of AI-controlled targeting systems.

The restrictions essentially just give the U.S. government more control over certain technologies that could give other countries a military advantage. While some companies might rankle under those restrictions--especially if their shareholders aren't pleased--it's not uncommon for governments to enforce these kinds of rules.

Things could have been much worse. AI has become a central part of many services, and it's possible to use AI on nearly any kind of hardware if you're patient enough, so broader restrictions could've made problems for much of the industry. Instead, the U.S. government introduced a narrow rule that applies to specific tech.

But that might not always be the case. Reuters reported in December that the U.S. was considering other rules that would also limit the export of technologies related to quantum computing, Gate-All-Around Field Effect transistor tech, 3D-printing and chemical weapons. (Which, again, isn't that surprising.) More on that here.