Skip to main content

ACLU Warns About Government-Mandated Malicious Software Updates

The ACLU issued a warning to software developers that if they help the U.S. government send malicious software updates to their own users, many people could lose faith in software updates altogether. This wouldn't be unheard of: The U.S. government previously considered requiring companies to allow it to send malicious updates to targeted users.

Malicious Automatic Updates

The ACLU noted in a post that companies such as Google and Apple make public only a fraction of data demands from the U.S. government. We know from Microsoft’s now-cancelled lawsuit against the Department of Justice that the government was sending almost half of its data requests to large companies as “secret orders.”

The ACLU worries that some of these requests could include forcing developers to send users malicious updates that could steal data or bypass users’ encryption, track their location, or enable their cameras and microphones.

Automatic software updates are generally a good idea, and they make software ecosystems much more resilient against attacks. Without automatic updates, the users who aren’t aware that there is an update for their applications or don’t want to update are more vulnerable to attacks that exploit known vulnerabilities.

Normally, users trust the vendors from which they install the applications, otherwise they may not install them in the first place. This trust extends to any future updates the developer may send to the user. However, if the developers were to send malicious updates that steal user data, that trust would be broken.

Similarly, if people learned that the U.S. government was forcing multiple software vendors to send malicious updates to certain targets, then many more people could turn off automatic updates and may delay manual updates, too, until they are certain the update isn’t intended to cause that user harm. 

ACLU’s Recommendations To Developers

To help developers “plan ahead” just in case the U.S. government may come knocking on their door with a secret order to send users malicious software updates, ACLU and law students in the NYU Technology Law & Policy Clinic prepared a guide.

ACLU recommended developers to design their software in such a way that even if the government tried to force them to send malicious updates to users, it’s not possible to compromise the user’s application or communications. Apple has already won a lawsuit, in which it argued that the FBI was imposing undue burden on the company to create other means to bypass an iPhone user’s storage encryption.

One such tactic is implementing “mirrorable distribution” for software updates so users can get the updates from each other, rather than straight from the developer. This would make it more unpredictable for law enforcement to know exactly which version of the update will be delivered to a target.

Another new type of update mechanism, called “binary transparency,” could be used to ensure that every update has been verifiably logged in a global, irrevocable, auditable log. Therefore, if a government tried to send someone a malicious update, the attack would be seen in this log. The binary transparency system is similar to Google’s open Certificate Transparency system for certificate issuance, and Mozilla is leading the way in implementing it for Firefox.

Other suggestions from ACLU and the law students included preferring making the software open source, so that others can see the whole code and audit it, and implementing “reproducible builds.” Reproducible builds refer to “deterministic compilation” of software, which means a certain version of a program should compile in the exact same way for different users.

Beyond these technical measures, ACLU recommended developers to plan what their responses would be if the government asked them to send malicious updates to users and also to get a lawyer.

Lucian Armasu
Lucian Armasu is a Contributing Writer for Tom's Hardware US. He covers software news and the issues surrounding privacy and security.