Facebook is currently embroiled in two privacy-related controversies. The first involved the ease with which third parties are able to collect information about Facebook users without their knowledge or consent, as shown by Cambridge Analytica's use of data from 50 million Americans for political messaging. The second revolved around Messenger's disclosed-but-surreptitious collection of call and SMS history on Android smartphones. The company is addressing the first controversy by making changes to its platform and the way organizations use its data.
In a blog post, Facebook said it plans to change several aspects of its platform in the near future. Some, like the decision to halt app review last week as it prepared to better protect user data, address the problems in the near-term. The decision to investigate all developers who had access to "large amounts of information" before Facebook changed its platform in 2014, meanwhile, is supposed to suss out any problems from the past. Facebook's other planned changes are aimed more at making sure controversies like this one are less likely to spring up in the future.
Here are the relevant changes, as outlined in Facebook's blog post:
Inform people if an app is removed for data misuse: If we find developers that misused personally identifiable information, we will ban them from our platform. Moving forward, if we remove an app for misusing data, we will notify everyone who used it.Encourage people to manage the apps they use: We already show people what apps their accounts are connected to and control what data they’ve permitted those apps to use. In the coming month, we’re going to make these choices more prominent and easier to manage.
These changes address the same core problem: people who use Facebook don't think about how their information is handled. Many apps request access to Facebook's data, sometimes for login purposes, sometimes to enable core features. Chances are good that many Facebook users don't think about what's going to happen to that data; they simply give the devs everything they want and assume Facebook will keep them safe.
The Cambridge Analytica saga--and the following revelation from a former Facebook employee that the company did little protect user data once it left its servers--showed how faulty those assumptions were. But that doesn't mean every Facebook user is going to suddenly know how to manage which apps have access to their information, or even which developers can't be trusted. They don't know how to protect their own data. These changes, however, will help raise awareness of these problems and make it easier for people to figure out how to respond to them.
Facebook also said it plans to expand its bug bounty program to reward people who discover that developers are misusing data and disclose that misuse to the company. It will also increase scrutiny of apps that request access to the "user_friends" permission, which lets developers show people their Facebook friends who use their app, by requiring a Login Review for every utility that attempts to access that information.
It's clear that Facebook is reeling from these privacy controversies. Lawmakers around the world have requested more information about the Cambridge Analytica saga, disgruntled users started the #deletefacebook campaign, and the company's share price keeps falling. These changes are unlikely to reverse all that momentum, but they should provide a modicum of comfort to the many people who will continue to use Facebook.