After five days of complete silence on the issue, Facebook CEO Mark Zuckerberg came out with an update on the company's recent controversy involving a data leak that affected 50 million Americans and allowed their data to be used for political purposes. Zuckerberg also offered yet another apology for the “breach of trust” between Facebook and its users, which now seems to happen roughly once a year.
Zuckerberg Apologizes (Again)
In a letter to the public, Zuckerberg admitted that because of the way Facebook’s data sharing with third parties worked before 2014, companies such as Strategic Communication Laboratories (SCL) and Cambridge Analytica were able to obtain not just your own data, for which you had to give permission, but also your friends’ data. Sharing your friends’ data without their consent is probably never a good idea, as the European Union (EU) realized not too long ago when it wrote the General Data Protection Regulation (going into effect this May).
After Facebook learned how Cambridge University researcher Aleksandr Kogan was harvesting its users’ data (again, through rules and APIs specifically allowed and written by Facebook), it likely realized that maybe it’s not a good idea to allow third parties to collect friends’ data without their consent. The company made changes in 2014 that prevented third parties from getting friends’ data unless they also gave permission to share their data with the same third-party developer.
However, even in this case, people may not have fully understood what they were signing-up for when, say, taking a quiz on Facebook. If a quiz app asks for all of your timeline data, including photos, comments, likes, shares, and profile information, most people may just click "OK" because they want to take the quiz, without realizing the implications of their agreement. Any app, whether it was a simple quiz or game, could have obtained all of their and their friends' data this easily.
The users may be partly at fault here, too, but most people ignore privacy policies, usually for a good reason: they’re made difficult to read and understand on purpose. Companies also need to take responsibility for how easy or hard they’re making their options and settings to understand by their users.
Zuckerberg said that he learned from the media in 2015 that Krogan was giving access to the data to Cambridge Analytica, and then Facebook asked Krogan and Cambridge Analytica to certify that they deleted all improperly obtained data.
However, at the time, Facebook didn’t do any audit. Only after the recent Cambridge Analytica story came out did the company send auditors to Analytica’s offices. The auditors didn’t have time to investigate because the UK Information Commissioner’s Office (ICO) came with a warrant and told the auditors to stand down so they wouldn’t impede the government’s investigation.
A “Breach Of Trust”
Zuckerberg also admitted that the Cambridge Analytica story showed a “breach of trust” between Facebook and its users:
This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.
Zuckerberg committed to investigating all the apps that requested large amounts of information before the 2014 rule change. A full audit of any app with suspicious activity will be conducted, and any developer who refuses the audit will be immediately banned from the platform. If the audits will find apps that misused personally identifiable information, the company will ban those apps and tell everyone about its findings.
Facebook’s CEO also said that further API restrictions will be added to the platform, so that developers will lose access to people’s data if the apps weren’t used in the last three months. Presumably, Facebook will also check to see if the developers aren’t simply storing the data somewhere else before they lose access, as Cambridge Analytica did.
Another restriction, which should have probably existed from day one, will be to allow third-party developers to obtain only your name, profile photo, and email address when you sign-in with the Facebook login. For additional data, the developers will have to sign contracts with Facebook.
Finally, Zuckerberg also promised a new tool in Facebook that will allow users to more easily revoke app permissions. This feature will be available next month.
Are The Tighter Rules Permanent?
Facebook has a long track record of changing its mind in regards to its privacy policies. In the past, the company would often nullify users’ more privacy-oriented settings and make their data more public by default. It took many outcries and many years for Facebook to eventually start allowing its users to actually make their data more private and limit who got to see it.
It also took companies such as SCL and Cambridge Analytica, and perhaps others, to grossly abuse the lax rules Facebook put in place for its data sharing with third-party applications - the same kind of rules Facebook was going to use to obtain your WhatsApp data after purchasing the messaging company. It was only after the EU Data Protection Authorities started intervening and requiring strict regulation over how Facebook can obtain the data and what it can do with it, that the company took a more privacy-oriented approach.
In the EU, Facebook won’t be given much of a choice once the GDPR goes into effect, especially now that the privacy enforcers are going to watch the company much more closely after what happened with Cambridge Analytica. However, it remains to be seen if this wasn’t just another apology from Zuckerberg in a long string of apologies, meant to get people to put their pitchforks down and forget about deleting their accounts.
Without some strong privacy regulations in the U.S., too, the company won’t have much incentive not to change its mind a couple of years later, and start relaxing the rules again, in order to make more money and make its shareholders happy.