According to a CNBC report, Facebook sent a doctor on a secret mission to ask hospitals to share patient data with them. The company’s aim was to match patient records to Facebook profiles, a potential violation of the Health Insurance Portability and Accountability Act of 1996 (HIPAA).
Facebook’s Spy Doctor
The report said that Facebook has asked major hospitals and health organizations such as Stanford Medical School and American College of Cardiology to share “anonymized” data about their patients for a “research project.”
The proposal never went beyond the planning phase and it was put on pause once the Cambridge Analytica scandal revealed other privacy issues the Facebook platform had.
The effort to share patient data was led by interventional cardiologist called Freddy Abnousi, who describes his role on LinkedIn (opens in new tab) as "leading top-secret projects." The project was supervised by Regina Dugan, the head of Facebook's "Building 8" experiment projects group, before she left in October 2017.
Deanonymizing “Anonymized Data”
We already know from previous studies that the so-called anonymized data that advertising and data-tracking companies like to promote as a way to encourage people to give up their data, isn’t actually that anonymous. In fact, in many of these studies over 90% of the people can be easily identified from the anonymized data.
It seems Facebook already knew this, because according to the CNBC report, the company was already planning on matching the patient data with its own user profiles.
To comply with the federal and state medical privacy laws, Facebook planned to use cryptographic hashes to match the medical data set with the Facebook user base, while blurring the names of the patients in the medical data set.
However, the final result of this solution still seems to lead to deanonymization, if at the end of the whole process, the company can still match single users to certain medical data about them. At that point, the data is no longer anonymous, even if it may have been in the early stages of the process.
It seems Facebook has gotten into the habit of using its users data in whole new ways without asking for consent, and only apologize later, when discovered. The company had previously done some psychological experiments on its users, for which it later apologized.
Aneesh Chopra, president of CareJourney, a health software company specializing in patient data, seems to agree that Facebook is not approaching this issue the right way:
Consumers wouldn't have assumed their data would be used in this way. If Facebook moves ahead (with its plans), I would be wary of efforts that repurpose user data without explicit consent.
The new EU GDPR rules already require explicit consent for data collection in most cases, and Facebook has already promised to enable the same privacy controls for every user. It now remains to be seen if Facebook has actually learned anything from the Cambridge Analytica scandal, and whether or not the company will be more careful about not using its users' data for things its users didn't give their consent.