The lesson of Cambridge Analytica: Facebook shares your data by design.
Cambridge Analytica, a company funded by Republican backer Robert Mercer, used data from 50 million Facebook users to model the electorate. But this isn’t a data breach. Facebook is a machine designed to encourage people to share their preferences, then exploit it for profit. Despite Facebook’s protests, this is a feature, not a bug.
According to the New York Times article “How Trump Consultants Exploited the Facebook Data of Millions,” here’s what happened:
[Cambridge Analytica founder Christopher Wylie’s] team had a bigger problem. Building psychographic profiles on a national scale required data the company could not gather without huge expense. Traditional analytics firms used voting records and consumer purchase histories to try to predict political beliefs and voting behavior. . . .
Mr. Wylie found a solution at Cambridge University’s Psychometrics Centre. Researchers there had developed a technique to map personality traits based on what people had liked on Facebook. The researchers paid users small sums to take a personality quiz and download an app, which would scrape some private information from their profiles and those of their friends, activity that Facebook permitted at the time. The approach, the scientists said, could reveal more about a person than their parents or romantic partners knew . . .
When the Psychometrics Centre declined to work with the firm, Mr. Wylie found someone who would: Dr. [Aleksandr] Kogan, who was then a psychology professor at the university and knew of the techniques. Dr. Kogan built his own app and in June 2014 began harvesting data for Cambridge Analytica. The business covered the costs — more than $800,000 — and allowed him to keep a copy for his own research, according to company emails and financial records.
All he divulged to Facebook, and to users in fine print, was that he was collecting information for academic purposes, the social network said. It did not verify his claim. Dr. Kogan declined to provide details of what happened, citing nondisclosure agreements with Facebook and Cambridge Analytica, though he maintained that his program was “a very standard vanilla Facebook app.”
He ultimately provided over 50 million raw profiles to the firm, Mr. Wylie said, a number confirmed by a company email and a former colleague. Of those, roughly 30 million — a number previously reported by The Intercept— contained enough information, including places of residence, that the company could match users to other records and build psychographic profiles. Only about 270,000 users — those who participated in the survey — had consented to having their data harvested.
So a Republican-funded firm worked with an academic to persuade Facebook users to take a personality quiz, then “harvested” their information and the information of friends of those people to build a psychographic model.
A lot of ink has been spilled analyzing these activities, including whether entities affiliated with Russia helped fund this activity. But what about Facebook? Did it do anything wrong here?
Facebook defends its behavior
Here’s the Facebook statement from its blog, including a comment added after its original publication. I’ve added my own translation of what they’re actually saying.
Suspending Cambridge Analytica and SCL Group from Facebook
By Paul Grewal, VP & Deputy General Counsel
Update on March 17, 2018, 9:50 AM: The claim that this is a data breach is completely false. Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.
Translation: It wasn’t hackers. Advertisers suck people into sharing data; that’s how Facebook works.
Originally published on March 16, 2018:
We are suspending Strategic Communication Laboratories (SCL), including their political data analytics firm, Cambridge Analytica, from Facebook. Given the public prominence of this organization, we want to take a moment to explain how we came to this decision and why.
Translation: An advertiser lied to us. When we found out, we kicked them off Facebook. This is the ultimate sanction.
We Maintain Strict Standards and Policies
Protecting people’s information is at the heart of everything we do, and we require the same from people who operate apps on Facebook.
Translation: We let people share everything about themselves in exchange for a momentary thrill. Sometime app developers deliver that momentary thrill. They’re supposed to be nice about it.
In 2015, we learned that a psychology professor at the University of Cambridge named Dr. Aleksandr Kogan lied to us and violated our Platform Policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica, a firm that does political, government and military work around the globe. He also passed that data to Christopher Wylie of Eunoia Technologies, Inc.
Translation: We don’t control data once people get it off Facebook. Sometimes people share it. That’s not nice.
Like all app developers, Kogan requested and gained access to information from people after they chose to download his app. His app, “thisisyourdigitallife,” offered a personality prediction, and billed itself on Facebook as “a research app used by psychologists.” Approximately 270,000 people downloaded the app. In so doing, they gave their consent for Kogan to access information such as the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it.
Translation: Apps get access to our members’ data. That’s their business model.
Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules. By passing information on to a third party, including SCL/Cambridge Analytica and Christopher Wylie of Eunoia Technologies, he violated our platform policies. When we learned of this violation in 2015, we removed his app from Facebook and demanded certifications from Kogan and all parties he had given data to that the information had been destroyed. Cambridge Analytica, Kogan and Wylie all certified to us that they destroyed the data.
Translation: Once the data is off of Facebook we can’t control it. If we find out people are sharing it, that’s very upsetting. Sometimes they lie about destroying the data. This is bad.
Breaking the Rules Leads to Suspension
Several days ago, we received reports that, contrary to the certifications we were given, not all data was deleted. We are moving aggressively to determine the accuracy of these claims. If true, this is another unacceptable violation of trust and the commitments they made. We are suspending SCL/Cambridge Analytica, Wylie and Kogan from Facebook, pending further information.
We are committed to vigorously enforcing our policies to protect people’s information. We will take whatever steps are required to see that this happens. We will take legal action if necessary to hold them responsible and accountable for any unlawful behavior.
Translation: After they persuaded you to give up your data, they did stuff with it. That’s not nice. We’ll be closing the barn door now that 50 million horses have fled this particular barn.
How Things Have Changed
We are constantly working to improve the safety and experience of everyone on Facebook. In the past five years, we have made significant improvements in our ability to detect and prevent violations by app developers. Now all apps requesting detailed user information go through our App Review process, which requires developers to justify the data they’re looking to collect and how they’re going to use it – before they’re allowed to even ask people for it.
Translation: We’ve changed. Now we’ll only work with nice people.
In 2014, after hearing feedback from the Facebook community, we made an update to ensure that each person decides what information they want to share about themselves, including their friend list. This is just one of the many ways we give people the tools to control their experience. Before you decide to use an app, you can review the permissions the developer is requesting and choose which information to share. You can manage or revoke those permissions at any time.
On an ongoing basis, we also do a variety of manual and automated checks to ensure compliance with our policies and a positive experience for users. These include steps such as random audits of existing apps along with the regular and proactive monitoring of the fastest growing apps.
We enforce our policies in a variety of ways — from working with developers to fix the problem, to suspending developers from our platform, to pursuing litigation.
Translation: We’re running a candy factory. We’ve told people that candy is bad for them. We’ve put a fence around the factory, although the fence is full of holes. People still keep coming in and taking candy. We will keep studying why that happens and patching the holes from time to time.
Facebook shares people’s data. That’s how it works.
Here are some facts.
Facebook developers designed it to be seductive. It uses your relationships with friends — and your biases around news — to suck you into interacting. This is why it exists.
Facebook makes money by allowing advertisers to target you based on your information sharing your information with advertisers. This is its business model.
Third parties with Facebook apps seduce people into sharing their Facebook information. Those “quizzes” you’re taking on Facebook? This is what they do. It’s part of the business model.
Once your data is in the hands of those third parties, Facebook’s contracts do not allow the third parties to take the data off Facebook. Facebook’s ability (and desire) to enforce this is weak.
For that reason, Facebook is a perfect machine for harvesting and exploiting data for commercial and political purposes. This is why it is susceptible to trolls. It is why Cambridge Analytica used it. And it is why every commercial or partisan group that needs to operate on data will use it in the future.
Facebook is very good at encouraging sharing. That was a hard problem, but Facebook has solved it.
Facebook is not very good at protecting data or preventing “bad actors” from exploiting how it works. This is also a hard problem, but there is no easy solution. The simple solutions that come to mind would slow down the sharing on Facebook or require people to act intelligently in their own interest, two things that are very unlikely to happen.
You can blame Facebook, Russians, Republicans, Steve Bannon, or anyone else you feel like. You can wring your hands and say “this shouldn’t happen, we need to stop it.”
But the people who built Facebook designed it to work this way. Don’t expect it to be any different in the future.
Edit added in third section to reflect a nuance in how and with whom Facebook shares information.
This problem is not just with Facebook, although, as you pointed out, they need to be accountable. Every time you register a warranty for a product, buy a house, download a free ebook to do something, or any number of other digital actions, the result is the same. People sell your data. I don’t know which is worse: somebody having my data or someone using my data to dupe people into dumb decisions. As a marketer, this is a tough one. We want people to give us their data so we can better target their needs, which in turn helps them buy something they probably don’t need. In my pea brain thinking, this is like trying to stop a tsunami by building a brick retaining wall. The tsunami is already on its way. Facebook is not the enemy. The people that mess with the data to mess with your minds are the culprits. I believe this is a human problem. [coming down off the soapbox now]
As many wise folks have said, if you don’t pay anything, then you are the product.
Some technicalities in the article are wrong. SCL was not an advertiser, but an app maker.
So, sadly but at least a point in their favor, Facebook did not get any benefit from SCL, it was solely SCL which benefited from it. And it took years for Facebook to come to realize this.
My personal reading is that Facebook sinned from naiveness and irresponsibility but not from maliciousness: the evil parties were SCL and CA