Facebook and the age of data enlightenment

Published on the 27/03/2018 | Written by Jonathan Cotton


Social-media-collecting-data_Facebook

With #DeleteFacebook trending, the creepiness of the social giant’s data-enabled reach is beginning to hit home…

The 2016 US Presidential election left everybody feeling a little grubby. From new lows of divisive populism, to the endless cries of ‘fake news’, to the uniquely sleazy vibes coming from many a Trump soundbite, in 2016 the nastiness and puerile nature of modern politics seemed on display like never before.

Now, there’s another bit of sleaze to add to the list: The revelation that British political consulting firm, Cambridge Analytica, had a little more Facebook customer data at its disposal than some would like and was using that data to target users with an unsettling degree of specificity.

It’s a big deal: An estimated 100 million registered US voters were targeted and the whistleblower behind the scandal, former Cambridge Analytica employee Christopher Wylie, has now accepted invitations to testify before U.S. and UK lawmakers.

Targeted advertising on Facebook is nothing new – Hillary Clinton did it, as did Obama before her.  Where Cambridge Analytica seems to have separated from the pack is through its use of ‘psychographics’ – behavioural analysis that segments not by age, race or gender, but by personality itself.

Here’s what happened: Cambridge Analytica purchased 270,000 personality tests (which were generated via a Facebook personality quiz app) from a Cambridge University researcher named Aleksandr Kogan. Now purchasing those second-hand test results is a violation of Facebook’s terms and conditions as it is, but to make matters worse, Cambridge Analytica also received a shady little bonus: information on test-takers friends, which it then used to reverse engineer personality profiles based on those users’ Facebook activity.

The result? Cambridge Analytics found itself with the personality profiles of 100 million registered US voters. Then, based on those profiles, they created dozens of campaign ad variations around election talking points – immigration, gun control and the economy – perhaps, in the process, altering the course of the 2016 US presidential election.

That’s the scandal, and as of now, political commentators are clutching their pearls, #DeleteFacebook is (ironically) doing the rounds on Twitter and some are looking for ways to get tough with Facebook.

The EU is one such party. They’ve been gunning for the social media giant – and its data-coveting bird-of-a-feather Google too – for some time, proposing last year, that authorities be given the power to levy fines of up to four percent of annual turnover for customer data infringements. (They’re quite keen on extending such a policy to cloud storage services and email providers too).

As for The Zuck, he’s finally responded – in print, uniquely enough – but is making fairly predictable noises: “We have a responsibility to protect your data,” he says in a full page ad published in several UK and US newspapers, “and if we can’t then we don’t deserve to serve you.”

Zuckerberg first points out that the data from the personality quiz was historical and changes made to Facebook’s platform in 2014 drastically limits what data apps like this one can access.

“It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data.”

While they provided these certifications, it’s thought that Cambridge may not have deleted the data as they claimed.

“This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.” (Cambridge Analytica has now agreed to a forensic audit by a firm hired by Facebook.)

Zuckerberg says that Facebook will now conduct an investigation of all apps that had access to large amounts of information before the 2014 platform changes and they will conduct a full audit of any app with suspicious activity.

“We will ban any developer from our platform that does not agree to a thorough audit,” he says.

The company will restrict developers’ data access “even further”, removing developers’ access to user data if those users haven’t used the app in last three months, as well as reduce the data that can be requested to sign in to only names, profile photos and email addresses.

Finally, Facebook will give users a more conspicuous heads up as to what is on the line when engaging with these sorts of apps: “In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.”

It’s corporate grovelling 101 and to be expected, but it’s unlikely that Zuckerberg, in his heart of hearts, has much concern for anything other that his company’s liability and bad PR right now.

“It’s unlikely that Zuckerberg has much concern for anything other that his company’s liability and bad PR right now.”

One party doubting his sincerity is Carol Davidsen, director of integration and analytics on the 2012 Obama campaign. She’s been stirring the pot on Twitter, commenting that, back in 2012, when she was manipulating Facebook user data for the Democratic campaign, that  “Facebook was surprised we were able to suck out the whole social graph, but they didn’t stop us once they realised that was what we were doing.”

“They came to the office in the days following election recruiting and were very candid that they allowed us to do things they wouldn’t have allowed someone else to do because they were on our side.”

Facebook, it seems, is now in a position where it can play a part in some very large outcomes. That should give us pause for thought.

And as for the actual data violations, we need to figure out how best to address this. What should be expected of Facebook? And what outcome do we actually want here? Because taking Facebook to task for not protecting our user data – the very thing they sell – is likely to be an exercise in futility.

Sure, what Cambridge Analytica did was a violation of Facebook’s policies, but it’s also closely related to how the company makes money. No amount of scolding, regulation or threats will ever convince Facebook to turn away from its actual business model – the collection and sale of your unique user data. In 2018, data – the bigger the better – is the new oil, and empires like Facebook are being built on it.

Don’t like your data being bought, sold, shared, manipulated, added to other data about you, spat out and fed back to you in the form of targeted advertising? Then it’s time, dear friend, to get off social media because that is the name of the game, the only one Facebook cares to play.

“The Cambridge Analytica scandal highlights that social media companies such as Facebook are faced with often conflicting privacy-related demands from users and advertisers, as well as from civil society, academia and government,” says Robert Ackland, associate professor at the School of Sociology, Australian National University.

“It is a quickly changing environment and what was considered ethical and appropriate five or ten years ago (such as the savvy use of social media by the Obama presidential campaign) may be regarded as unacceptable in the future.”

“This is just the natural process of technology evolving over time in response to public scrutiny. But some of Facebook’s privacy missteps have appeared to be wilful, with the platform testing the water (and then apologising) in terms of what it could get away with to make itself more valuable to advertisers.”

While the temptation to demand a pound of flesh from Facebook is surely great, perhaps we’d be better off thinking about some bigger pictures here, such as what we are willing to trade for these so-called ‘free platforms’, whether the benefit justifies the trade, and just how we can prepare ourselves for the profound impacts that the large-scale data economy is going to bring.

To punish Facebook specifically is to miss the point. Bring on the investigations, try to regulate away the risk, but protect yourself at all times. Because right now, your data, whatever that might be, is the cost of entry.

To feign naivety to that fact serves no one.

Post a comment or question...

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MORE NEWS:

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Follow iStart to keep up to date with the latest news and views...
ErrorHere