Opinion: Welcome to the age of ‘surveillance capitalism’ where highly personalized information and psychological models are leveraged to change consumers’ behaviour and sway their opinions
Did Facebook hack the 2016 U.S. election? No. Did it swing the result? Probably. Did it do it on purpose? No, not likely. But this weekend, news reports bolster the view its data, which were not supposed to be shared, were central to the strategy of political consultants Cambridge Analytica to sway voters in favour of Donald Trump. What Facebook has actually done here is to confirm the death of privacy.
You could make the case Facebook is actually the victim here, after Cambridge Analytica weaponized someone else’s academic survey data by blending it with other available consumer data to create rich profiles on the behaviours and activities of 50 million American Facebook users and potential voters. But right now, many fingers are pointing at the firm and its CEO-founder Mark Zuckerberg.
The news broke hard over the weekend and the fallout has been fevered and far-reaching, extending far beyond a slide in Facebook shares that took as much as a US$5-billion bite out of Zuckerberg’s personal wealth in a matter of hours. The backlash includes calls for government inquiries or investigations in the U.S., EU and Britain, words of concern from the British prime minister and speculation that users could sign out of the social media giant’s site for good due to concerns over data breaches and privacy threats.
This may be the precise moment in time when we all acknowledge that privacy is officially gone. No one buys the Google mantra “Do no evil” anymore; even if social media companies aren’t actively conspiring to eliminate privacy, they are complicit in its demise. Facebook may not have hacked an election, but nobody really knows where our data lives any more and who has access.
Who watches the watchers?
The question of oversight or regulation is one that consumers and brands must face when it comes to marketing and advertising on platforms like Facebook. It’s one thing to help a company sell you its goods, but it’s a whole new world to have the power to help anyone buy an election.
About 50 million profiles were allegedly harvested to create content and target individuals with specific political messages. Using personal profile data, analytics, predictive software and algorithms, Cambridge Analytica, with investment from U.S. hedge fund billionaire, Robert Mercer, along with advisor Steve Bannon (from Breitbart and Donald Trump’s election team) allegedly used our personal information (without explicit authorization) in 2014 to build a database that could target personalized political advertisements down to specific individuals.
It wasn’t a data breach. Or was it?
It’s easy to shake your first in the air and scream, “How could this happen?” The roots of this story seem innocuous enough. Back in 2014, Global Science Research’s Aleksandr Kogan launched a personality quiz app on Facebook called thisisyourdigitallife to study human behaviour based on information that could be gleaned from Facebook. The app was downloaded by 270,000 Facebook users, who were paid a small sum to take a personality test (all good and fun).
Depending on the users’ privacy settings, GSR was able to gather not just information about the individual who downloaded the app, but (depending on the user’s security settings) also gave GSR access to their “friends.” This quickly scaled to the 50 million profiles that were also shared by GSR’s data analytics firm, Cambridge Analytica. Facebook was made aware of this in 2015, removed the app and demanded proof from GSK (and all other parties) that the data had been destroyed.
Several days ago, Facebook was told that the data had not been deleted. Now, it’s all one big hot mess. Facebook is currently attempting to confirm whether or not the data had been deleted. Cambridge Analytica claims that they did, in fact, delete it, when it became clear that the data was not obtained in line with Facebook’s terms or service. Cambridge Analytica is denying the use of this data in relation to the Trump campaign. Now, a past employee of Cambridge Analytica, Christopher Wylie (a Vancouver-native), is doing the media circuit as a whistleblower. Both the New York Times and the The Guardian have been leading this past weekend’s revelations.
Welcome to “surveillance capitalism”
The idea of “surveillance capitalism” was first introduced by John Bellamy Foster and Robert McChesney in Monthly Review, and later popularized by academic Shoshana Zuboff (according to Wikipedia). In short: Facebook’s business model is not based on content, marketing or advertising. You — the consumer — are the product and the money that Facebook generates is based on how well they can monetize your data and target you to their brand partners.
None of that should come as a surprise, but the Cambridge Analytica twist to this business model is not about knowing individuals and figuring out how to better position a brand in front of a consumer. This new model is about being able to leverage highly personalized information and psychological models to change the consumer’s behaviour by showing them hyper-personalized content to sway their opinions. Of course, this is all being done without the consumer’s consent and knowledge. And the weapon of choice was fake news, served up to appeal to unique profiles.
Do no evil. Do no harm.
While that is often the mantra of these Silicon Valley social media and Internet platforms, it’s getting harder and harder for consumers to believe it. It’s not just Facebook. Google, Twitter, LinkedIn, Amazon and others hold a treasure trove of personal information about consumers. If third-party players are able to run apps and programs across multiple platforms, seeing 50 million accounts being harvested on Facebook may wind up being the tip of the iceberg. Everyone is vulnerable.
Facebook’s VP & Deputy General Counsel, Paul Grewal, took to Facebook to update his post, Suspending Cambridge Analytica and SCL Group from Facebook, with this on March 17th: “The claim that this is a data breach is completely false. Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.”
You will need to re-read that last paragraph a few times. It’s not about whether or not users knowingly provided their information. Users — without a doubt — were probably clueless that this app was taking much more from them than a personality quiz. It scraped their profiles and the profiles of their friends. So, while SCL may have “legally” used an app to access this information, it does not mean that they didn’t do anything morally wrong, or that Facebook’s “legal” terms of service at the time were not vulnerable.
Where do we go from here?
The true business challenge is that consumer’s data is being used in myriad ways that they are not aware of. Consumers have little insight and recourse in understanding what data is held, how it is being used and where that data may be floating around. In all of this news about Cambridge Analytica, it is still unclear how much advertising revenue that company drove to Facebook. If consumers even did read the terms of service when signing up for a platform like Facebook, it is more than likely that they didn’t keep abreast as these platforms change and adjust their terms of service.
Is this strictly the user’s fault if they’re not staying apprised? Facebook and other social networks often allow third parties to connect with the consumer, and while this makes for a better user experience (you get ads for things you’re interested in), we can’t be certain that users understand how that data is sold and resold. What happens when many of these third party apps and companies go bankrupt? Are the consumer data assets being reused and resold? Do consumers have a right to know if that has happened and stop it?
Facebook, like you, is trying to stay ahead of this
This Cambridge Analytica fiasco highlights some of the dynamic challenges that businesses face today. How can Facebook protect a businesses (and a user’s) best interest? The truth is that Facebook knows way more than they need to know about their users in order to best target them with relevant content and advertising.
While this current data imbroglio started playing out in 2015, why are most people only finding out about this now? Should every misuse of Facebook data be made readily available to the people whose data have been misused? These are all monstrously large ethical, legal and business questions that will need to answered and accounted for.
The better business question might be: just how much data does a brand really need to ideally target a consumer with a relevant message? Does that mean that every byte of information after that point should never be in a place where it can be passed along or analyzed? Until Facebook and the other digital marketing data giants can prove that they can police themselves, brands have an incredible opportunity to step up, move forward and properly protect their consumers and the people that they want as new consumers.
Mitch Joel is President of Mirum, a global digital marketing agency with offices in Toronto and Montreal and author of Six Pixels of Separation and CTRL ALT Delete.