The 2010s have been marked by unprecedented surveillance, the scale of which was revealed by former National Security Agency (NSA) contractor and whistleblower Edward Snowden. Granted permanent residency in Russia in October 2020, Snowden was responsible for leaking 200,000-plus documents to Guardian journalists, proving US, UK and Australian government surveillance of citizens around the world. These documents revealed several government programs, such as PRISM, which directly incorporated big tech companies in the mass surveillance of citizens, including Microsoft, Facebook, Apple, Google, YouTube, Skype, Microsoft, Yahoo and many other telecommunications companies (Greenwald 2013, 2014). Other international mass surveillance programs included XKeyScore and Tempora. The NSA selects about 22.4 terabytes of data for review every minute, which is about 5,724 two-hour HD movies (MacAskill and Dance 2013). The NSA established a global precedent for government agencies and how they monitor and surveil citizens through social media.
At near to the same time, the âApple-Foxconn allianceâ, what Jack Qui calls âAppconnâ, had become famous for mistreating its staff so badly that a wave of Chinese Apple staff committed suicide in 2010, highlighting a global IT business model based on severe worker exploitation (e.g. via resource extraction for smart phones as well as IT manufacture) (Qui 2017: 6â8). Indeed, Qui argues that Appconn best represents a ânew hegemonâ, and it is not just Apple or Foxconn, or Chinaâs low wages, that contribute to re-creating digital slavery, but the exploitative nature of the global platform economy (2017; cf. Zuboff 2015, 2019; China Labour Watch 2018).
These are important points. In our current landscape, datafication comes with mass surveillance, gross exploitation and the tremendous power of big tech, all of which set the background for two of the most important social media cases of the late 2010s: Cambridge Analytica and Chinaâs social credit systems.
Cambridge Analytica (CA)
Cambridge Analytica, henceforth referred to as CA, is a strategic communications firm, self- described as âa data driven political consulting and commercial marketing firmâ. CA is part of SCL (formerly Strategic Communications Laboratory, then called the âSCL Groupâ), and came to public attention for using personal data, primarily demographic information as well as âlikesâ, from 87 million Facebook accounts in 2015. These data were collected, with consent, from approximately 300,000 Amazon Turk workers and Qualtrics panels in 2015 through an app that claimed to test peopleâs personalities. This app was designed by Aleksandr Kogan, a Cambridge professor working with Facebook. Koganâs app collected additional data from each userâs Facebook friends (approximately 267 friends each) without consent (Venturini and Rogers 2019: 533). The app collected demographic data and likes in order to make personality predictions about users which were then used for both the Leave.EU and Trump political campaigns in 2016, amongst many other uses (Cadwalladr 2018; Kogan 2018; UK Parliament 2019).
Widely reported in the media as another major âsocial media scandalâ (e.g. Chang 2018; Confessore 2018; Wong 2019b), the Cambridge Analytica events revealed Facebookâs business model is about collecting user data and profiting from those data, rather than about connecting people. Based on Facebookâs apparent data harvesting, the UK Parliament conducted an 18-month investigation into CA and social media. Based on this, their final report, Disinformation and Fake News, found that Facebook offered âfriend permissionsâ to app developers, allowing those developers to access personal information from usersâ Facebook âfriendsâ. This also involved whitelisting or priority access to these kinds of âfriendâ data, only for the highest paying developers from 2015. Finally, developers were granted full access to âlookalike audiencesâ, i.e. audiences mirroring certain demographics or interests shared with the developersâ target audience (UK Parliament 2019, based on court case documents for Six4Three vs Facebook on anti-competition and privacy violations).
The parliamentary report found that, contrary to Facebookâs claims, these kinds of data- selling practices were not only commonplace, but also integral to Facebookâs business model:
We consider that data transfer for value is Facebookâs business model and that Mark Zuckerbergâs statement that âweâve never sold anyoneâs dataâ is simply untrue. (UK Parliament 2019: 134)
Although directly related to Facebook, the UK Parliamentâs findings apply to many social media platforms and the âcomplex webâ of actors involved not only in the Cambridge Analytica events, but also in datafication processes across the web, social and mobile ecosystems. Dr Emma Briant (2018), a propaganda researcher and expert on SCL Group, argues that this case is about much more than one company, and about much more than privacy invasions or data harvesting:
Itâs a story about how a network of companies was developed which enabled wide deployment of propaganda tools â based on propaganda techniques that were researched and designed for use as weapons in war zones â on citizens in democratic elections. (Emphasis added)
Bearing these two points about CAâs broader implications in mind, the most essential points of this case begin with a very brief background.
In terms of politics, CA was founded in deeply political circumstances. For example, Steve Bannon, well known for his far right and extremist politics, was the former Vice President of Cambridge Analytica (2014â2016) as well as the chief executive and chief White House strategist for Donald Trump (2016â2017), and founding member and former executive chairman of Breitbart News (âthe platform of the alt-rightâ, 2012â2018) (Cadwalladr 2018, 2019; Osborne 2018). Christopher Wylie, one of the key whistleblowers in the Cambridge Analytica revelations, reported directly to Steve Bannon during his employment at CA â and he claims that he built Bannonâs âpsychological warfare machineâ (Cadwalladr 2018).
Briant (2018) has also documented the extremist tactics used by CA and its multiple affiliates in political campaigns, such as âracist and violent video content designed to drive fear and intimidate voters in fragile statesâ as well as in the US and UK. Cambridge Analytica was not unique in building these kinds of campaigns. It was one company in a complex web of companies and actors, where it is clear that harvested data â from Facebook, from Republican databases, and many other sources â were used by CA (and others) in both Donald Trumpâs election campaign and Leave.EU campaigns (UK Parliament 2019: 140â47).
While some claim that the quality of Facebook data was too low to be meaningfully analysed (e.g. Venturini and Rogers 2019), Kogan himself notes that not much data are needed to make personality and voting predictions:
[Former colleagues had] already developed a model to make predictions from likes, and I had experience using the Facebook login app from my previous. This idea of predicting personality from page likes became the foundation of the project that I did with SCL. (Kogan 2018)
In addition, the quality of information has little to do with the fact that these kinds of data are collected, without consent, sold, and used to manipulate political votes and sentiment.
Notably, the public and political attention focused on Cambridge Analytica has many important implications. The first of these is that Facebook has been under investigation by both the US and UK parliaments, and has been fined ÂŁ500,000 by the Information Commissionerâs Office in the UK (ICO 2018) and $5 billion USD by the Federal Trade Commission (FTC) in the US (FTC 2019). While this may seem a hefty fine, when the exchange rate is accounted for and the two fines are combined, they account for 0.03% of Facebookâs 2019 total revenue (Facebook fourth-quarter 2019 report).
The ICO has also found that current UK electoral law is not fit for a digital age and has called for radical reform of âelectoral communications laws and rules on overseas involvement in UK electionsâ (UK Parliament 2019). The ICO has also called for the creation of a powerful new independent regulator to oversee a new âCompulsory Code of Ethics for tech companiesâ (UK Parliament 2019). These proposals have been further explored in the Online Harms white paper, which âsets out the governmentâs plans for a world-leading package of measures to keep UK users safe onlineâ (UK Government 2019). In what seems like a final development, CA and SCL filed for insolvency and shut in early 2018.
Thus, while it seems the CA revelations have sparked serious changes to Facebookâs practices and the regulation of tech, data and political practice, many questions remain about the effectiveness of such proposals. For starters, both CA and the SCL Group have a parent company, Emerdata, which continues to conduct business and is run by the same players as CA and the SCL Group. Indeed, the UK Parliamentâs final report on âFake News and Disinformationâ states:
Senior individuals involved in SCL and Cambridge Analytica appear to have moved onto new corporate vehicles ⊠We recommended that the National Crime Agency, if it is not already, should investigate the connections between the company SCL Elections Ltd and Eme...