In Phaedrus, Socrates warns against the latest invention of writing, fearing that this new tool could spell the destruction of the oral tradition of learning and communication. Certainly, Socrates understood the utility of writing in recording information and making it distributable, however he feared that humanity would become too reliant on this invention and would descend into engagement with the page, as opposed to one another. Perhaps Socrates’ forecast has not aged well with regards to writing. However, his overarching concern on the social impact of a new apparatus that changes our process of acquiring knowledge seems relevant now more than ever. Modern technological advances were intended to bring us closer together, giving us access to more information than ever before. Yet in today’s internet age, many of us cannot help but feel disconnected, polarised and isolated. How is it that social media and our online presence has engulfed us in a sea of misunderstanding, with islands of truth few and far between? The answer seems to lie in the monetising of our digital presence and the manipulation of the consumer to profitable ends.
Advertising is a central tenet of the capitalist model of consumerism. Motivating people to buy your product has been an essential part of business strategy since Edward Bernays pioneered public relations by integrating glamor into promotions. Billboards and television commercials have always been an arena for corporations to persuade large audiences to choose their commodity, however with handheld mobiles and personal social media accounts now, ads can be increasingly more tailored to the individual consumer. Search for a camera on Google, and you might see ads for a camera on Instagram the following day. In-built microphones can track conversations and background noise to build a profile of the consumer which social media platforms like Facebook have access to – a writer at Vice conducted an experiment to test this phenomenon in mid-2018 and showed that our conversations are being tracked. The existence of webpage cookies, which hold a significant amount of data specific to an individual user, are also undoubtedly important in tailoring ads to the individual. Such targeted ads might come across as innocent and even convenient, but the algorithm behind this mechanism can be used for more sinister ends than one might immediately anticipate.
Our online presence exists within an echo chamber. The algorithm used by Facebook and other platforms tailors news feeds specific to each account. By analysing our previous attention patterns, our news feeds predict what material we are most likely to engage with to maximise our time spent on its platform. For example, if you have already engaged with politically left-wing content, the algorithm will continue to reproduce similar material for you to consume. These companies are not primarily concerned with the accuracy of the information that is being disseminated, but more so with monetising your time spent on their service. This positive feedback loop of only seeing content that you are likely to engage with creates an echo chamber where outside opinions and views are blocked out. Not only does this harden one’s views without any effective counterbalance or credible fact-checking, but it often misrepresents the views of others, leading to hostility and distrust of those with differing opinions. But what users fail to realise is that no one person will see the exact same feed that they themselves will see. The media that we all consume on these applications are completely different. Due to social media, political leanings have become increasingly polarised, with the middle ground slowly dissipating and opposed groups less willing to engage each other in debate.
Social media companies have been operating with little regulation and transparency until relatively recently. They have capitalised upon a new marketplace which trades on our personal and private experiences. This is made possible by the mass surveillance facilitated by cookies in people’s internet usage. Shoshana Zuboff coins the term ‘surveillance capitalism’, defining it as “…parasitic and self-referential. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labour, but with an unexpected turn. Instead of labour, surveillance capitalism feeds on every aspect of every human’s experience.” Often without an individual intentionally consenting, our data has become a commodity that can be sold to the highest bidder.
The insidious results of what happens when our data is not protected, or even seen as private or personal, is exemplified by the Cambridge Analytica scandal during the 2016 Brexit vote and the US election in the same year. The Vote Leave campaign supposedly laundered around £750,000 through the data company and a Canadian data company called AggregateIQ. This was done, via offshoots of the campaign such as BeLeave and Leave.EU, to intentionally release a barrage of targeted ads to those that Cambridge Analytica had identified as “persuadable”. These were ads of misinformation, used to sow fear and hatred of migrants and refugees to manipulate the electorate into favouring stricter border control. And it worked. These targeted ads swayed the small fraction of people needed to swing the vote to leave the EU. Similar tactics were used in the 2016 US Election: Cambridge Analytica illegally harvested 87 million Facebook profiles in the US under the guise of academic research, and manipulated them through targeted ads to sway the vote in Donald Trump’s favour.
Cambridge Analytica took advantage of the fact that these social media platforms did not have a fact checking service (however, Twitter has recently introduced one and Facebook is in the process of launching one) — the algorithm does not care what the users consume as long as the users spend a long time consuming. This mechanism was exploited to spread fake, sensationalist information to a consumerist population that barely bothers to fact check for themselves. Whether a fact checking service will be of any use is highly disputed —a study at Yale found that informing users to potential fake news was not effective with helping users to correctly identify fake news, with only a 3.7% improvement. Regardless, such platforms should take accountability in providing such a service, and consumers should take responsibility in confirming the news that they read.
Political parties haven’t just been using social media to manipulate elections to their own benefit, but also to destabilise other countries. A more popular example of this is Putin’s Russia: the Russian government was able to exploit already existing political divides in order to cause chaos and spread more misinformation during the 2016 US election and Brexit, paying particular attention to racial divisions. In the countdown to the US election, they directly reached 30 million accounts. Through hacking Hillary Clinton’s emails and posting them online via Wikileaks into the public domain, the Russian government directly aided Trump in smearing her campaign. Russia’s Internet Research Agency (IRA) also had some influence on the EU referendum: studies have shown that around 15 to 150 thousand Russia-affiliated Twitter bots collectively sent tweets regarding the Brexit vote in “an effort to spread disinformation and discord”; the National Bureau of Economic Research (NBER) calculated that these Twitter bots were responsible for 1.76% of the Leave vote share. Although the effectiveness of Russian interference during the 2016 US election and Brexit is highly contested, this nevertheless serves as undeniable proof of the ease and intention to manipulate a population’s voting patterns through unregulated social media platforms.
Currently, the future looks bleak: if we continue on this path where personal data and privacy are valuable commodities, the already faltering notion of democracy will cease to exist. Society itself has become fractured and divided, to the extent that the middle ground is scarce, and compromise seems off the table. What can be gleaned by the abuse of social media and surveillance capitalism in manipulating our mindsets is that we must increase awareness of targeted advertisements, encourage consumers to question the accuracy of the content they receive and engage in discussions that need to be held. Strategies have to be constructed and implemented by governments to challenge the misinformation pandemic and restore some sense and trust into our democratic institutions.
You can find out more about the OFQE’s events and activity at www.OFQE.co.uk.