By Hannah Comiskey
The number of people relying on social media platforms as their main source of news is on the rise. With this shift comes drastic changes in how voters are informing their decisions. And dangerously, there is a rise in the number of people basing their perception of the world from news stories a callous social media algorithm picks for them to see.
Viewing two social media news feeds as the same is as reductive as declaring two strangers identical because they are both human. The thoughts, genetics and personal histories that make each individual unique reflect the variation in content seen across two news feeds. Algorithms sculpting our social media content mirror our preferences and beliefs in the same way as the posters on our bedroom walls once did. But unlike the posters which we laboriously put up and outgrew years before we made the effort to rip them down, the structure of our social media news feeds are curated without our intention, our knowledge or even our permission. And the jury remains out on whether our views guide the content, or the content guides our views.
When used as a source of News, the consequences of these personalised newsfeeds of Twitter or Facebook go much deeper than how many funny cat videos you see in a day. Rather, tailored News embedded into our Social Media creates very different depictions of the world to each individual user. And in doing so, has profound impacts on how we perceive and behave in the real world when we look up from our screens.
Through selectively exposing each user to targeted information, our personalised newsfeeds create echo chambers which serve to reaffirm and intensify individual beliefs. Suggested followers, targeted advertising and the personalised results of a Google Search therefore all align and provide a constant stream of evidence for what these algorithms know are our pre-existing views. The building blocks of our virtual realities are thus carefully curated from our individual Likes, Dislikes, engagement ‘clicks’ and Friends.
Targeted advertisement and suggested follows are the newest gateway drugs into the world of radicalised political views. Attention hungry algorithms are burning the bridges of political middle ground both in the virtual and real world. And while a Twitter blow-up can be ceased with a delete button or the closing of a laptop screen, the consequences of growing polarization in the real world are far less reversible.
Moscovini & Zavolloni’s (1969) Group Polarization theory posits that discussing issues with others of similar views leads to an increase in extremist positions.This is the deeper harm caused by our social media echo chambers. Recent findings supporting the relevance of group polarisation in online Twitter communities further strengthens this case. And with Twitter suggestions of who to follow next being based on your own views, online spaces initially intended to connect people are instead creating more divisions than bridges in both virtual and physical communities.
Instead of inspiring intellectual debate, the encouragement of digital conversing with those on the extreme end of the spectrum of any issue sets the stage to either repulse or radicalise users. And for the 49% of the UK population who rely on social media as a primary news source, there is a fundamental lack of unbiased, or even accurate information from which to make informed decisions.
Acting as the ultimate partisan media source, social media platforms promote uninterrupted opportunities for confirmation bias. Behind the protection of a screen, personalised algorithms are sheltering us from ever having to confront the storm of information which may genuinely challenge our own beliefs. It has never been easier to censor content you disagree with in the presence of an unfollow or mute button. Or spread misinformation in an entrenched online community which will not challenge it.
The shock result of the Brexit referendum may not have been so shocking had you not been a user who was virtually absorbed in the Remain campaign. If every news story you were reading and every influencer adding to your feed hadn’t been echoing back your own position, you might have refrained from preemptively taken your personal opinion as universally accepted. Had Leave voters not been bombarded with misinformation and uncontested propaganda, their views might not have been concretised in a way which manifested in a Leave vote. And had the algorithms driving so many voter views prioritised constructive discussion, a level of understanding or merely fact over intolerant rhetoric, then the hostile divide between Leavers and Remainers may have been altogether avoided.
The problem, however, goes deeper than what can be fixed by manually diversifying our newsfeeds by following or reading opposing viewpoints. Instead, systematic changes to the under workings of Social Media algorithms are required for it to adequately adapt to its evolving role as a news source. Currently, these platforms place emphasis on engagement rather than factual accuracy, and so posts with emotive language and radical viewpoints are prioritised on newsfeeds.
The resulting environment where outrage is prioritized over accuracy, becomes a dangerous place to consume news. Inflammatory language and radical viewpoints become commonplace in order to reach wider audiences. One only has to glance at President Trump or Nigel Farage’s most recent Twitter posts to see examples of such inflammatory language being broadcasted for political objectives. Generating controversy on these platforms has become the cheapest, and arguably most effective, way to generate attention and mass advertisement for a political campaign or figure.
And while this strategy is also utilised by traditional news sources, what makes social media’s use particularly sinister is its ability to target these radicalised posts to users it knows will be influenced by them. It is because it knows whose buttons to press, and when to press them, that makes radical news circulating on social media so influential, and so dangerous.
What should be even more frightening to us is how misinformation on online platforms spreads faster than truth. And the number of conspiracy theories has been rising hand in hand with the use of social media.The surge in conspiracy theories surrounding the rollout of COVID-19 vaccines provide a very stark reminder of the potential for catastrophic consequences if enough people are hooked in. And while all focus remains on the virus, the rapid spread of misinformation has the potential to be equally as deadly.
In connecting compatible users, Social Media has provided the ideal launchpad for radical individuals to spread doubt and distrust in government, in a time where that trust is needed most. And with the pace of the law far behind the incredible changes in both technology and social media, these platforms threaten to jeopardize stability in the political system and democracy as a whole.
The lack of regulation in social media as it currently stands establishes these platforms as an aggressive arena for political debate, which is having growing influence on voting behaviour. But more than debate, these platforms have become a key battleground for capturing voters. Accordingly, politicians are capitalizing on these growing forms of new media to reach and, as the Cambridge Analytica Scandal revealed, manipulate voters.
The shock outcomes of both the 2016 Brexit Referendum and Trump presidential election can be linked to the exploitation of voter’s social media data for respective political objectives. Harvested Facebook data captured by Cambridge Analytica was used to construct psychological profiles for each user. And in the hands of the Leave and Trump Campaign, these insights were used to tailor online political campaigning to fit the psychology of each user. In a world where our Facebook clicks reveal more to algorithms than they do ourselves, this method of targeted campaigning manipulated voters, and political outcomes, on an unprecedented scale.
The fundamental priorities of social media platforms, as it stands, are incompatible with its growing role as a news source. As while the social media companies pulling the strings of their manipulative algorithms may consider users mere pawns in their profit game, in the real world these people translate to consumers, future leaders and ultimately, to voters. In the current pandemic climate, we rely on technology more than ever before. And the increased exposure to targeted advertising and cherry-picked news stories makes this an issue which has never been more relevant or threatening.
The growing influence these invisible systems have on real world voting outcomes and growing political hysteria can therefore no longer be overlooked. Systematic changes and increased regulation are required to address the imbalance between what these invisible algorithms know about us and what we know about them.
Fundamentally, algorithms should work for us, rather than work on us. The incentives and ethics of online platforms must be realigned in accordance with their growing influence in the world. In order to address growing polarization, social media must be regulated and upheld to standards in the same way as any other news source would be. The role these platforms play must ultimately shift from manipulator to educator. And so long as we are viewed as the product rather than the customer of our online platforms, the prospect of this shift remains remote.
“The views expressed in this article are the author’s own, and may not reflect the opinions of The St Andrews Economist.”