Commentary

Trending on Fakebook

Pirang  Trending On Fakebook  685 X 420
Source: Justin Sullivan /​Getty Images
02 Dec 2016, 
published in
GPPi

On October 30, 1938, chaotic scenes took place in parts of the United States. Thousands of radio listeners who had tuned in to an adaptation of H. G. Wells’s novel The War of the Worlds had mistaken the fictional account of an impending alien invasion on the East Coast for an actual emergency broadcast and were seized by panic. Although a seemingly obscure footnote of modern history, it might offer a glimpse of things to come in today’s post-truth’ era.

Before stunned pundits scrambled to point to false stories as one of the likely reasons for the unexpected outcome of this year’s US presidential election, researchers had already stated massive digital misinformation as a major threat to democratic societies in the 2013 Global Risk Reports by the World Economic Forum. Such false information, they argued, has the potential to spread so quickly that any attempts to rectify it come too late to avert serious damage. Furthermore, misinformation within bubbles of likeminded people proves to be especially resilient to correction. Indeed, versions of these scenarios materialized in the build-up to the US election as well as the Brexit referendum. Amplified by algorithmic echo chambers, news stories such as the Pope’s endorsement of Donald Trump went viral. Needless to say, the story lacked any factual basis.

Unsurprisingly, politicians and civil society actors have strongly voiced their concerns. During his recent visit to Europe, President Obama sharply criticized the spread of fake news online, arguing that if we can’t discriminate between serious arguments and propaganda, then we have problems.” After initially denying any responsibility, Facebook now pledged to take concrete measures to tackle the issue.

Unfortunately, pressing social media companies to fix their algorithms alone will not solve the problem. Framing this issue as a mere technological mishap is a worrisome oversimplification. Instead, democratic societies on both sides of the Atlantic need to face up to the tough questions raised by the phenomenon of fake news.

First, leaving social media companies to clean up their own mess reinforces a highly problematic development. Today’s public sphere – that is, dominant social media platforms – is controlled by a small set of private companies. The parameters of public discourse used to be set by a multitude of actors, with major newspapers being the most prominent players; now, just a handful of tech companies are able to organize how people exchange arguments and form public consensus through their algorithms. This grants them unprecedented influence over a vital aspect of democracy. Such a state of affairs is troublesome, given that these companies do not operate under a logic centered on the public good, but primarily pursue their business interests. Dictated by shareholder interest, they see their users as consumers, not as citizens.

Therefore, democratic institutions need to reinsert at last partial control over the public sphere again. This could be achieved by imposing public interest principles on social media companies. Such codes of conducts traditionally oblige newspaper journalists in democracies, but have not yet been extended to social media and search engines. Carrying far too much social responsibility, Facebook and others should not be able to continue evading accountability for their content any longer. Regulation, as proven in the EU’s Right to be Forgotten” concept, can still be a viable instrument when it comes to reigning in the reach of tech giants.

Second, weeding out fabrications from objective facts is easier said than done. Not all dubious content can be easily distinguished; most falls into a gray area somewhere at the outer fringes of truth. Calling on social media companies to curb false stories effectively means that they become arbiters of truth, despite Mark Zuckerberg’s claim to the contrary. Leaving aside implicit constitutional concerns, it is not even certain that exposing bogus stories would be helpful. A study shows that the debunking of fake news actually tends to backfire with users interested in conspiracy theories by reinforcing consumption patterns within their individual echo chambers.

This leads to the last point, which is that focusing solely on the spread of misinformation means mixing up cause and effect. Fake news circulates within a bubble of people if they are all susceptible, for instance, to a similar brand of xenophobic and nationalist claims. Social media’s inherent confirmation bias exacerbates this further, but it is the people – readily believing in highly subjective truths’ – who feed their own bias into the algorithms in the first place. Fixing this implies fixing a divided society. This is a monumental task. It requires reaching out to those swaths of society feeling economically marginalized, alienated by mainstream political correctness and threatened by globalization, among other things.

To this end, establishment politicians are well advised to step outside their own filter bubbles. Otherwise, it is not inconceivable that someday people will again frantically run for their lives because they read online that Martians are about to invade the planet.