After Covid-19 was first identified in early January, the tools and techniques of science and medicine were engaged with unprecedented urgency to tackle the biology of the pathogenic coronavirus, the epidemiology of its spread, and possibilities for potential treatments and eventual cure. But in parallel with this energetic search for reliable yet elusive facts and remedies, we’ve also seen the lightning spread of Covid-19-related falsehoods—a phenomenon the WHO has called the coronavirus “infodemic.”
There has been a boom in conspiracy theories—the idea, for example, that the illness is in fact caused by the 5G network weakening the immune system with “radiation.” In defiance of the evidence, the US President and his circle have implied that the virus originated in a Chinese laboratory, and the Chinese in turn have encouraged rumours that the Americans brought it to Wuhan. Prominent commentators pursuing political agendas (or merely attention) have rounded on scientific results as if these were just one more opinion they didn’t like. The media and politicians have shown themselves pitifully vulnerable to falsehoods that pander to their agendas—or even actively willing to create them. Even after several years in which “fake news” has set the rhythm of insurgencies, elections and referendums, it is remarkable to witness just how contagious the Covid-19 infodemic is proving.
The pandemic underlines—again—the growing problems in our information ecosystem, this time in a field where falsehood can be (literally) lethal. It is acting as a lens that brings into focus one of the most urgent challenges of our times. We (most of us) will survive the virus, but it is far from clear that democracies can survive the longer-term destabilisation of objective truth. If we want a world where major events can be discussed and debated on a basis of agreed and reliable facts, then we have no choice but to grapple with the epidemiology of misinformation.
Slander on speed Misinformation is nothing new. Foreigners and minorities have been slandered through history—often in the context of disease, and sometimes with murderous consequences. But “the difference now,” Sylvie Briand, director of Infectious Hazards Management at WHO’s Health Emergencies Programme, told The Lancet in late February, is that “with social media… this phenomenon is amplified, it goes faster and further.” The “huge change in the infrastructure of information,” says Amil Khan, a former government specialist who has studied misinformation in conflict zones, affects “not just the mechanics but the fundamental principles.” While, “for most of the modern era, information was filtered,” he says, today the filters are seriously eroded. The costs of spreading information through Facebook and Twitter are virtually zero, and the form discourages considered evaluation: on Twitter, no one is interested in your view 24 hours after the event, when you’ve had time to think it over or check a few facts.
The feedback loops between social and more conventional forms of media further darken the picture. Many media outlets have always ground their axes without much regard for truth, and continue to do so. As the pandemic was building, Fox News insisted that there was no coronavirus threat but just a Democrat plot to unseat Trump. Closer to home, the pandemic has widened the fault lines already opened up by previous arguments: on scientific topics such as climate, for sure, but also on purely political issues, such as Brexit. Some sections of the political commentariat, mostly but not exclusively on the libertarian right, have turned the politicised rejection of science and expert opinion into a career gambit. The Spectator’s Toby Young, the Mail on Sunday’s Peter Hitchens and James Delingpole, who has made the leap from pages of the Spectator and Telegraph to Breitbart (the far-right American falsehood factory made notorious by Trump’s one-time right-hand man Steve Bannon) have all polemicised against the scientific recommendations for social distancing and lockdown, just as many of them had previously railed against experts on climate change. There have always been media contrarians, but editors in search of social media hits—or of pieces which, in the telling parlance, “go viral”—are ever less inclined to rein in their excesses.
[su_pullquote align="right"]Untruths can jump across national borders just as easily as the coronavirus, and they are heedless of ideological distance too[/su_pullquote]
Meanwhile, conspiracies that grow up in dark corners of the web make it through to the mainstream. When even the affable Eamonn Holmes, presenter of ITV’s This Morning programme, starts challenging dismissals of the bogus link between the Covid-19 pandemic and the rollout of the 5G network by saying that “it’s very easy to say it is not true because it suits the state narrative,” it’s fair to suspect that something has gone badly wrong.
The pandemic is a disturbingly good vehicle for misinformation. The topic, says data scientist Walter Quattrociocchi of the Ca’Foscari University of Venice, “is polarising, scary, captivating. People are anxious and fearful. And it’s really easy for everyone to get information that is consistent with their system of belief.” Untruths can jump across national borders just as easily as the coronavirus, and they are heedless of ideological distance too: alliances about coronavirus misinformation can very rapidly and sometimes unwittingly be forged between people who otherwise have very different worldviews. Opposition to vaccines, for example, has led to the racism and hate messages of the far-right leaking into groups more normally found discussing yoga and wholefoods.
Falsehood, falsehood everywhere! To start making sense of this sprawling ecosystem, it can be helpful to distinguish misinformation—untruths typically spread by credulous individuals—from disinformation, which is designed intentionally to mislead.
State-sanctioned activists are among those creating and spreading the latter. A briefing for the European parliament in April described Moscow and Beijing as “driving parallel information campaigns” to encourage the idea that “European citizens cannot trust their health systems, whereas their authoritarian systems can save the world.” Experts suspect that China has adopted the tactics that were honed by the Kremlin during the conflicts in the Crimea and Ukraine from 2014. Other state-on-state fronts in the information war have included the efforts of the American and Chinese states to accuse each other of creating the virus: as well as Trump’s accusations about Wuhan labs, we have had Chinese Foreign Ministry spokesperson Zhao Lijian encouraging the social-media rumour mill by tweeting on 12th March that “It might be the US army who brought the epidemic to Wuhan” and adding that “US owe us an explanation!”
“The coronavirus pandemic,” Khan has written, “is becoming a watershed moment for how states tussle on the international stage. Spies, special operations and high-stakes negotiations are no longer the tools of choice. Instead, it has become clear that disinformation, an information manipulation technique born in conflict zones, is becoming increasingly normalised as a political weapon—particularly in times of crisis.”
The biggest mistake would be to imagine that there are simply a few malicious or ignorant agents concocting lies or fantasies and feeding them to the unsuspecting public. Political scientists Darren Linvill and Patrick Warren of Clemson University in South Carolina have traced the idea of a Chinese bioweapon to a message from an anonymous conservative American woman. But it seems that this “theory,” like many others, was received by the Russian state disinformation apparatus, repackaged and fed back into the ecosystem. “Foreign actors aren’t the biggest danger,” Linvill and Warren wrote in the Washington Post. “We Americans are.”
Mis- as distinct from dis-information arises spontaneously and propagates in an information ecosystem where sources and channels of transmission are hard to identify and almost impossible to suppress. Yes, the ecosystem contains lonely trolls and bored conspiracy theorists on laptops in their bedrooms, as well as organised alt-right extremists and anti-vaxxers selling merchandise, and Elon Musk parading bizarre Silicon Valley fantasies. But it also embraces Mumsnet and the local school parents association. It is all of us, retweeting comments and links that pique our interest and flatter our preconceptions and prejudices without bothering to check them out.
Grain of truth Often, Covid-19 misinformation feeds off existing delusions and conspiracies. Bill Gates wants to control us all and he’s funding vaccine research? Well then, he is obviously planning to load the vaccines with microchips to create a worldwide personal surveillance system. Likewise, the delusion that 5G weakens the immune system was already circulating in anti-vaccination communities before the coronavirus outbreak began, but it was a decidedly fringe idea. Now the 5G delusion has taken wide hold, motivating attacks on the network’s masts in the UK. At a Hyde Park protest against the lockdown in May, one group chanted both “no 5G” and “no vaccines.” As the Russian state misinformation apparatus soon ascertained, fake news works best if it contains a seed of truth: for after all, no one denies that the 5G network was developed in China. Once the crisis took hold, it all seemed to fit together for anyone instinctively inclined to see dark plots lurking behind health problems.
For anti-vaxxers, coronavirus is catnip. Neil Johnson, a complexity theorist at George Washington University in Washington DC, explains: “They are right onto this topic, because they think there are going to be mandatory vaccines. For a lot of them, it’s all about Covid-19 now. It’s almost like they’ve been waiting for this. It crystallises everything they’ve been saying—it’s almost like a rallying cry.”
Johnson, who has previously worked on the structure of terrorist and insurgent networks, has mapped out the detailed network structure of the anti-vax debate online, identifying discussion sites that are pro and anti as well as those that are uncommitted either way. The picture is dispiriting (unless you are an anti-vaxxer): the anti-vaccination message could grow to dominate online discussion in the next decade. It tends to get spread not, like its contrary, with a careful presentation of the facts, but with more engaging emotive content (“Do you love your children? Then why would you hurt them?”). And it is more diverse: in contrast to the stark medical facts, there are a host of anti-vax “narratives” to draw you in. “It’s like going into an ice cream shop,” says Johnson. “You’ll always find a flavour that appeals to you, and online it’s only one step away.” Once a Covid-19 vaccine becomes available (hopefully by around the summer of 2021), widespread opposition to it could significantly depress take-up—with dangers not just for individuals but also for public health.
_________________________________________________________
Read more: Why there's more to the anti-vaxxing narrative than meets the eye
_________________________________________________________ The eagerness to find support for one’s narrative leads to some eccentric and disturbing alliances. There was a glimpse of that at the Hyde Park rally, where CND signs mixed with anti-vax messages, and Jeremy Corbyn’s brother Piers, a climate-change denier, rubbed shoulders with the hard right. In contrast to the apolitical (or even eco-left) leanings of many British anti-vaxxers, in the US opposition to vaccines is now strong among Trump supporters and opposition to the lockdowns is flourishing among rightwing libertarians everywhere. Neil Johnson and his colleagues have shown that the online “hate network” of the alt-right has become a conduit for Covid-19 misinformation, where “wormholes” can suck anti-vaxxers into a universe of anti-immigration, anti-semitic and anti-Asian sentiment. Once a generalised distrust of government and “mainstream science” becomes part of your mix, it's likely that it will become infected with traits such as climate-change denialism too.
The disparate links between pleasant-enough cranks, won’t-be-told petrol-heads and racist thugs might appear strange, but they emerge organically in our information ecosystem. No one arranges for that to happen, but there are plenty who are happy to exploit it—for example to spread ideas about the decadent incompetence of the EU and western democracies. What could be a better example of how the dark forces of “internationalism” (the WHO and UN) aim to control us all than a pandemic—perhaps intentionally sown by an engineered virus?—that will lead to enforced lockdowns followed by mandatory mass vaccination and surveillance? So state-run misinformation efforts like that of the Kremlin might, Khan says, try to thread a narrative between several communities. “If you can get anti-vaxxers and libertarians and alt-right nationalists together,” he says, “that’s when you start getting real heft.”
The infodemic devours its children This, then, is the big story in which the coronavirus infodemic is one more chapter. Pull it all together, and you risk coming over like a conspiracy theorist yourself. But the point is precisely the opposite: with misinformation, no fiendish master-plan is required. This ecosystem is populated with many interacting species, each with its own agenda, which combine in webs of twisted creativity to produce untruths with the power to endure. Stories get refined by consensus, the rough edges smoothed off until they become more credible and, well, contagious. Where the Russian and Chinese agencies (say) have a role, it is simply to amplify the dissent and discontent that is already rife within this ecology.
The manipulations of the Vote Leave campaign are also put in their proper context by this complex picture. Dominic Cummings and his computer whizzes were not agents in some dark Russian-funded plot to undermine British democracy. Rather, they were merely creatures slithering through this dangerous and baffling jungle like all the others, tweaking a few strands in the web of the misinformation ecology for short-term gain, and in the process enmeshing it with mainstream politics. It has now become unremarkable for government ministers to retweet political misinformation from the far-right; faked videos, faux fact-checking websites and misleading political adverts are just a part of electioneering. There is a collective “what can we do” shrug when Cummings and his Vote Leave computer modeller turn up at the supposedly independent meetings of the Covid-19 scientific advisory body.
But the truth is that this is a play in which our politicians have become bit-part actors with no control of the plot. Some of the awkwardness of the government’s fumbling response to the coronavirus crisis—its missteps about how to spin the debacle over the EU ventilator scheme, say, or its clumsy manipulation of figures—might stem from the dawning realisation that, when you actually have to govern, and most of all in a crisis, the growth of misinformation and the erosion of trust in expert advice they have abetted is now their enemy too.
What we have unwittingly created—the “we” here including malign state propagandists, manipulative politicians, unscrupulous media, careerist contrarians, and all the rest of us who ever succumb to the temptation of a reflexive retweet— is an infosphere within which what passes for reality itself is determined by market mechanisms. Contrarians and Silicon Valley speculators (who have weighed in on the coronavirus, sometimes with fancy-looking graphs but no epidemiological expertise) might have got famous or rich by betting on an “undervalued” outlier theory, and see no reason not to do so again. You can fail often, they have learnt, if you move fast enough—to do so is almost a badge of honour. But today these reckless mavericks are gambling not with their own reputation or other people’s money, but with other people’s lives.
Climate speeded-up So how can we fight misinformation? Providing easy access to reliable information clearly has a role: the WHO’s department of communications is working in collaboration with Google, Facebook and Twitter to ensure that people searching for “coronavirus” or “Covid-19” are directed first to authoritative public health sites. These social-media platforms already have rules that aim to remove malicious or fallacious content, and some have introduced policies that ban exploitative or misleading Covid-19 advertisements, for example to sell miracle cures or overpriced facemasks. Twitter recently took the unprecedented step of deleting tweets from a head of state—Brazilian president Jair Bolsonaro—for promoting unproven cures for Covid-19. (There’s not much that can be done, however, for Trump’s surreal suggestions for “cures” in his White House press conferences).
But such interventions can only go so far. For one thing, bad information doesn’t just arise from a deficit of good information: in the Darwinian information ecosystem, they compete. Or, as Johnson’s analysis of the anti-vax networks has shown, they might simply colonise different networks, with rather few connections between them, creating a false sense of security. Besides, Quattrociocchi says that when misinformation is kicked off one platform, it often just migrates to one where regulation is more lax. (WhatsApp and Gab, he says, are the current hotspots of coronavirus misinformation—although the former platform took a small corrective step by making it harder to forward messages many times over.)
Closing off misinformation streams, says Johnson, will require high-tech mapping of “the battlefield”—the structures of the various networks on which untruths spread. Once you have identified the “wormholes” that hook up, say, the anti-vax network to the “hate networks” of the alt-right, then you can target them for disruption. Manlio De Domenico of the Fondazione Bruno Kessler, a research institute for artificial intelligence in Trento, has set up a Covid-19 “infodemic observatory” that uses AI to analyse 4.7 million tweets a day referring to the pandemic—about 30 per cent of which they identify as “unreliable.”
Some of the onus to clean up the Covid-19 infosphere is, however, on us all: we need to become more critical consumers (and producers) of information. That may be easier said than done. After all, conspiracy theorists and contrarians (it is not always easy to tell them apart) think that they are already doing this: witness the screeds of analysis some of them conduct to “disprove” the conclusions of the International Panel on Climate Change. Eamonn Holmes defended his comments about the 5G conspiracy on the grounds that he had “an inquiring mind.” And advice to read sceptically and critically all too easily morphs into the claim of the former Supreme Court judge and lockdown sceptic, Lord Sumption, that “it is the right and duty of every citizen to look and see what the scientists have said and to analyse it for themselves and to draw common sense conclusions.” As he exemplifies himself, the likely result of that is cherry-picking to suit one’s prejudices. The idea that this complex science can be adjudicated by “common sense” is ludicrous, even dangerous. Real wisdom, in contrast, lies in identifying and heeding the most trustworthy opinions, while recognising too that even these might incur uncertainties and errors, and that experts won’t always agree when “the science” itself is still evolving.
A free society must of course make room for rightwing libertarianism and leftwing Luddite paranoia. The problem today is that their distorting messages are now apt to become amplified out of proportion. They have just what it takes to become viral memes: simplistic and emotive messages (“You’re going to die! You’re living in a police state!”) that require no context to do their work. Canny agents of misinformation know how to tailor it to their advantage.
Containing a misinformation epidemic is then also partly a matter of finding the right medicine. Some have suggested the idea of “inoculating” populations in advance with reliable information, so that false ideas can never get a foothold (although that is surely much harder now there is such widespread distrust of “elites”). We need agreed and enforceable standards and regulations for social media. We need diagnostic tools to rapidly identify and isolate “super-spreaders,” and “virologists” of misinformation who can find and attack its weak spots. And we need to understand why different people have different levels of immunity and susceptibility to bad ideas—and to recognise that understanding misinformation, like disease, is in many respects an inescapably sociopolitical affair. As with the Covid-19 pandemic itself, the infodemic depends on how we all behave collectively as well as individually, and demands that we think about the consequences of our own actions on others. “We need to understand human collective behaviour in a crisis,” says biologist Carl Bergstrom of the University of Washington in Seattle.
It is no only our bodily but also our societal health that is at stake. Democracy struggles, as recent years have shown us, when citizens lack any shared set of facts. What we are witnessing with the coronavirus infodemic has implications way beyond strategies for managing disease pandemics (of which there will be others). The problem has been dimly acknowledged for years now with climate change, but it was happening on too “flat a curve” to be fully recognised. For Covid-19 those same features hove into view within a matter of weeks, and the deaths, which are expected to occur gradually over many years as climate change worsens, were in this case piling up daily. “This is like climate change run very, very fast,” says Bergstrom. Let’s hope we can learn something from it.