Sarah Haque explores the gap between expert and public understandings of science and why this poses such challenges in times of public health crises such as the COVID-19 pandemic.
There’s a viral WhatsApp chain claiming your stomach acid will kill the novel Coronavirus if you drink enough water to swallow it down. You dutifully drink four litres of water per day. A photo circulating on Facebook warns you that military helicopters will be flying overhead and spraying disinfectant onto houses. You shut your windows and latch your doors. The President of the United States insists that hydroxychloroquine, a drug often used to treat malaria, is a real “game changer” for COVID-19 and has already garnered FDA approval. You take it. Potentially, you die.
Now, more than ever, misinformation is legion. These half-truths and glaring falsities spread with untouchable speed and vigour, arguably more contagious than the virus itself. And they follow a distinct template. Someone knows someone who spoke to someone who works in the Guangdong Province of China and/or is the current Secretary of State. There’s precise, emphatic language that evokes fear or feeds into political aversions. There is never sufficient supporting evidence.
What is it that makes misinformation about the COVID-19 pandemic so compelling?
The answer may lie, not in the contemporary age of rampant ‘fake news’, but in a deeper-rooted, longstanding undercurrent of tension between the British public and science itself.
What is referred to in academic circles as the ‘Public Understanding of Science’ is the acknowledgement of, and attempt to somewhat bridge, the gap between expert and lay understandings of science. In 1985, the Royal Society of London published an influential report recognising “improving public understanding of science” as one of its major concerns. Fifteen years later, the House of Lords diagnosed a “crisis of confidence” between the public and scientific institutions. A 2005 European Commission report found that many people could not answer basic scientific ‘true or false’ questions – with 43% of those quizzed believing that antibiotics kill viruses as well as bacteria, and 29% believing that the Sun orbits the Earth. It appears that a sizable proportion of the public lack an adequate understanding of science.
This is presumably why we are so susceptible to misinformation regarding scientific concepts. It is, in part, why the MMR hoax – which claimed a fabricated link between the measles mumps and rubella vaccine and autism in children – resulted in a severe drop in vaccinations and the entirely avoidable 2005 mumps pandemic in the UK. Before that, there was the pertussis (or whooping-cough) vaccine scare in the mid-1970s, and further back, the 1930s’ anti-smallpox vaccine movement in Leicester. It is why homeopathy and eugenics continue to pervade today’s society, despite being debunked as pseudosciences. Why flat-earthers, climate change deniers and anti-vaxxers persist.
Scientists are partly to blame. Experts often have difficulty conveying succinct, relevant scientific information that is both accessible and timely for laypeople. That’s what science communicators are for. Misinformation, however, is propagated most commonly between sectors of the public. By definition, the ‘public’ in the Public Understanding of Science refers to all non-experts, including governmental officials and mass media organisations. The UK Government’s approach to the outbreak has evolved rapidly, initially facing widespread criticism. The controversial ‘herd immunity’ tactic has upended into a nationwide lockdown. Prime Minister Boris Johnson has tested positive for the virus, just weeks after announcing that he’d been shaking hands with Coronavirus patients.
For NHS workers, it is not the unfounded wariness of ibuprofen or the idea that leaving raw onions in the corner of a room will prevent COVID-19 that is most unsettling. It’s the downplaying.
“The idea that young people can’t get it,” a frontline worker at the Luton and Dunstable Hospital, told me. “Or, about underlying conditions. These underlying health issues could be something you didn’t know about or even antibiotics or medication you might be on. This virus is new and being studied as we go on. We don’t know how it reacts to anything. We don’t have that information.”
In Ben Goldacre’s book Bad Science, there is a chapter titled ‘Why Clever People Believe Stupid Things’ in which the British physician cites the phenomenon of “communal reinforcement” – when a claim becomes a strong belief through repeated assertion by multiple members of a community. The supposed realities talked about in every broadcast, press conference or social media chain mail settle in a stagnant state in the air around you; polluting. All we can do is add the truth and hope that, at some point, it offset the lies.