Welcome to The Turnstone. Here, I share my perspective on science, society and the environment. I send my articles out every Sunday- if you’d like them emailed to you directly, you can sign up to my mailing list.
This article follows on from last week, where I looked at why people believe conspiracy theories.
The human brain is a remarkable thing. As an example of how well it can work, take the identification of plants and fungi. Most of us recognise a number of vegetables and garden flowers. Some of us know much more than that, and can identify hundreds, if not thousands, of different species. I belong to a group on Facebook where we identify plants and fungi which people fear may be poisonous. In most cases, three or four people will chime in with the name of the plant within a couple of minutes of a photograph appearing. Sometimes it’s tricky, as all we have are some chewed up pieces of leaf or mushroom. Even in difficult cases, we can usually help, at least narrowing the options.
Every now and again, someone posts a picture and mentions that they’ve tried to get an identification using an app on their phone. Sometimes, the name they have is correct, but they are wrong more often than they are right. Even looking at a poor quality photo, the human brain’s ability to identify plants and fungi far exceeds that of a computer. For most of our evolution, identifying our food was a crucial survival skill. It’s hardly surprising that we are good at it.
My observation that humans outperform computers at plant and fungus identification is backed up by evidence. Identification is a type of pattern matching – we are matching an image in our head with the image we see in front of us. It’s a skill computers can’t yet do better than we can. And we love doing it – there’s recent evidence that suggests identifying patterns triggers the part of our brain associated with reward.
But our love of identifying patterns can get us into trouble as well. Take this problem:
If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?
I’ll admit it, the first number that popped into my brain was – you guessed it – 100. That’s obviously wrong, because it takes 1 machine 5 minutes to make 1 widget, which means the answer is 5 minutes. I knew that 100 was wrong as soon it popped into my head, but that didn’t stop my brain from helpfully nudging me in that direction.
Image credit: Getty Images
The little nudge from my brain telling me the wrong answer is an example of intuition. We’ve all got it, and we can all think of times it has given us the right answer. But intuition is also prone to error. These errors are not random, they are predictable and measurable. Psychologists call them cognitive biases.
The study of cognitive biases began in the early 1970s, with work from Amos Tversky and Daniel Kahneman. They found that people used predictable mental shortcuts, called “heuristics”, a type of thinking that Kahneman would later describe as “system 1” thinking. The opposite of system 1 thinking was “system 2” thinking. This type of thinking is considered and deliberate, what we do when we multiply 17 by 24. It’s also hard work, and the human brain avoids it when it can.
When we are going through our posts on social media, we’re probably not using much system 2 thinking. We aren’t consciously analysing everything we are reading – that would be exhausting. So when a statement pops up about Covid-19 or vaccination, we aren’t really giving it much thought. We run it past our mental shortcuts, and most of the time we probably believe what our brain tells us. If we read something that fits with what we already believe, our brain will tell us that it’s true. This shortcut is called the confirmation bias. If we read something that suggests a connection between events, our brain will nod and say “okay” – that shortcut is called the clustering illusion and is linked to our tendency to see patterns.
Confirmation bias is one of the best known mental shortcuts. A great example of it showed up in the US media in September, when Rolling Stone magazine published an article claiming that emergency departments in the state of Oklahoma were clogged up with patients who had overdosed on Ivermectin, a deworming medication. Ivermectin is a drug that has been tested to see whether it could help treat Covid-19. Despite extremely limited evidence, it has proved popular with those who don’t want to believe offical advice. The story was widely shared, even being picked up by reputable news organisations such as The Guardian. Maybe you saw it yourself – I know I did, and I assumed it was true.
Unfortunately, the story was wrong. If hospital emergency departments were full, it was because of Covid-19 patients, a fact that could have been easily verified by contacting the hospitals concerned. The Ivermectin story, though, matched reporters’ beliefs and judgements about the kinds of people who would take an unproven medication, and it was accepted without verification – in other words, a case of confirmation bias.
The clustering illusion – our bias towards seeing patterns even where there are none – can be illustrated with the two images below.
Image credit: TierneyLab
Do you see anything? Many people, myself included, see pictures in both (I saw a pair of shoes in each). In fact, the right hand one shows Saturn with its rings and the left hand one is random. Not only that, but this particular mental shortcut seems to be heightened when we feel less in control. And it’s not just about seeing shoes in an image made up of random blobs. This same mental shortcut is linked to superstition and belief in conspiracies.
If we want to beat misinformation, we really need to think again, quite literally, about what we are reading. Research has shown that those who use more reflective (system 2) thinking are less likely to believe fabricated news headlines. Not only that, showing people a news headline twice and then asking them to take their time in thinking about it, improved their ability to spot something that is fake.
Controlled psychology experiments are all very well, but how does this work in the real world? Researchers have investigated the link between results in a test of “cognitive reflection” (that is, system 2 thinking) and belief in conspiracy theories about Covid-19. Sure enough, people with higher scores in the cognitive reflection test (indicating more system 2 thinking) were less likely to believe Covid-19 conspiracy theories. Those with lower scores, indicating that they were more likely to use mental shortcuts or system 1 thinking, were more likely to believe in conspiracies.
But it’s not all bad news. We all take mental shortcuts, but we can also be encouraged to switch our thinking to system 2. The study I mentioned above – the one that showed people a news headline twice – illustrated that prompting people to engage in system 2 thinking could help. But we don’t always have someone helpfully reminding us to think carefully before we share a social media post. Are there other ways to get us to engage in more system 2 thinking?
One intriguing piece of research comes from Cambridge University’s Social Decision Making Research lab. There, researchers have been working on the idea of “vaccinating” people against disinformation. The idea of vaccinating people against attempts to persuade them has been around for nearly fifty years, but the method that the Cambridge researchers are using is relatively new. They have developed a simple computer game called “Go Viral” that shows people some of the techniques used to spread disinformation. Their research showed that playing the game reduced the chances of people believing misinformation. More importantly, similar research showed that those who were most susceptible to misinformation had the greatest improvement in the ability to distinguish true from false. You can try the game yourself here.
There are other ways to improve or encourage system 2 thinking as well, although they aren’t as simple as a five minute video game. One approach that I’ve personally used in my public service career is “argument mapping” – a technique for visually representing logical reasoning. Argument mapping doesn’t, in itself, tell you whether your reasoning is good or not. Instead, it makes the structure of reasoning clear, and, with a clear structure, it is easier to spot a weak argument. Argument mapping takes time to learn and lots of practice, but the evidence does suggest that it improves critical thinking. If you’d like to try it out, there’s a basic free course available here.
There’s one problem with these approaches, though. They can reduce the chances of the average person believing misinformation, but there’s no evidence that they work on someone who has fallen deeply into the world of conspiracy thinking. Once that has happened, different tactics are needed. And that is what I’m going to look at next week.
Let me know what you think in the comment box below. And if you know someone who might find this article interesting, please share it with them.