Confirmed errors
Can we prevent our intuition from leading us astray? (10 minute read)
Before I start – I have some news that I’d like to share. This week, an anthology of essays called Otherhood was published. It covers the perspective of those who are childless, child-free or “child-adjacent”. What does “child-adjacent” mean? It refers to those who, for a range of reasons, ask themselves am I a mother or not?
I’m among those who fit within that last category. I was a foster mother for seven years, although it’s not called that these days, it’s called caregiver. I wrote an essay about some of my experiences, and it’s in the anthology. It was tricky to write, because I needed to protect the confidentiality of the children and young people I cared for. But I managed.
If you are interested, here is a link to the book. For those outside New Zealand, it’s also available as an e-book. If you are one of those who finds Mother’s Day difficult, I can particularly recommend this book. It gives voice to so many different perspectives.
Otherhood | Massey University Press (masseypress.ac.nz)
In May 1995, I handed in my Master’s thesis, packed my belongings into my Honda Civic and drove south from Auckland, heading for Christchurch. I don’t remember much about the trip, except that my car was crammed so full its acceleration was sluggish. I know that the trip took two days, with an overnight stop in Palmerston North. I do remember that I was very lucky with my crossing on the Cook Strait ferry. I suffer badly from seasickness but the water was as calm as a lake, and I was fine.
A couple of days after I reached Christchurch, I headed to the Allan Herbarium in Lincoln to start my new job. The Allan Herbarium is New Zealand’s largest collection of dried plant specimens, more than half a million of them, dating back to specimens collected by Joseph Banks and Daniel Solander, on Cook’s first visit to New Zealand in 1769. It’s a critical reference collection for botanists – around the world, this type of collection is used to help classify and identify plants. There, it was my job to change the blotting paper in the plant presses, so that the drying plants didn’t go mouldy. I also glued the dried plants to pieces of card and entered the details into the database. It was fairly menial work for someone with a Master's degree, but it was a remarkable learning opportunity.
My job meant that I looked closely at thousands of plants. But, more than that, it meant that I sat next to the legendary botanist Bill Sykes, known as Botany Bill because his knowledge of plants was so extensive.
Bill had retired some years before I started my job, so he was no longer assigned an office, which was why he was sitting in the lab next to a junior technician. He was an authority on the plants of the Pacific Islands and Nepal, as well as the cultivated plants of New Zealand. He had a personal collection of hundreds of succulents, which he kept in pots on the deck of his house. He knew both tropical and temperate plants, from rainforest to desert. Once, someone phoned the herbarium asking about a vine with blue flowers they had seen in Vietnam. We put Bill on the line, and he knew what they were talking about. I’m not even sure whether he’d been to Vietnam, but I know he passed through Thailand on his way to Nepal, so he’d have known the plants of south-east Asia.
Even though I was aware of his vast expertise and I was the most junior employee, I never felt ignorant in Bill’s company. He was humble, patient and generous in his acknowledgement of others. In future years, when sometimes I found myself in the position of expert, I would remember Bill’s ways. But, back then, I was simply a novice hungry to learn.
He noticed my interest, so he began bringing me plants to identify. He would open his backpack, hand me a specimen and ask me if I knew what it was. If I didn’t, he would ask me what it reminded me of. I’d have to think about it and make a guess. I’d name a plant that I thought resembled the specimen he’d brought me. “Well then,” he would say. “Why don’t you take a look at it?”
This was where I used my access to New Zealand’s best collection of pressed plant specimens. The specimens were arranged by how they were related to one another, which meant that if my guess was wrong, I could also check related plants, to see whether I was on the right track.
If I was completely wrong, I’d return to Bill, who would ask me to make another guess. He’d never give me the answer. Had I been an impatient person, it might have frustrated me, but I loved the challenge. Even better was the triumph when I finally found the pressed specimen which matched the plant Bill had brought me.
I sat next to Bill for 3 ½ years, and it was one of the greatest privileges of my life. What he did, I now realise, was to train my intuition. To this day, when I see a plant I’m not sure about, I don’t think about what features it has or use formal identification tools, as least not at first. I get a feeling about it. I know if I’ve seen a plant before, and if I haven’t, my guess on what it’s related to is right more often than not.
We all use our intuition at times. Most of us can give examples of where intuition has led us to a good decision. But if we are honest with ourselves, we can probably think of situations where our intuition has been completely wrong too.
I became interested in intuition when I read about the work of Gary Klein. He studied decision-making in certain types of experts, such as experienced firefighters and platoon commanders in the military, and looked at how they were using their intuition. They would be faced with a situation and have an intuitive idea of what to do – they didn’t consider a range of options, they went straight to the one that seemed right. Then, they would mentally simulate their response to see if there was any reason to discard this intuitive idea – if there was, they would move to the next intuitive idea, if there wasn’t, they would go ahead.
The point of Klein’s research was that these experts were using trained intuition. Crucial to the development of their intuition was immediate, clear feedback on whether they were right or wrong. It’s easy to see how this would work for a firefighter or platoon commander. If they were wrong, they could end up dead. The consequences of incorrect plant identification weren’t so extreme, but I was getting immediate feedback on my guesses. Without this feedback, however, intuition was less reliable.
While Gary Klein was studying the good intuition of experts, another researcher, Daniel Kahneman, was studying why our intuition is often wrong. Kahneman, along with Amos Tversky, pioneered the study of cognitive bias. He also developed the idea that we have two different modes of thinking which he calls system one and system two – his bestselling book on the subject, titled Thinking, Fast and Slow is well-worth reading. Kahneman’s argument is that system one thinking is automatic, fast, intuitive and emotional. System two is slow, effortful and analytical.
When I sit down to calculate my taxes or write an article, my brain is certainly working in the analytical mode of thinking, or system two. It’s a slow process and takes effort. Sometimes my mind wanders and I have to work to keep it on track. Sometimes I get thoroughly stuck and have to redo calculations or end up rewriting the same sentence over and over.
On the other hand, when I’m identifying plants, I’m mostly using system one. I also use it when doing a familiar task like getting dressed or driving a car. These activities are so familiar, I don’t need to think about how to do them. It’s also system one thinking which would make me jam on the brakes or swerve to avoid a child or animal who ran onto the road. If we had to rely on system two in that kind of emergency, the consequences don’t bear thinking about.
The problem with system one thinking is that we also use it when perhaps we shouldn’t, such as when we are scrolling through a feed on social media. System one thinking is easier than system two because it uses mental shortcuts, but these shortcuts are prone to errors, or cognitive biases. Many of these errors are consistent and repeatable. You may have heard of some, such as the confirmation bias or, my personal favourite, the Dunning-Kruger effect (which was not described by Kahneman but by Dunning and Kruger). This bias is the tendency of people who know a small amount about a topic or who have a novice level of skill to overestimate their competence.
Psychology research is difficult, and there are doubts about how consistent and prevalent some of these biases are. Not all of the biases that Kahneman and Tversky identified apply in all circumstances. And while we’ve probably all encountered people whose confidence exceeded their competence, I’m sad to say that some researchers have shown the Dunning-Kruger effect may not be real, or at least it’s not as widespread as originally suggested.
However, these criticisms don’t alter the central point that we are prone to using mental shortcuts which lead us to jump to incorrect conclusions.
One bias which doesn’t seem to be disputed is one of the most important and pernicious – confirmation bias. This is our tendency to notice evidence which supports our viewpoint and discount evidence which contradicts it. It’s easy to see how this can affect our susceptibility to misinformation and disinformation. If we see something that confirms what we already believe, it feels intuitively true to us, and we are more likely to accept it.
Even if social media gave us a balanced selection of information, with different perspectives on an issue (which it doesn’t) our brains would still give more credence to information which fitted our existing beliefs.
Are there ways we can get better at overcoming confirmation bias? Are there ways to encourage more system two thinking? Can we train our intuition to get better at spotting mis- and disinformation?
There’s actually some good news here. Studies have found that it’s relatively easy to get people to switch to their system two thinking and make a better judgement on whether a headline is true or not. In one study, people were asked how likely they were to share a selection of headlines. Some people were simply shown the headline then asked if they would share it. Others were shown the headline, asked why they thought it was true or false, then asked if they would share it. The second group were much less likely to share fake headlines, but equally likely to share true headlines.
Another study looked at providing ratings on news sources for articles shared on social media. The study found that people took ratings for the sources into account when judging whether to believe a headline and whether to share a news item. Even more encouraging, people gave more credence to ratings they were told came from expert fact checkers as opposed to ratings from other social media users. People paid particular attention when a news source was rated as very low for accuracy.
The authors of this study wrote an article for The Conversation, which is a lot easier to read than the original paper. It’s well worth checking out.
Rating news sources can help limit the spread of misinformation (4 minute read)
But there was something else interesting that these researchers found. Asking people to rate articles themselves also improved their ability to spot inaccurate information. Not only that, once they’d spent some time rating articles, they remained more critical of what they read, even after they were no longer required to rate it.
There are other studies too. Even something as simple as a time delay before posting can make people less likely to post aggressive and threatening messages on social media. There’s a lot of research going on, and it’s showing us that we don’t need to put up with social media being a toxic swamp of mis- and disinformation. There are ways to nudge our brains into being more discerning about what we are reading. Whether social media companies will be willing to put these measures in place is another matter – but knowing that the tools exist is a crucial first step.
I just finished reading Thinking Fast and Slow a few weeks ago -- it's an excellent book! Your reporting of research stating that people can be taught to use System 2 with articles and social media is promising. I yearn for a future where most people are critically literate...