Beliefs and lies
What is false information anyway? (9 minute read)
On the 26th of March, the world was horrified by news that a gargantuan container ship had crashed into a bridge in Baltimore, causing it to collapse. I heard about it from radio news while I was walking my dog. Although it sounded awful, the full impact of the news was blunted because I was listening to the radio. Only later would I see the pictures and grasp the scale of what had happened.
The video is almost incomprehensible, as if it was something generated by Hollywood. If it hadn’t happened at 1.30am, many more than six people would have died. Here, I’ve linked to the version shared by The Guardian, a media source I consider largely reliable. The same video was shared on many platforms, including Facebook and X, formerly Twitter.
Some of those who saw the video via social media were then directed to another video, which said that it showed the collapse from a different angle. For 35 seconds, it just shows a few vehicles moving over a bridge at night, then there’s a cataclysmic explosion. It’s horrifying. How did a container ship cause such an explosion?
That’s easy to answer, with a bit of basic fact checking. It didn’t. The video showing the horrific explosion came from the Kerch Bridge, which links the Crimean Peninsula with Russia. Begun in 2014, after Russia invaded Crimea, and completed in 2018, it has been repeatedly targeted by Ukraine since Russia’s full invasion of the country in 2022. Despite this, thousands of people shared the video as coming from Baltimore.
Misinformation and disinformation are widely-used terms meaning false information. The precise definitions, though, vary. The distinction usually hinges on the intention of the originator of that false information. Misinformation is a mistake – information shared by those who believe what they are saying, or without the intention of harm. Disinformation is deliberate deception. It’s often linked to harm, or to a hidden agenda, like power or profit.
Harm, however, is a matter of perspective, something that I learned in my work on invasive species. I’m concerned about harm to, and likely extinction of, birds found nowhere but New Zealand if pest species such as rats, stoats and possums aren’t controlled, for example. An animal rights campaigner might be concerned about the suffering of rats, stoats and possums. Rats, stoats and possums don’t die a quick, painless from some of the toxins used to control them – I’d be lying if I claimed they did. A New Zealand conservationist might overlook my lie, but the SPCA would label it disinformation.
There’s a good, nuanced discussion of misinformation and disinformation in the article below. It gives a few different definitions, as well as getting into some related topics.
Misinformation vs. Disinformation - Taylor & Francis Insights (taylorandfrancis.com)
One of the points which came up on the disinformation course that I’m doing is the idea that mis- and disinformation are on a spectrum. I find this idea immensely helpful, because it’s never seemed an entirely clear distinction to me. Whoever originally labelled a video from the Kerch Bridge as Baltimore must have known they were being dishonest. The thousands who shared it afterwards probably didn’t.
Sometimes, intentions seem fairly clear-cut. There are, for example, claims which originated in Russia that Vlodymyr Zelenskyy has bought a couple of luxury yachts – claims which have ended up being repeated in debates about US funding to support Ukraine despite the fact that the two yachts in question are still on the market and haven’t been sold to anyone. Mislabelling one bridge as another, on the other hand, is ambiguous.
There’s another spectrum in misinformation and disinformation, and although it’s mentioned in the article I’ve referred to above, it doesn’t usually get much attention. What, exactly, is false?
That question is harder to answer than it first appears. In the case of the Kerch Bridge, it is as clear-cut as it could be – there’s no version of events where that bridge is actually in Baltimore. But when we start to get into questions of complex science, as we do in climate change, we start to stray into more difficult questions.
First, there is scientific uncertainty. It’s a different kind of uncertainty from me saying that I’m uncertain what job I’m going to be doing in six months, or I’m uncertain who is the current coach of the All Blacks. The latter is simply ignorance. I could find out easily enough, if I cared. The former is something that nobody knows right now. There’s no point in me trying to predict it – I’m better to put my energy into finding the right opportunity.
Scientific uncertainty is about the degree to which something is known. Scientists have a range of mathematical ways of describing uncertainty, and often it is framed as a level of confidence rather than uncertainty. Some scientific questions, such as the link between specific microbes with specific diseases, have high confidence. Other scientific questions, such as how life first developed, are highly uncertain or have low confidence.
As an aside, one of the best explanations I’ve seen comes from the organisation Sense About Science – it’s well worth a look: Making Sense of Uncertainty - Sense about Science
Degrees of confidence or uncertainty are important in understanding climate change. In the science of climate change, scientists aren’t equally confident about everything. The latest report from the Intergovernmental Panel on Climate Change (IPCC), for example, states that there is high confidence that:
Human activities, principally through emissions of greenhouse gases, have unequivocally caused global warming, with global surface temperature reaching 1.1°C above 1850-1900 in 2011-2020. [page 4, Summary for Policy Makers]
Many climate change impacts are also reported with high or very high confidence. So too is the statement that we require rapid and deep reductions in greenhouse gas emissions to limit warming to below 2oC. On the other hand, estimates of sea level rise vary in confidence. There is high confidence that sea levels will continue to rise for hundreds or even thousands of years. Specific estimates of how much the sea level will rise over the next 75 years are reported with medium confidence. Estimates of the total sea level rise over the next couple of thousand years are reported with low confidence.
But most of us don’t get our information directly from IPCC reports. Once science is translated into less technical sources, uncertainty may be left out. I know that I don’t often get into discussions about how confident scientists are on topics I’m writing about. Many articles about science in the media, not just climate change, gloss over uncertainty. It isn’t only uncertainty which is glossed over. Science is often reinterpreted and its meaning lost in translation.
On the other hand, uncertainty could be presented in a way which is misleading. A 2017 article from The Conversation gives an example of this. A group of scientists published an article which gave a more optimistic view of how long it would be before the world passed the 1.5oC goal of the Paris climate agreement. It wasn’t vastly more optimistic – it gave us another ten years. But some media headlines suggested that their research had shown previous modelling was wrong, and that it signalled some kind of revolution in understanding.
In the case of climate change, as well as other environmental issues, uncertainty has frequently been used as an argument for delaying action. For years, reference to uncertainty was central to the public statements of fossil fuel companies – ExxonMobil in particular – about climate change. If there is uncertainty, the argument goes, the problem might not be as bad as the scientists are saying. Why take action which might be costly? Some of the statements arguing against climate action have misrepresented the level of confidence that scientists have in their evidence. But some of the conversation about climate change is not about science and uncertainty, it’s about opinion and values.
It's not a matter of opinion whether burning fossil fuels has contributed to increased levels of carbon dioxide, and whether these increased levels of carbon dioxide have resulted in warming. Nor is it a matter of opinion that the world is going to suffer certain consequences as a result, and that we need rapid and deep reductions in greenhouse gas emissions if we want to reduce those consequences. There are levels of uncertainty in those statements, but as far as I can tell, it’s not a meaningful level of uncertainty when it comes to understanding how the natural world works.
It's also not a matter of opinion that governments of the world have agreed to limit warming to well below 2oC and preferably 1.5oC, and that they have failed to implement policies which will achieve those goals. Some of the more extreme scenarios, however, are uncertain enough that we should be careful not to misrepresent the evidence.
But many questions about how we respond are matters of opinion and values, or have worryingly high uncertainty. Our ability to scale up carbon capture and storage, for example, remains uncertain. The exent to which different countries should contribute to emission reductions is a matter of opinion. How we balance the contributions of countries like New Zealand, who each contribute a small proportion of global emissions, against those of countries like the USA and China is a matter of opinion. How we balance the contributions of wealthy countries, who got that way by burning fossil fuels, with less wealthy countries is also a matter of opinion. Whether, and how much, wealthy countries should compensate less wealthy countries who are suffering the consequences is a matter of opinion.
However, we aren’t going to be able to have constructive discussions on these topics if we exist in different worlds about the science. The problem is, we do. I exist in a world where I think that science basically works. I consider it a valid way to understand the natural world. I recognise that there are issues with publication and peer review, that scientists make mistakes, that some science is fraudulent, and that sometimes things that we think we know turn out to be wrong. But, on the whole, I trust the system. It means that when I look at a report from the IPCC, I see a compelling reason for action.
Some people, though, see a different reality. There are valid reasons that people may not trust science, and it’s something that we need to consider in our discussions about disinformation and climate change. I’ll look more at that next week.
Thanks for taking up this subject, Melanie. Misinformation and disinformation have been on my mind a lot lately. It seems that so much of our time and effort—for those of us working in the environment—is taken up with fighting misinformation and mendacity. I may touch on the subject myself sometime soon.
Regarding: "On the 26th of March, the world was horrified by news that a gargantuan container ship had crashed into a bridge in Baltimore, causing it to collapse. " Information is nuanced, sometimes in an extreme way (your example) sometimes not so extreme. But there is also something about beliefs and lies that is nuanced. Sometimes, we are led astray intentionally ... sometimes unintentionally.
I guess that "certainty" has nuance between science and something like trans-science. Citizens in general are not thinking in the same way scientists do which sometimes causes conflicts between most citizens' policy advocacy and scientists' and engineers' policy advocacy. Engineers and scientists create equations the require coefficients. Such coefficients are derived from observation of relevant experiments. One often hears engineers saying "Get me more data." The more data one can have that applies t specific situations, material types, connected systems of devices, the more certain one becomes about outcomes. Insurance companies make money by having actuaries collect large amounts of data for specific kinds of losses so that premium values can be set that makes the company money. These principles of risk and uncertainty have been understood since antiquity and started to become formalized in the more recent past in for example, Gerolamo Cardano's ”liber de Ludo Aleae” and even more recently, Jakob Bernoulli's "Ars conjectandi". It seems that Enlightenment brought together Newton's discovery of differential equations with an understanding that having large data sets, the future could be predicted with certainty. In modernity, Although coming later, Sven Ove Hansson's "From the casino to the jungle: Dealing with uncertainty in technological risk management" lays a good foundation to Ilya Prigogine's "The end of certainty".
Thinking about bridges for example, it is well understood what the terms "ultimate strength" and "yield strength" imply to design engineers about the behavior of a structural member. When the yield strength is exceeded, the member will begin to strain (deflect, move, ...) when the ultimate strength is exceeded, the member will fail (break in two). However, it depends on what material for example a steel alloy, is being subjected to a force inducing strain or stress. The only way to know where a meterial will yield or fail (yield strength, ultimate strength) is by collecting a lot of data where samples of the material are subjected to forces in a specific way such that the equation coefficients can be set. Depending on how much data are collected that apply to a SPECIFIC SET OF CIRCUMSTANCES, predictions of behavior can be expressed in mathematical formulae.
Uncertainty creeps in by unknowing that things will obey the assumed specific set of circumstances. Engineers are very familiar with danger of the unknowing part and they use "margins of safety" to compensate for their unknowing. Engineers understand that the data collected on materials, for example, may not include effects such as corrosion, higher loads than assumed, or effects that belong to physics yet to be discovered or that are revealed in operation. For example, the engineers who designed the first Plutonium breeder reactor were asked to design it using the data scientists had collected on fission up to that time. However, the engineers who designed the reactor added extra fuel tubes to "increase the margin of safety" in the design. It is well-known that the addition of this "margin of safety" was the only reason the reactor could reach criticality because the effects of Xenon poisoning had not been observed in the data taken by scientists using low power fission reactions (for example, https://b-reactor.org/wp-content/uploads/2017/03/Lost_In_The_Telling-Rev_3.pdf).
Trans-science, a term coined by Alvin Weinberg in the article "The limits of science and trans-science". From his abstract:
"Many different limits to science have been identified, the most common being those between science and religion, or more generally between fact and value; between science and art; as well as the sociological limits imposed on science because it is becoming too large and unwieldy to be encompassed by a single mind. Here another realm is explored, lying beyond science: we call trans-science those questions which epistemologically are matters of fact, yet are beyond the proficiency of science. Trans-scientific questions consist of very rare occurrences and 'catastrophes' in the Thomian sense. It has been pointed out that unanswerable, trans-scientific questions are usually asked of science by policy makers. Consequently the scientist must concede that his[sic] proficiency is limited by this trans-scientific limit to science." (Weinberg 2013, online version)
Climate change and bridge building have trans-science in common. Experiments conducted by Joseph Fourier and Claude Pouillet showed that the broad spectrum energy (such as sunlight) when passed through carbon dioxide gas will be shifted towards blue. That is, the atmosphere retains the longer wave (closer to red) incident radiation from the sun. This is a fact that can, like the scientific fact that a that fissions will release a known amount of kinetic energy. Einstein's well-known equation, energy released equals the mass times the square of the speed of light required data from measurements made by for example, Ole Roemer. The mass of fissioning atoms must also be determined from experiments. And that scientific fact, when extrapolated to scale in the Hanford production reactor required more fissions than scientists expected because their experiments failed to include fission products that "steal" neutrons. So too did changing conditions: really, Really, REALLY big ships started going through the Francis Scott Key bridge that was designed (with safety margin) for much smaller ships.
Well, I guess what I am trying say in a round about way is that there is great uncertainty where we have little or no data. What are thought to be "scientific facts" may not account for uncertainty in either direction. Plus scientists, engineers, and particularly policy-makers should be humble enough to take into account all views when deciding policies. This should include understanding how much relevant data are available that can be applied to accurately predicting future outcomes.
Background on some of the points:
Hansson, Sven Ove. "From the casino to the jungle: Dealing with uncertainty in technological risk management." Synthese 168, no. 3 (2009): 423-432.
Weinberg, Alvin M. "The limits of science and trans-science." Interdisciplinary Science Reviews 2, no. 4 (1977): 337-342.
Cardano, Gerolamo. The book on games of Chance: the 16th-century treatise on probability. Courier Dover Publications, 2015.
Schneider, Ivo. "Jakob Bernoulli, Ars Conjectandi (1713)." In Landmark Writings in Western Mathematics 1640-1940, pp. 88-104. Elsevier Science, 2005.
Lewens, Tim, ed. Risk: philosophical perspectives. Routledge, 2007.