12 Comments
Apr 14Liked by Melanie Newfield

Thanks Melanie. You might be interested in a book I read a while back - "On Immunity" by Eula Biss. It's a history along similar lines, about the social history and autonomy implications of inoculation - you might enjoy it, if you haven't already 😉

Expand full comment
author

Thanks, I'll check it out.

Expand full comment
Apr 14Liked by Melanie Newfield

Excellent piece. What I wonder about is why those responsible - the white collar criminals responsible for the behaviour of companies - seem to walk away scot free. Certainty of substantial and debilitating punishment would change this as they’re risk adverse and will make those calculations. But it rarely happens, and even if caught the punishment is very light.

Expand full comment
author

In this case, one of the lawyers acting for the pharmaceutical company staff was appointed Minister of Justice for the state where the trial was taking place. And he basically shut the trial down. It's incredibly depressing and almost incomprehensible in how obvious the conflict of interest was.

https://www.theguardian.com/society/2014/nov/14/-sp-thalidomide-pill-how-evaded-justice

Expand full comment
Apr 14Liked by Melanie Newfield

Thanks Melanie, this is excellent. It's heartbreaking to see how corporations have deliberately destroyed the credibility of science so that they can continue sacrificing public health in the name of profit. It's the crime of the century for the second century in a row.

Expand full comment
author

Thanks John. Yes, it's horrific. There are few examples worse than this, but the terrible people keep doing it again and again.

Expand full comment

Thank you for this. I was glued in till the last word. I will definitely check out the "Dark Remedy" book.

Expand full comment
author

Thank you, I'm glad you found this interesting.

Expand full comment
Apr 14Liked by Melanie Newfield

"At their core, they relate to how people perceive risk – what seems dangerous and what seems safe. It’s a topic which has long fascinated me – I worked in biosecurity risk assessment for 16 years, and even before that I was involved in both risk assessment and risk communication."

There are some interesting nuances related to risk and risk perception that are often overlooked. One is that risk lives in the future in citizens' minds:

“Risk is a concept we utilize for coping with the myriad of logically possible future states of affairs. This means that risk does not have an objective existence per se, and that all risk assessments are subjective or relative.” (Solberg, 2012)

When considering whether or not to engage or not engage in an activity, citizens like me might ask themselves some basic "risk perception" questions:

1. Will consequences be realized in a short time frame?

2. Will I have control over engaging in the activity?

3. Will everyone exposed be in the same place?

That is, my opinion is that risk perception is governed by: time, control, and proximity.

It should be understood as well that science is often misleading if the extent of the data used is not well understood. For example, when low probability, high consequence scenarios are in play, the amount of data required is often well beyond anyone's reach. Nevertheless, "science" is often cited as having a certainty for a consequence when in fact the certainty should be very low indeed.

Adding to confusion, risk has many meanings that can confuse even sophisticated analyses. In my opinion, a particularly informative treatise on the subject is found in: Lewens, Tim, ed. Risk: philosophical perspectives. Routledge, 2007. Again in my opinion, presuming that science can "correctly" inform citizens on risk is unreasonable at best and dangerous at worst. Science is best used to inform policy makers what data have been accumulated on scenarios that have been studied. Scientific data are only a small part of the "risk" decision-making that must be accomplished to set public policy. Good decision-making is -- should be? -- centered around respectful debate in the citizenry.

Looking forward to your next post on the subject of risk, risk perception, and decision-making.

Expand full comment
author

Science is incredible useful in helping us to understand risks, but it's only one piece of the puzzle. You're absolutely right that it's particularly limited in what it can tell us about low probability, high consequence events.

I'm entirely in agreement that science is only an input into decision-making. I've literally been researching good decision-making for the last couple of years and you are quite right that respectful debate among citizens is crucial. I can't write about my specific research on my Substack yet, because I can't publish it yet, but I'm happy to have an offline discussion. I've been doing research which is specifically relevant to biosecurity/ invasive species management in New Zealand but it might be relevant to you. Feel free to email me if you'd like to discuss it (i.e. just reply to any of my newsletters and it will get to me).

Expand full comment

Hi Melanie, I have absolutely no background on biosecurity or invasive species beyond avoiding plant selection at the nursery!

I am retired and my rather limited background is nuclear power -- primarily in commercial nuclear energy production. My experience extends as well to nuclear powered warships operation & maintenance and experimental nuclear accident studies at a U.S. national laboratory. In my last about 10 years in commercial nuclear power, I worked in the quantitative risk assessment domain and have become somewhat skeptical about its application in regulation as a decision-making tool on protections. In particular, the "tool" called Probabilistic Risk Assessment (PRA) has been adopted by many who are hoping it can be used to understand the "risk" for low probability, high consequence events.

There are some problems with quantitative analyses pointed out by Sven Hansson for example, Hansson, Sven Ove. "From the casino to the jungle: Dealing with uncertainty in technological risk management." Synthese 168, no. 3 (2009): 423-432. We have tried to add specific context in for example, Kee, Ernie, and Martin Wortman. "Unanticipated protection failure scenarios optimistically bias reactor safety metrics." Nuclear Engineering and Design 403 (2023): 112151. We believe that unfortunately in some cases, various quantitative risk analyses are used to silence public criticism of regulated protection failures or even in some cases overreach of regulations. The well-known Reactor Safety Study presented a view that would tend to lead citizens to believe that reactor accidents would not happen in a reactor's lifetime. Shortly after publishing the study, the Three Mile Island Unit 2 accident happened in about 1 year after it started operation. The FAA used their quantitative analysis Transport Airplane Risk Assessment Methodology (TARAM) to justify continued operation of the Boeing MAX fleet AFTER the second deadly accident. The Executive (Presidential office) finally intervened to stop further operation of the planes.

In my own SubStack posts and contributions to the BeyondHarm SubStack, I try to stay focused on protection failures in technological systems that could have been avoided if citizens were aware that high consequence scenarios lack proper protections. My main idea is to simply raise citizens' awareness (those in a democratic political state) that they are the ones who must elect representatives who will develop proper a priori protections against consequential events that only come from regulatory authorities acting to inspect and enforce (proper) legislation (on protection).

An aspect of rare events is that they often follow a Poisson arrival characteristic. Just recalling a well-known consequence of such arrivals is that the longer the interval since the last arrival, the greater is the hazard intensity. That is, the more probable it is that an arrival will come in the next instant of time. On the other hand, citizens' propensity to advocate for protections against the next arrival grows less as the time since the last arrival grows. A good example is the Katrina hurricane event in the U.S. where the New Orleans government was allowed to spend money intended for hurricane protections in unrelated ways. In fact, extreme hurricane arrivals are not that rare, but obviously are rare enough to bring complacency.

Even greater complacency comes with events that are obviously present (can be easily known) but have never been realized (observed). A good example is the consequential Maui Lahaina fire that came about due to invasive grass species allowed to spread right up to the city limits. This was allowed even though was well-known that such grasses once ignited burn rapidly and fiercely. The reactor meltdowns at Fukushima followed such complacency (rare tsunami event) reinforced with quantitative risk analyses (PRA).

I guess this risk analysis is a complicated topic!

Expand full comment

It is a scarey subject . Thanks Melanie . I am grateful to have avoided any such disasters so far ( born in 1961 !!) . The TV series 'Call the Midwife' touched on the polio and thalidomide issues .

The guilt felt by mother's must be dreadful .

Expand full comment