Welcome to The Turnstone. Here, I help people understand important issues such as Covid-19, climate change and conservation. I send my articles out every Sunday - if you’d like them emailed to you directly, you can sign up to my mailing list.
On one of my many bookshelves, there’s a small, tatty book with the unlikely title “Pomp and Pestilence”. The dust jacket is plain, no illustration, just words and blocks of black, white and red. The font of the main title reminds me of the Nevil Shute novels I enjoyed reading some years ago – a clue to the age of the book, which was first published in 1954. But what really sets the book apart as a perfect product of its time is not the condition of the dust jacket or the cover design. It is the subtitle: Infectious Disease, Its Origin and Conquest.
Today, as we follow the ever-increasing Covid-19 death tolls around the world, and hang our hopes on the latest reports of progress in developing a vaccine, it is hard to imagine a time when a Professor of Bacteriology from St Thomas’s Hospital Medical School could have thought such a subtitle was a good idea. St Thomas’s, you might remember, was the hospital where Britain’s Prime Minister spent several days in the Intensive Care Unit as a result of a disease that didn’t exist six months previously. Nobody today would think of saying that infectious disease is conquered. But in the decades following the end of World War Two, there was an extraordinary era of optimism, when science seemed to be the answer to the world’s problems. This was no more apparent than in the field of medicine, where the combination of antibiotics, vaccination and DDT were so effective in controlling infectious disease that people could write books about its conquest.
Perhaps the greatest achievement of that era came right at the end, in 1977, with the victory over smallpox, which remains the only human disease to be eradicated. By then, the tone of the language had shifted, from the conquest of disease, to the theory of the epidemiologic transition. That’s a bit of a mouthful, but it means, in more simple terms, that patterns of disease had changed – from high mortality due to infectious disease, war and famine, balanced by an equally high birth rate, to mortality later in life, from conditions such as heart disease and cancer.
This pattern has largely held true in wealthier nations. In New Zealand, such a transition had certainly occurred by the 1970s. In 1900, infectious disease was a significant cause of death. It’s difficult to get exact proportions due to the arcane terminology – 137 people died of convulsions, for example, and 122 from “syncope”, which simply means fainting or passing out – but at least 25% of deaths were ascribed to infectious diseases, of which tuberculosis was the most prevalent. By 1977, infectious disease caused about 1% of deaths.
By this time, however, it was clear that infectious disease was far from just a problem of the past. Antibiotic resistance was a well-recognised problem (even if there had been little action to manage it), fears about a repeat of the 1918 flu pandemic had led to the hasty vaccination of millions of people and an outbreak of disease linked to the Pennsylvania Legionnaires had captured world headlines. But, with smallpox, at least, we have a true success story. Millions of people have been saved from death, disfigurement or blindness, by the use of one medical approach – vaccination.
Smallpox was the very first disease to be controlled by vaccination. As I mentioned in the first part of this article, a form of smallpox vaccination, called variolation, was reported from Asia about 1000 years ago. Variolation involved inoculating people with small amounts of the smallpox virus, usually from scabs of someone who had recently recovered. It was a dangerous practice – one in every hundred who were inoculated contracted a serious case of smallpox and died from it – but for a disease which normally killed three out of every ten infected, many considered it worth the risk. A safer alternative was created by Edward Jenner, using the closely-related but mild disease of cowpox. In 1791, in an experiment which would never pass a modern ethics committee, he infected an eight-year-old boy with cowpox. A couple of months later, he infected the boy with smallpox and demonstrated that he didn’t become ill.
Although not the first to realise that cowpox conferred immunity to smallpox, Jenner was the first to study it in a scientific, if not particularly ethical, way and to promote it effectively among the medical profession. By 1800, the use of cowpox to vaccinate against smallpox was established in Britain and had spread to Europe and North America. The first steps towards eradication had been taken.
From the very beginning, just how to vaccinate enough people to control epidemics was the subject of debate. Parish records from early 19th century England showed a range of strategies, from vaccinating only the willing, to withholding support from the poor who refused vaccination for themselves and their children. The first uses of mandatory vaccination appeared within a decade of Jenner’s first experiments. The state of Massachusetts made the smallpox vaccine compulsory in 1809. Sweden made the vaccine compulsory in 1816. Britain was much later, making it compulsory in 1840. New Zealand made the vaccine compulsory for children in 1863.
However, making a law about something doesn’t magically make it happen. In 1893, the first year for which I’ve been able to find data, declining rates of smallpox vaccination in New Zealand were already causing concern. While in 1889 nearly 50% of babies had been successfully vaccinated, by 1892 the number was just over 30%. Vaccination rates for older children showed a similar decline. A reason for the declining vaccination rates was not given, but there’s a startling statement made about the vaccine just after the tables giving the numbers and percentages for New Zealand:
The deaths in England from smallpox for the year 1891 were only 49 in number; and the deaths caused by the effects of vaccination were 43.
The prospect of dying from a vaccine is a powerful disincentive. When a disease is prevalent and deadly, as smallpox had been in eighteenth and early nineteenth century England, people may well accept the risk, as many did with variolation. But once a vaccine has reduced the disease enough that it is no longer feared, many will weigh up the risks of the disease and the risks of dying from the vaccine, and decide to take their chances. It seems likely that this was the case in New Zealand, where smallpox was a disease which appeared infrequently, arriving with infected passengers aboard ships. Then, as now, the preferred method for dealing with potentially infected arrivals was to place them in quarantine. Harbourmasters had the power to order ships into quarantine as early as 1842, initially by directing them to anchor in a designated quarantine area. Matiu/ Somes Island was first used for quarantine in 1872, when a ship arrived carrying passengers infected with smallpox. As a result, there were only occasional smallpox outbreaks in New Zealand.
With smallpox an infrequent arrival, it’s little wonder that many people in New Zealand did not get their children vaccinated with a potentially lethal vaccine. But, even in eighteenth and nineteenth century England, when smallpox was still a feared disease, some people were reluctant to be vaccinated. Although many parishes recognised the benefit, and paid for the vaccinations of those who could not afford it, there was also opposition. One parish warned an overzealous Overseer of the Poor against inoculating people against their will, while another threatened to prosecute someone if they inoculated their child. In the latter case, the point was made that smallpox wasn’t currently present in that parish. Since it was certainly possible for variolation to cause a serious case of smallpox which then led to an outbreak, the concern was understandable, if not entirely rational, since at that time smallpox was still responsible for approximately 10% of deaths in London – just a few miles away.
By the time Jenner’s cowpox-based vaccine was introduced, inoculation with small amounts of smallpox was already being widely practised in England. There were risks, but people were familiar with it and, when offered the choice, many preferred the more dangerous smallpox inoculation. The practice persisted until it was eventually banned in Britain in 1840.
The preference for the much more dangerous variolation over cowpox-based vaccination is a demonstration of risk perception. The human mind comprehends risk in a way which has little to do with the chance of dying. What matters, when we make a decision about whether to walk across a busy road, drink alcohol, or vaccinate our children, are factors such as how familiar something is, how fairly the risks and benefits are distributed, and whether we can remember any alarming stories on the topic. This perception of risk is often described as irrational, but it’s real and can’t be ignored if we want to understand why it can sometimes be difficult to convince people to get vaccinated.
One of the consistent factors which influences our perception of risk is how much control we have over it. Peter Sandman, the man who, more than anyone else, has influenced my understanding of risk perception, gives a metaphor from one of his colleagues of carving a rib roast. It’s an informal event and you don’t have a fork, so one hand is on the roast and one on the knife. Picture, he says, really picture in your mind, just how close your hand is to the knife. Now, give someone else the knife. What happens to your hand?
The same factor applies when we compare driving a car to flying in an aeroplane. Many more people fear flying, not because it is dangerous but, at least in part, because they are not in control (although memorable news reports of plane crashes also have something to do with it). However, people still control the choice of whether or not they get on a plane. The situation is quite different when considering risks such as being exposed to chemicals from an accident at a factory, radiation from nuclear tests or, for that matter, 5G cellphone towers. There, we have no control and little choice, and we perceive those risks as much higher than the statistics suggest are justified.
From the very start, vaccination campaigns provoked opposition. There were those who were afraid it would harm their children, those who considered vaccination “unchristian” and those who simply distrusted medicine and didn’t believe that smallpox was an infectious disease at all. Understandably, when we consider the role of choice and control in risk perception, the opposition only became stronger when vaccination was made mandatory in Britain. The situation was similar in the USA. Opposition to mandatory and sometimes forcible vaccination went all the way to the Supreme Court, which ruled in 1902 that states could make vaccination compulsory to protect people during an epidemic, but could not forcibly vaccinate people.
When the global campaign to eradicate smallpox began, the disease was still killing two million people a year. In terms of infectious disease, only tuberculosis was killing more. So it is understandable, if not entirely acceptable, that medical staff resorted to coercive and sometimes forcible vaccination to reach their goal. Any qualms they may have had tended to be overcome by their perception of the greater good. However, they also took a grave risk. The tactics used, however well-intentioned, risked losing the trust of the people they were trying to help. Had that happened, the eradication could easily have failed.
The catastrophic impact of a loss of trust in vaccination can be seen in the attempt to eradicate polio, which began in 1988. By the year 2000, the original target date for eradication, there were fewer than 3500 cases, down from an estimated 350,000 in 1988, and the programme was on target to be completed in 2005. The annual report on polio eradication for the year 2000 mentions three main challenges – ensuring adequate funding, accessing conflict-affected areas and maintaining political commitment – but there is no indication that the programme was about to go badly off-track.
By late 2003, reports from Nigeria indicated a resurgence of polio, the result of the religious and political leaders from northern states calling on parents to refuse vaccination for their children. The polio vaccine, they said, could be contaminated with anti-fertility agents and HIV. This contamination was deliberate, they said, part of an evil anti-Islam plot from America and its allies in the wake of September 11 and the invasion of Iraq. The governments of three states suspended their vaccination programmes.
Such ideas may seem laughable, and are easy to dismiss as the ignorant imaginings of religious extremists, but they had a real impact. While some countries continued to make good progress towards eradication, the number of cases in northern Nigeria began to rise, and fourteen countries which had previously been free from polio had cases imported from Nigeria. In six of those countries, polio successfully re-established. It took months, and millions of dollars, to bring the outbreaks back under control.
Northern Nigeria’s vaccine boycott didn’t come from nowhere. A number of different factors contributed, including long-standing tensions between the predominantly Muslim north and Christian south, and the limited basic healthcare, which made people suspicious of health workers turning up at their door offering something for nothing. These suspicions were not entirely unjustified. In 1996, the drug manufacturer Pfizer tested a new antibiotic on children in Kano state during a meningococcal meningitis outbreak, without informing their families. When the Washington Post reported on the trial, the company responded with an apology which sounded awfully like they weren’t sorry at all: “Pfizer is confident that no one associated with the Trovan clinical study…—conducted in Kano, Nigeria during a meningitis epidemic in 1996—ever put a patient's health at risk and that the company acted in the best interests of the children involved in the study, using the best medical knowledge available.”
None of the reasons above, of course, had any bearing on the safety of the polio vaccination programme in northern Nigeria. However, they give important context to the vaccine boycott, making it easier to understand why some states, which were responsible for implementing vaccination, would be unwilling to carry out a programme ordered by the federal government.
Vaccines, in general, are safe, otherwise they would never be approved for use, but safety is relative. When smallpox was prevalent, a death rate of one in a hundred from variolation was an acceptable risk for many. The smallpox vaccine which is being stockpiled in the USA in case of a bioterrorist attack is much safer than that, but it is still a less safe vaccine than many others in use today. In the 1960s, the USA reported one in a million of those vaccinated with the smallpox vaccine died, while ten times that number suffered life-threatening complications. Some forms of the rabies vaccine cause serious, potentially life-threatening, neurological illness in as many as 7 in 1000 people. Serious allergic reactions to the MMR vaccine (used to vaccinate children against measles, mumps and rubella) are reported to be up to one in 100,000.
When we compare the safety of vaccines to the dangers of the diseases they cause, the contrast is stark. Rubella, while a mild disease in most people – even asymptomatic in up to 50% of cases – can be devastating in pregnant women. Those infected during their first trimester of pregnancy will see harm to the developing baby in 85% of cases. Babies who survive their mothers being infected with rubella often suffer problems such as cataracts, deafness, heart defects and intellectual disabilities. Measles can lead to a number of life-threatening complications, including pneumonia and encephalitis (inflammation of the brain), especially in children under five. It kills around one in a thousand of those infected in developed countries, and can cause permanent brain damage. And rabies makes all the other diseases seem mild. Following infection, it can remain latent for months, even up to a year, but once symptoms appear, the fatality rate is virtually 100%. While it starts with flu-like symptoms, in the majority of cases these soon progress to what is known as “furious” rabies, symptoms of which include anxiety, confusion, hyperactivity, excessive salivation and difficulty swallowing, followed by coma and death. About 20% of cases are of the “paralytic” form, where the victim is overcome by a gradual paralysis – again followed by coma and death.
But the risk of dying from a disease if you catch it doesn’t tell the whole story. Rabies might be horrific, but when have you ever seen a rabid dog? Most of the diseases which can be prevented by vaccines have declined to such an extent, at least in wealthier countries, that many of us find it difficult to take them seriously. Most of us didn’t have the experience of losing siblings to infectious disease when we were growing up. Most of us haven’t helplessly watched a baby sicken and die. It’s not just a matter of statistics either. As I mentioned earlier, one of the major factors which influences our perception of risk is whether or not we can recall frightening images or stories about that risk. Until the arrival of Covid-19 most of us simply didn’t have life experiences which made us take infectious disease seriously.
At the end of the first part of this article, I quoted statistics from the World Health Organisation indicating that vaccination rates had fallen in a number of countries over the last five years. Could this simply be due to the way we perceive the risk of infectious disease? Or, as was the case in Nigeria, are there more complex issues around trust involved?
For some of the countries which have seen significant declines in vaccination rates, the answers are fairly obvious. In Samoa, two babies died in July 2018, only hours apart and just minutes after receiving their MMR vaccination. The cause was not the vaccine itself, but a grave error made by two nurses when preparing the vaccine. The nurses were sentenced to prison, and Samoa’s already low rate of measles vaccination suffered a precipitous drop. Just over a year after the deaths of the babies, an outbreak of measles killed 83, most of them children under four. Libya, another country where vaccination rates have declined by more than 20% in the last five years, has suffered a worsening civil war since 2014. Venezuela, which has had a 23% drop in vaccination rates, has been suffering a worsening political crisis and the economy has collapsed.
But for other countries, the answer is less clear. Why, for example, has Brazil seen such a decline in vaccination rates? What about Haiti and Laos, both of which had 13% declines in the last five years? And what about New Zealand?
In places like New Zealand, it is easy to blame misinformation, spread on social media, such as the supposed link between vaccines and autism, now thoroughly discredited. But a closer examination of the data suggests that this is only a minor consideration – those refusing vaccines have risen only slightly, from 1% to 1.5% over the last few years.
But something else stands out in the data too. The children missing out on vaccines are disproportionately Māori and Pasifika. And there’s a clear and consistent relationship between the level of deprivation and the likelihood of being vaccinated. The more deprived a child, the less likely they are to receive their vaccines.
The World Health Organisation, in attempting to understand barriers to vaccination, has come up with a model called “3Cs”, which stand for confidence, complacency and convenience. Confidence, or the lack of it, in vaccines and in the health system which delivers them was the crucial factor in Nigeria and Samoa. Complacency is a factor whenever rates of infectious disease are falling. But convenience isn’t something I have written about so far. It relates to how easily people can access vaccination, for themselves and for their children.
In New Zealand, vaccinations are primarily delivered by doctors or nurses associated with general practice. But access to general practice is an increasing problem. There are issues with an aging GP workforce and the funding model. New Zealand’s vaccination schedule now recommends children are vaccinated at 6 weeks, 3 months, 5 months, 15 months, 4 years and 11 years, with a further vaccination for girls at 12 years. Vaccination might be free, but for some families just getting to the appointments is a barrier.
Right now, the world is watching and waiting, desperately hoping that a vaccine for Covid-19 will be developed which will allow a return to relative normality. The Guardian’s Coronavirus vaccine tracker shows that five vaccines have so far reached the point of large scale efficacy trials. But the end point that they show on this tracker is “vaccines approved for general use”. In reality, the development of a safe and effective vaccine, and gaining regulatory approval to use it, is just the first step. The real challenge will be getting the vaccine to the people who need it.
Most articles in The Turnstone are free, but you can support my work and an receive additional material, including more in-depth interviews, with a monthly or annual subscription. Click the “Subscribe now” button below for options.
If you would like to support The Turnstone with a one-off contribution, click the “Buy me a coffee” button below.