Superbugs
eBook - ePub

Superbugs

An Arms Race against Bacteria

William Hall, Anthony McDonnell, Jim O'Neill

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Superbugs

An Arms Race against Bacteria

William Hall, Anthony McDonnell, Jim O'Neill

Book details
Book preview
Table of contents
Citations

About This Book

Antibiotics are powerful drugs that can prevent and treat infections, but they are becoming less effective as a result of drug resistance. Resistance develops because the bacteria that antibiotics target can evolve ways to defend themselves against these drugs. When antibiotics fail, there is very little else to prevent an infection from spreading.Unnecessary use of antibiotics in both humans and animals accelerates the evolution of drug-resistant bacteria, with potentially catastrophic personal and global consequences. Our best defenses against infectious disease could cease to work, surgical procedures would become deadly, and we might return to a world where even small cuts are life-threatening. The problem of drug resistance already kills over one million people across the world every year and has huge economic costs. Without action, this problem will become significantly worse.Following from their work on the Review on Antimicrobial Resistance, William Hall, Anthony McDonnell, and Jim O'Neill outline the major systematic failures that have led to this growing crisis. They also provide a set of solutions to tackle these global issues that governments, industry, and public health specialists can adopt. In addition to personal behavioral modifications, such as better handwashing regimens, Superbugs argues for mounting an offense against this threat through agricultural policy changes, an industrial research stimulus, and other broad-scale economic and social incentives.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Superbugs an online PDF/ePUB?
Yes, you can access Superbugs by William Hall, Anthony McDonnell, Jim O'Neill in PDF and/or ePUB format, as well as other popular books in Medicine & Public Health, Administration & Care. We have over one million books available in our catalogue for you to explore.

Information

I

The Problem of Drug Resistance

One

When a Scratch Could Kill

For many readers of this book, particularly those who live in high-income countries and were born after the Second World War, the fear of infectious diseases may not be a source of anxiety. Diseases like Ebola or Zika, both caused by viruses, still capture headlines. But in high-income countries, people’s experiences of infectious illnesses consist mostly of occasional bouts of the common cold and other self-limiting or easily treatable infections. The perceived security—some would say complacency—that we enjoy is the product of growing up in an era when many of us enjoy ready access to effective antibiotics, the protection afforded by good health care, and high standards of hygiene in public and in the home.
Of course, infectious diseases—both long-established ones, such as tuberculosis, and more modern threats, such as AIDS—remain real and pervasive threats in many parts of the world, particularly in low- and middle-income countries that lack a reliable public health infrastructure and access to essential medicines. But for most of the developed world, patterns of disease have shifted profoundly over the course of the past century.
In this chapter, we explore the extent to which the treatment of infectious diseases in Europe and North America has changed during the past two centuries. We consider how a series of breakthroughs in scientific research and understanding—beginning with the development of the “germ theory” in the late nineteenth century and culminating in the discovery and development of effective modern antibiotics during the first half of the twentieth century—have shaped the world we live in today and laid the foundations for the crisis we face, as we find ourselves on the cusp of what Margaret Chan, former director general of the World Health Organization, has described as a “postantibiotic era.”

The Pre-Antibiotic Era: Experiences and Understanding of Infection

In developed, high-income countries today, the burden of infectious disease—in terms of both disease incidence and mortality—is low. A typical adult in the United States is more likely to die of accidental or violent causes than of any type of infectious disease. In the world as a whole, noncommunicable conditions (such as cancers and heart disease) cause four times as many deaths as infectious diseases. An American born in 2015 can expect to live to the age of nearly eighty, while a typical sixty-five-year-old baby boomer now approaching retirement age can expect to enjoy another nineteen years of life.
These statistics are in startling contrast to the life expectancy, patterns of illness, and causes of mortality in western Europe and North America during the nineteenth and early twentieth centuries. For example, in 1841, average life expectancy in England (then one of the most prosperous countries on earth) was just forty-one years. Within that country’s major cities life expectancy was even lower—a child born in Manchester faced a meagre life expectancy of twenty-five years. By 1900, life expectancy at birth was only forty-seven years, and in the United States, approximately a third of all deaths were from tuberculosis, pneumonia, or gastroenteritis.1 At the beginning of the twentieth century in the United States, one infant in ten died before their first birthday. For every thousand live births, between six and nine mothers died during or shortly after childbirth. Sepsis—bacterial infection of the bloodstream—was responsible for forty percent of these deaths.
Following the industrial revolution that took place in the United Kingdom and many parts of western Europe and North America in the nineteenth century, urban populations grew quickly, often resulting in squalid and overcrowded living conditions. In dense and unsanitary living quarters, with limited sewerage and access only to shared—and often polluted—drinking water sources, air- and waterborne diseases spread easily. The poor suffered most.
However, affluence and status could not completely protect people against infectious disease. An example from American presidential history concerns President Calvin Coolidge’s family. On a hot afternoon in 1924, just a few days before the annual July Fourth celebrations, Coolidge’s two sons, John and Calvin Jr., spent the afternoon playing tennis on the White House grounds. The younger of the two, sixteen-year-old Calvin Jr., began to suffer pain from a blister on his toe, probably as a result of wearing his tennis shoes without socks. The blister became infected. Calvin Jr. developed a fever, and his condition rapidly deteriorated over the next few days. He was transferred to the Walter Reed Medical Center, suffering from Staphylococcus aureus blood poisoning. The president wrote in a letter to his father on July 4, “Calvin is very sick.
 Of course he has all that medical science can give but he may have a long sickness with ulcers, then again he may be better in a few days.” Sadly, the teenager died on July 7, just a week after his game of tennis. Even privilege and access to the very best medical care of the day could offer no defense against death from an injury and subsequent infection.
This poignant story is just one of many notable examples that illustrate the reality of disease in the pre-antibiotic era: infectious diseases were a blight across all sections of society, even affecting the affluent and otherwise young and healthy. A seemingly innocuous scratch or cut truly could kill.
During the nineteenth century, tuberculosis (TB), also called consumption, or the “white plague,” was a pervasive affliction. TB was common across all sections of society, especially in dense urban areas. In major European cities such as London, Paris, and Stockholm, annual mortality rates from TB were 800–1,000 deaths per 100,000. The disease accounted for 40 percent of deaths among the urban poor, with latent TB infection rates estimated at between 70 and 100 percent in some urban areas. The death rate among those who developed active TB infections was 80 percent.
The best efforts to treat TB were offered by the so-called sanatorium movement, whose proponents contended that the illness could be cured through relocation to places where the air was cleaner and circulated more freely. An extended stay at a sanatorium—typically located in a mountainous, rural, or seaside location—was often prescribed for more affluent TB patients, but in reality it offered little more than symptomatic relief from the disease. Nonetheless, considerable faith was placed in the curative ability of sanatoria, even after the discovery of the bacterium that causes TB (described later in this chapter). Although experimental treatments based on new discoveries emerged in the early 1900s, the recovery rates from TB remained persistently low, and the majority of patients “treated” in sanatoria were dead within five years of their discharge.
One of the authors’ own family members, Joyce Pickard spent three and a half years of her childhood in a British hospital that specialized in the treatment of children afflicted with nonpulmonary TB (see Figure 1.1). She lost the ability to walk and became permanently disabled by a severe TB infection in her hip bones. From late 1937 to the summer of 1941, she was almost completely confined to her bed in a hospital on the south coast of England. More than two hundred miles from her home in Yorkshire, she had little exposure to the outside world. Without antibiotics there was no way to treat the underlying infection effectively; her treatment consisted of isolation and fresh air. Her bed would be wheeled outdoors even in the depths of winter, when she recalled that she would develop chilblains on her hands from the cold, and doctors would do their ward rounds while shivering in heavy overcoats. Such a protracted period of treatment, and the confinement it involved, seems extraordinary to us today, particularly since the disease is now both preventable and treatable.
Fig. 1.1. A British hospital for children with tuberculosis in the 1930s. Credit: “Sun Therapy at Alton Hospital,” Wellcome Collection (CC BY 4.0).
This picture of infectious disease has changed almost entirely thanks to a series of breakthroughs in our understanding of disease, and subsequently in the discovery of antibiotics to treat them. Starting in the mid-nineteenth century, average life expectancy has increased by a greater amount, and more rapidly, than at any other period in human history, and this increase is attributable to the decline in infectious disease. But with the rise in antibiotic resistance potentially rendering vital treatments ineffective, are we facing a postantibiotic era that resembles this pre-antibiotic past?

The Development of the Germ Theory

We take it for granted today that—with a few exceptions—the causes of infectious diseases are well understood. These illnesses are spread from person to person, and in the environment around us, by a pathogen of one sort or another, of which bacteria and viruses are the two most common. Bacteria are single-celled organisms that live within us and around us in vast quantities, usually benignly coexisting with us (and even helping to keep us healthy), but in certain circumstances colonizing our bodies, multiplying uncontrollably, and causing illness. Viruses, meanwhile, are particles around a hundred times smaller than bacteria that penetrate our own cells and multiply within to cause diseases ranging from the common cold and influenza to rabies and Ebola.
But these facts have not always been known, and it was only a series of significant breakthroughs during the late nineteenth and early twentieth centuries that allowed scientists to properly understand the causes of many infectious diseases. Before the middle of the nineteenth century, people recognized that diseases like cholera were contagious—that is, they spread within a population—but the cause of the contagion was unknown. Many theories emphasized the role of “miasmas,” foul-smelling vapors that were thought to transmit disease, exacerbated by environmental conditions and proximity to “filth.” Such theories did account for certain key characteristics of the diseases—in the case of cholera, for example, areas hit by outbreaks would be dogged by foul stenches and conditions, and disease outbreaks were more likely to occur during hot summer months than in the cold of winter. However, the source of the contagion was often misidentified. For instance, conventional wisdom blamed typhoid on fecal contamination and poor personal hygiene, but little attention was paid to the role of contaminated drinking water or food. Another common hypothesis stated that some kind of predisposition—whether by birth or acquired through an individual’s environment and circumstances—was crucial to the development of the disease, downplaying the importance of exposure to the infection and how it occurs.
All of these ideas changed with the emergence of the germ theory, which posited that living microorganisms cause infectious diseases.
One of the founders of the germ theory was Louis Pasteur, a chemist who taught in Lille, in northern France, in the mid-nineteenth century. Lille was in an agricultural region and was the center for the industrial fermentation of sugar beets into alcohol. Pasteur became interested in the process of fermentation, and by means of a high-resolution microscope he was able to observe and identify the different microorganisms responsible for fermentation and putrefaction in food. He realized that microorganisms got into food through contamination from the environment, not, as had been thought, through a spontaneous decay process. This discovery highlighted the fact that microorganisms originated from the outside, not the inside. Could the same process—exposure to microorganisms—be responsible for certain diseases?
The other founder of germ theory is Robert Koch, a German physician and biologist. In the late 1870s, Koch published his investigations into bacteria as disease-causing microorganisms. He demonstrated the process by isolating pathogens from diseased animals and inoculating the pathogens into healthy animals, showing how diseases could be transmitted. In 1882 Koch successfully isolated the tuberculin bacillus, which he claimed to be the cause of all tuberculosis illness. This marked a significant step forward in the understanding of the cause and mechanism of transmission of TB.
In 1854, around the same time as Pasteur’s early work on microorganisms, a London-based physician named John Snow did pioneering work on cholera that elucidated the source of a major cholera outbreak in London’s Soho district. Snow rigorously tracke...

Table of contents