10% Human
eBook - ePub

10% Human

How Your Body's Microbes Hold the Key to Health and Happiness

Alanna Collen

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

10% Human

How Your Body's Microbes Hold the Key to Health and Happiness

Alanna Collen

Book details
Book preview
Table of contents
Citations

About This Book

Obesity, autism, mental health problems, IBS, allergies, auto-immunity, cancer. Does the answer to the modern epidemic of ‘Western’ diseases lie in our gut?

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is 10% Human an online PDF/ePUB?
Yes, you can access 10% Human by Alanna Collen in PDF and/or ePUB format, as well as other popular books in Biological Sciences & Science General. We have over one million books available in our catalogue for you to explore.

Information

Year
2015
ISBN
9780007584048

ONE

Twenty-First-Century Sickness

In September 1978, Janet Parker became the last person on Earth to die of smallpox. Just 70 miles from the place where Edward Jenner had first vaccinated a young boy against the disease with cowpox pus from a milkmaid, 180 years earlier, Parker’s body played host to the virus in its final outing in human flesh. Her job as a medical photographer at the University of Birmingham in the UK would not have put her in direct jeopardy were it not for the proximity of her dark room to the laboratory beneath. As she sat ordering photographic equipment over the telephone one afternoon that August, smallpox viruses travelled up the air ducts from the Medical School’s ‘pox’ room on the floor below, and brought on her fatal infection.
The World Health Organisation (WHO) had spent a decade vaccinating against smallpox around the world, and that summer they were on the brink of announcing its complete eradication. It had been nearly a year since the final naturally occurring case of the disease had been recorded. A young hospital cook had recovered from a mild form of the virus in its final stronghold of Somalia. Such a victory over disease was unprecedented. Vaccination had backed smallpox into a corner, ultimately leaving it with no vulnerable humans to infect, and nowhere to go.
But the virus did have one tiny pocket to retreat to – the Petri dishes filled with human cells that researchers used to grow and study the disease. The Medical School of Birmingham University was one such viral sanctuary, where one Professor Henry Bedson and his team were hoping to develop the means to quickly identify any pox viruses that might emerge from animal populations now that smallpox was gone from humans. It was a noble aim, and they had the blessing of the WHO, despite inspectors’ concerns about the pox room’s safety protocols. With just a few months left before Birmingham’s lab was due to close anyway, the inspectors’ worries did not justify an early closure, or an expensive refit of the facilities.
Janet Parker’s illness, at first dismissed as a mild bug, caught the attention of infectious disease doctors a fortnight after it had begun. By now she was covered in pustules, and the possible diagnosis turned to smallpox. Parker was moved into isolation, and samples of fluid were extracted for analysis. In an irony not lost on Professor Bedson, his team’s expertise in identifying pox viruses was called upon for verification of the diagnosis. Bedson’s fears were confirmed, and Parker was moved to a specialist isolation hospital nearby. Two weeks later on 6 September, with Parker still critically ill in hospital, Professor Bedson was found dead at his home by his wife, having slit his own throat. On 11 September 1978, Janet Parker died of her disease.
Janet Parker’s fate was that of many hundreds of millions before her. She had been infected by a strain of smallpox known as ‘Abid’, named after a three-year-old Pakistani boy who had succumbed to the disease eight years previously, shortly after the WHO’s intensive smallpox eradication campaign had got under way in Pakistan. Smallpox had become a significant killer across most of the world by the sixteenth century, in large part due to the tendency of Europeans to explore and colonise other regions of the world. In the eighteenth century, as human populations grew and became increasingly mobile, smallpox spread to become one of the major causes of death around the world, killing as many as 400,000 Europeans each year, including roughly one in ten infants. With the uptake of variolation – a crude and risky predecessor of vaccination, involving intentional infection of the healthy with the smallpox fluids of sufferers – the death toll was reduced in the latter half of the eighteenth century. Jenner’s discovery of vaccination using cowpox in 1796 brought further relief. By the 1950s, smallpox had been all but eliminated from industrialised countries, but there were still 50 million cases annually worldwide resulting in over 2 million deaths each year.
Though smallpox had released its grip on countries in the industrialised world, the tyrannical reign of many other microbes continued in the opening decade of the twentieth century. Infectious disease was by far the dominant form of illness, its spread aided by our human habits of socialising and exploring. The exponentially rising human population, and with that, ever-greater population densities, only eased the person-to-person leap that microbes needed to make in order to continue their life cycle. In the United States, the top three causes of death in 1900 were not heart disease, cancer and stroke, as they are today, but infectious diseases, caused by microbes passed between people. Between them, pneumonia, tuberculosis and infectious diarrhoea ended the lives of one-third of people.
Once regarded as ‘the captain of the men of death’, pneumonia begins as a cough. It creeps down into the lungs, stifling breathing and bringing on a fever. More a description of symptoms than a disease with a sole cause, pneumonia owes its existence to the full spectrum of microbes, from tiny viruses, through bacteria and fungi, to protozoan (‘earliest-animal’) parasites. Infectious diarrhoea, too, can be blamed on each variety of microbe. Its incarnations include the ‘blue death’ – cholera – which is caused by a bacterium; the ‘bloody flux’ – dysentery – which is usually thanks to parasitic amoebae; and ‘beaver fever’ – giardiasis, again from a parasite. The third great killer, tuberculosis, affects the lungs like pneumonia, but its source is more specific: an infection by a small selection of bacteria belonging to the genus Mycobacterium.
A whole host of other infectious diseases have also left their mark, both literally and figuratively, on our species: polio, typhoid, measles, syphilis, diphtheria, scarlet fever, whooping cough and various forms of flu, among many others. Polio, caused by a virus that can infect the central nervous system and destroy nerves controlling movements, paralysed hundreds of thousands of children each year in industrialised countries at the beginning of the twentieth century. Syphilis – the sexually transmitted bacterial disease – is said to have affected 15 per cent of the population of Europe at some point in their lifetime. Measles killed around a million people a year. Diphtheria – who remembers this heart-breaker? – used to kill 15,000 children each year in the United States alone. The flu killed between five and ten times as many people in the two years following the First World War than were killed fighting in the war itself.
Not surprisingly these scourges had a major influence on human life expectancy. Back then, in 1900, the average life expectancy across the whole planet was just thirty-one years. Living in a developed country improved the outlook, but only to just shy of fifty years. For most of our evolutionary history, we humans have managed to live to only twenty or thirty years old, though the average life expectancy would have been much lower. In one single century, and in no small part because of developments in one single decade – the antibiotic revolution of the 1940s – our average time on Earth was doubled. In 2005, the average human could expect to live to sixty-six, with those in the richest countries reaching, again on average, the grand old age of eighty.
These figures are highly influenced by the chances of surviving infancy. In 1900, when up to three in ten children died before the age of five, average life expectancy was dramatically lower. If, at the turn of the next century, rates of infant mortality had remained at the level they were in 1900, over half a million children would have died before their first birthday in the United States each year. Instead, around 28,000 did. Getting the vast majority of children through their first five years unscathed allows most of them to go on and live to ‘old age’ and brings the average life expectancy up accordingly.
Though the effects are far from fully felt in much of the developing world, we have, as a species, gone a long way towards conquering our oldest and greatest enemy: the pathogen. Pathogens – disease-causing microbes – thrive in the unsanitary conditions created by humans living en masse. The more of us we cram onto our planet, the easier it becomes for pathogens to make a living. By migrating, we give them access to yet more humans, and in turn, more opportunity to breed, mutate and evolve. Many of the infectious diseases we have contended with in the last few centuries originated in the period after early humans had left Africa and set up home across the rest of the world. Pathogens’ world domination mirrored our own; few species have as loyal a pathogenic following as us.
For many of us living in more developed countries, the reign of infectious diseases is confined to the past. Just about all that remain of thousands of years of mortal combat with microbes are memories of the sharp prick of our childhood immunisations followed by the ’reward’ of a polio-vaccine-infused sugar lump, and perhaps more clearly, the melodramatic queues outside the dinner hall as we waited with our school friends for a teenage booster shot. For many children and teenagers growing up now, the burden of history is even lighter, as not only the diseases themselves, but once-routine vaccinations, such as the dreaded ‘BCG’ for tuberculosis, are no longer necessary.
Medical innovations and public health measures – largely those of the late nineteenth and early twentieth centuries – have made a profound difference to life as a human. Four developments in particular have taken us from a two-generation society to a four-, or even five-generation society in just one, long, lifetime. The first and earliest of these, courtesy of Edward Jenner and a cow named Blossom, is, of course, vaccination. Jenner knew that milkmaids were protected from developing smallpox by virtue of having been infected by the much milder cowpox. He thought it possible that the pus from a milkmaid’s pustules might, if injected into another person, transfer that protection. His first guinea pig was an eight-year-old boy named James Phipps – the son of Jenner’s gardener. Having inoculated Phipps, Jenner went on to attempt to infect the brave lad, twice injecting pus from a true smallpox infection. The young boy was utterly immune.
Beginning with smallpox in 1796, and progressing to rabies, typhoid, cholera and plague in the nineteenth century, and dozens of other infectious diseases since 1900, vaccination has not only protected millions from suffering and death, but has even led to countrywide elimination or complete global eradication of some pathogens. Thanks to vaccination, we no longer have to rely solely on our immune systems’ experiences of full-blown disease to defend us against pathogens. Instead of acquiring natural defences against diseases, we have circumvented this process using our intellect to provide the immune system with forewarning of what it might encounter.
Without vaccination, the invasion of a new pathogen prompts sickness and possibly death. The immune system, as well as tackling the invading microbe, produces molecules called antibodies. If the person survives, these antibodies form a specialist team of spies that patrol the body looking out specifically for that microbe. They linger long after the disease has been conquered, primed to let the immune system know the moment there is a reinvasion of the same pathogen. The next time it is encountered, the immune system is ready, and the disease can be prevented from taking hold.
Vaccination mimics this natural process, teaching the immune system to recognise a particular pathogen. Instead of suffering the disease to achieve immunity, now we suffer only the injection, or oral administration, of a killed, weakened or partial version of the pathogen. We are spared illness but our immune systems still respond to the introduction of the vaccine, and produce antibodies that help the body to resist disease if the same pathogen invades for real.
Society-wide vaccination programmes are designed to bring about ‘herd immunity’ by vaccinating a large enough proportion of the population that contagious diseases cannot continue their spread. They have meant that many infectious diseases are almost completely eliminated in developed countries, and one, smallpox, has been totally eradicated. Smallpox eradication, as well as dropping the incidence of the disease from 50 million cases a year worldwide to absolutely none in little more than a decade, has saved governments billions in both the direct cost of vaccination and medical care, and the indirect societal costs of illness. The United States, which contributed a disproportionately large amount of money to the global eradication effort, recoups its investment every twenty-six days in unspent costs. Governmental vaccination schemes for a dozen or so other infectious diseases have dramatically reduced the number of cases, reducing suffering and saving lives and money.
Today, most countries in the developed world run vaccination programmes against ten or so infectious diseases, and half a dozen are marked for regionwide elimination or global eradication by the World Health Organisation. These programmes have had a dramatic effect on the incidence of these diseases. Before the worldwide eradication programme for polio began in 1988, the virus affected 350,000 people a year. In 2012, the disease was confined to just 223 cases in only three countries. In just twenty-five years, around half a million deaths have been prevented and 10 million children who would have been paralysed are free to walk and run. Likewise for measles and rubella: in a single decade, vaccination of these once-common diseases has prevented 10 million deaths worldwide. In the United States, as in most of the developed world, the incidence of nine major childhood diseases has been reduced by 99 per cent by vaccination. In developed countries, for every 1,000 babies born alive in 1950, around forty would die before their first birthday. By 2005, that figure had been reduced by an order of magnitude, to about four. Vaccination is so successful that only the oldest members of Western society can remember the horrendous fear and pain of these deadly diseases. Now, we are free.
After the development of the earliest vaccines came a second major health innovation: hygienic medical practice. Hospital hygiene is something we are still under pressure to improve today, but in comparison with the standards of the late nineteenth century, modern hospitals are temples of cleanliness. Imagine, instead, wards crammed full with the sick and dying, wounds left open and rotting, and doctors’ coats covered in the blood and gore of years of surgeries. There was little point in cleaning – infections were thought to be the result of ‘bad air’, or miasma, not germs. This toxic mist was thought to rise from decomposing matter or filthy water – an intangible force beyond the control of doctors and nurses. Microbes had been discovered 150 years previously, but the connection had not been made between them and disease. It was believed that miasma could not be transferred by physical contact, so infections were spread by the very people charged with curing them. Hospitals were a new invention, born of a drive towards public health care and a desire to bring ‘modern’ medicine to the masses. Despite the good intentions, they were filthy incubators for disease, and those attending them risked their lives for the treatment they needed.
Women suffered most as a result of the proliferation of hospitals, as the risks of labour and giving birth, rather than falling, actually rose. By the 1840s, up to 32 per cent of women giving birth in hospital would subsequently die. Doctors – all male at that time – blamed their deaths on anything from emotional trauma to uncleanliness of the bowel. The true cause of this horrifyingly high death rate would at last be unravelled by a young Hungarian obstetrician by the name of Ignaz Semmelweis.
At the hospital where Semmelweis worked, the Vienna General, women in labour were admitted on alternate days into two different clinics. One was run by doctors, and the other by midwives. Every second day, as Semmelweis walked to work, he’d see women giving birth on the street outside the hospital doors. On those days, it was the turn of the clinic run by doctors to admit labouring women. But the women knew the odds for their survival would not be good if they could not hold on until the following day. Childbed fever – the cause of most of the deaths – lurked in the doctors’ clinic. So they waited, cold and in pain, in the hope that their baby would delay its entrance to the world until after midnight had struck.
Getting admitted to the midwife-run clinic was, relatively speaking, a far safer proposition. Between 2 and 8 per cent of new mothers would die of childbed fever in the care of midwives – far fewer than succumbed in the doctors’ clinic.
Despite his junior status, Semmelweis began to look for differences between the two clinics that might explain the death rates. He thought overcrowding and the climate of the ward might be to blame, but found no evidence of any difference. Then, in 1847, a close friend and fellow doctor, Jakob Kolletschka, died after being accidentally cut by a student’s scalpel during an autopsy. The cause of death: childbed fever.
After Kolletschka’s death, Semmelweis had a realisation. It was the doctors who were spreading death among the women in their ward. Midwives, on the other hand, were not to blame. And he knew why. Whilst their patients laboured, the doctors would pass the time in the morgue, teaching medical students using human cadavers. Somehow, he thought, they were carrying death from the autopsy room to the maternity ward. The midwives never touched a corpse, and the patients dying on their ward were probably those whose post-natal bleeding meant a visit from the doctor.
Semmelweis had no clear idea of the form that death was taking on its passage from the morgue to the maternity ward, but he had an idea of how to stop it. To rid themselves of the stench of rotting flesh, doctors often washed with a solution of chlorinated lime. Semmelweis reasoned that if it could remove the smell, perhaps it could remove the vector of death as well. He instituted a policy that doctors must wash their hands in chlorinated lime between conducting autopsies and examining their patients. Within a month, the death rate in his clinic had dropped to match that of the midwives’ clinic.
Despite the dramatic results Semmelweis achieved in Vienna and later in two hospitals in Hungary, he was ridiculed and ignored by his contemporaries. The stiffness and stench of a surgeon’s scrubs were said to be a mark of his experience and expertise. ‘Doctors are gentlemen, and gentlemen’s hands are clean,’ said one leading obstetrician at the time, all the while infecting and killing dozens of women each month. The mere notion that doctors could be responsible for bringing death, not life, to their patients caused huge offence, and Semmelweis was cast out of the establishment. Women continued to risk their lives giving birth for decades, as they paid the price of the doctors’ arrogance.
Twenty years later, the great Frenchman Louis Pasteur developed the germ theory of disease, which attributed infection and illness to microbes, not miasma. In 1884, Pasteur’s theory was proved by the elegant experiments of the German Nobel prize-winning doctor Robert Koch. By this time, Semmelweis was long dead. He had become obsessed by childbed fever, and had gone mad with rage and desperation. He railed against the establishment, pushing his theories and accusing his contemporaries of being irresponsible murderers. He was lured by a colleague to an insane asylum, under the pretence of a visit, then forced to drink castor oil and beaten by the guards. Two weeks later, he died of a fever, probably from his infected wounds.
Nonetheless, germ theory was the breakthrough that gave Semmelweis’s observations and policies a truly scientific explanation. Steadily, antiseptic hand-washing was adopted by surgeons across Europe. Hygienic practices became common after the work of the British surgeon Joseph Lister. In the 1860s, Lister read of Pasteur’s work on microbes and food, and decided to experiment with chemical solutions on wounds to reduce the risk of gangrene and septicaemia. He used carbolic acid, which was known to stop wood from rotting, to wash his instruments, soak dressings and even to clean wounds during surgery. Just as Semmelweis had ach...

Table of contents