1 PATIENT ZERO
âGet the patient into isolation now,â the doctor ordered. The patient was a woman in her mid-seventies from nearby Reno, Nevada. A few days earlier, she had arrived at the ER disoriented and running a fever. Temperatures that day had topped out at a stifling 100 degrees. The 2016 summer had been brutal even by the standards of a county that included a torrid stretch of Black Rock Desert.
When doctors learned she had recently returned from a trip to India they suspected that the rigors of a twenty-hour flight, coupled with the heat, had left her severely dehydrated.1 A couple of days of intravenous fluids should make her as good as new.
They became increasingly alarmed, however, the following day. Her temperature spiked to 102, pulse raced at nearly 100 beats a minute, and breathing became labored. Blood tests revealed an abnormally high white blood cell count. That prompted a new diagnosis: systemic inflammatory response syndrome. Although her physicians could not identify an underlying infection, they thought it likely that her bodyâs extreme immune response had somehow poisoned her own blood. They administered intravenous antibiotics to prevent irreversible organ damage.
There was no improvement. After another thirty-six hours, the doctors ordered more testing to hunt for the culprit they had missed in their initial blood and fluid screens. That test result startled her physicians. The infection was carbapenem-resistant Enterobacteriaceae (CRE), a typically benign intestinal bacterium that becomes a treacherous supergerm after it enters a patientâs bloodstream or lungs. It then overwhelms the bodyâs immune system.2
This CRE diagnosis was particularly alarming. Discovered only in 2008 in New Delhi, it had established itself in under a decade as the most lethal supergerm, killing half of the patients it infects.3 The head of the Food and Drug Administration (FDA) described it as a ânightmarish bacteria.â CRE has mutated into newer strains, some of which have enhanced resistance to the class of antibiotics that were traditionally the last line of defense.4
Doctors knew about superbugs for several decades but before the CRE variant most had dismissed their potential threat. No longer does anyone in medicine or pharmaceuticals underestimate supergerms. A series of sobering reports in medical journals laid bare the extent to which they have spread and wreaked havoc.5 In 2010, the first year of reliable U.S. statistics, supergerms infected more than two million Americans and killed 23,000. Three years later the Centers for Disease Control and Prevention (CDC) issued an alert that the infection rate was accelerating much faster than epidemiologists had forecast. The president of the Infectious Diseases Society of America labeled CRE âan urgent threatâ to Americaâs health care system.6 As bad as the crisis was in the United States, it was far grimmer in many poor countries where superbugs thrive in unsanitary conditions.7
The Nevada doctors understood that hospitals and nursing homes are ground zero for rapacious germs such as CRE. They are the ideal breeding environment for superbugs to infect patients at high risk, those with immune systems weakened by other illnesses or drug therapies. The supergerms also spread easily through breathing machines, IV needles, catheters, even blood pressure cuffs. CRE thrives on everything from light switches, doorknobs, and toilets, as well as the unwashed hands of health care workers.8
The typical five-to-seven-day dose of oral antibiotics usually prescribed for bacterial infections has no effect on CRE. Instead, doctors must overwhelm and eliminate all traces of it with a small class of ultra-powerful antibiotics dispensed through an intravenous drip. The Nevada treating physicians went from concern to alarm when additional tests showed the strain of CRE ravaging her body was resistant to all fourteen antibiotics the hospital stocked.
The stateâs senior epidemiologists dispatched a sample of the bug to CDC headquarters in Atlanta. There, scientists watched with dread as further testing demonstrated the Nevada strain was resistant to an additional twelve antibiotics, including some that had never before failed to stop a superbug.9 The Nevada doctors were helpless as their patient deteriorated, went into septic shock, and died two weeks to the day after her ER admission.
Public health officials delayed reporting the death of Patient Zero until January 2017. The news of a superbug resistant to every available antibiotic kicked off sensational tabloid coverage with âend of the worldâ headlines. The resistant superbug overshadowed another pharma-related story that broke that same month. In January, states had begun reporting their drug overdose statistics for the previous year. They confirmed that Americaâs multiyear opioid addiction crisis had worsened. Over 63,000 had died in 2016, a 20 percent spike from the previous year, which itself had been a record. More people died of drugs in 2016 than had in car accidents, gun violence, or AIDS during their peak years.10 The state of emergency that two dozen governors had declared seemed to make little difference. Their overdose rates were up by double digits.11 Opioid-based prescription painkillers were involved in two thirds of the deaths.
Addiction did not discriminate between rich and poor, black and white, men and women. It affected big citiesâPhiladelphiaâs medical examiner reported a grim record of thirty-five dead in three daysâas well as some of the poorest stretches of Appalachia.
Just before the media was transfixed by Patient Zero and the invincible superbug, The Washington Post had run the final installment of a series on the opioid crisis. It was about Chillicothe, a historic Ohio town of 21,000 nestled along the Scioto River.12 Residents used to call it âMayberry,â boasting that it was a postcard for the best of small-town American life. Easy access to prescription opioids changed it. A doctor who had run a local pill millâwhere painkillers are dispensed for cash without any questions or examsâhad been sentenced to four consecutive life sentences for those âpatientsâ who had died from his reckless overprescribing. Chillicotheâs forty drug deaths in 2016 were a record and triple the number of a couple of years earlier.
The city coroner said he dreaded talking âto one more parent whoâs lost a child.â Firefighters, EMS workers, doctors, police, hospital workers, victimsâ families, all were succumbing to âopiate fatigue.â The coroner had almost quit on a day he described as âthe Zombie Apocalypse.â Chillicotheâs police and paramedics had responded to thirteen overdoses. A 911 call from a gas station attendant reported a woman passed out in the driverâs seat of an idling car. When the police arrived, they found an infant girl in the backseat. That child was one of seven under the age of ten handed over that day to Child Welfare.
Chillicothe is part of the collateral damage for an industry that not only created the nationâs most lethal drug crisis but allowed it to flourish mostly unchecked for twenty years. Its origins are in the addictive drugs that 150 years earlier were the core DNA of the pharmaceutical industry. Some of todayâs largest drug companies got their start selling then-legal heroin, morphine, cannabis, and cocaine-based medications that returned staggering profits.
Money, however, is only part of the answer. The pharmaceutical industryâs relationship with its federal regulators at the FDA alternates between contentious and too cozy. And for sixty years it has prevailed in a bitter battle over whether laboratory discoveries should be rewarded with exclusive patents and long monopolies. Most important, the drug business has turned America into a medicated society. Successive waves over the decades of so-called wonder drugs, some real and some hyped, have resulted in huge profits while also creating tens of millions of dependent patients waiting for the next pill to solve an ever-expanding range of illnesses and disorders.
Big Pharma likes to portray itself as a quasi-public trust focused on curing illnesses and saving lives. Its profits, while large, come at great cost for research and development. Its critics cast pharma as a veritable evil empire in which money trumps health. Wild conspiracy theories have flourished, that the industry has developed and hidden a cancer cure or pushes autism-causing vaccines, all to make more money.
The truth about todayâs pharmaceutical companies and what truly motivates them is found in part through the history of their origins and growth. Understanding how todayâs dominant mega-companies developed explains why the creative science that was the industryâs early hallmark is under assault.13 Big Pharma resides at the intersection of public health and free enterprise. Only by knowing its history is it possible to fully appreciate how the battle between noble ambitions and greed is a permanent conflict.
2 THE POISON SQUAD
The American pharmaceutical industry emerged in the mid-nineteenth century in response to an unprecedented surge in demand for antiseptics and painkillers for combat troops. The Mexican-American War that ended in 1848 had taught the United States a painful lesson. Adulterated medications meant that frontline soldiers died needlessly; the failure to treat dysentery, yellow fever, infections, and cholera resulted in 87 percent of the fatalities.1 Many who survived also suffered unnecessarily since the painkillers sent to treat battlefield wounds were often defective. No American company was then capable of large-scale manufacturing of morphine, the eraâs most powerful painkiller.
It had only been forty years since a twenty-one-year-old German pharmacistâs apprentice had isolated the morphine alkaloid from the opium poppy. He called it Morpheus, after the Greek god of dreams, but his findings were mostly ignored after he published them in a little-read medical journal.2 It was a decade before a French chemist realized its importance and not until the Roaring Twenties that Heinrich Emanuel Merck sold a standardized dose of morphine at his Engel-Apotheke (Angel Pharmacy) in Darmstadt, Germany. Morphine was inexpensive to produce and it became a key product at several new family-run German companies, including Ernst Christian Friedrich Scheringâs eponymously named Berlin company and Friedrich Bayerâs chemical factory in Wuppertal.3
A year after the Mexican-American War, two German American cousins used $2,500 in savings and a $1,000 mortgage to launch Charles Pfizer and Company. It was a chemicals business in a two-story brick building on Brooklynâs Bartlett Street.4 Their timing was good. Once the Civil War began, Pfizer had trouble keeping up with the demand for morphine.
Pfizerâs competition came from Edward Robinson Squibb, who had opened E. R. Squibb & Sons, a pharmaceutical manufacturing plant, also in Brooklyn. Squibb was personally aware of the importance of quality and consistency in drug production. As a wartime naval surgeon he had personally tossed overboard crates of substandard medications sent to the front.5 A year into the Civil War, pharmacist brothers John and Frank Wyeth opened a Philadelphia pharmacy and drug distributorship. The contract they got to supply medicine to the Union Army was so lucrative that after the war they sold their pharmacy and focused on mass-manufacturing drugs.6
Morphine was the most effective painkiller but not the only one. Dr. Samuel Pearce Duffield, chief of Detroitâs health department, sold an ether and alcohol solution to Union troops. When he retired in 1871, an exâcopper miner turned investor, Hervey Coke Parke, and the companyâs twenty-six-year-old salesman, George Solomon Davis, incorporated Parke-Davis.
Eli Lilly, a chemist, missed the opportunity to cash in on the Civil War demand for morphine, but as a Union Army colonel he learned how critical medications were to the war effort. He left the military convinced that his future lay in the eponymously named laboratory that started manufacturing drugs in 1876.7 Two other American pharmacists, Silas Mainville Burroughs and Henry Solomon Wellcome, also saw opportunity in the drug business. Deciding there was less competition in Britain than the U.S., they launched Burroughs Wellcome in London in 1880. It manufactured everything from cod liver oil to malt preparations to face creams to opiate-based pain compounds.8
Those pioneers entered a drug industry in its infancy. The highly addictive nature of their products, coupled with no government oversight and regulation, was good for sales. And they benefited also from the ignorance about what caused illnesses and chronic diseases or how to treat them. It had been only a few decades since French chemist Louis Pasteur had proven with a series of experiments on spoiled meat and sour milk the existence of microbes too small for the human eye to see. The emergence of âger...