Meltdown
eBook - ePub

Meltdown

Financial Times' best business books of the year, 2018

Chris Clearfield, András Tilcsik

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Meltdown

Financial Times' best business books of the year, 2018

Chris Clearfield, András Tilcsik

Book details
Book preview
Table of contents
Citations

About This Book

Financial Times' best business books of the year, 2018 'Endlessly fascinating, brimming with insight, and more fun than a book about failure has any right to be.' - Charles Duhigg, author of The Power of Habit A groundbreaking exploration of how complexity causes failure in business and life - and how to prevent it. An accidental overdose in a state-of-the-art hospital. The Post Office software that led to a multimillion-pound lawsuit. The mix-up at the 2017 Oscars Awards ceremony. An overcooked meal on holiday. At first glance, these events have little in common. But surprising new research shows that many modern failures share similar causes. In Meltdown, world-leading experts in disaster prevention, Chris Clearfield and András Tilcsik, use real-life examples to reveal the errors in thinking, perception, and system design that lie behind both our everyday errors and disasters like the Fukushima nuclear accident. But most crucially, Meltdown is about finding solutions. It reveals why ugly designs make us safer, how a five-minute exercise can prevent billion-dollar catastrophes, why teams with fewer experts are better at managing risk, and why diversity is one of our best safeguards against failure. The result is an eye-opening and empowering book - one that will change the way you see our complex world and your own place within it.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Meltdown an online PDF/ePUB?
Yes, you can access Meltdown by Chris Clearfield, András Tilcsik in PDF and/or ePUB format, as well as other popular books in Business & Contabilità gestionale. We have over one million books available in our catalogue for you to explore.

Information

Year
2018
ISBN
9781786492258

Part One

FAILURE ALL
AROUND US

Chapter One

THE DANGER ZONE

“Oh this will be fun.”

I.

The Ventana Nuclear Power Plant lies in the foothills of the majestic San Gabriel Mountains, just forty miles east of Los Angeles. One day in the late 1970s, a tremor rattled through the plant. As alarms rang and warning lights flashed, panic broke out in the control room. On a panel crowded with gauges, an indicator showed that coolant water in the reactor core had reached a dangerously high level. The control room crew, employees of California Gas and Electric, opened relief valves to get rid of the excess water. But in reality, the water level wasn’t high. In fact, it was so low that the reactor core was inches away from being exposed. A supervisor finally realized that the water level indicator was wrong—all because of a stuck needle. The crew scrambled to close the valves to prevent a meltdown of the reactor core. For several terrifying minutes, the plant was on the brink of nuclear disaster.
“I may be wrong, but I would say you’re probably lucky to be alive,” a nuclear expert told a couple of journalists who happened to be at the plant during the accident. “For that matter, I think we might say the same for the rest of Southern California.”
Fortunately, this incident never actually occurred. It’s from the plot of The China Syndrome, a 1979 thriller starring Jack Lemmon, Jane Fonda, and Michael Douglas. It was sheer fiction, at least according to nuclear industry executives, who lambasted the film even before it was released. They said the story had no scientific credibility; one executive called it a “character assassination of an entire industry.”
Michael Douglas, who both coproduced and starred in the film, disagreed: “I have a premonition that a lot of what’s in this picture will be reenacted in life in the next two or three years.”
It didn’t take that long. Twelve days after The China Syndrome opened in theaters, Tom Kauffman, a handsome twenty-six-year-old with long red hair, arrived for work at the Three Mile Island Nuclear Generating Station, a concrete fortress built on a sandbar in the middle of Pennsylvania’s Susquehanna River. It was 6:30 on a Wednesday morning, and Kauffman could tell that something was wrong. The vapor plumes coming from the giant cooling towers were much smaller than normal. And as he was receiving his security pat down, he could hear an emergency alarm. “Oh, they’re having some problem down there in Unit Two,” the guard told him.
Inside, the control room was crowded with operators, and hundreds of lights were flashing on the mammoth console. Radiation alarms went off all over the facility. Shortly before 7:00 a.m., a supervisor declared a site emergency. This meant there was a possibility of an “uncontrolled release of radioactivity” in the plant. By 8:00 a.m., half of the nuclear fuel in one of the plant’s two reactors had melted, and by 10:30 a.m., radioactive gas had leaked into the control room.
It was the worst nuclear accident in American history. Engineers struggled to stabilize the overheated reactor for days, and some officials feared the worst. Scientists debated whether the hydrogen bubble that had formed in the reactor could explode, and it was clear that radiation would kill anyone who got close enough to manually open a valve to remove the buildup of volatile gas.
After a tense meeting in the White House Situation Room, President Carter’s science aide took aside Victor Gilinsky, the commissioner of the Nuclear Regulatory Commission, and quietly suggested that they send in terminal cancer patients to release the valve. Gilinsky looked him over and could tell he wasn’t joking.
The communities around the plant turned into ghost towns as 140,000 people fled the area. Five days into the crisis, President Carter and the First Lady traveled to the site to quell the panic. Wearing bright yellow booties over their shoes to protect themselves from traces of radiation on the ground, they toured the plant and reassured the nation. The same day, engineers figured out that the hydrogen bubble posed no immediate threat. And once coolant was restored, the core temperature began to fall, though it took a whole month before the hottest parts of the core started to cool. Eventually, all public advisories were lifted. But many came to think of Three Mile Island as a place where our worst fears almost came to pass.
The Three Mile Island meltdown began as a simple plumbing problem. A work crew was performing routine maintenance on the nonnuclear part of the plant. For reasons that we still don’t totally understand, the set of pumps that normally sent water to the steam generator shut down. One theory is that, during the maintenance, moisture accidentally got into the air system that controlled the plant’s instruments and regulated the pumps. Without water flowing to the steam generator, it couldn’t remove heat from the reactor core, so the temperature increased and pressure built up in the reactor. In response, a small pressure-relief valve automatically opened, as designed. But then came another glitch. When pressure returned to normal, the relief valve didn’t close. It stuck open. The water that was supposed to cover and cool the core started to escape.
An indicator light in the control room led operators to believe that the valve was closed. But in reality, the light showed only that the valve had been told to close, not that it had closed. And there were no instruments directly showing the water level in the core, so operators relied on a different measurement: the water level in a part of the system called the pressurizer. But as water escaped through the stuck-open valve, water in the pressurizer appeared to be rising even as it was falling in the core. So the operators assumed that there was too much water, when in fact they had the opposite problem. When an emergency cooling system turned on automatically and forced water into the core, they all but shut it off. The core began to melt.
The operators knew something was wrong, but they didn’t know what, and it took them hours to figure out that water was being lost. The avalanche of alarms was unnerving. With all the sirens, klaxon horns, and flashing lights, it was hard to tell trivial warnings from vital alarms. Communication became even more difficult when high radiation readings forced everyone in the control room to wear respirators.
And it was unclear just how hot the core had become. Some temperature readings were high. Others were low. For a while, the computer monitoring the reactor temperature tapped out nothing but lines like these:
illustration
The situation was nearly as bad at the Nuclear Regulatory Commission. “It was difficult to process the uncertain and often contradictory information,” Gilinsky recalled. “I got lots of useless advice from all sides. No one seemed to have a reliable grip on what was going on, or what to do.”
It was a puzzling, unprecedented crisis. And it changed everything we know about failure in modern systems.

II.

Four months after the Three Mile Island accident, a mail truck climbed a winding mountain road up to a secluded cabin in Hillsdale, New York, in the foothills of the Berkshires. It was a hot August day, and it took the driver a few tries to find the place. When the truck stopped, a lean, curly-haired man in his mid-fifties emerged from the cabin and eagerly signed for a package—a large box filled with books and articles about industrial accidents.
The man was Charles Perrow, or Chick, as his friends called him. Perrow was an unlikely person to revolutionize the science of catastrophic failure. He wasn’t an engineer but a sociology professor. He had done no previous research on accidents, nuclear power, or safety. He was an expert on organizations rather than catastrophes. His most recent article was titled “Insurgency of the Powerless: Farm Worker Movements, 1946–1972.” When Three Mile Island happened, he was studying the organization of textile mills in nineteenth-century New England.
Sociologists rarely have a big impact on life-and-death matters like nuclear safety. A New Yorker cartoonist once lampooned the discipline with the image of a man reading the newspaper headline “Sociologists on Strike!!! Nation in Peril!!” But just five years after that box was delivered to Perrow’s cabin, his book Normal Accidents—a study of catastrophes in high-risk industries—became a sort of academic cult classic. Experts in a range of fields—from nuclear engineers to software experts and medical researchers—read and debated the book. Perrow accepted a professorship at Yale, and by the time his second book on catastrophes was published, the American Prospect magazine declared that his work had “achieved iconic status.” One endorsement for the book called him “the undisputed ‘master of disaster.’”
Perrow first got interested in meltdowns when the presidential commission on the Three Mile Island accident asked him to study the event. The commission was initially planning to hear only from engineers and lawyers, but its sole sociologist member suggested they also consult Perrow. She had a hunch that there was something to be learned from a social scientist, someone who had thought about how organizations actually operate in the real world.
When Perrow received the transcripts of the commission’s hearings, he read all the materials in an afternoon. That night he tossed and turned for hours, and when he finally got to sleep, he had his worst nightmares since his army days in World War II. “The testimony of the operators made a profound impression on me,” he recalled years later. “Here was an enormously, catastrophically risky technology, and they had no idea what was going on for some hours. . . . I suddenly realized that I was in the thick of it, in the very middle of it, because this was an organizational problem more than anything else.”
He had three weeks to write a ten-page report, but—with the help of graduate students who sent boxes of materials to his cabin—he wound up cranking out a forty-page paper by the deadline. He then put together what he would later describe as “a toxic and corrosive group of graduate research assistants who argued with me and each other.” It was, Perrow recalled, “the gloomiest group on campus, known for our gallows humor. At our Monday meetings, one of us would say, ‘It was a great weekend for the project,’ and rattle off the latest disasters.”
This group reflected Perrow’s personality. One scholar described him as a curmudgeon but called his research a “beacon.” Students said he was a demanding teacher, but they loved his classes because they learned so much. Among academics, he had a reputation for giving unusually intense but constructive criticism. “Chick’s critical appraisals of my work have been the yardstick by which I’ve judged my success,” wrote one author. “He has never failed to produce pages and pages of sometimes scathing remarks, usually well reasoned, and always ending with something like ‘Love, Chick’ or ‘My Usual Sensitive Self.’”

III.

The more Perrow learned about Three Mile Island, the more fascinated he became. It was a major accident, but its causes were trivial: not a massive earthquake or a big engineering mistake, but a combination of small failures—a plumbing problem, a stuck valve, and an ambiguous indicator light.
And it was an accident that happened incredibly quickly. Consider the initial plumbing glitch, the resulting failure of the pumps to send water to the steam generator, the increasing pressure in the reactor, the opening of the pressure relief valve and its failure to close, and then the misleading indication of the valve’s position—all this happened in just thirteen seconds. In less than ten minutes, the damage to the core was already done.
To Perrow, it was clear that blaming the operators was a cheap shot. The official investigation portrayed the plant staff as the main culprits, but Perrow realized that their mistakes were mistakes only in hindsight—“retrospective errors,” he called them.
Take, for example, the greatest blunder—the assumption that the problem was too much water rather than too little. When the operators made this assumption, the readings available to them didn’t show that the coolant level was too low. To the best of their knowledge, there was no danger of uncovering the core, so they focused on another serious problem: the risk of overfilling the system. Though there were indications that might have helped reveal the true nature of the problem, the operators thoug...

Table of contents