Once upon a time, in faraway lands as well as in our own communities, heroes and villains and the rest of us used, abused, and protected the environment. In big or small ways, each had a story to tell and each made an impact. As each story played out, some wondered if heroes would rise to the occasion, avoiding the kryptonite of disinterest to protect the environment and human life. Others thought about getting involved, but wondered what to do. Some stories ended badly. Great damage was done. Sometimes with villainous intent and sometimes simply with little regard for the consequences of their actions, people left in their wake spoiled landscapes, public health risks, endangered species, and even death. It was up to the rest of us to rebuild and recover, if we could.
Other environmental tales have ended with hope, having come to a more sustainable, environmentally sensitive conclusion. What makes the difference? It is persistence, integrity, wisdom, and courage that help shape happier endings to environmental stories.
The environmental stories that influence our environmental laws, regulations, and policies often have all the dramatic elements that make for a good tale. As with any good story, we are captivated by the high drama, the suspense, the danger—but most of all, by the cast of characters. Some of these stories are well known—who among us will soon forget the BP oil spill disaster in the Gulf of Mexico? Other, less familiar stories involved small communities and ordinary people doing extraordinary things. All of these stories, however, help illustrate how our environmental policies have come to be, how they protect public health and the environment, or why they have failed to work as intended.
This book takes us on a journey. It tells stories of actions taken by organizations, government agencies, and individuals. By no means do these represent the only such stories, and determining whether these are tales of villainous or heroic acts is ultimately up to the reader. But there are remarkable elements of these fascinating tales that ensure that they will be remembered for decades to come. By examining high-profile incidents, limning portraits of individuals and groups working to protect the environment, and detailing what we can do in our own part of the world, the book presents a way of seeing how environmental politics, policies, and laws work and how we can work within them to tell new stories.
In its consideration of extraordinarily tragic events, this book also focuses on extraordinarily courageous actions taken by environmental heroes (Chapter 6) and “the rest of us” (Chapter 7) for a simple reason: ultimately, earth’s various environmental stories have to end with us. Environmental laws and regulations can go only so far in protecting the environment and people. Laws and regulations are the engines that drive environmental protection, but citizens are the spark plugs. We are the ones who decide whether or not to participate, whether to observe and support the actions of federal and state agencies, and, ultimately, whether to act to enforce the law or to lobby for additional protections. We can encourage companies, too, to promote sustainable behavior and even go above and beyond regulatory requirements to help protect the environment. By being part of any environmental story, we might just influence decision-makers to heed the warning signs of an industrial disaster in the making. We can join in or stand by and do nothing, watching the story unfold.
The heroes are among us, and the villains are both lurking about in the shadows and standing in full view. Will we join the heroes, or become heroes ourselves? Will we put up with the villains? In the end, it’s really up to us. One way to begin is to tell the stories of four high-profile disasters (detailed in Chapters 2 to 5) that exacted a heavy toll in human life and ongoing injuries to the environment. Learning the lessons from these stories will potentially help us avoid similar industrial disasters in the future.
UNDERSTANDING ORGANIZATIONS AND WHY INDUSTRIAL DISASTERS HAPPEN
Industrial disasters are distinguished from natural disasters, such as hurricanes, wildfires, and floods, in that they are caused by organizations engaged in high-risk activities. These human-made disasters are of such a magnitude that they prompt major disruptions in the organization itself and sometimes spawn new policies, regulations, laws, and even agencies to govern that industry. To put it simply, these disasters are of sufficient size and scope to force changes in the way we view not only the organizations involved but the entire industry and the way it operates.
Astute readers may wonder why we should consider past accidents at all. Accidents happen, one might argue, and tragedies occur and always will, especially in our high-tech, fossil fuel- and chemical-driven society. Could it not be the case that some industrial disasters capture our attention because they are simply larger events than others? The answer to this is: sometimes … and sometimes not.
Distinguishing Normal Accidents from Avoidable Accidents
Scholars have long attempted to explain why organizations act in ways that discount safety or environmental issues. They attempt to draw distinctions between “simple” mistakes, on the one hand, and misconduct or criminal actions that give rise to the “dark side” of organizations, on the other.1 Whether or not an event represents behavior that could be characterized as “dark” or reprehensible depends in part on how the issues are interpreted and defined. Making sense of these disasters has long captured the attention of scholars in the fields of public policy, political science, psychology, business, sociology, criminology, and organizational behavior, among others.
Charles Perrow argues that complex enterprises engaged in high-risk activities will always have catastrophic potential.2 Accidents in these complicated systems should be expected and are therefore “normal.” According to “normal accident” theory, two organizational characteristics, “interactive complexity” and “tight coupling,” make these systems susceptible to accidents. Normal accidents may also be called “system accidents,” where the system includes not only the equipment and other components but also the humans who operate them.
Any system, by definition, has many parts, any one of which may fail. In a simple system, single malfunctions, or “discrete failures,” may be spotted and corrected. However, a complicated system may allow two or more discrete failures to interact in unexpected ways, thus creating what Perrow defines as “interactive complexity.” In turn, these unexpected interactions can affect supposedly redundant or backup systems, creating a series of malfunctions that may lead to catastrophe in a blink of an eye. Perrow focused on the 1979 accident at the Three Mile Island nuclear plant in Pennsylvania; later, his “normal accidents” framework would be used to examine other incidents, such as the 1986 Challenger disaster, when the space shuttle broke apart shortly after launch, leading to the deaths of its crew members. We can assume that a sufficiently complex system, like a nuclear power plant or a space shuttle, is susceptible to normal accidents because it can be expected to have many such unanticipated interaction failures. However, normal accidents in these complex systems have been minimized by vigilant staff in high-reliability organizations, described later in this chapter.
Normal accidents are also triggered by tightly coupled system components. “Tight coupling” exists when system components are linked closely in time or space. If the system allows sufficient time after the discrete failure occurs, operators of the system are able to respond. In contrast, tightly coupled systems create a rapid chain of events, so that components have major impacts on each other in a short time frame. Because of these tight linkages, system operators have almost no time to react. Tight coupling raises the odds that the responses of decision-makers trying to correct the failure will be wrong, since they do not correctly understand the true nature of the problem. Because system components are interacting in unexpected ways, the problem is incomprehensible, at least for a short period of time. As a result, a cascade of decisions may amplify the tragedy, and each single decision may have a deleterious effect on the outcome. Sadly, the failure that initiates a catastrophic event often seems, taken by itself, quite trivial. Because of the system’s complexity and tight coupling, however, events quickly surge out of control, creating a cataclysmic outcome.
By contrast, some accidents in complex systems are not “normal.” Accidents result because of an organization’s poor leadership, misdirected values, and a culture of complacency about safety and environmental regulations. Organization executives, driven by an all-encompassing desire to maximize profits or cut costs, seek to reach a goal regardless of the consequences.
Unintended Consequences, Organizational Culture, and Power
Why do organizations depart from their own goals, act in unethical ways that harm the public, or engage in criminal conduct? Answers to these questions can be found in research focused on unintended consequences, organizational culture, and the power and intent of the individuals responsible for an organization.
Let’s begin by looking at unintended consequences. BP, Union Carbide, Massey Energy, and W. R. Grace, the companies whose stories are told in Chapters 2 to 5, undoubtedly never wanted the environmental tragedies for which they were responsible to happen. Those disasters were likely to have been the unintended consequence of poor decision-making based on overriding values, like maximizing profitability. Robert Merton was one of the first scholars to point to unanticipated consequences of individual conduct in organizations.3 He observed that any purposive action inevitably generates unintended consequences, which may be positive or negative. He theorized that certain conditions make bad outcomes more likely and often will even exacerbate negative unintended consequences. These conditions include: failure to fully understand the problem; ignorance; attention to satisfying immediate interests rather than long-term goals; and any other unethical values that organizational actors might bring to decision-making.
Merton also suggested that a common fallacy is the “too-ready assumption that actions which have in the past led to the desired outcome will continue to do so.”4 That is, individuals commit errors in judgment because they fail to recognize that what has been successful in certain circumstances may not work under future conditions that are different. Error often involves either neglect or failure to thoroughly examine the situation, perhaps owing to what Merton called “pathological obsession”: a determined refusal to consider all aspects of a problem. Merton also felt that emotional involvement could distort our construction of a situation and the probable consequences. Perhaps most vexing is the influence of what Merton referred to as the “immediacy of interest” on basic values. He suggested that we may go against fundamental values and fail to consider further consequences because of our intense interest in satisfying immediate desires.
Over sixty years later, Diane Vaughan arrived at similar conclusions about values, ignorance, and errors due to complacency. She has sought to explain why organizations go over to the “dark side”—why they behave in ways that deviate from their formal organizational goals and from standards and expectations for behavior.5 She notes that people within an organization can become accustomed to a deviant behavior (such as ignoring safety rules) to the point where they don’t consider it deviant, a phenomenon she calls “normalization of deviance.” Then, as people inside an organization continue to regularly depart from accepted behavior, they grow more comfortable breaking the rules or relaxing safety standards.
If left unchecked long enough, organizational deviance results in uncorrected mistakes, misconduct, and, sometimes, even industrial disaster. Misconduct suggests intent: individuals or groups within an organization, acting in their organizational roles, intentionally violate internal rules, laws, or administrative regulations in pursuit of organization goals. They go against established procedures. The question is: why?
Part of the answer can be found in the culture of an organization. Organizational culture creates the “rules of t...