Critical Steps
eBook - ePub

Critical Steps

Managing What Must Go Right in High-Risk Operations

Tony Muschara, Ron Farris, Jim Marinus

Buch teilen
  1. 169 Seiten
  2. English
  3. ePUB (handyfreundlich)
  4. Über iOS und Android verfĂŒgbar
eBook - ePub

Critical Steps

Managing What Must Go Right in High-Risk Operations

Tony Muschara, Ron Farris, Jim Marinus

Angaben zum Buch
Buchvorschau
Inhaltsverzeichnis
Quellenangaben

Über dieses Buch

Critical Steps happen every day at work and at home, purposefully. Work does not happen otherwise. If an operation has the capacity to do work, then it has the capacity to do harm. Work is energy directed by human beings to create value. But people are imperfect—we make mistakes, and sometimes we lose control of the work. Therefore, work is the use of force under conditions of uncertainty. A Critical Step is a human action that will trigger immediate, irreversible, and intolerable harm to an asset, if that action or a preceding action is performed improperly. Whether the human action involves clicking on a link attached to an e-mail message, walking down a flight of stairs with a newborn baby in arms, engaging the clutch on a gasoline-driven chain saw, or administering a medication to a patient in a hospital, these all satisfy the definition of what constitutes critical risks in our daily lives, professionally or personally. The overarching goal of managing Critical Steps is to maximize the success (safety, reliability, productivity, quality, profitability, etc.) of people's performance in the workplace, to create value for the organization without losing control of built-in hazards necessary to create that value.

HĂ€ufig gestellte Fragen

Wie kann ich mein Abo kĂŒndigen?
Gehe einfach zum Kontobereich in den Einstellungen und klicke auf „Abo kĂŒndigen“ – ganz einfach. Nachdem du gekĂŒndigt hast, bleibt deine Mitgliedschaft fĂŒr den verbleibenden Abozeitraum, den du bereits bezahlt hast, aktiv. Mehr Informationen hier.
(Wie) Kann ich BĂŒcher herunterladen?
Derzeit stehen all unsere auf MobilgerĂ€te reagierenden ePub-BĂŒcher zum Download ĂŒber die App zur VerfĂŒgung. Die meisten unserer PDFs stehen ebenfalls zum Download bereit; wir arbeiten daran, auch die ĂŒbrigen PDFs zum Download anzubieten, bei denen dies aktuell noch nicht möglich ist. Weitere Informationen hier.
Welcher Unterschied besteht bei den Preisen zwischen den AboplÀnen?
Mit beiden AboplÀnen erhÀltst du vollen Zugang zur Bibliothek und allen Funktionen von Perlego. Die einzigen Unterschiede bestehen im Preis und dem Abozeitraum: Mit dem Jahresabo sparst du auf 12 Monate gerechnet im Vergleich zum Monatsabo rund 30 %.
Was ist Perlego?
Wir sind ein Online-Abodienst fĂŒr LehrbĂŒcher, bei dem du fĂŒr weniger als den Preis eines einzelnen Buches pro Monat Zugang zu einer ganzen Online-Bibliothek erhĂ€ltst. Mit ĂŒber 1 Million BĂŒchern zu ĂŒber 1.000 verschiedenen Themen haben wir bestimmt alles, was du brauchst! Weitere Informationen hier.
UnterstĂŒtzt Perlego Text-zu-Sprache?
Achte auf das Symbol zum Vorlesen in deinem nÀchsten Buch, um zu sehen, ob du es dir auch anhören kannst. Bei diesem Tool wird dir Text laut vorgelesen, wobei der Text beim Vorlesen auch grafisch hervorgehoben wird. Du kannst das Vorlesen jederzeit anhalten, beschleunigen und verlangsamen. Weitere Informationen hier.
Ist Critical Steps als Online-PDF/ePub verfĂŒgbar?
Ja, du hast Zugang zu Critical Steps von Tony Muschara, Ron Farris, Jim Marinus im PDF- und/oder ePub-Format sowie zu anderen beliebten BĂŒchern aus Technik & Maschinenbau & Gesundheit & Sicherheit in der Industrie. Aus unserem Katalog stehen dir ĂŒber 1 Million BĂŒcher zur VerfĂŒgung.

Information

1

What Is a CRITICAL STEP?

DOI: 10.1201/9781003220213-1
If an operation has the capacity to do work, then it has the capacity to do harm.*
* This statement is attributed to Dorian Conger, who made this statement to students during the introduction to a MORT cause analysis class. (MORT: Management Oversight Risk Tree).
—Dorian Conger
Roger, I was afraid of that. I was really afraid of that.
—Battalion Commander of a U.S. Army Apache helicopter flying in the gunner’s seat after a “friendly fire” tragedy
FATAL FRIENDLY FIRE 1
On February 17, 1991, at approximately 1:00 a.m., a U.S. Bradley Fighting Vehicle and an armored personnel carrier were destroyed by two missiles fired from a U.S. Apache helicopter. Two U.S. soldiers were killed, and six others were wounded. This friendly fire tragedy took place in the Persian Gulf during Operation Desert Storm. The incident occurred after U.S. ground forces, which were deployed along an east-west line about 3 miles north of the Saudi-Iraqi border, reported several enemy sightings north of their positions. In response, ground commanders called for Apache reconnaissance of the area.
Apache cockpits have two sections: the front seat is reserved for the gunner and the back seat for the pilot. The pilot controls the flight pattern, and the gunner engages the target with the helicopter’s weapon systems. Both sections of the cockpit have flight and weapons control if one must take control of the other.
Every night for the first couple of weeks of February, battalion helicopters responded to reports from U.S. ground forces of apparent movements of Iraqi vehicles, all false alarms. A U.S. Army Lieutenant Colonel was the Battalion Commander of a U.S. Army Apache helicopter strike force. Just days before, helicopters from the Colonel’s battalion misidentified and fired on a U.S. Army scout vehicle, missing it without damage or injury—a near hit.
U.S. armored forces on the ground operating in the area reported possible enemy sightings—suspected Iraqi armored vehicles moving toward a U.S. tank squadron. Commanders of the ground forces asked for aid from the Apache battalion based about 60 miles south of the border to explore the area and to engage them if enemy presence was found. The Colonel with his copilot and two other Apache helicopters responded quickly, urgently directed to patrol an area north of the line of U.S. tanks. Because of an imminent sandstorm with intense winds and low visibility, the Colonel decided to command the lead Apache himself, in the gunner’s seat, even though he had only 3 hours of sleep in the previous 24 hours. They launched at 12:22 a.m. Due to the urgency of the request, a normal, detailed premission briefing was not done.
Upon arriving on station at 12:50 a.m., the helicopter’s target acquisition system detected the vehicles. Two suspicious vehicles appeared near the eastern end of the line of U.S. ground forces, noting the targets’ locations by measuring their distance from the aircraft with a laser beam, automatically entered into the weapons fire control computer. The Colonel estimated the suspicious vehicles were about a quarter mile in front, the first mistake. He misread the grid coordinates of the alleged targets on the helicopter navigation system, reading instead the search coordinates that he manually entered into the navigation system while in route early in the flight. As a result, he misidentified the target vehicles’ location as being north of the line of friendly vehicles, which coincidently were in the exact location of previously reported enemy sightings.
A discussion ensued between the three Apache pilots and the ground commander to authenticate their identity. Apache helicopters were not equipped with IFF—an automated system referred to as “Identification Friend or Foe.” In the darkness, the vehicles could not otherwise be identified.
The ground commander insisted that no U.S. forces were ahead of the line, that the vehicles must be enemy, and twice replied to the Colonel, “Those are enemy. Go ahead and take them out.” Pilots of the other two Apaches also thought the vehicles were enemy. More ominously, there were persistent search-radar alerts being received in the cockpit, adding to the stress of the moment. These alerts, responding to radar emitted by friendly forces, were misidentified by the Apache computers as an enemy system. Even the Colonel’s copilot prompted him, “Do em!” more than once. Yet he felt uneasy as to the identity of the vehicles. The Colonel is recorded to have said, “Boy, I’m going to tell you, it’s hard to pull this trigger,” asking for help to verify current helicopter heading and bearing to and grid coordinates of targets. He states the targets’ grid coordinates aloud, again misreading them, the second mistake. No one recognizes the error. His copilot states, “Ready in the back.”
The Colonel decided to fire on the vehicles with the Apache’s 30-millimeter cannons (machine guns), which would have inflicted less damage than a missile just in case they were friendlies. The gun emitted only a few rounds before jamming (sand). He then fired two Hellfire missiles* at the suspected vehicles—the third, but deadly, mistake. Shortly thereafter, the Apaches received a cease fire order. The missiles had already been fired and both vehicles, a Bradley Fighting Vehicle and an armored personnel carrier, were destroyed, killing two U.S. soldiers inside. The Colonel softly said, “I was afraid of that, I was really afraid of that.”
* The laser-guided Hellfire missile is the main armament on the Apache helicopter, designed for the destruction of armor and other hardened targets.
The Colonel knew the point of no return: pulling the trigger! He said it. But human fallibility entered the decision-making process, hampered by sleep deprivation, a fierce desert dust storm, inadequate human factors in the cockpit, inferior teamwork, and the stress of combat that worked against him, his team, and even ground commanders. He did his best under the circumstances. Would you have done anything different? Be honest. The system and the battlefield worked against him. Sometimes doing your best isn’t good enough.

WORK = RISK

When you do work, something changes.* Physical work is the application of force over a distance (W = f ‱ d). Work is necessary to create value. Except where automation is used, work requires people to touch things—to oversee, manipulate, record, or alter things. Jobs and tasks comprise a series of human actions designed to change the state of material or information to create outputs—assets that have value in the marketplace. The risk of harm to those assets emerges when people do work, without which nothing of value is created. Work is energy directed by human beings to create value.2
* Work is the application of physical strength or mental effort to achieve a desired result, whether a force over a distance or careful reasoning (still a force over distance, though at a microscopic level).
Because the use of force, f, requires energy from a built-in hazard to create the d in work, W, all work involves some level of risk. Occasionally, people lose control of these hazards. Human fallibility is an inherent characteristic of the human condition—it’s in our genes. Error is normal—a fact of life, a natural part of being human. The human tendency to err is not a problem until it occurs in sync with significant transfers of energy, movements of matter, or transmissions of information.3 In an operational environment, human error is better characterized as a loss of control—a human action that triggers an unintended and unwanted transfer of energy (ΔE), movement of matter (ΔM), or transmission of information (ΔI).4 Human performance (Hu) is the greatest source of variation in any operation, and the uncertainty in human performance can never be eliminated. If work is not performed under control, the change (d) may not be what you want; work can inflict harm. Work involves the use of force under the condition of uncertainty.5
When performing work, people usually concentrate on accomplishing their immediate production goal, not necessarily on safety.6 Most of the time, people’s attention is on the work. If people cannot fully concentrate on being safe, thoroughly convinced there will be no unintended consequences 100 percent of the time, then when should they fully focus on safe outcomes?

VALUE ADDED VERSUS VALUE EXTRACTED

Recall Dorian Conger’s quote at the beginning of this chapter, “If an operation has the capacity to do work, it has the capacity to do harm.” All human-directed work intends to accomplish something that meets customer requirements, to add value. However, when people manipulate the controls of built-in hazards, there is a corresponding risk to do harm that can extract value instead of adding value. The greater the amount of energy transferred, matter transported, or sensitive information transm...

Inhaltsverzeichnis