Critical Steps
eBook - ePub

Critical Steps

Managing What Must Go Right in High-Risk Operations

Tony Muschara, Ron Farris, Jim Marinus

Partager le livre
  1. 169 pages
  2. English
  3. ePUB (adapté aux mobiles)
  4. Disponible sur iOS et Android
eBook - ePub

Critical Steps

Managing What Must Go Right in High-Risk Operations

Tony Muschara, Ron Farris, Jim Marinus

DĂ©tails du livre
Aperçu du livre
Table des matiĂšres
Citations

À propos de ce livre

Critical Steps happen every day at work and at home, purposefully. Work does not happen otherwise. If an operation has the capacity to do work, then it has the capacity to do harm. Work is energy directed by human beings to create value. But people are imperfect—we make mistakes, and sometimes we lose control of the work. Therefore, work is the use of force under conditions of uncertainty. A Critical Step is a human action that will trigger immediate, irreversible, and intolerable harm to an asset, if that action or a preceding action is performed improperly. Whether the human action involves clicking on a link attached to an e-mail message, walking down a flight of stairs with a newborn baby in arms, engaging the clutch on a gasoline-driven chain saw, or administering a medication to a patient in a hospital, these all satisfy the definition of what constitutes critical risks in our daily lives, professionally or personally. The overarching goal of managing Critical Steps is to maximize the success (safety, reliability, productivity, quality, profitability, etc.) of people's performance in the workplace, to create value for the organization without losing control of built-in hazards necessary to create that value.

Foire aux questions

Comment puis-je résilier mon abonnement ?
Il vous suffit de vous rendre dans la section compte dans paramĂštres et de cliquer sur « RĂ©silier l’abonnement ». C’est aussi simple que cela ! Une fois que vous aurez rĂ©siliĂ© votre abonnement, il restera actif pour le reste de la pĂ©riode pour laquelle vous avez payĂ©. DĂ©couvrez-en plus ici.
Puis-je / comment puis-je télécharger des livres ?
Pour le moment, tous nos livres en format ePub adaptĂ©s aux mobiles peuvent ĂȘtre tĂ©lĂ©chargĂ©s via l’application. La plupart de nos PDF sont Ă©galement disponibles en tĂ©lĂ©chargement et les autres seront tĂ©lĂ©chargeables trĂšs prochainement. DĂ©couvrez-en plus ici.
Quelle est la différence entre les formules tarifaires ?
Les deux abonnements vous donnent un accĂšs complet Ă  la bibliothĂšque et Ă  toutes les fonctionnalitĂ©s de Perlego. Les seules diffĂ©rences sont les tarifs ainsi que la pĂ©riode d’abonnement : avec l’abonnement annuel, vous Ă©conomiserez environ 30 % par rapport Ă  12 mois d’abonnement mensuel.
Qu’est-ce que Perlego ?
Nous sommes un service d’abonnement Ă  des ouvrages universitaires en ligne, oĂč vous pouvez accĂ©der Ă  toute une bibliothĂšque pour un prix infĂ©rieur Ă  celui d’un seul livre par mois. Avec plus d’un million de livres sur plus de 1 000 sujets, nous avons ce qu’il vous faut ! DĂ©couvrez-en plus ici.
Prenez-vous en charge la synthÚse vocale ?
Recherchez le symbole Écouter sur votre prochain livre pour voir si vous pouvez l’écouter. L’outil Écouter lit le texte Ă  haute voix pour vous, en surlignant le passage qui est en cours de lecture. Vous pouvez le mettre sur pause, l’accĂ©lĂ©rer ou le ralentir. DĂ©couvrez-en plus ici.
Est-ce que Critical Steps est un PDF/ePUB en ligne ?
Oui, vous pouvez accĂ©der Ă  Critical Steps par Tony Muschara, Ron Farris, Jim Marinus en format PDF et/ou ePUB ainsi qu’à d’autres livres populaires dans Technik & Maschinenbau et Gesundheit & Sicherheit in der Industrie. Nous disposons de plus d’un million d’ouvrages Ă  dĂ©couvrir dans notre catalogue.

Informations

Éditeur
CRC Press
Année
2021
ISBN
9781000476835

1

What Is a CRITICAL STEP?

DOI: 10.1201/9781003220213-1
If an operation has the capacity to do work, then it has the capacity to do harm.*
* This statement is attributed to Dorian Conger, who made this statement to students during the introduction to a MORT cause analysis class. (MORT: Management Oversight Risk Tree).
—Dorian Conger
Roger, I was afraid of that. I was really afraid of that.
—Battalion Commander of a U.S. Army Apache helicopter flying in the gunner’s seat after a “friendly fire” tragedy
FATAL FRIENDLY FIRE 1
On February 17, 1991, at approximately 1:00 a.m., a U.S. Bradley Fighting Vehicle and an armored personnel carrier were destroyed by two missiles fired from a U.S. Apache helicopter. Two U.S. soldiers were killed, and six others were wounded. This friendly fire tragedy took place in the Persian Gulf during Operation Desert Storm. The incident occurred after U.S. ground forces, which were deployed along an east-west line about 3 miles north of the Saudi-Iraqi border, reported several enemy sightings north of their positions. In response, ground commanders called for Apache reconnaissance of the area.
Apache cockpits have two sections: the front seat is reserved for the gunner and the back seat for the pilot. The pilot controls the flight pattern, and the gunner engages the target with the helicopter’s weapon systems. Both sections of the cockpit have flight and weapons control if one must take control of the other.
Every night for the first couple of weeks of February, battalion helicopters responded to reports from U.S. ground forces of apparent movements of Iraqi vehicles, all false alarms. A U.S. Army Lieutenant Colonel was the Battalion Commander of a U.S. Army Apache helicopter strike force. Just days before, helicopters from the Colonel’s battalion misidentified and fired on a U.S. Army scout vehicle, missing it without damage or injury—a near hit.
U.S. armored forces on the ground operating in the area reported possible enemy sightings—suspected Iraqi armored vehicles moving toward a U.S. tank squadron. Commanders of the ground forces asked for aid from the Apache battalion based about 60 miles south of the border to explore the area and to engage them if enemy presence was found. The Colonel with his copilot and two other Apache helicopters responded quickly, urgently directed to patrol an area north of the line of U.S. tanks. Because of an imminent sandstorm with intense winds and low visibility, the Colonel decided to command the lead Apache himself, in the gunner’s seat, even though he had only 3 hours of sleep in the previous 24 hours. They launched at 12:22 a.m. Due to the urgency of the request, a normal, detailed premission briefing was not done.
Upon arriving on station at 12:50 a.m., the helicopter’s target acquisition system detected the vehicles. Two suspicious vehicles appeared near the eastern end of the line of U.S. ground forces, noting the targets’ locations by measuring their distance from the aircraft with a laser beam, automatically entered into the weapons fire control computer. The Colonel estimated the suspicious vehicles were about a quarter mile in front, the first mistake. He misread the grid coordinates of the alleged targets on the helicopter navigation system, reading instead the search coordinates that he manually entered into the navigation system while in route early in the flight. As a result, he misidentified the target vehicles’ location as being north of the line of friendly vehicles, which coincidently were in the exact location of previously reported enemy sightings.
A discussion ensued between the three Apache pilots and the ground commander to authenticate their identity. Apache helicopters were not equipped with IFF—an automated system referred to as “Identification Friend or Foe.” In the darkness, the vehicles could not otherwise be identified.
The ground commander insisted that no U.S. forces were ahead of the line, that the vehicles must be enemy, and twice replied to the Colonel, “Those are enemy. Go ahead and take them out.” Pilots of the other two Apaches also thought the vehicles were enemy. More ominously, there were persistent search-radar alerts being received in the cockpit, adding to the stress of the moment. These alerts, responding to radar emitted by friendly forces, were misidentified by the Apache computers as an enemy system. Even the Colonel’s copilot prompted him, “Do em!” more than once. Yet he felt uneasy as to the identity of the vehicles. The Colonel is recorded to have said, “Boy, I’m going to tell you, it’s hard to pull this trigger,” asking for help to verify current helicopter heading and bearing to and grid coordinates of targets. He states the targets’ grid coordinates aloud, again misreading them, the second mistake. No one recognizes the error. His copilot states, “Ready in the back.”
The Colonel decided to fire on the vehicles with the Apache’s 30-millimeter cannons (machine guns), which would have inflicted less damage than a missile just in case they were friendlies. The gun emitted only a few rounds before jamming (sand). He then fired two Hellfire missiles* at the suspected vehicles—the third, but deadly, mistake. Shortly thereafter, the Apaches received a cease fire order. The missiles had already been fired and both vehicles, a Bradley Fighting Vehicle and an armored personnel carrier, were destroyed, killing two U.S. soldiers inside. The Colonel softly said, “I was afraid of that, I was really afraid of that.”
* The laser-guided Hellfire missile is the main armament on the Apache helicopter, designed for the destruction of armor and other hardened targets.
The Colonel knew the point of no return: pulling the trigger! He said it. But human fallibility entered the decision-making process, hampered by sleep deprivation, a fierce desert dust storm, inadequate human factors in the cockpit, inferior teamwork, and the stress of combat that worked against him, his team, and even ground commanders. He did his best under the circumstances. Would you have done anything different? Be honest. The system and the battlefield worked against him. Sometimes doing your best isn’t good enough.

WORK = RISK

When you do work, something changes.* Physical work is the application of force over a distance (W = f ‱ d). Work is necessary to create value. Except where automation is used, work requires people to touch things—to oversee, manipulate, record, or alter things. Jobs and tasks comprise a series of human actions designed to change the state of material or information to create outputs—assets that have value in the marketplace. The risk of harm to those assets emerges when people do work, without which nothing of value is created. Work is energy directed by human beings to create value.2
* Work is the application of physical strength or mental effort to achieve a desired result, whether a force over a distance or careful reasoning (still a force over distance, though at a microscopic level).
Because the use of force, f, requires energy from a built-in hazard to create the d in work, W, all work involves some level of risk. Occasionally, people lose control of these hazards. Human fallibility is an inherent characteristic of the human condition—it’s in our genes. Error is normal—a fact of life, a natural part of being human. The human tendency to err is not a problem until it occurs in sync with significant transfers of energy, movements of matter, or transmissions of information.3 In an operational environment, human error is better characterized as a loss of control—a human action that triggers an unintended and unwanted transfer of energy (ΔE), movement of matter (ΔM), or transmission of information (ΔI).4 Human performance (Hu) is the greatest source of variation in any operation, and the uncertainty in human performance can never be eliminated. If work is not performed under control, the change (d) may not be what you want; work can inflict harm. Work involves the use of force under the condition of uncertainty.5
When performing work, people usually concentrate on accomplishing their immediate production goal, not necessarily on safety.6 Most of the time, people’s attention is on the work. If people cannot fully concentrate on being safe, thoroughly convinced there will be no unintended consequences 100 percent of the time, then when should they fully focus on safe outcomes?

VALUE ADDED VERSUS VALUE EXTRACTED

Recall Dorian Conger’s quote at the beginning of this chapter, “If an operation has the capacity to do work, it has the capacity to do harm.” All human-directed work intends to accomplish something that meets customer requirements, to add value. However, when people manipulate the controls of built-in hazards, there is a corresponding risk to do harm that can extract value instead of adding value. The greater the amount of energy transferred, matter transported, or sensitive information transm...

Table des matiĂšres