Critical Steps
eBook - ePub

Critical Steps

Managing What Must Go Right in High-Risk Operations

Tony Muschara, Ron Farris, Jim Marinus

Compartir libro
  1. 169 páginas
  2. English
  3. ePUB (apto para móviles)
  4. Disponible en iOS y Android
eBook - ePub

Critical Steps

Managing What Must Go Right in High-Risk Operations

Tony Muschara, Ron Farris, Jim Marinus

Detalles del libro
Vista previa del libro
Índice
Citas

Información del libro

Critical Steps happen every day at work and at home, purposefully. Work does not happen otherwise. If an operation has the capacity to do work, then it has the capacity to do harm. Work is energy directed by human beings to create value. But people are imperfect—we make mistakes, and sometimes we lose control of the work. Therefore, work is the use of force under conditions of uncertainty. A Critical Step is a human action that will trigger immediate, irreversible, and intolerable harm to an asset, if that action or a preceding action is performed improperly. Whether the human action involves clicking on a link attached to an e-mail message, walking down a flight of stairs with a newborn baby in arms, engaging the clutch on a gasoline-driven chain saw, or administering a medication to a patient in a hospital, these all satisfy the definition of what constitutes critical risks in our daily lives, professionally or personally. The overarching goal of managing Critical Steps is to maximize the success (safety, reliability, productivity, quality, profitability, etc.) of people's performance in the workplace, to create value for the organization without losing control of built-in hazards necessary to create that value.

Preguntas frecuentes

¿Cómo cancelo mi suscripción?
Simplemente, dirígete a la sección ajustes de la cuenta y haz clic en «Cancelar suscripción». Así de sencillo. Después de cancelar tu suscripción, esta permanecerá activa el tiempo restante que hayas pagado. Obtén más información aquí.
¿Cómo descargo los libros?
Por el momento, todos nuestros libros ePub adaptables a dispositivos móviles se pueden descargar a través de la aplicación. La mayor parte de nuestros PDF también se puede descargar y ya estamos trabajando para que el resto también sea descargable. Obtén más información aquí.
¿En qué se diferencian los planes de precios?
Ambos planes te permiten acceder por completo a la biblioteca y a todas las funciones de Perlego. Las únicas diferencias son el precio y el período de suscripción: con el plan anual ahorrarás en torno a un 30 % en comparación con 12 meses de un plan mensual.
¿Qué es Perlego?
Somos un servicio de suscripción de libros de texto en línea que te permite acceder a toda una biblioteca en línea por menos de lo que cuesta un libro al mes. Con más de un millón de libros sobre más de 1000 categorías, ¡tenemos todo lo que necesitas! Obtén más información aquí.
¿Perlego ofrece la función de texto a voz?
Busca el símbolo de lectura en voz alta en tu próximo libro para ver si puedes escucharlo. La herramienta de lectura en voz alta lee el texto en voz alta por ti, resaltando el texto a medida que se lee. Puedes pausarla, acelerarla y ralentizarla. Obtén más información aquí.
¿Es Critical Steps un PDF/ePUB en línea?
Sí, puedes acceder a Critical Steps de Tony Muschara, Ron Farris, Jim Marinus en formato PDF o ePUB, así como a otros libros populares de Technik & Maschinenbau y Gesundheit & Sicherheit in der Industrie. Tenemos más de un millón de libros disponibles en nuestro catálogo para que explores.

Información

Editorial
CRC Press
Año
2021
ISBN
9781000476835

1

What Is a CRITICAL STEP?

DOI: 10.1201/9781003220213-1
If an operation has the capacity to do work, then it has the capacity to do harm.*
* This statement is attributed to Dorian Conger, who made this statement to students during the introduction to a MORT cause analysis class. (MORT: Management Oversight Risk Tree).
—Dorian Conger
Roger, I was afraid of that. I was really afraid of that.
—Battalion Commander of a U.S. Army Apache helicopter flying in the gunner’s seat after a “friendly fire” tragedy
FATAL FRIENDLY FIRE 1
On February 17, 1991, at approximately 1:00 a.m., a U.S. Bradley Fighting Vehicle and an armored personnel carrier were destroyed by two missiles fired from a U.S. Apache helicopter. Two U.S. soldiers were killed, and six others were wounded. This friendly fire tragedy took place in the Persian Gulf during Operation Desert Storm. The incident occurred after U.S. ground forces, which were deployed along an east-west line about 3 miles north of the Saudi-Iraqi border, reported several enemy sightings north of their positions. In response, ground commanders called for Apache reconnaissance of the area.
Apache cockpits have two sections: the front seat is reserved for the gunner and the back seat for the pilot. The pilot controls the flight pattern, and the gunner engages the target with the helicopter’s weapon systems. Both sections of the cockpit have flight and weapons control if one must take control of the other.
Every night for the first couple of weeks of February, battalion helicopters responded to reports from U.S. ground forces of apparent movements of Iraqi vehicles, all false alarms. A U.S. Army Lieutenant Colonel was the Battalion Commander of a U.S. Army Apache helicopter strike force. Just days before, helicopters from the Colonel’s battalion misidentified and fired on a U.S. Army scout vehicle, missing it without damage or injury—a near hit.
U.S. armored forces on the ground operating in the area reported possible enemy sightings—suspected Iraqi armored vehicles moving toward a U.S. tank squadron. Commanders of the ground forces asked for aid from the Apache battalion based about 60 miles south of the border to explore the area and to engage them if enemy presence was found. The Colonel with his copilot and two other Apache helicopters responded quickly, urgently directed to patrol an area north of the line of U.S. tanks. Because of an imminent sandstorm with intense winds and low visibility, the Colonel decided to command the lead Apache himself, in the gunner’s seat, even though he had only 3 hours of sleep in the previous 24 hours. They launched at 12:22 a.m. Due to the urgency of the request, a normal, detailed premission briefing was not done.
Upon arriving on station at 12:50 a.m., the helicopter’s target acquisition system detected the vehicles. Two suspicious vehicles appeared near the eastern end of the line of U.S. ground forces, noting the targets’ locations by measuring their distance from the aircraft with a laser beam, automatically entered into the weapons fire control computer. The Colonel estimated the suspicious vehicles were about a quarter mile in front, the first mistake. He misread the grid coordinates of the alleged targets on the helicopter navigation system, reading instead the search coordinates that he manually entered into the navigation system while in route early in the flight. As a result, he misidentified the target vehicles’ location as being north of the line of friendly vehicles, which coincidently were in the exact location of previously reported enemy sightings.
A discussion ensued between the three Apache pilots and the ground commander to authenticate their identity. Apache helicopters were not equipped with IFF—an automated system referred to as “Identification Friend or Foe.” In the darkness, the vehicles could not otherwise be identified.
The ground commander insisted that no U.S. forces were ahead of the line, that the vehicles must be enemy, and twice replied to the Colonel, “Those are enemy. Go ahead and take them out.” Pilots of the other two Apaches also thought the vehicles were enemy. More ominously, there were persistent search-radar alerts being received in the cockpit, adding to the stress of the moment. These alerts, responding to radar emitted by friendly forces, were misidentified by the Apache computers as an enemy system. Even the Colonel’s copilot prompted him, “Do em!” more than once. Yet he felt uneasy as to the identity of the vehicles. The Colonel is recorded to have said, “Boy, I’m going to tell you, it’s hard to pull this trigger,” asking for help to verify current helicopter heading and bearing to and grid coordinates of targets. He states the targets’ grid coordinates aloud, again misreading them, the second mistake. No one recognizes the error. His copilot states, “Ready in the back.”
The Colonel decided to fire on the vehicles with the Apache’s 30-millimeter cannons (machine guns), which would have inflicted less damage than a missile just in case they were friendlies. The gun emitted only a few rounds before jamming (sand). He then fired two Hellfire missiles* at the suspected vehicles—the third, but deadly, mistake. Shortly thereafter, the Apaches received a cease fire order. The missiles had already been fired and both vehicles, a Bradley Fighting Vehicle and an armored personnel carrier, were destroyed, killing two U.S. soldiers inside. The Colonel softly said, “I was afraid of that, I was really afraid of that.”
* The laser-guided Hellfire missile is the main armament on the Apache helicopter, designed for the destruction of armor and other hardened targets.
The Colonel knew the point of no return: pulling the trigger! He said it. But human fallibility entered the decision-making process, hampered by sleep deprivation, a fierce desert dust storm, inadequate human factors in the cockpit, inferior teamwork, and the stress of combat that worked against him, his team, and even ground commanders. He did his best under the circumstances. Would you have done anything different? Be honest. The system and the battlefield worked against him. Sometimes doing your best isn’t good enough.

WORK = RISK

When you do work, something changes.* Physical work is the application of force over a distance (W = fd). Work is necessary to create value. Except where automation is used, work requires people to touch things—to oversee, manipulate, record, or alter things. Jobs and tasks comprise a series of human actions designed to change the state of material or information to create outputs—assets that have value in the marketplace. The risk of harm to those assets emerges when people do work, without which nothing of value is created. Work is energy directed by human beings to create value.2
* Work is the application of physical strength or mental effort to achieve a desired result, whether a force over a distance or careful reasoning (still a force over distance, though at a microscopic level).
Because the use of force, f, requires energy from a built-in hazard to create the d in work, W, all work involves some level of risk. Occasionally, people lose control of these hazards. Human fallibility is an inherent characteristic of the human condition—it’s in our genes. Error is normal—a fact of life, a natural part of being human. The human tendency to err is not a problem until it occurs in sync with significant transfers of energy, movements of matter, or transmissions of information.3 In an operational environment, human error is better characterized as a loss of control—a human action that triggers an unintended and unwanted transfer of energy (ΔE), movement of matter (ΔM), or transmission of information (ΔI).4 Human performance (Hu) is the greatest source of variation in any operation, and the uncertainty in human performance can never be eliminated. If work is not performed under control, the change (d) may not be what you want; work can inflict harm. Work involves the use of force under the condition of uncertainty.5
When performing work, people usually concentrate on accomplishing their immediate production goal, not necessarily on safety.6 Most of the time, people’s attention is on the work. If people cannot fully concentrate on being safe, thoroughly convinced there will be no unintended consequences 100 percent of the time, then when should they fully focus on safe outcomes?

VALUE ADDED VERSUS VALUE EXTRACTED

Recall Dorian Conger’s quote at the beginning of this chapter, “If an operation has the capacity to do work, it has the capacity to do harm.” All human-directed work intends to accomplish something that meets customer requirements, to add value. However, when people manipulate the controls of built-in hazards, there is a corresponding risk to do harm that can extract value instead of adding value. The greater the amount of energy transferred, matter transported, or sensitive information transm...

Índice