The Multitasking Myth
eBook - ePub

The Multitasking Myth

  1. 202 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

About this book

Despite growing concern with the effects of concurrent task demands on human performance, and research demonstrating that these demands are associated with vulnerability to error, so far there has been only limited research into the nature and range of concurrent task demands in real-world settings. This book presents a set of NASA studies that characterize the nature of concurrent task demands confronting airline flight crews in routine operations, as opposed to emergency situations. The authors analyze these demands in light of what is known about cognitive processes, particularly those of attention and memory, with the focus upon inadvertent omissions of intended actions by skilled pilots. The studies reported within the book employed several distinct but complementary methods: ethnographic observations, analysis of incident reports submitted by pilots, and cognitive task analysis. They showed that concurrent task management comprises a set of issues distinct from (though related to) mental workload, an area that has been studied extensively by human factors researchers for more than 30 years. This book will be of direct relevance to aviation psychologists and to those involved in aviation training and operations. It will also interest individuals in any domain that involves concurrent task demands, for example the work of emergency room medical teams. Furthermore, the countermeasures presented in the final chapter to reduce vulnerability to errors associated with concurrent task demands can readily be adapted to work in diverse domains.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access The Multitasking Myth by Loukia D. Loukopoulos, R. Key Dismukes, Immanuel Barshi in PDF and/or ePUB format, as well as other popular books in Betriebswirtschaft & Business allgemein. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2016
eBook ISBN
9781317023555

Chapter 1
Introduction

On the evening of March 2, 1994, the crew of Continental Airlines flight 795 prepared for takeoff from La Guardia’s runway 13. The taxiways and runways were covered with a thin layer of slush, conditions quite common for this airport at this time of year, and certainly nothing new for this crew of very experienced pilots. The captain had more than 6,000 hours in the particular aircraft type (MD-82) and 23,000 total flight hours; the first officer had 2,400 hours in type and 16,000 total flight hours. Beginning with their preflight inspection, and all the way up until taking the active runway, the pilots took appropriate precautions to prepare the aircraft for the prevailing icing conditions. Before departing the gate, they requested that the aircraft be sprayed with de-icing fluid and they visually inspected the wings from the cabin windows to insure that the de-icing was effective. They elected to taxi on one engine to save fuel, anticipating departure delays because of the weather, and kept the flaps retracted to prevent freezing slush from being thrown up onto the flaps during taxi. Before starting the second engine on the taxiway, they again visually inspected the wings from the cabin windows. Their takeoff briefing included a review of procedures for a rejected takeoff. At 1758 local time the flight was cleared for takeoff with the first officer at the controls. One minute later the captain commanded the first officer to abort the takeoff after noticing erratic readings on both airspeed indicators. But by this time the aircraft had reached 145 knots, considerably faster than shown on the airspeed indicators, and could no longer be brought to a safe stop within the confines of the runway. It was substantially damaged as it slid into a mud flat in Flushing Bay.
The National Transportation Safety Board (NTSB) investigation determined that the crew had failed to turn on the pitot-static heat system, which is normally done while the airplane is at the gate. Lacking heat, the pitot tubes1 were blocked by ice or snow, which in turn caused the erroneous airspeed indications (NTSB, 1995). Also, just before takeoff the crew had failed to note an amber warning light on the annunciation panel indicating that the pitot-static heat was off.
Turning the pitot-static heat system on and verifying its status were actions highly familiar to the pilots of flight 795. Execution and timing of both steps were explicitly prescribed in the airline’s standard operating procedures (SOP). Airline carriers use carefully scripted instructions, in the form of written procedures and checklists, to describe the exact actions that must be performed within each phase of flight as well as the precise sequence in which those actions are to be executed. Pilots are trained to follow these instructions in a consistent manner. At the time of the accident, Continental Airlines used a Before Pushback/Before Start checklist that was to be performed with the airplane parked at the gate before starting the engines. The airline’s procedure was for the captain to call for this checklist and for the first officer then to read out loud each item on the checklist, so that the captain could respond by stating the configuration or status of that item.2 One item on the checklist called for the pitot-static system heat to be turned on and checked. To accomplish this in the MD-82, the captain was to rotate a switch on the overhead panel from the ā€œOFFā€ to the ā€œCAPTā€ position and verify that current was flowing by noting the indication on an ammeter next to the switch.
Beyond the checklist, which was the first layer of defense against inadvertent failure to turn on the pitot-static heat, crews in the MD-82 were provided further protection: an amber warning light illuminated on the annunciation panel to indicate that pitot-static heat was off. Checking the annunciation panel was an item on another checklist, the Before Takeoff checklist, yet another layer of defense, which the crew was expected to perform just before takeoff.

Even Skilled Experts Make Mistakes

Why, then, would a crew with so much experience and skill, who had performed the checklists to prepare this type of aircraft for flight thousands of times previously, make the errors that caused this accident? The answer to this question is complex. We have argued elsewhere that attributing potentially fatal errors made by highly-motivated expert pilots to ā€œcarelessnessā€ or ā€œcomplacencyā€ is trivializing and misleading (Dismukes, Berman, and Loukopoulos, 2007; Dismukes and Tullo, 2000). Finding meaningful answers requires careful analysis of the nature of cockpit tasks and the operational environment in which they are performed, the demands those tasks place on human cognitive processes, and the inherent vulnerability of those processes to characteristic forms of error in particular situations.
Our review of airline accidents attributed to crew error (Dismukes et al., 2007) reveals that flight 795 shared critical features with other accidents in which crews inadvertently omitted procedural steps or failed to notice warning indicators. In another review, inadvertent omission of a normal procedural step by pilots played a central role in five of 27 major airline accidents that occurred in the United States between 1987 and 2001 and in which crew error was found to be a causal or contributing factor (Dismukes, 2007).
Flight crews are by no means the only human operators vulnerable to such errors. In 1991, an Air Traffic Control (ATC) Tower controller at Los Angeles International airport cleared an airliner to land on runway 24L, not realizing that she had forgotten to release a commuter aircraft waiting on that runway to take off (NTSB, 1991). The resulting collision in twilight haze destroyed both aircraft and killed 34 people. Forgetting to perform a procedural step, such as removing all surgical instruments at the conclusion of an operation (Gawande, Studdert, Orav, Brennan, and Zinner, 2003), failing to remove a tourniquet after starting an IV line or drawing blood (Patient Safety Authority, 2005) or failing to accomplish medication reconciliation procedures in place to prevent administration of the wrong medication to patients (Joint Commission, 2006) are common forms of omissions in the health care setting. Indeed, omission of procedural steps is a form of human error with serious consequences in many complex work settings, such as that of aviation operations (Reason, 2002), aviation maintenance (Boeing 1993; Hobbs and Williamson, 2003; Reason, 1997), air traffic control (Eurocontrol, 2004), the pipeline industry (American Petroleum Institute, 2005), and nuclear power plant operations (Davey, 2003; Kastchiev, Kromp, Kurth, Lochbaum, Lyman, Sailer, et al., 2006; Rasmussen, 1980). According to Reason (2002), inadvertent omissions have been ā€œā€¦ shown to constitute the largest class of human performance problems in various hazardous operations ā€¦ā€
The ubiquity of omissions during task performance in so many professional settings stems from the same underlying and contextual factors. Crews accomplish many tasks in the short time from the moment of preparing the aircraft for flight to the moment of takeoff, and most of these tasks involve multiple procedural steps. The high degree of familiarity with these tasks and the standardization of operating procedures usually keep the workload within human capabilities. This phase of flight is, however, often replete with interruptions, distractions, and unexpected task demands, even on routine flights (Loukopoulos, Dismukes, and Barshi, 2003, 2001). The crew of flight 795 experienced additional demands because of the weather conditions. They had to arrange for de-icing, and to check the wings for ice before and during taxi. They also had to defer setting the flaps to takeoff position and had to postpone starting the second engine until much later than usual, when approaching the departure runway. These demands disrupted the usual flow of procedures and delayed completion of checklists that included items associated with the flaps and engines. In fact, the first officer was still performing checklists when the captain positioned the aircraft on the runway and turned control over to the first officer to execute the takeoff.3
These additional demands, however, were not extraordinary, and experienced pilots would not have considered the workload on this flight to be excessive. Yet for reasons that will become apparent in this book, the nature of competing task demands in the cockpit contributes directly to inadvertent errors of omission, such as those made by this crew and by other crews. The issue lies not so much in the total volume of work required as in the concurrent nature of task demands. Like other operators in complex environments, both pilots in the cockpit of a modern airliner must often manage multiple tasks concurrently, interleaving performance of several tasks, deferring or suspending some tasks while performing others, responding to unexpected interruptions and delays and unpredictable demands imposed by external agents, and keeping track of the status of all tasks during these events.
The issue of concurrent task management in the cockpit has only recently begun to receive attention from scientists and the operational community. Reviews of accident reports from the NTSB database and incident reports from the Aviation Safety Reporting System (ASRS) database (e.g., Chou, Madhavan, and Funk, 1996; Dismukes, Young, and Sumwalt, 1998; Sarter and Alexander, 2000) and simulation studies (e.g., Latorella, 1999; Raby and Wickens, 1994) reveal that pilots are prone to error when managing concurrent tasks—especially forgetting to perform an intended task (Dismukes, 2007). (Also see Damos, 1991, for an earlier review of multiple task performance, couched mainly in terms of workload). Problems stemming from concurrent task management demands are an issue in many workplace settings, such as the health care industry (e.g., among nurses —Institute of Medicine, 2007, 2004; Tucker and Spear, 2006), anesthesiologists (Cook, and Woods, 1994), operating room staff (Rogers, Cook, Bower, Molloy, and Render, 2004) and emergency department personnel (Gray-Eurom, 2006)) and the nuclear power industry (Theureau, Filippi, Saliou, Le Guilcher, and Vermersch, 2002).
A special case of forgetting to perform tasks is to become so preoccupied with one task that the individual inadvertently stops concurrent monitoring of the status of other tasks (Dismukes, et al., 1998). Even when monitoring does not drop out completely, the quality of monitoring may suffer (Wickens, 2005), and pilots may fail to notice changes in the status of the monitored system (Bellenkes, Wickens, and Kramer, 1997; Mumaw, Sarter, Wickens, Kimball, Nikolic, Marsh, et al., 2000). Preoccupation also makes pilots and operators of other complex systems vulnerable to ā€œhabit captureā€ errors in which they inadvertently take a highly practiced action instead of an intended action that is less common (Reason, 1990).
Another form of error occurs when pilots are interrupted and forget to resume the interrupted task (Degani and Wiener, 1990; Dismukes and Nowinski, 2006; Dodhia and Dismukes, 2008). Interruptions have been found to impair both individual and team performance engaged in diverse tasks involving detailed procedures: nurses administering medication (Hickam, Severance, Feldstein, et al., 2003; O’Shea, 1999; Pape, 2003); biomedical engineers providing space mission support (Rukab, Johnson-Throopa, Malinb, and Zhang, 2004); police dispatchers routing emergency calls (Kirmeyer, 1988a); physicians coordinating and communicating with nurses in the emergency department (Alvarez, Coiera, 2005; Chisholm, Collison, Nelson, and Cordell, 2000); and operators monitoring nuclear power plant operations (De Carvalho, Rixey, Shepley, Gomes, and Guerlain, 2006). Interruptions are also commonplace in routine office work. Not only are workers vulnerable to forgetting to resume interrupted tasks in a timely manner, when they do resume they must often struggle to mentally reconstruct the status of the interrupted task (implying a delay, aptly labeled ā€œrecovery timeā€ by Tucker and Spear, 2006), and they are vulnerable to increased error rates (Gillie and Broadbent, 1989; Latorella, 1999; Monk, Boehm-Davis and Trafton, 2004; Speier, Valacich and Vessey, 2003; Trafton, Altmann, Brock and Mintz, 2003). Similarly, when pilots are forced by circumstances to defer a task that is normally performed at a certain point in a standard procedure to a later point in time, they are vulnerable to forgetting to execute the deferred task, especially when busy with other task demands (Dismukes, 2007; Dismukes and Nowinski, 2006).
One might argue that some of the situations described here do not involve concurrent task management because the tasks are performed sequentially rather than simultaneously. However, we use the term concurrent task management because the pilot is responsible for scheduling and executing multiple tasks whose status must at least be monitored concurrently. This is a dynamic process that must be continuously updated. The term concurrent task management does not imply that multiple tasks must be performed simultaneously, just managed concurrently. Another concern is that concurrent task management might be taken to imply deliberate, strategic efforts by pilots or other individuals to manage competing task demands, whereas in fact individuals sometimes react to competing task demands without an explicit overall strategy. In this book we use the term concurrent task management broadly to refer both to the challenge imposed by multiple task demands occurring within the same time frame, and to the ways in which individuals respond to those challenges, whether or not deliberate or well-thought out. This use of the term is consistent with how it is already being used in the research literature.
Aviation accident and incident report studies reveal that concurrent task management is challenging and vulnerable to error, but do not explain why these errors occur. That explanation requires a rigorous experimental investigation of the cognitive processes, particularly attention and memory, involved in responding to the demands of specific cockpit situations. Furthermore, the nature of concurrent task demands in the operational context, particularly in airline cockpit operations, has never before been analyzed in detail. To understand the issues underlying concurrent task management, we first need a thorough description of the types of tasks that must be performed concurrently, the demands posed by these tasks separately and together, the temporal structure of these demands, and the characteristic forms of error associated with typical combinations of tasks. This description can inform the operational community about cockpit situations vulnerable to error.
Another reason a detailed description of the real-world task demands of cockpit operations is required can be found in the FAA Advisory Circular (FAA, 2006a) on Safety Management Systems. This Advisory Circular recommends that systems and task analysis, hazard identification, and risk analysis and assessment be performed as the first three steps of risk management. Further, to become eligible to participate in the FAA’s Advanced Qualification Program (AQP) (FAA, 2006b), airlines must formally analyze the tasks performed by pilots and the skills required in the airline’s particular operations. However, this analysis is typically performed on the basis of a formal description of pilot duties that in the next chapter we refer to as the ā€œIdealā€. As will become apparent in this book, this ideal description fails to capture crucial aspects of cockpit task demands and their complexity. A more realistic characterization of the full scope of cockpit task demands in actual flights (i.e., in ā€œline operationsā€) would provide better guidance for effective training under AQP. It would also provide a foundation for designing flight operating procedures to minimize vulnerability to error.
Still another benefit of characterizing concurrent task demands in line operations is that it can help the scientific community design experimental research to elucidate the cognitive processes that both enable concurrent task performance and make it vulnerable to error. This research can provide a rational basis for designing a wide range of countermeasures to reduce vulnerability to error. Our work aims to address these needs and to provide these benefits. Although our study uses cockpit operations as the domain for study, our analysis applies to many other domains in which humans must deal with concurrent task demands, ranging from hospital emergency rooms, to control rooms in nuclear power plants, to office work in the computer age.

Our Study

In the research presented in this book, we extend previous studies of concurrent task management in two important ways: (1) describing in some detail the various situations involving concurrent task management that challenge aircraft crews and (2) characterizing one of the most common forms of error associated with concurrent task demands—inadvertent omissions of intended actions. These inadvertent omissions constitute several forms of prospective memory error. Prospective memory refers to the cognitive processes involved in remembering or forgetting to perform actions intended to be performed at a later time (i.e., delayed intentions). As the example of flight 795 illustrates, human operators often do not recognize they have forgotten to perform a crucial action until it is too late to recover. Throughout this book, we link the real-world aspects of concurrent task management and prospective memory to the underlying cognitive processes. This linkage should provide a foundation for developing better ways of managing concurrent task demands and avoiding prospective memory errors.
We chose to study concurrent task management in the cockpit for several reasons: It is an unavoidable aspect of flight operations; the challenges of managing multiple tasks concurrently have contributed to many aviation accidents; and only recently have scientists begun to analyze the nature of these challenges. Further, the cockpit is a structured environment well-suited for observation, and our observations can serve as a model for analyzing concurrent task management in diverse other settings.
Our focus on inadvertent omission of intended actions as a consequence of difficulties with managing tasks concurrently was driven by four factors:
1. Omission of intended actions can have grave consequences, as illustrated by the example of Fight 795, discussed at the beginning of this chapter.
2. For the purposes of research, these omissions are easier to identify in accident and incident records than many other forms of error. Flight operating manuals (FOMs) specify in considerable detail the actions crews must perform during flight, thus omissions are fairly conspicuous. In contrast, crews are given a fair amount of latitude in judgment and decision-making; consequently, evaluating these aspects of crew performance tends to be more subjective.
3. A burgeoning new field of research on prospective memory is providing ways to understand the cognitive processes underlying vulnerability to inadvertent errors of omission (see, for example, McDaniel and Einstein, 2007).
4. We also suspect that inadvertent omissions of intended actions may underlie and contribute to more subtle forms of error. For example, preoccupied with one task, a pilot may inadvertently neglect to monitor the status of other tasks, thus undermining situation awareness, which in turn is likely to impair decision-making. Further, Dismukes et al. (2007) found that when crews become overloaded with multiple task demands they sometimes respond in a maladaptive fashion, inadvertently dropping strategic management in favor of a less demanding, but far less effective, style of reacting to events only as they occur, rather than anticipating and planning for events. This too is a kind of error of omission—omission of strategy.

Our Approach

We begin our analysis with a detailed description of the cockpit tasks of the two-person crew of a large airliner and then characterize the diverse types of situations in which tasks must be managed concurrently. This characterization provides a foundation for our analysis of the cognitive processes that enable humans to manage tasks concurrently, and the ways in which this management is vulnerable to error.
The study at the core of this book required a multi-faceted approach (described in detail in Appendix A). One component was an ethnographic approach in which we participated in airline flight training, analyzed FOMs and flight reference manuals (FRMs), and observed normal flight operations from the cockpit jumpseat at two major U.S. airlines. Another compone...

Table of contents

  1. Cover Page
  2. Title Page
  3. Copyright Page
  4. Contents
  5. Acknowledgments
  6. Reviews for The Multitasking Myth
  7. Preface
  8. 1 Introduction
  9. 2 What is Multitasking and How is it Accomplished?
  10. 3 The Ideal: Flight Operations as Depicted by Flight Operations Manuals
  11. 4 The Real: Flight Operations Add Complexity and Variability
  12. 5 Analysis of Concurrent Task Demands and Crew Responses
  13. 6 The Research Applied
  14. Appendix A Methods
  15. Appendix B Human Agents
  16. Appendix C Observed Perturbations
  17. Appendix D Errors Cited in ASRS Incident Reports
  18. Glossary
  19. References
  20. Index