Human Performance in Automated and Autonomous Systems
eBook - ePub

Human Performance in Automated and Autonomous Systems

Emerging Issues and Practical Perspectives

  1. 314 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Human Performance in Automated and Autonomous Systems

Emerging Issues and Practical Perspectives

About this book

This book is devoted to the examination of emerging practical issues related to automated and autonomous systems. The book highlights the significance of these emergent technologies that determine the course of our daily lives. Each unique chapter highlights human factors and engineering concerns across real-world applications, including matters related to aviation and healthcare, human-robot interaction, transportation systems, cybersecurity and cyber defense. This book also depicts the boundaries that separate humans from machine as we continue to become ever more immersed in and symbiotic with these fast-emerging technologies.

Automation, across many occupations, has transitioned the human to a role of monitoring machines, presenting challenges related to vigilance and workload. This book identifies the importance of an approach to automated technology that emphasizes the "human user" at the center of the design process.

Features

  • Provides perspectives on the role of the individual and teams in complex technical systems such as aviation, healthcare, and medicine
  • Presents the development of highly autonomous systems related to human safety and performance
  • Examines solutions to human factors challenges presented by modern threats to data privacy and cybersecurity
  • Discusses human perceptual and cognitive capabilities underwriting to the design of automated and autonomous systems

• Provides in-depth, expert reviews of context-related developments in automation and human-robot teaming

Human Performance in Automated and Autonomous Systems: Emerging Issues and Practical Perspectives applies scientific theory directly to real-world systems where automation and autonomous technology is implemented.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Human Performance in Automated and Autonomous Systems by Mustapha Mouloua, Peter A. Hancock, Mustapha Mouloua,Peter A. Hancock in PDF and/or ePUB format, as well as other popular books in Technology & Engineering & Automation in Engineering. We have over one million books available in our catalogue for you to explore.
1
Human Monitoring of Automated Systems
Mustapha Mouloua, James C. Ferraro, Raja Parasuraman, Robert Molloy, & Brian Hilburn

Introduction

It has been already nearly 25 years since our original chapter was published in the first edition of this book (Parasuraman & Mouloua, 1996). That previous chapter covered many of the critical human performance issues related to highly automated systems and, in particular, it emphasized the aviation system. However, the same problems that were examined in the original book chapter remain pertinent today, albeit across other domains and applications (e.g., health care and medicine, industrial process control, nuclear, etc.). The proliferation of automated systems and devices continues to increase at a remarkable rate due to the evident benefits to human performance and safety as has been documented in previous publications (see Billings, 1997; Bogner, Mouloua, & Parasuraman, 1994; Garland & Wise, 2010; Mouloua & Koonce, 1997; Mouloua & Parasuraman, 1994; Parasuraman & Mouloua, 1996; Scerbo & Mouloua 1999; Sheridan, 2002; Vincenzi, Mouloua, & Hancock, 2004a, 2004b; Wiener & Nagel, 1988). Collectively, these texts have also covered a wide array of chapters pertaining to problems often encountered in these highly automated systems. For example, several of these problems were attributed to automation-induced complacency and/or automation-induced monitoring inefficiency. With the associated concerns for de-skilling human operators, such problems have also been documented in accident reports (e.g., National Aeronautics and Space Administration's Aviation Safety Reporting Program, NASA ASRS). Similarly, a line of programmatic work by Endsley and her associates has also covered a variety of problems related to loss of situation awareness in highly automated systems (Endsley & Garland, 2000; Endsley & Jones, 2004; Endsley & Strauch, 1997). With the advent of even more fully autonomous and semiautonomous systems, in both the military and the civilian airspace, as well as military surface environments, entertainment, learning, and medical systems, it seems inevitable that the same automation problems will continue to persist as long as the machines and intelligent agents are replacing the active role of the human systems operators. Such replacements place the human in a more passive or supervisory role, a role that is often not well suited for humans (Hancock, 2013; Parasuraman & Mouloua, 1996). The present chapter is an updated evolution from our previous work, published in our original book. Here, we provide an update contingent upon developments in the literature, centered on human capabilities in automation monitoring.
The revolution ushered in by the digital computer in the latter half of the last century transformed many of the characteristics of work, leisure, and travel for most people throughout the world. Even more radical changes have occurred during this century, as computers have increased in power, speed, availability, flexibility, and in that elusive concept known as “intelligence.” Only a neo-Luddite would want to operate in the 21st century without the capabilities that the new computer tools provide; and perhaps even a latter-day Thoreau would not wish to trade in his word processor for pen and paper. And, yet, although we have become accustomed to the rise of computers and as consumers demanded that they perform even greater feats, many have felt a sense of unease at the growth of computerization and automation in the workplace and in the home. Although there are several aspects to this disquiet, there is one overriding concern: Who will watch the computers?
The concern is not just the raw material for science-fiction writers or germane only to the paranoid mind but something much more mundane. Computers have taken over more of human work—ostensibly leaving humans less to do, to do more in less time, to be more creative in what they do, or to be free to follow other pursuits. For the most part, computers have led to these positive outcomes—they have freed us from the hard labor of repetitive computation and allowed us to engage in more creative pursuits. But in some other cases, the outcomes have not been so sanguine; in these instances, human operators of automated systems may have to work as hard or even harder, for they must now watch over the computers that do their work. This may be particularly true in complex human-machine systems in which several automated subsystems are embedded, such as the commercial aircraft cockpit, the nuclear power station, and the advanced manufacturing plant. Such complex, high-risk systems, in which different system subcomponents are tightly “coupled,” are vulnerable to system monitoring failures that can escalate into large-scale catastrophes (Perrow, 1984; Weick, 1988). Editorial writers have rightly called for better understanding and management of these low-probability, high-consequence accidents (Koshland, 1989).
One of the original reasons for the introduction of automation into these systems was to assist humans in dealing with complexity and to relieve them of the burden of repetitive work. The irony (Bainbridge, 1983) is that one source of workload may be replaced by another: Monitoring computers to make sure they are doing their job properly can be as burdensome as doing the same job manually and can impose considerable mental workload on the human operator. Sheridan (1970) first discussed how advanced automation in modern human-machine systems changes the nature of the task demands imposed on the human operator of such systems. He characterized the role of the human operator in highly automated systems as altered from that of an active, manual controller to a supervisor engaged in monitoring, diagnosis, and planning. Each of these activities can contribute to increased mental workload.
Many of the changes brought about by automation have led to significant system benefits, and it would be difficult to operate many complex modern systems such as nuclear power plants or military aircraft without automation (Sheridan, 1992). Although users of automated systems often express concerns about the trend of “automation for automation's sake” (Peterson, 1984), many automated systems have been readily accepted and found invaluable by users (e.g., the horizontal situation indicator map display used by pilots). At the same time, some other changes associated with automation have reduced safety and user satisfaction, and a deeper understanding of these changes is necessary for successful implementation and operation of automation in many different systems (Mouloua & Parasuraman, 1994; Wickens, 1994; Wiener, 1988). Among the major areas of concern is the impact of automation on human monitoring. Automation of a task for long periods of time increases the demand on the operator to monitor the performance of the automation, given that the operator is expected to intervene appropriately if the automation fails. Because human monitoring can be subject to error in certain conditions, understanding how automation impacts on monitoring is of considerable importance for the design of automated systems. This chapter discusses the interrelationships of automation and monitoring and the corresponding implications for the design of automated systems.

Examples of Operational Monitoring: Normal Performance and Incidents

It has become commonplace to point out that human monitoring can be subject to errors. Although this is sometimes the case, in many instances, operational monitoring can be quite efficient. In general, human operators perform well in the diverse working environments in which monitoring is required. These include air traffic control, surveillance operations, power plants, intensive-care units, and quality control in manufacturing. In large part, this probably stems from general improvements over the years in working conditions and, in some cases (although not generally), from increased attention to ergonomic principles. In one sense, when the number of opportunities for failure are considered—virtually every minute for these continuous, 24-hour systems—the relatively low frequency of human monitoring errors is quite striking.
This is not to say that errors do not occur. But often when human monitoring is imperfect, it occurs under conditions of work that are less than ideal. Consider the monitoring performance of personnel who conduct X-ray screening for weapons at airport security checkpoints. These operators are trained to detect several types of weapons and explosives, yet they may rarely encounter them in their daily duty periods. To evaluate the efficiency of the security screening, Federal Aviation Administration (FAA) inspectors conduct random checks of particular airline screening points using several test objects corresponding to contraband items, including guns, pipe bombs, grenades, dynamite, and opaque objects. The detection rate of these test objects by airport X-ray screening personnel is typically good, although not perfect, as shown in Figure 1.1 (Air Transport Association, 1989). Founded in 2001, the Transportation Safety Administration (TSA) was tasked to administer airport security screening and safety procedures throughout the U.S. airports. This has led to much improved safety standards related to TSA personnel selection and training. However, there still exist some human factors challenges that are readily understandable given even a cursory evaluation...

Table of contents

  1. Cover
  2. Half Title
  3. Title
  4. Copyright
  5. Contents
  6. Preface
  7. Remembering Fallen Heroes: A Tribute to Raja Parasuraman, Joel Warm, & Neville Moray
  8. Acknowledgments
  9. About the Editors
  10. List of Contributors
  11. Chapter 1 Human Monitoring of Automated Systems
  12. Chapter 2 Motor Performance Assessment and Its Implication for Display and Control Systems
  13. Chapter 3 The Role of Automation in Aviation Weather: Product Development and General Aviation Pilot Performance
  14. Chapter 4 A Playbook-Based Interface for Human Control of Swarms
  15. Chapter 5 Human-Machine System Performance in Spaceflight: A Guide for Measurement
  16. Chapter 6 Decision Support in Medical Systems
  17. Chapter 7 Creating and Evaluating Human-Machine Teams in Context
  18. Chapter 8 20 Years of Automation in Team Performance
  19. Chapter 9 Automation Trust and Situational Experience: Theoretical Assumptions and Experimental Evidence
  20. Chapter 10 Human Performance with Autonomous Robotic Teammates: Research Methodologies and Simulations
  21. Chapter 11 Organizational and Safety Factors in Automated Oil and Gas Pipeline Systems
  22. Chapter 12 Cybersecurity in Organizations: A Sociotechnical Systems Approach
  23. Chapter 13 Evolution of Phishing Attacks: Challenges and Opportunities for Humans to Adapt to the Ubiquitous Connected World
  24. Chapter 14 Teleology for Technology
  25. Chapter 15 The Axial Age of Artificial Autonomy
  26. Author Index
  27. Subject Index