Cognitive Task Analysis
  1. 547 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

About this book

Cognitive task analysis is a broad area consisting of tools and techniques for describing the knowledge and strategies required for task performance. Cognitive task analysis has implications for the development of expert systems, training and instructional design, expert decision making and policymaking. It has been applied in a wide range of settings, with different purposes, for instance: specifying user requirements in system design or specifying training requirements in training needs analysis. The topics to be covered by this work include: general approaches to cognitive task analysis, system design, instruction, and cognitive task analysis for teams. The work settings to which the tools and techniques described in this work have been applied include: 911 dispatching, faultfinding on board naval ships, design aircraft, and various support systems.

The editors' goal in this book is to present in a single source a comprehensive, in-depth introduction to the field of cognitive task analysis. They have attempted to include as many examples as possible in the book, making it highly suitable for those wishing to undertake a cognitive task analysis themselves. The book also contains a historical introduction to the field and an annotated bibliography, making it an excellent guide to additional resources.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Cognitive Task Analysis by Jan Maarten Schraagen, Susan F. Chipman, Valerie L. Shalin, Jan Maarten Schraagen,Susan F. Chipman,Valerie L. Shalin in PDF and/or ePUB format, as well as other popular books in Business & History & Theory in Psychology. We have over one million books available in our catalogue for you to explore.
I
Introduction and History

1
Introduction to Cognitive Task Analysis

Susan F.Chipman
Office of Naval Research

Jan Maarten Schraagen
TNO Human Factors

Valerie L.Shalin
Wright State University
Modern work, with its increasing reliance on automation to support human action, is focusing attention on the cognitive aspects of work that are not accessible to direct observation. For example, it is obvious that the physical acts of button pushing that occur in the command center of a modern ship are of less intrinsic importance than the mental decision processes executed via those actions. The mental processes organize and give meaning to the observable physical actions. Attempts to analyze a task like air traffic control with traditional behavioral task analysis techniques made the shortcomings of those techniques strikingly clear (Means, 1993). Starting in the 1960s, the cognitive revolution in academic psychology has both increased our awareness of the extensive cognitive activity underlying even apparently simple tasks and provided research techniques and theories for characterizing covert cognition. Hence, the term cognitive task analysis is coming into use to describe a new branch of applied psychology. The relative newness of this enterprise is evidenced by the fact that, as of this writing, a search of the entire Psyc INFO database with the term yielded only 28 items, some irrelevant, and a search in the Science Citation Index yielded 30 items. The high current interest in cognitive task analysis is evidenced by recent literature review efforts undertaken by a British aerospace company (confidential) and by the French military (Doireau, Grau, & Poisson, 1996) as well as the NATO Study Group effort reported here.
Cognitive task analysis is the extension of traditional task analysis techniques to yield information about the knowledge, thought processes, and goal structures that underlie observable task performance. Some would confine the term exclusively to the methods that focus on the cognitive aspects of tasks, but this seems counterproductive. Overt observable behavior and the covert cognitive functions behind it form an integrated whole. Artificially separating and focusing on the cognitive alone is likely to produce information that is not very useful in understanding, aiding, or training job performance. The tension between traditional behavioral task analysis techniques and newer cognitive task analysis is largely a U.S. phenomenon. Elsewhere, behaviorism never took hold as it did in the U.S., where military regulations governing training development have forbidden talk of processes that go on inside the head almost until the present day. Annett, Duncan, Stammers, and Gray’s (1971) hierarchical task analyses, for example, often segued smoothly from the domain of observable behavior to the internal world of perception and cognition (see also Duncan, 1974). The changing nature of work, however, is universal throughout the developed world. Even those who did not eschew analysis of the cognitive aspects of work now need more powerful tools and techniques to address the large role of cognition in modern work. Chapter 2 (this volume) reviews the history of task analysis more fully.
Analyses of jobs and their component tasks may be undertaken for a wide variety of purposes, including the design of computer systems to support human work, the development of training, or the development of tests to certify job competence. An emerging frontier of modern task analysis is the analysis of entire working teams’ activities. This is done for purposes such as the allocation of responsibilities to individual humans and cooperating computer systems, often with the goal of reducing the number of humans who must be employed to accomplish the team function. Given the purposes and constraints of particular projects, several (cognitive) task analysis approaches merit consideration. Savvy customers and practitioners of cognitive task analysis must know that one approach will not fit all circumstances. On the other hand, a thorough-going cognitive task analysis may repay the substantial investment required by proving applicable to purposes beyond the original intent. For example, Zachary, Ryder, and Hicinbothom (Chap. 22, this volume) analyzed the tasks of the AEGIS antiair warfare team in order to build an artificially intelligent training system, but these same analyses are being used to guide the design of advanced work stations and new teams with fewer members.
This book is the ultimate product of a NATO study group aiming to capture the state of the art of cognitive task analysis. The intent is to advance it toward a more routine engineering discipline—one that could be applied reliably by practitioners not necessarily educated at the doctoral level in cognitive psychology or cognitive science. To that end, two major activities were undertaken. One was a review of the state of the art of cognitive task analysis, focusing on recent articles and chapters claiming to review cognitive task analysis techniques. This effort produced a bibliographic resource appearing as chapter 28 in this book. We hope that this chapter gives sufficient information to help students and other readers decide which of these earlier contributions to the field they should read for their particular purposes. The second major activity of the NATO study group was an international workshop intended to provide an up-to-date snapshot of cognitive task analyses, emphasizing new developments. Invitations were extended to known important contributors to the field. The opportunity to participate was also advertised widely through electronic mailing lists to capture new developments and ongoing projects that might not be known to the study group members organizing the workshop. This book is largely the product of that workshop, sharing its insights into the state of the art of this new field. This introduction provides an overview of these two activities. First, we sketch a prototypic cognitive task analysis, based on results from the NATO study group. Next, we describe the organization of the chapters in this book that resulted from the international workshop.

The Prototypic Cognitive Task Analysis Process as Seen in Prior Literature

Ironically, the cognitive analysis of tasks is itself a field of expertise like those it attempts to describe. Reviewing recent discussions of cognitive task analysis reveals that the explicitly stated state of the art is lacking specification of just those kinds of knowledge most characteristic of expertise. A large number of particular, limited methods are described repeatedly. However, little is said about how these can be effectively orchestrated into an approach that will yield a complete analysis of a task or job. Little is said about the conditions under which an approach or method is appropriate. Clearly, the relevant conditions that need to be considered include at least the type of task being analyzed, the purpose for which the analysis is being done (human-computer interaction design, training, testing, expert system development), and the resources available for the analysis, particularly the type of personnel available to do the analysis (cognitive scientists, cognitive psychologists, educational specialists, subject-matter experts). The literature is also weak in specifying the way in which the products of task analysis should be used in designing either training or systems with which humans will interact. The prior literature on cognitive task analysis is also limited by a focus on the tasks of individuals, almost exclusively existing tasks for which there are existing task experts.
Nevertheless, the literature review effort did, within these limits, provide the image of a prototypic ideal case of the cognitive task analysis process, as it might be when unhampered by resource limitations. What emerges as the ideal case, assuming that resource limitations are not a problem? Although the answer to this question may vary somewhat, depending on the purpose for which the analysis is being done, we set aside that consideration for a while or assume that the purpose is training and associated proficiency measurement. Several of the articles we reviewed are strong in their presentation of an inclusive recommended approach to cognitive task analysis (e.g., Hall, Gott, & Pokorny, 1995; Hoffman, Shadbolt, Burton, & Klein, 1995; Means, 1993; DuBois & Shalin, 1995). In the present volume, the following chapters also present reasonably inclusive descriptions of the process: chapter 3 by DuBois and Shalin, chapter 6 by Flach, and chapter 9 by Seamster, Redding, and Kaempf.

Preliminary Phase

One should begin a cognitive task analysis with a study of the job or jobs involved to determine what tasks merit the detailed attention of a cognitive task analysis. Standard approaches from personnel psychology are appropriate for this phase of the effort, using unstructured interviews and/or questionnaires to determine the importance, typicality, and frequency of tasks within job performance. Hall et al. (1995) discussed this preliminary phase, as did DuBois and Shalin (1995) with somewhat more methodological detail. DuBois and Shalin also pointed out the importance of focusing on the tasks or problems within general tasks that discriminate more expert performance from routine performance, even though these may not be high-frequency events. Klein Associates’ approach seems to embody the same view, with an emphasis on gathering data about past critical incidents in experts’ experience.
Depending on the availability of written materials about the job or task, such as existing training materials, the first step for those responsible for the analysis probably should be to read those materials to gain a general familiarity with the job or task and a knowledge of the specialized vocabulary (this is referred to as bootstrapping by Hoffman et al. [1995], and table-top analysis by Flach [chap. 6, this volume]). The major alternative is to begin with informal, unstructured interviews with persons who have been identified as experts. In the ideal case, the task analysis becomes a team effort among one or more experts in cognitive task analysis and several subject-matter experts. Of course, it is important to obtain the time, effort, and cooperation of experts who are in fact expert. Hall et al. (1995) discussed the issue of the scarcity of true experts and the selection of appropriate experts in moderate detail. Hoffman et al. (1995) were also concerned with the gradations of expertise. Articulate experts with recent experience in both performing and teaching the skill are particularly useful. For example, the MYCIN (Buchanan & Shortliffe, 1984) expert was reknowned for his ability to teach medical diagnosis.
It is also true that not just anyone is suitable for acting as a cognitive task analyst—not even just anyone who is educated in cognitive psychology and cognitive science. Analysts must have the social skills to establish rapport with the subject-matter experts (SMEs), sometimes across the barriers of different social, cultural and economic backgrounds. If doing unstructured or even structured interviews, they must be verbally adept to adapt to the changing circumstances of the interview. They must be intelligent, quick earners because they have to learn a great deal about the task to analyze it effectively. Hoffman et al. (1995) and Crandall, Klein, Militello, and Wolf (1994) discussed some of these issues about the requirements for cognitive task analysts. Forsythe and Buchanan (1993) also appears to be a reference of interest on these points. There is also a good deal of literature from the expert systems community dealing with the practicalities of interviewing and with requirements that both the knowledge engineer and the expert must meet (e.g., Firlej & Hellens, 1991; McGraw & Harbison-Briggs, 1989; Meyer & Booker, 1991; Waterman, 1986).

Identifying Knowledge Representations

A major goal for the initial unstructured interviews with the SMEs should be to identify the abstract nature of the knowledge involved in the task, that is, the type of knowledge representations that need to be used. This can order the rest of the task analytic effort. This point is not explicit in the literature, but the more impressive, convincing approaches are organized around a knowledge representation or set of knowledge representations appropriate for the job or task. For example, DuBois and Shalin (1995, chap. 3, this volume) use a goal/method graph annotated with additional information about the basis for method selection and the explanation of the rationale or principles behind the method. Less explicitly, the PARI method (Hall et al., 1995) gathers essentially the same information supplemented by information about the experts’ mental organization of device structure and function. Crandall et al. (1994) advocated collecting mental models of the task and of the team context of work, as well as of the equipment. For eliciting knowledge about how a device or system works, Williams and Kotnur (1993) described Miyake’s (1986) constructive interaction. Benysh, Koubek, and Calvez (1993) proposed a knowledge representation that combines procedural information with conceptual information. Similarly, in ongoing work, Williams, Hultman, and Graesser (1998) have collaborated on ways to combine the representations of declarative and procedural knowledge.
Semantic networks are probably overrepresented in reviews of knowledge acquisition methods relative to their actual utility. Although measures of conceptual relatedness or organization are sensitive to growth in expertise, they may actually be derived from more complex knowledge organizations in the experts’ minds, such as those mentioned earlier that integrate procedural and declarative knowledge. For example, it might be a mistake to attempt to directly train the conceptual organizations one deduces from studies of experts. However, semantic networking or clustering techniques have been successfully used to structure more effective computer interfaces (Patel, Drury, & Shalin, 1998; Roske-Hofstrand & Paap, 1986; Vora, Helander, & Shalin, 1994). As we gain experience with cognitive task analysis, it may become possible to define a taxonomy of tasks that, in effect, would classify tasks into types for which the same abstract knowledge representations and the same associated knowledge-elicitation methods are appropriate. However, we should always keep in mind the possibility that the particular task of concern may involve some type of knowledge not in the stereotype for its assigned position in the classification scheme.

Knowledge-Elicitation Techniques

Having identified the general framework for the knowledge that has to be obtained, the analysts can then proceed to employ the knowledge-elicitation techniques or methods discussed in the articles reviewed. Structured interviews can be used to obtain information—an approach that is well discussed in Hoffman et al. (1995), Randel, Pugh, and Reed (1996), and Crandall et al. (1994). The extreme of the structured interview is the computer-aided knowledge-elicitation approach, discussed in reviews by Williams and Kotnour (1993) and Cooke (1994) and exemplified by Shute’s (chap. 5, this volume) DNA cognitive task analysis software and Williams’ (chap. 11, this volume) CAT and CAT-HCI tools. The latter structure and support a generalized version of a GOMS-style analysis, generating much the same sort of goal/method representation recommended by DuBois and Shalin. Of course these interviews and other methods must be focused on an appropriate representative set of problems or cases previously identified, as alluded to earlier. The PARI method (Hall et al., 1995) featu...

Table of contents

  1. Cover Page
  2. Half Title page
  3. Series page
  4. Title Page
  5. Copyright Page
  6. Contents
  7. Contributors
  8. Foreword: References
  9. Preface
  10. Part I Introduction and History
  11. Part II Cognitive Task Analysis for Individual Training, Performance Assessment, and Selection
  12. Part III Cognitive Task Analysis for Applications to the Design of Human-System Interaction
  13. Part IV Cognitive Task Analysis for Teamwork Situations
  14. Part V Discussion
  15. Author Index
  16. Subject Index