Errors in Organizations
eBook - ePub

Errors in Organizations

  1. 383 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

About this book

Despite the significance and prevalence of errors in organizations, there has been no attempt within the field of Industrial and Organizational Psychology to create a single source summarizing what we know regarding errors in organizations and providing a focused effort toward identifying future directions of research. This volume answers that need and provides contributions by researchers who have conducted a considerable amount of research on errors occurring in the work context. Students, academics and practitioners in a wide range of disciplines, i.e., industrial organizational psychology, medicine, aviation, human factors and systems engineering, will find this book of interest.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Errors in Organizations by David A. Hofmann, Michael Frese, David A. Hofmann,Michael Frese in PDF and/or ePUB format, as well as other popular books in Psychology & Applied Psychology. We have over one million books available in our catalogue for you to explore.

Information

1
Errors, Error Taxonomies, Error Prevention, and Error Management: Laying the Groundwork for Discussing Errors in Organizations
David A. Hofmann and Frese, Michael
Every organization is confronted with errors; these errors can result in either positive (e.g., learning, innovation) or negative (e.g., loss of time, poor-quality products) consequences. On the positive side, errors can lay the foundation for outcomes such as innovation and learning (e.g., Sitkin, 1992). For example, both Edmondson (1996) and van Dyck, Frese, Baer, and Sonnentag (2005) found that a positive and constructive approach to errors is associated with organizational outcomes such as learning and performance. With regard to the negative aspects of errors, the majority of the attention within the organizational sciences has focused on the investigation of highly salient and visible organizational failures (e.g., Challenger, Columbia, Chernobyl; Perrow, 1984; Reason, 1987; Starbuck & Farjoun, 2005; Starbuck & Milliken, 1988a; Vaughan, 1996). These investigations have taught us a great deal about how many seemingly independent decisions, actions, and organizational conditions can become interconnected and create extreme failure.
These extreme examples, however, do not really capture the lion’s share of errors occurring within organizations. Individuals working in organizations make errors every day and every hour and (sometimes) make multiple errors in the span of a minute. Researchers, for example, have estimated that for some computer tasks, up to 50% of work time is spent on error recovery (Hanson, Kraut, & Farber, 1984; Kraut, Hanson, & Farber, 1983; Shneiderman, 1987), and Brodbeck, Zapf, PrĂŒmper, and Frese (1993) found that 10% of computer work time is spent handling and recovering from errors. Other computer-based research suggested that individuals average 18 unnecessary cursor movements per hour (Floyd & Pyun, 1987).
Research investigating the use of spreadsheets within organizations also provides evidence regarding the large number of errors contained in these applications. For example, it has been suggested that between 20% and 40% of all spreadsheets in use within organizations contain errors (Panko, 1988, 2005). As a case in point, Davies and Ikin (1987) found—after inspecting 19 spreadsheets used in 10 different firms (dealing with issues such as project cost, payroll, loan schedules, and short-term money market investment analysis)—that 25% contained serious errors. Two such errors were a mistaken transfer of $7 million in funds between divisions and inconsistent currency conversions. Along similar lines, Lawrence and Lee (2004) audited 30 spreadsheets used to justify the financing of projects. They found, on average, that 7% of the spreadsheets contained errors, and that it took an average of six iterations before they were fully error free (see Panko, 2005).
Although many of these “smaller” errors occurring within organizations are quickly handled and rectified, sometimes they can create significant negative consequences. For example, Smelcer (1989) estimated that an error in the command code of the Structured Query Language resulted in a loss of $58 million per year (based on the estimated time for error recovery). The loss of the $125 million Orbiter spacecraft provides another example. In this case, the postincident investigation board determined the underlying reason for the loss was a failure to convert several calculations from English measures of force to newtons.
Despite the importance and prevalence of errors within organiations, there has been no attempt within the field of industrial and organizational psychology or organizational behavior to create a single source that both summarizes what we know regarding errors in organizations and provides a focused effort toward identifying future directions for research. The goal of this volume is to address this gap by providing a forum for researchers who have conducted a considerable amount of research in the error domain to discuss how to extend this research and provide researchers who have not considered the implications of errors for their domain of organizational research an outlet to do so. Our goal of this first chapter is to provide those not familiar with the error literature an overview of this literature. We begin by defining errors and differentiating errors from other related terms. We then describe a goal-directed view of behavior within organizations. Next, we turn to a discussion of an error taxonomy specifying which types of errors might be expected at the different stages of goal-directed behavior. After discussing the different types of errors, we consider the challenges involved in error detection. Following this, we transition into a discussion of collective errors. We conclude by discussing the distinction between error prevention and error management along with some thoughts regarding how to implement each of these activities.
Errors, Inefficiencies, Violations, and Risk
The Merriam-Webster Online Dictionary (2006) defines an error as an act or condition of ignorant or imprudent deviation from a code of behavior; an act involving an unintentional deviation from truth or accuracy; and an act that through ignorance, deficiency, or accident departs from or fails to achieve what should be done. There are several key ideas nested within this definition worth highlighting. First, an error only occurs when there is a deviation from something else. In other words, classifying something as an error implies “an error compared to what”; the “what” in this case is some external goal, standard of behavior, or truth. Second, an error is an unintended deviation. Third, an error can come about through different mechanisms. For example, an individual may not know the standard (ignorance) or may fail to enact his or her intention successfully (e.g., I intended to hit the nail with the hammer but erroneously hit my finger).
As described in more detail in this chapter, we assume that behavior in organizations consists of goal-oriented action. Thus, errors imply a nonachievement of goals; the successful accomplishment of these goals would be the intention. Reason (1990) noted that actions should not be classified as an error if they are brought about by some chance agency. For example, if a person is prevented from achieving a goal due to a lightning strike that results in a temporary power outage, this should not be classified as an error. If errors are not brought about by some chance agency, then the individual was—at least theoretically—in control of his or her action; therefore, errors are potentially avoidable. Taking into consideration these various aspects of errors, we have chosen to define actions as erroneous when they unintentionally fail to achieve their goal if this failure was potentially avoidable (i.e., did not arise from some unforeseeable chance agency; Reason, 1990; Zapf, Brodbeck, Frese, Peters, & PrĂŒmper, 1992).
In light of this definition of errors, we can now discuss several related concepts, such as inefficiencies, violations, and risk. Errors can be differentiated from inefficiencies because inefficient pursuits do in the end reach the goal. Thus, inefficient actions seem not to meet our definition of an error. However, if we assume that actions occurring in organizations typically include efficiency as part of the broader goal, then inefficiencies would be errors. In other words, if the goal of an action is to achieve some end result efficiently, then inefficient routes do reflect deviations from this standard and therefore would be classified as erroneous.
Viewing errors as unintentional deviations differentiates them from intentional deviations from standards or goals. Specifically, we operationalize violations as intentional deviations from task goals, rules, or some standard (Reason, 1990). Although when viewed in isolation it is difficult to imagine purposely deviating from a standard or goal, when one considers the pursuit of various goals simultaneously it is easier to imagine such intentional deviations. This is particularly the case with actions in organizational settings in which any given action often involves the pursuit of multiple (and often contradictory) goals simultaneously. A chemical plant worker, for example, may seek to repair a faulty electrical system with, at its most basic level, the goal to get the system working. Yet, this overall goal actually involves the pursuit of multiple goals, such as performance (get it working), quality (so that it works not only in the short term but also in the long term), efficiency, and safety. We believe that most violations within organizations occur when a lower-priority goal is sacrificed to pursue more vigorously a higher-priority goal. For example, the maintenance worker may maximize the efficiency goal by intentionally violating safety standards (e.g., by not following accepted protocol regarding lockout and tag-out procedures). Similarly, an individual late for a meeting across town may intentionally violate the highway speed limit to arrive at the meeting on time (a higher-priority goal). Of course, it is possible for individuals to engage in violations for other reasons as well, with this deviance having a more malicious intent (Griffin, O’Leary-Kelly, & Pritchard, 2004) or sensation seeking being one of the contributing factors. That said, however, we believe that intentional violations based on differential goal priority (e.g., performance receiving a higher priority than safety) will be the most frequent cause of violations.
It is also worth mentioning that simple observation of actions often does not allow differentiating errors from violations. For example, the observation of cars speeding on the highway in violation of the speed limit might be indicative of either a violation or an error. The drivers may, in fact, be late for an important meeting and be intentionally violating the speed limit, which they view as a lower-priority goal when compared to arriving on time. Or, they may simply be unaware that they are violating the speed limit because they are engaged in conversation with another passenger (i.e., an error due to inattention).
A number of investigations at the individual level have reinforced this distinction between errors and violations (e.g., Kontogiannis, Kossiavelou, & Marmaras, 2002; Lawton, 1998; Reason, Manstead, Stradling, Baxter, & Campbell, 1990). Yet, even though errors and violations are distinct, they can and often do interact with each other. A number of investigations of large-scale organizational accidents—for example, investigations of British Rail accidents, Chernobyl, among others—have revealed that many of these incidents involved a combination of both errors and violations (Reason, 1987, 1990).
Finally, we mention here the concept of risk. One could ask whether a well-thought-out, calculated risk that subsequently turns out to result in harm is an error. Suppose a broker does the normal due diligence research and, based on this research, invests in a particular stock with the goal of making a positive return on the investment. Now, suppose the stock subsequently goes down in price such that the observed outcome (negative return on investment) deviates from the original goal. Would this be classified as an error? There is clearly a deviation from the original goal that is unintended, so in these two respects, this outcome fulfils our definition of error. But, the third aspect of our error definition—that it should have been potentially avoidable—seems to be something that differentiates errors from risks. Errors are things that after they occur give us the feeling of “we should have known better,” whereas risks that turn out to be harmful are more likely to give us the feeling of “given the same information at that same time and in the same context, I would make the same decision.”
In light of this definition of risk, we believe it is possible to draw distinctions between errors and risks as well as discuss the relationship between violations and risks. Errors occur when there is an unintended deviation from a goal or standard; the factors causing this deviation were potentially avoidable (i.e., under the control of the individual). Risks seem to reside in the objective situation. In principle, risks can be analyzed before an action is started or decision is made. Thus, individuals engage in actions that involve risk knowing that the situation has the potential to result in harm, but they believed the probability of this harm actually occurring was lower than the gain. However, it is possible that people can miscalculate the risks inherent in the situation. This may be the case as technology and other improvements reduce the objective risk of the situation. In particular, risk homeostasis theory suggests that often individuals increase their risky behavior as technology and other system improvements reduce the objective risk (e.g., as technological improvements increase the safety of automobiles, individuals drive faster; Pfafferott & Huguenin, 1991; Stetzer & Hofmann, 1996; Wilde, 1982, 1988).
This seems to be the point at which violations and risk begin to interrelate. In other words, engaging in intentional violations often seems to involve assessments of risk. For example, individuals may intentionally violate traffic laws because they believe that the risk of being caught is low. In other words, they recognize that factors outside their control (a police officer) might result in a negative outcome (a fine), but they view the likelihood of this factor occurring to be relatively small in light of the potential benefits brought about by achieving the higher-priority goal (arriving on time). This highlights the fact that often individuals engage in violations when they perceive the risks of negative outcomes to be minimal (Reason, 1990). Of course, nonviolation behavior can also carry certain risks as well. The decision to invest in stocks carries with it the assumption of risk that sometimes bad outcomes (deviations from goals) occur due to the inherent risk in the chosen alternative. Negative outcomes resulting from the assumption of risk are not errors. Of course, if the risks are calculated incorrectly—due to, say, an error in a spreadsheet—and more risk is assumed than the actor believes is the case, then an error has occurred. We, like others, assume that the interaction of incorrect risk calculations (which is an error) coupled with violations can produce catastrophes more easily. The Chernobyl disaster is a case in point. In this disaster, highly skilled operators conducted an experiment in the middle of the night (a high-risk situation) that involved several other decisions that violated accepted safety protocol. This risky experiment coupled with several violations and other errors of judgment resulted in the most significant accident in nuclear power plant history (Dörner, 1996; Reason, 1987, 1990).
Action Processes as the Foundation of Individual Behavior in Organizations
Now that we have defined errors and related terms, we turn our attention to an integrative error taxonomy. As noted, actions can only be defined as erroneous when there is some referent goal against which to compare the outcome (i.e., an error compared to what?). This brings us to what we believe is the defining feature of behavior occurring within organizations: goal-directed action (Dörner & Schaub, 1994; Frese & Za...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright
  5. Contents
  6. Series Foreword
  7. Preface
  8. About the Editors
  9. Contributors
  10. 1. Errors, Error Taxonomies, Error Prevention, and Error Management: Laying the Groundwork for Discussing Errors in Organizations
  11. 2. Learning Through Errors in Training
  12. 3. The Role of Errors in the Creative and Innovative Process
  13. 4. Revisiting the “Error” in Studies of Cognitive Errors
  14. 5. Collective Failure: The Emergence, Consequences, and Management of Errors in Teams
  15. 6. Team Training as an Instructional Mechanism to Enhance Reliability and Manage Errors
  16. 7. Learning Domains: The Importance of Work Context in Organizational Learning From Error
  17. 8. Errors at the Top of the Hierarchy
  18. 9. When Things Go Wrong: Failures as the Flip Side of Successes
  19. 10. The Link Between Organizational Errors and Adverse Consequences: The Role of Error-Correcting and Error-Amplifying Feedback Processes
  20. 11. Cultural Influences on Errors: Prevention, Detection, and Management
  21. 12. A New Look at Errors: On Errors, Error Prevention, and Error Management in Organizations
  22. Author Index
  23. Subject Index