Business

Decision Making Biases

Decision making biases are systematic errors in judgment that can affect business decisions. These biases can lead to irrational choices and suboptimal outcomes. Common biases include confirmation bias, anchoring bias, and overconfidence bias, and being aware of these biases can help decision makers mitigate their impact.

Written by Perlego with AI-assistance

10 Key excerpts on "Decision Making Biases"

  • Book cover image for: Behavioral Strategy
    Heuristics and biases are often valuable and indispensable for effective decision making. This may be particularly rel- evant for strategic decisions, which are highly uncertain and need to be made in a timely fashion. Clearly, in order to avoid systematic errors arising from biases, managers need to be keenly aware of the assumptions, heu- ristics and biases employed in their decision making. Thus, they ought to examine their own cognitive biases, which may be more easily identified and appreciated than one might think possible. For example, managers could check if they have a tendency to reject alternatives without carefully weighing them. They could also review whether they make decisions based on rigorous estimates of probability. Such procedures would enable manag- ers to reveal for themselves any cognitive biases inherent in their decision making, and thereby be in a position to make appropriate adjustments. ACKNOWLEDGMENT This chapter, save some minor changes, was earlier published as Das, T. K., & Teng, B. (1999). Cognitive biases and strategic decision processes: An integrative perspective. Journal of Management Studies, 36, 757–778. 22  T. K. DAS and B. TENG REFERENCES Allison, G. T. (1971). Essence of decision: Explaining the Cuban missile crisis. Boston, MA: Little, Brown. Amason, A. C. (1996). Distinguishing the effects of functional and dysfunctional conflict on strategic decision making: Resolving a paradox for top manage- ment teams. Academy of Management Journal, 39, 123–148. Anderson, P. A. (1983). Decision making by objection and the Cuban missile crisis. Administrative Science Quarterly, 28, 201–222. Baird, I. S., & Thomas, H. (1985). Toward a contingency model of strategic risk tak- ing. Academy of Management Review, 10, 230–243. Baldridge, J. V. (1971). Power and conflict in the university. New York, NY: Wiley. Barnes, J. H., Jr. (1984). Cognitive biases and their impact on strategic planning.
  • Book cover image for: Real-time Strategy and Business Intelligence
    eBook - PDF

    Real-time Strategy and Business Intelligence

    Digitizing Practices and Systems

    Whereas earlier studies highlight the role of rationality in strategic decision making, recent studies have emphasized the role of cognitive biases. The roles of most known cognitive biases are well illus- trated in previous literature (Johnson et al. 2008; Lovallo and Sibony 2006). Over-optimism and loss aversion are seen as universal human biases affecting all types of situations, including those of everyday life. For example, when we think of our future lives, we tend to underesti- mate the potential for negative events in our lives (over optimism). In addition, we prefer avoiding losses to making gains (loss aversion). The following biases—the principal–agent problem, champions’ bias, and the sunflower syndrome are more specific and tend to happen in decision- making situations. Principal–agent bias is a particular concern among decision makers especially in strategic decision-making situations, “when the incentives of certain employees are misaligned with the interests of their companies, they tend to look out for themselves in a deceptive way” (Lovallo and Sibony 2006, p. 20). In addition, champions’ bias indicates the likelihood of managers having too much faith in the opinions of trusted persons (usually an experienced manager) in decision-making situ- ations. Finally, the sunflower syndrome shows the tendency to lead and follow senior managers’ opinions in decision-making processes. As the potential for bias in decision-making situations is well docu- mented (Kahneman et al. 2011; Johnson et al. 2008; Lovallo and Sibony Making Sense Of Strategic Decision Making 151 2006), the ways used to address bias in those situations become more interesting. If decision makers were to become more aware of how biases affect strategic decision making, there would be more opportunities to prevent those effects.
  • Book cover image for: The Blackwell Handbook of Personnel Selection
    • Arne Evers, Neil Anderson, Olga Smit-Voskuijl, Arne Evers, Neil Anderson, Olga Smit-Voskuijl(Authors)
    • 2009(Publication Date)
    • Wiley-Blackwell
      (Publisher)
    In this model, Simon recognizes the cognitive, social, and organizational barriers to maximization. Judgmental heuristics outlook. Cognitive psychological research into judgmental heuristics and biases has led to more insight into which biases may influence human decision making. Kahneman and Tversky demonstrated three major and well-known biases that are at work when humans are making decisions and that will hinder a fully rational decision-making process (e.g., Kahneman & Tversky, 1982; Tversky & Kahneman, 1974): the availability heuristic (the assessment of probability of an event depends on how readily it is remem-bered); the representativeness heuristic (the assessment of probability of an event depends on its resemblance to similar events that have occurred); and the anchoring and adjust-ment heuristic, where a judgment is made by starting from an initial value (this may be some accidental information, some historical precedent, etc.) and the decision subsequently is the result of adjustments to that initial value. Social influences outlook. One of the most important findings to emerge from studies focus-ing on social influences during decision making is the phenomenon of escalation of com-mitment (Staw & Ross, 1989; Hantula & DeNicolis Bragger, 1999). This phenomenon refers to decision makers sticking with faulty decisions. Social forces such as the need to 278  .     save face, but also defensively ignoring information, are some of the reasons thought to be behind the occurrence of commitment escalation. Several exemplary empirical studies on selection decisions demonstrate how such cognitive and affective influences impact upon the decisions made. A study by Shafir (1993) showed that decision-making behavior may vary according to whether the type of decision involves choosing the very best candidates or rejecting the worst candidates.
  • Book cover image for: Language and Thought
    • Nick Lund(Author)
    • 2014(Publication Date)
    • Routledge
      (Publisher)
    7 Decision-making Introduction The heuristics and biases approach to judgements Other factors that influence judgements Evaluation of the heuristics and biases approach Theories of decision-making Summary Review exercise Human lives tend to be complicated and we are faced with a multitude of choices daily. Many of these are trivial, such as whether to have another cup of coffee or not; others can be life changing or life threatening, such as whether to end a relationship or where to cross a road. Sometimes decision-making requires a simple choice of one of two possible actions, such as whether to take a taxi or a bus. Even such a simple choice involves weighing the costs and benefits. However, making decisions often involves a more complex mix of weighing probabilities, personal desires and personal beliefs. Thus, given the same choices, one person may risk a great deal because of political or religious beliefs but another may be unwilling to take any risk. Sometimes the ‘logical’ decision may be resisted because of factors such as tradition or emotional ties. For example, it might make financial sense to trade in an old car that is costing a lot in repairs but 85 Introduction a person may decide not to do so because they have become attached to it. Garnham and Oakhill (1994) point out that decisions involving probability judgements fall into two categories: decision-making under risk and decision-making under uncertainty. Decision-making under risk occurs when the probabilities of outcomes are known. For example, if you throw a dice there is a 1 in 6 chance of getting any number. Decision-making under uncertainty occurs when the probabilities of outcomes are not known. Most of the examples in this chapter are concerned with decision-making under risk and these depend on probability judgements of risk and rewards. We have to decide, given a possible reward, whether a risk is worth taking.
  • Book cover image for: Advanced Introduction to Scenario Planning
    128 9. Biases and practical pitfalls This chapter will first address some general biases that may inadvert- ently hinder the development of high-quality scenarios about the future. Research has revealed numerous biases that affect judgment and choice in many settings, well beyond scenario planning. My treatment therefore must be selective since the fields of cognitive, emotional and social biases are broad, with a depressingly long list developed to date. I shall first address the various challenges that uncertainty presents to individuals, groups and organizations when making judgments or choices. Thereafter, I shall highlight some practical challenges scenario teams will likely encounter by identifying pitfalls related to both the process and content side of scenario planning. They are presented here as warnings that may need further empirical validation since they mostly reflect my own con- sulting experiences, in contrast to the more scientific underpinnings of the section on biases below. 9.1 Biases to guard against In common parlance, the term bias refers to a prejudice against people, things or ideas in ways that may not be correct or fair. In decision theory, it connotes systematic error (as opposed to mere random noise), in peo- ple’s subjective judgments and choices due to mental shortcuts taken to overcome information processing limitations. Box 9.1 provides a partial list which I have taken from a catalogue of forty-six cognitive biases based on research in behavioral decision-making. Each bias is briefly defined below and can significantly distort the scenario building process although it need not do so in every case. These biases often operate unconsciously, like optical illusions, and may require training, group discussion or special interventions to reduce them (Russo & Schoemaker, 2002; Ariely & Jones, 2008). A promising recent study by Maymin & Langer (2021) showed that
  • Book cover image for: The SAGE Handbook of Organization Studies
    • Stewart R Clegg, Cynthia Hardy, Tom Lawrence, Walter R Nord, Stewart R Clegg, Cynthia Hardy, Tom Lawrence, Walter R Nord(Authors)
    • 2006(Publication Date)
    A central question in the area of behavioural decision research, then, is how decision makers actually go about making decisions, using as a com-parison the benchmark of optimal (i.e. rational) performance. Juxtaposing the standard of rational-ity against actual behaviour, several researchers in the decision arena began mapping the systematic deviations from rationality that they observed. Behavioural decision researchers focus on these sys-tematic inconsistencies in the decision making process that prevents humans from making fully rational decisions. Kahneman and Tversky (1979; Tversky and Kahneman 1974) have provided critical information about specific systematic biases that influence judgement. This work has elucidated our modern understanding of judgement. The impor-tance of this perspective has, with the awarding of the 2002 Nobel Prize in Economics to Danny Kahneman, received considerable validation both for economic and psychology scholars. When making decisions, people rely on a number of simplifying strategies, or rules of thumb called Decision Perspective on Organizations 493 heuristics. Although heuristics often prevent us from finding the optimal decision by eliminating the best choice, they do have some benefits: the expected time saved by using them could outweigh any potential loss resulting from a full search strategy. By providing people with a simple way of dealing with a complex world, heuristics produce correct or partially correct judgements more often than not. In addition, it may be inevitable that humans will adopt some way of simplifying decisions. The only drawback is that indi-viduals frequently adopt these heuristics without being aware of them. The misapplication of heuristics to inappropriate situations, unfortunately, often leads people astray. The three most important heuristics are the avail-ability heuristic, the representativeness heuristic, and anchoring and adjustment.
  • Book cover image for: Maximizing Project Value
    eBook - ePub

    Maximizing Project Value

    A Project Manager's Guide

    5

    JUDGMENT AND DECISION-MAKING AS VALUE DRIVERS

    Put your feet in the right place, and then stand firm .
    —Abraham Lincoln
    I f it were only true that stakeholders, customers, sponsors, and project teams were completely rational and efficient in their judgments and decision-making:
    Rational in the sense that outcomes are predictable consequences of facts and other information applied consistently to an analytic decision-making process governed by a transparent policy, and
    Efficient in the sense that all relevant information is available and objectively evaluated so that unbiased decisions are possible.
    Of course we know from experiments that the exclusively rational mind often can’t make up its mind and make a decision. The truly rational mind gets into a “do loop” of “if, then, else—but …” that constantly cycles. We call this “paralysis of analysis.” It takes a bit of emotion, belief, or passion—the not-rational reflex—to break the loop and actually make the decision.1 And we know by experiment and observation that subliminal intuition and mood—largely influenced by experience and exposure to ideas—greatly affect decision-making. Research has shown that mood affects intuition and affects how the creative juices flow. Mood also affects our vigilance and inclination to avoid logical errors.2
    When we are in a good mood, our intuition is heightened; our creativity may be peaked; but we may be less vigilant about small errors.
    In fact, mood influences energy, which in turn influences receptivity to understanding available information.3 (So just because this book is hard to read, don’t dismiss it out of hand!)
    Consequently, the judgments and decision-making of executives, stakeholders, and sponsors, as well as customers/users, are in part a consequence of many biases, moods, and experiences. We call this cognitive bias. Cognitive bias is a departure from the objectivity that a neutral third party would have, given access to all relevant facts. Cognitive bias introduces loss of objectivity, inaccuracies, prejudice, logic errors, and distorted perceptions. Countermeasures are needed to maintain a sense of objectivity about facts and estimates and guard optimization in the face of such weaknesses and threats.
  • Book cover image for: Criminal Investigative Failures
    • D. Kim Rossmo(Author)
    • 2008(Publication Date)
    • Routledge
      (Publisher)
    On the contrary, many times the motivation to “help” and solve a case, to “do justice,” clouds our judgments and our ability to reach objective conclusions. Human Perception, Judgment, and Decision Making 59 Confirmation Bias The tendency to confirm an initial theory or preconception and avoid dis-confirming information is known as confirmation bias. An example of this is demonstrated by Wason’s (1960) selection task. Participants were given a three-number sequence that followed a certain rule. They were required to deduce this rule by proposing potential sequences. They were then given feedback as to whether their proposed sequences followed the rule. The rule was simply “any ascending sequence,” yet the rules suggested by par-ticipants were generally far more complex. Participants appeared to for-mulate a potential rule and then only generate sequences that conformed to their rule. If enough sequences were accepted, then the theory would be accepted. Surprisingly, participants tended not to try to falsify their theories. This phenomenon has also been observed in other areas. We often appear to prefer information that is biased toward previously held beliefs, desired outcomes, or expectations (Jonas, Schulz-Hardt, Frey, & Thelen, 2001) or appear to support our expectations in negotiations (Pinkley, Griffith, & Northcraft, 1995), our outlooks and attitudes (Lundgren & Prislin, 1998), our self-serving conclusions (Frey, 1981), or our social stereotypes (Johnston, 1996). Our mind does not seem to be designed to optimize and find the per-fect solution to any given problem. Instead, it merely aims to feel sufficiently satisfied with a solution (Simon, 1956, 1982). Therefore, decision makers have a criterion level, a threshold that must be met before a conclusion can be reached. Once this threshold has been reached, it is a winner takes all process in which a final and decisive decision is reached (Dror et al., 1999).
  • Book cover image for: Judgment in Managerial Decision Making
    • Max H. Bazerman, Don A. Moore(Authors)
    • 2012(Publication Date)
    • Wiley
      (Publisher)
    You can practice spotting others’ biases while reading the newspaper or watching a sporting event on television. Reporters, sportscasters, politicians, and other information providers and public servants constantly make statements that exemplify the biased decision-making processes outlined in this book. We hope that this book has dispelled some of your assumptions about decision making. We also hope to have raised your awareness of the importance of the decision-making process itself, rather than just the results of this process. We are disturbed by the fact that most managers reward results rather than good decisions. As we have seen, managers make many decisions for the wrong reasons. Nevertheless, because so many important decisions involve uncertainty, plenty of good decisions turn out badly, and some bad decisions turn out well. To the extent that a manager rewards results and not sound decision making, the manager is likely to be rewarding behaviors that may not work in the future. Davis (1971) argues that “interesting” writing leads readers to question issues that they never thought about before. Thus, identifying new issues may be more important than providing new answers to old questions. In this sense, we hope this book has succeeded at being interesting by making you aware of aspects of your decision-making process that inspire new questions and solutions. Conclusion  229 References Abeler, J., Falk, A., Goette, L., & Huffman, D. (2011). Reference points and effort provision. The American Economic Review, 101(2), 470–492. Adaval, R., & WyerJr, R. S. (2011). Conscious and Nonconscious Comparisons with Price Anchors: Effects on Willingness to Pay for Related and Unrelated Products. Journal of Marketing Research, 48(2), 355–365. Ager, J. W., & Dawes, R. M. (1965). Effect of judges’ attitudes on judgment. Journal of Personality and Social Psychology, 1(5), 533–538.
  • Book cover image for: Value-Added Decision Making for Managers
    In this chapter, we explore these decision biases and discuss ways of overcom-ing each of them: • Sunk cost, escalation, and de-escalation of commitment • Framing • Status quo and omissions • Regret • Fairness • Mood • Groupthink, optimism, and miscellaneous biases Several of the biases overlap or are closely linked to one another. To isolate the effect of a specific bias, researchers develop creative scenarios for their experimental subjects. 424 Value Added Decision Making for Managers In exploring these biases, we discuss some of their underlying psychological explanations. However, it is impossible to present all the nuances within the extensive literature on each bias. We have chosen to avoid many of these subtleties so as to simplify the presentation of the core concepts. Racial, ethnic, or sexual biases as well as conflict of interest are discussed in a later chapter on ethi-cal decision making. One may not completely control the decision environment in any given situation, but it is never-theless critical to increase awareness of one’s personal tendencies that can affect decisions via bias. Such awareness can also help one develop a better sense of how bias affects others in their decisions. Toward these ends, we identify actions that can be taken to overcome specific biases. 14.2 Sunk Cost and Escalation of Commitment According to basic economic principles, future investment should be judged by estimates of future returns without regard to how much has already been invested. The sunk cost bias under-mines this principle, because people have a tendency to factor into future investment decisions how much has already been spent—even if it seems that this would be like throwing good money after bad. A psychological explanation for this cognitive bias is rooted in an unwillingness to admit that an investment has been wasted.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.