Psychology
Cognitive Bias
Cognitive bias refers to the systematic patterns of deviation from rationality in judgment and decision-making. These biases are often influenced by factors such as past experiences, emotions, and social influences, leading individuals to make subjective and sometimes irrational judgments. Understanding cognitive biases is important in psychology as they can impact various aspects of human behavior and decision-making processes.
Written by Perlego with AI-assistance
Related key terms
1 of 5
9 Key excerpts on "Cognitive Bias"
- eBook - ePub
- Richards J Heuer(Author)
- 2020(Publication Date)
- Burtyrki Books(Publisher)
PART III—Cognitive BiasES
Chapter 9 — What Are Cognitive Biases?
***This mini-chapter discusses the nature of Cognitive Biases in general. The four chapters that follow it describe specific Cognitive Biases in the evaluation of evidence, perception of cause and effect, estimation of probabilities, and evaluation of intelligence reporting.Fundamental limitations in human mental processes were identified in Chapters 2 and 3. A substantial body of research in cognitive psychology and decision-making is based on the premise that these cognitive limitations cause people to employ various simplifying strategies and rules of thumb to ease the burden of mentally processing information to make judgments and decisions.{89} These simple rules of thumb are often useful in helping us deal with complexity and ambiguity. Under many circumstances, however, they lead to predictably faulty judgments known as Cognitive Biases.Cognitive Biases are mental errors caused by our simplified information processing strategies. It is important to distinguish Cognitive Biases from other forms of bias, such as cultural bias, organizational bias, or bias that results from one’s own self-interest. In other words, a Cognitive Bias does not result from any emotional or intellectual predisposition toward a certain judgment, but rather from subconscious mental procedures for processing information. A Cognitive Bias is a mental error that is consistent and predictable. For example:The apparent distance of an object is determined in part by its clarity. The more sharply the object is seen, the closer it appears to be. This rule has some validity, because in any given scene the more distant objects are seen less sharply than nearer objects. However, the reliance on this rule leads to systematic errors in estimation of distance. Specifically, distances are often overestimated when visibility is poor because the contours of objects are blurred. On the other hand, distances are often underestimated when visibility is good because the objects are seen sharply. Thus the reliance on clarity as an indication of distance leads to common biases. - eBook - PDF
Argumentation
The Art of Persuasion
- Raymond S. Nickerson(Author)
- 2020(Publication Date)
- Cambridge University Press(Publisher)
Generally speaking, people who do better on the various tests of cognitive ability tend to be less likely to display the more common biases. And people who perform normatively, or nearly so, with one task, are likely to perform normatively, or nearly so, on other tasks as well. Evans (1989) argues that all the kinds of biases that have been found in reasoning tasks stem primarily from selective processing of problem infor- mation: “the major cause of bias in human reasoning and judgment lies in factors which induce people to process the problem information in a selective manner. Such selection may arise either in the process of forming mental representations of the information presented in the pro- blem or else in the actual manner in which it is subsequently processed” (p. 19). The impossibility of processing all the sensory data with which we are continually bombarded makes selection essential; the challenge to the student of cognition, Evans argues, is to discover why the selection process sometimes yields systematic mistakes in reasoning. The Functionality or Dysfunctionality of Biases A tacit assumption underlying much of the relevant psychological litera- ture is that biases of the sort described are bad – irrational aspects of human cognition. They cause people to reason illogically, to draw unwarranted conclusions, and to form and retain unfounded beliefs. Jonathan Baron (1985) summarizes his view of the situation the following way. “People will in general think too little, search for evidence to confirm what they already favor, and use evidence the same way – unless corrective action has been provided. We may thus expect people (without corrective education) to exhibit biases predominantly in one direction, and therefore to be generally irrational. This is the main empirical prediction of the present theory” (p. 129). This is a fairly negative general assessment of the role of biases in thinking and of the prospects for changing things much for the better. - eBook - PDF
- D. Kim Rossmo(Author)
- 2008(Publication Date)
- Routledge(Publisher)
Finally, data that do not fit the bias or context and cannot easily be ignored are dismissed and explained away, and weighting of disconfirming data is low. These and other manifestations of bias and cognitive influences can make perception, judgment, and decision making unreliable. They are Human Perception, Judgment, and Decision Making 57 well researched and documented by many scientific studies (e.g., Balcetis & Dunning, 2006; Cordelia, 2006; Ditto & Lopez, 1992; Edwards & Smith, 1996; Evans, 1989; Gilovich et al., 2002; Haselton, Nettle, & Andrews, 2005; Hogarth, 1980; Kahneman et al., 1982; Koriat, Lichtenstein, & Fischhoff, 1980; Kunda, 1990; Nickerson, 1998; Nisbett & Ross, 1980; Tversky & Kahneman, 1974; Zhaoping & Guyader, 2007). The criminal justice system, for example, has in many ways adopted and taken on board these and other cognitive and psychological findings to improve investigations (e.g., Ask & Granhag, 2005; Risinger & Loop, 2002; Stelfox & Pease, 2005). A clear case is the way in which line-ups are conducted. Rather than biasing eyewitnesses by presenting them with the suspect (the target), eyewitnesses are presented with a range of targets that include the suspect as well as numerous decoys. The line-up procedures have been drastically improved by taking into account issues of bias and other cognitive and psychological influences (e.g., Charman & Wells, 2006; Turtle, Lindsay, & Wells, 2003; Wells & Olson, 2003). In this chapter we present cogni-tive theory and bridge it to practical situations in the real world of investiga-tions. Of course, within the scope of this chapter we can only bring examples, as illustrations, to convey the complex issues at hand. Initial Impressions and Accountability Research indicates that early impressions have considerable influence on our final evaluations. Indeed, it is common for people to maintain preex-isting beliefs despite dissonant or even contradictory evidence. - eBook - ePub
A Brain for Business – A Brain for Life
How insights from behavioural and brain science can change business and business practice for the better
- Shane O'Mara(Author)
- 2017(Publication Date)
- Palgrave Macmillan(Publisher)
Biases in decision-making may serve to reinforce affiliation or strength of the bonds within a tightly defined group, for example. Biases may also serve to punish behaviour that is seen to be transgressive—taking too much reward for too little effort (free-loading), for instance. In this case, there may even be a willingness to engage in costly punishment— to punish a member of one’s own group, or to punish an opposing group, even if there is an economic cost to oneself and to one’s group (Heinrich and colleagues 2006). The question posed by the group is simple: ‘what sort of a hit are we willing to take in order to teach the other lot a lesson?’ Both history and experiments show that humans are willing to take punishments in order to teach the other side a lesson—costly punishments that would not be prescribed by a purely rational calculus. The other side in the dispute may assume that there will be no costly punishment, as in believing that ‘they won’t impose trade sanctions on us. After all, they sell us a lot of cheese (or cars, or wine, or shoes—or whatever)’. But they can, and they will. A refusal to recognise in a negotiation that one side will act contrary to their own narrowly defined economic self-interest to ensure a broader political, legal and social lesson is learned is a very common mistake. Cognitive Biases are a pervasive and universal aspect of human thinking. In essence, they are systematic biases in gathering information and in thinking that lead to a deviation from rationality calculations or even simply what is demonstrably ‘good and fair judgement’. The case study in the Chapter 1 of this book provides many examples of many Cognitive Biases, and we will discuss a few them below. There are huge numbers of biases—the Wikipedia entry for ‘ Cognitive Bias ’ lists more than 175 of them - H. Kent Baker, John R. Nofsinger, Vesa Puttonen(Authors)
- 2020(Publication Date)
- Emerald Publishing Limited(Publisher)
93 4 SELF-INFLICTED PITFALLS: THE DANGERS OF PSYCHOLOGICAL BIASES Investing isn’t about beating others at their game. It’s about controlling yourself at your own game. — Jason Zweig Why do generally sensible people sometimes make foolish decisions involving money and investing? A major reason is that they have psychological biases and make predictable cognitive errors. These problems are inherent in everyone’s thinking process, but some people are more susceptible than others to their influence. Knowledge and experience help to reduce the influence of behavioral biases on financial deci-sions. This chapter is intended to help give you that knowl-edge by informing you about these self-inflicted pitfalls and how to avoid them. Behavioral biases occur for several reasons. These problems come from limitations in such areas as memory, attention, knowledge, and time. Various cognitive errors occur because the human brain is subject to these limitations. When the 94 Investor’s Guide to Avoiding Pitfalls, Frauds, and Scams thinking process attempts to simplify its information processing, it often creates predictable biases in the judgments. Investing deci-sions are particularly vul-nerable to behavioral biases because they involve much uncertainty. Since investors don’t know what will happen in the future, they lack all of the information needed when analyzing investment choices. Without having the necessary information, the cognitive process must “fill in the gaps” to reach a conclusion. In other words, your brain takes a short-cut. This mental shortcut, also called heuristic simplification , helps in forming judgments, but also has inherent biases that are problematic for financial decision making. Another source of behavioral bias is your emotions. The media love to frame investors in terms of fear and greed. You might hear, “The market fell because of panicked inves-tors.” But emotions are much more complicated than that simple characterization.- Gregory J. Feist(Author)
- 2008(Publication Date)
- Yale University Press(Publisher)
Confirmation Bias. One of the more pervasive topics in the cognitive psy-chology of science has been the tendency to selectively look for and latch onto evidence that confirms our theory and to deny, distort, or dismiss evidence that contradicts it. One of the first to put confirmation bias on the front burner of the cognitive psychology of science was Peter Wason. Wason and many others who have followed his lead have consistently demonstrated that when students test their hypotheses about how something works, they are very much disposed toward positive tests, that is, they only propose tests that support rather than refute the theory. The best scientists do this relatively infrequently and non-scientists or novices do it more frequently, but scientists are not immune to such biases. Michael J. Mahoney, in fact, compared a small sample of scientists working on the – – task to a sample of Protestant ministers and found that the former were more prone to confirmation bias than the latter! Arie Kruglan-ski has argued that scientists are subject to some of the same Cognitive Biases as nonscientists, including confirmation. 40 Falsification, however, is not impossible and confirmation bias is not in-evitable in nonscientists. Michael Gorman and his colleagues, for example, found that instructions to falsify significantly improved performance on the – – and similar tasks in various samples of college students. 41 So it appeared that confirmation was a bias that can be combated with education, and indeed training to be a scientist provides just such education. Confirm Early–Disconfirm Late Heuristic. It is somewhat misleading to argue that confirmation bias is the only strategy involved through the hypothesis-testing process. Scientists, and especially the best scientists, appear to use a more complex heuristic: early in theory formation they look for confirming evidence, but once the theory is well developed they look for disconfirming (falsifying) evidence.- eBook - PDF
Escaping Paternalism
Rationality, Behavioral Economics, and Public Policy
- Mario J. Rizzo, Glen Whitman(Authors)
- 2019(Publication Date)
- Cambridge University Press(Publisher)
In a broader sense, it refers to “the person’s construal of the examiner, the task administered, [and] the expectation of what might happen” (Kagan 2012, 22). 2 1 “Three factors influence the probability that a particular behavior will be expressed. The properties of the brain and the individual’s prior experiences are the most obvious. . . . The local settings, the third influence, selects one behavior from an envelope that usually contains more than one possibility” (Kagan 2012, 9). 2 One of the reasons Kahneman overestimates the generality of the heuristics-and-biases research findings is that the context is rarely specified, which gives the false impression that the biases are very general human traits. But not explicitly specifying the context is not 192 Deficient Foundations for Behavioral Policymaking Contextuality of the Effect of Moods and Emotions The problem of context can be illustrated by research on the effect of moods or emotions on risk-taking and other decision variables (Andrade and Chen 2007, 52–53; Winkielman and Trujillo 2007, 77, 82–85). Nega- tive moods are often associated with individuals assigning higher subjective probabilities to unfavorable events. However, this does not necessarily mean that they engage in excessive risk avoidance. 3 In fact, they often appear to engage in risk-seeking behavior. Those who respond to negative affect in a risk-seeking way are those who perceive or expect a real chance to repair their mood by taking a risk. Those who don’t expect such a likelihood, on the other hand, will be risk-averse so as to reduce their feelings of uncertainty. Obviously, this is already very context-sensitive, depending as it does on the subjects’ perceptions about the efficacy of their actions. - eBook - ePub
Contributions To Information Integration Theory
Volume 1: Cognition
- Norman H. Anderson(Author)
- 2014(Publication Date)
- Psychology Press(Publisher)
This point arose in the conference discussion, in the example of the high failure rate of small businesses in the U. S. This overall base rate has limited relevance, for it neglects individuating information. A relevant base rate would take account of the kind of business: Different base rates would apply to starting up liquor stores, travel bureaus, and fast food or gourmet restaurants in San Diego. Also important in determining a relevant base rate are physical location, both regional and in relation to traffic patterns, the experience and capability of the operators, the financial reserves, and so forth. Many base rates are thus possible, but the more relevant are also more uncertain and difficult to determine.These considerations reinforce the theme that the study of biases and errors of judgment is too narrow for cognitive analysis. If the normative standard is uncertain or ill-defined, the same applies to the “errors” of judgment, for these exist only by reference to the normative standard.The cognitive-normative antinomy reappears in this base rate issue. In a practical way, a rough or approximate base rate may suffice, either because a more accurate estimate is unobtainable or because it would not be cost-effective. In relation to outcomes, therefore, the question is not whether the base rate is accurate, but whether it works well enough.Cognitive theory requires a different perspective, because it is concerned with process rather than outcome. To define “errors” of judgment relative to a practical criterion is to stand on shifting sand; the same response might be considered underestimation or overestimation, depending on which criterion is used. It is such dependence on criteria that are hypothetical, arbitrary, and often ill-defined that makes “errors” of judgment and decision fundamentally different from perceptual errors, for example, or even from memory errors.Biosocial Heuristic
The nature of human thought depends on its biological foundation and on its social construction. Thus, man’s phylogenetic inheritance contains an array of motivational systems, ranging from biological and affectional to perceptual and intellectual, that is efficient and adaptable. This adaptability may be seen in the evolution of civilization and in its continuous reconstruction from new-born organisms. The biological base is thus transformed into social motivation.Biology and society are both concerned with survival. Both have done remarkably well, on the whole, considering the unpromising origins and small size of the human brain. One key to success lies in general-purpose processes that can determine rough-and-ready actions in diverse situations. Flexibility, however, is generally achieved at the expense of optimality. Especially in the artificial worlds of many judgment-decision studies, defined in abstract terms and hedged by arbitrary, often implicit normative assumptions, suboptimal behavior is not unexpected. - John F. Tomer(Author)
- 2017(Publication Date)
- Edward Elgar Publishing(Publisher)
THE BASICS OF THE PSYCHOLOGICAL ECONOMICS STRAND 35 Heuristics and biases During the 1970s and 1980s, Kahneman and Tversky along with a number of their research collaborators identified quite a few human judgment and decision biases. Let’s start with the three biases that are the subject of their very important 1974 Science article entitled “Judgment under Uncertainty: Heuristics and Biases.” These three biases are availability, anchoring and adjustment, and representative- ness. Because of these biases, humans regularly make systematic and predictable errors in their judgment and decision making in certain kinds of situations. In other words, humans are predictably irrational. Heuristics are rules of thumb, involving simplifications of reality, that people use to make judgments. Because our lives are often busy and complicated, we do not have time to carefully think about and analyse all the different situations we encounter. Therefore, we use rules of thumb that are quick, useful, and simple (Thaler and Sunstein 2009, p. 22). Particular rules of thumb are used automatically in certain situations to make intuitive judgments and decisions. According to Kahneman and Tversky, these heuristics belong to the “human infor- mal processing machinery that cannot be changed” (Heukelom 2014, p. 118). In other words, they are part of the “unchanging biological makeup of the individual.” 2 A good heuristic can be used quickly and can enable close to optimal decision making. Nevertheless, these rules of thumb violate logical principles and contribute to errors in some situations (Camerer and Loewenstein 2004, p. 11). Thus, according to Kahnemen and Tversky, humans are imperfect statisticians, logicians, and optimizers of utility (Heukelom 2014, p. 118). Availability The availability bias relates to “situations in which people judge the frequency . . . or probability of an event by the ease with which instances . . . can be brought to mind” (Tversky and Kahneman 1974, p. 1127).
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.








