Psychology
Biases
Biases refer to systematic patterns of deviation from rationality or objectivity in judgment. These deviations can occur in various forms, such as cognitive biases, which are errors in thinking, or social biases, which involve prejudices and stereotypes. Biases can influence decision-making, perception, and behavior, often leading to inaccurate or unfair outcomes. Understanding biases is crucial in psychology for addressing their impact on individuals and society.
Written by Perlego with AI-assistance
Related key terms
1 of 5
11 Key excerpts on "Biases"
- eBook - ePub
- Phil Banyard(Author)
- 2003(Publication Date)
- Routledge(Publisher)
4 Bias in psychology
Introduction
It is interesting that we have a title such as ‘bias in psychology’. It seems obvious to me that everything in psychology contains some sort of bias or other, but what is remarkable is that this bias is often not acknowledged. Perhaps this is because one feature of the scientific approach in psychology is the attempt to be objective. To be objective is usually taken to mean standing apart from the subject that is being studied, and being free from bias. This might be possible if we are studying chemicals or micro-organisms, but is it possible to be objective when we are studying the behaviour and experience of people? It is difficult, if not impossible, to stand apart from the subject that is being studied when the subject is human behaviour and experience and you are a human being. In this chapter we will look at some examples of bias in psychology, and pay particular attention to issues around cultural diversity and gender. We will start by looking at the concept of ethnocentrism.Ethnocentrism
One source of bias in psychology comes from the fact that we tend to see things from our own viewpoint and the viewpoint of people like us. In our everyday lives we are asked to make judgments about people and events. We have a range of opinions that we are prepared to offer to other people when asked, and sometimes when not asked. In our judgments we are often inclined to show a little egocentrism (seeing things from our own particular viewpoint to the exclusion of others). Another bias that can affect our judgments is ethnocentrism - eBook - ePub
- Richards J Heuer(Author)
- 2020(Publication Date)
- Burtyrki Books(Publisher)
PART III—COGNITIVE Biases
Chapter 9 — What Are Cognitive Biases?
***This mini-chapter discusses the nature of cognitive Biases in general. The four chapters that follow it describe specific cognitive Biases in the evaluation of evidence, perception of cause and effect, estimation of probabilities, and evaluation of intelligence reporting.Fundamental limitations in human mental processes were identified in Chapters 2 and 3. A substantial body of research in cognitive psychology and decision-making is based on the premise that these cognitive limitations cause people to employ various simplifying strategies and rules of thumb to ease the burden of mentally processing information to make judgments and decisions.{89} These simple rules of thumb are often useful in helping us deal with complexity and ambiguity. Under many circumstances, however, they lead to predictably faulty judgments known as cognitive Biases.Cognitive Biases are mental errors caused by our simplified information processing strategies. It is important to distinguish cognitive Biases from other forms of bias, such as cultural bias, organizational bias, or bias that results from one’s own self-interest. In other words, a cognitive bias does not result from any emotional or intellectual predisposition toward a certain judgment, but rather from subconscious mental procedures for processing information. A cognitive bias is a mental error that is consistent and predictable. For example:The apparent distance of an object is determined in part by its clarity. The more sharply the object is seen, the closer it appears to be. This rule has some validity, because in any given scene the more distant objects are seen less sharply than nearer objects. However, the reliance on this rule leads to systematic errors in estimation of distance. Specifically, distances are often overestimated when visibility is poor because the contours of objects are blurred. On the other hand, distances are often underestimated when visibility is good because the objects are seen sharply. Thus the reliance on clarity as an indication of distance leads to common Biases. - eBook - ePub
- D. Kim Rossmo(Author)
- 2008(Publication Date)
- Routledge(Publisher)
In this regard, it is important to distinguish between bottom-up data-driven processes versus top-down processes that are guided and driven by factors distinct from the actual data provided by the external world. The existence and power of such top-down processes in shaping the identification of visual and other patterns has been demonstrated time and again in a number of different studies using a variety of different scientific methodologies, all confirming subjective effects on perception and judgment (e.g., Balcetis & Dunning, 2006; Humphreys, Riddoch, & Price, 1997; McClelland & Rumelhart, 1981; Zhaoping & Guyader, 2007). Top-down influences include, among other things, contextual information, expectation, what we already know (or think we know), hope, motivation, and state of mind. Although top-down processing is essential for human cognition and is a sign of expertise, it can also interfere and contaminate our perception, judgment, and decision-making processes. These Biases and distortions arise from a long and well-studied list of cognitive and psychological phenomena (e.g., Evans, 1989; Gilovich, Griffin, & Kahneman, 2002; Hogarth, 1980; Kahneman, Slovic, & Tversky, 1982; Nickerson, 1998; Nisbett & Ross, 1980). These well-established cognitive and psychological phenomena (e.g., confirmation bias, cognitive dissonance, self-fulfilling prophecies, motivated reasoning, hindsight bias, escalation of commitment, etc.) cause people to lose objectivity.Subjectivity arises when we no longer examine data purely by itself, evaluating it on its own merit without cognitive influences. When we examine information in light of such influences, we unavoidably and unconsciously perceive and judge it differently. When cognitive Biases exist, we interact differently and subjectively with the information. This is manifested in a variety of ways. For example, during our examination of the data we are more likely to notice and focus on characteristics that validate and conform to extraneous information or context, a belief or a hope. Thus, the way we search and allocate attention to the data is selective and biased. Confirming data are emphasized and weighted highly, and when data quality is low (and therefore ambiguous and open to different interpretation), the existence of an extraneous influence will make people interpret the data in ways that are consistent with them. We tend to avoid and ignore data that conflict and contradict such Biases, and disconfirming data that we notice are ignored. Finally, data that do not fit the bias or context and cannot easily be ignored are dismissed and explained away, and weighting of disconfirming data is low.These and other manifestations of bias and cognitive influences can make perception, judgment, and decision making unreliable. They are well researched and documented by many scientific studies (e.g., Balcetis & Dunning, 2006; Cordelia, 2006; Ditto & Lopez, 1992; Edwards & Smith, 1996; Evans, 1989; Gilovich et al., 2002; Haselton, Nettle, & Andrews, 2005; Hogarth, 1980; Kahneman et al., 1982; Koriat, Lichtenstein, & Fischhoff, 1980; Kunda, 1990; Nickerson, 1998; Nisbett & Ross, 1980; Tversky & Kahneman, 1974; Zhaoping & Guyader, 2007). The criminal justice system, for example, has in many ways adopted and taken on board these and other cognitive and psychological findings to improve investigations (e.g., Ask & Granhag, 2005; Risinger & Loop, 2002; Stelfox & Pease, 2005). A clear case is the way in which line-ups are conducted. Rather than biasing eyewitnesses by presenting them with the suspect (the target), eyewitnesses are presented with a range of targets that include the suspect as well as numerous decoys. The line-up procedures have been drastically improved by taking into account issues of bias and other cognitive and psychological influences (e.g., Charman & Wells, 2006; Turtle, Lindsay, & Wells, 2003; Wells & Olson, 2003). In this chapter we present cognitive theory and bridge it to practical situations in the real world of investigations. Of course, within the scope of this chapter we can only bring examples, as illustrations, to convey the complex issues at hand. - eBook - PDF
Rethinking Psychology
Good Science, Bad Science, Pseudoscience
- Brian Hughes(Author)
- 2017(Publication Date)
- Red Globe Press(Publisher)
Some Biases are held explicitly – psychologists will have preferences about what outcomes they would like to see in the research that they conduct. The problem is that, while psy-chologists will be conscious of such views, they typically underestimate their effects on their scientific objectivity. The example we will consider concerns politics, and the way psychologists hold either liberal or conservative views. Other Biases are held explicitly but arrived at inadvertently – psychologists will hold unquestioned assumptions about the nature of human behaviour. The problem here is that, while such assumptions might appear superficially banal, they are often quite profound. The example we will consider is when psychology attempts to distinguish normality from abnormality. A third set of Biases are those which are held implicitly rather than explicitly. These Biases undermine objectivity in a way the psychologist is not aware of. The views underlying such Biases might be so implicit as to reflect contentious social values – even prejudices – that the psychologist would otherwise disa-vow. We will begin this chapter by considering one such example, which refers again to the masculinist bias historically embedded in wider society. We have already seen how poorly scrutinized biological tropes regularly lead to erroneous views about gender differences in cognition and behav-iour. In this chapter, we will consider whether the weight of social gender bias might hamper the entire enterprise of science, the resulting implications for psychological research, and whether this bias extends beyond profes-sional interactions to the very logic of psychology, thereby warranting the adoption of a new, feminist, form of epistemology better suited to the field. Sociopolitical Biases in psychology: Masculinism The masculinist society It is an uncomfortable historical fact that just about all human cultures were founded on a presumption of sexual inequality. - eBook - PDF
Argumentation
The Art of Persuasion
- Raymond S. Nickerson(Author)
- 2020(Publication Date)
- Cambridge University Press(Publisher)
Tom Pyszczynski and Jeff Greenberg (1987) summarize the situation this way. The Big Picture: How Biased Is Human Reasoning? 253 Although virtually all psychologists agree that cognitions are often subject to bias, there is very little agreement concerning the mechanisms responsible for such bias. One group of theorists, those espousing a motivational posi- tion, argue that cognitions are biased to meet the needs or desires of the individual. Influenced by the psychodynamic theories of Freud and others, these theorists maintain that cognitive Biases result from powerful drives, internal conflicts, and affective states. They posit a variety of motives, such as needs for self-esteem, cognitive consistency, and a belief in a just world, that lead to inferences other than those which would result from a purely logical consideration of evidence. The other group of theorists, those espousing a purely cognitive perspec- tive, view cognitive Biases as the result of rational, albeit imperfect, infer- ential processes. Influenced by recent developments in cognitive psychology and information processing, these theorists focus on the way people encode, organize, and retrieve information and on the knowledge structures, trans- formation rules, and heuristics that are used to make inferences of various kinds. Rather than viewing cognitive bias as a result of the affective con- sequences of various cognitive configurations, they view it as a consequence of the dispassionate workings of the cognitive system (p. 297). Sometimes the two types of explanation are said to invoke notions of hot and cold cognition respectively. The motivational accounts of Biases predominated until relatively recently; explanations emphasizing the role of cognitive factors began to emerge and gain some prominence as part of what is sometimes called the “cognitive revolution” in psychology. - eBook - ePub
- Jeffry A. Simpson, Douglas T. Kenrick(Authors)
- 2014(Publication Date)
- Psychology Press(Publisher)
The tendency for people to take credit for their successes by attributing them to internal dispositions (e.g., intelligence) and to deny responsibility for their failures by attributing them to external circumstances (e.g., unfair test conditions). “I am a capable person, able to succeed.” “I failed because the situation was unfair.”Self-centered bias The tendency to take more than one’s share of the credit for outcomes (both successful and unsuccessful) that involved a joint or group effort. “I have contributed more than others.” Egocentricity bias The tendency to recall one’s role in past events as positive and causally significant. “I was important; I had an impact on others.” False consensus effect The tendency for people to see their own attitudes, values, and behavioral choices as relatively common and appropriate. “Most people would agree with me; most people would have done as I did.” False uniqueness effect/Assumption of uniqueness The tendency for people to view their identity-defining traits and abilities as relatively rare/distinctive/unique. “I have rare qualities; I am special.” Illusion of controlThe tendency for people to believe they can influence events beyond their control to produce desireable outcomes (e.g., winning a lottery) or to avoid undesirable outcomes (e.g., becoming a victim of crime). “I am in control.”Hindsight bias The tendency to find outcomes inevitable in retrospect. “I knew this would happen.” Self-righteous bias The tendency to view oneself as possessing more moral integrity than others. “I am more likely to abide by moral principles than others.”Psychological Explanations for Biases in Social Perception
Social psychologists have offered several explanations for Biases in social cognition, which they have divided into those that involve “hot” and “cold” mechanisms. According to hot explanations, people’s needs and motives interfere with and slant the ways in which they process information. For example, people process information in ways that enhance and protect self-esteem or increase their sense of control. We demean out-group members and view ourselves and our friends through rose-colored glasses to allay our anxieties and to make ourselves feel good. - eBook - PDF
- D. Kim Rossmo(Author)
- 2008(Publication Date)
- Routledge(Publisher)
These Biases and distortions arise from a long and well-studied list of cognitive and psychological phenomena (e.g., Evans, 1989; Gilovich, Griffin, & Kahneman, 2002; Hogarth, 1980; Kahneman, Slovic, & Tversky, 1982; Nickerson, 1998; Nisbett & Ross, 1980). These well-established cognitive and psychological phenomena (e.g., con-firmation bias, cognitive dissonance, self-fulfilling prophecies, motivated reasoning, hindsight bias, escalation of commitment, etc.) cause people to lose objectivity. Subjectivity arises when we no longer examine data purely by itself, eval-uating it on its own merit without cognitive influences. When we examine information in light of such influences, we unavoidably and unconsciously perceive and judge it differently. When cognitive Biases exist, we interact dif-ferently and subjectively with the information. This is manifested in a variety of ways. For example, during our examination of the data we are more likely to notice and focus on characteristics that validate and conform to extraneous information or context, a belief or a hope. Thus, the way we search and allocate attention to the data is selective and biased. Confirming data are emphasized and weighted highly, and when data quality is low (and therefore ambiguous and open to different interpretation), the existence of an extraneous influence will make people interpret the data in ways that are consistent with them. We tend to avoid and ignore data that conflict and contradict such Biases, and disconfirming data that we notice are ignored. Finally, data that do not fit the bias or context and cannot easily be ignored are dismissed and explained away, and weighting of disconfirming data is low. These and other manifestations of bias and cognitive influences can make perception, judgment, and decision making unreliable. They are - H. Kent Baker, John R. Nofsinger, Vesa Puttonen(Authors)
- 2020(Publication Date)
- Emerald Publishing Limited(Publisher)
4SELF-INFLICTED PITFALLS: THE DANGERS OF PSYCHOLOGICAL Biases
Investing isn’t about beating others at their game. It’s about controlling yourself at your own game. —Jason ZweigWhy do generally sensible people sometimes make foolish decisions involving money and investing? A major reason is that they have psychological Biases and make predictable cognitive errors. These problems are inherent in everyone’s thinking process, but some people are more susceptible than others to their influence. Knowledge and experience help to reduce the influence of behavioral Biases on financial decisions. This chapter is intended to help give you that knowledge by informing you about these self-inflicted pitfalls and how to avoid them.Behavioral Biases occur for several reasons. These problems come from limitations in such areas as memory, attention, knowledge, and time. Various cognitive errors occur because the human brain is subject to these limitations. When the thinking process attempts to simplify its information processing, it often creates predictable Biases in the judgments. Investing decisions are particularly vulnerable to behavioral Biases because they involve much uncertainty. Since investors don’t know what will happen in the future, they lack all of the information needed when analyzing investment choices. Without having the necessary information, the cognitive process must “fill in the gaps” to reach a conclusion. In other words, your brain takes a shortcut. This mental shortcut, also called heuristic simplification , helps in forming judgments, but also has inherent Biases that are problematic for financial decision making.The investor’s chief problem, and even his worst enemy, is likely to be himself. Benjamin GrahamAnother source of behavioral bias is your emotions. The media love to frame investors in terms of fear and greed. You might hear, “The market fell because of panicked investors.” But emotions are much more complicated than that simple characterization. For example, other factors influence investment decisions such as regret, pride, optimism, and self-image. Nevertheless, you’re better off making non-emotional, rational decisions involving your money. Lastly, other people may influence your decisions. Different groups develop norms that provide social pressure to inform and conform. The company you keep partly determines your investment decisions and thus your wealth.- eBook - ePub
An Introduction to Implicit Bias
Knowledge, Justice, and the Social Mind
- Erin Beeghly, Alex Madva, Erin Beeghly, Alex Madva(Authors)
- 2020(Publication Date)
- Routledge(Publisher)
merely pushed the entire explanation back a level to equally intelligent homunculus-like states?3 Empirical Data of Social Bias
At the onset of our investigation, we’re faced with several questions. What are the data surrounding social bias? In what ways do methods of testing for social bias differ from one another? What patterns emerge from these data?3.1 Direct and Indirect Measures
Before the early 1970s, tests for social bias took a direct route: if a psychologist wanted to know if someone had a bias against a particular social group, she would ask her subjects directly. Such tests are called direct measures . Let’s focus on the case of racial attitudes in the United States. One of the earliest examples of a direct measure was a test created by Katz and Braly (1933) that asked 100 Princeton students to read through a list of 84 adjectives and write down those that they think best characterized a particular race or ethnicity. Characteristic of the time, the results indicated pervasive negative racial Biases. The majority of participants in the study paired African Americans with traits like superstitious and lazy , while pairing Germans with traits like scientifically-minded and industrious .Over time, the social landscape of the United States changed dramatically. The Civil Rights Movement of the 1950s and 60s strived to establish racial equality across the country, and ushered in a new public standard that discriminatory opinions about African Americans were socially unacceptable. During this time, direct measures began to show a decline in negative racial bias. However, although overt expressions of racist ideology were curbed, the pervasive and destructive effects of racism were still painfully evident. It seemed that people still harbored racist opinions, opinions that influenced their beliefs about and actions toward people of color; it’s just that either those individuals stopped wanting to admit those opinions to others or, more curiously, those opinions were not obvious even to them. - eBook - PDF
- Paolo Diego Bubbio, Jeff Malpas, Paolo Diego Bubbio, Jeff Malpas(Authors)
- 2019(Publication Date)
- De Gruyter(Publisher)
By furthering our understanding of these and similar unconscious or unin-tended forms of bias and prejudice, recent philosophical research on implicit B. Keith Payne, 2001, pp. 181 – 192. Dan-Olof Rooth, 2010, pp. 523 – 534. Jens Agerström and Dan-Olof Rooth, 2011, pp. 790 – 805. 52 Lisa Bortolotti and Katherine Puddifoot bias illustrates the substantial contribution that philosophy can make to under-standing the nature of human thought and how it influences interpersonal inter-actions. In the domain of implicit bias research philosophy is also at its most practical: providing insights about potential ways to reduce the implicit stereo-typing involved with implicit bias. 2.1 The Psychology of Implicit Bias One strand of philosophical research into implicit bias aims to identify the psy-chological underpinnings of implicit bias. It aims to answer the question, what, precisely, are implicit Biases? How do implicit Biases relate to better-recognized psychological states? For example, much recent philosophical discussion has aimed to answer the question: “ How do implicit Biases relate to beliefs? ” For a significant number of the years during which implicit Biases have been studied, it has been assumed in the psychological literature that they are merely associations that people make in their thinking — for example, one might associ-ate social groups (e. g., women ) and their members with concepts ( weakness ) or feelings ( aversion ) — and that they can only be changed via retraining. They have been distinguished from other mental states on the basis that the believer is often unaware of or unable to control the operation of implicit Biases. - H. Kent Baker, John R. Nofsinger, Vesa Puttonen(Authors)
- 2020(Publication Date)
- Emerald Publishing Limited(Publisher)
93 4 SELF-INFLICTED PITFALLS: THE DANGERS OF PSYCHOLOGICAL Biases Investing isn’t about beating others at their game. It’s about controlling yourself at your own game. — Jason Zweig Why do generally sensible people sometimes make foolish decisions involving money and investing? A major reason is that they have psychological Biases and make predictable cognitive errors. These problems are inherent in everyone’s thinking process, but some people are more susceptible than others to their influence. Knowledge and experience help to reduce the influence of behavioral Biases on financial deci-sions. This chapter is intended to help give you that knowl-edge by informing you about these self-inflicted pitfalls and how to avoid them. Behavioral Biases occur for several reasons. These problems come from limitations in such areas as memory, attention, knowledge, and time. Various cognitive errors occur because the human brain is subject to these limitations. When the 94 Investor’s Guide to Avoiding Pitfalls, Frauds, and Scams thinking process attempts to simplify its information processing, it often creates predictable Biases in the judgments. Investing deci-sions are particularly vul-nerable to behavioral Biases because they involve much uncertainty. Since investors don’t know what will happen in the future, they lack all of the information needed when analyzing investment choices. Without having the necessary information, the cognitive process must “fill in the gaps” to reach a conclusion. In other words, your brain takes a short-cut. This mental shortcut, also called heuristic simplification , helps in forming judgments, but also has inherent Biases that are problematic for financial decision making. Another source of behavioral bias is your emotions. The media love to frame investors in terms of fear and greed. You might hear, “The market fell because of panicked inves-tors.” But emotions are much more complicated than that simple characterization.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.










