1
DEFINING AND MEASURING IMPLICIT BIAS
The study of implicit social cognition âexamines those mental processes that operate without conscious awareness or conscious control but nevertheless influence fundamental evaluations of individuals and groups.â1 It is grounded primarily in the field of psychology but has been taken up by many other fields, including neuroscience.2 Ralph Adolphs, professor of psychology and neuroscience at the California Institute of Technology, has complained that âsocial cognition is a domain with fuzzy boundaries and vaguely specified components.â Nonetheless, he recognizes that it can be understood as guiding âboth automatic and volitional behavior by participating in a variety of processes that modulate behavioral response: memory, decision-making, attention, motivation and emotion are all prominently recruited when socially relevant stimuli elicit behavior.â3
ISCâs potential for political applications is made abundantly clear in an article titled âPolitical Neuroscience: The Beginning of a Beautiful Friendship,â in which an interdisciplinary group of coauthors assert that âthe application of neuroscience to political topics offers a powerful set of research methods that promises to integrate multiple levels of analysis. As E. O. Wilson (1998) wrote in Consilience: The Unity of Knowledge: âthe social sciences are intrinsically compatible with the natural sciences. The two great branches of learning will benefit to the extent that their modes of causal explanation are made consistent.â â4
Psychologists Curtis Hardin and Mahzarin Banaji assert that âimplicit prejudiceâ itself â(a) operates unintentionally and outside awareness, (b) is empirically distinct from explicit prejudice, and (c) uniquely predicts consequential social judgment and behavior.â5 Thus, as other authors assert, âpeople can have implicit prejudicesâfeelings, favorable or unfavorable, toward persons or groups that they did not endorse or even realize that they possessed.â6
ISC emerged from a history of psychological studies of prejudice that John Dovidio has broadly characterized as happening in three waves. The first wave, from the 1920s through the 1950s, cast prejudice as form of psychopathology involving ânot simply a disruption in rational processes, but as a dangerous aberration from normal thinking.â The second wave, lasting until the early 1990s, âbegan with an opposite assumption: Prejudice is rooted in normal rather than abnormal processes.â This approach conceived of prejudice, stereotyping, and bias as âoutcomes of normal cognitive processes associated with simplifying and storing information of overwhelming quantity and complexity that people encounter in daily life.â Beginning in the mid-1990s, a third waveâthe current waveâemerged that âemphasizes the multidimensional aspect of prejudice and takes advantage of new technologies to study processes that were earlier hypothesized but not directly measurable.â During this wave, psychologists developed the IAT and more recently fMRI studies of neuropsychological processes to produce âa more comprehensive, interdisciplinary, and multidimensional understanding of prejudice.â7 This current interdisciplinary approach is notable for its focus on quantifiable, measurable, and (in the case of fMRI) visualizable metrics of prejudice.
Despite this rather sequential characterization, it bears noting that each âwaveâ did not simply supersede and render prior research irrelevant. Rather, these waves are best understood as building upon, interweaving with, and influencing one anotherâmore like marbled layers of research than distinct historical strata. Thus, for example, many scholars remain deeply concerned with what they see to be the pathologies of racismâparticularly in its most extreme formsâthough they may also embrace current work on ISC.
HEURISITICS AND BIASES
Amos Tversky and Daniel Kahnemanâs profoundly influential article âJudgment Under Uncertainty: Heuristics and Biasesâ was published in 1974, during the âsecond wave,â but it remains of fundamental significance to the work of ISC theorists in general and of behavioral realists in the law in particular.8 Among scholars during the âthird wave,â Mahzarin Banaji and Anthony Greenwald, writing from within the discipline of psychology, as well as Richard Thaler and Cass Sunstein, writing from within economics and law, draw directly and heavily on this work. Banaji and Greenwald refer to the âheuristics and biasesâ identified by Tversky and Kahneman as âmind bugs,â which are âingrained habits of thought that lead to errors in how we perceive, remember, reason, and make decisions.â9 Thaler and Sunstein simply call them ârules of thumb.â10 They identify these mind bugs or rules of thumb with three key heuristics analyzed by Tversky and Kahneman as centrally shaping the way people use shortcuts to make sense of the complicated array of information that we encounter in everyday life: anchoring, availability, and representativeness.
Banaji and Greenwald observe that âthe mind does not search for information in a vacuum. Rather, it starts by using whatever information is immediately available as a reference point or âanchorâ and then adjusting.â11 Thaler and Sunstein illustrate the concept of anchoring by considering how they, living in Chicago, might respond to a request to guess the population of Milwaukee, about two hours away. They know little about Milwaukee other than that it is the largest city in Wisconsin. So they start with something they do know, the population of Chicago, which is roughly three million. This is their anchor. Working from this number, they consider that Milwaukee is a major city but clearly isnât as big as Chicago, perhaps one-third its size, so they estimate its population at one million. Then, they compare this process to that of a hypothetical resident of Green Bay, Wisconsin, who uses Green Bayâs population of one hundred thousand as his anchor and guesses that Milwaukee is three times as bigâthree hundred thousand. Like Banaji and Greenwald, Thaler and Sunstein refer to this process as âanchoring and adjustmentââin conditions of uncertainty, you start with the anchor you know and adjust from there. The problem is that the adjustment is often insufficient, creating a bias toward the anchor. In their example, they note that the population of Milwaukee is actually about 580,000 people.12
Banaji and Greenwald illustrate the availability heuristic by asking the reader, âPick the correct answer in each of the following three pairs: Each year, do more people in the United States die from cause (a) or cause (b)?â
- (a) murder (b) diabetes
- (a) murder (b) suicide
- (a) car accidents (b) abdominal cancer13
They note that most people chose (b) for question 1 and (a) for questions 2 and 3. In fact, the correct answer to each question is (b). The availability heuristic means that âwhen instances of one type of event (such as murder rather than suicide) come more easily to mind than those of another type, we tend to assume that the first event also must occur more frequently in the world.â14
Thaler and Sunstein characterize ârepresentivenessâ simply as the idea âthat when asked to judge how likely it is that A belongs to category B, people ⌠answer by asking themselves how similar A is to their image or stereotype of B (that is, how ârepresentativeâ A is of B).â15 Given representativenessâs direct connection to stereotyping, one can readily appreciate its implications for understanding implicit prejudice.
EXPLICIT AND IMPLICIT BIAS
Such cognitive heuristics are understood as operating largely at an unconscious or implicit level in contrast to realms of more conscious, explicit deliberation and awareness. Daniel Kahneman popularized this âdual-systemâ model in his best-selling book Thinking, Fast and Slow, where he describes it as follows:
- System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
- System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.16
This dual-system model applies across a wide range of cognitive tasks but has been particularly significant in framing discussions of ISC and drawing distinctions between implicit and explicit bias. It suffuses Thaler and Sunsteinâs work, who state baldly that it is âhow we think.â They describe the dual-system model as involving âa distinction between two types of thinking, one that is intuitive and automatic and another that is reflective and rational.â17
Among psychologists, David Amodio and Saaid Mendoza use the terms implicit and explicit to refer to a subjectâs own awareness of a particular psychological process, such as bias: âan explicit process can be consciously detected and reported (regardless of whether it was triggered spontaneously). Any process that is not explicit is referred to as implicit. Hence, âimplicitâ describes a process that cannot be directly inferred through introspective awareness.â18
Psychologists Brian Nosek and Rachel Siskind offer a slightly different definition, asserting that âimplicit social cognitionâ is not a specific psychological process but rather âis a descriptive term encompassing thoughts and feelings that occur independently of conscious intention, awareness, or control.â19 Anthony Greenwald and Linda Hamilton Krieger also emphasize awareness and control, noting that âa belief is explicit if it is consciously endorsed,â whereas implicit cognition involves âprocesses of social perception, impression formation, and judgmentâ over which a person âmay not always have conscious, intentional control.â20
Implicit bias involves three basic steps: the mental recognition or construction of a social group; the association of a stereotype with that group; and the layering of a positive or negative association or attitude on top of the stereotype. Social psychologists define a social stereotype as âa mental association between a social group and a category or trait.â21 Stereotypes in themselves are not necessarily normative. In contrast, an attitude is âan evaluative disposition, that is, the tendency to like or dislike, or to act favorably or unfavorably toward someone or something.â22 Implicit biases, therefore, âare discriminatory biases based on implicit attitudes or implicit stereotypes.â23 Implicit attitudes may be related to explicit attitudes, âbut [the two] are distinct in that neither is robustly predictive of the other.â24 Of particular interest are situations in which explicit and implicit attitudes toward the same object differ. These dissociations are most commonly observed with respect to stigmatized groups, such as racial minorities.25
MEASURING IMPLICIT BIAS
As Amodio notes, âMany of the central components of intergroup bias (e.g., the construct of implicit bias) are exceedingly difficult to study using the traditional methods of social psychology, as they appear to be impervious to introspection, and thus to self-report.â26 The science of ISC began to gain traction in the 1990s as psychologists developed new techniques for measuring and quantifying it. At a larger social level, bias (both implicit and explicit) may often be inferred from significant statistical disparities in the treatment of racial groups with respect to a particular practice. Thus, for example, the phenomenon of differential traffic stops by police that has come to be known as âdriving while blackâ became the basis for a number of successful lawsuits challenging racial profiling.27 One early study done in 1993 found that in a particular place where more than 98 percent of the cars on the New Jersey Turnpike were speeding, the police essentially had discretion to pull over anyone they chose. Although African Americans made up only 15 percent of the speeders, not statistically different from their proportion of the driving population, 35 percent of the drivers pulled over were black. The average black driver was almost four times more likely to be pulled over than a nonblack driver.28 Such s...