Failed Evidence
eBook - ePub

Failed Evidence

Why Law Enforcement Resists Science

David A. Harris

Share book
  1. 269 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Failed Evidence

Why Law Enforcement Resists Science

David A. Harris

Book details
Book preview
Table of contents
Citations

About This Book

With the popularity of crime dramas like CSI focusing on forensic science, and increasing numbers of police and prosecutors making wide-spread use of DNA, high-tech science seems to have become the handmaiden of law enforcement. But this is a myth,asserts law professor and nationally known expert on police profiling David A. Harris. In fact, most of law enforcement does not embrace science—it rejects it instead, resisting it vigorously. The question at the heart of this book is why. »» Eyewitness identifications procedures using simultaneous lineups—showing the witness six persons together,as police have traditionally done—produces a significant number of incorrect identifications. »» Interrogations that include threats of harsh penalties and untruths about the existence of evidence proving the suspect’s guilt significantly increase the prospect of an innocent person confessing falsely. »» Fingerprint matching does not use probability calculations based on collected and standardized data to generate conclusions, but rather human interpretation and judgment.Examiners generally claim a zero rate of error – an untenable claim in the face of publicly known errors by the best examiners in the U.S. Failed Evidence explores the real reasons that police and prosecutors resist scientific change, and it lays out a concrete plan to bring law enforcement into the scientific present. Written in a crisp and engaging style, free of legal and scientific jargon, Failed Evidence will explain to police and prosecutors, political leaders and policy makers, as well as other experts and anyone else who cares about how law enforcement does its job, where we should go from here. Because only if we understand why law enforcement resists science will we be able to break through this resistance and convince police and prosecutors to rely on the best that science has to offer. Justice demands no less. Visit the author's blog here.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Failed Evidence an online PDF/ePUB?
Yes, you can access Failed Evidence by David A. Harris in PDF and/or ePUB format, as well as other popular books in Law & Law Theory & Practice. We have over one million books available in our catalogue for you to explore.

Information

Publisher
NYU Press
Year
2012
ISBN
9780814744666
Topic
Law
Index
Law

1

Introduction: Science-Driven Policing, or Police Indifference to Science?

In 2010, and for the previous nine years running, CSI: Crime Scene Investigation ranked among the most popular shows on television in the United States.1 The program became a hit so quickly after its premiere in 2000 that the original series, set in Las Vegas, spawned two clones: CSI: Miami and CSI: New York. These shows put a new twist on the old police procedural drama. The CSI officers solved crimes with high-tech forensics: gathering DNA, lifting fingerprints with revolutionary new techniques, and using science to reconstruct the paths of bullets. Watching these programs, the viewer knows that policing has changed. For every member of the CSI team using a gun, more wield test tubes, DNA sampling equipment, and all manner of futuristic gizmos designed to track down witnesses and catch the bad guys.2 The show signals a break with the past, because it revolves around the way police use modern science to find the guilty and bring them to justice.
CSI reflects the emergence of DNA evidence as a powerful tool since it first appeared in American criminal courts in the late 1980s. With DNA and other formidable forensic techniques on our side, little could escape our scientific police work. In this new world, in which science could tell us definitively that the police had the right guy, with a probability of millions or even billions to one, the game had changed for good. The “just the facts, ma’am” approach of Sergeant Joe Friday, and the slow and inexact old-school ways that might or might not turn up evidence, began to seem like quaint relics of a bygone era. Sure, some real-world police protested that CSI raised unrealistic public expectations of both forensic science and the police,3 but CSI simply put a drama-worthy sheen on the way that police departments liked to portray themselves in the age of DNA: using the best of what science had to offer to construct air-tight criminal cases. Police frequently announced that they had used DNA to catch guilty people, sometimes for crimes far in the past, attracting wide public notice and bolstering law enforcement’s science-based image. With headlines like “State, City Police Laud Increase in Arrests Using DNA”4 in Baltimore, “Georgia DNA Solves 1,500 Cases”5 in Atlanta, “DNA Databanks Allow Police to Solve at Least Four Murders”6 in Memphis, and “With Added Lab Staff, DNA Tests Resolve String of Old Killings”7 in Milwaukee, the direction and approach of police work now seem woven together with the latest scientific advancements. Science has given police and prosecutors an enormous, unbeatable advantage.
But this all-too-common view of modern police work using science to move into a gleaming, high-tech future turns out to be a myth. When we strip away the veneer of television drama and the news stories about how DNA has helped police catch another killer or rapist, the real picture concerning law enforcement and science actually looks much different. With the exception of DNA (and then, only sometimes), most of our police and prosecutorial agencies do not welcome the findings of science; they do not rush to incorporate the latest scientific advances into their work. On the contrary, most police departments and prosecutor’s offices resist what science has to say about how police investigate crimes. The best, most rigorous scientific findings do not form the foundation for the way most police departments collect evidence, the way they test it, or the way they draw conclusions from it. Similarly, most prosecutors have not insisted upon evidence collected by methods that comply with the latest scientific findings in order to assure that they have the most accurate evidence to use in court. Like police, most prosecutors have resisted. And this resistance comes despite a nearly twenty-year drumbeat of exonerations: people wrongly convicted based on standard police practices, but proven irrefutably innocent based on DNA evidence. These DNA exonerations, now numbering more than 250 nationwide, prove that traditional techniques of eyewitness identification, suspect interrogation, and forensic testing contain fundamental flaws that have resulted in miscarriages of justice.
Yet the resistance continues. At best, police and prosecutors have used advances in science selectively, when it helps their cases. At worst, they have actively opposed replacing questionable investigative methods with better, empirically proven techniques, sometimes even insisting on retaining flawed methods. As a matter of principle and logic, this indifference to improved practices that will avoid miscarriages of justice seems puzzling and irresponsible, since we know for certain that we can do better than we used to. As a matter of concrete cases, when we see that the failure to use our best methods sometimes leads to both the punishment of the innocent and the escape of the guilty, indifference can become a catastrophe for our system of justice. It is this resistance to sound, science-based police investigative methods that forms the heart of this book.

Brandon Mayfield and the Infallible Science of Fingerprinting

Brandon Mayfield’s case makes a striking example. In March of 2004, terrorists bombed four commuter trains in Madrid, killing 191 people and wounding approximately eighteen hundred. Spanish police soon found a partial fingerprint on a plastic bag in a car containing materials from the attack. Using a digital copy of the fingerprint sent by the Spanish police, a senior FBI fingerprint examiner made “a 100% identification” of Brandon Mayfield, an Oregon attorney, whose prints appeared in government databases because of his military service and an arrest years earlier.8 Three other fingerprint experts confirmed the match of Mayfield to the print found on the bag: FBI supervisory fingerprint specialist Michael Wieners, who headed the FBI’s Latent Print Unit; examiner John Massey, a retired FBI fingerprint specialist with thirty years of experience; and Kenneth Moses, a leading independent fingerprint examiner.9 The FBI arrested Mayfield, and at the Bureau’s request, a court incarcerated him for two weeks, despite the fact that he did not have a valid passport on which he could have traveled to Spain; he claimed he had not left the United States in ten years.10 When the FBI showed the Spanish police the match between the latent print from the bag and Mayfield’s prints, the Spanish police expressed grave doubts. The FBI refused to back down, even declining the request of the Spanish police to come to Madrid and examine the original print.11 Only when the Spanish authorities matched the print with an Algerian man living in Spain did the FBI admit its mistake. The Bureau issued an apology to Mayfield12—an action almost unprecedented in the history of the FBI—and later paid him millions of dollars in damages in an out-of-court settlement.13
The extraordinary apology and the payment of damages may help to rectify the injustice done to Mayfield and his family. But for our purposes, what happened after the FBI admitted its mistakes and asked the court to release Mayfield shows us something perhaps more important. The Mayfield disaster occurred because, among other things, the verification of the original FBI match of Mayfield’s print—a procedure performed by three well-regarded fingerprint experts—ignored one of the most basic principles of scientific testing: the verification was not a “blind” test. The three verifying examiners knew that an identification had already been made in the case, and they were simply being asked to confirm it.14 No scientific investigation or basic research in any other field—a test of the effectiveness of a new over-the-counter medicine, for example—would ever use a nonblind testing procedure; yet nonblind verification is still routine in fingerprint identification. Further, the FBI conducted proficiency testing of all of the examiners involved in the Mayfield case—but only after revelation of the errors, not before. At the time of Brandon Mayfield’s arrest, the FBI did no regular proficiency testing of its examiners to determine their competence, even though such testing routinely occurs in almost any commercial laboratory using quality-control procedures. Further, and perhaps most shocking of all, the fingerprint comparison in the Mayfield case relied not on rigorously researched data and a comparison made under a well-accepted set of protocols and standards, but on the unregulated interpretations of the examiners.
Yet, confronted by an undeniable, publicly embarrassing error that highlighted the crying need for fingerprint analysts to adopt standard practices used in every scientific discipline, the experts refused to yield. Their answer was resistance and denial: resistance to change, and denial of the existence of a problem. Months after the humiliating exposure of the Mayfield debacle, some of those involved continued to insist that the matching of prints to identify unknown perpetrators could not produce mistakes—ever. In an article on the Mayfield case and other instances of mistaken forensic identification, Agent Massey, who had verified the print as belonging to Mayfield, told the Chicago Tribune that he and his fellow analysts had just done their jobs—nothing more. He acknowledged that when he verified Mayfield’s print, he knew that another examiner had already declared the print a match; in other words, he had not performed a blind verification test. Nevertheless, he said, “I’ll preach fingerprints till I die. They’re infallible.”15 Another examiner interviewed about the Mayfield case made an almost identical, unequivocal statement: “Fingerprints are absolute and infallible.”16 When another false fingerprint match led to the two-year incarceration of a man named Rick Jackson, CBS News correspondent Lesley Stahl confronted another FBI agent on the news program 60 Minutes. The agent’s words eerily echoed Agent Massey’s declarations of fingerprint infallibility. After a demonstration of fingerprint identification by the agent, Stahl asked, “What are the chances that it’s still not the right person?” Without hesitation, the agent replied, “zero,” because “[i]t’s a positive identification.”17
As an institution, the FBI did no better at accepting its error and changing its practices. The Bureau announced that it would conduct an investigation of the practices of its Latent Fingerprint Unit, with an eye to “adopting new guidelines.” (The Latent Fingerprint Unit conducted this investigation itself.) As these words are written, more than six years after a mistaken fingerprint match almost sent Brandon Mayfield to prison for the rest of his life, the FBI laboratory’s fingerprint identification division does not use standard blind testing in every case. The laboratory widely considered to have the best fingerprint identification operation in the country continues to resist change and remains in denial, and has refused to move toward practices and safeguards that the scientific world has long considered standard.

How We Got Here

To understand how we got to this point, we must start with DNA. DNA analysis did not develop in the context of police-driven forensic investigation, but rather as a wholly scientific endeavor. This helps explain why DNA testing has always included fully developed standard protocols for its use and the ability to calculate the probability of its accuracy based on rigorously analyzed data.18 This made courts willing to allow its use as proof. Despite its obvious complexity, DNA analysis had been thoroughly tested and was well grounded in scientific principles. As long as forensic scientists followed proper protocols for handling and testing the evidence, DNA could “individualize”—indicate whether a particular person had or had not supplied a tiny piece of tissue or fluid—with a degree of precision unimaginable before. The potential for solving crimes, particularly murders and rapes by strangers in which police might find some fragment of the assailant’s DNA left behind, seemed limitless. Defendants who might have escaped detection and conviction got the punishment they deserved. Even decades-old “cold cases” would yield to this marvelous new tool providing that enough testable biological material still existed, and advances in testing rapidly made accurate analysis of ever smaller samples possible.19
Soon enough, though, police and prosecutors found that the great sword of DNA had two edges: it could confirm guilt like nothing else, but it could also exclude a suspect that the authorities believed had perpetrated the crime. Sometimes the prosecution had already tried the suspect and obtained a guilty verdict. DNA could convict, but it could also throw police investigations, charges, and even convictions into the gravest doubt. A pattern emerged: many of the cases upended by DNA rested on well-accepted types of evidence, like identifications by eyewitnesses, confessions from suspects, or forensic science producing a “match” with a perpetrator. Thus DNA began to demonstrate that these traditional methods actually did not have anything like the rock-solid basis everyone in law enforcement had always imagined. The very basis for trusting these standard types of evidence began to erode.
By early 2010, DNA had resulted in the exoneration of more than 250 previously convicted people, some of whom had spent years on death row.20 By far, the single most common factor, found in 75 percent of these cases, was incorrect eyewitness identifications;21 the second most common type of error was inaccurate (or sometimes downright fraudulent) forensic testing.22 Perhaps most surprisingly, DNA also proved that some suspects did something most people considered unimaginable: they confessed to serious crimes that they had not committed.23 All in all, the DNA exoneration cases showed, beyond any doubt, that we simply had to rethink some of our fundamental assumptions about the most basic and common types of evidence used in criminal cases. An eyewitness who expressed absolute certainty when identifying the perpetrator could actually be wrong. A person who confessed to a crime might not actually have done it. And forensic analysis, including fingerprint matching, was not invariably correct.
With DNA exonerations continuing every year in the 1990s and 2000s, more and more research on traditional police investigative methods began to come to prominence. The research had earned acceptance in the scientific community, sometimes decades before, through peer review, publication, and replication by other scientists, but most of it had remained obscure except to a small circle of researchers. With the advent of DNA exonerations, the science became important to anyone interested in the integrity of the criminal justice system. Decades of these studies, it turned out, pointed out flaws in the ways that police conducted eyewitness identifications. Other research showed that the most widely used method of interrogating suspects rested upon assumptions shown to be deeply flawed, and that common interrogation methods created real risks of false confessions.
DNA’s precision and scientifically sound foundation effectively raised the bar for every other forensic technique and investigative method. Experts and researchers began to call traditional (i.e., non-DNA) investigative methods into question.24 The full scope of damage to the credibility of police investigative tactics became visible in 2009, with the National Research Council’s report on forensic sciences, Strengthening Forensic Science in the United States: A Path Forward.25 In this landmark report, discussed in detail in chapter 2, a large group of the most knowledgeable people in forensic science and related fields declared that, aside from DNA and a few other solidly scientific disciplines such as toxicology, almost none of the forensic science disciplines could claim any real scientific basis for their results. Most of the forensic work done in the United States did not follow the standard scientific precautions against human cognitive biases. Striking at the core of forensic science, particularly fingerprint analysis, the report stated that (again with the exception of DNA and some other disciplines based firmly in the hard sciences) none of the common forensic disciplines could proclaim themselves rigorously reliable, let alone infallible.26
But this sudden exposure of the shortcomings of traditional police investigation tactics also had another, more positive side. The same bodies of research that demonstrated the failings of traditional eyewitness identification testimony, interrogation methods, and forensics also revealed better, more accurate methods to solve crimes—or, at the very least, improved ways to investigate that would greatly reduce the risks of incorrect charges and convictions. These new methods could help guard against mistakes, both by producing more reliable evidence and by eliminating common human cognitive biases from police and forensic investigation. Many of these improved methods would cost very little—sometimes nothing. Thus, the research on traditional investigative methods did not just point out flaws; it pointed the way to better, more reliable tactics. A few examples make this plain.
• Research by cognitive psychologist Gary Wells and others demonstrated that eyewitness identification procedures using simultaneous lineups—showing the witness six persons together, as police have traditionally done—produces a significant number of incorrect identifications. This is the case because showing the six persons to the witness simultaneously encourages witnesses to engage in relative judgment: they make a selection by asking themselves, “Which of the people in the lineup looks most like the perpetrator, even if I can’t say for sure that the perpetrator is there?” Wells discovered that if he showed the persons in the lineup to the witnesses sequentially—one at a time, instead of all six together—a direct comparison of each individual person in the lineup to the witness’s memory of the perpetrator replaces the flawed relative judgment process, reducing the number of false identifications significantly.
• Research has demonstrated that interrogations that include threats of harsh penalties (“Talk, or we’ll ask for the death penalty.”) and untruths about the existence of evidence proving the suspect’s guilt (a false statement by police asserting that they found the suspect’s DNA at the scene) significantly increase the prospect of an innocent person confessing falsely. By eliminating these tactics, police can reduce false confessions.27
• Fingerprint matching does not use probability calculations based on collected and standardized data to generate conclusions, but rather human interpretation and judgment. Examiners generally claim a zero rate of error—an untenable claim in the face of publicly known errors by the best examiners in the United States. To preserve the credibility of fingerprint examination, forensic labs could use exactly the kinds of proficiency testing and quality assurance standards scientists have crafted for other fields. These methods have become widely available; scientists, engineers, and researchers all use them for work that requires high levels of reliability.28

The Reaction: From Indifference to Hostility

In light of all of the challenges that science n...

Table of contents