2An Anecdote. . .Related by Evan Hodge, former chief of the FBI Firearms and Toolmark Unit: A detective goes to a ballistics expert along with a .45 pistol and a bullet recovered from a murder victim. “We know this guy shot the victim and this is the gun he used. All we want you to do is confirm what we already know so we can get a warrant to get the scumbag off the street. We will wait. How quick can you do it?”
3An Anecdote. . .The analyst conducted tests and provided a finding that linked the slug to the gun.The suspect was confronted with this damning forensic evidence in an interrogation that ended with his confession.
4An Anecdote. . .The defendant then led the police to a different .45 pistol, which tests later showed was the true murder weapon. -- Evan Hodge, Guarding Against Error, 20 Ass’n Firearms & Toolmark Examiners’ J. 290, 292 (1988).
5Overview Discovered wrongful convictions Prevalence of forensic flaws in those casesWays these cases are and are not representativeA provisional taxonomy of forensic flawsMany different ways things can go wrongDifferent responses required to address themA closer look at DNA exoneration casesFlawed forensic testimonyRelevancy of these errors today, and lessons
6Wrongful ConvictionsTo date, 259 people have been exonerated in the United States by post-conviction DNA testing.Texas leads the nation with 40 (42) DNA exonerations.
7DNA Exonerations and Forensic Problems In more than 50% of DNA exonerations, unvalidated or improper forensic science contributed to the wrongful conviction, making it the second most prevalent factor contributing to wrongful convictions.--The Innocence Project
8DNA Exonerations and Forensic Problems In 60% of DNA exoneration cases, “forensic analysts called by the prosecution provided invalid testimony at trial, that is, testimony with conclusions misstating empirical data or wholly unsupported by empirical data.” -Garrett/Neufeld Study: Invalid Forensic Science Testimony and Wrongful Convictions, Virginia Law Review, March 2009.
9The Limits of DNADNA demonstrates the power of forensic science and exposes previously unrecognized limitations of forensic science. The overwhelming majority of criminal cases are not amenable to DNA testing. Over-representation of certain kinds of cases: rape and, to a lesser degree, murder.
10Discovered Wrongful Convictions and the Magnitude of the Problem Discovered wrongful convictions represent only a tiny fraction of criminal convictions, BUT the discovered error is not all of the error. In non-DNA cases, the innocent have a very difficult, virtually insurmountable burden. While the true rate of wrongful conviction is difficult to know, we should be focusing on the fact that much (not all) of this error is preventable. We have identified patterns of preventable error. The costs of wrongful conviction are so grave and profound – loss of liberty, public safety, collateral damage, financial—that we must take all reasonable steps to reduce the risk.
11Taxonomy of Forensic Flaws Deliberate MisconductTechnical IncompetenceUnvalidated MethodologiesCommunication ErrorsInterpretive Errors
12Taxonomy of Forensic Flaws Deliberate MisconductDry-labbing: Lying about what tests were doneIntentional withholding of exculpatory findingsIntentional falsification of reportsFaked AutopsiesCheating on proficiency examsEg. - Fred Zain (WV, TX), Joyce Gilchrist (OK),Ralph Erdmann (TX)
13Taxonomy of Forensic Flaws IncompetenceInadequate trainingContaminationPoor quality controlFacilities and Equipment ProblemsChain of Custody ProblemsCollection problemsStorage
14Taxonomy of Forensic Flaws Unvalidated MethodologiesDog Scent LineupsBullet Lead AnalysisVoice Print AnalysisLip PrintsMicroscopic Hair ComparisonPredictions of future dangerousness
15Taxonomy of Forensic Flaws Communication ErrorsDisclosing available evidenceOrdering TestsFollow-up on Reference SamplesChanges in Scientific StandardsTestimony ProblemsMistaken elaboration (false conclusions)Exaggerating evidentiary significanceExplicitlyBy Omission
16Taxonomy of Forensic Flaws Interpretation ErrorsInadvertent biasConfirmation bias (investigative tunnel vision)Domain-extraneous information (manipulation of decision threshold)Group-think/role bias.
17Many Solutions for Many Problems Deliberate MisconductCloser monitoring, full documentation, redundancy, audits, full discovery (incl. bench notes)IncompetenceTraining, certification, monitoring, audits, blind proficiency testingUnvalidated MethodologiesResearch, documentation, standardizationCommunication ErrorsCloser review of testimony, standardization of reporting and relevant terminologyInterpretive ErrorsRegulating flow of information, independence of labs
18The Garrett/Neufeld Study Invalid Forensic Science Testimony and Wrongful Convictions, Virginia Law Review, March 2009.Took documented DNA exoneration cases (232 at the time of the study last year, now up to 259), identified those that included forensic science testimony (156), reviewed all of those transcripts they could obtain (137).“In 82 of these cases, or 60%, forensic analysts called by the prosecution provided invalid testimony at trial, that is, testimony with conclusions misstating empirical data or wholly unsupported by empirical data.”
19A few bad apples?There were a total of 72 different analysts who presented invalid forensic testimony against a person that would later be proven demonstrably innocent.They worked for 52 different labs, in 25 states.
20The Adversarial Process “The adversarial process largely failed to police this invalid testimony. Defense counsel rarely cross-examined analysts concerning invalid testimony, and rarely obtained experts of their own.”
21The Garrett/Neufeld Study Only looked at problems in how evidence was presented in testimony, not underlying errors in the actual analysis, or whether methodologies are sound.The authors call for new oversight mechanisms for reviewing forensic testimony and the development of clear scientific standards for written reports and testimony.
22What kinds of flawed forensic testimony occurred? Most fell into 2 categories: 1. Serology (Analysis of bodily fluids to determine blood type characteristics). Type A/B/O, plus “secretor status” – whether a person secretes blood type substances into bodily fluids (eg. saliva or semen) or not.2. Microscopic Hair Comparison (MHC) – The visual comparison of questioned hairs from a crime scene and known exemplars to determine presence of shared characteristics.
23What kinds of flawed forensic testimony occurred? Other kinds of forensic testimony that clearly went beyond empirical standards included:forensic odontology (bite mark)shoe printfingerprintThree additional cases involved withholding of exculpatory forensic evidence (discovered through post-conviction proceedings).
24Incidence of Types of Analysis 100 cases involved serology, 57 flawed (57%).65 MHC, 25 cases invalid testimony (38%).13 fingerprint cases, 1 flawed testimony.11 DNA cases, 3 with flawed testimony.6 forensic odontology, 4 with invalid testimony.4 shoe print comparisons, 1 flawed testimony1 voice comparison case with flawed testimony
25Two basic types of invalid testimony identified 1.Misuse of empirical population data.Example, Gary Dotson – Testimony that Dotson was among 11% that could have been the donor, when in fact it was 100%.2. Unsupported conclusions about the probative value of evidence (eg. providing opinions on the significance of evidence without any empirical support).Example, Durham case – Testimony that a particular reddish-yellow hair color occurs in about 5% of the population. No empirical data exists on the frequency of hair characteristics.
26“Causation” is Unclear Cannot say that in all the cases where bad forensic testimony occurred that it “caused” the wrongful conviction, since there was usually other kinds of evidence involved, such as an eyewitness identification, that was presented. Could the presence of other evidence have contributed to the faulty interpretation and testimony? (We tend to see/find what we expect or desire to find, or what makes the most sense given other things that we know.)
27Kevin Byrd, TXByrd is a non-secretor. No antigens were detected on a stain at the crime scene, so the analyst assumed that the victim was also a non- secretor as well.The analyst testified that 15-20% of the population are non-secretors.In fact, no donor could be eliminated because no determination had been made about the victim's secretor status (so it's impossible to know whether her blood group markers masked the perpetrator's) and because the sample could have lacked antigens due to degradation. (Garrett/Neufeld, March 2009)
28Alejandro Dominguez, IL The victim was a B secretor and Dominguez was an O secretor. Two of the tested stains had B and H antigens, which were consistent with the victim. However, the analyst testified that Dominguez could not be excluded and that O secretors comprise 36% of the population.In fact, nobody in the population could be excluded because the victim's blood group markers could have masked the perpetrator's. (Garrett/Neufeld, March 2009)
29Gary Dotson, ILThe victim and Dotson were both B secretors. B substances were found on the victim's underwear, and the analyst testified that that the donor was a B secretor. Those substances could have been entirely from the victim, so any male could have been the donor. Another stain had A antigens that were foreign to both Dotson and the victim, but the analyst failed to exclude Dotson as the source -- telling the court it could be a mixture of blood and sweat, wood, leather,detergents or other substances.(Garrett/Neufeld, March 2009)
30Dennis Fritz, OKAn analyst did not detect blood group substances in fluids from the crime. The analyst testified that this meant the perpetrator was a nonsecretor. In fact, if the victim was a non-secretor nobody could be excluded because her blood group markers could mask the perpetrator's, or the lack of blood group substances could have been the result of degradation.(Garrett/Neufeld, March 2009)
31Michael Anthony Green, OH The victim and Green were both B secretors, and the stain showed both B and H antigens. The analyst testified that B secretors were 16% of the population; the analyst conclusively ruled out 84% of the population as the source. The testimony failed to account for the possibility that the victim's blood group markers could mask the perpetrator's. (Garrett/Neufeld, March 2009)
32Edward Honaker, VAAn analyst testified that the tested hair was "consistent“ with Honaker and concluded that it came from Honaker or someone of the same race, coloring and microscopic makeup: "It is unlikely that the hair would match anyone other than the defendant; but it is possible." (Garrett/Neufeld, March 2009)
33Ray Krone, AZAn analyst testified that he was "certain" that Krone's teeth caused bites on the victim, and that it was "a very good match." He went on to say that bite mark comparison "has all the veracity, all the strength that a fingerprint would have." The prosecution also failed to disclose that an FBI expert had examined the bite marks and said they weren't from Krone. (Garrett/Neufeld,March 2009)
34Barry Laughman, PAThe victim was an A secretor and Laughman was a B-secretor. No B substances were detected in the evidence, but the analyst said bacteria could have "worked on these antigens" or they could have broken down. The analyst also testified that medications could have interfered with the antigens. The analyst then claimed that bacteria could actually convert one blood group substance to another: "Given sufficient time for those bacteria to act, it would be possible to convert a group A substance to a B, or a B substance to an A."(Garrett/Neufeld, March 2009)
35Wilton Dedge, FLIncorrect Hair Analysis. Comparing hairs from the crime with Dedge's hair, an analyst testified that "it would not be a million white people" who would possess such hairs, and that "out of all the pubic hairs I have examined in the laboratory, I have never found two samples, two known samples to match in their microscopic characteristics." (Garrett/Neufeld, March 2009)
36What went wrong with serology? Unlike MHC, the underlying science of serology is sound. There are very large blood type databases that allow us to say with confidence how prevalent various blood types are in the population, and among members of different ethnic groups.But serology is a limited forensic methodology. It cannot individuate, but rather can only include (or exclude) a person in a category of people who may have contributed biological evidence found at a crime scene.
37Lessons of MHC and Serology Error We don’t use these anymore, we have more accurate method, so the problems have been fixed or eliminated by the new science.Temptation to say that MHC and serology are old-fashioned, that our problem has been solved by DNA testing.It is fair to say that we won’t see as many of the same kinds of errors anymore in cases where biological evidence is available, BUT
38Lessons of MHC and Serology Error It would be a mistake to think that DNA has solved the problem of invalid forensic testimony.“DNA has replaced some, but not most, traditional forensic methods.”Some estimates indicate that DNA testing accounts for only 2% of police requests to crime laboratories.
39What can/should we learn from MHC and Serology Errors? Biological Evidence is not available in the vast majority of cases.Many other forensic methodologies are susceptible to the problems that undermined accuracy in the MHC and serology areas.It is an accident that we have happened to be able to establish conclusive error in so many MSC and serology cases. Only because of the presence of biological evidence.If document examination and ballistics cases, for example, typically had biological material where DNA testing could be dispositive, we may well have exposed more errors in those other forensic disciplines.
40The Importance of Independence The vast majority of crime labs are operated by state and local law enforcement agencies.The National Academies of Science urges that all crime labs be made independent of law enforcement agencies.
41Notable Non-DNA CasesWe have a number of other discredited methodologies that have been used repeatedly in court despite lacking any scientific validity.Dog scent line-upsLip PrintsBullet lead analysis
42Bullet Lead AnalysisFor years FBI lab experts would conduct an analysis of the chemical or elemental composition of a slug recovered from a crime scene, and purport to match it to a particular batch of bullets, such that if a defendant had a box of .38 calliber ammunition , an analyst could testify that the box was a likely source of the bullet recovered from a shooting victim. It was high-tech, with a scanning electron microscope, and as dazzling as CSI, but it turns out that the testimony was scientifically invalid, and they don’t do it anymore.
43Dog Scent LineupsHigh profile non-dna reversal in a Dog Scent Lineup case.IPOT did a must-read report on unscientific dog scent lineups focusing on a particular handler from Fort Bend County who testified in many cases.
44What could explain errors? Knaves or fools? Or something else?Very often the evidence was interpreted in such a way that it was “made to fit” with the prosecution’s theory of the case.The usual presence of other evidence of guilt, most often eyewitness evidence, creates expections for what the evidence will show.
45Inadvertent contextual bias is as plausible explanation for some Either these many analysts in many labs committed intentional misconduct, orThey lacked a basic understanding of their work, orTheir interpretations were influenced by features of normal human psychology that resulted in inadvertent bias.
46Not merely theoretical We know that exposing analysts to “domain- extraneous information” can undermine the objective interpretation of evidence.We know that perception and interpretation of evidence can be affected by our expectations and desires.We know that decision thresholds can be affected by extraneous information and expectations.
47How do we know?Experience in other contexts, like clinical drug trials.Real life cases like the Brandon Mayfield case.Experimental research documenting influence of scientifically irrelevant information on the conclusions of real forensic experts.
48Admitting the risk of contextual bias is a crucial step Traditionally the forensic science establishment has dismissed concerns about contextual bias and observer effects, appealing to the professionalism and training of analysts as a sufficient counter- weight to any potential influence.Since we are dealing with aspects of human nature and normal human psychology, there is no basis to assume that these tendencies can be “trained away.”It has only been VERY recently that the risks have begun to be taken seriously.
49Where the risk existsThere are many forensic disciplines, some of which are inherently have a more significant risk of contextual bias than others.DNA analysis, drug testing, etc., are rooted in well-established scientific research and leave little room for subjective interpretation.Other kinds of analysis have an interpretive element that is necessarily subjective, involving visual comparisons, for example.
50Validity challenge vs. Risk Management This is not to suggest that there are not valid and reliable methods that have a subjective interpretive element. Rather, where there is a subjective interpretive element, there is real risk of inadvertent contextual bias that must be addressed.
51Itiel Dror Experiments Dror, I.E. and Rosenthal, R. (2008). Meta-analytically quantifying the reliability and biasability of forensic experts. Journal of Forensic Sciences, 53(4), (Citing the underlying studies).Experiments that look at “within-expert comparisons,” whereby the same expert unknowingly makes judgments on the same data at different times and with different contextual information, i.e. that the suspect had an alibi, that the suspect confessed, or, in one study, that the prints were the mistaken matches from the Mayfield case.Experts, as humans, perceive and judge information based on circumstances, such as context, emotional states, expectations, and hopes. This is not a problem if the circumstances are relevant to their decision making, because by being relevant the new circumstances may actually change the decision problem itself. But what happens when experts are faced with extraneous circumstances which are not relevant and do not modify the decision problem?
52The Biasing Effects of Contextual Information In Dror’s initial study, experienced analysts were asked to evaluate a series of fingerprints to determine if they matched. Though the analysts believed the prints were for an actual, open case, they were actually reexamining prints they had correctly evaluated in the past, this time accompanied by artificial contextual information, such as that the suspect had confessed. The results were striking. In cases where analysts were given contextual information about the fingerprints, they were wrong in almost seventeen percent of the cases. These errors were particularly notable because the same analysts had previously evaluated the prints correctly.
53Managing riskNeed to regulate the flow of information between analysts and investigators.Evidence intake and controlBlind testingEvidence lineups?Labs independent of law enforcement agencies are less susceptible to risk of inadvertent bias related to contextual information or perceived role.