Presentation is loading. Please wait.

Presentation is loading. Please wait.

Forensic Flaws and Wrongful Convictions

Similar presentations

Presentation on theme: "Forensic Flaws and Wrongful Convictions"— Presentation transcript:

1 Forensic Flaws and Wrongful Convictions
Edwin Colfax The Justice Project October 8, 2010

2 An Anecdote. . . Related by Evan Hodge, former chief of the FBI Firearms and Toolmark Unit: A detective goes to a ballistics expert along with a .45 pistol and a bullet recovered from a murder victim. “We know this guy shot the victim and this is the gun he used. All we want you to do is confirm what we already know so we can get a warrant to get the scumbag off the street. We will wait. How quick can you do it?”

3 An Anecdote. . . The analyst conducted tests and provided a finding that linked the slug to the gun. The suspect was confronted with this damning forensic evidence in an interrogation that ended with his confession.

4 An Anecdote. . . The defendant then led the police to a different .45 pistol, which tests later showed was the true murder weapon. -- Evan Hodge, Guarding Against Error, 20 Ass’n Firearms & Toolmark Examiners’ J. 290, 292 (1988).

5 Overview Discovered wrongful convictions
Prevalence of forensic flaws in those cases Ways these cases are and are not representative A provisional taxonomy of forensic flaws Many different ways things can go wrong Different responses required to address them A closer look at DNA exoneration cases Flawed forensic testimony Relevancy of these errors today, and lessons

6 Wrongful Convictions To date, 259 people have been exonerated in the United States by post-conviction DNA testing. Texas leads the nation with 40 (42) DNA exonerations.

7 DNA Exonerations and Forensic Problems
In more than 50% of DNA exonerations, unvalidated or improper forensic science contributed to the wrongful conviction, making it the second most prevalent factor contributing to wrongful convictions. --The Innocence Project

8 DNA Exonerations and Forensic Problems
In 60% of DNA exoneration cases, “forensic analysts called by the prosecution provided invalid testimony at trial, that is, testimony with conclusions misstating empirical data or wholly unsupported by empirical data.” -Garrett/Neufeld Study: Invalid Forensic Science Testimony and Wrongful Convictions, Virginia Law Review, March 2009.

9 The Limits of DNA DNA demonstrates the power of forensic science and exposes previously unrecognized limitations of forensic science. The overwhelming majority of criminal cases are not amenable to DNA testing. Over-representation of certain kinds of cases: rape and, to a lesser degree, murder.

10 Discovered Wrongful Convictions and the Magnitude of the Problem
Discovered wrongful convictions represent only a tiny fraction of criminal convictions, BUT the discovered error is not all of the error. In non-DNA cases, the innocent have a very difficult, virtually insurmountable burden. While the true rate of wrongful conviction is difficult to know, we should be focusing on the fact that much (not all) of this error is preventable. We have identified patterns of preventable error. The costs of wrongful conviction are so grave and profound – loss of liberty, public safety, collateral damage, financial—that we must take all reasonable steps to reduce the risk.

11 Taxonomy of Forensic Flaws
Deliberate Misconduct Technical Incompetence Unvalidated Methodologies Communication Errors Interpretive Errors

12 Taxonomy of Forensic Flaws
Deliberate Misconduct Dry-labbing: Lying about what tests were done Intentional withholding of exculpatory findings Intentional falsification of reports Faked Autopsies Cheating on proficiency exams Eg. - Fred Zain (WV, TX), Joyce Gilchrist (OK), Ralph Erdmann (TX)

13 Taxonomy of Forensic Flaws
Incompetence Inadequate training Contamination Poor quality control Facilities and Equipment Problems Chain of Custody Problems Collection problems Storage

14 Taxonomy of Forensic Flaws
Unvalidated Methodologies Dog Scent Lineups Bullet Lead Analysis Voice Print Analysis Lip Prints Microscopic Hair Comparison Predictions of future dangerousness

15 Taxonomy of Forensic Flaws
Communication Errors Disclosing available evidence Ordering Tests Follow-up on Reference Samples Changes in Scientific Standards Testimony Problems Mistaken elaboration (false conclusions) Exaggerating evidentiary significance Explicitly By Omission

16 Taxonomy of Forensic Flaws
Interpretation Errors Inadvertent bias Confirmation bias (investigative tunnel vision) Domain-extraneous information (manipulation of decision threshold) Group-think/role bias.

17 Many Solutions for Many Problems
Deliberate Misconduct Closer monitoring, full documentation, redundancy, audits, full discovery (incl. bench notes) Incompetence Training, certification, monitoring, audits, blind proficiency testing Unvalidated Methodologies Research, documentation, standardization Communication Errors Closer review of testimony, standardization of reporting and relevant terminology Interpretive Errors Regulating flow of information, independence of labs

18 The Garrett/Neufeld Study
Invalid Forensic Science Testimony and Wrongful Convictions, Virginia Law Review, March 2009. Took documented DNA exoneration cases (232 at the time of the study last year, now up to 259), identified those that included forensic science testimony (156), reviewed all of those transcripts they could obtain (137). “In 82 of these cases, or 60%, forensic analysts called by the prosecution provided invalid testimony at trial, that is, testimony with conclusions misstating empirical data or wholly unsupported by empirical data.”

19 A few bad apples? There were a total of 72 different analysts who presented invalid forensic testimony against a person that would later be proven demonstrably innocent. They worked for 52 different labs, in 25 states.

20 The Adversarial Process
“The adversarial process largely failed to police this invalid testimony. Defense counsel rarely cross-examined analysts concerning invalid testimony, and rarely obtained experts of their own.”

21 The Garrett/Neufeld Study
Only looked at problems in how evidence was presented in testimony, not underlying errors in the actual analysis, or whether methodologies are sound.    The authors call for new oversight mechanisms for reviewing forensic testimony and the development of clear scientific standards for written reports and testimony.

22 What kinds of flawed forensic testimony occurred?
Most fell into 2 categories:  1. Serology (Analysis of bodily fluids to determine blood type characteristics). Type A/B/O, plus “secretor status” – whether a person secretes blood type substances into bodily fluids (eg. saliva or semen) or not. 2. Microscopic Hair Comparison (MHC) – The visual comparison of questioned hairs from a crime scene and known exemplars to determine presence of shared characteristics. 

23 What kinds of flawed forensic testimony occurred?
Other kinds of forensic testimony that clearly went beyond empirical standards included: forensic odontology (bite mark) shoe print fingerprint Three additional cases involved withholding of exculpatory forensic evidence (discovered through post-conviction proceedings).

24 Incidence of Types of Analysis
100 cases involved serology, 57 flawed (57%). 65 MHC, 25 cases invalid testimony (38%). 13 fingerprint cases, 1 flawed testimony. 11 DNA cases, 3 with flawed testimony. 6 forensic odontology, 4 with invalid testimony. 4 shoe print comparisons, 1 flawed testimony 1 voice comparison case with flawed testimony

25 Two basic types of invalid testimony identified
1.Misuse of empirical population data. Example, Gary Dotson – Testimony that Dotson was among 11% that could have been the donor, when in fact it was 100%. 2. Unsupported conclusions about the probative value of evidence (eg. providing opinions on the significance of evidence without any empirical support). Example, Durham case – Testimony that a particular reddish-yellow hair color occurs in about 5% of the population. No empirical data exists on the frequency of hair characteristics.

26 “Causation” is Unclear
Cannot say that in all the cases where bad forensic testimony occurred that it “caused” the wrongful conviction, since there was usually other kinds of evidence involved, such as an eyewitness identification, that was presented. Could the presence of other evidence have contributed to the faulty interpretation and testimony? (We tend to see/find what we expect or desire to find, or what makes the most sense given other things that we know.)

27 Kevin Byrd, TX Byrd is a non-secretor. No antigens were detected on a stain at the crime scene, so the analyst assumed that the victim was also a non- secretor as well. The analyst testified that 15-20% of the population are non-secretors. In fact, no donor could be eliminated because no determination had been made about the victim's secretor status (so it's impossible to know whether her blood group markers masked the perpetrator's) and because the sample could have lacked antigens due to degradation. (Garrett/Neufeld, March 2009)

28 Alejandro Dominguez, IL
The victim was a B secretor and Dominguez was an O secretor. Two of the tested stains had B and H antigens, which were consistent with the victim. However, the analyst testified that Dominguez could not be excluded and that O secretors comprise 36% of the population. In fact, nobody in the population could be excluded because the victim's blood group markers could have masked the perpetrator's. (Garrett/Neufeld, March 2009)

29 Gary Dotson, IL The victim and Dotson were both B secretors. B substances were found on the victim's underwear, and the analyst testified that that the donor was a B secretor. Those substances could have been entirely from the victim, so any male could have been the donor. Another stain had A antigens that were foreign to both Dotson and the victim, but the analyst failed to exclude Dotson as the source -- telling the court it could be a mixture of blood and sweat, wood, leather,detergents or other substances. (Garrett/Neufeld, March 2009)

30 Dennis Fritz, OK An analyst did not detect blood group substances in fluids from the crime. The analyst testified that this meant the perpetrator was a nonsecretor. In fact, if the victim was a non-secretor nobody could be excluded because her blood group markers could mask the perpetrator's, or the lack of blood group substances could have been the result of degradation. (Garrett/Neufeld, March 2009)

31 Michael Anthony Green, OH
The victim and Green were both B secretors, and the stain showed both B and H antigens. The analyst testified that B secretors were 16% of the population; the analyst conclusively ruled out 84% of the population as the source. The testimony failed to account for the possibility that the victim's blood group markers could mask the perpetrator's. (Garrett/Neufeld, March 2009)

32 Edward Honaker, VA An analyst testified that the tested hair was "consistent“ with Honaker and concluded that it came from Honaker or someone of the same race, coloring and microscopic makeup: "It is unlikely that the hair would match anyone other than the defendant; but it is possible." (Garrett/Neufeld, March 2009)

33 Ray Krone, AZ An analyst testified that he was "certain" that Krone's teeth caused bites on the victim, and that it was "a very good match." He went on to say that bite mark comparison "has all the veracity, all the strength that a fingerprint would have." The prosecution also failed to disclose that an FBI expert had examined the bite marks and said they weren't from Krone. (Garrett/Neufeld,March 2009)

34 Barry Laughman, PA The victim was an A secretor and Laughman was a B-secretor. No B substances were detected in the evidence, but the analyst said bacteria could have "worked on these antigens" or they could have broken down. The analyst also testified that medications could have interfered with the antigens. The analyst then claimed that bacteria could actually convert one blood group substance to another: "Given sufficient time for those bacteria to act, it would be possible to convert a group A substance to a B, or a B substance to an A." (Garrett/Neufeld, March 2009)

35 Wilton Dedge, FL Incorrect Hair Analysis. Comparing hairs from the crime with Dedge's hair, an analyst testified that "it would not be a million white people" who would possess such hairs, and that "out of all the pubic hairs I have examined in the laboratory, I have never found two samples, two known samples to match in their microscopic characteristics." (Garrett/Neufeld, March 2009)

36 What went wrong with serology?
Unlike MHC, the underlying science of serology is sound. There are very large blood type databases that allow us to say with confidence how prevalent various blood types are in the population, and among members of different ethnic groups. But serology is a limited forensic methodology. It cannot individuate, but rather can only include (or exclude) a person in a category of people who may have contributed biological evidence found at a crime scene.

37 Lessons of MHC and Serology Error
We don’t use these anymore, we have more accurate method, so the problems have been fixed or eliminated by the new science. Temptation to say that MHC and serology are old-fashioned, that our problem has been solved by DNA testing. It is fair to say that we won’t see as many of the same kinds of errors anymore in cases where biological evidence is available, BUT

38 Lessons of MHC and Serology Error
It would be a mistake to think that DNA has solved the problem of invalid forensic testimony. “DNA has replaced some, but not most, traditional forensic methods.” Some estimates indicate that DNA testing accounts for only 2% of police requests to crime laboratories.

39 What can/should we learn from MHC and Serology Errors?
Biological Evidence is not available in the vast majority of cases. Many other forensic methodologies are susceptible to the problems that undermined accuracy in the MHC and serology areas. It is an accident that we have happened to be able to establish conclusive error in so many MSC and serology cases. Only because of the presence of biological evidence. If document examination and ballistics cases, for example, typically had biological material where DNA testing could be dispositive, we may well have exposed more errors in those other forensic disciplines.

40 The Importance of Independence
The vast majority of crime labs are operated by state and local law enforcement agencies. The National Academies of Science urges that all crime labs be made independent of law enforcement agencies.

41 Notable Non-DNA Cases We have a number of other discredited methodologies that have been used repeatedly in court despite lacking any scientific validity. Dog scent line-ups Lip Prints Bullet lead analysis

42 Bullet Lead Analysis For years FBI lab experts would conduct an analysis of the chemical or elemental composition of a slug recovered from a crime scene, and purport to match it to a particular batch of bullets, such that if a defendant had a box of .38 calliber ammunition , an analyst could testify that the box was a likely source of the bullet recovered from a shooting victim. It was high-tech, with a scanning electron microscope, and as dazzling as CSI, but it turns out that the testimony was scientifically invalid, and they don’t do it anymore.

43 Dog Scent Lineups High profile non-dna reversal in a Dog Scent Lineup case. IPOT did a must-read report on unscientific dog scent lineups focusing on a particular handler from Fort Bend County who testified in many cases.

44 What could explain errors?
Knaves or fools? Or something else? Very often the evidence was interpreted in such a way that it was “made to fit” with the prosecution’s theory of the case. The usual presence of other evidence of guilt, most often eyewitness evidence, creates expections for what the evidence will show.

45 Inadvertent contextual bias is as plausible explanation for some
Either these many analysts in many labs committed intentional misconduct, or They lacked a basic understanding of their work, or Their interpretations were influenced by features of normal human psychology that resulted in inadvertent bias.

46 Not merely theoretical
We know that exposing analysts to “domain- extraneous information” can undermine the objective interpretation of evidence. We know that perception and interpretation of evidence can be affected by our expectations and desires. We know that decision thresholds can be affected by extraneous information and expectations.

47 How do we know? Experience in other contexts, like clinical drug trials. Real life cases like the Brandon Mayfield case. Experimental research documenting influence of scientifically irrelevant information on the conclusions of real forensic experts.

48 Admitting the risk of contextual bias is a crucial step
Traditionally the forensic science establishment has dismissed concerns about contextual bias and observer effects, appealing to the professionalism and training of analysts as a sufficient counter- weight to any potential influence. Since we are dealing with aspects of human nature and normal human psychology, there is no basis to assume that these tendencies can be “trained away.” It has only been VERY recently that the risks have begun to be taken seriously.

49 Where the risk exists There are many forensic disciplines, some of which are inherently have a more significant risk of contextual bias than others. DNA analysis, drug testing, etc., are rooted in well-established scientific research and leave little room for subjective interpretation. Other kinds of analysis have an interpretive element that is necessarily subjective, involving visual comparisons, for example.

50 Validity challenge vs. Risk Management
This is not to suggest that there are not valid and reliable methods that have a subjective interpretive element. Rather, where there is a subjective interpretive element, there is real risk of inadvertent contextual bias that must be addressed.

51 Itiel Dror Experiments
Dror, I.E. and Rosenthal, R. (2008). Meta-analytically quantifying the reliability and biasability of forensic experts. Journal of Forensic Sciences, 53(4), (Citing the underlying studies). Experiments that look at “within-expert comparisons,” whereby the same expert unknowingly makes judgments on the same data at different times and with different contextual information, i.e. that the suspect had an alibi, that the suspect confessed, or, in one study, that the prints were the mistaken matches from the Mayfield case. Experts, as humans, perceive and judge information based on circumstances, such as context, emotional states, expectations, and hopes. This is not a problem if the circumstances are relevant to their decision making, because by being relevant the new circumstances may actually change the decision problem itself. But what happens when experts are faced with extraneous circumstances which are not relevant and do not modify the decision problem?

52 The Biasing Effects of Contextual Information
In Dror’s initial study, experienced analysts were asked to evaluate a series of fingerprints to determine if they matched. Though the analysts believed the prints were for an actual, open case, they were actually reexamining prints they had correctly evaluated in the past, this time accompanied by artificial contextual information, such as that the suspect had confessed. The results were striking. In cases where analysts were given contextual information about the fingerprints, they were wrong in almost seventeen percent of the cases. These errors were particularly notable because the same analysts had previously evaluated the prints correctly.

53 Managing risk Need to regulate the flow of information between analysts and investigators. Evidence intake and control Blind testing Evidence lineups? Labs independent of law enforcement agencies are less susceptible to risk of inadvertent bias related to contextual information or perceived role.

Download ppt "Forensic Flaws and Wrongful Convictions"

Similar presentations

Ads by Google