Forensic science — FBI Bullet-Lead Technique Dead Wrong — Intel Today

 

Forensic science — FBI Bullet-Lead Technique Dead Wrong

“For over thirty years, FBI experts testified about comparative bullet lead analysis (CBLA), a technique that was first used in the investigation into President Kennedy’s assassination. CBLA compares trace chemicals found in bullets at crime scenes with ammunition found in the possession of a suspect. (…) Although the FBI eventually ceased using CBLA, the Bureau’s conduct in first employing the technique and then defending it after it was challenged provides an insight into how forensic science sometimes works.”

Paul C. Giannelli

“We cannot afford to be misleading to a jury. We plan to discourage prosecutors from using our previous results in future prosecutions.”

Letter from Dwight E. Adams — then FBI lab Director — to FBI Director Robert S. Mueller III

Since the 1960s, testimony by representatives of the Federal Bureau of Investigation in thousands of criminal cases has relied on evidence from Compositional Analysis of Bullet Lead (CABL), a forensic technique that compares the elemental composition of bullets found at a crime scene to the elemental composition of bullets found in a suspect’s possession. Different from ballistics techniques that compare striations on the barrel of a gun to those on a recovered bullet, CABL is used when no gun is recovered or when bullets are too small or mangled to observe striations. Follow us on Twitter: @Intel_Today

RELATED POST: Syria — A Commentary on the Khan Shaykhun Chemical Attack Intelligence

RELATED POST: Significant Uncertainties in the Yield Estimate of North Korea H Bomb

RELATED POST: A SCIENTIFIC ANALYSIS OF NORTH KOREA SECOND NUCLEAR TEST (May 25 2009)

A True Story — In 1995, former Baltimore police Sgt. James A. Kulbicki was convicted of first-degree murder. The prosecutor convinced the jury that, in 1993, Kulbicki had killed his mistress —  22-year-old Gina Nueslein– with his off duty .38-caliber revolver.

The scientific evidence was “irrefutable”. The bullets recovered from the victim’s body and from the crime scene had been fired by his gun.

“I wonder what it felt like, Mr. Kulbicki, to have taken this gun, pressed it to the skull of that young woman and pulled the trigger, that cold steel,” the prosecutor asked rhetorically during closing arguments.

Forensic Science — In order to move along a stable straight trajectory, a bullet must spin on itself. To achieve such spin, spiralling “grooves” are machined in the inside of the weapon barrel.

The size of these “grooves” as well as the “lands”, the angle of the grooves, their number per length and the direction of rotation — clockwise or anticlockwise — generally permit to identify a type of weapon. For instance, Colt traditionally uses a left-hand twist while Smith & Wesson uses a right hand twist.

Moreover, specific imperfections of a barrel may allow in some case to match one bullet to a particular weapon. In the best-case scenario, two bullets fired by the same gun will not look alike but they are likely to show areas of resemblance.

When such test is not conclusive or not possible — because the bullets fragments are too small or because the gun is not recovered — it is still possible to analyze the lead content of the fragments and compare it to bullets known to belong to a suspect.

The Scientific Evidence Against Kulbicki

Maryland’s top firearms expert told the jury that the size of the bullet was compatible with Kulbicki’s gun and that he had cleaned the gun.

He added that he had not been able to identify the marks from the barrel.

Last, he testified that the lead content of the bullet that killed his mistress was identical to the content of bullets from a box belonging to Kulbicki.

“Out of the billions of bullets in the world, is this just a coincidence that this bullet ended up in the defendant’s off-duty weapon,” a prosecutor asked.

A prosecutor told the Jury that the evidence presented by the forensic experts was “a significant piece of evidence” and a “major link” to establish Kulbicki’s guilt.

The jurors agreed. Kulbicki was sentenced to life in prison without the possibility of parole.

False Testimony

Joseph Kopera, one of the forensic experts who testified at the trial, presented the formal reports to the defense.

But his working notes were not given to them either at the trial, or at the appeal, which Kulbicki lost.

These notes conflict with the report on all grounds.

Kopera testified that the fragments were consistent with a large-caliber, probably a .38.

His notes tell that the first fragment came from a medium caliber and that the origin of the second fragment could not be determined.

Kopera testified that the gun had been cleaned. His notes read, “Residue in barrel: Yes. Bore condition: Dirty.”

Kopera testified that he could not identify the grooves and lands on the fragments. His notes reveal that the fragment’s land width was 0.072 inches and its groove width was 0.083 inches.

Bullets fired from Kulbicki’s Smith & Wesson revolver had a land width of 0.100 inches and a groove width of 0.113 inches.

The difference is significant enough to state beyond doubts that Kulbicki’s gun did not fire the bullet that killed his mistress.

Kopera testified that he could not identify the twist. His notes indicate that he had detected a “slight left twist” while Kulbicki’s off-duty weapon makes right-twist markings.

Kopera testified that the lead content of the bullets were identical. It was not.

The amount of arsenic in the fragments significantly differed from the one contained in the bullets belonging to Kulbicki.

No Degree — At the trial, Kopera testified that he had an engineering degree from the Rochester Institute of Technology and a mechanical engineering degree from the University of Maryland. Neither institution has ever heard of him.

A Widely Used Technique

“Every critical part of Kopera’s testimony was false, misleading, based on improper assumptions or ignored exculpatory information,” Suzanne K. Drouet, a former Justice Department lawyer, told the judge in her recent motion seeking a new trial for Kulbicki.

“If this could happen to my client, who was a cop who worked within this justice system, what does it say about defendants who know far less about the process and may have far fewer resources to uncover evidence of their innocence that may have been withheld by the prosecution or their scientific experts?”

Following a 2004 National Academy of Sciences report that sharply criticized the FBI’s bullet-lead technique, the agency no longer relies on this method.

After retiring from the firearms section of the Maryland State Police, Kopera  committed suicide.

For more than 30 years, his expertise has helped secure countless convictions.

Nationwide, it has been estimated that the method has been used in more than 2,000 cases over four decades.

Several former FBI employees believe that a review of all cases where the CBLA method was used in testimony should be urgently conducted.

“It troubles me that anyone would be in prison for any reason that wasn’t justified. And that’s why these reviews should be done in order to determine whether or not our testimony led to the conviction of a wrongly accused individual,” said Adams, the former FBI lab director.

The second in command agree.

“I don’t believe that we can testify about how many bullets may have come from the same melt and our estimate may be totally misleading,” declared deputy lab director Marc LeBeau in a May 12, 2005, e-mail.

So far, the FBI has rejected such reviews on the basis that it would be very expensive. A sum of US$70,000 was mentioned.

Since 2005, the nonpartisan Forensic Justice Project, run by former FBI lab whistle-blower Frederic Whitehurst, has tried to force the bureau to release a list of bullet-lead cases under the Freedom of Information Act.

In academic circles, some experts have not hidden their anger toward the program and what seems to be an attempt to cover-up decades of fraudulent forensic sciences.

Clifford Spiegelman is a statistician at Texas A&M University. He reviewed the FBI’s statistical methods for the science academy.

“They said the FBI agents who went after Al Capone were the untouchables, and I say the FBI experts who gave this bullet-lead testimony are the unbelievables.”

Conclusion

Several lessons can be gleaned from the CBLA experience. In the conclusion of his excellent paper on the subject, Paul Giannelli wrote:

First, the failure to publish the empirical data that supports scientific conclusions is unacceptable. Scientists “are generally expected to exchange research data as well as unique research materials that are essential to the replication or extension of reported findings.”

Second, defense attorneys were unable to successfully challenge the evidence until William Tobin, the retired FBI expert, became a defense witness. This is not surprising because no defendant, no matter how rich, can conduct extensive empirical studies. A defense expert in a particular case can critique the bases of a prosecution expert’s opinion but can rarely replicate the research upon which that opinion rests.

Forensic Science: Last Week Tonight with John Oliver (HBO)

  Forensic science used in criminal trials can be surprisingly unscientific. Maybe a new television procedural could help change the public perception.

REFERENCES

Comparative Bullet lead Analysis: A Retrospective — Paul C. Giannelli

Comparative bullet-lead analysis – Wikipedia

=

 

“For over thirty years, FBI experts testified about comparative bullet lead analysis (CBLA), a technique that was first used in the investigation into President Kennedy’s assassination. CBLA compares trace chemicals found in bullets at crime scenes with ammunition found in the possession of a suspect. (…) Although the FBI eventually ceased using CBLA, the Bureau’s […]

via Forensic science — FBI Bullet-Lead Technique Dead Wrong — Intel Today

Advertisement

Ethical Dilemmas in Forensics

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.
via: Salem Press Encyclopedia of Science
Abstract:
Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

Definition: Principles of conduct, moral duty, and obligation that guide individuals in their decisions and actions.

Significance: As scientists, forensic scientists have a professional obligation to seek and to speak the truth about matters within their purview. As participants in a forensic process, they are subject to additional, sometimes conflicting, duties. This tension generates many ethical dilemmas.

Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

89312142-73888.jpg
Entrance sign at the Society of Forensic Toxicologists (SOFT) Meeting, July 5, 2012. By Monika M. Wahi (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia Commons

Forensic scientists are measured against a standard of professional integrity, although the professionalization of the scientific study of crime is far from complete. Professions are substantially self-regulating, usually through agreed-upon standards and codes of ethics, and this creates the need for them to articulate appropriate expectations and the responsibility of members of professions both to act correctly themselves and to provide appropriate correction for their errant colleagues. A case in point is William Tobin’s campaign against the chemical analysis of bullet lead, also known as comparative bullet-lead analysis (CBLA).

Tobin’s Exposure of CBLA

CBLA is a technique that the Federal Bureau of Investigation (FBI) used for four decades—the investigation of the assassination of President John F. Kennedy in 1963 was an early use—to make cases against defendants when traditional firearms analysis (that is, examination of barrel rifling on bullets) was not possible. By measuring the proportions of seven trace elements (antimony, arsenic, bismuth, cadmium, copper, silver, and tin) found in the lead of a bullet in evidence, forensic scientists sought to establish the probability that the bullet came from the same provenance as a bullet in the suspect’s possession. The belief that the comparison of the chemical composition of bullets could connect two bullets rested on unexamined assumptions about the similarities and differences of the source lead from which the bullets were cast. FBI experts testified in thousands of cases that the facts ascertainable through CBLA established likely identity and therefore pointed toward the probable guilt of the accused. Sometimes, as in the case of Michael Behm, who was convicted of murder in 1997, CBLA provided essentially the only evidence of guilt.

In the 1990s, FBI metallurgist William Tobin began to question the validity of the technique. He felt strongly enough about the issue to research the matter, after his retirement in 1998, with Lawrence Livermore National Laboratory metallurgist Erik Randich. They analyzed data from two lead smelters in Alabama and Minnesota and discovered that the FBI techniques could not distinguish batches of lead produced months apart. They also discovered that differences existed within single batches. Their research was published in Forensic Science International in July 2002.

Although he still defended the technique, the director of the FBI Laboratory requested that the National Research Council (NRC) of the National Academy of Sciences review CBLA. In February 2004, the NRC report, titled Forensic Analysis: Weighing Bullet Lead Evidence, confirmed that only extremely limited claims could be made about the relationship between bullets based on CBLA. Given the NRC findings, a New Jersey appeals court overturned Behm’s conviction in March 2005. The results of the NRC study have obvious implications for many other cases as well.

In an article titled “Forensic Significance of Bullet Lead Compositions,” which appeared in the Journal of Forensic Sciences in March 2005, FBI research chemists Robert D. Koons and JoAnn Buscaglia argued that “compositional comparison of bullet lead provides a reliable, highly significant point of evidentiary comparison of potential sources of crime-related bullets.” In September of that year, however, the FBI announced that it would no longer use CBLA. (In a curious subsequent development, Tobin and a member of the NRC committee, Clifford Spiegelman, suggested that a reanalysis of the bullet fragments from the Kennedy assassination might be in order.)

An article published in New Scientist in April 2002, quoted Tobin as saying of the interpretation of bullet data based on CBLA, “It offended me as a scientist.” In fact, Tobin has a long record as a critic of FBI procedures he regards as bad science and of testimonial practices he regards as unwarranted by the scientific data. To complain about testimony that unreasonably goes beyond what the data can support is to respond equally to the demands of science and the demands of ethics. It is a feature of commonsense justice that the punishment should fit the crime, and a basic requirement of that, in turn, is that the people who are punished should be guilty. Violating that requirement is both bad science and bad ethics.

Joyce Gilchrist’s Tainted Evidence

Is it enough that the accused be guilty of some crime, or does it have to be the one in question? If the accused is guilty of the crime in question, does it matter whether the evidence actually shows that? The belief that one can convict the guilty by tweaking the evidence a little, or shading one’s testimony a bit, is among the most common sources of unethical (and, often enough, criminal) behavior on the part of forensic scientists. The cautionary tale of former Oklahoma City police Department forensic scientist Joyce Gilchrist probably falls into this category.

In May 2007, Curtis Edward McCarty, who was facing his third trial for a 1982 murder, was freed as the result of the improper handling and representation of hair evidence by Gilchrist, who apparently had tried to frame McCarty. The judge dismissed the charge despite her belief that McCarty was probably not completely innocent. This was merely the latest in a series of episodes involving Gilchrist.

Questions about the integrity of Gilchrist’s work began as early as January 1987, when a Kansas City colleague, John Wilson, complained about her to the Southwestern Association of Forensic Scientists, without result. In 1998, Robert Miller was exonerated after he had been convicted a decade earlier based in part on Gilchrist’s testimony regarding blood, semen, and hair evidence. In 1999, Gilchrist was criticized by a judge for having given false testimony (regarding semen evidence) in the rape/murder trial of Alfred Brian Mitchell in 1992. In the spring of 2000, Jeffrey Todd Pierce was ordered released after he had served a decade and a half for a rape he did not commit; he had been convicted based on Gilchrist’s testimony. In January 2001, Gilchrist was criticized for the various judicial reprimands and professional critiques her work had received. In August 2001, doubts were raised about the guilt of Malcolm Rent Johnson, who had been executed for a 1981 rape and murder; Johnson was convicted based on Gilchrist’s testimony.

A month later, in September 2001, Gilchrist was finally fired, after years of reputedly shoddy forensics work, including both mishandling and misrepresentation of evidence, on many cases in addition to those noted above. The world of criminal justice contains innumerable isolated instances of perverse idealism, self-serving cynicism, and simple incompetence, but Gilchrist is one of the most striking cases of flagrant disregard for ethics in the forensics community. Was she genuinely convinced of the guilt of those against whom she testified? (She was certainly persuasive to juries.) Was she cynically distorting her testimony, and the evidence, to help prosecutors gain convictions, or was she just incompetent?

Ethics of Competence

One may well agree with forensics ethicist Peter D. Barnett’s remark that “there is a certain baseline level of competence that every criminalist is expected to understand, and there are certain procedures and protocols that are so fundamental to the practice of criminalistics that failure to follow them is evidence of gross incompetence or malfeasance, which is unethical.” As Barnett himself notes, however, “in the practice of forensic science, the disparate educational and experiential backgrounds of workers in the field make determination of a baseline level of competence relatively difficult.”

This is a problem throughout the American criminal justice system. In June 2007, all sergeants in the New Orleans Police Department were required to attend a four-day seminar to learn how to improve their (and their subordinates’) writing of police reports. This was part of an attempt to smooth out conflicts between the department and the New Orleans district attorney’s office, which claimed that part of its difficulty in prosecuting criminals stemmed from “incomplete or vague reports” by officers. More generally, criminalists frequently lament that frontline officers are not more skilled in observing, protecting, collecting, and preserving crime scene evidence.

One certainly can (in theory) impose reasonable expectations about competence and development in forensic science. However, that is not made easy by the variety of educational backgrounds and practical experience of the people who actually work in the field. In an unflattering assessment published in 2005, Jane Campbell Moriarty and Michael J. Saks bluntly asserted that “in the forensic sciences . . . 96 percent of practitioners hold bachelor’s degrees or less.” They went on to note:

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.

Moreover, as with the difficulty in defining the qualifications for expert testimony, the fact that crime fighting is not a natural kind of expertise has an impact. Almost any expert might be relevant to a criminal case, depending on circumstances. Given the diverse forms of knowledge relevant to the application of science to crime solving, and to the providing of suitable expert testimony, it may be that the only truly unifying factor is the application of the so-called scientific method, broadly understood as intellectual integrity—the determined effort, as physicist Richard P. Feynman put it, not to fool oneself (or others).

What is impressive about the case of William Tobin is his determination to ensure that his colleagues (or former colleagues) not testify to more than the data warrant, both out of scientific integrity and out of fairness to those whose lives are affected by what scientists say. What is appalling about the case of Joyce Gilchrist is the stubbornness of her effort to resist correction by colleagues or even by the seemingly obvious limits of the evidence itself. Sometimes the individual needs to correct the group, by exposing a bogus or complacent consensus; sometimes the group needs to correct the individual, by identifying willful deception or self-centered fantasy. Unfortunately, no formula exists to guarantee the right result, and that is why ethics remains a constant challenge to conscientious souls.

Ethical dilemmas in forensics

Related Information

  • American Academy of Forensic Sciences (AAFS)
  • American Society of Crime Laboratory Directors (ASCLD)
  • Brain-wave scanners
  • Criminal personality profiling
  • DNA database controversies
  • Ethics of DNA analysis
  • Expert witnesses in trials
  • Forensic journalism
  • Innocence Project
  • Interrogation in criminal investigations
  • Training and licensing of forensic professionals
  • Truth serum in interrogation

Last reviewed: October 2016

Bibliography

Barnett, Peter D. Ethics in Forensic Science: Professional Standards for the Practice of Criminalistics. Boca Raton: CRC, 2001. Print.

Inman, Keith, and Norah Rudin. Principles and Practice of Criminalistics: The Profession of Forensic Science. Boca Raton: CRC, 2001. Print.

Lucas, Douglas M. “The Ethical Responsibilities of the Forensic Scientist: Exploring the Limits.” Journal of Forensic Sciences 34 (1989): 719–29. Print.

Macklin, Ruth. “Ethics and Value Bias in the Forensic Sciences.” Journal of Forensic Sciences 42 (1997): 1203–206. Print.

Moriarty, Jane Campbell, and Michael J. Saks. “Forensic Science: Grand Goals, Tragic Flaws, and Judicial Gatekeeping.” Judges’ Journal 44.4 (2005): 16–33. Print.

Peterson, Joseph L., and John E. Murdock. “Forensic Science Ethics: Developing an Integrated System of Support and Enforcement.” Journal of Forensic Sciences 34 (1989): 749–62. Print.

Derived from: “Ethics.” Forensic Science. Salem Press. 2009.