Forensic Files

$7.5M Award paid for BIASED evidence analysis via FOX6Now.com

MILWAUKEE —  Robert Lee Stinson spent 23 years in prison for a crime he did not commit. Now, he is finally getting justice for a conviction based on flawed evidence. His long-awaited day in court came amid a national effort to put forensic science on trial.

 

For decades, television shows have conditioned people to believe that people can pinpoint a criminal suspect with a shoe print, tire mark, or a single strand of hair, and they can do it with absolute certainty. However, the advent of DNA technology has proven that other forensic disciplines, once thought to be bulletproof, are susceptible. Those errors have put hundreds, if not thousands, of innocent people in prison.

The conviction and exoneration of Robert Lee Stinson

When Robert Lee Stinson walked out of New Lisbon Correctional Center at the age of 44, his smile revealed a full set of teeth.

“It’s been a long time. Twenty-three years. I was accused of something I didn’t do,” Stinson said when he was released in 2009.

More than two decades earlier, one of those teeth was missing, and that’s all it took to convict him of murder.

“That was essentially the case. The whole case against Mr. Stinson,” said Keith Findley, co-founder of the Center for Integrity in Forensic Sciences.

In fall 1984, the body of 63-year-old Ione Cychosz was discovered in the backyard of a home near 7th and Center. She’d been raped and beaten to death. There were bite marks all over her skin.

“Whoever left these bite marks had some irregular dentition,” Findley explained.

Milwaukee police brought in a dental expert from Marquette University to examine the marks. Doctor L. Thomas Johnson helped police develop a sketch, which showed the killer would likely have a cracked or missing upper right tooth.

“It’s a difficult job,” Dr. Johnson said during a 2007 interview with FOX6 about forensic odontology.

Stinson lived just steps from the crime scene, and had a missing upper right tooth.

“The detectives closed this case after seeing Mr. Stinson,” said Heather Lewis Donnell, Stinson’s attorney since 2009.

The jury never saw the sketch, which showed a different tooth missing than the one in Stinson’s mouth, but they did hear Dr. Johnson say that the bite marks “had to have come” from Stinson. There was no margin for error. A second expert agreed.

“So they were saying, ‘It has to be him,'” Lewis Donnell explained.

She said the level of certainty the dental experts relayed to the jury in 1985 was never supported by the science.

“That they had the ability, their science had the ability to say, ‘It was this person, and only this person,'” Lewis Donnell said.

“It’s really kind of preposterous,” Findley said.

Twenty-three years would pass before Findley and the Wisconsin Innocence Project would prove the doctors were wrong.

“Did you ever think this would come?” a reporter asked Stinson after his 2009 release.

“No, I didn’t. No, I didn’t, but with the help of the Innocence Project — came through,” Stinson responded.

DNA technology would eventually identify the real killer as Moses Price, but Findley said the bite mark analysis that put Stinson away instead was flawed from the start, and more recent research proves it.

“It’s essentially junk,” Findley said.

Questioning bite mark analysis

For more than 50 years, Dr. Johnson was a pioneer in the field of forensic odontology. He led a team of dentists that identified victims of the 1985 Midwest Airlines crash, and he helped police identify the remains of victims dismembered by serial killer Jeffery Dahmer.

However, Findley said using bite marks to solve crimes is an entirely different process.

“Matching human remains is not the problem. Matching bite marks to a particular individual is a huge problem,” Findley said.

Matching actual teeth to actual dental records is precise, but a growing body of research finds that bite marks left on the skin are unreliable, because skin is a terrible medium for retaining bite mark indentations.

“Because skin is malleable,” Lewis Donnell explained.

“It stretches. It bloats. You bruise in funny patterns,” Findley explained further. “And that’s where the science has completely fallen apart.”

Study after study now questions the validity of bite mark analysis, with one expert calling it “the poster child for bad forensic science.” A 2009 report by The National Academy of Sciences went further, citing “serious problems” across the entire “forensic science system,” from fingerprints to firearms, and footwear to hair comparison.

Changing the face of forensic science

It was that government report and another that followed in 2016 that ultimately prompted Findley to join some of the nation’s leading criminal defense experts in launching The Center for Integrity in Forensic Sciences. The first symposium was held in June at Northwestern University.

“This is one of our inaugural events,” Findley said during the symposium. “We can’t wait for the federal government to fix this.”

“We want to ensure that the science we’re producing is reliable and defendable,” said Jennifer Naugle, deputy administrator of the Wisconsin State Crime Lab.

Naugle said she’s on board with improving the science behind forensic science.

“‘The only thing we’re trying to do is seek the truth through science. That’s it. That’s really all it is,” Naugle said.

She said a 2016 report by the Obama Administration unfairly lumped more reliable techniques used every day, like fingerprint and firearms analysis, with things like hair and bite mark analysis, which has been largely discredited.

“That’s not something we would ever do at the Wisconsin State Crime Lab,” Naugle said.

“We’re not suggesting that all of the forensic disciplines are useless. They’re not, but what we are suggesting is that they need to be improved,” Findley said.

Dr. Johnson retired in 2013, but the following year, he published his final study on bite mark analysis. It concluded it is sometimes possible to narrow the source of a human bite mark to about 5% of the population. In other words, nowhere near a precise individual match. The FOX6 Investigators contacted Dr. Johnson by telephone, but he is 93 years old and unable to hear well. His wife declined an interview on his behalf.

Now that Dr. Johnson is retired, there is only one board-certified forensic odontologist in Wisconsin — Dr. Donald Simley in Madison. He declined an interview for this story because Dr. Johnson is a close personal friend and mentor. Dr. Simley has not testified in a bite mark case since 2003. While he believes there is still value in this type of evidence, he said police are better off swabbing a bite mark for DNA than trying to match a suspect’s teeth.

Across the country,  the Innocence Project has exonerated more than 160 people who were convicted with flawed forensic evidence, including 10 because of bite marks.

“This evidence is dreadful,” said Jennifer Mnookin, UCLA School of Law, during the symposium.

Yet, bite mark evidence is still admissible in more states, including Wisconsin, where, ironically, Stinson’s case still serves as the legal precedent.

“Even though Stinson has now been conclusively exonerated, and the bite mark evidence in his case has been shown to be false,” Findley said.

Robert Lee Stinson seeks justice in federal court

Ten years after Stinson’s release, his federal civil rights case against the dentists and the City of Milwaukee finally went to trial.

“There was a lot of powerful and moving testimony,” Lewis Donnell said.

Just before the case went to the jury, they settled out of court. The City of Milwaukee will pay Stinson $7.5 million. Stinson’s attorney said the remaining terms of the settlement — including any amount other defendants have agreed to pay — will remain confidential.

“We’re just really grateful that this is how it ended, and that Mr. Stinson got some measure of justice after all he’s been through,” said Lewis Donnell.

Thirty-four years later, Stinson can finally move on, but the injustice he endured is sure to leave a mark.

  • Share

 

 

 

MILWAUKEE — He spent 23 years in prison for a crime he did not commit. Now, a Milwaukee man is finally getting justice for a conviction based on flawed evidence. His long-awaited day in court came amid a national effort to put forensic science on trial. For decades, television shows have conditioned people to believe that people can pinpoint a criminal suspect with a shoe print, tire mark, or a single strand of hair, and they can do it with […]

via ‘It’s essentially junk:’ $7.5M bite mark settlement underscores national call for better forensic evidence — FOX6Now.com

Advertisement

Global Standardization of Forensics will Decrease the Bias Factor of Evidence Collection Procedures and Court Rulings

Interviews – 2018

Angus Marshall, Digital Forensic Scientist

via Angus Marshall
Angus, tell us a bit about yourself. What is your role, and how long have you been working in digital forensics?

Where to begin? I have a lot of different roles these days, but by day I’m a Lecturer in Cybersecurity – currently at the University of York, and also run my own digital forensic consultancy business. I drifted into the forensic world almost by accident back in 2001 when a server I managed was hacked. I presented a paper on the investigation of that incident at a forensic science conference and a few weeks later found myself asked to help investigate a missing person case that turned out to be a murder. There’s been a steady stream of casework ever since.

I’m registered as an expert adviser and most of my recent casework seems to deal with difficult to explain or analyse material. Alongside that, I’ve spent a lot of time (some might say too much) working on standards during my time on the Forensic Science Regulator’s working group on digital evidence and as a member of BSI’s IST/033 information security group and the UK’s digital evidence rep. on ISO/IEC JTC1 SC27 WG4, where I led the work to develop ISO/IEC 27041 and 27042, and contributed to the other investigative and eDiscovery standards.

You’ve recently published some research into verification and validation in digital forensics. What was the goal of the study?

It grew out of a proposition in ISO/IEC 27041 that tool verification (i.e. evidence that a tool conforms to its specification) can be used to support method validation (i.e. showing that a particular method can be made to work in a lab). The idea of the 27041 proposal is that if tool vendors can provide evidence from their own development processes and testing, the tool users shouldn’t need to repeat that. We wanted to explore the reality of that by looking at accredited lab processes and real tools. In practice, we found that it currently won’t work because the requirement definitions for the methods don’t seem to exist and the tool vendors either can’t or won’t disclose data about their internal quality assurance.

The effect of it is that it looks like there may be a gap in the accreditation process. Rather than having a selection of methods that are known to work correctly (as we see in calibration houses, metallurgical and chemical labs etc. – where the ISO 17025 standard originated) which can be chosen to meet a specific customer requirement, we have methods which satisfy much fuzzier customer requirements which are almost always non-technical in nature because the customers are CJS practitioners who simply don’t express things in a technical way.

We’re not saying that anyone is necessarily doing anything wrong, by the way, just that we think they’ll struggle to provide evidence that they’re doing the right things in the right way.

Where do we stand with standardisation in the UK at the moment?

Standardization is a tricky word. It can mean that we all do things the same way, but I think you’re asking about progress towards compliance with the regulations. In that respect, it looks like we’re on the way. It’s slower than the regulator would like. However, our research at York suggests that even the accreditations awarded so far may not be quite as good as they could be. They probably satisfy the letter of the regulator’s documents, but not the spirit of the underlying standard. The technical correctness evidence is missing.

ISO 17025 has faced a lot of controversy since it has been rolled out as the standard for digital forensics in the UK. Could you briefly outline the main reasons why?

Most of the controversy is around cost and complexity. With accreditation costing upwards of £10k for even a small lab, it makes big holes in budgets. For the private sector, where turnover for a small lab can be under £100k per annum, that’s a huge issue. The cost has to be passed on. Then there’s the time and disruption involved in producing the necessary documents, and then maintaining them and providing evidence that they’re being followed for each and every examination.

A lot of that criticism is justified, but adoption of any standard also creates an opportunity to take a step back and review what’s going on in the lab. It’s a chance to find a better way to do things and improve confidence in what you’re doing.

In your opinion, what is the biggest stumbling block either for ISO 17025 specifically, or for standardizing digital forensics in general?

Two things – as our research suggests, the lack of requirements makes the whole verification and validation process harder, and there’s the confusion about exactly what validation means. In ISO terms, it’s proof that you can make a process work for you and your customers. People still seem to think it’s about proving that tools are correct. Even a broken tool can be used in a valid process, if the process accounts for the errors the tool makes.

I guess I’ve had the benefit of seeing how standards are produced and learning how to use the ISO online browsing platform to find the definitions that apply. Standards writers are a lot like Humpty Dumpty. When we use a word it means exactly what we choose it to mean. Is there a way to properly standardise tools and methods in digital forensics?

It’s not just a UK problem – it’s global. There’s an opportunity for the industry to review the situation, now, and create its own set of standard requirements for methods. If these are used correctly, we can tell the tool makers what we need from them and enable proper objective testing to show that the tools are doing what we need them to. They’ll also allow us to devise proper tests for methods to show that they really are valid, and to learn where the boundaries of those methods are.

Your study also looked at some existing projects in the area: can you tell us about some of these? Do any of them present a potential solution?

NIST and SWGDE both have projects in this space, but specifically looking at tool testing. The guidance and methods look sound, but they have some limitations. Firstly, because they’re only testing tools, they don’t address some of the wider non-technical requirements that we need to satisfy in methods (things like legal considerations, specific local operational constraints etc.).

Secondly, the NIST project in particular lacks a bit of transparency about how they’re establishing requirements and choosing which functions to test. If the industry worked together we could provide some more guidance to help them deal with the most common or highest priority functions.

Both projects, however, could serve as a good foundation for further work and I’d love to see them participating in a community project around requirements definition, test development and sharing of validation information.

Is there anything else you’d like to share about the results?

We need to get away from thinking solely in terms of customer requirements and method scope. These concepts work in other disciplines because there’s a solid base of fundamental science behind the methods. Digital forensics relies on reverse-engineering and trying to understand the mind of a developer in order to work out how extract and interpret data. That means we have a potentially higher burden of proof for any method we develop. We also need to remember that we deal with a rate of change caused by human ingenuity and marketing, instead of evolution.

Things move pretty fast in DF, if we don’t stop and look at what we’re doing once in a while, we’ll miss something important.

Read Angus Marshall’s paper on requirements in digital forensics method definition here. Angus Marshall

The hottest topic in digital forensics at the moment, standardisation is on the tip of everyone’s tongues. Following various think pieces on the subject and a plethora of meetings at conferences, I spoke to Angus Marshall about his latest paper and what he thinks the future holds for this area of the industry. You can […]

via Angus Marshall talks about standardisation — scar

Wrongfully convicted former Death Row inmate gets second bite at $18 million case

Decorative Scales of Justice in the CourtroomPosted: 04/07/2015, 10:45am |  @ Chicago Sun Times

Nathson Fields, former Death Row inmate who was cleared of a double murder, will get another chance to win substantial damages from the city and police. A lawsuit last year led to a judgment of $80,000; he had sought $18 million. | Rummana Hussain/Sun-Times

A former Death Row inmate who was wrongfully convicted of a double murder after a Chicago cop withheld or fabricated evidence against him, then was left fuming when a federal jury awarded him just $80,000 in damages, is getting another chance to win the $18 million he says he deserves.