7 Countries with the Best Forensic Technologies via iTHINK

Civilizations that have thousands of years invested in perfecting a field tend to NAIL IT with more accuracy….

Many research and development solutions can be obtained through information sharing from countries that have had centuries of trial and error based experiementation. Seek to learn from mentors in the field, and save yourself from complicating your analysis.

7 Countries with the Best Forensic Technology

7. China

Technology

China has been investing time, energy and resources into forensic science since the 1980’s and globally-renowned forensic scientist Henry Chang-yu Lee believes it’s about to pay off tipping China to become a world leader in high-tech evidence collection.

“I believe the technology in China will be more advanced than ever in the United States within five years,” the Chinese-American expert said in a recent interview with China Daily.

Lee, who has racked up more than five decades of experience in forensic science, has worked on a number of high-profile criminal cases in the US, but has also shared his wealth of knowledge with students, lawyers, judges and law enforcement in China over the years.

“The apparatus and devices used to identify fingerprints or footprints, for example, were very simple when I first visited Chinese forensic laboratories,” he said.

However, he has seen the technology improve over the years and there have been many advances, particularly in electronic evidence collection and fraud prevention by means of real-time monitoring.

In 2016, Lee and several other experts established the Silk Road Forensic Consortium in Xi’an, Shaanxi province, to fight crime and safeguard security by boosting scientific exchanges among countries involved in the Belt and Road Initiative.

The consortium, which has 150 members from 30 countries and regions, provides an open platform for forensic specialists, police officers and judges to share ideas and difficulties as well as experiences in DNA identification studies.

Lee, who acts as chairman, said, “Although we speak different languages in our daily lives, we all speak the same ‘language’ at work, and that’s the language of the criminal investigation. We share the same goal – to speak for the dead using forensic science.”

In September, at the organisation’s third annual conference in Yantai, Shandong province, Lee announced plans to unify DNA identification standards among its members to try and build a mutual DNA database that can better solve criminal cases.

Unified standards are essential to the world of forensic science, he told China Daily.

“If we can achieve unification in China, it can be extended across Asia, to the consortium and finally the world,” he added. “It would mean a brighter future for forensic science.”

6. European Network of Forensic Institutes

Although not a country, the European Network of Forensic Institutes (ENFSI) is recognized as a pre-eminent voice in forensic science worldwide. It is a network of forensic specialists covering a broad range of fields of expertise, from 38 countries geographically spread across Europe:

Austria, Armenia, Azerbaijan, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Georgia, Germany, Greece, Hungary, Ireland, Italy, Latvia, Lithuania, Former Yugoslav Republic of Macedonia, Malta, Montenegro, The Netherlands, Norway, Poland, Portugal, Romania, Russia, Serbia, Slovenia, Slovakia, Spain, Sweden, Switzerland, Turkey, Ukraine and the United Kingdom.

The ENFSI has seventeen Expert Working Groups working on a diverse range of forensic specialisms, from textiles and hair to explosives and firearms. It also provides invaluable training to police officers and crime scene investigators.

5. Germany

Technology

Police in the German state of Bavaria have the power to use forensic DNA profiling after a controversial law passed in 2018 in the Landtag, the state parliament in Munich. The law was the first in Germany to allow authorities to use DNA to help determine the physical characteristics, such as eye colour, of an unknown culprit.

The new DNA rules are part of a broader law which has drawn criticism of the wide surveillance powers it gives the state’s police to investigate people they deem an “imminent danger,” people who haven’t necessarily committed any crimes but might be planning to do so.

The move was prompted, in part, by the rape and murder of a medical student in Freiburg, Germany, in late 2016. An asylum seeker, originally from Afghanistan, was convicted of the murder and sentenced to life in prison.

But some authorities complained that they could have narrowed their search more quickly if they had been able to use trace DNA to predict what the suspect would look like.

Federal and state laws previously only allowed investigators to use DNA to look for an exact match between crime scene evidence and a potential culprit, either in a database of known criminals or from a suspect.

Germany also forms part of the aforementioned ENFSI.

4. South Korea

To say that smartphones have changed the digital forensic landscape is an understatement. The device has become the core of every criminal investigation and helped propel digital forensics as a serious, scientific investigation tool.

South Korea is leading the way in digital forensics, with its largest digital forensic firm, Hancom GMD, playing a crucial role in prosecuting some of the country’s most powerful politicians.

In late 2016, South Korea was rocked by one of its biggest political corruption scandals in history – its President Park Guen-hye was accused of bribery and by law, investigators only had 60 days to investigate and prosecute.

They had confiscated over 300 smartphones as from suspects and needed to analyse tens of thousands of phone records and chat messages within a tight deadline. Hancom GMD successfully analysed all of the data in the 300 smartphones and extracted crucial evidence that led to several convictions.

With 5G set to be rolled out globally this year, forensic teams in South Korea are already preparing for this further growth in the collection of digital evidence.

Hancom GMD is planning to launch a service that recovers data from the cloud, though privacy regulations in each country are expected to be a challenge to overcome.

3. United Kingdom

Technology

Prior to its closure in 2012, the UK Forensic Science Service (FSS) was a world-leader in forensic technology. It pioneered the use of the handheld breath alcohol roadside tester and the DNA national database was first worked on and initially tested on all staff and police forces to ensure its reliability.

The organisation later pioneered the use of large scale DNA profiling for forensic identification and crime detection when it moved the facilities to Birmingham.

This enabled the launch of the world’s first DNA database on 10 April 1995. The FSS’s innovative and highly sensitive DNA profiling technique called LCN (low copy number) was used in convicting Antoni Imiela (the M25 rapist).

As well as,  Ronald Castree (for the murder of Lesley Molseed in 1975) but the organisation came under attack when it failed to recover blood stains from a shoe in the murder of Damilola Taylor.

Forensic laboratories in the UK are now privately-owned but are experiencing similar financial difficulties, a recent inquiry by the House of Lords heard.

Mark Pearse, the commercial director in the forensics division of Eurofins, one of the three major providers in the UK, described an “unsustainable toxic set of conditions” when he appeared before the inquiry.

Representatives from the two other largest providers – Key Forensics, which had to be bailed out by police last year after going into administration, and Cellmark – raised similar concerns.

However, that’s not to say that the UK is not involved in researching and implementing new forensic technologies.

Forensic scientists are currently working with the British military to open the United Kingdom’s first body farm — a site where researchers will be able to study the decomposition of human remains.

Details are not yet finalized, but the plans are at an advanced stage: project leaders hope this year to open the farm, also known as a forensic cemetery or taphonomy facility, after the discipline devoted to the study of decay and fossilization.

Such sites generate data on tissue and bone degradation under controlled conditions, along with chemical changes in the soil, air and water around a corpse, to help criminal and forensic investigators.

2. The Netherlands

The Netherlands Forensic Institute (NFI) is one of the world’s leading forensic laboratories. From its state-of-the-art, purpose-built premises in The Hague, the NFI provides products and services to a wide range of national and international clients.

To ensure that their work remains at the forefront of developments, the Netherlands Forensic Institute invests heavily in Research and Development. In this way, it lays the foundations for innovative forensic methods and technologies that will play an important part in the coming decades.

Amongst these innovative forensic technologies is the invention of Hansken, a system that can store large quantities and diverse data easily from different sources. All data is stored, indexed, enriched and made rapidly searchable, cutting down the turnaround time of forensic evidence.

It now contains over 150 samples of glass from a large number cases. In several cases, this glass database has linked suspects to several crimes.

Offenders who carry out robberies, smash-and-grab raids or ARM gas attacks may have splinters of glass on their clothes or in the soles of their shoes and these splinters of glass can remain in place for months, even though they are barely visible to the naked eye, if at all.

These splinters can be of great value. The composition of each piece of glass is unique because of minuscule contaminants in the raw materials for making glass.

By comparing the unique composition of splinters of glass found on a suspect to glass from the database, it is possible to check whether that glass originates from a crime committed earlier.

The glass analysts of the NFI measure the concentration of twenty elements in each piece of glass. This produces a kind of ‘chemical fingerprint’ of the material.

1.United States of America

Technology

It will come as no surprise that at the forefront of cutting-edge forensic technology is the USA, home to over 400 crime labs and the biggest crime lab in the world, the FBI Laboratory.

 

To help train government and industry organisations on cyberattack prevention, as part of a research project for the U.S. Army, scientists at The University of Texas at San Antonio, have developed the first framework to score the agility of cyber attackers and defenders.

“The DOD and U.S. Army recognize that the cyber domain is as important a battlefront as ground, air and sea,” said Dr. Purush Iyer, division chief, network sciences at Army Research Office, an element of the Army Futures Command’s Army Research Laboratory.

“Being able to predict what the adversaries will likely do provides opportunities to protect and to launch countermeasures. This work is a testament to successful collaboration between academia and government.”

The framework developed by the researchers will help government and industry organizations visualize how well they out-maneuver attacks.

Their work is published in IEEE Transactions on Information Forensics and Security, a top journal for cybersecurity.

Education and training programs in the field of forensics are also on the rise, supported by organisations such as The Forensic Sciences Foundation and the American Academy of Forensic Sciences.

In fact, there are 485 Forensic Science schools in the US, so it’s no wonder that it is the home of the some of the most influential forensic scientists, such as Dr. Michael M. Baden and Ellis R. Kerley, and is sure to produce a great deal more talent in the future.

This is certainly an exciting time to be working in forensic science, with the challenges presented by the world of AI, Smartphones and Cloud data calling for rapid improvements to existing technology.

With these challenges comes the need for those countries with more developed forensic facilities to provide training and education opportunities to those in less developed areas so that science can play its rightful part in the criminal justice system.

For now, these are among the 7 countries who have the most advanced forensic technology and it is not the end. As the world continues to evolve, so will technology and the forensic industry itself.

Hi! I’m Isabella and I’m an Italian living in the UK studying for a Masters in Crime & Justice. I currently work in the prison education sector and have a background in teaching, having completed a PGCE after reading languages at the University of Durham. I love travelling, cooking, reading and playing the piano.

Technology is at its peak moment and with it has bought about some of the finest forensic techs. Here are 7 countries with the best forensic technology.

via 7 Countries with the Best Forensic Technology — iTHINK

Advertisements

How to Evaluate a Statistic and avoid Bias / False Presumptions via Mathematical Software

A counting statistic is simply a numerical count of the number of some item such as “one million missing children”, “three million homeless”, and “3.5 million STEM jobs by 2025.” Counting statistics are frequently deployed in public policy debates, the marketing of goods and services, and other contexts. Particularly when paired with an emotionally engaging story, counting statistics can be powerful and persuasive. Counting statistics can be highly misleading or even completely false. This article discusses how to evaluate counting statistics and includes a detailed list of steps to follow to evaluate a counting statistic.

Checklist for Counting Statistics

  1. Find the original primary source of the statistic. Ideally you should determine the organization or individual who produced the statistic. If the source is an organization you should find out who specifically produced the statistic within the organization. If possible find out the name and role of each member involved in the production of the statistic. Ideally you should have a full citation to the original source that could be used in a high quality scholarly peer-reviewed publication.
  2. What is the background, agenda, and possible biases of the individual or organization that produced the statistic? What are their sources of funding?What is their track record, both in general and in the specific field of the statistic? Many statistics are produced by “think tanks” with various ideological and financial biases and commitments.
  3. How is the item being counted defined. This is very important. Many questionable statistics use a broad, often vague definition of the item paired with personal stories of an extreme or shocking nature to persuade. For example, the widely quoted “one million missing children” in the United States used in the 1980’s — and even today — rounded up from an official FBI number of about seven hundred thousand missing children, the vast majority of whom returned home safely within a short time, paired with rare cases of horrific stranger abductions and murders such as the 1981 murder of six year old Adam Walsh.
  4. If the statistic is paired with specific examples or personal stories, how representative are these examples and stories of the aggregate data used in the statistic? As with the missing children statistics in the 1980’s it is common for broad definitions giving large numbers to be paired with rare, extreme examples.
  5. How was the statistic measured and/or computed? At one extreme, some statistics are wild guesses by interested parties. In the early stages of the recognition of a social problem, there may be no solid reliable measurements; activists are prone to providing an educated guess. The statistic may be the product of an opinion survey. Some statistics are based on detailed, high quality measurements.
  6. What is the appropriate scale to evaluate the counting statistic? For example, the United States Census estimates the total population of the United States as of July 1, 2018 at 328 million. The US Bureau of Labor Statistics estimates about 156 million people are employed full time in May 2019. Thus “3.5 million STEM jobs” represents slightly more than one percent of the United States population and slightly more than two percent of full time employees.
  7. Are there independent estimates of the same or a reasonably similar statistic? If yes, what are they? Are the independent estimates consistent? If not, why not? If there are no independent estimates, why not? Why is there only one source? For example, estimates of unemployment based on the Bureau of Labor Statistics Current Population Survey (the source of the headline unemployment number reported in the news) and the Bureau’s payroll survey have a history of inconsistency.
  8. Is the statistic consistent with other data and statistics that are expected to be related? If not, why doesn’t the expected relationship hold? For example, we expect low unemployment to be associated with rising wages. This is not always the case, raising questions about the reliability of the official unemployment rate from the Current Population Survey.
  9. Is the statistic consistent with your personal experience or that of your social circle?If not, why not? For example, I have seen high unemployment rates among my social circle at times when the official unemployment rate was quite low.
  10. Does the statistic feel right? Sometimes, even though the statistic survives detailed scrutiny — following the above steps — it still doesn’t seem right. There is considerable controversy over the reliability of intuition and “feelings.” Nonetheless, many people believe a strong intuition often proves more accurate than a contradictory “rational analysis.” Often if you meditate on an intuition or feeling, more concrete reasons for the intuition will surface.

(C) 2019 by John F. McGowan, Ph.D.

About Me

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Centerinvolved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).

 

 

A counting statistic is simply a numerical count of the number of some item such as “one million missing children”, “three million homeless”, and “3.5 million STEM jobs by 2025.” Counting statistics are frequently deployed in public policy debates, the marketing of goods and services, and other contexts. Particularly when paired with an emotionally engaging […]

via How to Evaluate a Counting Statistic — Mathematical Software

8 Reasons Why I love Forensics via Emily Leong

I get to talk about topics that matter to me

In forensics, I get to choose the topics for all my speeches. This means that they’re very personal and matter a lot to me. Issues like racism, homophobia, and gun control are all topics I’ve done speeches on. Being able to have a space to express my opinions, and to hear other opinions has been really empowering for me.

I can express myself

Forensics isn’t just all formal speeches. Most of the events I do is under the interpretation genre, meaning that my speeches are more like acting than formal informative or persuasive speeches. Being able to express myself through my speeches has been a great stress reliever for me.

Creativity

Being in interpretation events, I get to have a bigger creative licence with my speeches. This allows me to get creative with what I do with my movements and gestures. This creative side of forensics turns words into art.

Life Skills

Through this activity, I’ve developed the important life skill of public speaking. Many people who have not done forensics fear public speaking, but because I have grown such an affinity for it, speeches, and class presentations are much easier, and come quite natural to me.

Community

I have met so many wonderful people through this activity. Most of my closest friends are on my team, and I’ve met so many beautiful people from other teams who are so inspiring, beautiful, and kind.

State Team Goofy

I stay informed

Most topics in Forensics are about either politics or current events. This means that I am not informed during tournaments, but motivated to read and watch more news, and stay current outside of tournaments. Staying informed on current events now has become important to my daily life.

Travel

Especially in college, forensics is a great way to travel to new places. The spring of 2018, my team traveled to Nashville, Tennessee for the Pi Kappa Delta national tournament. It was great to explore the city, and I can’t wait to have the opportunity to travel more.

Confidence Booster

Forensics has helped be able to accept constructive criticism, which is something a lot of people need to work on. It also has allowed me to feel good about the work I put into each and every one of  my speeches. Forensics has given me a space where I feel confident about myself.

Suits

Of course, a proper team is not complete without the proper attire. Suits are the norm at tournaments, and if you dig around enough on Ebay or at Goodwill, you’re able to find some great pieces. Suits can be a great conversation starter with another competitor. A good suit is essential for forensics.

Ecc Tessa Suit

 

I get to talk about topics that matter to me In forensics, I get to choose the topics for all my speeches. This means that they’re very personal and matter a lot to me. Issues like racism, homophobia, and gun control are all topics I’ve done speeches on. Being able to have a space to […]

via 8 Reasons Why I love Forensics — Emily Leong

Forensic Failures Described via Law in Focus @ CSIDDS |

Faulty Forensics: Explained

By Jessica Brand

(West Midlands Police / Flickr [CC])

In our Explainer series, Fair Punishment Project lawyers help unpack some of the most complicated issues in the criminal justice system. We break down the problems behind the headlines — like bail, civil asset forfeiture, or the Brady doctrine — so that everyone can understand them. Wherever possible, we try to utilize the stories of those affected by the criminal justice system to show how these laws and principles should work, and how they often fail. We will update our Explainers quarterly to keep them current.

In 1992, three homemade bombs exploded in seemingly random locations around Colorado. When police later learned that sometime after the bombs went off, Jimmy Genrich had requested a copy of The Anarchist Cookbook from a bookstore, he became their top suspect. In a search of his house, they found no gunpowder or bomb-making materials, just some common household tools — pliers and wire cutters. They then sent those tools to their lab to see if they made markings or toolmarks similar to those found on the bombs.

At trial, forensic examiner John O’Neil matched the tools to all three bombs and, incredibly, to an earlier bomb from 1989 that analysts believed the same person had made — a bomb Genrich could not have made because he had an ironclad alibi. No research existed showing that tools such as wire cutters or pliers could leave unique markings, nor did studies show that examiners such as O’Neil could accurately match markings left by a known tool to those found in crime scene evidence. And yet O’Neil told the jury it was no problem, and that the marks “matched … to the exclusion of any other tool” in the world. Based on little other evidence, the jury convicted Genrich.

Twenty-five years later, the Innocence Project is challenging Genrich’s conviction and the scientific basis of this type of toolmark testimony, calling it “indefensible.” [Meehan Crist and Tim Requarth / The Nation]

There are literally hundreds of cases like this, where faulty forensic testimony has led to a wrongful conviction. And yet as scientists have questioned the reliability and validity of “pattern-matching” evidence — such as fingerprints, bite marks, and hair — prosecutors are digging in their heels and continuing to rely on it. In this explainer, we explore the state of pattern-matching evidence in criminal trials.

What is pattern-matching evidence?

In a pattern-matching, or “feature-comparison,” field of study, an examiner evaluates characteristics visible on evidence found at the crime scene — e.g., a fingerprint, a marking on a fired bullet (“toolmark”), handwriting on a note — and compares those features to a sample collected from a suspect. If the characteristics, or patterns, look the same, the examiner declares a match. [Jennifer Friedman & Jessica Brand / Santa Clara Law Review]

Typical pattern-matching fields include the analysis of latent fingerprints, microscopic hair, shoe prints and footwear, bite marks, firearms, and handwriting. [“A Path Forward” / National Academy of Sciences”] Examiners in almost every pattern-matching field follow a method of analysis called “ACE-V” (Analyze a sample, Compare, Evaluate — Verify). [Jamie Walvisch / Phys.org]

Here are two common types of pattern-matching evidence:

Fingerprints: Fingerprint analysts try to match a print found at the crime scene (a “latent” print) to a suspect’s print. They look at features on the latent print — the way ridges start, stop, and flow, for example — and note those they believe are “significant.” Analysts then compare those features to ones identified on the suspect print and determine whether there is sufficient similarity between the two. (Notably, some analysts will deviate from this method and look at the latent print alongside the suspect’s print before deciding which characteristics are important.) [President’s Council of Advisors on Science and Technology]

Firearms: Firearm examiners try to determine if shell casings or bullets found at a crime scene are fired from a particular gun. They examine the collected bullets through a microscope, mark down characteristics, and compare these to characteristics on bullets test-fired from a known gun. If there is sufficient similarity, they declare a match. [“A Path Forward” / National Academy of Sciences”]

What’s wrong with pattern-matching evidence?

There are a number of reasons pattern-matching evidence is deeply flawed, experts have found. Here are just a few:

These conclusions are based on widely held, but unproven, assumptions.

The idea that handwriting, fingerprints, shoeprints, hair, or even markings left by a particular gun, are unique is fundamental to forensic science. The finding of a conclusive match, between two fingerprints for example, is known as “individualization.” [Kelly Servick / Science Mag]

However, despite this common assumption, examiners actually have no credible evidence or proof that hair, bullet markings, or things like partial fingerprints are unique — in any of these pattern matching fields.

In February 2018, The Nation conducted a comprehensive study of forensic pattern-matching analysis (referenced earlier in this explainer, in relation to Jimmy Genrich). The study revealed “a startling lack of scientific support for forensic pattern-matching techniques.” Disturbingly, the authors also described “a legal system that failed to separate nonsense from science in capital cases; and consensus among prosecutors all the way up to the attorney general that scientifically dubious forensic techniques should not only be protected, but expanded.” [Meehan Crist and Tim Requarth / The Nation]

Similarly, no studies show that one person’s bite mark is unique and therefore different from everyone else’s bite mark in the world. [Radley Balko / Washington Post] No studies show that all markings left on bullets by guns are unique. [Stephen Cooper / HuffPost] And no studies show that one person’s fingerprints — unless perhaps a completely perfect, fully rolled print — are completely different than everyone else’s fingerprints. It’s just assumed. [Sarah Knapton / The Telegraph]

Examiners often don’t actually know whether certain features they rely upon to declare a “match” are unique or even rare.

On any given Air Jordan sneaker, there are a certain number of shared characteristics: a swoosh mark, a tread put into the soles. That may also be true of handwriting. Many of us were taught to write cursive by tracing over letters, after all, so it stands to reason that some of us may write in similar ways. But examiners do not know how rare certain features are, like a high arch in a cursive “r” or crossing one’s sevens. They therefore can’t tell you how important, or discriminating, it is when they see shared characteristics between handwriting samples. The same may be true of characteristics on fingerprints, marks left by teeth, and the like. [Jonathan Jones / Frontline]

There are no objective standards to guide how examiners reach their conclusions.

How many characteristics must be shared before an examiner can definitively declare “a match”? It is entirely up to the discretion of the individual examiner, based on what the examiner usually chalks up to “training and experience.” Think Goldilocks. Once she determines the number that is “just right,” she can pick. “In some ways, the process is no more complicated than a child’s picture-matching game,” wrote the authors of one recent article. [Liliana Segura & Jordan Smith / The Intercept] This is true for every pattern-matching field — it’s almost entirely subjective. [“A Path Forward” / National Academy of Sciences”]

Unsurprisingly, this can lead to inconsistent and incompatible conclusions.

In Davenport, Iowa, police searching a murder crime scene found a fingerprint on a blood-soaked cigarette box. That print formed the evidence against 29-year-old Chad Enderle. At trial, prosecutors pointed to seven points of similarity between the crime scene print and Enderle’s print to declare a match. But was that enough? Several experts hired by the newspaper to cover the case said they could not draw any conclusions about whether it matched Enderle. But the defense lawyer didn’t call an expert and the jury convicted Enderle. [Susan Du, Stephanie Haines, Gideon Resnick & Tori Simkovic / The Quad-City Times]

Why faulty forensics persist

Despite countless errors like these, experts continue to use these flawed methods and prosecutors still rely on their results. Here’s why:

Experts are often overconfident in their abilities to declare a match.

These fields have not established an “error rate” — an estimate of how often examiners erroneously declare a “match,” or how often they find something inconclusive or a non-match when the items are from the same source. Even if your hair or fingerprints are “unique,” if experts can’t accurately declare a match, that matters. [Brandon L. Garrett / The Baffler]

Analysts nonetheless give very confident-sounding conclusions — and juries often believe them wholesale. “To a reasonable degree of scientific certainty” — that’s what analysts usually say when they declare a match, and it sounds good. But it actually has no real meaning. As John Oliver explained on his HBO show: “It’s one of those terms like basic or trill that has no commonly understood definition.” [John Oliver / Last Week Tonight] Yet, in trial after trial, jurors find these questionable conclusions extremely persuasive. [Radley Balko / Washington Post]

Why did jurors wrongfully convict Santae Tribble of murdering a Washington, D.C., taxi driver, despite his rock-solid alibi supported by witness testimony? “The main evidence was the hair in the stocking cap,” a juror told reporters. “That’s what the jury based everything on.” [Henry Gass / Christian Science Monitor]

But it was someone else’s hair. Twenty-eight years later, after Tribble had served his entire sentence, DNA evidence excluded him as the source of the hair. Incredibly, DNA analysis established that one of the crime scene hairs, initially identified by an examiner as a human hair, belonged to a dog. [Spencer S. Hsu / Washington Post]

Labs are not independent — and that can lead to biased decision-making.

Crime labs are often embedded in police departments, with the head of the lab reporting to the head of the police department. [“A Path Forward” / National Academy of Sciences] In some places, prosecutors write lab workers’ performance reviews. [Radley Balko / HuffPost] This gives lab workers an incentive to produce results favorable to the government. Research has also shown that lab technicians can be influenced by details of the case and what they expect to find, a phenomenon known as “cognitive bias.” [Sue Russell / Pacific Standard]

Lab workers may also have a financial motive. According to a 2013 study, many crime labs across the country received money for each conviction they helped obtain. At the time, statutes in Florida and North Carolina provided remuneration only “upon conviction”; Alabama, Arizona, California, Missouri, Wisconsin, Tennessee, New Mexico, Kentucky, New Jersey, and Virginia had similar fee-based systems. [Jordan Michael Smith / Business Insider]

In North Carolina, a state-run crime lab produced a training manual that instructed analysts to consider defendants and their attorneys as enemies and warned of “defense whores” — experts hired by defense attorneys. [Radley Balko / Reason]

Courts are complicit

Despite its flaws, judges regularly allow prosecutors to admit forensic evidence. In place of hearings, many take “judicial notice” of the field’s reliability, accepting as fact that the field is accurate without requiring the government to prove it. As Radley Balko from the Washington Post writes: “Judges continue to allow practitioners of these other fields to testify even afterthe scientific community has discredited them, and even after DNA testing has exonerated people who were convicted, because practitioners from those fields told jurors that the defendant and only the defendant could have committed the crime.” [Radley Balko / Washington Post]

In Blair County, Pennsylvania, in 2017, Judge Jolene G. Kopriva ruled that prosecutors could present bite mark testimony in a murder trial. Kopriva didn’t even hold an evidentiary hearing to examine whether it’s a reliable science, notwithstanding the mounting criticism of the field. Why? Because courts have always admitted it. [Kay Stephens / Altoona Mirror]

Getting it wrong

Not surprisingly, flawed evidence leads to flawed outcomes. According to the Innocence Project, faulty forensic testimony has contributed to 46 percent of all wrongful convictions in cases with subsequent DNA exonerations. [Innocence Project] Similarly, UVA Law Professor Brandon Garrett examined legal documents and trial transcripts for the first 250 DNA exonerees, and discovered that more than half had cases tainted by “invalid, unreliable, concealed, or erroneous forensic evidence.” [Beth Schwartzapfel / Newsweek]

Hair analysis

In 2015, the FBI admitted that its own examiners presented flawed microscopic hair comparison testimony in over 95 percent of cases over a two-decade span. Thirty-three people had received the death penalty in those cases, and nine were executed. [Pema Levy / Mother Jones] Kirk Odom, for example, was wrongfully imprisoned for 22 years because of hair evidence. Convicted of a 1981 rape and robbery, he served his entire term in prison before DNA evidence exonerated him in 2012. [Spencer S. Hsu / Washington Post]

In 1985, in Springfield, Massachusetts, testimony from a hair matching “expert” put George Perrot in prison — where he stayed for 30 years — for a rape he did not commit. The 78-year-old victim said Perrot was not the assailant, because, unlike the rapist, he had a beard. Nonetheless, the prosecution moved forward on the basis of a single hair found at the scene that the examiner claimed could only match Perrot. Three decades later, a court reversed the conviction after finding no scientific basis for a claim that a specific person is the only possible source of a hair. Prosecutors have dropped the charges. [Danny McDonald / Boston Globe]

In 1982, police in Nampa, Idaho, charged Charles Fain with the rape and murder of a 9-year-old girl. The government claimed Fain’s hair matched hair discovered at the crime scene. A jury convicted him and sentenced him to death. DNA testing later exonerated him, and, in 2001, after he’d spent two decades in prison, a judge overturned his conviction. [Raymond Bonner / New York Times]

Bite mark analysis

In 1999, 26 members of the American Board of Forensic Odontologyparticipated in an informal proficiency test regarding their work on bite marks. They were given seven sets of dental molds and asked to match them to four bite marks from real cases. They reached erroneous results 63 percent of the time. [60 Minutes] One bite mark study has shown that forensic dentists can’t even determine if a bite mark is caused by human teeth. [Pema Levy / Mother Jones]

That didn’t keep bite mark “expert” Michael West from testifying in trial after trial. In 1994, West testified that the bite mark pattern found on an 84-year-old victim’s body matched Eddie Lee Howard’s teeth. Based largely on West’s testimony, the jury convicted Howard and sentenced him to death. Experts have since called bite mark testimony “scientifically unreliable.” And sure enough, 14 years later, DNA testing on the knife believed to be the murder weapon excluded Howard as a contributor. Yet the state continues to argue that Howard’s conviction should be upheld on the basis of West’s testimony. [Radley Balko / Washington Post]

West, who in 1994 was suspended from the American Board of Forensic Odontology and basically forced to resign in 2006, is at least partially responsible for several other wrongful convictions as well. [Radley Balko / Washington Post]

West himself has even discredited his own testimony, now stating that he “no longer believe[s] in bite mark analysis. I don’t think it should be used in court.” [Innocence Project]

Fingerprint analysis

The FBI has found that fingerprint examiners could have an error rate, or false match call, as high as 1 in 306 cases, with another study indicating examiners get it wrong as often as 1 in every 18 cases. [Jordan Smith / The Intercept] A third study of 169 fingerprint examiners found a 7.5 percent false negative rate (where examiners erroneously found prints came from two different people), and a 0.1 percent false positive rate. [Kelly Servick / Science Mag]

In 2004, police accused American attorney Brandon Mayfield of the notorious Madrid train bombing after experts claimed his fingerprint matched one found on a bag of detonators. Eventually, four experts agreed with this finding. Police arrested him and detained him for two weeks until the police realized their mistake and were forced to release him. [Steve Pokin / Springfield News-Leader]

In Boston, Stephan Cowans was convicted, in part on fingerprint evidence, in the 1997 shooting of a police officer. But seven years later, DNA evidence exonerated him and an examiner stated that the match was faulty. [Innocence Project]

A 2012 review of the St. Paul, Minnesota, crime lab found that over 40 percent of fingerprint cases had “seriously deficient work.” And “[d]ue to the complete lack of annotation of actions taken during the original examination process, it is difficult to determine the examination processes, including what work was attempted or accomplished.” [Madeleine Baran / MPR News]

Firearm analysis

According to one study, firearm examiners may have a false positive rate as high as 2.2 percent, meaning analysts may erroneously declare a match as frequently as 1 in 46 times. This is a far cry from the “near perfect” accuracy that examiners often claim. [President’s Council of Advisors on Science and Technology]

In 1993, a jury convicted Patrick Pursley of murder on the basis of firearms testimony. The experts declared that casings and bullets found on the scene matched a gun linked to Pursley “to the exclusion of all other firearms.” Years later, an expert for the state agreed that the examiner should never have made such a definitive statement. Instead, he should have stated that Pursley’s gun “couldn’t be eliminated.” In addition, the defense’s experts found that Pursley’s gun was not the source of the crime scene evidence. Digital imaging supported the defense. [Waiting for Justice / Northwestern Law Bluhm Legal Clinic] In 2017, a court granted Pursley a new trial. [Georgette Braun / Rockford Register Star]

Rethinking faulty forensics

Scientists from across the country are calling for the justice system to rethink its willingness to admit pattern-matching evidence.

In 2009, the National Research Council of the National Academy of Science released a groundbreaking report concluding that forensic science methods “typically lack mandatory and enforceable standards, founded on rigorous research and testing, certification requirements, and accreditation programs.” [Peter Neufeld / New York Times]

In 2016, the President’s Council of Advisors on Science and Technology (PCAST), a group of pre-eminent scientists, issued a scathing report on pattern-matching evidence. The report concluded that most of the field lacked “scientific validity” — i.e., research showing examiners could accurately and reliably do their jobs. [Jordan Smith / The Intercept] Until the field conducted better research proving its accuracy, the Council stated that forensic science had no place in the American courtroom. The study found that, regarding bite mark analysis, the error rate was so high that resources shouldn’t be wasted to attempt to show it can be used accurately. [Radley Balko / Washington Post]

After the PCAST report came out, then-Attorney General Loretta Lynch, citing no studies, stated emphatically that “when used properly, forensic science evidence helps juries identify the guilty and clear the innocent.” [Jordan Smith / The Intercept] “We appreciate [PCAST’s] contribution to the field of scientific inquiry,” Lynch said, “[but] the department will not be adopting the recommendations related to the admissibility of forensic science evidence.” [Radley Balko / Washington Post]

The National District Attorneys Association (NDAA) called the PCAST report “scientifically irresponsible.” [Jessica Pishko / The Nation] “Adopting any of their recommendations would have a devastating effect on the ability of law enforcement, prosecutors and the defense bar to fully investigate their cases, exclude innocent suspects, implicate the guilty, and achieve true justice at trial,” the association noted. [Rebecca McCray / Take Part]

The NDAA also wrote that PCAST “clearly and obviously disregard[ed] large bodies of scientific evidence … and rel[ied], at times, on unreliable and discredited research.” But when PCAST sent out a subsequent request for additional studies, neither the NDAA nor the Department of Justice identified any. [PCAST Addendum]

This problem is getting worse under the current administration. Attorney General Jeff Sessions has disbanded the National Commission on Forensic Science, formed to improve both the study and use of forensic science, and which had issued over 40 consensus recommendation documents to improve forensic science. [Suzanne Bell / Slate] He then developed a DOJ Task Force on Crime Reduction and Public Safety, tasked with “support[ing] law enforcement” and “restor[ing] public safety.” [Pema Levy / Mother Jones]

But there are also new attempts to rein in the use of disproven forensic methods. In Texasthe Forensic Science Commission has called for a ban on bite marks. “I think pretty much everybody agrees that there is no scientific basis for a statistical probability associated with a bite mark,” said Dr. Henry Kessler, chair of the subcommittee on bite mark analysis. [Meagan Flynn / Houston Press]

A bill before the Virginia General Assembly, now carried over until 2019, would provide individuals convicted on now-discredited forensic science a legal avenue to contest their convictions. The bill is modeled after similar legislation enacted in Texas and California. The Virginia Commonwealth’s Attorneys Association opposes the legislation, arguing: “It allows all sorts of opportunities to ‘game’ the system.” [Frank Green / Richmond Times-Dispatch]

Meanwhile, at least one judge has recognized the danger of forensic expert testimony. In a 2016 concurrence, Judge Catherine Easterly of the D.C. Court of Appeals lambasted expert testimony about toolmark matching: “As matters currently stand, a certainty statement regarding toolmark pattern matching has the same probative value as the vision of a psychic: it reflects nothing more than the individual’s foundationless faith in what he believes to be true. This is not evidence on which we can in good conscience rely, particularly in criminal cases … [T]he District of Columbia courts must bar the admission of these certainty statements, whether or not the government has a policy that prohibits their elicitation. We cannot be complicit in their use.” [Spencer S. Hsu / Washington Post]

Do you wonder how witchcraft and satanic children eating coven stories survive in this era of lies and misdemeanors and wrongful convictions? This article pushes back against what’s coming out of the US WH and DOJ (and some DAs) spiel about forensic reliability. https://injusticetoday.com/faulty-forensics-explained-fe4d41157452

via #Forensics: The usual forensic failures described : DAs don’t give much of a damn. — FORENSICS and LAW in FOCUS @ CSIDDS | News and Trends

Run, hide, fight: how to survive an active shooter situation

Who is Stephen Paddock? Police say suspect responsible for deadliest shooting in US history

via Fox2 News:

At least 50 people were killed late Sunday night in a mass shooting during a concert on the Las Vegas Strip. Here’s what we know about Stephen Paddock, the man police identified as the shooter:

Paddock, 64, is from Mesquite, Nevada, authorities said. Police stormed his hotel room from which they believe he fired the shots on the 32nd floor of the Mandalay Bay Resort and Casino.  They say he had over 10 rifles when police found his body.  The believe he killed himself prior to entry.  He checked into the hotel on September 28th.

Paddock was apparently targeting a crowd of 30,000 people attending the Route 91 Harvest music festival below.  Police believe he killed himself prior to entry into the hotel room.

Police are also at his home in Mesquite, Nevada.  They are methodically investigating the property.  The Mesquite Police Department had no prior contact with him, according to spokesman Quinn Averett, nor had there been any calls to Paddock’s home in Mesquite, which is about 80 miles northeast of Las Vegas. Officials did not know how long he had been living in the area.

This mass-shooting is not yet considered an act of terrorism.  Police are looking into a motive for the shooting before they can call it an act of terror.

So far, the massacre has no known link to overseas terrorism or terror groups, a US official with knowledge of the case said.

And a woman described as a “person of interest” after the attack is not believed to be involved in the shooting, police said in a statement.

“Marilou Danley is no longer being sought out as a person of interest,” the Las Vegas Metropolitan Police Department said. “LVMPD detectives have made contact with her and do not believe she is involved with the shooting on the strip.”

The suspect:

• Police have identified the shooter as Stephen Paddock, a 64-year-old man. He was earlier described as a “local resident.”

• Officers said they engaged the suspect at the Mandalay Bay hotel, and he was killed.

• Police do not believe there are additional shooters.

Casualties:

• Police report there are now at least 50 dead and some 200 injured, making this the deadliest shooting in modern US history. The 2016 Pulse nightclub shooting in Orlando, Florida, was previously the deadliest, with 49 killed.

• Two off-duty Las Vegas police officers were killed in the shooting, Sheriff Joseph Lombardo said at a news conference.

• Two other officers responding to the scene were wounded, the sheriff said. One of them is in critical condition.

Investigation:

• Police said they are confident they have located Marilou Danley, who was traveling with the suspect. She had been sought for questioning in connection with the shooting. She is not named as a suspect.

• Police also said they have located two vehicles they were searching for: a Hyundai Tucson Nevada with a Nevada plate, and a Chrysler Pacifica Touring with a Nevada plate.

• Police are urging patience: “I think it’s very important that you understand this investigation is going to be long and contracted before we get to the bottom of everything associated with it,” Sheriff Lombardo said.

How shooting unfolded:

• Just after 10 p.m. Sunday local time (1 a.m. ET Monday) the Route 91 Harvest Festival — a country music concert — was interrupted by the sound of gunfire, witnesses said.

• Police said the gunman fired on the crowd of about 30,000 people from the 32nd floor of the Mandalay Bay hotel, several hundred feet southwest of the concert grounds.

• Country music singer Jason Aldean was performing when the gunshots began, according to eyewitness cell phone video.

• “The gunshots lasted for 10 to 15 minutes. It didn’t stop,” said eyewitness Rachel de Kerf.

Reaction:

• President Trump tweeted, “My warmest condolences and sympathies to the victims and families of the terrible Las Vegas shooting. God bless you!”

• Las Vegas Airport diverts two dozen flights due to the shooting. Travelers are being urged to check with their airlines for flight status.

• Performer Jason Aldean released a statement that said: “Tonight has been beyond horrific.” On Instagram, Aldean said that he and his crew were safe.

 

After a deadly mass shooting in Las Vegas, local security experts are sharing advice on how to survive an active shooter situation.

A gunman opened fire on a crowd on the Las Vegas strip Sunday night, killing at least 58 and injuring hundreds more.

The best advice from law enforcement in an active shooter situation: run, hide, or fight.

If possible, experts say the best way is to run away from the threat. Always try to escape and evacuate, even if others are insisting on staying.

However, in certain situations it may be better to remain in place and try to avoid detection.

The last option, though the least preferable, is to fight the shooter with whatever means necessary.

In August, the Virginia Beach Police Department’s Crime Prevention Unit provided it’s Active Threat Citizen Defense in a public setting for the first time.

Master Police Officer David Nieves talked to News 3 after the training about surviving an active threat situation.

“You’ve got three choices. You either need to run, hide or fight. You just need to keep your head about yourself and understand that you only have a few seconds to understand what you’re going to do. You better have a plan before you need that plan because there are no seconds, you aren’t going to have time to make a plan,” he explained.

Experts say it’s also important to remain aware of your surroundings when in public.

“Things can happen anywhere. We don’t want [people] to have the mindset that it will never happen here. It will never happen to me. If you have that mindset then you’ve already lost,” MPO Nieves stated.

Police are searching for a woman named Marilou Danley who was traveling with the suspect, Lombardo said. He described Danley as an Asian woman, 4 ft 11 inches tall and 111 pounds. “We have not located her at this time and we are interested in talking to her,” he said.

Related: 

Class teaches how to survive an active shooting situation

Mass casualty drill takes place in Norfolk in preparation for the worst

In an active shooter situation, don’t just stand there…

This is how the FBI says civilians should react in an active shooter situation

via Latest updates: 58 killed, 515 hurt in Las Vegas Strip massacre — WTKR.com

Aaron Hernandez sues over hacked jailhouse phone calls — WTNH Connecticut News

BOSTON (AP) — Former New England Patriots player Aaron Hernandez has sued a company after some of his jailhouse phone calls were apparently listened to by an unknown person. Hernandez’s lawsuit filed in federal court in Boston charges Dallas-based Securus Technologies Inc. with negligence, breach of contract, and invasion of privacy. It demands that Securus…

via Aaron Hernandez sues over hacked jailhouse phone calls — WTNH Connecticut News