7 Countries with the Best Forensic Technologies via iTHINK

Civilizations that have thousands of years invested in perfecting a field tend to NAIL IT with more accuracy….

Many research and development solutions can be obtained through information sharing from countries that have had centuries of trial and error based experiementation. Seek to learn from mentors in the field, and save yourself from complicating your analysis.

7 Countries with the Best Forensic Technology

7. China

Technology

China has been investing time, energy and resources into forensic science since the 1980’s and globally-renowned forensic scientist Henry Chang-yu Lee believes it’s about to pay off tipping China to become a world leader in high-tech evidence collection.

“I believe the technology in China will be more advanced than ever in the United States within five years,” the Chinese-American expert said in a recent interview with China Daily.

Lee, who has racked up more than five decades of experience in forensic science, has worked on a number of high-profile criminal cases in the US, but has also shared his wealth of knowledge with students, lawyers, judges and law enforcement in China over the years.

“The apparatus and devices used to identify fingerprints or footprints, for example, were very simple when I first visited Chinese forensic laboratories,” he said.

However, he has seen the technology improve over the years and there have been many advances, particularly in electronic evidence collection and fraud prevention by means of real-time monitoring.

In 2016, Lee and several other experts established the Silk Road Forensic Consortium in Xi’an, Shaanxi province, to fight crime and safeguard security by boosting scientific exchanges among countries involved in the Belt and Road Initiative.

The consortium, which has 150 members from 30 countries and regions, provides an open platform for forensic specialists, police officers and judges to share ideas and difficulties as well as experiences in DNA identification studies.

Lee, who acts as chairman, said, “Although we speak different languages in our daily lives, we all speak the same ‘language’ at work, and that’s the language of the criminal investigation. We share the same goal – to speak for the dead using forensic science.”

In September, at the organisation’s third annual conference in Yantai, Shandong province, Lee announced plans to unify DNA identification standards among its members to try and build a mutual DNA database that can better solve criminal cases.

Unified standards are essential to the world of forensic science, he told China Daily.

“If we can achieve unification in China, it can be extended across Asia, to the consortium and finally the world,” he added. “It would mean a brighter future for forensic science.”

6. European Network of Forensic Institutes

Although not a country, the European Network of Forensic Institutes (ENFSI) is recognized as a pre-eminent voice in forensic science worldwide. It is a network of forensic specialists covering a broad range of fields of expertise, from 38 countries geographically spread across Europe:

Austria, Armenia, Azerbaijan, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Georgia, Germany, Greece, Hungary, Ireland, Italy, Latvia, Lithuania, Former Yugoslav Republic of Macedonia, Malta, Montenegro, The Netherlands, Norway, Poland, Portugal, Romania, Russia, Serbia, Slovenia, Slovakia, Spain, Sweden, Switzerland, Turkey, Ukraine and the United Kingdom.

The ENFSI has seventeen Expert Working Groups working on a diverse range of forensic specialisms, from textiles and hair to explosives and firearms. It also provides invaluable training to police officers and crime scene investigators.

5. Germany

Technology

Police in the German state of Bavaria have the power to use forensic DNA profiling after a controversial law passed in 2018 in the Landtag, the state parliament in Munich. The law was the first in Germany to allow authorities to use DNA to help determine the physical characteristics, such as eye colour, of an unknown culprit.

The new DNA rules are part of a broader law which has drawn criticism of the wide surveillance powers it gives the state’s police to investigate people they deem an “imminent danger,” people who haven’t necessarily committed any crimes but might be planning to do so.

The move was prompted, in part, by the rape and murder of a medical student in Freiburg, Germany, in late 2016. An asylum seeker, originally from Afghanistan, was convicted of the murder and sentenced to life in prison.

But some authorities complained that they could have narrowed their search more quickly if they had been able to use trace DNA to predict what the suspect would look like.

Federal and state laws previously only allowed investigators to use DNA to look for an exact match between crime scene evidence and a potential culprit, either in a database of known criminals or from a suspect.

Germany also forms part of the aforementioned ENFSI.

4. South Korea

To say that smartphones have changed the digital forensic landscape is an understatement. The device has become the core of every criminal investigation and helped propel digital forensics as a serious, scientific investigation tool.

South Korea is leading the way in digital forensics, with its largest digital forensic firm, Hancom GMD, playing a crucial role in prosecuting some of the country’s most powerful politicians.

In late 2016, South Korea was rocked by one of its biggest political corruption scandals in history – its President Park Guen-hye was accused of bribery and by law, investigators only had 60 days to investigate and prosecute.

They had confiscated over 300 smartphones as from suspects and needed to analyse tens of thousands of phone records and chat messages within a tight deadline. Hancom GMD successfully analysed all of the data in the 300 smartphones and extracted crucial evidence that led to several convictions.

With 5G set to be rolled out globally this year, forensic teams in South Korea are already preparing for this further growth in the collection of digital evidence.

Hancom GMD is planning to launch a service that recovers data from the cloud, though privacy regulations in each country are expected to be a challenge to overcome.

3. United Kingdom

Technology

Prior to its closure in 2012, the UK Forensic Science Service (FSS) was a world-leader in forensic technology. It pioneered the use of the handheld breath alcohol roadside tester and the DNA national database was first worked on and initially tested on all staff and police forces to ensure its reliability.

The organisation later pioneered the use of large scale DNA profiling for forensic identification and crime detection when it moved the facilities to Birmingham.

This enabled the launch of the world’s first DNA database on 10 April 1995. The FSS’s innovative and highly sensitive DNA profiling technique called LCN (low copy number) was used in convicting Antoni Imiela (the M25 rapist).

As well as,  Ronald Castree (for the murder of Lesley Molseed in 1975) but the organisation came under attack when it failed to recover blood stains from a shoe in the murder of Damilola Taylor.

Forensic laboratories in the UK are now privately-owned but are experiencing similar financial difficulties, a recent inquiry by the House of Lords heard.

Mark Pearse, the commercial director in the forensics division of Eurofins, one of the three major providers in the UK, described an “unsustainable toxic set of conditions” when he appeared before the inquiry.

Representatives from the two other largest providers – Key Forensics, which had to be bailed out by police last year after going into administration, and Cellmark – raised similar concerns.

However, that’s not to say that the UK is not involved in researching and implementing new forensic technologies.

Forensic scientists are currently working with the British military to open the United Kingdom’s first body farm — a site where researchers will be able to study the decomposition of human remains.

Details are not yet finalized, but the plans are at an advanced stage: project leaders hope this year to open the farm, also known as a forensic cemetery or taphonomy facility, after the discipline devoted to the study of decay and fossilization.

Such sites generate data on tissue and bone degradation under controlled conditions, along with chemical changes in the soil, air and water around a corpse, to help criminal and forensic investigators.

2. The Netherlands

The Netherlands Forensic Institute (NFI) is one of the world’s leading forensic laboratories. From its state-of-the-art, purpose-built premises in The Hague, the NFI provides products and services to a wide range of national and international clients.

To ensure that their work remains at the forefront of developments, the Netherlands Forensic Institute invests heavily in Research and Development. In this way, it lays the foundations for innovative forensic methods and technologies that will play an important part in the coming decades.

Amongst these innovative forensic technologies is the invention of Hansken, a system that can store large quantities and diverse data easily from different sources. All data is stored, indexed, enriched and made rapidly searchable, cutting down the turnaround time of forensic evidence.

It now contains over 150 samples of glass from a large number cases. In several cases, this glass database has linked suspects to several crimes.

Offenders who carry out robberies, smash-and-grab raids or ARM gas attacks may have splinters of glass on their clothes or in the soles of their shoes and these splinters of glass can remain in place for months, even though they are barely visible to the naked eye, if at all.

These splinters can be of great value. The composition of each piece of glass is unique because of minuscule contaminants in the raw materials for making glass.

By comparing the unique composition of splinters of glass found on a suspect to glass from the database, it is possible to check whether that glass originates from a crime committed earlier.

The glass analysts of the NFI measure the concentration of twenty elements in each piece of glass. This produces a kind of ‘chemical fingerprint’ of the material.

1.United States of America

Technology

It will come as no surprise that at the forefront of cutting-edge forensic technology is the USA, home to over 400 crime labs and the biggest crime lab in the world, the FBI Laboratory.

 

To help train government and industry organisations on cyberattack prevention, as part of a research project for the U.S. Army, scientists at The University of Texas at San Antonio, have developed the first framework to score the agility of cyber attackers and defenders.

“The DOD and U.S. Army recognize that the cyber domain is as important a battlefront as ground, air and sea,” said Dr. Purush Iyer, division chief, network sciences at Army Research Office, an element of the Army Futures Command’s Army Research Laboratory.

“Being able to predict what the adversaries will likely do provides opportunities to protect and to launch countermeasures. This work is a testament to successful collaboration between academia and government.”

The framework developed by the researchers will help government and industry organizations visualize how well they out-maneuver attacks.

Their work is published in IEEE Transactions on Information Forensics and Security, a top journal for cybersecurity.

Education and training programs in the field of forensics are also on the rise, supported by organisations such as The Forensic Sciences Foundation and the American Academy of Forensic Sciences.

In fact, there are 485 Forensic Science schools in the US, so it’s no wonder that it is the home of the some of the most influential forensic scientists, such as Dr. Michael M. Baden and Ellis R. Kerley, and is sure to produce a great deal more talent in the future.

This is certainly an exciting time to be working in forensic science, with the challenges presented by the world of AI, Smartphones and Cloud data calling for rapid improvements to existing technology.

With these challenges comes the need for those countries with more developed forensic facilities to provide training and education opportunities to those in less developed areas so that science can play its rightful part in the criminal justice system.

For now, these are among the 7 countries who have the most advanced forensic technology and it is not the end. As the world continues to evolve, so will technology and the forensic industry itself.

Hi! I’m Isabella and I’m an Italian living in the UK studying for a Masters in Crime & Justice. I currently work in the prison education sector and have a background in teaching, having completed a PGCE after reading languages at the University of Durham. I love travelling, cooking, reading and playing the piano.

Technology is at its peak moment and with it has bought about some of the finest forensic techs. Here are 7 countries with the best forensic technology.

via 7 Countries with the Best Forensic Technology — iTHINK

Advertisements

Factors Affecting The Intensity Of Poisoning via Forensic’s blog

By @forensicfield

Introduction

The resultant of poisoning depends on many factors.

There are number of reasons which can affect intensity of poisoning are further explained, such as;

  • Dose.
  • Time of intake
  • Way of taking
  • Environmental factors, etc.

Dose

Amount of the poison is determine the affect of it on the body. Smaller the dose, lighter the effect and larger the dose, severe the effect.

Resistance

After doing continuous use of some drugs, such as opiates, tobacco, alcohol, etc. person develop a resistance towards some drugs.

Incompatible Combination of Drugs

Ingestion of some incompatible combination of Medicines may be fatal. Such As; Prozac and Tramadol, Thyroid medication and proton pump inhibitors, Nonsteroidal anti-inflammatory drugs and antihypertensive, etc.

Hypersensitivity

Some of persons show abnormal response (idiosyncrasy) to a drug like morphine, quinine, aspirin etc. due to inherent personal hypersensitivity.

Allergy

Some persons are allergic (acquired hypersensitivity) towards certain drugs like penicillin, sulpha, etc.

Incompatible Combinations

Ingestion of certain medications like anti – ulcerous gels with aspirin may lead to fatal effects.

Tolerance

People develop a marked tolerance in the case of opium, alcohol, strychnine, tobacco, arsenic and some other narcotic drugs by repeated and continued use.

Synergism

Some poisonous drugs can be toxic when taken together may cause lethal effect. Such as; Alcohol and Benzodiazepines, Heroine and Cocaine, Benzodiazepines and Opioids, Alcohol and Opioids

Slow Poisons

The continuous small amount of poison ingestion like arsenic, strychnine, lead, etc. accumulate in body and may cause death.

Conditions of The Body

  • Conditions of the body, i.e. age, health, etc. also affect the action of the poison.
  • Generally old persons, weaker persons and children severly affected by low dose of poison then young and healthy person.

Cumulative Action

The repeated small doses of cumulative poisons like arsenic, lead, mercury, strychnine, digitalis etc. may cause death or chronic poisoning by cumulative action.

Shock

Some times, a large dose of a poison acts differently from a small dose, for example; a large dose of arsenic may cause death by shock while a small dose results in diarrhoea.

Forms of Poison

  • Gases/Vapours Poisons
  • Liquid Poisons
  • Powder Poisons
  • Chemical Combination
  • Mechanical Combination

Gases / Vapours Poisons

These types of poison absorbed immediately and act quickly.

Liquid Poisons

These act better than solids.

Powder Poisons

Fine powdered poison act fast than coarse powdered poison.

Chemical Combination

Some substances in combination act like lethal, such as; acids and alkali’s, strychnine and tannic acid, etc.

Mechanical Combination

The action of a poison is altered when combined mechanically with inert substances, such as; when alkaloid when taken with charcoal, it does not act.

Methods Of Administration

A poison acts more rapidly when inhaled in gaseous form or when injected intravenously.

Next when inject intramuscularly or subcutaneously.

A poison acts slowly when swallowed or applied on skin.

Watch it🤳, share it ✌and subscribe it 👇 : –

NO COMMENTS – add the first!

JZ Logistics via AP Editor


SEND

By @forensicfield Introduction The resultant of poisoning depends on many factors. There are number of reasons which can affect intensity of poisoning are further explained, such as; Dose. Time of intake Way of taking Environmental factors, etc. Dose Amount of the poison is determine the affect of it on the body. Smaller the dose, lighter […]

via Factors Affecting The Intensity Of Poisoning — Forensic’s blog

Biometrics — Forensic’s blog

INTRODUCTION

✴ The term “biometrics” is derived from the Greek words “bio” (life) and “metrics” (to measure).
✴ Biometrics is the technical term for body measurements and calculations.
✴ Biometrics is the measurement and statistical analysis of people’s unique physical and behavioral characteristics.
✴ Biometrics allows a person to be identified and authenticated based on a set of recognizable and verifiable data, which are unique and specific to them.
✴ Biometrics authentication is the process of comparing data for the person’s characteristics in order to determine resemblance.

CHARACTERISTCS

1. UNIVERSAL
2. UNIQUE
3. INVARIABLE
4. RECORDABLE
5. MEASURABLE

HISTORY OF BIOMETRICS

🔘1858 – First systematic capture of hand images for identification purposes is recorded.
🔘1870 – Bertillon develops anthropometries to identify individuals.
🔘1892 – Galton develops a classification system for fingerprints.
🔘1896 – Henry develops a fingerprint classification system
🔘1903 – NY State Prisons begin using fingerprints.
🔘1960s – Face recognition becomes semi-automated.
🔘1960 – First model of acoustic speech production is created.
🔘1963 – Hughes research paper on fingerprint automation is published.
🔘1974- First commercial hand geometry systems become available.
🔘1976 – First prototype system for speaker recognition is developed.
🔘1986 – Exchange of fingerprint minutiae data standard is published.
🔘1988 – First semi-automated facial recognition system is deployed.
🔘1991 – Face detection is pioneered, making real time face recognition possible.
🔘1992 – Biometric Consortium is established within US Government.
🔘1994 – Palm System is benchmarked.
🔘1996 – Hand geometry is implemented at the Olympic Games.
🔘1996 – NIST begins hosting annual speaker recognition evaluations.
🔘1997 – First commercial, generic biometric interoperability standard is published.
🔘1998- FBI launches COOlS (DNA forensic database).
🔘1999 – FBI’s IAFIS major components become operational.
🔘2001 – Face recognition is used at the Super Bowl in Tampa, Florida.
🔘2002 – ISO/IEC standards committee on biometrics is established.
🔘2004 – First statewide automated palm print databases are deployed in the US.
🔘2008 – U.S. Government begin coordinating biometric database use.
🔘2010 – U.S. national security apparatus utilizes biometrics for terrorist identification.
🔘2011 – Biometric identification used to identify body of Osama bin Laden.

TYPES OF BIOMETRICS

Biometrics Can Be Divided Into Three Main Categories Of Characteristics:
1. BIOLOGICAL
2. MORPHOLOGICAL
3. BEHAVIORAL

⏩DNA MATCHING

The identification of an individual using the analysis of segments from DNA.

👂EAR

The identification of an individual using the shape of the ear.

👀EYES – IRIS RECOGNITION & RETINA RECOGNITION

👁IRIS RECOGNITION- The use of the features found in the iris to identify an individual.
👁RETINA RECOGNITION- The use of patterns of veins in the back of the eye to accomplish recognition.

👱‍♂️FACE RECOGNITION

The analysis of facial features or patterns for the authentication or recognition of an individuals identity.

🤘FINGERPRINT RECOGNITION

The use of the ridges and valleys (minutiae) found on the surface tips of a human finger to identify an individual.

👋FINGER GEOMETRY RECOGNITION

The use of 3D geometry of the finger to determine identity.

🤚HAND GEOMETRY RECOGNITION

The use of the geometric features of the hand such as the lengths of fingers and the width of the hand to identify an individual.

🙌VEIN RECOGNITION

Vein recognition is a type of biometrics that can be used to identify individuals based on the vein patterns in the human finger or palm.

👃ODOUR

The use of an individuals odour to determine identity.

✍SIGNATURE RECOGNITION

The authentication of an individual by the analysis of handwriting style, specifically the signature. Technology is available to check two scanned signatures using advances algorithms.

👩‍💻TYPING RECOGNITION

The use of the unique characteristics of a persons typing for establishing identity.

🗣VOICE / SPEAKER RECOGNITION

There are two major applications of speaker recognition:
🙊Voice – Speaker Verification / Authentication
🙊Voice – Speaker Identification
✔In forensic applications, it is common to first perform a speaker identification process to create a list of “best matches” and then perform a series of verification processes to determine a conclusive match.
✔Voice recognition analyzes audio input for specific patterns in speech or sound. Each voice, or common noise, has a recognizable wavelength pattern that can aid in identification of a specific individual.

🚶‍♀️👣GAIT

The use of an individuals walking style or gait to determine identity.

Biometrics allows a person to be identified and authenticated based on a set of recognizable and verifiable data, which are unique and specific to them. This video covers following Points of Biometrics: 💡Introduction 💡Characteristics 💡History & 💡Types.

via Biometrics — Forensic’s blog

Intro to Fingerprints via Forensic’s blog

Fingerprints- An Introduction

By @forensicfield

INTRODUCTION

Everything we touched, leave behind our unique impression on it, which is Our fingerprints.

No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints.

Fingerprint identification also known as “Dactyloscopy”.

Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. They develop from pressure on a baby’s tiny, developing fingers in the womb.

CLASSIFICATION OF FINGERPRINTS

🧤By FRANCIS GALTON
A well-known British scientist sir Francis Galton published his first book on fingerprint in 1892. His important work include method for classification for fingerprint which are divided into three groups-
a. Arch
b. Loop
c. Whorl

🧤 By WILLIAM J. HERSHEL
While working for the East India Company in Bengal, India, Sir William James Herschel first used fingerprints on native contracts. After a decade, he had accumulated a file of fingerprints.

🧤 By EDWARD HENRY
Henry Classification of Fingerprinting was accepted as common practice throughout England and its territorial holdings and in the United States.
Under the henry system, fingerprints divided into two classes:
•Those which are given numerical value. (whorls and composites).
•Those which doesn’t give numerical value. (loops and arches).

🖐 All patters are divided as follows:

The henry classification system assigns each finger A number according to the order in which it is located in the hand, beginning with the right thumb as number 1 and ending with the left pinky as number 10.
• The system also assigns a numerical value to fingers that contain a whorl pattern; fingers 1 and 2 each have a value of 16,
• Fingers 3 and 4 = 8,
• Fingers 5 and 6 = 4,
• Fingers 7 and 8 = 2,
• Final two fingers = 1.
• Fingers with a non-whorl pattern, such as an arch or loop pattern, have a value of zero.
• The sum of the even finger value is then calculated and placed in the numerator of a fraction.
• The sum of the odd finger values is place in the denominator.
• The value of 1 is added to each sum of the whorls with the maximum obtainable on either side of the fraction begin 32.
• Thus, the primary classification is a fraction between 1/1 to 32/32, where 1/1 would indicate no whorl patterns and 32/32 would mean that all fingers had whorl patterns.

🧤 By JUAN VUCETICH
Vucetich is credited with the first positive criminal identification as, in 1892, he was able to extract a set of prints off a door and thus identify a woman as the culprit in a double homicide.

✴ CHARACTERISTICS OF FINGERPRINT

🖐 Class Characteristics
Class characteristics are the characteristics that narrow the print down to a group but not an individual.

The Three Fingerprint Class Types Are;
1. Arches:
✔Arches are the simplest type of fingerprints that are formed by ridges that enter on one side of the print and exit on the other. No deltas are present.
✔About 5 % of the world’s populations have arch patterns.

Arch pattern

2. Loops:
✔Loops must have one delta and one or more ridges that enter and leave on the same side. These patterns are named for their positions related to the radius and ulna bones.
✔About 60-65 % of the world’s populations have loop patterns.

Loop pattern

3. Whorls:
✔Whorls have at least one ridge that makes (or tends to make) a complete circuit. They also have at least two deltas.
✔About 30-35 % of the world’s populations have whorls patterns.

Whorl Pattern

👍 Individual characteristics:
👉Individual characteristics are those characteristics that are unique to an individual.
👉They are tiny irregularities that appear within the friction ridges and are referred to as Galton’s details.

Individual Characteristics of Fingerprint

Watch it🤳, share it ✌and subscribe it 👇 : –

Fingerprints-An Introduction

By @forensicfield INTRODUCTION Everything we touched, leave behind our unique impression on it, which is Our fingerprints. No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints. Fingerprint identification also known as “Dactyloscopy”. Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. […]

via Fingerprints- An Introduction — Forensic’s blog

Global Standardization of Forensics will Decrease the Bias Factor of Evidence Collection Procedures and Court Rulings

Interviews – 2018

Angus Marshall, Digital Forensic Scientist

via Angus Marshall
Angus, tell us a bit about yourself. What is your role, and how long have you been working in digital forensics?

Where to begin? I have a lot of different roles these days, but by day I’m a Lecturer in Cybersecurity – currently at the University of York, and also run my own digital forensic consultancy business. I drifted into the forensic world almost by accident back in 2001 when a server I managed was hacked. I presented a paper on the investigation of that incident at a forensic science conference and a few weeks later found myself asked to help investigate a missing person case that turned out to be a murder. There’s been a steady stream of casework ever since.

I’m registered as an expert adviser and most of my recent casework seems to deal with difficult to explain or analyse material. Alongside that, I’ve spent a lot of time (some might say too much) working on standards during my time on the Forensic Science Regulator’s working group on digital evidence and as a member of BSI’s IST/033 information security group and the UK’s digital evidence rep. on ISO/IEC JTC1 SC27 WG4, where I led the work to develop ISO/IEC 27041 and 27042, and contributed to the other investigative and eDiscovery standards.

You’ve recently published some research into verification and validation in digital forensics. What was the goal of the study?

It grew out of a proposition in ISO/IEC 27041 that tool verification (i.e. evidence that a tool conforms to its specification) can be used to support method validation (i.e. showing that a particular method can be made to work in a lab). The idea of the 27041 proposal is that if tool vendors can provide evidence from their own development processes and testing, the tool users shouldn’t need to repeat that. We wanted to explore the reality of that by looking at accredited lab processes and real tools. In practice, we found that it currently won’t work because the requirement definitions for the methods don’t seem to exist and the tool vendors either can’t or won’t disclose data about their internal quality assurance.

The effect of it is that it looks like there may be a gap in the accreditation process. Rather than having a selection of methods that are known to work correctly (as we see in calibration houses, metallurgical and chemical labs etc. – where the ISO 17025 standard originated) which can be chosen to meet a specific customer requirement, we have methods which satisfy much fuzzier customer requirements which are almost always non-technical in nature because the customers are CJS practitioners who simply don’t express things in a technical way.

We’re not saying that anyone is necessarily doing anything wrong, by the way, just that we think they’ll struggle to provide evidence that they’re doing the right things in the right way.

Where do we stand with standardisation in the UK at the moment?

Standardization is a tricky word. It can mean that we all do things the same way, but I think you’re asking about progress towards compliance with the regulations. In that respect, it looks like we’re on the way. It’s slower than the regulator would like. However, our research at York suggests that even the accreditations awarded so far may not be quite as good as they could be. They probably satisfy the letter of the regulator’s documents, but not the spirit of the underlying standard. The technical correctness evidence is missing.

ISO 17025 has faced a lot of controversy since it has been rolled out as the standard for digital forensics in the UK. Could you briefly outline the main reasons why?

Most of the controversy is around cost and complexity. With accreditation costing upwards of £10k for even a small lab, it makes big holes in budgets. For the private sector, where turnover for a small lab can be under £100k per annum, that’s a huge issue. The cost has to be passed on. Then there’s the time and disruption involved in producing the necessary documents, and then maintaining them and providing evidence that they’re being followed for each and every examination.

A lot of that criticism is justified, but adoption of any standard also creates an opportunity to take a step back and review what’s going on in the lab. It’s a chance to find a better way to do things and improve confidence in what you’re doing.

In your opinion, what is the biggest stumbling block either for ISO 17025 specifically, or for standardizing digital forensics in general?

Two things – as our research suggests, the lack of requirements makes the whole verification and validation process harder, and there’s the confusion about exactly what validation means. In ISO terms, it’s proof that you can make a process work for you and your customers. People still seem to think it’s about proving that tools are correct. Even a broken tool can be used in a valid process, if the process accounts for the errors the tool makes.

I guess I’ve had the benefit of seeing how standards are produced and learning how to use the ISO online browsing platform to find the definitions that apply. Standards writers are a lot like Humpty Dumpty. When we use a word it means exactly what we choose it to mean. Is there a way to properly standardise tools and methods in digital forensics?

It’s not just a UK problem – it’s global. There’s an opportunity for the industry to review the situation, now, and create its own set of standard requirements for methods. If these are used correctly, we can tell the tool makers what we need from them and enable proper objective testing to show that the tools are doing what we need them to. They’ll also allow us to devise proper tests for methods to show that they really are valid, and to learn where the boundaries of those methods are.

Your study also looked at some existing projects in the area: can you tell us about some of these? Do any of them present a potential solution?

NIST and SWGDE both have projects in this space, but specifically looking at tool testing. The guidance and methods look sound, but they have some limitations. Firstly, because they’re only testing tools, they don’t address some of the wider non-technical requirements that we need to satisfy in methods (things like legal considerations, specific local operational constraints etc.).

Secondly, the NIST project in particular lacks a bit of transparency about how they’re establishing requirements and choosing which functions to test. If the industry worked together we could provide some more guidance to help them deal with the most common or highest priority functions.

Both projects, however, could serve as a good foundation for further work and I’d love to see them participating in a community project around requirements definition, test development and sharing of validation information.

Is there anything else you’d like to share about the results?

We need to get away from thinking solely in terms of customer requirements and method scope. These concepts work in other disciplines because there’s a solid base of fundamental science behind the methods. Digital forensics relies on reverse-engineering and trying to understand the mind of a developer in order to work out how extract and interpret data. That means we have a potentially higher burden of proof for any method we develop. We also need to remember that we deal with a rate of change caused by human ingenuity and marketing, instead of evolution.

Things move pretty fast in DF, if we don’t stop and look at what we’re doing once in a while, we’ll miss something important.

Read Angus Marshall’s paper on requirements in digital forensics method definition here. Angus Marshall

The hottest topic in digital forensics at the moment, standardisation is on the tip of everyone’s tongues. Following various think pieces on the subject and a plethora of meetings at conferences, I spoke to Angus Marshall about his latest paper and what he thinks the future holds for this area of the industry. You can […]

via Angus Marshall talks about standardisation — scar

Computer Forensics: Data Recovery & Extraction from Scratched Hard Drives via Salvation DATA Blog

[Case Study] Computer Forensics: Data Recovery & Extraction From Platter Scratched Hard Drives. COMPUTER FORENSICS:
Editor’s note: As a forensic data recovery expert, Salvation DATA receives different data recovery cases every day. Our forensic customers usually turn to us for help when they run into a case they are not able to handle. And among all the data lost situations, platter scratch is one of the most difficult kinds of problem to deal with. So in this issue, let’s see what is the correct forensic process for a platter scratched hard drive.

What is platter scratch?

When platters are damaged it is usually in the form of scratching caused by debris and or the read/write heads when they come in contact during the reading-writing process.

This is also known commonly as a head crash, although that term is often mistakenly used by inexperienced individuals to relate to clicking drives or hard drives that need a read/write head replacement.

Once the platters are scratched to a certain degree this will, in turn, damaged the read/write heads and will render the drive unreadable. Oftentimes this results in a clicking, scratching, chirping, or screeching sounds. However, these sounds don’t automatically mean the platters are scratched.

When the platters are scratched in this manner the drive will not be able to be recovered, the files and data contained on the drive will be lost forever. This is known as a catastrophic head crash and most hard drive failure recovery cannot fix this.

SalvationDATA Computer Forensics Scratched Platters

How to work with a hard drive with platter scratch?

Is platter scratch truly unrecoverable? Actually sometimes if the scratches to the platter surface are not too severe, there is still the possibility to recover and extract the data as long as we strictly follow operating procedures.

Stop attempting to read data immediately to avoid further unrecoverable damage.
Open the hard drive in a dust-free environment and inspect for damage.
Remove the damaged read/write head, and replace it with a healthy head. Donor head must be selected by strict matching rule. For example, for Western Digital head replacement, donor drive must match the model number, batch ID, FW version and PN.
After repairing physical damages, we can continue to forensically recover and extract the data from this hard drive with SalvationDATA’s DRS (Data Recovery System).
What tools do you need for this process?

HPE Pro is a hard drive repair tool Head Platter Exchange it is the unarguable and the only equipment built to handle head stack and drive motor issues, in case the drive corruption is not caused by firmware but head stack or drive spin motor. With the pioneer platter exchanger, it can prevent the head from further damage or misalignment due to incorrect operations to maintain the user data intact.

DRS (Data Recovery System) is our next generation intelligent all-in-one forensic data recovery tool that can acquire and recover data from both good and damaged storage media like HDD simply and easily.

Forensic Diagnostics

How do we know if the hard drive is fixed, and can continue to the next step? DRS’s disk diagnostics feature perfectly helps to solve the problem. DRS is able to scan the source disk in advance. With fairly new FastCheck technology, it allows rapid check within 5 seconds, avoiding the risk of second damage made to an important evidentiary storage device.

Insert the hard disk in DRS, and simply click the one-key Diagnose function to complete the process. DRS will tell you the detailed disk health status in no time!

2

Forensic Imaging

After repairing the physical damages, the hard drive could still be fragile and easy to fail again. If not handled with care, we may permanently lose the opportunity to recover and extract the data. Therefore, it is crucial to first secure data stored on the hard drive. DRS also provides the solution. The forensic imaging function of DRS secures the evidentiary digital data by creating a physical level sector-by-sector duplication of the damaged hard drive. Once finished, a forensic image will be exactly the same as the source data and can be stored safely and analyzed at any time appropriate.

3

Important notes:

When dealing with a defective hard drive as in this case, it is recommended to use the Advanced Imaging mode in DRS to help bypass bad sectors and extract as much data as possible. Also, remember to set transmission mode as PIO (low speed) to safely extract the data from such damaged storage device.
Before imaging, we can also check the raw hexadecimal data view in DRS Sector View to make sure data on this damaged hard drive is accessible. Professional data recovery engineers can even acquire more information from this sector view.
SalvationDATA Computer Forensics Scratched Platters

Forensic Recovery

Now with all the problems dealt with, we have one final step to make: recover and extract valuable evidentiary data. Use DRS’s File Recovery & File Carving function to locate and extract important digital files, and generate a forensic report at the end of the process. With DRS’s intelligent recovery technology, investigators can deal with deleted files, formatted partitions, corrupted file system and many other digital data lost situations without any professional skill requirements!

5

Conclusions

Platter scratch is the nightmare for data recovery engineers. However, it is not impossible to recover data from scratched platters. In this issue, we discussed the standard operating procedure to deal with a hard drive with platter scratch to maximize the possibility to recover and extract valuable evidentiary data. We hope the instructions we provide can help you with your work!
You can also visit our official YouTube channel for more videos: https://www.youtube.com/user/SalvationDataOfficia/featured

https://videopress.com/embed/MiSnIzf7?hd=0&autoPlay=0&permalink=0&loop=0

Editor’s note: As a forensic data recovery expert, SalvationDATA receives different data recovery cases every day. Our forensic customers usually turn to us for help when they run into a case they are not able to handle. And among all the data lost situations, platter scratch is one of the most difficult kinds of problem […]

via [Case Study] Computer Forensics: Data Recovery & Extraction From Platter Scratched Hard Drives — SalvationDATA Blog

Ethical Dilemmas in Forensics

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.
via: Salem Press Encyclopedia of Science
Abstract:
Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

Definition: Principles of conduct, moral duty, and obligation that guide individuals in their decisions and actions.

Significance: As scientists, forensic scientists have a professional obligation to seek and to speak the truth about matters within their purview. As participants in a forensic process, they are subject to additional, sometimes conflicting, duties. This tension generates many ethical dilemmas.

Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

89312142-73888.jpg
Entrance sign at the Society of Forensic Toxicologists (SOFT) Meeting, July 5, 2012. By Monika M. Wahi (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia Commons

Forensic scientists are measured against a standard of professional integrity, although the professionalization of the scientific study of crime is far from complete. Professions are substantially self-regulating, usually through agreed-upon standards and codes of ethics, and this creates the need for them to articulate appropriate expectations and the responsibility of members of professions both to act correctly themselves and to provide appropriate correction for their errant colleagues. A case in point is William Tobin’s campaign against the chemical analysis of bullet lead, also known as comparative bullet-lead analysis (CBLA).

Tobin’s Exposure of CBLA

CBLA is a technique that the Federal Bureau of Investigation (FBI) used for four decades—the investigation of the assassination of President John F. Kennedy in 1963 was an early use—to make cases against defendants when traditional firearms analysis (that is, examination of barrel rifling on bullets) was not possible. By measuring the proportions of seven trace elements (antimony, arsenic, bismuth, cadmium, copper, silver, and tin) found in the lead of a bullet in evidence, forensic scientists sought to establish the probability that the bullet came from the same provenance as a bullet in the suspect’s possession. The belief that the comparison of the chemical composition of bullets could connect two bullets rested on unexamined assumptions about the similarities and differences of the source lead from which the bullets were cast. FBI experts testified in thousands of cases that the facts ascertainable through CBLA established likely identity and therefore pointed toward the probable guilt of the accused. Sometimes, as in the case of Michael Behm, who was convicted of murder in 1997, CBLA provided essentially the only evidence of guilt.

In the 1990s, FBI metallurgist William Tobin began to question the validity of the technique. He felt strongly enough about the issue to research the matter, after his retirement in 1998, with Lawrence Livermore National Laboratory metallurgist Erik Randich. They analyzed data from two lead smelters in Alabama and Minnesota and discovered that the FBI techniques could not distinguish batches of lead produced months apart. They also discovered that differences existed within single batches. Their research was published in Forensic Science International in July 2002.

Although he still defended the technique, the director of the FBI Laboratory requested that the National Research Council (NRC) of the National Academy of Sciences review CBLA. In February 2004, the NRC report, titled Forensic Analysis: Weighing Bullet Lead Evidence, confirmed that only extremely limited claims could be made about the relationship between bullets based on CBLA. Given the NRC findings, a New Jersey appeals court overturned Behm’s conviction in March 2005. The results of the NRC study have obvious implications for many other cases as well.

In an article titled “Forensic Significance of Bullet Lead Compositions,” which appeared in the Journal of Forensic Sciences in March 2005, FBI research chemists Robert D. Koons and JoAnn Buscaglia argued that “compositional comparison of bullet lead provides a reliable, highly significant point of evidentiary comparison of potential sources of crime-related bullets.” In September of that year, however, the FBI announced that it would no longer use CBLA. (In a curious subsequent development, Tobin and a member of the NRC committee, Clifford Spiegelman, suggested that a reanalysis of the bullet fragments from the Kennedy assassination might be in order.)

An article published in New Scientist in April 2002, quoted Tobin as saying of the interpretation of bullet data based on CBLA, “It offended me as a scientist.” In fact, Tobin has a long record as a critic of FBI procedures he regards as bad science and of testimonial practices he regards as unwarranted by the scientific data. To complain about testimony that unreasonably goes beyond what the data can support is to respond equally to the demands of science and the demands of ethics. It is a feature of commonsense justice that the punishment should fit the crime, and a basic requirement of that, in turn, is that the people who are punished should be guilty. Violating that requirement is both bad science and bad ethics.

Joyce Gilchrist’s Tainted Evidence

Is it enough that the accused be guilty of some crime, or does it have to be the one in question? If the accused is guilty of the crime in question, does it matter whether the evidence actually shows that? The belief that one can convict the guilty by tweaking the evidence a little, or shading one’s testimony a bit, is among the most common sources of unethical (and, often enough, criminal) behavior on the part of forensic scientists. The cautionary tale of former Oklahoma City police Department forensic scientist Joyce Gilchrist probably falls into this category.

In May 2007, Curtis Edward McCarty, who was facing his third trial for a 1982 murder, was freed as the result of the improper handling and representation of hair evidence by Gilchrist, who apparently had tried to frame McCarty. The judge dismissed the charge despite her belief that McCarty was probably not completely innocent. This was merely the latest in a series of episodes involving Gilchrist.

Questions about the integrity of Gilchrist’s work began as early as January 1987, when a Kansas City colleague, John Wilson, complained about her to the Southwestern Association of Forensic Scientists, without result. In 1998, Robert Miller was exonerated after he had been convicted a decade earlier based in part on Gilchrist’s testimony regarding blood, semen, and hair evidence. In 1999, Gilchrist was criticized by a judge for having given false testimony (regarding semen evidence) in the rape/murder trial of Alfred Brian Mitchell in 1992. In the spring of 2000, Jeffrey Todd Pierce was ordered released after he had served a decade and a half for a rape he did not commit; he had been convicted based on Gilchrist’s testimony. In January 2001, Gilchrist was criticized for the various judicial reprimands and professional critiques her work had received. In August 2001, doubts were raised about the guilt of Malcolm Rent Johnson, who had been executed for a 1981 rape and murder; Johnson was convicted based on Gilchrist’s testimony.

A month later, in September 2001, Gilchrist was finally fired, after years of reputedly shoddy forensics work, including both mishandling and misrepresentation of evidence, on many cases in addition to those noted above. The world of criminal justice contains innumerable isolated instances of perverse idealism, self-serving cynicism, and simple incompetence, but Gilchrist is one of the most striking cases of flagrant disregard for ethics in the forensics community. Was she genuinely convinced of the guilt of those against whom she testified? (She was certainly persuasive to juries.) Was she cynically distorting her testimony, and the evidence, to help prosecutors gain convictions, or was she just incompetent?

Ethics of Competence

One may well agree with forensics ethicist Peter D. Barnett’s remark that “there is a certain baseline level of competence that every criminalist is expected to understand, and there are certain procedures and protocols that are so fundamental to the practice of criminalistics that failure to follow them is evidence of gross incompetence or malfeasance, which is unethical.” As Barnett himself notes, however, “in the practice of forensic science, the disparate educational and experiential backgrounds of workers in the field make determination of a baseline level of competence relatively difficult.”

This is a problem throughout the American criminal justice system. In June 2007, all sergeants in the New Orleans Police Department were required to attend a four-day seminar to learn how to improve their (and their subordinates’) writing of police reports. This was part of an attempt to smooth out conflicts between the department and the New Orleans district attorney’s office, which claimed that part of its difficulty in prosecuting criminals stemmed from “incomplete or vague reports” by officers. More generally, criminalists frequently lament that frontline officers are not more skilled in observing, protecting, collecting, and preserving crime scene evidence.

One certainly can (in theory) impose reasonable expectations about competence and development in forensic science. However, that is not made easy by the variety of educational backgrounds and practical experience of the people who actually work in the field. In an unflattering assessment published in 2005, Jane Campbell Moriarty and Michael J. Saks bluntly asserted that “in the forensic sciences . . . 96 percent of practitioners hold bachelor’s degrees or less.” They went on to note:

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.

Moreover, as with the difficulty in defining the qualifications for expert testimony, the fact that crime fighting is not a natural kind of expertise has an impact. Almost any expert might be relevant to a criminal case, depending on circumstances. Given the diverse forms of knowledge relevant to the application of science to crime solving, and to the providing of suitable expert testimony, it may be that the only truly unifying factor is the application of the so-called scientific method, broadly understood as intellectual integrity—the determined effort, as physicist Richard P. Feynman put it, not to fool oneself (or others).

What is impressive about the case of William Tobin is his determination to ensure that his colleagues (or former colleagues) not testify to more than the data warrant, both out of scientific integrity and out of fairness to those whose lives are affected by what scientists say. What is appalling about the case of Joyce Gilchrist is the stubbornness of her effort to resist correction by colleagues or even by the seemingly obvious limits of the evidence itself. Sometimes the individual needs to correct the group, by exposing a bogus or complacent consensus; sometimes the group needs to correct the individual, by identifying willful deception or self-centered fantasy. Unfortunately, no formula exists to guarantee the right result, and that is why ethics remains a constant challenge to conscientious souls.

Ethical dilemmas in forensics

Related Information

  • American Academy of Forensic Sciences (AAFS)
  • American Society of Crime Laboratory Directors (ASCLD)
  • Brain-wave scanners
  • Criminal personality profiling
  • DNA database controversies
  • Ethics of DNA analysis
  • Expert witnesses in trials
  • Forensic journalism
  • Innocence Project
  • Interrogation in criminal investigations
  • Training and licensing of forensic professionals
  • Truth serum in interrogation

Last reviewed: October 2016

Bibliography

Barnett, Peter D. Ethics in Forensic Science: Professional Standards for the Practice of Criminalistics. Boca Raton: CRC, 2001. Print.

Inman, Keith, and Norah Rudin. Principles and Practice of Criminalistics: The Profession of Forensic Science. Boca Raton: CRC, 2001. Print.

Lucas, Douglas M. “The Ethical Responsibilities of the Forensic Scientist: Exploring the Limits.” Journal of Forensic Sciences 34 (1989): 719–29. Print.

Macklin, Ruth. “Ethics and Value Bias in the Forensic Sciences.” Journal of Forensic Sciences 42 (1997): 1203–206. Print.

Moriarty, Jane Campbell, and Michael J. Saks. “Forensic Science: Grand Goals, Tragic Flaws, and Judicial Gatekeeping.” Judges’ Journal 44.4 (2005): 16–33. Print.

Peterson, Joseph L., and John E. Murdock. “Forensic Science Ethics: Developing an Integrated System of Support and Enforcement.” Journal of Forensic Sciences 34 (1989): 749–62. Print.

Derived from: “Ethics.” Forensic Science. Salem Press. 2009.