Repost of Day 58/67: Five Month GED, Graphing via Slope-Intercept Form, and Forensic Science Continuous Learning: Project Do Better

We use rate of change every day, for transporting ourselves and our needful things, for instance, perhaps without even recognizing it, but what else …

Repost of Day 58/67: Five Month GED, Graphing via Slope-Intercept Form, and Forensic Science Continuous Learning: Project Do Better
Advertisement

Forensics: The Research Tools Every Writer Needs via Danielle Adams

A staple location in any mystery is the crime scene. You know, the one where cops and forensic scientists take photos and collect evidence. It’s that evidence that helps our detective crack the case. In other words, forensics are integral to any mystery or crime novel.

The only problem is that the task of collecting and analyzing evidence is usually grossly misrepresented by Hollywood. It is also detailed work, which makes getting it right in your novel paramount.

To make things more interesting is there are numerous sub-fields to the discipline, so you’ll need to craft different characters for each role. Thankfully, there are writers out there that have blazed a trail for you and amassed numerous resources for us to use.

What is Forensics?

Before we go any further into the resources and how to use forensics in our writing, let’s define what this scientific discipline is:

Forensics: The application of the methods of the natural and physical sciences to matters of criminal and civil law. 

According to Britannica, forensic science is involved in criminal investigation and prosecution and civil wrong cases, such as willful pollution or industrial injuries.

The Different Types

And if you think about it, there are many different types of sciences used in forensics. For instance, here are 16 types of forensic science:

  • Trace Evidence Analysis
  • Toxicology
  • Podiatry
  • Odontology
  • Linguistics
  • Geology
  • Entomology
  • Engineering
  • DNA Analysis
  • Botany
  • Archaeology
  • Anthropology
  • Digital Forensics
  • Ballistics

And that’s not all of them. There are more, and what you use in your story will depend on the type of crime your character is investigating.

It makes this a bit daunting when you’re thinking about the research you’ll have to do into each branch. There are ways around this, but you should know the basics about evidence collection, who collects it, and how it’s analyzed.

Forensics Research Resources

When writing our detective story or police procedural, our focus is on the investigator of the crime. We aren’t looking closely at our secondary characters, which means we may not be doing our research.

And if we aren’t doing proper research, we might be assigning our detective duties outside of their role. Before we go into anything else, I want you to look at this infographic from Rasmussen College on who is present on a crime scene.

Who's who on a crime scene infographic. You need to know which forensics experts are present at the beginning.

Aside from the police officers on-site, you also have a forensic photographer taking pictures of the evidence. Your medical examiner and forensic pathologist are there to figure out how and when the victim died and to gather any evidence to help confirm these details later. And finally, you have a forensic science technician that collects all the evidence from the crime scene and classifies it, so it goes to the right people back at the lab.

Just because several people are on the crime scene, it doesn’t mean the crime scene is a free-for-all. There is a strict hierarchy of who is allowed on the crime scene at a given time.

Also, make sure that you’re up on the latest technology to lift fingerprints, collect DNA, etc. Here’s an infographic from eLocal Lawyers to get you started:

Get to Know Your Forensics Field

Each branch of forensics deals with a particular part of an investigation. For example, ballistics deals with guns, odontology deals with dental work, etc. If you are using a specific type of evidence, like dental work, look into the following things:

  • What evidence is looked at by that branch?
  • How that evidence is collected?
  • How long it takes to analyze the evidence?
  • Does the evidence support the investigation and prosecution of the crime? And in what ways?
  • What a negative result looks like and what successful results look like?
  • What’s real or made-up by Hollywood?

Here’s an infographic that explains the myths surrounding DNA evidence by Criminologia:

Some other authors have also created in-depth articles about specific forensic fields, like this one from the Creative Penn.

Other Resources

Besides knowing who does what, you need to look into several other avenues to make your fiction as realistic as possible. Here’s how you can do that:

Read some books.

The best way to find some things to look into is by reading within your genre. And if you want a thoroughly entertaining read from a forensic scientist, then you can’t go wrong with Kathy Reichs, the author behind the Bones TV show.

I’ve linked you to her about page. It tells you about all of her real-life experience in the field, so she knows what she’s talking about. The about page also links to numerous institutions that contain valuable, credible research information that you can use in your story.

You can also turn to nonfiction and read the following books and articles for writers:

You can also check out textbooks for forensics students, biographies, true crime stories, etc.

Check the Internet or Your Local Library

Libraries, especially your local university or college libraries, are fantastic resources for finding information. These libraries house academic texts and often have specialized collections and rare materials.

The college library is also the place to research articles from professional, scientific, and academic journals. These journals are a great way to find information about the latest news, breakthroughs, theories, and research within the various forensic specialties.

Additionally, your friendly search engine is an excellent way for you to start amassing your library of links on forensic science sites. You want to look for websites affiliated with educational institutions, media outlets that investigate and cover forensic issues, professional forensics organizations, law enforcement agencies, and experienced forensic investigators.

Talk to the Experts

The best place for you to find the information you need is to go straight to the source: the forensic experts. That means you’ll need to leverage your networking skills to find someone. You can also contact professors from your local college or university.

And if you don’t want to talk to someone, you can flip through some forensic science magazines to read interviews and articles on the subject. Or to get contact information so you can speak to them. (It’s still your best bet to get the information you need.)

Forensic specialists are busy people, so it may be worth your time to learn how to conduct an interview (AKA, ask pointed questions to get what you need). Here are some resources for helping you hone your interviewing skills:

Remember to thank your expert for their time and send a follow up “Thank you” note.

Legal Ramifications

Your story might not cover the case’s prosecution, but it’s good to know how forensic evidence is used in court and what types of evidence are seen as “more concrete” than others.

What can affect your story is the collection of evidence from a suspect’s home or person. You may want to look into the legal documents or statements that police and investigators need before they begin taking objects into evidence.

How to Write a Forensics Novel

Now that we have the research portion of our writing process out of the way, we need to focus on putting it into action. So without further adieu, here is what you need to think about when adding forensics to your story:

Setting up your crime scene.

You need to give this some thought on two fronts. First, you need to provide clues that will help your protagonist solve the crime. You also need to think about what your antagonist will do to cover their tracks.

Here’s an infographic on some ways your murderer may cover his tracks:

10 Ways to Cover Up a Murder Infographic with forensics

Why does this matter?

Because whatever your criminal does will leave behind certain types of evidence – your red herrings. These are a must-have in mystery or thriller because they help create suspense and keep your readers guessing until the end.

Put it in writing.

Earlier in this post, I told you there was a way around creating a new character for every forensic expert your protagonist interacts with. And the best way you can do that is by giving your protagonist reports to read.

And this is where you can have some fun. You can have your protagonist talk to their partner about the results, or you can format your page to look like a report. (It could look a little like this.)

Focus on one aspect of forensics.

I’m not saying that you need to focus on one type of forensics only, still use common evidence types, like fingerprinting, DNA, etc. but focus on an aspect you want to explore. Once you do that, exploit it for all the drama you can get out of it.

A common thing to focus on in true crime accounts or other crime stories is the killer or criminal’s psyche. People find this fascinating, and you can play into that to make your story more dramatic and suspenseful. Whatever you choose, ensure it is a central theme or piece of evidence for your novel.

Don’t commit to a specific time of death.

As author C.S. Lakin puts it:

Many mortis factors are considered when estimating time of death. Temperature is the biggie, followed by body mass.

A dead body will naturally adjust temperature (algor) to achieve equilibrium with its surroundings and will display time-telling factors, such as muscle stiffening (rigor), blood settling (livor), color (palor), and tissue breakdown (decomp). The presence of toxins also effects body changes. Cocaine amplifies the mortis process, while carbon monoxide retards it. Be careful in getting your forensic guru to commit on specific time.

The answer is in forensics.

Or the smallest of details. Author Sue Coletta highlighted this beautifully in her post about writing realistic crime scenes. She provides her readers with two cases and alters the crime scene in a small way.

When the officers find a small changed detail, they can solve the crime and arrest the appropriate person. That’s why many TV shows and novels have the detective go through all the evidence again to find that one small thing they missed.

And you can do this for your novel as well. If you want to, that is. You can always try to come up with a new way to bring the perpetrator to justice.


Forensics is an essential part of any investigation. It helps our detective find out who the killer is and bring them to justice.

They are also nasty little details that can make or break your reader’s suspension of belief if they’re not well-researched. Hopefully, I’ve made your research easier by providing you with a list of resources for you to use.

And don’t forget to pay attention to how your criminal will cover their tracks. It could be a small detail that’s their undoing.

Why are we fascinated with forensics and crime? Any Bones fans out there? Did I miss anything, or do you have any more tips? Please let me know in the comments below!

Stay safe, everyone.

Until next time.

Cheers,

Danielle

A staple location in any mystery is the crime scene. You know, the one where cops and forensic scientists take photos and collect evidence. It’s that evidence that helps our detective crack the case. In other words, forensics are integral to any mystery or crime novel. The only problem is that the task of collecting and analyzing […]

Forensics: The Research Tools Every Writer Needs — Danielle Adams

Intro to Fingerprints via Forensic’s blog

Fingerprints- An Introduction

By @forensicfield

INTRODUCTION

Everything we touched, leave behind our unique impression on it, which is Our fingerprints.

No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints.

Fingerprint identification also known as “Dactyloscopy”.

Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. They develop from pressure on a baby’s tiny, developing fingers in the womb.

CLASSIFICATION OF FINGERPRINTS

🧤By FRANCIS GALTON
A well-known British scientist sir Francis Galton published his first book on fingerprint in 1892. His important work include method for classification for fingerprint which are divided into three groups-
a. Arch
b. Loop
c. Whorl

🧤 By WILLIAM J. HERSHEL
While working for the East India Company in Bengal, India, Sir William James Herschel first used fingerprints on native contracts. After a decade, he had accumulated a file of fingerprints.

🧤 By EDWARD HENRY
Henry Classification of Fingerprinting was accepted as common practice throughout England and its territorial holdings and in the United States.
Under the henry system, fingerprints divided into two classes:
•Those which are given numerical value. (whorls and composites).
•Those which doesn’t give numerical value. (loops and arches).

🖐 All patters are divided as follows:

The henry classification system assigns each finger A number according to the order in which it is located in the hand, beginning with the right thumb as number 1 and ending with the left pinky as number 10.
• The system also assigns a numerical value to fingers that contain a whorl pattern; fingers 1 and 2 each have a value of 16,
• Fingers 3 and 4 = 8,
• Fingers 5 and 6 = 4,
• Fingers 7 and 8 = 2,
• Final two fingers = 1.
• Fingers with a non-whorl pattern, such as an arch or loop pattern, have a value of zero.
• The sum of the even finger value is then calculated and placed in the numerator of a fraction.
• The sum of the odd finger values is place in the denominator.
• The value of 1 is added to each sum of the whorls with the maximum obtainable on either side of the fraction begin 32.
• Thus, the primary classification is a fraction between 1/1 to 32/32, where 1/1 would indicate no whorl patterns and 32/32 would mean that all fingers had whorl patterns.

🧤 By JUAN VUCETICH
Vucetich is credited with the first positive criminal identification as, in 1892, he was able to extract a set of prints off a door and thus identify a woman as the culprit in a double homicide.

✴ CHARACTERISTICS OF FINGERPRINT

🖐 Class Characteristics
Class characteristics are the characteristics that narrow the print down to a group but not an individual.

The Three Fingerprint Class Types Are;
1. Arches:
✔Arches are the simplest type of fingerprints that are formed by ridges that enter on one side of the print and exit on the other. No deltas are present.
✔About 5 % of the world’s populations have arch patterns.

Arch pattern

2. Loops:
✔Loops must have one delta and one or more ridges that enter and leave on the same side. These patterns are named for their positions related to the radius and ulna bones.
✔About 60-65 % of the world’s populations have loop patterns.

Loop pattern

3. Whorls:
✔Whorls have at least one ridge that makes (or tends to make) a complete circuit. They also have at least two deltas.
✔About 30-35 % of the world’s populations have whorls patterns.

Whorl Pattern

👍 Individual characteristics:
👉Individual characteristics are those characteristics that are unique to an individual.
👉They are tiny irregularities that appear within the friction ridges and are referred to as Galton’s details.

Individual Characteristics of Fingerprint

Watch it🤳, share it ✌and subscribe it 👇 : –

Fingerprints-An Introduction

By @forensicfield INTRODUCTION Everything we touched, leave behind our unique impression on it, which is Our fingerprints. No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints. Fingerprint identification also known as “Dactyloscopy”. Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. […]

via Fingerprints- An Introduction — Forensic’s blog

Global Standardization of Forensics will Decrease the Bias Factor of Evidence Collection Procedures and Court Rulings

Interviews – 2018

Angus Marshall, Digital Forensic Scientist

via Angus Marshall
Angus, tell us a bit about yourself. What is your role, and how long have you been working in digital forensics?

Where to begin? I have a lot of different roles these days, but by day I’m a Lecturer in Cybersecurity – currently at the University of York, and also run my own digital forensic consultancy business. I drifted into the forensic world almost by accident back in 2001 when a server I managed was hacked. I presented a paper on the investigation of that incident at a forensic science conference and a few weeks later found myself asked to help investigate a missing person case that turned out to be a murder. There’s been a steady stream of casework ever since.

I’m registered as an expert adviser and most of my recent casework seems to deal with difficult to explain or analyse material. Alongside that, I’ve spent a lot of time (some might say too much) working on standards during my time on the Forensic Science Regulator’s working group on digital evidence and as a member of BSI’s IST/033 information security group and the UK’s digital evidence rep. on ISO/IEC JTC1 SC27 WG4, where I led the work to develop ISO/IEC 27041 and 27042, and contributed to the other investigative and eDiscovery standards.

You’ve recently published some research into verification and validation in digital forensics. What was the goal of the study?

It grew out of a proposition in ISO/IEC 27041 that tool verification (i.e. evidence that a tool conforms to its specification) can be used to support method validation (i.e. showing that a particular method can be made to work in a lab). The idea of the 27041 proposal is that if tool vendors can provide evidence from their own development processes and testing, the tool users shouldn’t need to repeat that. We wanted to explore the reality of that by looking at accredited lab processes and real tools. In practice, we found that it currently won’t work because the requirement definitions for the methods don’t seem to exist and the tool vendors either can’t or won’t disclose data about their internal quality assurance.

The effect of it is that it looks like there may be a gap in the accreditation process. Rather than having a selection of methods that are known to work correctly (as we see in calibration houses, metallurgical and chemical labs etc. – where the ISO 17025 standard originated) which can be chosen to meet a specific customer requirement, we have methods which satisfy much fuzzier customer requirements which are almost always non-technical in nature because the customers are CJS practitioners who simply don’t express things in a technical way.

We’re not saying that anyone is necessarily doing anything wrong, by the way, just that we think they’ll struggle to provide evidence that they’re doing the right things in the right way.

Where do we stand with standardisation in the UK at the moment?

Standardization is a tricky word. It can mean that we all do things the same way, but I think you’re asking about progress towards compliance with the regulations. In that respect, it looks like we’re on the way. It’s slower than the regulator would like. However, our research at York suggests that even the accreditations awarded so far may not be quite as good as they could be. They probably satisfy the letter of the regulator’s documents, but not the spirit of the underlying standard. The technical correctness evidence is missing.

ISO 17025 has faced a lot of controversy since it has been rolled out as the standard for digital forensics in the UK. Could you briefly outline the main reasons why?

Most of the controversy is around cost and complexity. With accreditation costing upwards of £10k for even a small lab, it makes big holes in budgets. For the private sector, where turnover for a small lab can be under £100k per annum, that’s a huge issue. The cost has to be passed on. Then there’s the time and disruption involved in producing the necessary documents, and then maintaining them and providing evidence that they’re being followed for each and every examination.

A lot of that criticism is justified, but adoption of any standard also creates an opportunity to take a step back and review what’s going on in the lab. It’s a chance to find a better way to do things and improve confidence in what you’re doing.

In your opinion, what is the biggest stumbling block either for ISO 17025 specifically, or for standardizing digital forensics in general?

Two things – as our research suggests, the lack of requirements makes the whole verification and validation process harder, and there’s the confusion about exactly what validation means. In ISO terms, it’s proof that you can make a process work for you and your customers. People still seem to think it’s about proving that tools are correct. Even a broken tool can be used in a valid process, if the process accounts for the errors the tool makes.

I guess I’ve had the benefit of seeing how standards are produced and learning how to use the ISO online browsing platform to find the definitions that apply. Standards writers are a lot like Humpty Dumpty. When we use a word it means exactly what we choose it to mean. Is there a way to properly standardise tools and methods in digital forensics?

It’s not just a UK problem – it’s global. There’s an opportunity for the industry to review the situation, now, and create its own set of standard requirements for methods. If these are used correctly, we can tell the tool makers what we need from them and enable proper objective testing to show that the tools are doing what we need them to. They’ll also allow us to devise proper tests for methods to show that they really are valid, and to learn where the boundaries of those methods are.

Your study also looked at some existing projects in the area: can you tell us about some of these? Do any of them present a potential solution?

NIST and SWGDE both have projects in this space, but specifically looking at tool testing. The guidance and methods look sound, but they have some limitations. Firstly, because they’re only testing tools, they don’t address some of the wider non-technical requirements that we need to satisfy in methods (things like legal considerations, specific local operational constraints etc.).

Secondly, the NIST project in particular lacks a bit of transparency about how they’re establishing requirements and choosing which functions to test. If the industry worked together we could provide some more guidance to help them deal with the most common or highest priority functions.

Both projects, however, could serve as a good foundation for further work and I’d love to see them participating in a community project around requirements definition, test development and sharing of validation information.

Is there anything else you’d like to share about the results?

We need to get away from thinking solely in terms of customer requirements and method scope. These concepts work in other disciplines because there’s a solid base of fundamental science behind the methods. Digital forensics relies on reverse-engineering and trying to understand the mind of a developer in order to work out how extract and interpret data. That means we have a potentially higher burden of proof for any method we develop. We also need to remember that we deal with a rate of change caused by human ingenuity and marketing, instead of evolution.

Things move pretty fast in DF, if we don’t stop and look at what we’re doing once in a while, we’ll miss something important.

Read Angus Marshall’s paper on requirements in digital forensics method definition here. Angus Marshall

The hottest topic in digital forensics at the moment, standardisation is on the tip of everyone’s tongues. Following various think pieces on the subject and a plethora of meetings at conferences, I spoke to Angus Marshall about his latest paper and what he thinks the future holds for this area of the industry. You can […]

via Angus Marshall talks about standardisation — scar

Computer Forensics: Data Recovery & Extraction from Scratched Hard Drives via Salvation DATA Blog

[Case Study] Computer Forensics: Data Recovery & Extraction From Platter Scratched Hard Drives. COMPUTER FORENSICS:
Editor’s note: As a forensic data recovery expert, Salvation DATA receives different data recovery cases every day. Our forensic customers usually turn to us for help when they run into a case they are not able to handle. And among all the data lost situations, platter scratch is one of the most difficult kinds of problem to deal with. So in this issue, let’s see what is the correct forensic process for a platter scratched hard drive.

What is platter scratch?

When platters are damaged it is usually in the form of scratching caused by debris and or the read/write heads when they come in contact during the reading-writing process.

This is also known commonly as a head crash, although that term is often mistakenly used by inexperienced individuals to relate to clicking drives or hard drives that need a read/write head replacement.

Once the platters are scratched to a certain degree this will, in turn, damaged the read/write heads and will render the drive unreadable. Oftentimes this results in a clicking, scratching, chirping, or screeching sounds. However, these sounds don’t automatically mean the platters are scratched.

When the platters are scratched in this manner the drive will not be able to be recovered, the files and data contained on the drive will be lost forever. This is known as a catastrophic head crash and most hard drive failure recovery cannot fix this.

SalvationDATA Computer Forensics Scratched Platters

How to work with a hard drive with platter scratch?

Is platter scratch truly unrecoverable? Actually sometimes if the scratches to the platter surface are not too severe, there is still the possibility to recover and extract the data as long as we strictly follow operating procedures.

Stop attempting to read data immediately to avoid further unrecoverable damage.
Open the hard drive in a dust-free environment and inspect for damage.
Remove the damaged read/write head, and replace it with a healthy head. Donor head must be selected by strict matching rule. For example, for Western Digital head replacement, donor drive must match the model number, batch ID, FW version and PN.
After repairing physical damages, we can continue to forensically recover and extract the data from this hard drive with SalvationDATA’s DRS (Data Recovery System).
What tools do you need for this process?

HPE Pro is a hard drive repair tool Head Platter Exchange it is the unarguable and the only equipment built to handle head stack and drive motor issues, in case the drive corruption is not caused by firmware but head stack or drive spin motor. With the pioneer platter exchanger, it can prevent the head from further damage or misalignment due to incorrect operations to maintain the user data intact.

DRS (Data Recovery System) is our next generation intelligent all-in-one forensic data recovery tool that can acquire and recover data from both good and damaged storage media like HDD simply and easily.

Forensic Diagnostics

How do we know if the hard drive is fixed, and can continue to the next step? DRS’s disk diagnostics feature perfectly helps to solve the problem. DRS is able to scan the source disk in advance. With fairly new FastCheck technology, it allows rapid check within 5 seconds, avoiding the risk of second damage made to an important evidentiary storage device.

Insert the hard disk in DRS, and simply click the one-key Diagnose function to complete the process. DRS will tell you the detailed disk health status in no time!

2

Forensic Imaging

After repairing the physical damages, the hard drive could still be fragile and easy to fail again. If not handled with care, we may permanently lose the opportunity to recover and extract the data. Therefore, it is crucial to first secure data stored on the hard drive. DRS also provides the solution. The forensic imaging function of DRS secures the evidentiary digital data by creating a physical level sector-by-sector duplication of the damaged hard drive. Once finished, a forensic image will be exactly the same as the source data and can be stored safely and analyzed at any time appropriate.

3

Important notes:

When dealing with a defective hard drive as in this case, it is recommended to use the Advanced Imaging mode in DRS to help bypass bad sectors and extract as much data as possible. Also, remember to set transmission mode as PIO (low speed) to safely extract the data from such damaged storage device.
Before imaging, we can also check the raw hexadecimal data view in DRS Sector View to make sure data on this damaged hard drive is accessible. Professional data recovery engineers can even acquire more information from this sector view.
SalvationDATA Computer Forensics Scratched Platters

Forensic Recovery

Now with all the problems dealt with, we have one final step to make: recover and extract valuable evidentiary data. Use DRS’s File Recovery & File Carving function to locate and extract important digital files, and generate a forensic report at the end of the process. With DRS’s intelligent recovery technology, investigators can deal with deleted files, formatted partitions, corrupted file system and many other digital data lost situations without any professional skill requirements!

5

Conclusions

Platter scratch is the nightmare for data recovery engineers. However, it is not impossible to recover data from scratched platters. In this issue, we discussed the standard operating procedure to deal with a hard drive with platter scratch to maximize the possibility to recover and extract valuable evidentiary data. We hope the instructions we provide can help you with your work!
You can also visit our official YouTube channel for more videos: https://www.youtube.com/user/SalvationDataOfficia/featured

https://videopress.com/embed/MiSnIzf7?hd=0&autoPlay=0&permalink=0&loop=0

Editor’s note: As a forensic data recovery expert, SalvationDATA receives different data recovery cases every day. Our forensic customers usually turn to us for help when they run into a case they are not able to handle. And among all the data lost situations, platter scratch is one of the most difficult kinds of problem […]

via [Case Study] Computer Forensics: Data Recovery & Extraction From Platter Scratched Hard Drives — SalvationDATA Blog

Autopsy of a Dill Pickle-Introductory Lab for Anatomy or Forensics!

A Pickle Autopsy? YES!

If you teach Anatomy & Physiology, you know the struggle of the first unit…. it’s HUGE!! … and jam-packed with things that are absolutely essential for students to know in order to be successful in the course.  I usually struggle with finding activities to review the body cavities and directional terms.  This year, someone suggested using the pickle autopsy and I’m so glad I did!

The lab I used was published in The Forensic Teacher and would be appropriate for either discipline (I teach both this year).  Here is the link to the lab I used http://www.theforensicteacher.com/Labs_files/picklelabsheets.pdf  A clever fellow teacher friend came up with the storyline that there was a gang war between the Claussens and the Vlasics in the fridge that resulted in no survivors. I loved it so I also used that storyline to frame my lab.

Set Up– The Basics

Now that I had my lab picked out and my story to tell, I had to figure the logistics of how to get everything set up.

First, the pickles….

img_9918

I found the big jars of dills at Walmart for $5.97 each. The smaller pickles I got because I wanted some of my “victims” to be pregnant (or they could also be small children pickles lol).  I had a hard time estimating how many pickles were in the big jars, but these 2 had a total of 33 pickles– more than enough for my classes. The picture below shows them separated by “male” and “female” victims (my “male” pickles are the ones with the stems lol).

Here are all the supplies I used for the lab: img_9916

How to make them look like victims….

I glued wiggly eyes onto thumbtacks for their eyes (so I can reuse them)img_9917

I also used pellets that go in pellet guns for bullet wounds (I smashed them a little with the hammer first and dipped them into gel food coloring before I stuck them in the “victims”)img_9922

I made their heads from an olive stuck on a toothpick– some I even squished so their “brains” fell out a little lol.  I also gave all of them a “spine” (a toothpick on the dorsal side just under the skin).  I also broke several of the toothpicks so this “injury” might be discovered and included in the story of their “victim”. img_9937.jpg

All the “victims” had a bead implanted in the vicinity of their heart.  If the bead was red, they had a normal heart.  If it was black or dark purple, it represented a heart attack.  I found that if you make a slit on the side of the pickle (choose a wrinkle), it will often be completely unnoticeable and students will wonder how in the world you got those beads in there!  I also slipped in a small green bead in the neck region of a few of the “victims” and told my students I heard that some of the gang members involved in the war were caught raiding the grapes from the fridge and several choked on them when their leader caught them.

I also told them that the gang members were not healthy and many had various diseases and disorders because they didn’t take care of themselves.  Many had white beads implanted in various areas.  These beads represented a tumor in the particular area.  Knotted pieces of rubber bands in the abdominal region represented parasites.  Many had broken toothpick “limbs”.  I also had several who were pregnant.

This is the sheet of “Helpful Hints” I gave my students with their lab:

img_9941

A Snapshot of My “Victims”

I separated my “victims” into 4 general types based on their cause of death:

  1. Trauma or internal bleeding (Stabbed or gunshot, injected with red food coloring)
  2. Poisoning/ Drug Overdose (I soaked them in baking soda but didn’t get a very good result)
  3. Heart Attack (black bead instead of red bead in chest)
  4. Drowning (blue food coloring injected in chest area)

 

My “victims” had multiple things that could have resulted in their deaths, but having 4 major things just helped me keep it organized. I also put them in separate dishes while I plotted their demise 🙂 img_9926

I also kept them separate in labeled gallon ziplock bags to transport them to school. img_9927

The Lab Set Up

I set my lab up as a mini crime scene.  I had some fake vampire blood from my forensics class that I also added to help set the scene.  I also added in some extra plastic swords and pellets around the “victims”.  (I let my students pick their own “victim” from the scene). img_9948

Group Jobs

Students were in a lab group of 3 per “victim”.  In my lab, every student in the group has a specific job and job description.  It just helps my lab groups run more smoothly and tends to decrease the possibility that one student does the lion’s share of work.  These are the jobs I gave my groups for this lab: img_9936.jpg

My Take on the Pickle Autopsy Lab

Would I use it again? Absolutely!  My students became very proficient at actually using the directional terminology and identifying the body cavities that we talked about in class.  I heard many meaningful conversations within the groups… “That’s a break in his arm that’s intermediate between the shoulder and the elbow” “I think this sword went through the abdominal cavity and not the thoracic cavity”…. This was so much better than hearing them try to memorize a diagram or a chart of the directional terms!

They loved getting into our “gang warfare” story.  I had them fill out a Coroner’s Report detailing the abnormalities they found both in, and on their “victim”, as well as the location of these abnormalities.  Then, they had to determine the cause of death for their victim, supporting their opinion with specific details from their autopsy.  At all times within their report, they had to incorporate correct anatomical terminology.  Finally, they had to create a narrative of what happened to their “victim” based on the findings from their autopsy.  Several groups shared with the class.  It was lots of fun!

 

 

A Pickle Autopsy? YES! If you teach Anatomy & Physiology, you know the struggle of the first unit…. it’s HUGE!! … and jam-packed with things that are absolutely essential for students to know in order to be successful in the course. I usually struggle with finding activities to review the body cavities and directional […]

via Autopsy of a Dill Pickle- A Great Introductory Lab for Anatomy or Forensics! — Edgy Instruction

Ethical Dilemmas in Forensics

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.
via: Salem Press Encyclopedia of Science
Abstract:
Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

Definition: Principles of conduct, moral duty, and obligation that guide individuals in their decisions and actions.

Significance: As scientists, forensic scientists have a professional obligation to seek and to speak the truth about matters within their purview. As participants in a forensic process, they are subject to additional, sometimes conflicting, duties. This tension generates many ethical dilemmas.

Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

89312142-73888.jpg
Entrance sign at the Society of Forensic Toxicologists (SOFT) Meeting, July 5, 2012. By Monika M. Wahi (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia Commons

Forensic scientists are measured against a standard of professional integrity, although the professionalization of the scientific study of crime is far from complete. Professions are substantially self-regulating, usually through agreed-upon standards and codes of ethics, and this creates the need for them to articulate appropriate expectations and the responsibility of members of professions both to act correctly themselves and to provide appropriate correction for their errant colleagues. A case in point is William Tobin’s campaign against the chemical analysis of bullet lead, also known as comparative bullet-lead analysis (CBLA).

Tobin’s Exposure of CBLA

CBLA is a technique that the Federal Bureau of Investigation (FBI) used for four decades—the investigation of the assassination of President John F. Kennedy in 1963 was an early use—to make cases against defendants when traditional firearms analysis (that is, examination of barrel rifling on bullets) was not possible. By measuring the proportions of seven trace elements (antimony, arsenic, bismuth, cadmium, copper, silver, and tin) found in the lead of a bullet in evidence, forensic scientists sought to establish the probability that the bullet came from the same provenance as a bullet in the suspect’s possession. The belief that the comparison of the chemical composition of bullets could connect two bullets rested on unexamined assumptions about the similarities and differences of the source lead from which the bullets were cast. FBI experts testified in thousands of cases that the facts ascertainable through CBLA established likely identity and therefore pointed toward the probable guilt of the accused. Sometimes, as in the case of Michael Behm, who was convicted of murder in 1997, CBLA provided essentially the only evidence of guilt.

In the 1990s, FBI metallurgist William Tobin began to question the validity of the technique. He felt strongly enough about the issue to research the matter, after his retirement in 1998, with Lawrence Livermore National Laboratory metallurgist Erik Randich. They analyzed data from two lead smelters in Alabama and Minnesota and discovered that the FBI techniques could not distinguish batches of lead produced months apart. They also discovered that differences existed within single batches. Their research was published in Forensic Science International in July 2002.

Although he still defended the technique, the director of the FBI Laboratory requested that the National Research Council (NRC) of the National Academy of Sciences review CBLA. In February 2004, the NRC report, titled Forensic Analysis: Weighing Bullet Lead Evidence, confirmed that only extremely limited claims could be made about the relationship between bullets based on CBLA. Given the NRC findings, a New Jersey appeals court overturned Behm’s conviction in March 2005. The results of the NRC study have obvious implications for many other cases as well.

In an article titled “Forensic Significance of Bullet Lead Compositions,” which appeared in the Journal of Forensic Sciences in March 2005, FBI research chemists Robert D. Koons and JoAnn Buscaglia argued that “compositional comparison of bullet lead provides a reliable, highly significant point of evidentiary comparison of potential sources of crime-related bullets.” In September of that year, however, the FBI announced that it would no longer use CBLA. (In a curious subsequent development, Tobin and a member of the NRC committee, Clifford Spiegelman, suggested that a reanalysis of the bullet fragments from the Kennedy assassination might be in order.)

An article published in New Scientist in April 2002, quoted Tobin as saying of the interpretation of bullet data based on CBLA, “It offended me as a scientist.” In fact, Tobin has a long record as a critic of FBI procedures he regards as bad science and of testimonial practices he regards as unwarranted by the scientific data. To complain about testimony that unreasonably goes beyond what the data can support is to respond equally to the demands of science and the demands of ethics. It is a feature of commonsense justice that the punishment should fit the crime, and a basic requirement of that, in turn, is that the people who are punished should be guilty. Violating that requirement is both bad science and bad ethics.

Joyce Gilchrist’s Tainted Evidence

Is it enough that the accused be guilty of some crime, or does it have to be the one in question? If the accused is guilty of the crime in question, does it matter whether the evidence actually shows that? The belief that one can convict the guilty by tweaking the evidence a little, or shading one’s testimony a bit, is among the most common sources of unethical (and, often enough, criminal) behavior on the part of forensic scientists. The cautionary tale of former Oklahoma City police Department forensic scientist Joyce Gilchrist probably falls into this category.

In May 2007, Curtis Edward McCarty, who was facing his third trial for a 1982 murder, was freed as the result of the improper handling and representation of hair evidence by Gilchrist, who apparently had tried to frame McCarty. The judge dismissed the charge despite her belief that McCarty was probably not completely innocent. This was merely the latest in a series of episodes involving Gilchrist.

Questions about the integrity of Gilchrist’s work began as early as January 1987, when a Kansas City colleague, John Wilson, complained about her to the Southwestern Association of Forensic Scientists, without result. In 1998, Robert Miller was exonerated after he had been convicted a decade earlier based in part on Gilchrist’s testimony regarding blood, semen, and hair evidence. In 1999, Gilchrist was criticized by a judge for having given false testimony (regarding semen evidence) in the rape/murder trial of Alfred Brian Mitchell in 1992. In the spring of 2000, Jeffrey Todd Pierce was ordered released after he had served a decade and a half for a rape he did not commit; he had been convicted based on Gilchrist’s testimony. In January 2001, Gilchrist was criticized for the various judicial reprimands and professional critiques her work had received. In August 2001, doubts were raised about the guilt of Malcolm Rent Johnson, who had been executed for a 1981 rape and murder; Johnson was convicted based on Gilchrist’s testimony.

A month later, in September 2001, Gilchrist was finally fired, after years of reputedly shoddy forensics work, including both mishandling and misrepresentation of evidence, on many cases in addition to those noted above. The world of criminal justice contains innumerable isolated instances of perverse idealism, self-serving cynicism, and simple incompetence, but Gilchrist is one of the most striking cases of flagrant disregard for ethics in the forensics community. Was she genuinely convinced of the guilt of those against whom she testified? (She was certainly persuasive to juries.) Was she cynically distorting her testimony, and the evidence, to help prosecutors gain convictions, or was she just incompetent?

Ethics of Competence

One may well agree with forensics ethicist Peter D. Barnett’s remark that “there is a certain baseline level of competence that every criminalist is expected to understand, and there are certain procedures and protocols that are so fundamental to the practice of criminalistics that failure to follow them is evidence of gross incompetence or malfeasance, which is unethical.” As Barnett himself notes, however, “in the practice of forensic science, the disparate educational and experiential backgrounds of workers in the field make determination of a baseline level of competence relatively difficult.”

This is a problem throughout the American criminal justice system. In June 2007, all sergeants in the New Orleans Police Department were required to attend a four-day seminar to learn how to improve their (and their subordinates’) writing of police reports. This was part of an attempt to smooth out conflicts between the department and the New Orleans district attorney’s office, which claimed that part of its difficulty in prosecuting criminals stemmed from “incomplete or vague reports” by officers. More generally, criminalists frequently lament that frontline officers are not more skilled in observing, protecting, collecting, and preserving crime scene evidence.

One certainly can (in theory) impose reasonable expectations about competence and development in forensic science. However, that is not made easy by the variety of educational backgrounds and practical experience of the people who actually work in the field. In an unflattering assessment published in 2005, Jane Campbell Moriarty and Michael J. Saks bluntly asserted that “in the forensic sciences . . . 96 percent of practitioners hold bachelor’s degrees or less.” They went on to note:

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.

Moreover, as with the difficulty in defining the qualifications for expert testimony, the fact that crime fighting is not a natural kind of expertise has an impact. Almost any expert might be relevant to a criminal case, depending on circumstances. Given the diverse forms of knowledge relevant to the application of science to crime solving, and to the providing of suitable expert testimony, it may be that the only truly unifying factor is the application of the so-called scientific method, broadly understood as intellectual integrity—the determined effort, as physicist Richard P. Feynman put it, not to fool oneself (or others).

What is impressive about the case of William Tobin is his determination to ensure that his colleagues (or former colleagues) not testify to more than the data warrant, both out of scientific integrity and out of fairness to those whose lives are affected by what scientists say. What is appalling about the case of Joyce Gilchrist is the stubbornness of her effort to resist correction by colleagues or even by the seemingly obvious limits of the evidence itself. Sometimes the individual needs to correct the group, by exposing a bogus or complacent consensus; sometimes the group needs to correct the individual, by identifying willful deception or self-centered fantasy. Unfortunately, no formula exists to guarantee the right result, and that is why ethics remains a constant challenge to conscientious souls.

Ethical dilemmas in forensics

Related Information

  • American Academy of Forensic Sciences (AAFS)
  • American Society of Crime Laboratory Directors (ASCLD)
  • Brain-wave scanners
  • Criminal personality profiling
  • DNA database controversies
  • Ethics of DNA analysis
  • Expert witnesses in trials
  • Forensic journalism
  • Innocence Project
  • Interrogation in criminal investigations
  • Training and licensing of forensic professionals
  • Truth serum in interrogation

Last reviewed: October 2016

Bibliography

Barnett, Peter D. Ethics in Forensic Science: Professional Standards for the Practice of Criminalistics. Boca Raton: CRC, 2001. Print.

Inman, Keith, and Norah Rudin. Principles and Practice of Criminalistics: The Profession of Forensic Science. Boca Raton: CRC, 2001. Print.

Lucas, Douglas M. “The Ethical Responsibilities of the Forensic Scientist: Exploring the Limits.” Journal of Forensic Sciences 34 (1989): 719–29. Print.

Macklin, Ruth. “Ethics and Value Bias in the Forensic Sciences.” Journal of Forensic Sciences 42 (1997): 1203–206. Print.

Moriarty, Jane Campbell, and Michael J. Saks. “Forensic Science: Grand Goals, Tragic Flaws, and Judicial Gatekeeping.” Judges’ Journal 44.4 (2005): 16–33. Print.

Peterson, Joseph L., and John E. Murdock. “Forensic Science Ethics: Developing an Integrated System of Support and Enforcement.” Journal of Forensic Sciences 34 (1989): 749–62. Print.

Derived from: “Ethics.” Forensic Science. Salem Press. 2009.

 

Focused Digital Forensic Methodology — Forensic Focus – Articles

Abstract

Since the end of the 19th Century until the current time, law enforcement has been facing a rapid increase in computer-related crimes. In the present time, digital forensics has become an important aspect of not only law enforcement investigations, but also; counter-terrorism investigations, civil litigations, and investigating cyber-incidents. Due to rapid developing and evolving technology, these types of forensic investigations can become complex and intricate. However, creating a general framework for digital forensic professionals to follow during those investigations would lead to a successful retrieval of relevant digital evidence. The digital forensic framework or methodologies should be able to highlight all the proper phases that a digital forensic investigation would endure to ensure accurate and reliable results. One of the challenges that digital forensic professionals have been facing in the recent years is the volume of data submitted for analysis. Few digital forensic methodologies have been developed to provide a framework for the entire process and also offer techniques that would assist digital forensic professionals to reduce the amount of analyzed data. This paper proposes a methodology that focuses mostly on fulfilling the forensic aspect of digital forensic investigations, while also including techniques that can assist digital forensic practitioners in solving the data volume issue.

Focused Digital Forensic Methodology

Modern society has become very dependent on computers and technology to run all aspects of their lives. Technology has had a very positive impact on humanity, which can be easily proven with a short visit to any hospital and witnessing how computers and technology have become tools used to treat and save lives. However, computers have an indisputable disadvantage of being used as a tool to facilitate criminal activities. For instance, the sexual exploitation of children can be performed using the Internet, which would allow criminals to remain anonymous while preying on innocent children. The number of digital related crimes are increasing, which makes law enforcement agencies engaged in a constant battle against criminals who use this technology to commit crimes. As a result, digital forensics has become an important part of law enforcement investigations. Digital forensics is not only performed during law enforcement investigations but can also be conducted during the course of civil matters.

The fact that the information obtained from digital forensic investigations would and can be used as evidence during legal proceedings means that the entire process must be performed according to the legal standards. Perumal (2009) explained that the legal system requires digital forensic processes to be standardized and consistent. One of the issues that Perumal highlighted was the fact that digital evidence is very fragile and the use of improper methods could potentially alter or eliminate that evidence. There are a huge number of methodologies that have been developed all over the world, many of them were designed to target a specific type of technology (Selamat, Yusof, and Sahib, 2008). Also, many methodologies were developed to address requirements imposed by the legal system in certain jurisdictions. One of the methodologies that did not base their theory on technology or the law is the Integrated Digital Investigation Process (IDIP) Model. Carrier and Spafford (2003) explained that the IDIP Model is based on the Locard Exchange Principle, which is used to retrieve evidence from physical crime scenes. The IDIP Model uses the idea that when software programs were being executed in an electronic environment, electronic artifacts would most likely be created on the underlying device. Those artifacts can be retrieved and analyzed to obtain information about a certain incident or event.

This paper examines different literatures that present different types of digital forensic methodologies. Some of these methodologies have taken the focus away from the forensic aspect of digital forensic investigations. Instead, these methodologies have addressed crime scenes and other processes that are not related to the digital forensic field. Also, much research has been focused on creating solutions to the challenges that digital forensic practitioners are facing when conducting digital forensic investigations. One of the main challenges that multiple literatures have addressed is the constant increase in the volume of data that practitioners are acquiring and examining during investigations. This paper proposes a digital forensic methodology that would allow forensic practitioners to overcome the data volume issue and eliminate the lack of focus found in many methodologies.

Previous Work

As it was mentioned above, there are a large number of digital forensic methodologies that were developed all over the world. One of the first serious attempts to develop a standardized methodology that could be used during digital forensic investigations was in 1995. Pollitt (1995) utilized processes that were originally developed to handle physical evidence as a framework to create a methodology for digital forensic investigations. The author’s approach to creating different phases being conducted during an investigation was inspired by many factors that the legal system considers when evaluating any type of evidence. The court would evaluate; whether the seizure was properly conducted, was there any alteration that occurred with the evidence, and what methods were used to examine the evidence. The proper performance of all these steps would allow any type of evidence to be admitted by the court. Pollitt (1995) developed a methodology that consisted of four phases; acquisition, identification, evaluation, and admission as evidence. This methodology focused mostly on the forensic aspect of the investigation and did not extend to other processes that other researchers have considered in their methodology; preparation, planning, and the search of physical crime scene.

Many digital forensic methodologies were developed after Pollitt’s methodology. Some of those methodologies have used different terminology and sequencing of the phases that a forensic practitioner would have to perform throughout their investigations. Most of the methodologies have agreed on certain processes that are related to the forensic aspect of digital forensic investigations (Selamat et al., 2008). Selamat et al. studied 11 different digital forensic methodologies and concluded that all of them had common phases; preservation, collection, examination, analysis, and reporting. This means that all these researchers have agreed on these specific phases and any newly developed methodology would have to include similar phases.

According to Ruibin, Yun, and Gaertner (2005), the proper completion of any digital forensic investigation is directly related to conducting the processes that are similar to the ones highlighted by Selamat et al. (2008). Ruibin, Yun, and Gaertner also explained that in order to properly perform these phases, proper technique must be used. These techniques would ensure that the authenticity and reliability of the evidence is acceptable by the legal standards of the jurisdiction where the methodologies are being implemented. The use of the term legal is not foreign to digital forensic methodologies, as many researchers have addressed the importance of performing digital forensic investigations using the proper legal authorizations. For instance, the Integrated Digital Investigation Process (IDIP) Model has included obtaining legal authorization to perform a digital forensic investigation as one of the sub-phases of the Deployment Phase (Baryamureeba and Tushabe, 2004).

The IDIP Model was revised by Baryamureeba and Tushabe (2004) after noticing some practicality and clarity problems. Baryamureeba and Tushabe named the revised version as the Enhanced Integrated Digital Investigation Process (EIDIP) Model. In the EIDIP Model, the authors focused on two aspects, the sequence of the phases and clarity in regard to processing multiple crime scenes. The EIDIP proposed a modification to the deployment phase of the IDIP Model and included investigation of the physical and digital scenes as a way to arrive at the confirmation phase. Also, Baryamureeba and Tushabe added the Traceback phase, which aims at using information obtained from the victim’s machine to trace back other machines used to initiate the incident.

Challenges Facing Current Methodologies

Technology has been developing on two aspects; hardware and software. For instance, a mobile device hardware has seen great advances that allows these devices to be able to perform complex operations efficiently. At the same time, the software operates on mobile devices has also undergone great advancements to be able to support the consumers’ demand. However, these enormous changes in technology have affected the digital forensic methodologies, especially the ones that evolved around a certain set of technologies. Selamat et al. (2008) explained that there are many digital forensic methodologies that have been developed to target certain devices or a specific type of technology. The main problem with this type of methodology is the rapid changes of the underlying technology, which render these methodologies obsolete.

The legal system in different jurisdictions view digital forensics differently, which reflects on the methodologies used by forensic practitioners in those jurisdictions. According to Perumal (2009), “As computer forensic is a new regulation in Malaysia, there is a little consistency and standardization in the court and industry sector. As a result, it is not yet recognized as a formal ‘Scientific’ discipline in Malaysia” (p. 40). This means that the most affected jurisdictions are ones that have not developed a full understanding of computer forensics and the processes used to preserve and analyze electronic evidence. Also, the lack of understanding by the legal system would reduce the influence that the legal system has on shaping methodologies and making them acceptable by legal standards.

One of the other challenges related to digital forensic methodologies had been caused by the researchers who develop these methodologies. Some researchers have used titles and terminology in naming their methodologies that is not related to digital forensics. For instance, Baryamureeba and Tushabe (2004) named the fourth stage of the EIDIP Model as the Dynamite Phase, which includes Reconstruction and Communication as sub-phases. This lack of clarity in terminology would have a negative impact on any effort to help groups outside the digital forensic arena to have a better understanding of the processes used in the digital forensic field. As it was explained above, some jurisdictions across the globe are still unsure about digital forensics and the evidence obtained from a digital forensic investigation. This means that there would be advantages for researchers to utilize more modest and familiar terminology when addressing digital forensic methodologies. This simplicity in the terminology would assist the legal system in having a better understanding of the digital forensic processes.

The same challenges that the digital forensic field is currently facing can also be seen as challenges to digital forensic methodologies. Some of those challenges are; volume of data, encryption, anti-forensic techniques, and lack of standards. The reason these factors are perceived as challenging is because they have been slowing down the progression of digital forensic investigations. For instance, a fully encrypted computer system and an undisclosed encryption key would prevent practitioners from gaining access to the data, which ultimately imposes a challenge on the entire digital forensic process. One of the challenges that has been addressed by many researchers is the volume of data examined by forensic practitioners during digital forensic investigation. Examining a large amount of information during a digital forensic investigation would slow down the entire forensic process and affect the flow of operations for forensic laboratories.

Volume of Data

The increase in the volume of data was caused by the increasing sizes of electronic storage devices and at the same time the significant decrease in prices of those devices. According to Quick and Choo (2014), digital forensic practitioners have seen a significant increase in the amount of data that is being analyzed in every digital forensic examination. The authors explained that three factors have contributed to the volume of data that has become an issue for digital forensic practitioners; increase in the number of electronic devices for each investigation, the increase in the size of memory for those devices, and the number of investigations that require digital forensic examinations.

The significant increase in the volume of data has caused a great burden on forensic laboratories. Digital forensic practitioners are forced to spend longer times examining and analyzing larger data sets, which cause huge delays in completing digital forensic investigations. Quick and Choo (2014) explained that forensic laboratories have backlogs of work caused by the amount of time spend on each single examination. Some may argue that the increased amount of data can be an advantage as consumers are not forced to delete any data. However, Quick and Choo (2014) highlighted the seriousness of the data volume issue and stated, “Serious implications relating to increasing backlogs include; reduced sentences for convicted defendants due to the length of time waiting for results of digital forensic analysis, suspects committing suicide whilst waiting for analysis, and suspects denied access to family and children whilst waiting for analysis” (p. 274).

Current Solutions to the Data Volume Issue

Ruibin et al. (2005) offered a way to reduce the amount of data for each case by determining the relevant data based on the preliminary investigation. The relevant data for a certain case can be determined through two sources; case information and the person investigating the matter. Combining the information from both sources would assist in building a case profile, which can then be used to determine what type of information would be relevant for each forensic examination. The solution proposed by Ruibin et al. (2005) did not focus on using the technology to reduce the amount of data; however, there were other proposed solutions that suggested using technology to eliminate the data volume issue.

Neuner, Mulazzani, Schrittwieser, and Weippl (2015) proposed a technique that depends on technology to reduce the data volume. Their technique is based on the idea of file deduplication and file whitelisting. Deduplication and file whitelisting techniques would allow digital forensic practitioners to reduce the number of files that would have to be reviewed. Reducing the number of files would reduce the amount of time needed to complete the entire process. The authors defined the whitelisting technique as a way to compare a known list of hash values to the evidence and exclude any matches, as the contents of those files have already been determined. This list of hash values can either represent safe files or known contraband. The method offered by Neuner et al. (2015) is completely automated, and it would not require any human intervention, which means that no manpower is required to perform this task. Both solutions explained above are not controversial, as they utilize standard methods to locate the evidence. However, that is not the case with all solutions proposed to solve the data volume issue.

Quick and Choo (2014) explained that many researchers have addressed the issue of data volume through the use of the term sufficiency of examination. Those researchers did not focus on reducing the amount of data to be examined. Instead, they proposed limiting the search for electronic information only to the amount that would answer the main questions of the investigation. In other words, if a digital forensic practitioner examining a computer system for any evidence related to a child pornography investigation, the examination can be stopped once enough evidence was found to charge the suspect. This means that not all of the computer system would be examined and practitioners would be able to complete more examinations in a shorter period of time.

Some may argue that conducting a partial examination of the data could potentially lead to missing valuable digital evidence. For instance, in the example above concerning conducting a partial examination of child pornography evidence, conducting a full exam may lead to discovering pictures and videos of minors that are being sexually exploited. Those victims can potentially be identified and rescued. This means that conducting a partial examination of the evidence could possibly prevent those victims from being rescued. Still, assuming a full exam could lead to identifying other victims does not stand when confronted by the facts presented by Quick and Choo’s (2014), which highlighted the fact that delaying the results of a forensic exam has grievous negative effects. These effects can be seen on suspects who are waiting a long time for forensic practitioners to conclude their exams. The same goes for the victims, as they are not able to get any closure during the investigation, as they all have to await the exam to be concluded.

The Proposed Methodology

Digital forensic practitioners continue to face a great deal of pressure from the challenges explained earlier. These challenges are placing obstacles in the way of examinations, in return causing practitioners to spend an extensive amount of time working on each investigation. However, it appears that researchers have not invested much effort into creating methodologies that would support practitioners to ease the pressure and eliminate the challenges. However, some may impose a compelling argument that not all challenges can be resolved in the context of the methodologies. For instance, data encryption is one of the issues that prevents practitioners from accessing the evidence and can be categorized as a technical challenge. This means that researchers would not be able to address this challenge in the context of a digital forensic methodology as methodologies are supposed to address higher level processes that occur during digital forensic investigations. Even though this argument seems compelling, practitioners can still create methodologies that are at least capable of reducing the workload pressure away from practitioners.

Selamat et al. (2008) described the Extended Model presented by Ciardhuain (2004) as a complete framework that provides clear stages for digital forensic investigations. Ciardhuain (2004) suggested that there are multiple steps that must be taken throughout digital forensic investigations; awareness, authorization, planning, notification, search and identification of evidence, collection, transport, storage, examination, hypotheses, presentation, proof/defense, and dissemination. Selamat et al. (2008) also explained that the Extended Model included all five phases that were agreed upon by many researchers. This means the remaining phases of the Extended Model are not related to digital forensics and removing them would not affect the authenticity or reliability of the electronic evidence. However, some may argue that the Extended Model was not intend to be conducted by only digital forensic practitioners, as many phases of the model can be performed by non-digital forensic practitioners. For instance, in a criminal investigation, completing the authorization phase can be performed by law enforcement personnel to grant forensic practitioners lawful access to the evidence. This means that the Extended Model, is a framework that is not only for practitioners, but is for all investigations that involve digital evidence.

This paper proposes a new methodology, Focused Digital Forensic Methodology (FDFM), that is capable of eliminating the data volume issue and the lack of focus with the current digital forensic methodologies. The FDFM is designed to be a reflection of the current workflow of law enforcement and civil investigations. The FDFM focuses on ensuring that digital forensic investigations are being conducted properly without overloading practitioners with unrelated activities. Further, the FDFM proposes techniques to reduce the volume of data to cut on time required to complete examinations of electronic evidence, especially on larger data sets. This paper proposes a new methodology, Focused Digital Forensic Methodology (FDFM), that is capable of eliminating the data volume issue and the lack of focus with the current digital forensic methodologies. The FDFM is designed to be a reflection of the current workflow of law enforcement and civil investigations. The FDFM focuses on ensuring that digital forensic investigations are being conducted properly without overloading practitioners with unrelated activities. Further, the FDFM proposes techniques to reduce the volume of data to cut on time required to complete examinations of electronic evidence, especially on larger data sets.

A quick review of the Extended Model would lead to a conclusion that it focuses on activities related to evidence that is considered common knowledge in the law enforcement field and digital forensic field. For instance, the transportation and storage activities of the EIDIP are two parts of evidence handling that is being trained to law enforcement personnel and digital forensic practitioners. So, it would not be of value to emphasiz the idea that digital forensic practitioners must transport and store the evidence after it was collected. For that reason, the FDFM has excluded aspects that can be considered common knowledge in the digital forensic field, which are related to the proper handling of evidence.

Phases of the FDFM

The FDFM is designed in a way similar to the IDIP and EIDIP, as it is broken down into multiple phases, which are further broken down into sub-phases. The FDFM is designed in a way similar to the IDIP and EIDIP, as it is broken down into multiple phases, which are further broken down into sub-phases.

Preparation Phases. The preparation phase can be applied in two situations; when creating a new digital forensic team and after completing a digital forensic investigation. This phase is mainly aimed at ensuring that digital forensic teams, no matter what their mission might be, are capable of initiating and completing an investigation properly and without any problems. Just as the case for the EIDIP, the focus in this phase is ensuring that the forensic team is trained and equipped for their assigned mission.

Training. To ensure the forensic team is executing a forensic methodology properly, they must have all the training that would equip all team members with the knowledge needed. This training does not only focus on how to collect or search evidence properly but also trains the members on how to use the tools needed during those processes. For instance, the team should have the knowledge needed regarding the value of preserving data. They should also be able to utilize any software or hardware tools that are capable of preserving and acquiring data from different platforms.

Equipment. There are many tools that digital forensic professionals utilize during investigations that are capable of serving different purposes. One of the functions that those tools are able to accomplish is automating different processes during the investigation, and completing those jobs in a timely manner. Depending on the main mission of the forensic team, tools and equipment must be available to ensure a proper and fast completion of any digital investigation.

On-Scene Phases. This phase focuses on processes that would be accomplished in the event that a digital forensic team was called in to assist with executing search warrants or even preserving data for a civil litigation.

Physical Preservation. During this phase, digital forensic practitioners ensure that any item that may contain electronic information of value is protected. This protection would ensure that no damage would be inflected on any electronic device that can cause the loss of the information within. For instance, if a digital forensic practitioner found one of the items sought during the search of a residence, this item must be kept in a location under the control of the searching team. This means that any occupant in the residence would not be able to cause any damage to the device, which they would possibly attempt if they believed that it contained incriminating evidence.

Electronic Isolation. The other part of the preservation is ensuring that the information within the collected device is preserved by eliminating any electronic interference. An item like a mobile device would have to be isolated from the cellular network to ensure that no one is capable of damaging or changing the data remotely. The same case applies to computers or other items with network connectivity, as suspects could connect remotely to those devices and attempt to eliminate any information that is potentially damaging to their case. Electronic Isolation. The other part of the preservation is ensuring that the information within the collected device is preserved by eliminating any electronic interference. An item like a mobile device would have to be isolated from the cellular network to ensure that no one is capable of damaging or changing the data remotely. The same case applies to computers or other items with network connectivity, as suspects could connect remotely to those devices and attempt to eliminate any information that is potentially damaging to their case.

Electronic Preservation. The last part of preservation is creating a complete or partial duplicate of the targeted electronic information. The created copy would ensure that the electronic evidence is preserved in a forensic image. In the event that the devices were not supposed to be removed from the scene, forensic images would have to be created on scene, which can then be taken back to the laboratory for examination. A partial collection can also be conducted during this phase, which would be especially advantageous for civil litigation purposes. In those cases, the collection of an entire computer system is not recommended, and practitioners would have to search for and collect only the relevant data.

Laboratory Phases. These phases are conducted when the team is at the laboratory and have all the evidence or forensic images collected from the scene. Prior to interacting with the evidence, an attempt would be made to reduce the volume of data that an examiner would have to process and analyze. The reduction of data volume is crucial for cases that have a huge number of electronic devices and a large volume of storage for electronic information. As it was explained earlier, reducing the volume of data would expedite the examination of the evidence and allow for a better workflow. Once the reduction of data has been completed, examiners can begin processing and analyzing data.

Building Case Profile. This phase is somewhat similar to the creation of a case profile presented by Ruibin et al. (2005). These authors suggested using input from the investigation to identify the type of information that a digital forensic practitioner would need to focus on finding during the examination. For instance, if the case is related to pictures of evidentiary value, then a practitioner would need to be focused on reviewing all the pictures found on a device, without the need to focus on any other type of electronic information. For the FDFM, building a case profile would begin, just as the case for Ruibin et al. (2005), with obtaining information from the investigating agency regarding their investigation. The obtained information must focus specifically on the relationship between the electronic evidence and the incident under investigation. In other words, the investigating agency would have to provide information about what role the devices played in the incident. For instance, a mobile device would be submitted to the digital forensic team as part of a rape investigation. The investigating agency would have to prove how the mobile device was related to the rape incident. In these situations, the phone could have been used by the suspect to communicate with or lure the victim into a certain location. Building Case Profile. This phase is somewhat similar to the creation of a case profile presented by Ruibin et al. (2005). These authors suggested using input from the investigation to identify the type of information that a digital forensic practitioner would need to focus on finding during the examination. For instance, if the case is related to pictures of evidentiary value, then a practitioner would need to be focused on reviewing all the pictures found on a device, without the need to focus on any other type of electronic information. For the FDFM, building a case profile would begin, just as the case for Ruibin et al. (2005), with obtaining information from the investigating agency regarding their investigation. The obtained information must focus specifically on the relationship between the electronic evidence and the incident under investigation. In other words, the investigating agency would have to provide information about what role the devices played in the incident. For instance, a mobile device would be submitted to the digital forensic team as part of a rape investigation. The investigating agency would have to prove how the mobile device was related to the rape incident. In these situations, the phone could have been used by the suspect to communicate with or lure the victim into a certain location.

Once all the information was obtained regarding the investigation and the evidence, the type of data relevant to the investigation can be determined. In the example above regarding the rape case, the suspect may have used text messages to communicate with the suspect, which makes any text base communication between both parties relevant and must be reviewed and analyzed. In some cases, determining the approximate size of information sought can also be relevant and would allow for the examination to be even more focused. For instance, an investigating agency submitted one 500 gigabyte external drive and one 8 gigabyte thumb drive to the digital forensic team for examination. The investigation was seeking documents approximately 30 gigabytes in size relevant to a fraud case. In this situation, a forensic practitioner would focus on the external drive as it most likely contains the documents because it is more than 20 gigabytes in size. In the same sense, the 8-gigabyte thumb drive most likely does not have the documents because of its small size.

The next step of the case profile is determining the most relevant type of devices based on the information obtained thus far about the investigation. If the examination is seeking text-based messages sent to the victim, then it likely means that a mobile device was used to send those messages. However, this does not mean that a computer cannot be used to send such messages, but a mobile device is the most likely suspect in this situation. Then, the profile can provide further emphasis on the possibility of a mobile device being the source of the relevant messages. It is apparent from the above that building a case profile would require an experienced digital forensic practitioner, using prior experience and knowledge while building the case profile would result in a more accurate and realistic conclusion. The next step of the case profile is determining the most relevant type of devices based on the information obtained thus far about the investigation. If the examination is seeking text-based messages sent to the victim, then it likely means that a mobile device was used to send those messages. However, this does not mean that a computer cannot be used to send such messages, but a mobile device is the most likely suspect in this situation. Then, the profile can provide further emphasis on the possibility of a mobile device being the source of the relevant messages. It is apparent from the above that building a case profile would require an experienced digital forensic practitioner, using prior experience and knowledge while building the case profile would result in a more accurate and realistic conclusion.

The final step of the case profile is setting goals that the investigation is hoping to accomplish at the end of the examination. These goals are based on all the information gathered thus far and how the evidence was going to prove or disprove an incident or event. For instance, in the example of the rape investigation mentioned above, retrieving the text messages from the mobile device is the first goal to show that the suspect had lured the victim to the crime scene. The timeline of those messages is another piece of information that would corroborate the victim’s statement regarding different events prior to the incident. While examining the mobile device, these goals would make it clear to the digital forensic practitioner what the expected results following the examination of the device. The final step of the case profile is setting goals that the investigation is hoping to accomplish at the end of the examination. These goals are based on all the information gathered thus far and how the evidence was going to prove or disprove an incident or event. For instance, in the example of the rape investigation mentioned above, retrieving the text messages from the mobile device is the first goal to show that the suspect had lured the victim to the crime scene. The timeline of those messages is another piece of information that would corroborate the victim’s statement regarding different events prior to the incident. While examining the mobile device, these goals would make it clear to the digital forensic practitioner what the expected results following the examination of the device.

In civil litigation cases and even criminal cases, it would be beneficial during this phase to generate a list of keyword searches that could be used during the examination process. Those keywords must be related directly to the evidence and based on the information obtained about the evidence. When dealing with civil litigation cases, the keyword list would become the main method used to locate any responsive and relevant electronic information. The words added to the list must also be related to the goals setup to be accomplished at the end of the investigation. For instance, if the civil case is related to patent infringement, then the keywords would reveal documents or data that could prove or disprove the allegations submitted by one party against the other.

Evaluation of the Evidence. This evaluation of the evidence would have to occur based on the case profile that was created during the previous phase. The main goal of this phase is to exclude any device that does not match the case profile and most likely would not hold any information sought during the examination of the evidence. Excluding items that do not have relevant information from being examined would reduce the amount of information that a digital forensic practitioner examines in each case. This would ultimately cut on the amount of time needed to complete the examinations and give the investigators faster results. As it was explained above, the case profile would determine the type of electronic information, the timeline related to the information, the size of the electronic information, and other aspects of the evidence. Using the generated profile, the items of evidence can be listed in a way that the device most likely containing the information would be placed on the top of the list. The rest of the devices would be placed in the same way, which is based on the likelihood that they would contain the targeted data. Based on this list, the last item of the list would be least likely to contain information relevant to the investigation.

Forensic Acquisition. During this phase, any data that was not acquired or preserved on scene would be imaged. The imaging process would focus on the items that are on the top of the list that was generated during the Evaluation of the Evidence phase. This means that only items that are believed to hold the relevant data would be acquired and not the entire list of items. This would save on the storage space required to save all the forensic images to and also reduce the time required for imaging.

Partial Examination/Analysis. Many digital forensic models separate the examination phase from the analysis phase, just as the case for the Abstract Digital Forensic Model (Reith, Carr, and Gunsch, 2002). Those models assume that a digital forensic practitioner would search the evidence for any relevant data during the examination phase. Then, an analysis would be conducted during the analysis phase to determine the significance of the information found during the examination phase. In the FDFM, those two phases are combined in one phase as they cannot be separated since they occur at the exact same moment during the search for the evidence. The FDFM proposes creating a case profile prior to examining the evidence, which means that a digital forensic practitioner would know what to search for and the significance of the information beforehand. So, while searching for the evidence, examiners would be able to determine the significance of the information as it is being located.

An example of the situation explained above is the rape incident that was mentioned earlier and the mobile device that holds the information related to the incident. At the time when the practitioner they had prior knowledge that the targeted information related to luring the victim to the incident location was in the text messages. While reviewing the text messages, the digital forensic practitioner found a message that showed how the suspect and victim had met. Then, on the day of the incident, other messages were found showing how the victim was lured to the location of the incident. As the forensic practitioner is reviewing the messages, they are evaluating the information simultaneously and weighing the significance of the pieces of information as it is being found. As each text message is going to add another piece of information related to the incident. At the end of this phase, the forensic practitioner will have a full picture of what happened on that day and any other information related to the incident. It would not be practical for the forensic practitioner to go back and reanalyze information that was previously analyzed as the information was being found. It is also worth mentioning that digital forensic practitioners usually mark the relevant data as it is being found to ensure that it is being included in the final report generated by the tool used to review the evidence. An example of the situation explained above is the rape incident that was mentioned earlier and the mobile device that holds the information related to the incident. At the time when the practitioner they had prior knowledge that the targeted information related to luring the victim to the incident location was in the text messages. While reviewing the text messages, the digital forensic practitioner found a message that showed how the suspect and victim had met. Then, on the day of the incident, other messages were found showing how the victim was lured to the location of the incident. As the forensic practitioner is reviewing the messages, they are evaluating the information simultaneously and weighing the significance of the pieces of information as it is being found. As each text message is going to add another piece of information related to the incident. At the end of this phase, the forensic practitioner will have a full picture of what happened on that day and any other information related to the incident. It would not be practical for the forensic practitioner to go back and reanalyze information that was previously analyzed as the information was being found. It is also worth mentioning that digital forensic practitioners usually mark the relevant data as it is being found to ensure that it is being included in the final report generated by the tool used to review the evidence.

The FDFM suggests the use of a method that was referenced by Quick et al. (2014), which aims at conducting a partial examination and analysis of the evidence as a solution for the data volume issue. According to Quick et al. (2014), the digital forensic practitioner would be, “doing enough examination to answer the required questions, and no more” (p. 282). The same concept can be applied using the FDFM, as digital forensic practitioners conduct examinations only to accomplish the goals setup during the Building Case Profile phase. The examination would be performed in the same sequence of relevance as determined during the Evaluation of the Evidence phase. This means that the examination would begin with the more relevant items and continue down the list until all the goals are accomplished. One of the greatest benefits of conducting partial examinations of the evidence is maintaining the privacy of the owners. For instance, mobile devices contain a vast amount of private information about the owners. This means that the more information to be reviewed, the more loss of privacy that would occur. Limiting the examination to the need of the investigation would assist in maintaining a certain limit of privacy for the owner of the data.

Reporting. During this phase, all the information found during the examination would be placed in a report to inform the investigating agency of the findings. There are many report formats used by different agencies. However, all formats have one thing in common, which is that they are all driven by the findings of the examination and not opinions. Reports that are supported by solid evidence are hard to dispute, as the evidence behind the information in those reports would suppress any arguments.

Conclusions and Future Research

There are many literatures that propose different types of methodologies that have different focuses. However, many of those methodologies have included phases that are not related to the forensic aspect of those methodologies. Many researchers also addressed the issue of the volume of data that is causing huge delays with digital forensic exams. The proposed methodology, FDFM, allows digital forensic professionals to be focused more on forensics during any digital forensic investigation. This methodology has excluded any phases that were included in other methodologies and are considered common knowledge within the digital forensic field. The proposed methodology also addressed one of the biggest challenges for digital forensic investigations, which is the volume of data. The FDFM proposed two methods that would allow for reduction in the volume of data, excluding devices that do not contain relevant information and conducting partial examinations. Both techniques can be applied only after a case profile has been generated based on the information obtained by the investigating agency. Future research can be conducted by focusing on integrating other techniques to the FDFM that would eliminate other challenges of digital forensic investigations.

References

Agarwal, A., Gupta, M., & Gupta, S. (2011). Systematic digital forensic investigation model. International Journal of Computer Science and Security (IJCSS), 5(1), 118-131.

Baryamureeba, V., & Tushabe, F. (2004). The Enhanced Digital Investigation Process Model. Proceedings of the Digital Forensic Research Conference. Baltimore, MD.

Carrier, B., & Spafford, E. H. (2003). Getting physical with the Digital Investigation Process. International Journal of Digital Evidence, 2(2), 1-20.Carrier, B., & Spafford, E. H. (2003). Getting physical with the Digital Investigation Process. International Journal of Digital Evidence, 2(2), 1-20.

Ciardhuain, S. O. (2004). An extended model of cybercrime investigations. International Journal of Digital Evidence, 3(1), 1-22.

Neuner, S., Mulazzani, M., Schrittwieser, S., & Weippl, E. (2015). Gradually improving the forensic process. In the 10th International Conference on Availability, Reliability and Security, 404-410. IEEE.

Perumal, S. (2009). Digital forensic model based on Malaysian investigation process. International Journal of Computer Science and Security (IJCSS), 9(8), 38-44.

Pollitt, M. M. (1995). Computer forensics: An approach to evidence in cyberspace. In the 18th National Information Systems Security Conference, 487-491. Baltimore, MD.

Quick, D., & Choo, K. R. (2014). Impact of increasing volume of digital forensic data: A survey and future research challenges. Digital Investigation, 11(4), 273-294.

Reith, M., Carr, C., & Gunsch, G. (2002). An examination of the digital forensic models. International Journal of Digital Evidence, 1(3), 1-12.

Ruibin, G., Yun, T., & Gaertner, M. (2005). Case-relevance information investigation: binding computer intelligence to the current computer forensic framework. International Journal of Digital Evidence, 4(1), 147-67.

Selamat, S. R., Yusof, R., & Sahib, S. (2008). Mapping process of digital forensic investigation framework. International Journal of Computer Science and Network Security, 8(10), 163-169.

About the Author

Haider Khaleel is a Digital Forensics Examiner with the US Army, previously a field agent with Army CID. Haider received a Master’s Degree in Digital Forensics Science from Champlain College. The ideas presented in this article do not reflect the polices, procedure, and regulations of the author’s agency.

Correspondence concerning this article should be addressed to Haider H. Khaleel, Champlain College, Burlington, VT 05402. haider.khaleel@mymail.champlain.edu

by Haider H. Khaleel Abstract Since the end of the 19th Century until the current time, law enforcement has been facing a rapid increase in computer-related crimes. In the present time, digital forensics has become an important aspect of not only law enforcement investigations, but also; counter-terrorism investigations, civil litigations, and investigating cyber-incidents. Due to […]

via Focused Digital Forensic Methodology — Forensic Focus – Articles

Digital Forensics as a Big Data Challenge — Forensic Focus – Articles

Digital Forensics as a Big Data Challenge

 

Abstract

Digital Forensics, as a science and part of the forensic sciences, is facing new challenges that may well render established models and practices obsolete. The dimensions of potential digital evidence supports has grown exponentially, be it hard disks in desktops and laptops or solid state memories in mobile devices like smartphones and tablets, even while latency times lag behind. Cloud services are now sources of potential evidence in a vast range of investigations and network traffic also follows a growing trend, and in cyber security the necessity of sifting through vast amount of data quickly is now paramount. On a higher level investigations – and intelligence analysis – can profit from sophisticated analysis of such datasets as social network structures, corpora of text to be analysed for authorship and attribution. All of the above highlights the convergence between so-called data science and digital forensics, to take the fundamental challenge of analysing vast amounts of data (“big data”) in actionable time while at the same time preserving forensic principles in order for the results to be presented in acourt of law. The paper, after introducing digital forensics and data science, explores the challenges above and proceeds to propose how techniques and algorithms used in big data analysis can be adapted to the unique context of digital forensics, ranging from the managing of evidence via Map-Reduce to machine learning techniques for triage and analysis of big forensic disk images and network traffic dumps. In the conclusion the paper proposes a model to integrate this new paradigm into established forensic standards and best practices and tries to foresee future trends.

1 Introduction

1.1 Digital Forensics

What is digital forensics? We report here one of the most useful definitions of digital forensics formulated. It was developed during the first Digital Forensics Research Workshop (DFRWS) in 2001 and it is still very much relevant today:

Digital Forensics is the use of scientifically derived and proven methods toward the preservation, collection, validation, identification, analysis, interpretation, documentation and presentation of digital evidence derived from digital sources for the purpose of facilitating or furthering the reconstruction of events found to be criminal, or helping to anticipate unauthorized actions shown to be disruptive to planned operations. [Pear01]

This formulation stresses first and foremost the scientific nature of digital forensics methods, in a point in time when the discipline was transitioning from being a “craft” to an established field and rightful part of the forensic sciences. At that point digital forensics was also transitioning from being mainly practised in separated environments such as law enforcement bodies and enterprise audit offices to a unified field. Nowadays this process is very advanced and it can be said that digital forensics principles, procedures and methods are shared by a large part of its practitioners, coming from different backgrounds (criminal prosecution, defence consultants, corporate investigators and compliance officers). Applying scientifically valid methods implies important concepts and principles to be respected when dealing with digital evidence. Among others we can cite:

  • Previous validation of tools and procedures. Tools and procedures should be validated by experiment prior to their application on actual evidence.
  • Reliability. Processes should yield consistent results and tools should present consistent behaviour over time.
  • Repeatability. Processes should generate the same results when applied to the same test environment.
  • Documentation. Forensic activities should be well-documented, from the inception to the end of evidence life-cycle. On one hand strict chain-of-custody procedures should be enforced to assure evidence integrity and the other hand complete documentation of every activity is necessary to ensure repeatability by other analysts.
  • Preservation of evidence – Digital evidence is easily altered and its integrity must be preserved at all times, from the very first stages of operations, to avoid spoliation and degradation. Both technical (e.g. hashing) and organizational (e.g. clear accountabilityfor operators) measures are to be taken.

These basic tenets are currently being challenged in many ways by the shifting technologicaland legal landscape practitioners have to contend with. While this paper shall not dwell much on the legal side of things, this is also obviously something that is always to be considered in forensics.

Regarding the phases that usually make up the forensic workflow, we refer here again to the only international standard available [ISO12] and describe them as follows:

  • Identification. This process includes the search, recognition and documentation of the physical devices on the scene potentially containing digital evidence. [ISO12]
  • Collection – Devices identified in the previous phase can be collected and transferred to an analysis facility or acquired (next step) on site.
  • Acquisition – This process involves producing an image of a source of potential evidence, ideally identical to the original.
  • Preservation – Evidence integrity, both physical and logical, must be ensured at all times.
  • Analysis – Interpretation of the data from the evidence acquired. It usually depends onthe context, the aims or the focus of the investigation and can range from malware analysis to image forensics, database forensics, and a lot more of application-specific areas.On a higher level analysis could include content analysis via for instance forensics linguistics or sentiment analysis techniques.
  • Reporting – Communication and/or dissemination of the results of the digital investigation to the parties concerned.

1.2 Data Science

Data Science is an emerging field basically growing at the intersection between statistical techniques and machine learning, completing this toolbox with domain specific knowledge, having as fuel big datasets. Hal Varian gave a concise definition of the field:

[Data science is] the ability to take data – to be able to understand it, to process it, to extract value from it, to visualize it, to communicate it. [Vari09]

We can see here the complete cycle of data management and understand that data science in general is concerned with the collection, preparation, analysis, visualization, communication and preservation of large sets of information; this is a paraphrase of another insightful definition by Jeffrey Stanton of Syracuse University’s School of Information Studies. The parallels with the digital forensics workflow are clear but the mention in both definitions of visualization deserves to be stressed. Visualization is mostly never mentioned in digital forensics guidelines and standards but as the object of analysis moves towards “Big Data”, it will necessarily become one of the most useful tools in the analyst’s box, for instance in the prioritization phase but also for dissemination and reporting: visual communication is probably the most efficient way into a human’s brain but this channel is underused by most of today’s forensic practitioners.

If Data Science is concerned with “Big Data”, what is Big Data anyway? After all big is a relative concept and prone to change with time. Any data that is difficult to manage and work with, or in other words datasets so big that for them conventional tools – e.g. relational databases – are not practical or useful. [ISAC13] From the point of view of data science the challenges of managing big data can be summarized as three Vs: Volume (size), Velocity (needed for interactivity), Variety (different sources of data). In the next paragraph we shall see how these three challenges dovetail nicely with the digital forensics context.

2 Challenges

“Golden Age” is a common definition for the period in the history of digital forensics that went roughly from the 1990s to the first decade of the twenty-first century. During that period the technological landscape was dominated by the personal computer, and mostly by a single architecture – x86 plus Windows – and data stored in hard drives represented the vast majority of evidence, so much so that “Computer Forensics” was the accepted term for the discipline. Also the storage size allowed for complete bitwise forensic copies of the evidence for subsequent analysis in the lab. The relative uniformity of the evidence nature facilitated the development of the digital forensic principles outlined above and enshrined in several guidelines and eventually in the ISO/IEC 27037 standard. Inevitably anyway they lagged behind the real-world developments: recent years brought many challenges to the “standard model”, first among them the explosion in the average size of the evidence examined for a single case. Historical motivations for this include:

  • A dramatic drop in hard drive and solid state storage cost (currently estimated at $80 per Terabyte) and consequently an increase in storage size per computer or device;
  • Substantial increase in magnetic storage density and diffusion of solid-state removable media (USB sticks, SD and other memory cards etc) in smartphones, notebooks, cameras and many other kinds of devices;
  • Worldwide huge penetration of personal mobile devices like smartphones and tablets, not only in Europe and America, but also in Africa – where they constitute the main communication mode in many areas – and obviously in Asia;
  • Introduction and increasing adoption by individuals and businesses of cloud services – infrastructure services (IAAS), platform services (PAAS) and applications (SAAS) – made possible in part by virtualization technology enabled in turn by the modern multi-core processors;
  • Network traffic is ever more part of the evidence in cases and the sheer size of it has – again – obviously increased in the last decade, both on the Internet and on 3G-4G mobile networks, with practical but also ethical and political implications;
  • Connectivity is rapidly becoming ubiquitous and the “Internet of things” is near, especially considering the transition to IPv6 in the near future. Even when not networked, sensors are everywhere, from appliances to security cameras, from GPS receivers to embedded systems in cars, from smart meters to Industrial Control Systems.

To give a few quantitative examples of the trend, in 2008 the FBI Regional Computer Forensics Laboratories (RCFLs) Annual Report [FBI08] explained that the agency’s RCFLs processed 27 percent more data than they did during the preceding year; the 2010 Report gavean average case size of 0.4 Terabytes. According to a recent (2013) informal survey among forensic professionals on Forensic Focus, half of the cases involve more than on Terabyte of data, with one in five over five Terabytes in size.

The simple quantity of evidence associated to a case is not the only measure of its complexity and the growing in size is not the only challenge that digital forensics is facing: evidence is becoming more and more heterogeneous in nature and provenance, following the evolving trends in computing. The workflow phase impacted by this new aspect is clearly analysis where, even when proper prioritization is applied, it is necessary to sort through diverse categories and source of evidence, structured and unstructured. Data sources themselves are much more differentiated than in the past: it is common now for a case to include evidence originating from personal computers, servers, cloud services, phones and other mobile devices, digital cameras, even embedded systems and industrial control systems.

3 Rethinking Digital Forensics

In order to face the many challenges but also to leverage the opportunities it is encountering, the discipline of digital forensics will have to rethink in some ways established principles and reorganize well-known workflows, even include and use tools not previously considered viable for forensic use – concerns regarding the security of some machine learning algorithms has been voiced, for instance in [BBC+08]. On the other hand forensic analysts’ skills need to be rounded up to make better use of these new tools in the first place, but also to help integrate them in forensic best practices and validate them. The dissemination of “big data” skills will have to include all actors in the evidence lifecycle, starting with Digital Evidence First Responders (DEFRs), as identification and prioritization will see their importance increased and skilled operators will be needed from the very first steps of the investigation.

3.1 Principles

Well-established principles shall need to undergo at least a partial extension and rethinking because of the challenges of Big Data.

  • Validation and reliability of tools and methods gain even more relevance in a big data scenarios because of the size and variety of datasets, coupled with the use of cutting-edge algorithms that still need validation efforts, including a body of test work first on methods and then on tools in controlled environments and on test datasets before their use in court.
  • Repeatability has long been a basic tenet in digital forensics but most probably we will be forced to abandon it, at least in its strictest sense, for a significant part of evidence acquisition and analysis. Already repeatability stricto sensu is impossible to achieve in nearly all instances of forensic acquisition of mobile devices, and the same applies to cloud forensics. When Machine Learning tools and methods become widespread, reliance on previous validation will be paramount. As an aside, this stresses once more the importance of using open methods and tools that can be independently and scientifically validated as opposed to black box tools or – worse – LE-reserved ones.
  • As for documentation, its importance for a sound investigation is even greater when we see non-repeatable operations and live analysis routinely be part of the investigation process. Published data about validation results of tools and methods used – or at least pointers to it – should be an integral part of the investigation report.

3.2 Workflow

Keeping in mind how the forensic principles may need to evolve, we present here a brief summary of the forensics workflow and how each phase may have to adapt to big data scenarios. ISO/IEC 27037 International Standard covers the identification, collection, acquisition and preservation of digital evidence (or, literally, “potential” evidence). Analysis and disposal are not covered by this standard, but will be in future – in development – guidelines in the 27xxx series.

Identification and collection

Here the challenge is selecting evidence in a timely manner, right on the scene. Guidelines for proper prioritization of evidence should be further developed, abandoning the copy-all paradigm and strict evidence integrity in favour of appropriate triage procedures: this implies skimming through all the (potential) evidence right at the beginning and selecting relevant parts. First responders’ skills will be even more critical that they currently are and, in corporate environments, also preparation procedures.

Acquisition

When classic bitwise imaging is not feasible due to the evidence size, prioritization procedures or “triage” can be conducted, properly justified and documented because integrity is not absolute anymore and the original source has been modified, if only by selecting what to acquire. Visualization can be a very useful tool, both for low-level filesystem analysis and higher level content analysis. Volume of evidence is a challenge because dedicated hardware is required for acquisition – be it storage or online traffic – while in the not so distant past an acquisition machine could be built with off-the-shelf hardware and software. Variety poses achallenge of a slightly different kind, especially when acquiring mobile devices, due to the huge number of physical connectors and platforms.

Preservation

Again, preservation of all evidence in a secure way and complying with legal requirements calls for quite a substantial investment for forensic labs working on a significant number of cases.

Analysis

Integrating methods and tools from data science implies surpassing the “sausage factory” forensics still widespread today, where under-skilled operators rely heavily on point and click all-in-one tools to perform the analysis. Analysts shall need to include a plurality of tools in their panoply and not only that, but understand and evaluate the algorithms and implementations they are based upon. The absolute need for highly skilled analysts and operators is clear, and suitable professional qualifications will develop to certify this.

Reporting

The final report for an analysis conducted using data science concepts should contain accurate evaluations of tools, methods used, including data from the validation process and accurate documentation is even more fundamental as strict repeatability becomes very hard to uphold.

3.3 Some tools for tackling the Big Data Challenge

At this stage, due also to the fast-changing landscape in data science, it is hard to systematically categorize its tools and techniques. We review here some of them.

Map-Reduce is a framework used for massive parallel tasks. This works well when the data-sets do not involve a lot of internal correlation. This does not seem to be the case for digital evidence in general but a task like file fragment classification is suited to be modelled in aMap-Reduce paradigm. Attribution of file fragments – coming from a filesystem image or from unallocated space – to specific file types is a common task in forensics: machine learning classification algorithms – e.g. logistic regression, support vector machines – can be adapted toM-R if the analyst forgoes the possible correlations among single fragments. A combined approach where a classification algorithm is combined for instance with a decision tree method probably would yeld higher accuracy.

Decision trees and random forests are fruitfully brought to bear in fraud detection software, where the objective is to find in a vast dataset the statistical outliers – in this case anomalous transactions, or in another application, anomalous browsing behaviour.

In audio forensics unsupervised learning techniques under the general definition of “blind signal separation” give good results in separating two superimposed speakers or a voice from background noise. They rely on mathematical underpinning to find, among possible solutions, the least correlated signals.

In image forensics again classification techniques are useful to automatically review big sets of hundreds or thousands of image files, for instance to separate suspect images from the rest.

Neural Networks are suited for complex patter recognition in network forensics. A supervised approach is used, where successive snapshots of the file system are used to train the network to recognize normal behaviour of an application. After the event the system can be used to automatically build an execution timeline on a forensic image of a filesystem. [KhCY07] Neural Networks have also been used to analyse network traffic but in this case the results still do not present high levels of accuracy.

Natural Language Processing (NLP) techniques, including Bayesian classifiers and unsupervised algorithms for clustering like k-means, has been successfully employed for authorship verification or classification of large bodies of unstructured texts, emails in particular.

4 Conclusion

The challenges of big data evidence already at present highlight the necessity of revising tenets and procedures firmly established in digital forensics. New validation procedures, analysts’ training, and analysis workflow shall be needed in order to confront the mutated landscape. Furthermore, few forensic tools implement for instance machine learning algorithms or, from the other side, most machine learning tools and libraries are not suitable and/or validated for forensic work, so there still exists a wide space for development of innovative tools leveraging machine learning methods.

References

[BBC+08] Barreno, M. et al.: “Open Problems in the Security of Learning”. In: D. Balfanzand J. Staddon, eds., AISec, ACM, 2008, p.19-26
[FBI08] FBI: “RCFL Program Annual Report for Fiscal Year 2008”, FBI 2008. http://www.fbi.gov/news/stories/2009/august/rcfls_081809
[FBI10] FBI: “RCFL Program Annual Report fir Fiscal Year 2010”, FBI 2010.
[ISAC13] ISACA: “What Is Big Data and What Does It Have to Do with IT Audit?”,ISACA Journal, 2013, p.23-25
[ISO12] ISO/IEC 27037 International Standard
[KhCY07] Khan, M. and Chatwin, C. and Young, R.: “A framework for post-event timelinereconstruction using neural networks” Digital Investigation 4, 2007
[Pear01] Pearson, G.: “A Road Map for Digital Forensic Research”. In: Report fromDFRWS 2001, First Digital Forensic Research Workshop, 2001.
[Vari09] Varian, Hal in: “The McKinsey Quarterly”, Jan 2009

About the Author

Alessandro Guarino is a senior Information Security professional and independent researcher. He is the founder and principal consultant of StudioAG, a consultancy firm based in Italy and active since 2000, serving clients both in the private and public sector and providing cybersecurity, data protection and compliance consulting services. He is also a digital forensics analyst and consultant, as well as expert witness in Court. He holds an M.Sc in Industrial Engineering and a B.Sc. in economics, with a focus on Information Security Economics. He is an ISO active expert in JTC 1/SC 27 (IT Security Techniques committee) and contributed in particular to the development of cybersecurity and digital investigation standards. He represents Italy in the CEN-CENELEC Cybersecurity Focus Group and ETSI TC CYBER. He is the chair of the recently formed CEN/CENELEC TC 8 “Privacy management in products and services”. As an independent researcher, he delivered presentations at international conferences and published several peer-reviewed papers.

Find out more and get in touch with the author at StudioAG.

by Alessandro Guarino, StudioAG Abstract Digital Forensics, as a science and part of the forensic sciences, is facing new challenges that may well render established models and practices obsolete. The dimensions of potential digital evidence supports has grown exponentially, be it hard disks in desktops and laptops or solid state memories in mobile devices like smartphones […]

via Digital Forensics as a Big Data Challenge — Forensic Focus – Articles