Everything we touched, leave behind our unique impression on it, which is Our fingerprints.
No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints.
Fingerprint identification also known as “Dactyloscopy”.
Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. They develop from pressure on a baby’s tiny, developing fingers in the womb.
CLASSIFICATION OF FINGERPRINTS
By FRANCIS GALTON
A well-known British scientist sir Francis Galton published his first book on fingerprint in 1892. His important work include method for classification for fingerprint which are divided into three groups-
By WILLIAM J. HERSHEL
While working for the East India Company in Bengal, India, Sir William James Herschel first used fingerprints on native contracts. After a decade, he had accumulated a file of fingerprints.
By EDWARD HENRY
Henry Classification of Fingerprinting was accepted as common practice throughout England and its territorial holdings and in the United States.
Under the henry system, fingerprints divided into two classes:
•Those which are given numerical value. (whorls and composites).
•Those which doesn’t give numerical value. (loops and arches).
All patters are divided as follows:
The henry classification system assigns each finger A number according to the order in which it is located in the hand, beginning with the right thumb as number 1 and ending with the left pinky as number 10.
• The system also assigns a numerical value to fingers that contain a whorl pattern; fingers 1 and 2 each have a value of 16,
• Fingers 3 and 4 = 8,
• Fingers 5 and 6 = 4,
• Fingers 7 and 8 = 2,
• Final two fingers = 1.
• Fingers with a non-whorl pattern, such as an arch or loop pattern, have a value of zero.
• The sum of the even finger value is then calculated and placed in the numerator of a fraction.
• The sum of the odd finger values is place in the denominator.
• The value of 1 is added to each sum of the whorls with the maximum obtainable on either side of the fraction begin 32.
• Thus, the primary classification is a fraction between 1/1 to 32/32, where 1/1 would indicate no whorl patterns and 32/32 would mean that all fingers had whorl patterns.
By JUAN VUCETICH
Vucetich is credited with the first positive criminal identification as, in 1892, he was able to extract a set of prints off a door and thus identify a woman as the culprit in a double homicide.
CHARACTERISTICS OF FINGERPRINT
Class characteristics are the characteristics that narrow the print down to a group but not an individual.
The Three Fingerprint Class Types Are;
Arches are the simplest type of fingerprints that are formed by ridges that enter on one side of the print and exit on the other. No deltas are present.
About 5 % of the world’s populations have arch patterns.
Loops must have one delta and one or more ridges that enter and leave on the same side. These patterns are named for their positions related to the radius and ulna bones.
About 60-65 % of the world’s populations have loop patterns.
Whorls have at least one ridge that makes (or tends to make) a complete circuit. They also have at least two deltas.
About 30-35 % of the world’s populations have whorls patterns.
Individual characteristics are those characteristics that are unique to an individual.
They are tiny irregularities that appear within the friction ridges and are referred to as Galton’s details.
Watch it, share it and subscribe it : –
By @forensicfield INTRODUCTION Everything we touched, leave behind our unique impression on it, which is Our fingerprints. No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints. Fingerprint identification also known as “Dactyloscopy”. Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. […]
Angus Marshall, Digital Forensic Scientist
Where to begin? I have a lot of different roles these days, but by day I’m a Lecturer in Cybersecurity – currently at the University of York, and also run my own digital forensic consultancy business. I drifted into the forensic world almost by accident back in 2001 when a server I managed was hacked. I presented a paper on the investigation of that incident at a forensic science conference and a few weeks later found myself asked to help investigate a missing person case that turned out to be a murder. There’s been a steady stream of casework ever since.
I’m registered as an expert adviser and most of my recent casework seems to deal with difficult to explain or analyse material. Alongside that, I’ve spent a lot of time (some might say too much) working on standards during my time on the Forensic Science Regulator’s working group on digital evidence and as a member of BSI’s IST/033 information security group and the UK’s digital evidence rep. on ISO/IEC JTC1 SC27 WG4, where I led the work to develop ISO/IEC 27041 and 27042, and contributed to the other investigative and eDiscovery standards.
You’ve recently published some research into verification and validation in digital forensics. What was the goal of the study?
It grew out of a proposition in ISO/IEC 27041 that tool verification (i.e. evidence that a tool conforms to its specification) can be used to support method validation (i.e. showing that a particular method can be made to work in a lab). The idea of the 27041 proposal is that if tool vendors can provide evidence from their own development processes and testing, the tool users shouldn’t need to repeat that. We wanted to explore the reality of that by looking at accredited lab processes and real tools. In practice, we found that it currently won’t work because the requirement definitions for the methods don’t seem to exist and the tool vendors either can’t or won’t disclose data about their internal quality assurance.
The effect of it is that it looks like there may be a gap in the accreditation process. Rather than having a selection of methods that are known to work correctly (as we see in calibration houses, metallurgical and chemical labs etc. – where the ISO 17025 standard originated) which can be chosen to meet a specific customer requirement, we have methods which satisfy much fuzzier customer requirements which are almost always non-technical in nature because the customers are CJS practitioners who simply don’t express things in a technical way.
We’re not saying that anyone is necessarily doing anything wrong, by the way, just that we think they’ll struggle to provide evidence that they’re doing the right things in the right way.
Where do we stand with standardisation in the UK at the moment?
Standardization is a tricky word. It can mean that we all do things the same way, but I think you’re asking about progress towards compliance with the regulations. In that respect, it looks like we’re on the way. It’s slower than the regulator would like. However, our research at York suggests that even the accreditations awarded so far may not be quite as good as they could be. They probably satisfy the letter of the regulator’s documents, but not the spirit of the underlying standard. The technical correctness evidence is missing.
ISO 17025 has faced a lot of controversy since it has been rolled out as the standard for digital forensics in the UK. Could you briefly outline the main reasons why?
Most of the controversy is around cost and complexity. With accreditation costing upwards of £10k for even a small lab, it makes big holes in budgets. For the private sector, where turnover for a small lab can be under £100k per annum, that’s a huge issue. The cost has to be passed on. Then there’s the time and disruption involved in producing the necessary documents, and then maintaining them and providing evidence that they’re being followed for each and every examination.
A lot of that criticism is justified, but adoption of any standard also creates an opportunity to take a step back and review what’s going on in the lab. It’s a chance to find a better way to do things and improve confidence in what you’re doing.
In your opinion, what is the biggest stumbling block either for ISO 17025 specifically, or for standardizing digital forensics in general?
Two things – as our research suggests, the lack of requirements makes the whole verification and validation process harder, and there’s the confusion about exactly what validation means. In ISO terms, it’s proof that you can make a process work for you and your customers. People still seem to think it’s about proving that tools are correct. Even a broken tool can be used in a valid process, if the process accounts for the errors the tool makes.
I guess I’ve had the benefit of seeing how standards are produced and learning how to use the ISO online browsing platform to find the definitions that apply. Standards writers are a lot like Humpty Dumpty. When we use a word it means exactly what we choose it to mean. Is there a way to properly standardise tools and methods in digital forensics?
It’s not just a UK problem – it’s global. There’s an opportunity for the industry to review the situation, now, and create its own set of standard requirements for methods. If these are used correctly, we can tell the tool makers what we need from them and enable proper objective testing to show that the tools are doing what we need them to. They’ll also allow us to devise proper tests for methods to show that they really are valid, and to learn where the boundaries of those methods are.
Your study also looked at some existing projects in the area: can you tell us about some of these? Do any of them present a potential solution?
NIST and SWGDE both have projects in this space, but specifically looking at tool testing. The guidance and methods look sound, but they have some limitations. Firstly, because they’re only testing tools, they don’t address some of the wider non-technical requirements that we need to satisfy in methods (things like legal considerations, specific local operational constraints etc.).
Secondly, the NIST project in particular lacks a bit of transparency about how they’re establishing requirements and choosing which functions to test. If the industry worked together we could provide some more guidance to help them deal with the most common or highest priority functions.
Both projects, however, could serve as a good foundation for further work and I’d love to see them participating in a community project around requirements definition, test development and sharing of validation information.
Is there anything else you’d like to share about the results?
We need to get away from thinking solely in terms of customer requirements and method scope. These concepts work in other disciplines because there’s a solid base of fundamental science behind the methods. Digital forensics relies on reverse-engineering and trying to understand the mind of a developer in order to work out how extract and interpret data. That means we have a potentially higher burden of proof for any method we develop. We also need to remember that we deal with a rate of change caused by human ingenuity and marketing, instead of evolution.
Things move pretty fast in DF, if we don’t stop and look at what we’re doing once in a while, we’ll miss something important.
Read Angus Marshall’s paper on requirements in digital forensics method definition here.
The hottest topic in digital forensics at the moment, standardisation is on the tip of everyone’s tongues. Following various think pieces on the subject and a plethora of meetings at conferences, I spoke to Angus Marshall about his latest paper and what he thinks the future holds for this area of the industry. You can […]
[Case Study] Computer Forensics: Data Recovery & Extraction From Platter Scratched Hard Drives. COMPUTER FORENSICS:
Editor’s note: As a forensic data recovery expert, Salvation DATA receives different data recovery cases every day. Our forensic customers usually turn to us for help when they run into a case they are not able to handle. And among all the data lost situations, platter scratch is one of the most difficult kinds of problem to deal with. So in this issue, let’s see what is the correct forensic process for a platter scratched hard drive.
What is platter scratch?
When platters are damaged it is usually in the form of scratching caused by debris and or the read/write heads when they come in contact during the reading-writing process.
This is also known commonly as a head crash, although that term is often mistakenly used by inexperienced individuals to relate to clicking drives or hard drives that need a read/write head replacement.
Once the platters are scratched to a certain degree this will, in turn, damaged the read/write heads and will render the drive unreadable. Oftentimes this results in a clicking, scratching, chirping, or screeching sounds. However, these sounds don’t automatically mean the platters are scratched.
When the platters are scratched in this manner the drive will not be able to be recovered, the files and data contained on the drive will be lost forever. This is known as a catastrophic head crash and most hard drive failure recovery cannot fix this.
SalvationDATA Computer Forensics Scratched Platters
How to work with a hard drive with platter scratch?
Is platter scratch truly unrecoverable? Actually sometimes if the scratches to the platter surface are not too severe, there is still the possibility to recover and extract the data as long as we strictly follow operating procedures.
Stop attempting to read data immediately to avoid further unrecoverable damage.
Open the hard drive in a dust-free environment and inspect for damage.
Remove the damaged read/write head, and replace it with a healthy head. Donor head must be selected by strict matching rule. For example, for Western Digital head replacement, donor drive must match the model number, batch ID, FW version and PN.
After repairing physical damages, we can continue to forensically recover and extract the data from this hard drive with SalvationDATA’s DRS (Data Recovery System).
What tools do you need for this process?
HPE Pro is a hard drive repair tool Head Platter Exchange it is the unarguable and the only equipment built to handle head stack and drive motor issues, in case the drive corruption is not caused by firmware but head stack or drive spin motor. With the pioneer platter exchanger, it can prevent the head from further damage or misalignment due to incorrect operations to maintain the user data intact.
DRS (Data Recovery System) is our next generation intelligent all-in-one forensic data recovery tool that can acquire and recover data from both good and damaged storage media like HDD simply and easily.
How do we know if the hard drive is fixed, and can continue to the next step? DRS’s disk diagnostics feature perfectly helps to solve the problem. DRS is able to scan the source disk in advance. With fairly new FastCheck technology, it allows rapid check within 5 seconds, avoiding the risk of second damage made to an important evidentiary storage device.
Insert the hard disk in DRS, and simply click the one-key Diagnose function to complete the process. DRS will tell you the detailed disk health status in no time!
After repairing the physical damages, the hard drive could still be fragile and easy to fail again. If not handled with care, we may permanently lose the opportunity to recover and extract the data. Therefore, it is crucial to first secure data stored on the hard drive. DRS also provides the solution. The forensic imaging function of DRS secures the evidentiary digital data by creating a physical level sector-by-sector duplication of the damaged hard drive. Once finished, a forensic image will be exactly the same as the source data and can be stored safely and analyzed at any time appropriate.
When dealing with a defective hard drive as in this case, it is recommended to use the Advanced Imaging mode in DRS to help bypass bad sectors and extract as much data as possible. Also, remember to set transmission mode as PIO (low speed) to safely extract the data from such damaged storage device.
Before imaging, we can also check the raw hexadecimal data view in DRS Sector View to make sure data on this damaged hard drive is accessible. Professional data recovery engineers can even acquire more information from this sector view.
SalvationDATA Computer Forensics Scratched Platters
Now with all the problems dealt with, we have one final step to make: recover and extract valuable evidentiary data. Use DRS’s File Recovery & File Carving function to locate and extract important digital files, and generate a forensic report at the end of the process. With DRS’s intelligent recovery technology, investigators can deal with deleted files, formatted partitions, corrupted file system and many other digital data lost situations without any professional skill requirements!
Platter scratch is the nightmare for data recovery engineers. However, it is not impossible to recover data from scratched platters. In this issue, we discussed the standard operating procedure to deal with a hard drive with platter scratch to maximize the possibility to recover and extract valuable evidentiary data. We hope the instructions we provide can help you with your work!
You can also visit our official YouTube channel for more videos: https://www.youtube.com/user/SalvationDataOfficia/featured
Editor’s note: As a forensic data recovery expert, SalvationDATA receives different data recovery cases every day. Our forensic customers usually turn to us for help when they run into a case they are not able to handle. And among all the data lost situations, platter scratch is one of the most difficult kinds of problem […]
A Pickle Autopsy? YES!
If you teach Anatomy & Physiology, you know the struggle of the first unit…. it’s HUGE!! … and jam-packed with things that are absolutely essential for students to know in order to be successful in the course. I usually struggle with finding activities to review the body cavities and directional terms. This year, someone suggested using the pickle autopsy and I’m so glad I did!
The lab I used was published in The Forensic Teacher and would be appropriate for either discipline (I teach both this year). Here is the link to the lab I used http://www.theforensicteacher.com/Labs_files/picklelabsheets.pdf A clever fellow teacher friend came up with the storyline that there was a gang war between the Claussens and the Vlasics in the fridge that resulted in no survivors. I loved it so I also used that storyline to frame my lab.
Set Up– The Basics
Now that I had my lab picked out and my story to tell, I had to figure the logistics of how to get everything set up.
First, the pickles….
I found the big jars of dills at Walmart for $5.97 each. The smaller pickles I got because I wanted some of my “victims” to be pregnant (or they could also be small children pickles lol). I had a hard time estimating how many pickles were in the big jars, but these 2 had a total of 33 pickles– more than enough for my classes. The picture below shows them separated by “male” and “female” victims (my “male” pickles are the ones with the stems lol).
Here are all the supplies I used for the lab:
How to make them look like victims….
I glued wiggly eyes onto thumbtacks for their eyes (so I can reuse them)
I also used pellets that go in pellet guns for bullet wounds (I smashed them a little with the hammer first and dipped them into gel food coloring before I stuck them in the “victims”)
I made their heads from an olive stuck on a toothpick– some I even squished so their “brains” fell out a little lol. I also gave all of them a “spine” (a toothpick on the dorsal side just under the skin). I also broke several of the toothpicks so this “injury” might be discovered and included in the story of their “victim”.
All the “victims” had a bead implanted in the vicinity of their heart. If the bead was red, they had a normal heart. If it was black or dark purple, it represented a heart attack. I found that if you make a slit on the side of the pickle (choose a wrinkle), it will often be completely unnoticeable and students will wonder how in the world you got those beads in there! I also slipped in a small green bead in the neck region of a few of the “victims” and told my students I heard that some of the gang members involved in the war were caught raiding the grapes from the fridge and several choked on them when their leader caught them.
I also told them that the gang members were not healthy and many had various diseases and disorders because they didn’t take care of themselves. Many had white beads implanted in various areas. These beads represented a tumor in the particular area. Knotted pieces of rubber bands in the abdominal region represented parasites. Many had broken toothpick “limbs”. I also had several who were pregnant.
This is the sheet of “Helpful Hints” I gave my students with their lab:
A Snapshot of My “Victims”
I separated my “victims” into 4 general types based on their cause of death:
- Trauma or internal bleeding (Stabbed or gunshot, injected with red food coloring)
- Poisoning/ Drug Overdose (I soaked them in baking soda but didn’t get a very good result)
- Heart Attack (black bead instead of red bead in chest)
- Drowning (blue food coloring injected in chest area)
My “victims” had multiple things that could have resulted in their deaths, but having 4 major things just helped me keep it organized. I also put them in separate dishes while I plotted their demise 🙂
I also kept them separate in labeled gallon ziplock bags to transport them to school.
The Lab Set Up
I set my lab up as a mini crime scene. I had some fake vampire blood from my forensics class that I also added to help set the scene. I also added in some extra plastic swords and pellets around the “victims”. (I let my students pick their own “victim” from the scene).
Students were in a lab group of 3 per “victim”. In my lab, every student in the group has a specific job and job description. It just helps my lab groups run more smoothly and tends to decrease the possibility that one student does the lion’s share of work. These are the jobs I gave my groups for this lab:
My Take on the Pickle Autopsy Lab
Would I use it again? Absolutely! My students became very proficient at actually using the directional terminology and identifying the body cavities that we talked about in class. I heard many meaningful conversations within the groups… “That’s a break in his arm that’s intermediate between the shoulder and the elbow” “I think this sword went through the abdominal cavity and not the thoracic cavity”…. This was so much better than hearing them try to memorize a diagram or a chart of the directional terms!
They loved getting into our “gang warfare” story. I had them fill out a Coroner’s Report detailing the abnormalities they found both in, and on their “victim”, as well as the location of these abnormalities. Then, they had to determine the cause of death for their victim, supporting their opinion with specific details from their autopsy. At all times within their report, they had to incorporate correct anatomical terminology. Finally, they had to create a narrative of what happened to their “victim” based on the findings from their autopsy. Several groups shared with the class. It was lots of fun!
A Pickle Autopsy? YES! If you teach Anatomy & Physiology, you know the struggle of the first unit…. it’s HUGE!! … and jam-packed with things that are absolutely essential for students to know in order to be successful in the course. I usually struggle with finding activities to review the body cavities and directional […]
- Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.
- via: Salem Press Encyclopedia of Science,
- Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”
Definition: Principles of conduct, moral duty, and obligation that guide individuals in their decisions and actions.
Significance: As scientists, forensic scientists have a professional obligation to seek and to speak the truth about matters within their purview. As participants in a forensic process, they are subject to additional, sometimes conflicting, duties. This tension generates many ethical dilemmas.
Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”
Forensic scientists are measured against a standard of professional integrity, although the professionalization of the scientific study of crime is far from complete. Professions are substantially self-regulating, usually through agreed-upon standards and codes of ethics, and this creates the need for them to articulate appropriate expectations and the responsibility of members of professions both to act correctly themselves and to provide appropriate correction for their errant colleagues. A case in point is William Tobin’s campaign against the chemical analysis of bullet lead, also known as comparative bullet-lead analysis (CBLA).
Tobin’s Exposure of CBLA
CBLA is a technique that the Federal Bureau of Investigation (FBI) used for four decades—the investigation of the assassination of President John F. Kennedy in 1963 was an early use—to make cases against defendants when traditional firearms analysis (that is, examination of barrel rifling on bullets) was not possible. By measuring the proportions of seven trace elements (antimony, arsenic, bismuth, cadmium, copper, silver, and tin) found in the lead of a bullet in evidence, forensic scientists sought to establish the probability that the bullet came from the same provenance as a bullet in the suspect’s possession. The belief that the comparison of the chemical composition of bullets could connect two bullets rested on unexamined assumptions about the similarities and differences of the source lead from which the bullets were cast. FBI experts testified in thousands of cases that the facts ascertainable through CBLA established likely identity and therefore pointed toward the probable guilt of the accused. Sometimes, as in the case of Michael Behm, who was convicted of murder in 1997, CBLA provided essentially the only evidence of guilt.
In the 1990s, FBI metallurgist William Tobin began to question the validity of the technique. He felt strongly enough about the issue to research the matter, after his retirement in 1998, with Lawrence Livermore National Laboratory metallurgist Erik Randich. They analyzed data from two lead smelters in Alabama and Minnesota and discovered that the FBI techniques could not distinguish batches of lead produced months apart. They also discovered that differences existed within single batches. Their research was published in Forensic Science International in July 2002.
Although he still defended the technique, the director of the FBI Laboratory requested that the National Research Council (NRC) of the National Academy of Sciences review CBLA. In February 2004, the NRC report, titled Forensic Analysis: Weighing Bullet Lead Evidence, confirmed that only extremely limited claims could be made about the relationship between bullets based on CBLA. Given the NRC findings, a New Jersey appeals court overturned Behm’s conviction in March 2005. The results of the NRC study have obvious implications for many other cases as well.
In an article titled “Forensic Significance of Bullet Lead Compositions,” which appeared in the Journal of Forensic Sciences in March 2005, FBI research chemists Robert D. Koons and JoAnn Buscaglia argued that “compositional comparison of bullet lead provides a reliable, highly significant point of evidentiary comparison of potential sources of crime-related bullets.” In September of that year, however, the FBI announced that it would no longer use CBLA. (In a curious subsequent development, Tobin and a member of the NRC committee, Clifford Spiegelman, suggested that a reanalysis of the bullet fragments from the Kennedy assassination might be in order.)
An article published in New Scientist in April 2002, quoted Tobin as saying of the interpretation of bullet data based on CBLA, “It offended me as a scientist.” In fact, Tobin has a long record as a critic of FBI procedures he regards as bad science and of testimonial practices he regards as unwarranted by the scientific data. To complain about testimony that unreasonably goes beyond what the data can support is to respond equally to the demands of science and the demands of ethics. It is a feature of commonsense justice that the punishment should fit the crime, and a basic requirement of that, in turn, is that the people who are punished should be guilty. Violating that requirement is both bad science and bad ethics.
Joyce Gilchrist’s Tainted Evidence
Is it enough that the accused be guilty of some crime, or does it have to be the one in question? If the accused is guilty of the crime in question, does it matter whether the evidence actually shows that? The belief that one can convict the guilty by tweaking the evidence a little, or shading one’s testimony a bit, is among the most common sources of unethical (and, often enough, criminal) behavior on the part of forensic scientists. The cautionary tale of former Oklahoma City police Department forensic scientist Joyce Gilchrist probably falls into this category.
In May 2007, Curtis Edward McCarty, who was facing his third trial for a 1982 murder, was freed as the result of the improper handling and representation of hair evidence by Gilchrist, who apparently had tried to frame McCarty. The judge dismissed the charge despite her belief that McCarty was probably not completely innocent. This was merely the latest in a series of episodes involving Gilchrist.
Questions about the integrity of Gilchrist’s work began as early as January 1987, when a Kansas City colleague, John Wilson, complained about her to the Southwestern Association of Forensic Scientists, without result. In 1998, Robert Miller was exonerated after he had been convicted a decade earlier based in part on Gilchrist’s testimony regarding blood, semen, and hair evidence. In 1999, Gilchrist was criticized by a judge for having given false testimony (regarding semen evidence) in the rape/murder trial of Alfred Brian Mitchell in 1992. In the spring of 2000, Jeffrey Todd Pierce was ordered released after he had served a decade and a half for a rape he did not commit; he had been convicted based on Gilchrist’s testimony. In January 2001, Gilchrist was criticized for the various judicial reprimands and professional critiques her work had received. In August 2001, doubts were raised about the guilt of Malcolm Rent Johnson, who had been executed for a 1981 rape and murder; Johnson was convicted based on Gilchrist’s testimony.
A month later, in September 2001, Gilchrist was finally fired, after years of reputedly shoddy forensics work, including both mishandling and misrepresentation of evidence, on many cases in addition to those noted above. The world of criminal justice contains innumerable isolated instances of perverse idealism, self-serving cynicism, and simple incompetence, but Gilchrist is one of the most striking cases of flagrant disregard for ethics in the forensics community. Was she genuinely convinced of the guilt of those against whom she testified? (She was certainly persuasive to juries.) Was she cynically distorting her testimony, and the evidence, to help prosecutors gain convictions, or was she just incompetent?
Ethics of Competence
One may well agree with forensics ethicist Peter D. Barnett’s remark that “there is a certain baseline level of competence that every criminalist is expected to understand, and there are certain procedures and protocols that are so fundamental to the practice of criminalistics that failure to follow them is evidence of gross incompetence or malfeasance, which is unethical.” As Barnett himself notes, however, “in the practice of forensic science, the disparate educational and experiential backgrounds of workers in the field make determination of a baseline level of competence relatively difficult.”
This is a problem throughout the American criminal justice system. In June 2007, all sergeants in the New Orleans Police Department were required to attend a four-day seminar to learn how to improve their (and their subordinates’) writing of police reports. This was part of an attempt to smooth out conflicts between the department and the New Orleans district attorney’s office, which claimed that part of its difficulty in prosecuting criminals stemmed from “incomplete or vague reports” by officers. More generally, criminalists frequently lament that frontline officers are not more skilled in observing, protecting, collecting, and preserving crime scene evidence.
One certainly can (in theory) impose reasonable expectations about competence and development in forensic science. However, that is not made easy by the variety of educational backgrounds and practical experience of the people who actually work in the field. In an unflattering assessment published in 2005, Jane Campbell Moriarty and Michael J. Saks bluntly asserted that “in the forensic sciences . . . 96 percent of practitioners hold bachelor’s degrees or less.” They went on to note:
Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.
Moreover, as with the difficulty in defining the qualifications for expert testimony, the fact that crime fighting is not a natural kind of expertise has an impact. Almost any expert might be relevant to a criminal case, depending on circumstances. Given the diverse forms of knowledge relevant to the application of science to crime solving, and to the providing of suitable expert testimony, it may be that the only truly unifying factor is the application of the so-called scientific method, broadly understood as intellectual integrity—the determined effort, as physicist Richard P. Feynman put it, not to fool oneself (or others).
What is impressive about the case of William Tobin is his determination to ensure that his colleagues (or former colleagues) not testify to more than the data warrant, both out of scientific integrity and out of fairness to those whose lives are affected by what scientists say. What is appalling about the case of Joyce Gilchrist is the stubbornness of her effort to resist correction by colleagues or even by the seemingly obvious limits of the evidence itself. Sometimes the individual needs to correct the group, by exposing a bogus or complacent consensus; sometimes the group needs to correct the individual, by identifying willful deception or self-centered fantasy. Unfortunately, no formula exists to guarantee the right result, and that is why ethics remains a constant challenge to conscientious souls.
Ethical dilemmas in forensics
- American Academy of Forensic Sciences (AAFS)
- American Society of Crime Laboratory Directors (ASCLD)
- Brain-wave scanners
- Criminal personality profiling
- DNA database controversies
- Ethics of DNA analysis
- Expert witnesses in trials
- Forensic journalism
- Innocence Project
- Interrogation in criminal investigations
- Training and licensing of forensic professionals
- Truth serum in interrogation
Last reviewed: October 2016
Barnett, Peter D. Ethics in Forensic Science: Professional Standards for the Practice of Criminalistics. Boca Raton: CRC, 2001. Print.
Inman, Keith, and Norah Rudin. Principles and Practice of Criminalistics: The Profession of Forensic Science. Boca Raton: CRC, 2001. Print.
Lucas, Douglas M. “The Ethical Responsibilities of the Forensic Scientist: Exploring the Limits.” Journal of Forensic Sciences 34 (1989): 719–29. Print.
Macklin, Ruth. “Ethics and Value Bias in the Forensic Sciences.” Journal of Forensic Sciences 42 (1997): 1203–206. Print.
Moriarty, Jane Campbell, and Michael J. Saks. “Forensic Science: Grand Goals, Tragic Flaws, and Judicial Gatekeeping.” Judges’ Journal 44.4 (2005): 16–33. Print.
Peterson, Joseph L., and John E. Murdock. “Forensic Science Ethics: Developing an Integrated System of Support and Enforcement.” Journal of Forensic Sciences 34 (1989): 749–62. Print.
Derived from: “Ethics.” Forensic Science. Salem Press. 2009.
Since the end of the 19th Century until the current time, law enforcement has been facing a rapid increase in computer-related crimes. In the present time, digital forensics has become an important aspect of not only law enforcement investigations, but also; counter-terrorism investigations, civil litigations, and investigating cyber-incidents. Due to rapid developing and evolving technology, these types of forensic investigations can become complex and intricate. However, creating a general framework for digital forensic professionals to follow during those investigations would lead to a successful retrieval of relevant digital evidence. The digital forensic framework or methodologies should be able to highlight all the proper phases that a digital forensic investigation would endure to ensure accurate and reliable results. One of the challenges that digital forensic professionals have been facing in the recent years is the volume of data submitted for analysis. Few digital forensic methodologies have been developed to provide a framework for the entire process and also offer techniques that would assist digital forensic professionals to reduce the amount of analyzed data. This paper proposes a methodology that focuses mostly on fulfilling the forensic aspect of digital forensic investigations, while also including techniques that can assist digital forensic practitioners in solving the data volume issue.
Focused Digital Forensic Methodology
Modern society has become very dependent on computers and technology to run all aspects of their lives. Technology has had a very positive impact on humanity, which can be easily proven with a short visit to any hospital and witnessing how computers and technology have become tools used to treat and save lives. However, computers have an indisputable disadvantage of being used as a tool to facilitate criminal activities. For instance, the sexual exploitation of children can be performed using the Internet, which would allow criminals to remain anonymous while preying on innocent children. The number of digital related crimes are increasing, which makes law enforcement agencies engaged in a constant battle against criminals who use this technology to commit crimes. As a result, digital forensics has become an important part of law enforcement investigations. Digital forensics is not only performed during law enforcement investigations but can also be conducted during the course of civil matters.
The fact that the information obtained from digital forensic investigations would and can be used as evidence during legal proceedings means that the entire process must be performed according to the legal standards. Perumal (2009) explained that the legal system requires digital forensic processes to be standardized and consistent. One of the issues that Perumal highlighted was the fact that digital evidence is very fragile and the use of improper methods could potentially alter or eliminate that evidence. There are a huge number of methodologies that have been developed all over the world, many of them were designed to target a specific type of technology (Selamat, Yusof, and Sahib, 2008). Also, many methodologies were developed to address requirements imposed by the legal system in certain jurisdictions. One of the methodologies that did not base their theory on technology or the law is the Integrated Digital Investigation Process (IDIP) Model. Carrier and Spafford (2003) explained that the IDIP Model is based on the Locard Exchange Principle, which is used to retrieve evidence from physical crime scenes. The IDIP Model uses the idea that when software programs were being executed in an electronic environment, electronic artifacts would most likely be created on the underlying device. Those artifacts can be retrieved and analyzed to obtain information about a certain incident or event.
This paper examines different literatures that present different types of digital forensic methodologies. Some of these methodologies have taken the focus away from the forensic aspect of digital forensic investigations. Instead, these methodologies have addressed crime scenes and other processes that are not related to the digital forensic field. Also, much research has been focused on creating solutions to the challenges that digital forensic practitioners are facing when conducting digital forensic investigations. One of the main challenges that multiple literatures have addressed is the constant increase in the volume of data that practitioners are acquiring and examining during investigations. This paper proposes a digital forensic methodology that would allow forensic practitioners to overcome the data volume issue and eliminate the lack of focus found in many methodologies.
As it was mentioned above, there are a large number of digital forensic methodologies that were developed all over the world. One of the first serious attempts to develop a standardized methodology that could be used during digital forensic investigations was in 1995. Pollitt (1995) utilized processes that were originally developed to handle physical evidence as a framework to create a methodology for digital forensic investigations. The author’s approach to creating different phases being conducted during an investigation was inspired by many factors that the legal system considers when evaluating any type of evidence. The court would evaluate; whether the seizure was properly conducted, was there any alteration that occurred with the evidence, and what methods were used to examine the evidence. The proper performance of all these steps would allow any type of evidence to be admitted by the court. Pollitt (1995) developed a methodology that consisted of four phases; acquisition, identification, evaluation, and admission as evidence. This methodology focused mostly on the forensic aspect of the investigation and did not extend to other processes that other researchers have considered in their methodology; preparation, planning, and the search of physical crime scene.
Many digital forensic methodologies were developed after Pollitt’s methodology. Some of those methodologies have used different terminology and sequencing of the phases that a forensic practitioner would have to perform throughout their investigations. Most of the methodologies have agreed on certain processes that are related to the forensic aspect of digital forensic investigations (Selamat et al., 2008). Selamat et al. studied 11 different digital forensic methodologies and concluded that all of them had common phases; preservation, collection, examination, analysis, and reporting. This means that all these researchers have agreed on these specific phases and any newly developed methodology would have to include similar phases.
According to Ruibin, Yun, and Gaertner (2005), the proper completion of any digital forensic investigation is directly related to conducting the processes that are similar to the ones highlighted by Selamat et al. (2008). Ruibin, Yun, and Gaertner also explained that in order to properly perform these phases, proper technique must be used. These techniques would ensure that the authenticity and reliability of the evidence is acceptable by the legal standards of the jurisdiction where the methodologies are being implemented. The use of the term legal is not foreign to digital forensic methodologies, as many researchers have addressed the importance of performing digital forensic investigations using the proper legal authorizations. For instance, the Integrated Digital Investigation Process (IDIP) Model has included obtaining legal authorization to perform a digital forensic investigation as one of the sub-phases of the Deployment Phase (Baryamureeba and Tushabe, 2004).
The IDIP Model was revised by Baryamureeba and Tushabe (2004) after noticing some practicality and clarity problems. Baryamureeba and Tushabe named the revised version as the Enhanced Integrated Digital Investigation Process (EIDIP) Model. In the EIDIP Model, the authors focused on two aspects, the sequence of the phases and clarity in regard to processing multiple crime scenes. The EIDIP proposed a modification to the deployment phase of the IDIP Model and included investigation of the physical and digital scenes as a way to arrive at the confirmation phase. Also, Baryamureeba and Tushabe added the Traceback phase, which aims at using information obtained from the victim’s machine to trace back other machines used to initiate the incident.
Challenges Facing Current Methodologies
Technology has been developing on two aspects; hardware and software. For instance, a mobile device hardware has seen great advances that allows these devices to be able to perform complex operations efficiently. At the same time, the software operates on mobile devices has also undergone great advancements to be able to support the consumers’ demand. However, these enormous changes in technology have affected the digital forensic methodologies, especially the ones that evolved around a certain set of technologies. Selamat et al. (2008) explained that there are many digital forensic methodologies that have been developed to target certain devices or a specific type of technology. The main problem with this type of methodology is the rapid changes of the underlying technology, which render these methodologies obsolete.
The legal system in different jurisdictions view digital forensics differently, which reflects on the methodologies used by forensic practitioners in those jurisdictions. According to Perumal (2009), “As computer forensic is a new regulation in Malaysia, there is a little consistency and standardization in the court and industry sector. As a result, it is not yet recognized as a formal ‘Scientific’ discipline in Malaysia” (p. 40). This means that the most affected jurisdictions are ones that have not developed a full understanding of computer forensics and the processes used to preserve and analyze electronic evidence. Also, the lack of understanding by the legal system would reduce the influence that the legal system has on shaping methodologies and making them acceptable by legal standards.
One of the other challenges related to digital forensic methodologies had been caused by the researchers who develop these methodologies. Some researchers have used titles and terminology in naming their methodologies that is not related to digital forensics. For instance, Baryamureeba and Tushabe (2004) named the fourth stage of the EIDIP Model as the Dynamite Phase, which includes Reconstruction and Communication as sub-phases. This lack of clarity in terminology would have a negative impact on any effort to help groups outside the digital forensic arena to have a better understanding of the processes used in the digital forensic field. As it was explained above, some jurisdictions across the globe are still unsure about digital forensics and the evidence obtained from a digital forensic investigation. This means that there would be advantages for researchers to utilize more modest and familiar terminology when addressing digital forensic methodologies. This simplicity in the terminology would assist the legal system in having a better understanding of the digital forensic processes.
The same challenges that the digital forensic field is currently facing can also be seen as challenges to digital forensic methodologies. Some of those challenges are; volume of data, encryption, anti-forensic techniques, and lack of standards. The reason these factors are perceived as challenging is because they have been slowing down the progression of digital forensic investigations. For instance, a fully encrypted computer system and an undisclosed encryption key would prevent practitioners from gaining access to the data, which ultimately imposes a challenge on the entire digital forensic process. One of the challenges that has been addressed by many researchers is the volume of data examined by forensic practitioners during digital forensic investigation. Examining a large amount of information during a digital forensic investigation would slow down the entire forensic process and affect the flow of operations for forensic laboratories.
Volume of Data
The increase in the volume of data was caused by the increasing sizes of electronic storage devices and at the same time the significant decrease in prices of those devices. According to Quick and Choo (2014), digital forensic practitioners have seen a significant increase in the amount of data that is being analyzed in every digital forensic examination. The authors explained that three factors have contributed to the volume of data that has become an issue for digital forensic practitioners; increase in the number of electronic devices for each investigation, the increase in the size of memory for those devices, and the number of investigations that require digital forensic examinations.
The significant increase in the volume of data has caused a great burden on forensic laboratories. Digital forensic practitioners are forced to spend longer times examining and analyzing larger data sets, which cause huge delays in completing digital forensic investigations. Quick and Choo (2014) explained that forensic laboratories have backlogs of work caused by the amount of time spend on each single examination. Some may argue that the increased amount of data can be an advantage as consumers are not forced to delete any data. However, Quick and Choo (2014) highlighted the seriousness of the data volume issue and stated, “Serious implications relating to increasing backlogs include; reduced sentences for convicted defendants due to the length of time waiting for results of digital forensic analysis, suspects committing suicide whilst waiting for analysis, and suspects denied access to family and children whilst waiting for analysis” (p. 274).
Current Solutions to the Data Volume Issue
Ruibin et al. (2005) offered a way to reduce the amount of data for each case by determining the relevant data based on the preliminary investigation. The relevant data for a certain case can be determined through two sources; case information and the person investigating the matter. Combining the information from both sources would assist in building a case profile, which can then be used to determine what type of information would be relevant for each forensic examination. The solution proposed by Ruibin et al. (2005) did not focus on using the technology to reduce the amount of data; however, there were other proposed solutions that suggested using technology to eliminate the data volume issue.
Neuner, Mulazzani, Schrittwieser, and Weippl (2015) proposed a technique that depends on technology to reduce the data volume. Their technique is based on the idea of file deduplication and file whitelisting. Deduplication and file whitelisting techniques would allow digital forensic practitioners to reduce the number of files that would have to be reviewed. Reducing the number of files would reduce the amount of time needed to complete the entire process. The authors defined the whitelisting technique as a way to compare a known list of hash values to the evidence and exclude any matches, as the contents of those files have already been determined. This list of hash values can either represent safe files or known contraband. The method offered by Neuner et al. (2015) is completely automated, and it would not require any human intervention, which means that no manpower is required to perform this task. Both solutions explained above are not controversial, as they utilize standard methods to locate the evidence. However, that is not the case with all solutions proposed to solve the data volume issue.
Quick and Choo (2014) explained that many researchers have addressed the issue of data volume through the use of the term sufficiency of examination. Those researchers did not focus on reducing the amount of data to be examined. Instead, they proposed limiting the search for electronic information only to the amount that would answer the main questions of the investigation. In other words, if a digital forensic practitioner examining a computer system for any evidence related to a child pornography investigation, the examination can be stopped once enough evidence was found to charge the suspect. This means that not all of the computer system would be examined and practitioners would be able to complete more examinations in a shorter period of time.
Some may argue that conducting a partial examination of the data could potentially lead to missing valuable digital evidence. For instance, in the example above concerning conducting a partial examination of child pornography evidence, conducting a full exam may lead to discovering pictures and videos of minors that are being sexually exploited. Those victims can potentially be identified and rescued. This means that conducting a partial examination of the evidence could possibly prevent those victims from being rescued. Still, assuming a full exam could lead to identifying other victims does not stand when confronted by the facts presented by Quick and Choo’s (2014), which highlighted the fact that delaying the results of a forensic exam has grievous negative effects. These effects can be seen on suspects who are waiting a long time for forensic practitioners to conclude their exams. The same goes for the victims, as they are not able to get any closure during the investigation, as they all have to await the exam to be concluded.
The Proposed Methodology
Digital forensic practitioners continue to face a great deal of pressure from the challenges explained earlier. These challenges are placing obstacles in the way of examinations, in return causing practitioners to spend an extensive amount of time working on each investigation. However, it appears that researchers have not invested much effort into creating methodologies that would support practitioners to ease the pressure and eliminate the challenges. However, some may impose a compelling argument that not all challenges can be resolved in the context of the methodologies. For instance, data encryption is one of the issues that prevents practitioners from accessing the evidence and can be categorized as a technical challenge. This means that researchers would not be able to address this challenge in the context of a digital forensic methodology as methodologies are supposed to address higher level processes that occur during digital forensic investigations. Even though this argument seems compelling, practitioners can still create methodologies that are at least capable of reducing the workload pressure away from practitioners.
Selamat et al. (2008) described the Extended Model presented by Ciardhuain (2004) as a complete framework that provides clear stages for digital forensic investigations. Ciardhuain (2004) suggested that there are multiple steps that must be taken throughout digital forensic investigations; awareness, authorization, planning, notification, search and identification of evidence, collection, transport, storage, examination, hypotheses, presentation, proof/defense, and dissemination. Selamat et al. (2008) also explained that the Extended Model included all five phases that were agreed upon by many researchers. This means the remaining phases of the Extended Model are not related to digital forensics and removing them would not affect the authenticity or reliability of the electronic evidence. However, some may argue that the Extended Model was not intend to be conducted by only digital forensic practitioners, as many phases of the model can be performed by non-digital forensic practitioners. For instance, in a criminal investigation, completing the authorization phase can be performed by law enforcement personnel to grant forensic practitioners lawful access to the evidence. This means that the Extended Model, is a framework that is not only for practitioners, but is for all investigations that involve digital evidence.
This paper proposes a new methodology, Focused Digital Forensic Methodology (FDFM), that is capable of eliminating the data volume issue and the lack of focus with the current digital forensic methodologies. The FDFM is designed to be a reflection of the current workflow of law enforcement and civil investigations. The FDFM focuses on ensuring that digital forensic investigations are being conducted properly without overloading practitioners with unrelated activities. Further, the FDFM proposes techniques to reduce the volume of data to cut on time required to complete examinations of electronic evidence, especially on larger data sets. This paper proposes a new methodology, Focused Digital Forensic Methodology (FDFM), that is capable of eliminating the data volume issue and the lack of focus with the current digital forensic methodologies. The FDFM is designed to be a reflection of the current workflow of law enforcement and civil investigations. The FDFM focuses on ensuring that digital forensic investigations are being conducted properly without overloading practitioners with unrelated activities. Further, the FDFM proposes techniques to reduce the volume of data to cut on time required to complete examinations of electronic evidence, especially on larger data sets.
A quick review of the Extended Model would lead to a conclusion that it focuses on activities related to evidence that is considered common knowledge in the law enforcement field and digital forensic field. For instance, the transportation and storage activities of the EIDIP are two parts of evidence handling that is being trained to law enforcement personnel and digital forensic practitioners. So, it would not be of value to emphasiz the idea that digital forensic practitioners must transport and store the evidence after it was collected. For that reason, the FDFM has excluded aspects that can be considered common knowledge in the digital forensic field, which are related to the proper handling of evidence.
Phases of the FDFM
The FDFM is designed in a way similar to the IDIP and EIDIP, as it is broken down into multiple phases, which are further broken down into sub-phases. The FDFM is designed in a way similar to the IDIP and EIDIP, as it is broken down into multiple phases, which are further broken down into sub-phases.
Preparation Phases. The preparation phase can be applied in two situations; when creating a new digital forensic team and after completing a digital forensic investigation. This phase is mainly aimed at ensuring that digital forensic teams, no matter what their mission might be, are capable of initiating and completing an investigation properly and without any problems. Just as the case for the EIDIP, the focus in this phase is ensuring that the forensic team is trained and equipped for their assigned mission.
Training. To ensure the forensic team is executing a forensic methodology properly, they must have all the training that would equip all team members with the knowledge needed. This training does not only focus on how to collect or search evidence properly but also trains the members on how to use the tools needed during those processes. For instance, the team should have the knowledge needed regarding the value of preserving data. They should also be able to utilize any software or hardware tools that are capable of preserving and acquiring data from different platforms.
Equipment. There are many tools that digital forensic professionals utilize during investigations that are capable of serving different purposes. One of the functions that those tools are able to accomplish is automating different processes during the investigation, and completing those jobs in a timely manner. Depending on the main mission of the forensic team, tools and equipment must be available to ensure a proper and fast completion of any digital investigation.
On-Scene Phases. This phase focuses on processes that would be accomplished in the event that a digital forensic team was called in to assist with executing search warrants or even preserving data for a civil litigation.
Physical Preservation. During this phase, digital forensic practitioners ensure that any item that may contain electronic information of value is protected. This protection would ensure that no damage would be inflected on any electronic device that can cause the loss of the information within. For instance, if a digital forensic practitioner found one of the items sought during the search of a residence, this item must be kept in a location under the control of the searching team. This means that any occupant in the residence would not be able to cause any damage to the device, which they would possibly attempt if they believed that it contained incriminating evidence.
Electronic Isolation. The other part of the preservation is ensuring that the information within the collected device is preserved by eliminating any electronic interference. An item like a mobile device would have to be isolated from the cellular network to ensure that no one is capable of damaging or changing the data remotely. The same case applies to computers or other items with network connectivity, as suspects could connect remotely to those devices and attempt to eliminate any information that is potentially damaging to their case. Electronic Isolation. The other part of the preservation is ensuring that the information within the collected device is preserved by eliminating any electronic interference. An item like a mobile device would have to be isolated from the cellular network to ensure that no one is capable of damaging or changing the data remotely. The same case applies to computers or other items with network connectivity, as suspects could connect remotely to those devices and attempt to eliminate any information that is potentially damaging to their case.
Electronic Preservation. The last part of preservation is creating a complete or partial duplicate of the targeted electronic information. The created copy would ensure that the electronic evidence is preserved in a forensic image. In the event that the devices were not supposed to be removed from the scene, forensic images would have to be created on scene, which can then be taken back to the laboratory for examination. A partial collection can also be conducted during this phase, which would be especially advantageous for civil litigation purposes. In those cases, the collection of an entire computer system is not recommended, and practitioners would have to search for and collect only the relevant data.
Laboratory Phases. These phases are conducted when the team is at the laboratory and have all the evidence or forensic images collected from the scene. Prior to interacting with the evidence, an attempt would be made to reduce the volume of data that an examiner would have to process and analyze. The reduction of data volume is crucial for cases that have a huge number of electronic devices and a large volume of storage for electronic information. As it was explained earlier, reducing the volume of data would expedite the examination of the evidence and allow for a better workflow. Once the reduction of data has been completed, examiners can begin processing and analyzing data.
Building Case Profile. This phase is somewhat similar to the creation of a case profile presented by Ruibin et al. (2005). These authors suggested using input from the investigation to identify the type of information that a digital forensic practitioner would need to focus on finding during the examination. For instance, if the case is related to pictures of evidentiary value, then a practitioner would need to be focused on reviewing all the pictures found on a device, without the need to focus on any other type of electronic information. For the FDFM, building a case profile would begin, just as the case for Ruibin et al. (2005), with obtaining information from the investigating agency regarding their investigation. The obtained information must focus specifically on the relationship between the electronic evidence and the incident under investigation. In other words, the investigating agency would have to provide information about what role the devices played in the incident. For instance, a mobile device would be submitted to the digital forensic team as part of a rape investigation. The investigating agency would have to prove how the mobile device was related to the rape incident. In these situations, the phone could have been used by the suspect to communicate with or lure the victim into a certain location. Building Case Profile. This phase is somewhat similar to the creation of a case profile presented by Ruibin et al. (2005). These authors suggested using input from the investigation to identify the type of information that a digital forensic practitioner would need to focus on finding during the examination. For instance, if the case is related to pictures of evidentiary value, then a practitioner would need to be focused on reviewing all the pictures found on a device, without the need to focus on any other type of electronic information. For the FDFM, building a case profile would begin, just as the case for Ruibin et al. (2005), with obtaining information from the investigating agency regarding their investigation. The obtained information must focus specifically on the relationship between the electronic evidence and the incident under investigation. In other words, the investigating agency would have to provide information about what role the devices played in the incident. For instance, a mobile device would be submitted to the digital forensic team as part of a rape investigation. The investigating agency would have to prove how the mobile device was related to the rape incident. In these situations, the phone could have been used by the suspect to communicate with or lure the victim into a certain location.
Once all the information was obtained regarding the investigation and the evidence, the type of data relevant to the investigation can be determined. In the example above regarding the rape case, the suspect may have used text messages to communicate with the suspect, which makes any text base communication between both parties relevant and must be reviewed and analyzed. In some cases, determining the approximate size of information sought can also be relevant and would allow for the examination to be even more focused. For instance, an investigating agency submitted one 500 gigabyte external drive and one 8 gigabyte thumb drive to the digital forensic team for examination. The investigation was seeking documents approximately 30 gigabytes in size relevant to a fraud case. In this situation, a forensic practitioner would focus on the external drive as it most likely contains the documents because it is more than 20 gigabytes in size. In the same sense, the 8-gigabyte thumb drive most likely does not have the documents because of its small size.
The next step of the case profile is determining the most relevant type of devices based on the information obtained thus far about the investigation. If the examination is seeking text-based messages sent to the victim, then it likely means that a mobile device was used to send those messages. However, this does not mean that a computer cannot be used to send such messages, but a mobile device is the most likely suspect in this situation. Then, the profile can provide further emphasis on the possibility of a mobile device being the source of the relevant messages. It is apparent from the above that building a case profile would require an experienced digital forensic practitioner, using prior experience and knowledge while building the case profile would result in a more accurate and realistic conclusion. The next step of the case profile is determining the most relevant type of devices based on the information obtained thus far about the investigation. If the examination is seeking text-based messages sent to the victim, then it likely means that a mobile device was used to send those messages. However, this does not mean that a computer cannot be used to send such messages, but a mobile device is the most likely suspect in this situation. Then, the profile can provide further emphasis on the possibility of a mobile device being the source of the relevant messages. It is apparent from the above that building a case profile would require an experienced digital forensic practitioner, using prior experience and knowledge while building the case profile would result in a more accurate and realistic conclusion.
The final step of the case profile is setting goals that the investigation is hoping to accomplish at the end of the examination. These goals are based on all the information gathered thus far and how the evidence was going to prove or disprove an incident or event. For instance, in the example of the rape investigation mentioned above, retrieving the text messages from the mobile device is the first goal to show that the suspect had lured the victim to the crime scene. The timeline of those messages is another piece of information that would corroborate the victim’s statement regarding different events prior to the incident. While examining the mobile device, these goals would make it clear to the digital forensic practitioner what the expected results following the examination of the device. The final step of the case profile is setting goals that the investigation is hoping to accomplish at the end of the examination. These goals are based on all the information gathered thus far and how the evidence was going to prove or disprove an incident or event. For instance, in the example of the rape investigation mentioned above, retrieving the text messages from the mobile device is the first goal to show that the suspect had lured the victim to the crime scene. The timeline of those messages is another piece of information that would corroborate the victim’s statement regarding different events prior to the incident. While examining the mobile device, these goals would make it clear to the digital forensic practitioner what the expected results following the examination of the device.
In civil litigation cases and even criminal cases, it would be beneficial during this phase to generate a list of keyword searches that could be used during the examination process. Those keywords must be related directly to the evidence and based on the information obtained about the evidence. When dealing with civil litigation cases, the keyword list would become the main method used to locate any responsive and relevant electronic information. The words added to the list must also be related to the goals setup to be accomplished at the end of the investigation. For instance, if the civil case is related to patent infringement, then the keywords would reveal documents or data that could prove or disprove the allegations submitted by one party against the other.
Evaluation of the Evidence. This evaluation of the evidence would have to occur based on the case profile that was created during the previous phase. The main goal of this phase is to exclude any device that does not match the case profile and most likely would not hold any information sought during the examination of the evidence. Excluding items that do not have relevant information from being examined would reduce the amount of information that a digital forensic practitioner examines in each case. This would ultimately cut on the amount of time needed to complete the examinations and give the investigators faster results. As it was explained above, the case profile would determine the type of electronic information, the timeline related to the information, the size of the electronic information, and other aspects of the evidence. Using the generated profile, the items of evidence can be listed in a way that the device most likely containing the information would be placed on the top of the list. The rest of the devices would be placed in the same way, which is based on the likelihood that they would contain the targeted data. Based on this list, the last item of the list would be least likely to contain information relevant to the investigation.
Forensic Acquisition. During this phase, any data that was not acquired or preserved on scene would be imaged. The imaging process would focus on the items that are on the top of the list that was generated during the Evaluation of the Evidence phase. This means that only items that are believed to hold the relevant data would be acquired and not the entire list of items. This would save on the storage space required to save all the forensic images to and also reduce the time required for imaging.
Partial Examination/Analysis. Many digital forensic models separate the examination phase from the analysis phase, just as the case for the Abstract Digital Forensic Model (Reith, Carr, and Gunsch, 2002). Those models assume that a digital forensic practitioner would search the evidence for any relevant data during the examination phase. Then, an analysis would be conducted during the analysis phase to determine the significance of the information found during the examination phase. In the FDFM, those two phases are combined in one phase as they cannot be separated since they occur at the exact same moment during the search for the evidence. The FDFM proposes creating a case profile prior to examining the evidence, which means that a digital forensic practitioner would know what to search for and the significance of the information beforehand. So, while searching for the evidence, examiners would be able to determine the significance of the information as it is being located.
An example of the situation explained above is the rape incident that was mentioned earlier and the mobile device that holds the information related to the incident. At the time when the practitioner they had prior knowledge that the targeted information related to luring the victim to the incident location was in the text messages. While reviewing the text messages, the digital forensic practitioner found a message that showed how the suspect and victim had met. Then, on the day of the incident, other messages were found showing how the victim was lured to the location of the incident. As the forensic practitioner is reviewing the messages, they are evaluating the information simultaneously and weighing the significance of the pieces of information as it is being found. As each text message is going to add another piece of information related to the incident. At the end of this phase, the forensic practitioner will have a full picture of what happened on that day and any other information related to the incident. It would not be practical for the forensic practitioner to go back and reanalyze information that was previously analyzed as the information was being found. It is also worth mentioning that digital forensic practitioners usually mark the relevant data as it is being found to ensure that it is being included in the final report generated by the tool used to review the evidence. An example of the situation explained above is the rape incident that was mentioned earlier and the mobile device that holds the information related to the incident. At the time when the practitioner they had prior knowledge that the targeted information related to luring the victim to the incident location was in the text messages. While reviewing the text messages, the digital forensic practitioner found a message that showed how the suspect and victim had met. Then, on the day of the incident, other messages were found showing how the victim was lured to the location of the incident. As the forensic practitioner is reviewing the messages, they are evaluating the information simultaneously and weighing the significance of the pieces of information as it is being found. As each text message is going to add another piece of information related to the incident. At the end of this phase, the forensic practitioner will have a full picture of what happened on that day and any other information related to the incident. It would not be practical for the forensic practitioner to go back and reanalyze information that was previously analyzed as the information was being found. It is also worth mentioning that digital forensic practitioners usually mark the relevant data as it is being found to ensure that it is being included in the final report generated by the tool used to review the evidence.
The FDFM suggests the use of a method that was referenced by Quick et al. (2014), which aims at conducting a partial examination and analysis of the evidence as a solution for the data volume issue. According to Quick et al. (2014), the digital forensic practitioner would be, “doing enough examination to answer the required questions, and no more” (p. 282). The same concept can be applied using the FDFM, as digital forensic practitioners conduct examinations only to accomplish the goals setup during the Building Case Profile phase. The examination would be performed in the same sequence of relevance as determined during the Evaluation of the Evidence phase. This means that the examination would begin with the more relevant items and continue down the list until all the goals are accomplished. One of the greatest benefits of conducting partial examinations of the evidence is maintaining the privacy of the owners. For instance, mobile devices contain a vast amount of private information about the owners. This means that the more information to be reviewed, the more loss of privacy that would occur. Limiting the examination to the need of the investigation would assist in maintaining a certain limit of privacy for the owner of the data.
Reporting. During this phase, all the information found during the examination would be placed in a report to inform the investigating agency of the findings. There are many report formats used by different agencies. However, all formats have one thing in common, which is that they are all driven by the findings of the examination and not opinions. Reports that are supported by solid evidence are hard to dispute, as the evidence behind the information in those reports would suppress any arguments.
Conclusions and Future Research
There are many literatures that propose different types of methodologies that have different focuses. However, many of those methodologies have included phases that are not related to the forensic aspect of those methodologies. Many researchers also addressed the issue of the volume of data that is causing huge delays with digital forensic exams. The proposed methodology, FDFM, allows digital forensic professionals to be focused more on forensics during any digital forensic investigation. This methodology has excluded any phases that were included in other methodologies and are considered common knowledge within the digital forensic field. The proposed methodology also addressed one of the biggest challenges for digital forensic investigations, which is the volume of data. The FDFM proposed two methods that would allow for reduction in the volume of data, excluding devices that do not contain relevant information and conducting partial examinations. Both techniques can be applied only after a case profile has been generated based on the information obtained by the investigating agency. Future research can be conducted by focusing on integrating other techniques to the FDFM that would eliminate other challenges of digital forensic investigations.
Agarwal, A., Gupta, M., & Gupta, S. (2011). Systematic digital forensic investigation model. International Journal of Computer Science and Security (IJCSS), 5(1), 118-131.
Baryamureeba, V., & Tushabe, F. (2004). The Enhanced Digital Investigation Process Model. Proceedings of the Digital Forensic Research Conference. Baltimore, MD.
Carrier, B., & Spafford, E. H. (2003). Getting physical with the Digital Investigation Process. International Journal of Digital Evidence, 2(2), 1-20.Carrier, B., & Spafford, E. H. (2003). Getting physical with the Digital Investigation Process. International Journal of Digital Evidence, 2(2), 1-20.
Ciardhuain, S. O. (2004). An extended model of cybercrime investigations. International Journal of Digital Evidence, 3(1), 1-22.
Neuner, S., Mulazzani, M., Schrittwieser, S., & Weippl, E. (2015). Gradually improving the forensic process. In the 10th International Conference on Availability, Reliability and Security, 404-410. IEEE.
Perumal, S. (2009). Digital forensic model based on Malaysian investigation process. International Journal of Computer Science and Security (IJCSS), 9(8), 38-44.
Pollitt, M. M. (1995). Computer forensics: An approach to evidence in cyberspace. In the 18th National Information Systems Security Conference, 487-491. Baltimore, MD.
Quick, D., & Choo, K. R. (2014). Impact of increasing volume of digital forensic data: A survey and future research challenges. Digital Investigation, 11(4), 273-294.
Reith, M., Carr, C., & Gunsch, G. (2002). An examination of the digital forensic models. International Journal of Digital Evidence, 1(3), 1-12.
Ruibin, G., Yun, T., & Gaertner, M. (2005). Case-relevance information investigation: binding computer intelligence to the current computer forensic framework. International Journal of Digital Evidence, 4(1), 147-67.
Selamat, S. R., Yusof, R., & Sahib, S. (2008). Mapping process of digital forensic investigation framework. International Journal of Computer Science and Network Security, 8(10), 163-169.
About the Author
Haider Khaleel is a Digital Forensics Examiner with the US Army, previously a field agent with Army CID. Haider received a Master’s Degree in Digital Forensics Science from Champlain College. The ideas presented in this article do not reflect the polices, procedure, and regulations of the author’s agency.
Correspondence concerning this article should be addressed to Haider H. Khaleel, Champlain College, Burlington, VT 05402. firstname.lastname@example.org
by Haider H. Khaleel Abstract Since the end of the 19th Century until the current time, law enforcement has been facing a rapid increase in computer-related crimes. In the present time, digital forensics has become an important aspect of not only law enforcement investigations, but also; counter-terrorism investigations, civil litigations, and investigating cyber-incidents. Due to […]