Charles Manson Dead at 83 via


Charles Manson Dead at 83

CALIFORNIA — Charles Manson, the wild-eyed 1960s cult leader whose followers committed heinous murders that terrorized Los Angeles and shocked the nation, died Sunday of natural causes, according to the California Department of Corrections. He was 83. 

The diminutive and charismatic Manson orchestrated a wave of violence in August 1969 that took the lives of seven people, spawned headlines worldwide and landed him and his “Manson Family” of followers in prison for most of the remainder of their lives.

Manson served nine life terms in California prisons and was denied parole 12 times. His notoriety, boosted by popular books and films, made him a cult figure to those fascinated by his dark apocalyptic visions.

“He was the dictatorial ruler of the (Manson) family, the king, the Maharaja. And the members of the family were slavishly obedient to him,” former prosecutor Victor Bugliosi told CNN in 2015.

To the point, they would kill for him.

The brutal killings began on August 9, 1969, at the home of actress Sharon Tate and her husband, famed movie director Roman Polanski. He was out of the country at the time. The first set of victims were Tate, who was eight months’ pregnant; a celebrity hairstylist named Jay Sebring; coffee fortune heiress Abigail Folger; writer Wojciech Frykowski; and Steven Parent, a friend of the family’s caretaker.

The next evening, another set of murders took place. Supermarket executive Leno LaBianca and his wife, Rosemary, were killed at their home.

Although Manson ordered the killings, he didn’t participate.

Over the course of two nights, the killers took the lives of seven people, inflicting 169 stab wounds and seven .22-caliber gunshot wounds. Both crime scenes revealed horrifying details. And a few details linked the two crime scenes.

The word pig was written in victim blood on the walls of one home and the front door of another. There was also another phrase apparently scrawled in blood: Helter Skelter (it was misspelled Healter). The reason for the disturbing writings, the prosecutor argued, was because Manson wanted to start a race war and had hoped the Black Panthers would be blamed for the killings.

On June 16, 1970, Manson and three of his followers — Susan Atkins, Patricia Krenwinkel and Leslie Van Houten — went on trial in Los Angeles.

All of those details came tumbling out in the trial that both mesmerized and horrified the nation. During the trial, Manson and his followers created a circus-like atmosphere in the court with singing, giggling, angry outbursts and even carving X’s in their foreheads.

The charges came after a major break in the case when Atkins, who was already in jail on another charge, bragged to a fellow inmate about the Tate murders. She said they did it “because we wanted to do a crime that would shock the world. …”

Manson was originally sentenced to death but the death penalty was briefly abolished in the state and his concurrent sentences were commuted to life in prison.

He also was convicted in the connection with the killings of Gary Hinman, a musician, and stuntman Donald “Shorty” Shea in 1969.

Early life

Charles Manson was born Charles Maddox in Cincinnati in 1934 to an unmarried 16-year-old mother. He later took his then-stepfather William Manson’s last name.

At age 12, Charles Manson was sent to Gibault School for Boys in Terre Haute, Indiana, for stealing. Over the next 20 years, he was in and out of reform schools and prison for various crimes.

In a 1987 prison interview with CNN, he said, “I spent the best part of my life in boys’ schools, prisons, and reform school because I had nobody.”

After marrying twice and spending half his life in prison, 32-year-old Manson made his way to Berkeley, California, by way of San Francisco in 1967. He established himself as a guru in the summer of love and soon shared a home with 18 women.

By 1968, race riots, the Black Panther movement, and anti-world violence convinced Manson that Armageddon was coming. He called it Helter Skelter after the famous Beatles song.

Charismatic figure

The so-called Manson Family made a dilapidated old movie set called Spahn’s Ranch near Los Angeles their home.

“I was mesmerized by his mind and the things he professed,” Manson Family member Leslie van Houten once said.

At the ranch Manson, who was 5-feet-2, hosted LSD fueled orgies and gave sermons. His followers were in thrall of Manson, who told them he was Jesus Christ — and the devil rolled into one.

“They worshipped Charlie like a god,” former Manson Family member Barbara Hoyt told CNN.

Music a part of his life

While in prison as a young man, Manson would listen to the radio. Inspired by the Beatles, he started writing songs and performing in prison shows.

Manson believed that the Beatles were speaking to him through the lyrics of the White Album, which was released in late 1968. The apocalyptic message, as Manson interpreted it: Blacks would “rise up” and overthrow the white establishment in a race war. Manson and his Family would be spared by hiding out in a “bottomless pit” near Death Valley until he could emerge to assume leadership of the post-revolutionary order.

After moving to California, Manson met Hinman, a music teacher who introduced him to Dennis Wilson of the Beach Boys.

Wilson took one of Manson’s songs, “Cease to Exist,” and turned it into the Beach Boys’ “Never Learn Not to Love.” Manson was furious when he didn’t get a songwriting credit.

Wilson had introduced Manson to record producer Terry Melcher, the son of actress Doris Day. After initially showing interest in Manson’s music, Melcher declined to work with him further.

Melcher later moved out of his house, which was then leased to Polanski and Tate.

Cult figure

Manson got people everywhere to pay attention to him.

With their brew of violence, music and anti-establishment youth counterculture, the 1969 murders and ensuing trials established Manson as a perverse cultural icon that endured until his death. Along the way, the mastermind transcended his victims, and the Tate-LaBianca murders became known as the Manson murders.

Laurie Levenson, a professor at Loyola Law School who follows high-profile cases, described Manson in 2009 as the worst of the worst, evil incarnate.

“If you’re going to be evil, you have to be off-the-charts evil, and Charlie Manson was off-the-charts evil,” Levenson told CNN.

Manson’s image can still be found on posters and T-shirts. In 1998, the animated television series “South Park” featured Manson in a Christmas special. There have been books, a play, an opera and television movies about Manson and his followers.’%3Cscript%3Edocument.domain=/;%3C///script%3E/’);document.close();%7D;%3C/script%3E



[van id=”us/2017/11/20/charles-manson-dead-obit-pkg.cnn”] CALIFORNIA — Charles Manson, the wild-eyed 1960s cult leader whose followers committed heinous murders that terrorized Los Angeles and shocked the nation, died Sunday of natural causes, according to the California Department of Corrections. He was 83. The diminutive and charismatic Manson orchestrated a wave of violence in August 1969 that took the lives […]

via Charles Manson Dead at 83 —


Ethical Dilemmas in Forensics

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.
via: Salem Press Encyclopedia of Science
Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

Definition: Principles of conduct, moral duty, and obligation that guide individuals in their decisions and actions.

Significance: As scientists, forensic scientists have a professional obligation to seek and to speak the truth about matters within their purview. As participants in a forensic process, they are subject to additional, sometimes conflicting, duties. This tension generates many ethical dilemmas.

Although witnesses in American courtrooms are called upon to tell the truth, the whole truth, and nothing but the truth, they may be enjoined from volunteering information. A witness’s individual sense of relevance must often bow to a court’s judgment. The legal system seeks truth, yet it sometimes defers to other values, such as fairness and confidentiality, and in general demands acceptance of formalized rules of procedure. In their capacity as experts, forensic scientists typically enjoy greater latitude than ordinary witnesses in expressing opinions and making judgments in the courtroom, but they too must operate within the often cumbersome and sometimes counterintuitive requirements of the “system” of “justice.”

Entrance sign at the Society of Forensic Toxicologists (SOFT) Meeting, July 5, 2012. By Monika M. Wahi (Own work) [CC-BY-SA-3.0 (, via Wikimedia Commons

Forensic scientists are measured against a standard of professional integrity, although the professionalization of the scientific study of crime is far from complete. Professions are substantially self-regulating, usually through agreed-upon standards and codes of ethics, and this creates the need for them to articulate appropriate expectations and the responsibility of members of professions both to act correctly themselves and to provide appropriate correction for their errant colleagues. A case in point is William Tobin’s campaign against the chemical analysis of bullet lead, also known as comparative bullet-lead analysis (CBLA).

Tobin’s Exposure of CBLA

CBLA is a technique that the Federal Bureau of Investigation (FBI) used for four decades—the investigation of the assassination of President John F. Kennedy in 1963 was an early use—to make cases against defendants when traditional firearms analysis (that is, examination of barrel rifling on bullets) was not possible. By measuring the proportions of seven trace elements (antimony, arsenic, bismuth, cadmium, copper, silver, and tin) found in the lead of a bullet in evidence, forensic scientists sought to establish the probability that the bullet came from the same provenance as a bullet in the suspect’s possession. The belief that the comparison of the chemical composition of bullets could connect two bullets rested on unexamined assumptions about the similarities and differences of the source lead from which the bullets were cast. FBI experts testified in thousands of cases that the facts ascertainable through CBLA established likely identity and therefore pointed toward the probable guilt of the accused. Sometimes, as in the case of Michael Behm, who was convicted of murder in 1997, CBLA provided essentially the only evidence of guilt.

In the 1990s, FBI metallurgist William Tobin began to question the validity of the technique. He felt strongly enough about the issue to research the matter, after his retirement in 1998, with Lawrence Livermore National Laboratory metallurgist Erik Randich. They analyzed data from two lead smelters in Alabama and Minnesota and discovered that the FBI techniques could not distinguish batches of lead produced months apart. They also discovered that differences existed within single batches. Their research was published in Forensic Science International in July 2002.

Although he still defended the technique, the director of the FBI Laboratory requested that the National Research Council (NRC) of the National Academy of Sciences review CBLA. In February 2004, the NRC report, titled Forensic Analysis: Weighing Bullet Lead Evidence, confirmed that only extremely limited claims could be made about the relationship between bullets based on CBLA. Given the NRC findings, a New Jersey appeals court overturned Behm’s conviction in March 2005. The results of the NRC study have obvious implications for many other cases as well.

In an article titled “Forensic Significance of Bullet Lead Compositions,” which appeared in the Journal of Forensic Sciences in March 2005, FBI research chemists Robert D. Koons and JoAnn Buscaglia argued that “compositional comparison of bullet lead provides a reliable, highly significant point of evidentiary comparison of potential sources of crime-related bullets.” In September of that year, however, the FBI announced that it would no longer use CBLA. (In a curious subsequent development, Tobin and a member of the NRC committee, Clifford Spiegelman, suggested that a reanalysis of the bullet fragments from the Kennedy assassination might be in order.)

An article published in New Scientist in April 2002, quoted Tobin as saying of the interpretation of bullet data based on CBLA, “It offended me as a scientist.” In fact, Tobin has a long record as a critic of FBI procedures he regards as bad science and of testimonial practices he regards as unwarranted by the scientific data. To complain about testimony that unreasonably goes beyond what the data can support is to respond equally to the demands of science and the demands of ethics. It is a feature of commonsense justice that the punishment should fit the crime, and a basic requirement of that, in turn, is that the people who are punished should be guilty. Violating that requirement is both bad science and bad ethics.

Joyce Gilchrist’s Tainted Evidence

Is it enough that the accused be guilty of some crime, or does it have to be the one in question? If the accused is guilty of the crime in question, does it matter whether the evidence actually shows that? The belief that one can convict the guilty by tweaking the evidence a little, or shading one’s testimony a bit, is among the most common sources of unethical (and, often enough, criminal) behavior on the part of forensic scientists. The cautionary tale of former Oklahoma City police Department forensic scientist Joyce Gilchrist probably falls into this category.

In May 2007, Curtis Edward McCarty, who was facing his third trial for a 1982 murder, was freed as the result of the improper handling and representation of hair evidence by Gilchrist, who apparently had tried to frame McCarty. The judge dismissed the charge despite her belief that McCarty was probably not completely innocent. This was merely the latest in a series of episodes involving Gilchrist.

Questions about the integrity of Gilchrist’s work began as early as January 1987, when a Kansas City colleague, John Wilson, complained about her to the Southwestern Association of Forensic Scientists, without result. In 1998, Robert Miller was exonerated after he had been convicted a decade earlier based in part on Gilchrist’s testimony regarding blood, semen, and hair evidence. In 1999, Gilchrist was criticized by a judge for having given false testimony (regarding semen evidence) in the rape/murder trial of Alfred Brian Mitchell in 1992. In the spring of 2000, Jeffrey Todd Pierce was ordered released after he had served a decade and a half for a rape he did not commit; he had been convicted based on Gilchrist’s testimony. In January 2001, Gilchrist was criticized for the various judicial reprimands and professional critiques her work had received. In August 2001, doubts were raised about the guilt of Malcolm Rent Johnson, who had been executed for a 1981 rape and murder; Johnson was convicted based on Gilchrist’s testimony.

A month later, in September 2001, Gilchrist was finally fired, after years of reputedly shoddy forensics work, including both mishandling and misrepresentation of evidence, on many cases in addition to those noted above. The world of criminal justice contains innumerable isolated instances of perverse idealism, self-serving cynicism, and simple incompetence, but Gilchrist is one of the most striking cases of flagrant disregard for ethics in the forensics community. Was she genuinely convinced of the guilt of those against whom she testified? (She was certainly persuasive to juries.) Was she cynically distorting her testimony, and the evidence, to help prosecutors gain convictions, or was she just incompetent?

Ethics of Competence

One may well agree with forensics ethicist Peter D. Barnett’s remark that “there is a certain baseline level of competence that every criminalist is expected to understand, and there are certain procedures and protocols that are so fundamental to the practice of criminalistics that failure to follow them is evidence of gross incompetence or malfeasance, which is unethical.” As Barnett himself notes, however, “in the practice of forensic science, the disparate educational and experiential backgrounds of workers in the field make determination of a baseline level of competence relatively difficult.”

This is a problem throughout the American criminal justice system. In June 2007, all sergeants in the New Orleans Police Department were required to attend a four-day seminar to learn how to improve their (and their subordinates’) writing of police reports. This was part of an attempt to smooth out conflicts between the department and the New Orleans district attorney’s office, which claimed that part of its difficulty in prosecuting criminals stemmed from “incomplete or vague reports” by officers. More generally, criminalists frequently lament that frontline officers are not more skilled in observing, protecting, collecting, and preserving crime scene evidence.

One certainly can (in theory) impose reasonable expectations about competence and development in forensic science. However, that is not made easy by the variety of educational backgrounds and practical experience of the people who actually work in the field. In an unflattering assessment published in 2005, Jane Campbell Moriarty and Michael J. Saks bluntly asserted that “in the forensic sciences . . . 96 percent of practitioners hold bachelor’s degrees or less.” They went on to note:

Most forensic “scientists” have little understanding of scientific methodology, do not design or conduct research (and do not know how to), often have not read the serious scientific literature beginning to emerge in their fields. . . . Scientific findings relevant to a given forensic science often are ignored in the conduct of everyday casework.

Moreover, as with the difficulty in defining the qualifications for expert testimony, the fact that crime fighting is not a natural kind of expertise has an impact. Almost any expert might be relevant to a criminal case, depending on circumstances. Given the diverse forms of knowledge relevant to the application of science to crime solving, and to the providing of suitable expert testimony, it may be that the only truly unifying factor is the application of the so-called scientific method, broadly understood as intellectual integrity—the determined effort, as physicist Richard P. Feynman put it, not to fool oneself (or others).

What is impressive about the case of William Tobin is his determination to ensure that his colleagues (or former colleagues) not testify to more than the data warrant, both out of scientific integrity and out of fairness to those whose lives are affected by what scientists say. What is appalling about the case of Joyce Gilchrist is the stubbornness of her effort to resist correction by colleagues or even by the seemingly obvious limits of the evidence itself. Sometimes the individual needs to correct the group, by exposing a bogus or complacent consensus; sometimes the group needs to correct the individual, by identifying willful deception or self-centered fantasy. Unfortunately, no formula exists to guarantee the right result, and that is why ethics remains a constant challenge to conscientious souls.

Ethical dilemmas in forensics

Related Information

  • American Academy of Forensic Sciences (AAFS)
  • American Society of Crime Laboratory Directors (ASCLD)
  • Brain-wave scanners
  • Criminal personality profiling
  • DNA database controversies
  • Ethics of DNA analysis
  • Expert witnesses in trials
  • Forensic journalism
  • Innocence Project
  • Interrogation in criminal investigations
  • Training and licensing of forensic professionals
  • Truth serum in interrogation

Last reviewed: October 2016


Barnett, Peter D. Ethics in Forensic Science: Professional Standards for the Practice of Criminalistics. Boca Raton: CRC, 2001. Print.

Inman, Keith, and Norah Rudin. Principles and Practice of Criminalistics: The Profession of Forensic Science. Boca Raton: CRC, 2001. Print.

Lucas, Douglas M. “The Ethical Responsibilities of the Forensic Scientist: Exploring the Limits.” Journal of Forensic Sciences 34 (1989): 719–29. Print.

Macklin, Ruth. “Ethics and Value Bias in the Forensic Sciences.” Journal of Forensic Sciences 42 (1997): 1203–206. Print.

Moriarty, Jane Campbell, and Michael J. Saks. “Forensic Science: Grand Goals, Tragic Flaws, and Judicial Gatekeeping.” Judges’ Journal 44.4 (2005): 16–33. Print.

Peterson, Joseph L., and John E. Murdock. “Forensic Science Ethics: Developing an Integrated System of Support and Enforcement.” Journal of Forensic Sciences 34 (1989): 749–62. Print.

Derived from: “Ethics.” Forensic Science. Salem Press. 2009.


Digital Forensics as a Big Data Challenge — Forensic Focus – Articles

Digital Forensics as a Big Data Challenge



Digital Forensics, as a science and part of the forensic sciences, is facing new challenges that may well render established models and practices obsolete. The dimensions of potential digital evidence supports has grown exponentially, be it hard disks in desktops and laptops or solid state memories in mobile devices like smartphones and tablets, even while latency times lag behind. Cloud services are now sources of potential evidence in a vast range of investigations and network traffic also follows a growing trend, and in cyber security the necessity of sifting through vast amount of data quickly is now paramount. On a higher level investigations – and intelligence analysis – can profit from sophisticated analysis of such datasets as social network structures, corpora of text to be analysed for authorship and attribution. All of the above highlights the convergence between so-called data science and digital forensics, to take the fundamental challenge of analysing vast amounts of data (“big data”) in actionable time while at the same time preserving forensic principles in order for the results to be presented in acourt of law. The paper, after introducing digital forensics and data science, explores the challenges above and proceeds to propose how techniques and algorithms used in big data analysis can be adapted to the unique context of digital forensics, ranging from the managing of evidence via Map-Reduce to machine learning techniques for triage and analysis of big forensic disk images and network traffic dumps. In the conclusion the paper proposes a model to integrate this new paradigm into established forensic standards and best practices and tries to foresee future trends.

1 Introduction

1.1 Digital Forensics

What is digital forensics? We report here one of the most useful definitions of digital forensics formulated. It was developed during the first Digital Forensics Research Workshop (DFRWS) in 2001 and it is still very much relevant today:

Digital Forensics is the use of scientifically derived and proven methods toward the preservation, collection, validation, identification, analysis, interpretation, documentation and presentation of digital evidence derived from digital sources for the purpose of facilitating or furthering the reconstruction of events found to be criminal, or helping to anticipate unauthorized actions shown to be disruptive to planned operations. [Pear01]

This formulation stresses first and foremost the scientific nature of digital forensics methods, in a point in time when the discipline was transitioning from being a “craft” to an established field and rightful part of the forensic sciences. At that point digital forensics was also transitioning from being mainly practised in separated environments such as law enforcement bodies and enterprise audit offices to a unified field. Nowadays this process is very advanced and it can be said that digital forensics principles, procedures and methods are shared by a large part of its practitioners, coming from different backgrounds (criminal prosecution, defence consultants, corporate investigators and compliance officers). Applying scientifically valid methods implies important concepts and principles to be respected when dealing with digital evidence. Among others we can cite:

  • Previous validation of tools and procedures. Tools and procedures should be validated by experiment prior to their application on actual evidence.
  • Reliability. Processes should yield consistent results and tools should present consistent behaviour over time.
  • Repeatability. Processes should generate the same results when applied to the same test environment.
  • Documentation. Forensic activities should be well-documented, from the inception to the end of evidence life-cycle. On one hand strict chain-of-custody procedures should be enforced to assure evidence integrity and the other hand complete documentation of every activity is necessary to ensure repeatability by other analysts.
  • Preservation of evidence – Digital evidence is easily altered and its integrity must be preserved at all times, from the very first stages of operations, to avoid spoliation and degradation. Both technical (e.g. hashing) and organizational (e.g. clear accountabilityfor operators) measures are to be taken.

These basic tenets are currently being challenged in many ways by the shifting technologicaland legal landscape practitioners have to contend with. While this paper shall not dwell much on the legal side of things, this is also obviously something that is always to be considered in forensics.

Regarding the phases that usually make up the forensic workflow, we refer here again to the only international standard available [ISO12] and describe them as follows:

  • Identification. This process includes the search, recognition and documentation of the physical devices on the scene potentially containing digital evidence. [ISO12]
  • Collection – Devices identified in the previous phase can be collected and transferred to an analysis facility or acquired (next step) on site.
  • Acquisition – This process involves producing an image of a source of potential evidence, ideally identical to the original.
  • Preservation – Evidence integrity, both physical and logical, must be ensured at all times.
  • Analysis – Interpretation of the data from the evidence acquired. It usually depends onthe context, the aims or the focus of the investigation and can range from malware analysis to image forensics, database forensics, and a lot more of application-specific areas.On a higher level analysis could include content analysis via for instance forensics linguistics or sentiment analysis techniques.
  • Reporting – Communication and/or dissemination of the results of the digital investigation to the parties concerned.

1.2 Data Science

Data Science is an emerging field basically growing at the intersection between statistical techniques and machine learning, completing this toolbox with domain specific knowledge, having as fuel big datasets. Hal Varian gave a concise definition of the field:

[Data science is] the ability to take data – to be able to understand it, to process it, to extract value from it, to visualize it, to communicate it. [Vari09]

We can see here the complete cycle of data management and understand that data science in general is concerned with the collection, preparation, analysis, visualization, communication and preservation of large sets of information; this is a paraphrase of another insightful definition by Jeffrey Stanton of Syracuse University’s School of Information Studies. The parallels with the digital forensics workflow are clear but the mention in both definitions of visualization deserves to be stressed. Visualization is mostly never mentioned in digital forensics guidelines and standards but as the object of analysis moves towards “Big Data”, it will necessarily become one of the most useful tools in the analyst’s box, for instance in the prioritization phase but also for dissemination and reporting: visual communication is probably the most efficient way into a human’s brain but this channel is underused by most of today’s forensic practitioners.

If Data Science is concerned with “Big Data”, what is Big Data anyway? After all big is a relative concept and prone to change with time. Any data that is difficult to manage and work with, or in other words datasets so big that for them conventional tools – e.g. relational databases – are not practical or useful. [ISAC13] From the point of view of data science the challenges of managing big data can be summarized as three Vs: Volume (size), Velocity (needed for interactivity), Variety (different sources of data). In the next paragraph we shall see how these three challenges dovetail nicely with the digital forensics context.

2 Challenges

“Golden Age” is a common definition for the period in the history of digital forensics that went roughly from the 1990s to the first decade of the twenty-first century. During that period the technological landscape was dominated by the personal computer, and mostly by a single architecture – x86 plus Windows – and data stored in hard drives represented the vast majority of evidence, so much so that “Computer Forensics” was the accepted term for the discipline. Also the storage size allowed for complete bitwise forensic copies of the evidence for subsequent analysis in the lab. The relative uniformity of the evidence nature facilitated the development of the digital forensic principles outlined above and enshrined in several guidelines and eventually in the ISO/IEC 27037 standard. Inevitably anyway they lagged behind the real-world developments: recent years brought many challenges to the “standard model”, first among them the explosion in the average size of the evidence examined for a single case. Historical motivations for this include:

  • A dramatic drop in hard drive and solid state storage cost (currently estimated at $80 per Terabyte) and consequently an increase in storage size per computer or device;
  • Substantial increase in magnetic storage density and diffusion of solid-state removable media (USB sticks, SD and other memory cards etc) in smartphones, notebooks, cameras and many other kinds of devices;
  • Worldwide huge penetration of personal mobile devices like smartphones and tablets, not only in Europe and America, but also in Africa – where they constitute the main communication mode in many areas – and obviously in Asia;
  • Introduction and increasing adoption by individuals and businesses of cloud services – infrastructure services (IAAS), platform services (PAAS) and applications (SAAS) – made possible in part by virtualization technology enabled in turn by the modern multi-core processors;
  • Network traffic is ever more part of the evidence in cases and the sheer size of it has – again – obviously increased in the last decade, both on the Internet and on 3G-4G mobile networks, with practical but also ethical and political implications;
  • Connectivity is rapidly becoming ubiquitous and the “Internet of things” is near, especially considering the transition to IPv6 in the near future. Even when not networked, sensors are everywhere, from appliances to security cameras, from GPS receivers to embedded systems in cars, from smart meters to Industrial Control Systems.

To give a few quantitative examples of the trend, in 2008 the FBI Regional Computer Forensics Laboratories (RCFLs) Annual Report [FBI08] explained that the agency’s RCFLs processed 27 percent more data than they did during the preceding year; the 2010 Report gavean average case size of 0.4 Terabytes. According to a recent (2013) informal survey among forensic professionals on Forensic Focus, half of the cases involve more than on Terabyte of data, with one in five over five Terabytes in size.

The simple quantity of evidence associated to a case is not the only measure of its complexity and the growing in size is not the only challenge that digital forensics is facing: evidence is becoming more and more heterogeneous in nature and provenance, following the evolving trends in computing. The workflow phase impacted by this new aspect is clearly analysis where, even when proper prioritization is applied, it is necessary to sort through diverse categories and source of evidence, structured and unstructured. Data sources themselves are much more differentiated than in the past: it is common now for a case to include evidence originating from personal computers, servers, cloud services, phones and other mobile devices, digital cameras, even embedded systems and industrial control systems.

3 Rethinking Digital Forensics

In order to face the many challenges but also to leverage the opportunities it is encountering, the discipline of digital forensics will have to rethink in some ways established principles and reorganize well-known workflows, even include and use tools not previously considered viable for forensic use – concerns regarding the security of some machine learning algorithms has been voiced, for instance in [BBC+08]. On the other hand forensic analysts’ skills need to be rounded up to make better use of these new tools in the first place, but also to help integrate them in forensic best practices and validate them. The dissemination of “big data” skills will have to include all actors in the evidence lifecycle, starting with Digital Evidence First Responders (DEFRs), as identification and prioritization will see their importance increased and skilled operators will be needed from the very first steps of the investigation.

3.1 Principles

Well-established principles shall need to undergo at least a partial extension and rethinking because of the challenges of Big Data.

  • Validation and reliability of tools and methods gain even more relevance in a big data scenarios because of the size and variety of datasets, coupled with the use of cutting-edge algorithms that still need validation efforts, including a body of test work first on methods and then on tools in controlled environments and on test datasets before their use in court.
  • Repeatability has long been a basic tenet in digital forensics but most probably we will be forced to abandon it, at least in its strictest sense, for a significant part of evidence acquisition and analysis. Already repeatability stricto sensu is impossible to achieve in nearly all instances of forensic acquisition of mobile devices, and the same applies to cloud forensics. When Machine Learning tools and methods become widespread, reliance on previous validation will be paramount. As an aside, this stresses once more the importance of using open methods and tools that can be independently and scientifically validated as opposed to black box tools or – worse – LE-reserved ones.
  • As for documentation, its importance for a sound investigation is even greater when we see non-repeatable operations and live analysis routinely be part of the investigation process. Published data about validation results of tools and methods used – or at least pointers to it – should be an integral part of the investigation report.

3.2 Workflow

Keeping in mind how the forensic principles may need to evolve, we present here a brief summary of the forensics workflow and how each phase may have to adapt to big data scenarios. ISO/IEC 27037 International Standard covers the identification, collection, acquisition and preservation of digital evidence (or, literally, “potential” evidence). Analysis and disposal are not covered by this standard, but will be in future – in development – guidelines in the 27xxx series.

Identification and collection

Here the challenge is selecting evidence in a timely manner, right on the scene. Guidelines for proper prioritization of evidence should be further developed, abandoning the copy-all paradigm and strict evidence integrity in favour of appropriate triage procedures: this implies skimming through all the (potential) evidence right at the beginning and selecting relevant parts. First responders’ skills will be even more critical that they currently are and, in corporate environments, also preparation procedures.


When classic bitwise imaging is not feasible due to the evidence size, prioritization procedures or “triage” can be conducted, properly justified and documented because integrity is not absolute anymore and the original source has been modified, if only by selecting what to acquire. Visualization can be a very useful tool, both for low-level filesystem analysis and higher level content analysis. Volume of evidence is a challenge because dedicated hardware is required for acquisition – be it storage or online traffic – while in the not so distant past an acquisition machine could be built with off-the-shelf hardware and software. Variety poses achallenge of a slightly different kind, especially when acquiring mobile devices, due to the huge number of physical connectors and platforms.


Again, preservation of all evidence in a secure way and complying with legal requirements calls for quite a substantial investment for forensic labs working on a significant number of cases.


Integrating methods and tools from data science implies surpassing the “sausage factory” forensics still widespread today, where under-skilled operators rely heavily on point and click all-in-one tools to perform the analysis. Analysts shall need to include a plurality of tools in their panoply and not only that, but understand and evaluate the algorithms and implementations they are based upon. The absolute need for highly skilled analysts and operators is clear, and suitable professional qualifications will develop to certify this.


The final report for an analysis conducted using data science concepts should contain accurate evaluations of tools, methods used, including data from the validation process and accurate documentation is even more fundamental as strict repeatability becomes very hard to uphold.

3.3 Some tools for tackling the Big Data Challenge

At this stage, due also to the fast-changing landscape in data science, it is hard to systematically categorize its tools and techniques. We review here some of them.

Map-Reduce is a framework used for massive parallel tasks. This works well when the data-sets do not involve a lot of internal correlation. This does not seem to be the case for digital evidence in general but a task like file fragment classification is suited to be modelled in aMap-Reduce paradigm. Attribution of file fragments – coming from a filesystem image or from unallocated space – to specific file types is a common task in forensics: machine learning classification algorithms – e.g. logistic regression, support vector machines – can be adapted toM-R if the analyst forgoes the possible correlations among single fragments. A combined approach where a classification algorithm is combined for instance with a decision tree method probably would yeld higher accuracy.

Decision trees and random forests are fruitfully brought to bear in fraud detection software, where the objective is to find in a vast dataset the statistical outliers – in this case anomalous transactions, or in another application, anomalous browsing behaviour.

In audio forensics unsupervised learning techniques under the general definition of “blind signal separation” give good results in separating two superimposed speakers or a voice from background noise. They rely on mathematical underpinning to find, among possible solutions, the least correlated signals.

In image forensics again classification techniques are useful to automatically review big sets of hundreds or thousands of image files, for instance to separate suspect images from the rest.

Neural Networks are suited for complex patter recognition in network forensics. A supervised approach is used, where successive snapshots of the file system are used to train the network to recognize normal behaviour of an application. After the event the system can be used to automatically build an execution timeline on a forensic image of a filesystem. [KhCY07] Neural Networks have also been used to analyse network traffic but in this case the results still do not present high levels of accuracy.

Natural Language Processing (NLP) techniques, including Bayesian classifiers and unsupervised algorithms for clustering like k-means, has been successfully employed for authorship verification or classification of large bodies of unstructured texts, emails in particular.

4 Conclusion

The challenges of big data evidence already at present highlight the necessity of revising tenets and procedures firmly established in digital forensics. New validation procedures, analysts’ training, and analysis workflow shall be needed in order to confront the mutated landscape. Furthermore, few forensic tools implement for instance machine learning algorithms or, from the other side, most machine learning tools and libraries are not suitable and/or validated for forensic work, so there still exists a wide space for development of innovative tools leveraging machine learning methods.


[BBC+08] Barreno, M. et al.: “Open Problems in the Security of Learning”. In: D. Balfanzand J. Staddon, eds., AISec, ACM, 2008, p.19-26
[FBI08] FBI: “RCFL Program Annual Report for Fiscal Year 2008”, FBI 2008.
[FBI10] FBI: “RCFL Program Annual Report fir Fiscal Year 2010”, FBI 2010.
[ISAC13] ISACA: “What Is Big Data and What Does It Have to Do with IT Audit?”,ISACA Journal, 2013, p.23-25
[ISO12] ISO/IEC 27037 International Standard
[KhCY07] Khan, M. and Chatwin, C. and Young, R.: “A framework for post-event timelinereconstruction using neural networks” Digital Investigation 4, 2007
[Pear01] Pearson, G.: “A Road Map for Digital Forensic Research”. In: Report fromDFRWS 2001, First Digital Forensic Research Workshop, 2001.
[Vari09] Varian, Hal in: “The McKinsey Quarterly”, Jan 2009

About the Author

Alessandro Guarino is a senior Information Security professional and independent researcher. He is the founder and principal consultant of StudioAG, a consultancy firm based in Italy and active since 2000, serving clients both in the private and public sector and providing cybersecurity, data protection and compliance consulting services. He is also a digital forensics analyst and consultant, as well as expert witness in Court. He holds an M.Sc in Industrial Engineering and a B.Sc. in economics, with a focus on Information Security Economics. He is an ISO active expert in JTC 1/SC 27 (IT Security Techniques committee) and contributed in particular to the development of cybersecurity and digital investigation standards. He represents Italy in the CEN-CENELEC Cybersecurity Focus Group and ETSI TC CYBER. He is the chair of the recently formed CEN/CENELEC TC 8 “Privacy management in products and services”. As an independent researcher, he delivered presentations at international conferences and published several peer-reviewed papers.

Find out more and get in touch with the author at StudioAG.

by Alessandro Guarino, StudioAG Abstract Digital Forensics, as a science and part of the forensic sciences, is facing new challenges that may well render established models and practices obsolete. The dimensions of potential digital evidence supports has grown exponentially, be it hard disks in desktops and laptops or solid state memories in mobile devices like smartphones […]

via Digital Forensics as a Big Data Challenge — Forensic Focus – Articles

Improper Evidence Gathering

“[…] Improperly Photographed Impressions: If the examination involves a photographed tire impression, many things can affect the dimensional accuracy of that photograph. If the camera’s film plane (back) is not perfectly parallel to the impression, then the photograph will have a perspective problem that can affect the ability to accurately enlarge the photograph of the […]

via “Improperly Photographed Impressions”. — Mantracking

Good Samaritan chases down man accused of taking pictures up woman’s dress —

SIMPSONVILLE, S.C. — “Oh my God, could this be happening? Cause there’s kids and there’s people everywhere!” That was Mark Morgan’s initial thought, last week, when he saw something unusual inside the Goodwill on South Street in Simpsonville. “I came across this isle and I noticed this gentleman with his arm stretched out taking a picture, up…

via Good Samaritan chases down man accused of taking pictures up woman’s dress —

Patty Hearst crime history: September 18, 1975

On this date in 1975, heiress-turned-hostage-turned- revolutionary Patty Hearst was arrested by the FBI in San Francisco. And so began her transformation from radical chic to jailhouse geek. She was tried, convicted, and given a seven year sentence that was commuted in 1979. She was pardoned in January 2001 .

Further reading:

Wikipedia entry on Patty Hears

Nobody Move!


On this date in 1975, heiress-turned-hostage-turned- revolutionary Patty Hearst was arrested by the FBI in San Francisco. And so began her transformation from radical chic to jailhouse geek. She was tried, convicted, and given a seven year sentence that was commuted in 1979 (thanks, Jimmy!). She was pardoned in January 2001 (thanks, Bill!).

Further reading:

Wikipedia entry on Patty Hearst

Patty Hearst, actress

View original post

On this day: Mary, Queen of Scots married.

In Times Gone By...

Mary, Queen of Scots married her second husband, Henry Stuart, Lord Darnley at Holyrood Abbey on the 29th of July 1565. The marriage was not a happy one, and Darnley was murdered in February 1567. He was twenty-one when he died.

Mary, Queen of Scots, and her second husband Henry Stuart, Lord Darnley, parents of King James VI of Scotland, later King James I of England.

View original post