Factors Affecting The Intensity Of Poisoning via Forensic’s blog

By @forensicfield

Introduction

The resultant of poisoning depends on many factors.

There are number of reasons which can affect intensity of poisoning are further explained, such as;

  • Dose.
  • Time of intake
  • Way of taking
  • Environmental factors, etc.

Dose

Amount of the poison is determine the affect of it on the body. Smaller the dose, lighter the effect and larger the dose, severe the effect.

Resistance

After doing continuous use of some drugs, such as opiates, tobacco, alcohol, etc. person develop a resistance towards some drugs.

Incompatible Combination of Drugs

Ingestion of some incompatible combination of Medicines may be fatal. Such As; Prozac and Tramadol, Thyroid medication and proton pump inhibitors, Nonsteroidal anti-inflammatory drugs and antihypertensive, etc.

Hypersensitivity

Some of persons show abnormal response (idiosyncrasy) to a drug like morphine, quinine, aspirin etc. due to inherent personal hypersensitivity.

Allergy

Some persons are allergic (acquired hypersensitivity) towards certain drugs like penicillin, sulpha, etc.

Incompatible Combinations

Ingestion of certain medications like anti – ulcerous gels with aspirin may lead to fatal effects.

Tolerance

People develop a marked tolerance in the case of opium, alcohol, strychnine, tobacco, arsenic and some other narcotic drugs by repeated and continued use.

Synergism

Some poisonous drugs can be toxic when taken together may cause lethal effect. Such as; Alcohol and Benzodiazepines, Heroine and Cocaine, Benzodiazepines and Opioids, Alcohol and Opioids

Slow Poisons

The continuous small amount of poison ingestion like arsenic, strychnine, lead, etc. accumulate in body and may cause death.

Conditions of The Body

  • Conditions of the body, i.e. age, health, etc. also affect the action of the poison.
  • Generally old persons, weaker persons and children severly affected by low dose of poison then young and healthy person.

Cumulative Action

The repeated small doses of cumulative poisons like arsenic, lead, mercury, strychnine, digitalis etc. may cause death or chronic poisoning by cumulative action.

Shock

Some times, a large dose of a poison acts differently from a small dose, for example; a large dose of arsenic may cause death by shock while a small dose results in diarrhoea.

Forms of Poison

  • Gases/Vapours Poisons
  • Liquid Poisons
  • Powder Poisons
  • Chemical Combination
  • Mechanical Combination

Gases / Vapours Poisons

These types of poison absorbed immediately and act quickly.

Liquid Poisons

These act better than solids.

Powder Poisons

Fine powdered poison act fast than coarse powdered poison.

Chemical Combination

Some substances in combination act like lethal, such as; acids and alkali’s, strychnine and tannic acid, etc.

Mechanical Combination

The action of a poison is altered when combined mechanically with inert substances, such as; when alkaloid when taken with charcoal, it does not act.

Methods Of Administration

A poison acts more rapidly when inhaled in gaseous form or when injected intravenously.

Next when inject intramuscularly or subcutaneously.

A poison acts slowly when swallowed or applied on skin.

Watch it🤳, share it ✌and subscribe it 👇 : –

NO COMMENTS – add the first!

JZ Logistics via AP Editor


SEND

By @forensicfield Introduction The resultant of poisoning depends on many factors. There are number of reasons which can affect intensity of poisoning are further explained, such as; Dose. Time of intake Way of taking Environmental factors, etc. Dose Amount of the poison is determine the affect of it on the body. Smaller the dose, lighter […]

via Factors Affecting The Intensity Of Poisoning — Forensic’s blog

Biometrics — Forensic’s blog

INTRODUCTION

✴ The term “biometrics” is derived from the Greek words “bio” (life) and “metrics” (to measure).
✴ Biometrics is the technical term for body measurements and calculations.
✴ Biometrics is the measurement and statistical analysis of people’s unique physical and behavioral characteristics.
✴ Biometrics allows a person to be identified and authenticated based on a set of recognizable and verifiable data, which are unique and specific to them.
✴ Biometrics authentication is the process of comparing data for the person’s characteristics in order to determine resemblance.

CHARACTERISTCS

1. UNIVERSAL
2. UNIQUE
3. INVARIABLE
4. RECORDABLE
5. MEASURABLE

HISTORY OF BIOMETRICS

🔘1858 – First systematic capture of hand images for identification purposes is recorded.
🔘1870 – Bertillon develops anthropometries to identify individuals.
🔘1892 – Galton develops a classification system for fingerprints.
🔘1896 – Henry develops a fingerprint classification system
🔘1903 – NY State Prisons begin using fingerprints.
🔘1960s – Face recognition becomes semi-automated.
🔘1960 – First model of acoustic speech production is created.
🔘1963 – Hughes research paper on fingerprint automation is published.
🔘1974- First commercial hand geometry systems become available.
🔘1976 – First prototype system for speaker recognition is developed.
🔘1986 – Exchange of fingerprint minutiae data standard is published.
🔘1988 – First semi-automated facial recognition system is deployed.
🔘1991 – Face detection is pioneered, making real time face recognition possible.
🔘1992 – Biometric Consortium is established within US Government.
🔘1994 – Palm System is benchmarked.
🔘1996 – Hand geometry is implemented at the Olympic Games.
🔘1996 – NIST begins hosting annual speaker recognition evaluations.
🔘1997 – First commercial, generic biometric interoperability standard is published.
🔘1998- FBI launches COOlS (DNA forensic database).
🔘1999 – FBI’s IAFIS major components become operational.
🔘2001 – Face recognition is used at the Super Bowl in Tampa, Florida.
🔘2002 – ISO/IEC standards committee on biometrics is established.
🔘2004 – First statewide automated palm print databases are deployed in the US.
🔘2008 – U.S. Government begin coordinating biometric database use.
🔘2010 – U.S. national security apparatus utilizes biometrics for terrorist identification.
🔘2011 – Biometric identification used to identify body of Osama bin Laden.

TYPES OF BIOMETRICS

Biometrics Can Be Divided Into Three Main Categories Of Characteristics:
1. BIOLOGICAL
2. MORPHOLOGICAL
3. BEHAVIORAL

⏩DNA MATCHING

The identification of an individual using the analysis of segments from DNA.

👂EAR

The identification of an individual using the shape of the ear.

👀EYES – IRIS RECOGNITION & RETINA RECOGNITION

👁IRIS RECOGNITION- The use of the features found in the iris to identify an individual.
👁RETINA RECOGNITION- The use of patterns of veins in the back of the eye to accomplish recognition.

👱‍♂️FACE RECOGNITION

The analysis of facial features or patterns for the authentication or recognition of an individuals identity.

🤘FINGERPRINT RECOGNITION

The use of the ridges and valleys (minutiae) found on the surface tips of a human finger to identify an individual.

👋FINGER GEOMETRY RECOGNITION

The use of 3D geometry of the finger to determine identity.

🤚HAND GEOMETRY RECOGNITION

The use of the geometric features of the hand such as the lengths of fingers and the width of the hand to identify an individual.

🙌VEIN RECOGNITION

Vein recognition is a type of biometrics that can be used to identify individuals based on the vein patterns in the human finger or palm.

👃ODOUR

The use of an individuals odour to determine identity.

✍SIGNATURE RECOGNITION

The authentication of an individual by the analysis of handwriting style, specifically the signature. Technology is available to check two scanned signatures using advances algorithms.

👩‍💻TYPING RECOGNITION

The use of the unique characteristics of a persons typing for establishing identity.

🗣VOICE / SPEAKER RECOGNITION

There are two major applications of speaker recognition:
🙊Voice – Speaker Verification / Authentication
🙊Voice – Speaker Identification
✔In forensic applications, it is common to first perform a speaker identification process to create a list of “best matches” and then perform a series of verification processes to determine a conclusive match.
✔Voice recognition analyzes audio input for specific patterns in speech or sound. Each voice, or common noise, has a recognizable wavelength pattern that can aid in identification of a specific individual.

🚶‍♀️👣GAIT

The use of an individuals walking style or gait to determine identity.

Biometrics allows a person to be identified and authenticated based on a set of recognizable and verifiable data, which are unique and specific to them. This video covers following Points of Biometrics: 💡Introduction 💡Characteristics 💡History & 💡Types.

via Biometrics — Forensic’s blog

Intro to Fingerprints via Forensic’s blog

Fingerprints- An Introduction

By @forensicfield

INTRODUCTION

Everything we touched, leave behind our unique impression on it, which is Our fingerprints.

No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints.

Fingerprint identification also known as “Dactyloscopy”.

Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. They develop from pressure on a baby’s tiny, developing fingers in the womb.

CLASSIFICATION OF FINGERPRINTS

🧤By FRANCIS GALTON
A well-known British scientist sir Francis Galton published his first book on fingerprint in 1892. His important work include method for classification for fingerprint which are divided into three groups-
a. Arch
b. Loop
c. Whorl

🧤 By WILLIAM J. HERSHEL
While working for the East India Company in Bengal, India, Sir William James Herschel first used fingerprints on native contracts. After a decade, he had accumulated a file of fingerprints.

🧤 By EDWARD HENRY
Henry Classification of Fingerprinting was accepted as common practice throughout England and its territorial holdings and in the United States.
Under the henry system, fingerprints divided into two classes:
•Those which are given numerical value. (whorls and composites).
•Those which doesn’t give numerical value. (loops and arches).

🖐 All patters are divided as follows:

The henry classification system assigns each finger A number according to the order in which it is located in the hand, beginning with the right thumb as number 1 and ending with the left pinky as number 10.
• The system also assigns a numerical value to fingers that contain a whorl pattern; fingers 1 and 2 each have a value of 16,
• Fingers 3 and 4 = 8,
• Fingers 5 and 6 = 4,
• Fingers 7 and 8 = 2,
• Final two fingers = 1.
• Fingers with a non-whorl pattern, such as an arch or loop pattern, have a value of zero.
• The sum of the even finger value is then calculated and placed in the numerator of a fraction.
• The sum of the odd finger values is place in the denominator.
• The value of 1 is added to each sum of the whorls with the maximum obtainable on either side of the fraction begin 32.
• Thus, the primary classification is a fraction between 1/1 to 32/32, where 1/1 would indicate no whorl patterns and 32/32 would mean that all fingers had whorl patterns.

🧤 By JUAN VUCETICH
Vucetich is credited with the first positive criminal identification as, in 1892, he was able to extract a set of prints off a door and thus identify a woman as the culprit in a double homicide.

✴ CHARACTERISTICS OF FINGERPRINT

🖐 Class Characteristics
Class characteristics are the characteristics that narrow the print down to a group but not an individual.

The Three Fingerprint Class Types Are;
1. Arches:
✔Arches are the simplest type of fingerprints that are formed by ridges that enter on one side of the print and exit on the other. No deltas are present.
✔About 5 % of the world’s populations have arch patterns.

Arch pattern

2. Loops:
✔Loops must have one delta and one or more ridges that enter and leave on the same side. These patterns are named for their positions related to the radius and ulna bones.
✔About 60-65 % of the world’s populations have loop patterns.

Loop pattern

3. Whorls:
✔Whorls have at least one ridge that makes (or tends to make) a complete circuit. They also have at least two deltas.
✔About 30-35 % of the world’s populations have whorls patterns.

Whorl Pattern

👍 Individual characteristics:
👉Individual characteristics are those characteristics that are unique to an individual.
👉They are tiny irregularities that appear within the friction ridges and are referred to as Galton’s details.

Individual Characteristics of Fingerprint

Watch it🤳, share it ✌and subscribe it 👇 : –

Fingerprints-An Introduction

By @forensicfield INTRODUCTION Everything we touched, leave behind our unique impression on it, which is Our fingerprints. No two people have exactly the same fingerprints. Even identical twins, with identical DNA, have different fingerprints. Fingerprint identification also known as “Dactyloscopy”. Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each fingers. […]

via Fingerprints- An Introduction — Forensic’s blog

Forensic Structural Engineeringg via Australian Business Network

Structural engineering is a specialty within the field of civil engineering which focuses on the framework of structures and on designing those structures to withstand the stresses and pressures of their environment and remain safe, stable and secure throughout their use. To explain a little differently, it can be said that structural and consulting engineers […]

via Understanding The Concept Of Structural Engineering — Australian Business Network

Variables to consider when Determining Post Mortem Blood Alcohol Levels via True Crime Rocket Science / #tcrs

Immediately following the release of the autopsy reports on November 19th, I contacted Thomas Mollett, a forensic investigator, fellow true crime author and friend, and asked him his opinion on Shan’anns Blood Alcohol Levels. They were found to be three times the legal limit for driving. How likely was it, I asked, that these apparently high levels were from “normal” decomposition?

SUPPLEMENTAL

Autopsy reports show Shanann Watts, daughters were asphyxiated – TimesCall

Fullscreen capture 20181217 233710

Pathology is an extremely complex science, and many factors play into the biological processes that occur after death.

image001

The three basic pillars one uses to calculate whether the BAC is “normal” or not are related to:

  1. the time the body is exposed to the elements [here time of death is a factor, unknown in this case, but with a relatively short window either way]
  2. the ambient conditions of the body [temperature, humidity etc.]
  3. circumstantial evidence is also a vital tool to gauge alcohol content, including eye witnesses, Shan’ann’s drinking habits, and her appearance in the Ring camera footage when she arrived home [described but not released thus far]

During our first communication I miscommunicated to Mollett that Shan’ann’s corpse was recovered after only 48 hours, which I guessed wasn’t enough time to reflect the high alcohol levels found. This was an initial error on my part; it took closer to 70 hours for Shan’ann’s corpse to be discovered and exhumed.

Based on this initial miscommunication, Mollett also believed the BAC level was likely higher than a natural rate [which as I say, was also what I suspected].

I asked Mollett to investigate the BAC levels and I’m grateful to him for doing so in detail. Obviously part of his thorough investigation corrected the original 48 hour error.

Below is Mollet’s unabridged report on the BAC levels.

Fullscreen capture 20181217 230915Fullscreen capture 20181217 230922Fullscreen capture 20181217 230928Fullscreen capture 20181217 230933Fullscreen capture 20181217 230940Fullscreen capture 20181217 231003Fullscreen capture 20181217 231009Fullscreen capture 20181217 231013Fullscreen capture 20181217 231022Fullscreen capture 20181217 231028

Fullscreen capture 20181217 235348Fullscreen capture 20181217 235321Fullscreen capture 20181217 231033Fullscreen capture 20181217 231105

20181119__20TCAAUTw_1

  • Share
  • 9Comments
9 COMMENTS
  • Helen

    Helen

    I think Chris tied Shanann to the bed after she fell asleep, put a pillow case over her mouth to prevent her from screaming, made sure she watched through the monitor how he smothered Bella and Celeste, and then came back to the bedroom to strangle her.

    Reply
  • BAMS13

    BAMS13
    Helen

    You’re going to get in trouble from Nick now… lol.

    Reply
  • nickvdl

    nickvdl
    BAMS13

    Bams, can I let you take it from here? I can’t always be the one cracking the whip 😉

    Reply
  • BAMS13

    BAMS13
    nickvdl

    Haha! Always happy to try and exert my low ranking power anytime. You’d think those virtual whip cracks can be heard loud and clear though. 😉

    Reply
  • Syzia

    Syzia

    Helen took it to the next level here

    Reply
  • Marie

    Marie
    Syzia

    Oh yes syzia, I agree

    Reply
  • Karen

    Karen

    Well, that report certainly cleared up so many things. Now we know. The body certainly is a fascinating animal in death as much as life. I do know that when officer Coonrod was in the kitchen he didn’t have a peek in the sink to see if there were breakfast dishes in there to find out if the kids had eaten so we couldn’t see if there was a wine glass. Nor did I see any at all throughout his whole walk through the house. Thorough report

    Reply
  • Sylvester

    Sylvester

    “Important moments at Watt’s well site” is really stunning. I hope everyone can blow it up on a computer monitor rather than a cell phone. You really get the sense of vastness of that site – miles and miles in every direction of land dotted with wildflowers. The tank battery site even seems dwarfed in proportion to the land. As the drone makes it’s lazy pass from the air you then see the sheet, hugging the scrub. Look a little closer and you see the black garbage bags. It was rather stupid of him to discard the sheet on top of the land after it had fulfilled it’s purpose to conceal and drag. Same with the garbage bags. Maybe he thought in the vastness of the land those items, like his family, would simply vanish.

    Reply
  • Karen

    Karen
    Sylvester

    Sylvester, do you know if they sent the drone out before Chris said anything or after? For the life of me, I can’t remember. Thank you kindly

    Reply

JZ Logistics @ The Internet Truck Stop


SEND

Immediately following the release of the autopsy reports on November 19th, I contacted Thomas Mollett, a forensic investigator, fellow true crime author and friend, and asked him his opinion on Shan’anns Blood Alcohol Levels. They were found to be three times the legal limit for driving. How likely was it, I asked, that these apparently high […]

via Thomas Mollett’s Forensic Report on Shan’ann Watts’ Post Mortem Blood Alcohol Level — True Crime Rocket Science / #tcrs

Global Standardization of Forensics will Decrease the Bias Factor of Evidence Collection Procedures and Court Rulings

Interviews – 2018

Angus Marshall, Digital Forensic Scientist

via Angus Marshall
Angus, tell us a bit about yourself. What is your role, and how long have you been working in digital forensics?

Where to begin? I have a lot of different roles these days, but by day I’m a Lecturer in Cybersecurity – currently at the University of York, and also run my own digital forensic consultancy business. I drifted into the forensic world almost by accident back in 2001 when a server I managed was hacked. I presented a paper on the investigation of that incident at a forensic science conference and a few weeks later found myself asked to help investigate a missing person case that turned out to be a murder. There’s been a steady stream of casework ever since.

I’m registered as an expert adviser and most of my recent casework seems to deal with difficult to explain or analyse material. Alongside that, I’ve spent a lot of time (some might say too much) working on standards during my time on the Forensic Science Regulator’s working group on digital evidence and as a member of BSI’s IST/033 information security group and the UK’s digital evidence rep. on ISO/IEC JTC1 SC27 WG4, where I led the work to develop ISO/IEC 27041 and 27042, and contributed to the other investigative and eDiscovery standards.

You’ve recently published some research into verification and validation in digital forensics. What was the goal of the study?

It grew out of a proposition in ISO/IEC 27041 that tool verification (i.e. evidence that a tool conforms to its specification) can be used to support method validation (i.e. showing that a particular method can be made to work in a lab). The idea of the 27041 proposal is that if tool vendors can provide evidence from their own development processes and testing, the tool users shouldn’t need to repeat that. We wanted to explore the reality of that by looking at accredited lab processes and real tools. In practice, we found that it currently won’t work because the requirement definitions for the methods don’t seem to exist and the tool vendors either can’t or won’t disclose data about their internal quality assurance.

The effect of it is that it looks like there may be a gap in the accreditation process. Rather than having a selection of methods that are known to work correctly (as we see in calibration houses, metallurgical and chemical labs etc. – where the ISO 17025 standard originated) which can be chosen to meet a specific customer requirement, we have methods which satisfy much fuzzier customer requirements which are almost always non-technical in nature because the customers are CJS practitioners who simply don’t express things in a technical way.

We’re not saying that anyone is necessarily doing anything wrong, by the way, just that we think they’ll struggle to provide evidence that they’re doing the right things in the right way.

Where do we stand with standardisation in the UK at the moment?

Standardization is a tricky word. It can mean that we all do things the same way, but I think you’re asking about progress towards compliance with the regulations. In that respect, it looks like we’re on the way. It’s slower than the regulator would like. However, our research at York suggests that even the accreditations awarded so far may not be quite as good as they could be. They probably satisfy the letter of the regulator’s documents, but not the spirit of the underlying standard. The technical correctness evidence is missing.

ISO 17025 has faced a lot of controversy since it has been rolled out as the standard for digital forensics in the UK. Could you briefly outline the main reasons why?

Most of the controversy is around cost and complexity. With accreditation costing upwards of £10k for even a small lab, it makes big holes in budgets. For the private sector, where turnover for a small lab can be under £100k per annum, that’s a huge issue. The cost has to be passed on. Then there’s the time and disruption involved in producing the necessary documents, and then maintaining them and providing evidence that they’re being followed for each and every examination.

A lot of that criticism is justified, but adoption of any standard also creates an opportunity to take a step back and review what’s going on in the lab. It’s a chance to find a better way to do things and improve confidence in what you’re doing.

In your opinion, what is the biggest stumbling block either for ISO 17025 specifically, or for standardizing digital forensics in general?

Two things – as our research suggests, the lack of requirements makes the whole verification and validation process harder, and there’s the confusion about exactly what validation means. In ISO terms, it’s proof that you can make a process work for you and your customers. People still seem to think it’s about proving that tools are correct. Even a broken tool can be used in a valid process, if the process accounts for the errors the tool makes.

I guess I’ve had the benefit of seeing how standards are produced and learning how to use the ISO online browsing platform to find the definitions that apply. Standards writers are a lot like Humpty Dumpty. When we use a word it means exactly what we choose it to mean. Is there a way to properly standardise tools and methods in digital forensics?

It’s not just a UK problem – it’s global. There’s an opportunity for the industry to review the situation, now, and create its own set of standard requirements for methods. If these are used correctly, we can tell the tool makers what we need from them and enable proper objective testing to show that the tools are doing what we need them to. They’ll also allow us to devise proper tests for methods to show that they really are valid, and to learn where the boundaries of those methods are.

Your study also looked at some existing projects in the area: can you tell us about some of these? Do any of them present a potential solution?

NIST and SWGDE both have projects in this space, but specifically looking at tool testing. The guidance and methods look sound, but they have some limitations. Firstly, because they’re only testing tools, they don’t address some of the wider non-technical requirements that we need to satisfy in methods (things like legal considerations, specific local operational constraints etc.).

Secondly, the NIST project in particular lacks a bit of transparency about how they’re establishing requirements and choosing which functions to test. If the industry worked together we could provide some more guidance to help them deal with the most common or highest priority functions.

Both projects, however, could serve as a good foundation for further work and I’d love to see them participating in a community project around requirements definition, test development and sharing of validation information.

Is there anything else you’d like to share about the results?

We need to get away from thinking solely in terms of customer requirements and method scope. These concepts work in other disciplines because there’s a solid base of fundamental science behind the methods. Digital forensics relies on reverse-engineering and trying to understand the mind of a developer in order to work out how extract and interpret data. That means we have a potentially higher burden of proof for any method we develop. We also need to remember that we deal with a rate of change caused by human ingenuity and marketing, instead of evolution.

Things move pretty fast in DF, if we don’t stop and look at what we’re doing once in a while, we’ll miss something important.

Read Angus Marshall’s paper on requirements in digital forensics method definition here. Angus Marshall

The hottest topic in digital forensics at the moment, standardisation is on the tip of everyone’s tongues. Following various think pieces on the subject and a plethora of meetings at conferences, I spoke to Angus Marshall about his latest paper and what he thinks the future holds for this area of the industry. You can […]

via Angus Marshall talks about standardisation — scar