Forensic Collection from a Smartphone Using GenAI for Analysis: eDiscovery Webinars

Great topic! Friday’s webinar from Lexbe will discuss the anatomy of a forensic collection from a smartphone using GenAI for analysis!

This Friday, Lexbe will host the encore webinar titled (wait for it!) Anatomy of a Forensic Collection from a Smartphone and How to Use GenAI to Analyze the Evidence (available here) at 2pm ET (1pm CT, 11am PT). Join Lexbe for an informative session on forensic collection from smartphones and analysis of the evidence using generative AI. During this informative session, you’ll learn:

  • Forensically sound collection procedures and documentation
  • The scope of ESI that can be collected
  • Forensic collection methods and workflow for mobile devices
  • How to assess exfiltrated and deleted ESI
  • Best practices to avoid mistakes and corruption of ESI
  • How to perform remote iPhone collections
  • How to leverage GenAI to analyze Smartphone forensic data

The webinar will be conducted by Nicholas Marrero, who is a computer forensic analyst with extensive experience in digital forensics, consulting and eDiscovery.

Advertisement

Level Legal

Increased reliance on smartphones due the digital transformation of corporations, anytime and anywhere access to data, the proliferation of integrated apps, and the increased popularity of BYOD (bring your own device) programs make smartphones highly relevant sources of ESI (electronically stored information) for most cases. And once the smartphone data has been collected, GenAI can help identify patterns, detect anomalies, correlate data points, summarize the data, and provide unparalleled insights. This includes analyzing text messages, call logs, app usage, location data, and multimedia files. Register here to learn more about the anatomy of a forensic collection from a smartphone using GenAI for analysis!

So, what do you think? Are you finding discovery from smartphones to be necessary but challenging? If so, consider attending tomorrow’s webinar!  And please share any comments you might have or if you’d like to know more about a particular topic.

Image created using GPT-4o’s Image Creator Powered by DALL-E, using the term “robot doctor examining a small smartphone with a stethoscope where the other end of the stethoscope is placed on the smartphone”.

Disclosure: Lexbe is an Educational Partner and sponsor of eDiscovery Today

Advertisement

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Share this:

Like this:

Related

Anatomy of a Forensic Collection from a Smartphone Using GenAI for Analysis: eDiscovery WebinarsAugust 29, 2024In “Analysis”

Forensic Collection from a Smartphone and Using GenAI to Analyze the Evidence: eDiscovery WebinarsOctober 11, 2024In “Analysis”

A Litigator’s Guide to Forensic Collections from Websites: eDiscovery WebinarsOctober 15, 2024In “Artificial Intelligence”


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

Type your email…

Subscribe

Published by Doug Austin

Doug is an established eDiscovery thought leader with over 30 years of experience providing eDiscovery best practices, legal technology consulting and technical project management services to numerous commercial and government clients. Doug has published a daily blog since 2010 and has written numerous articles and white papers. He has received the JD Supra Readers Choice Award as the Top eDiscovery Author for 2017 and 2018 and a JD Supra Readers Choice Award as a Top Cybersecurity Author for 2019. Doug has presented at numerous events and conferences, including Legaltech New York, ILTACON, Relativity Fest, University of Florida E-Discovery Conference, Masters Conference and many local and regional conferences. Doug has also presented numerous CLE-accredited webcasts. View all posts by Doug Austin

Post navigation

PreviousThe 2024-2029 eDiscovery Market Size Mashup of eDiscovery Market Estimates is Here!: eDiscovery Trends

NextThe Brutalist Uses Generative AI. Will that Cost it the Oscar?: Artificial Intelligence Trends

Leave a Reply

Jetpack Comment

Follow Blog via Email

Enter your email address to follow this blog and receive notifications of new posts by email.

Email Address

Follow

Advertisement

Level Legal

Search for:

Categories

Categories  Select Category  eDiscovery Today  (2,729)     ACEDS  (138)     Analysis  (203)     ARMA  (6)     Artificial Intelligence  (720)     Audio/Video Discovery  (97)     Blockchain  (8)     Careers  (88)     Case Law  (643)     Case Study  (4)     Collaboration Apps  (266)     Collection  (211)     Compensation  (2)     Conferences  (190)     Consulting  (77)     Contracts  (38)     Cooperation  (256)     Corporate  (144)     COVID-19  (107)     Criminal Law  (55)     Depositions  (77)     EDRM  (290)     Education  (317)     Electronic Discovery  (2,104)     Email  (188)     Ethics  (434)     Evidence  (331)     Exhibits  (7)     Expert Witness  (12)     Federal eDiscovery Rules  (113)     Forensics  (142)     Government Agency  (110)     Identification  (40)     ILTA  (28)     In Memoriam  (8)     Industry Trends  (1,316)     Information Governance  (478)     Internal Investigations  (79)     International eDiscovery  (86)     Internet of Things  (31)     Law Firm Departments  (73)     Legal Technology  (466)     Master’s Conference  (9)     Mergers-Acquisitions-Investments  (108)     Metadata  (63)     Milestones  (2)     Mobile Devices  (283)     News  (252)     Outsourcing  (276)     Presentation  (34)     Preservation  (277)     Pricing  (60)     Privacy  (680)     Privilege  (151)     Processing  (85)     Production  (293)     Project Management  (193)     Proportionality  (174)     Redaction  (37)     Review  (213)     Sanctions  (238)     Searching  (187)     Security  (606)     Sedona Conference  (41)     Social Media  (103)     Software: Cloud  (514)     Software: On-Prem  (68)     State eDiscovery Rules  (19)     Technology Assisted Review (TAR)  (165)     Third Party Subpoena  (33)     Webinar/Webcast  (368)     Women in eDiscovery  (8)  Uncategorized  (1) 

Archives

Archives  Select Month   January 2025  (56)   December 2024  (36)   November 2024  (49)   October 2024  (63)   September 2024  (57)   August 2024  (52)   July 2024  (54)   June 2024  (55)   May 2024  (64)   April 2024  (61)   March 2024  (58)   February 2024  (55)   January 2024  (59)   December 2023  (46)   November 2023  (54)   October 2023  (59)   September 2023  (53)   August 2023  (63)   July 2023  (45)   June 2023  (59)   May 2023  (63)   April 2023  (49)   March 2023  (57)   February 2023  (53)   January 2023  (55)   December 2022  (35)   November 2022  (43)   October 2022  (55)   September 2022  (48)   August 2022  (57)   July 2022  (45)   June 2022  (54)   May 2022  (49)   April 2022  (45)   March 2022  (55)   February 2022  (42)   January 2022  (43)   December 2021  (33)   November 2021  (42)   October 2021  (45)   September 2021  (43)   August 2021  (43)   July 2021  (43)   June 2021  (45)   May 2021  (42)   April 2021  (42)   March 2021  (39)   February 2021  (32)   January 2021  (33)   December 2020  (26)   November 2020  (29)   October 2020  (39)   September 2020  (35)   August 2020  (42)   July 2020  (41)   June 2020  (35)   May 2020  (35)   April 2020  (19) 

Sites I Follow:

EDRM
Ball in Your Court (Craig Ball)
Complex Discovery (Rob Robinson)
eDiscovery Assistant (Kelly Twigger)
ACEDS
LawSites (Bob Ambrogi)
Techno Gumbo (Tom O’Connor)
Legaltech® News


This work is licensed under a Creative Commons Attribution 4.0 International License

Privacy Policy

Global Standardization of Forensics will Decrease the Bias Factor of Evidence Collection Procedures and Court Rulings

Interviews – 2018

Angus Marshall, Digital Forensic Scientist

via Angus Marshall
Angus, tell us a bit about yourself. What is your role, and how long have you been working in digital forensics?

Where to begin? I have a lot of different roles these days, but by day I’m a Lecturer in Cybersecurity – currently at the University of York, and also run my own digital forensic consultancy business. I drifted into the forensic world almost by accident back in 2001 when a server I managed was hacked. I presented a paper on the investigation of that incident at a forensic science conference and a few weeks later found myself asked to help investigate a missing person case that turned out to be a murder. There’s been a steady stream of casework ever since.

I’m registered as an expert adviser and most of my recent casework seems to deal with difficult to explain or analyse material. Alongside that, I’ve spent a lot of time (some might say too much) working on standards during my time on the Forensic Science Regulator’s working group on digital evidence and as a member of BSI’s IST/033 information security group and the UK’s digital evidence rep. on ISO/IEC JTC1 SC27 WG4, where I led the work to develop ISO/IEC 27041 and 27042, and contributed to the other investigative and eDiscovery standards.

You’ve recently published some research into verification and validation in digital forensics. What was the goal of the study?

It grew out of a proposition in ISO/IEC 27041 that tool verification (i.e. evidence that a tool conforms to its specification) can be used to support method validation (i.e. showing that a particular method can be made to work in a lab). The idea of the 27041 proposal is that if tool vendors can provide evidence from their own development processes and testing, the tool users shouldn’t need to repeat that. We wanted to explore the reality of that by looking at accredited lab processes and real tools. In practice, we found that it currently won’t work because the requirement definitions for the methods don’t seem to exist and the tool vendors either can’t or won’t disclose data about their internal quality assurance.

The effect of it is that it looks like there may be a gap in the accreditation process. Rather than having a selection of methods that are known to work correctly (as we see in calibration houses, metallurgical and chemical labs etc. – where the ISO 17025 standard originated) which can be chosen to meet a specific customer requirement, we have methods which satisfy much fuzzier customer requirements which are almost always non-technical in nature because the customers are CJS practitioners who simply don’t express things in a technical way.

We’re not saying that anyone is necessarily doing anything wrong, by the way, just that we think they’ll struggle to provide evidence that they’re doing the right things in the right way.

Where do we stand with standardisation in the UK at the moment?

Standardization is a tricky word. It can mean that we all do things the same way, but I think you’re asking about progress towards compliance with the regulations. In that respect, it looks like we’re on the way. It’s slower than the regulator would like. However, our research at York suggests that even the accreditations awarded so far may not be quite as good as they could be. They probably satisfy the letter of the regulator’s documents, but not the spirit of the underlying standard. The technical correctness evidence is missing.

ISO 17025 has faced a lot of controversy since it has been rolled out as the standard for digital forensics in the UK. Could you briefly outline the main reasons why?

Most of the controversy is around cost and complexity. With accreditation costing upwards of £10k for even a small lab, it makes big holes in budgets. For the private sector, where turnover for a small lab can be under £100k per annum, that’s a huge issue. The cost has to be passed on. Then there’s the time and disruption involved in producing the necessary documents, and then maintaining them and providing evidence that they’re being followed for each and every examination.

A lot of that criticism is justified, but adoption of any standard also creates an opportunity to take a step back and review what’s going on in the lab. It’s a chance to find a better way to do things and improve confidence in what you’re doing.

In your opinion, what is the biggest stumbling block either for ISO 17025 specifically, or for standardizing digital forensics in general?

Two things – as our research suggests, the lack of requirements makes the whole verification and validation process harder, and there’s the confusion about exactly what validation means. In ISO terms, it’s proof that you can make a process work for you and your customers. People still seem to think it’s about proving that tools are correct. Even a broken tool can be used in a valid process, if the process accounts for the errors the tool makes.

I guess I’ve had the benefit of seeing how standards are produced and learning how to use the ISO online browsing platform to find the definitions that apply. Standards writers are a lot like Humpty Dumpty. When we use a word it means exactly what we choose it to mean. Is there a way to properly standardise tools and methods in digital forensics?

It’s not just a UK problem – it’s global. There’s an opportunity for the industry to review the situation, now, and create its own set of standard requirements for methods. If these are used correctly, we can tell the tool makers what we need from them and enable proper objective testing to show that the tools are doing what we need them to. They’ll also allow us to devise proper tests for methods to show that they really are valid, and to learn where the boundaries of those methods are.

Your study also looked at some existing projects in the area: can you tell us about some of these? Do any of them present a potential solution?

NIST and SWGDE both have projects in this space, but specifically looking at tool testing. The guidance and methods look sound, but they have some limitations. Firstly, because they’re only testing tools, they don’t address some of the wider non-technical requirements that we need to satisfy in methods (things like legal considerations, specific local operational constraints etc.).

Secondly, the NIST project in particular lacks a bit of transparency about how they’re establishing requirements and choosing which functions to test. If the industry worked together we could provide some more guidance to help them deal with the most common or highest priority functions.

Both projects, however, could serve as a good foundation for further work and I’d love to see them participating in a community project around requirements definition, test development and sharing of validation information.

Is there anything else you’d like to share about the results?

We need to get away from thinking solely in terms of customer requirements and method scope. These concepts work in other disciplines because there’s a solid base of fundamental science behind the methods. Digital forensics relies on reverse-engineering and trying to understand the mind of a developer in order to work out how extract and interpret data. That means we have a potentially higher burden of proof for any method we develop. We also need to remember that we deal with a rate of change caused by human ingenuity and marketing, instead of evolution.

Things move pretty fast in DF, if we don’t stop and look at what we’re doing once in a while, we’ll miss something important.

Read Angus Marshall’s paper on requirements in digital forensics method definition here. Angus Marshall

The hottest topic in digital forensics at the moment, standardisation is on the tip of everyone’s tongues. Following various think pieces on the subject and a plethora of meetings at conferences, I spoke to Angus Marshall about his latest paper and what he thinks the future holds for this area of the industry. You can […]

via Angus Marshall talks about standardisation — scar