U.S. flag

An official website of the United States government, Department of Justice.

Experimental Study of the Validity and Reliability of Digital Forensics Tools

NCJ Number
311530
Date Published
December 2011
Length
53 pages
Abstract

Digital forensic techniques and tools, as with all other forensic disciplines, must meet basic evidentiary and scientific standards to be allowed as evidence in legal proceedings. In order to be admissible, evidence or opinion derived from scientific or technical activities must come from methods that are proven competencies to be “scientifically valid.” Scientifically valid techniques are capable of being proven correct through empirical testing. In practice, this means that the tools and techniques used in digital forensics must be validated, and that crime laboratories, including digital forensic labs, should be accredited or otherwise proven to meet such scientific standards. This task is overwhelming for governmental agencies that lack funding to perform a full-scale validation of all forensics tools. Validation is often left to the individual examiners, who may lack the expertise and resources to conduct a scientific validation.

Our project directly addresses the National Academies’ concerns related to measurement validity in the digital evidence domain. Researchers from the National Center for Forensic Science (University of Central Florida), Purdue University, and law enforcement digital forensic experts conduct tests to identify issues with the reliability and accuracy of the most accepted software and hardware in use by law enforcement forensic examiners. We conducted approximately 250 validation tests of hardware (write blockers) and software for both Windows and Mac OS X operating systems. Our research design was based on employing these tools to conduct the common forensic tasks across varying operating system and file system conditions following the Scientific Working Group on Digital Evidence (SWGDE) and National Institute of Standards and Technology (NIST) Computer Forensics Tool Testing Guidelines. We selected the most commonly used commercial forensic suites employed by law enforcement for our test bed based upon feedback collected from a survey of over fifty members of the International Association of Computer Investigative Specialists (IACIS). Our research design includes the most frequently encountered file systems, and includes several file systems for each of Windows OS and Mac OS X. We have also included select hardware write blockers in our research design, as they are crucial to the forensic examiner’s ability to duplicate media without changing the original evidence. 

We used black box testing to identify issues with accuracy and reliability of our selected hardware and software. In black box testing, the software serves essentially as a “black box” and the performance of the application is evaluated against functional requirements. We created detailed test plans, scripts for installation of operating systems, scripts for forensic tool suite installation, scripts for evidence creation, and descriptions of hardware specifications used for testing in our research. Due to the number of documents, and file sizes, we are unable to include all documentation and reports in this report. These documents are publicly available for download from www.ncfs.org.

The extrapolation of the results of software validation is inherently limited for several reasons. First and foremost is that new versions of software may ‘fix’ previous detected faults (‘bugs’), or even introduce new faults. In addition, these bugs may interact with the testing media (other software, hardware, media under examination, etc.), so that the bug may be apparent only under certain circumstances. Thus, it is crucial that examiners test their own combination of software version and hardware to identify any discrepancies between expected and actual results prior to conducting a forensic examination, or relying upon the results. Accordingly, the results of our validation tests may be extrapolated to only the versions of the software versions used in our study.

Date Published: December 1, 2011