Digital Forensic Science. Vassil Roussev

Чтение книги онлайн.

Читать онлайн книгу Digital Forensic Science - Vassil Roussev страница 5

Digital Forensic Science - Vassil Roussev Synthesis Lectures on Information Security, Privacy, and Trust

Скачать книгу

forensic tools for analyzing the two main mobile OS environments—Android and iOS.

      On the other hand, we see exponential growth in the volume of forensic data in need of processing (Section 6.1) and the accelerating transition to cloud-centric IT (Section 4.6). As our discussion will show, the latter presents a qualitatively new target and requires a new set of tools to be developed. Separately, we are also seeing a maturing use of security and privacy techniques, such as encryption and media sanitization, that eliminate some traditional sources of evidence, and make access to others problematic.

      It is difficult to predict what will be the event(s) that will mark the logical beginning of the next period, but one early candidate is the announcement of the AWS Lambda platform [9]. There is a broad consensus that the next major technology shift (over the next 10–15 years) will be the widespread adoption of the Internet of Things (IoT) [7]. It is expected that it will bring online between 10 and 100 times more Internet-connected devices of all kinds. The fast growing adoption of AWS Lambda as a means of working with these devices suggests that it could have a similar impact on the IT landscape to that of the original introduction of AWS.

      Lambda provides a platform, in which customers write event-handling functions that require no explicit provisioning of resources. In a typical workflow, a device uploads a piece of data to a storage service, like AWS S3. This triggers an event, which is automatically dispatched to an instance of a user-defined handler; the result may be the generation of a series of subsequent events in a processing pipeline. From a forensic perspective, such an IT model renders existing techniques obsolete, as there is no meaningful data to be extracted from the embedded device itself.

      The main point of this brief walk through the history of digital forensics is to link the predominant forensic methods to the predominant IT environment. Almost all techniques in widespread use today are predicated on access to the full environment in which the relevant computations were performed. This started with standalone personal computers, which first became connected to the network, then became mobile, and eventually became portable and universally connected. Although each step introduced incremental challenges, the overall approach continued to work well.

      However, IT is undergoing a rapid and dramatic shift from using software products to employing software services. Unlike prior developments, this one has major forensic implications; in simple terms, tools no longer have access to the full compute environment of the forensic target, which is a service hosted somewhere in a shared data center. Complicating things further is the fact that most computations are ephemeral (and do not leave the customary traces) and storage devices are routinely sanitized.

      We will return to this discussion in several places throughout the text, especially in Chapter 6. For now, the main takeaway is that the future of forensics is likely to be different than its past and present. That being said, the bulk of the content will naturally focus on systematizing what we already know, but we will also point out the new challenges that may require completely new solutions.

      CHAPTER 3

       Definitions and Models

      Forensic science is the application of scientific methods to collect, preserve, and analyze evidence related to legal cases. Historically, this involved the systematic analysis of (samples of) physical material in order to establish causal relationships among various events, as well as to address issues of provenance and authenticity.1 The rationale behind it—Locard’s exchange principle—is that physical contact between objects inevitably results in the exchange of matter leaving traces that can be analyzed to (partially) reconstruct the event.

      With the introduction of digital computing and communication, the same general assumptions were taken to the cyber world, largely unchallenged. Although a detailed conceptual discussion is outside the intent of this text, we should note that the presence of persistent “digital traces” (broadly defined) is neither inevitable nor is it a “natural” consequence of the processing and communication of digital information. Such records of cyber interactions are the result of concious engineering decisions, ones not usually taken specifically for forensic purposes. This is a point we will return to shortly, as we work toward a definition that is more directly applicable to digital forensics.

      Any discussion on forensic evidence must inevitably begin with the Daubert standard—a reference to three landmark decisions by the Supreme Court of the United States: Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993); General Electric Co. v. Joiner, 522 U.S. 136 (1997); and Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999).

      In the words of Goodstein [78]: “The presentation of scientific evidence in a court of law is a kind of shotgun marriage between the two disciplines.… The Daubert decision is an attempt (not the first, of course) to regulate that encounter.”

      These cases set a new standard for expert testimony [11], overhauling the previous Frye standard of 1923 (Frye v. United States, 293 F. 1013, D.C. Cir. 1923). In brief, the Supreme Court instructed trial judges to become gatekeepers of expert testimony, and gave four basic criteria to evaluate the admissability of forensic evidence:

      1. The theoretical underpinnings of the methods must yield testable predictions by means of which the theory could be falsified.

      2. The methods should preferably be published in a peer-reviewed journal.

      3. There should be a known rate of error that can be used in evaluating the results.

      4. The methods should be generally accepted within the relevant scientific community.

      The court also emphasized that these standards are flexible and that the trial judge has a lot of leeway in determining admissability of forensic evidence and expert witness testimony. During legal proceedings, special Daubert hearings are often held in which the judge rules on the admissibility of expert witness testimony requested by the two sides.

      In other words, scientific evidence becomes forensic only if the court deems it admissible. It is a somewhat paradoxic situation that an evaluation of the scientific merits of a specific method is rendered by a judge, not scientists. There is no guarantee that the legal decision, especially in the short term, will be in agreement with the ultimate scientific consensus on the subject. The courts have a tendency to be conservative and skeptical with respect to new types of forensic evidence. The admissability decision also depends on the specific case, the skill of the lawyers on both sides, the communication skills of the expert witnesses, and a host of other factors that have nothing to do with scientific merit.

      The focus of this book is on the scientific aspect of the analytical methods and, therefore, we develop a more technical definition of digital forensic science.

      Early applications of digital forensic science emerged out of law enforcement agencies, and were initiated by investigators with some technical background, but no formal training as computer scientists. Through the 1990s, with the introduction and mass adoption of the Internet, the amount of data and the complexity of the systems investigated grew quickly. In response, digital forensic methods developed in an ad hoc, on-demand fashion, with no overarching methodology, or peer-reviewed venues. By the late 1990s, coordinated efforts emerged to formally define and organize the discipline, and to spell out best field practices

Скачать книгу