INTRODUCTION
In this chapter, I introduce the science of computer forensicsâboth what it is and what it is not. Although the term is heard these days with increasing frequency, the discipline itself has existed for only a short time. Only within the past 20 years has the admission of digital evidence gained consistent recognition in our court systems. In fact, some courts systems around the world may not admit certain types of digital evidence. As you might imagine, given the relative youth of this area of study, some mistakes were made early on. It is to be hoped that those mistakes have been long since corrected, and, while completely smooth sailing is doubtful, at least calmer seas are on the horizon. As we begin this journey, it is only fitting to lay some groundwork. This groundwork, in the form of history, will help provide a clearer picture of the role that computer forensics plays in our legal system. It begins with a discussion of forensics itself.
FORENSIC SCIENCE
The term âforensicsâ is often misunderstood and is frequently misused. Whether in popular television or the news media, the term is often thrown around without regard to what it actually means. From the Latin forensis, the term means âbelonging to, used in or suitable to courts of judicature or to public discussion and debate.â1 Forensics exists independently of any particular field of study.
For example, forensic entomology, while grouped with other forensic fields, is really nothing more than the scientific study of insects, with the added qualifier that it is done with a goal of introduction into court. In fact, Gil Grissom of televisionâs CSI is probably Americaâs most famous forensic entomologist, albeit a fictional one, and has single-handedly made bug lovers sexy.
In reality, although there are a small group of fields in which forensic analysis is common, practically any field of study is amenable to such work. For example, most people have heard of forensic psychologists, who offer evidence in court regarding mental states and conditions; few people know that there are forensic engineers, who offer scientific evidence within their subspecialty. For example, a forensic electrical engineer might offer testimony regarding the cause of a fire related to faulty wiring. Fewer people still have heard of the field of forensic linguistics. Since linguistics is the study of language, a forensic linguist might be used to analyze the language used in a suicide note, compared to miscellaneous writings of the deceased prior to death, to try to determine if the note was in fact written by the deceased.
Therefore, by extension of this logic, computer forensics is the scientific study of computers in a manner consistent with the principles of the rules of evidence and court rules of procedure. This is exactly what the field of computer forensics is. It is also important to understand what it is not.
Even among those knowledgeable in the field, some confusion exists over what particular areas of computer science should actually be included under the umbrella of computer forensics. In order to better illustrate what is, and what is not, traditionally considered computer forensics, a brief history of the evolution of computer science into the study of computer forensics is helpful.
HISTORY OF COMPUTER FORENSICS
The most influential aspects of computer history are the history of the machines themselves. The evolution of the computer from a mysterious black box of interest only to academics and technical types, to a ubiquitous fixture in nearly every home, is a unique and interesting story.
Once of the biggest changes to occur is the sheer size of the computer. In the early 1950âs the first computers were housed in buildings dedicated solely to their operation. These behemoths, less sophisticated than todayâs three-dollar calculator, were unbelievably costly and amazingly temperamental. Designed and built using conventional vacuum tubes, many of the circuits were large enough for computer scientists to actually walk among the components removing debris and small bugs that were causing malfunctionsâhence the term âbug,â which in computer lingo signifies an operating glitch. Their size and cost made the first computers little more than curiosities for the average American. In fact, until 1981, when IBM released its first personal computer (PC), personal home computers were a rarity.2
Or perhaps the mystique that shrouds the computer is the result of the fact that computers speak their own language. Originally computers were nonprogrammable in the sense we think of today. Eventually, as they evolved, the ability to change their configuration emerged, and while difficult under the best conditions, changes could be made to their functionality. As the power of the computational ability of computers expanded during the late 1940s and 1950s, interacting with the computers became a greater focus.
In 1954 John Backus, an employee of IBM, developed the first high-level programming language.3 This language, FORTRAN, short for formula translation, was subsequently released commercially, and thus began the computer revolution. Prior to FORTRAN and other high-level languages that would follow, such as COBOL and C++, the only way to communicate with the computer was through machine language: a series of 0s and 1s. Machine language eventually led to a second layer of language known as assembly language, which turned the 0s and 1s of machine code into human words, such as PUSH, POP, and MOVE.
From this highly complicated language system emerged FORTRAN and COBOL and later C+. These high-level languages, while much simpler than machine language, were still well beyond the capabilities and comprehension of the average citizen, which contributed to the mystique of computers. Unlike the telephone, which was an unprecedented phenomenal scientific advancement in its own right, you needed to know an entirely new language to communicate with computers.
Whatever the reason, whether cost or communication barriers, computers remained an academic and military phenomenon for much of their early lives. However, as computers began to take a foothold, a cottage industry of home computer kits emerged. These kits, ranging in cost from $1,500 to $4000, were targeted to computer and electronics hobbyists who wanted to own their own computerâsome assembly required.4
Historically, many of the advances in the home computer, later rebranded the personal computer thanks to IBMâs marketing of the IBM-PC, occurred in a hobbyist, garage-tinkering way. Industry leaders such as Bill Gates, Steve Jobs, and Steve Wozniak began their careers by building home-brewed versions of commercial products. Were it not for the innovations of these early pioneers, the PC would not have evolved in the fashion it had.5 This characteristic is much more than an interesting footnote to history. On the contrary, I believe it is the single most important factor influencing the nature of computer forensics.
The modality through which early home computers evolved promoted an environment of innovation and tinkering, the heart and soul of which is exploration and adaptation. I liken the environment of the 1970s and early 1980s, during which some of the greatest advancements in home computers were made, to a young child disassembling a parentâs transistor radio to figure out how it works. This spirit of exploration, while at the heart of most all innovations and inventions, would have been no different from the exploration of our ancestors such as Guglielmo Marconi and Enrico Fermi, but for the influence of one phenomenon: the Internet.
There is some disagreement over the actual origin of the Internet. Some claim that it was built in cooperation with the Department of Defense as a vast nationwide âcommunications bomb shelter.â Others argue that it was more about linking research institutions together than providing for the common defense.6 Regardless of which side you believe, the Internet was in fact originally a small network of computers known as the ARPANET. The ARPANET originally consisted of four computers located at research facilities at the University of California at Los Angeles, Stanford, the University of California at Santa Barbara, and the University of Utah. From those humble beginnings there arose the phenomenon we know today.7
Much like the PC, the environment in which the ARPANET began to grow greatly influenced its development. From its early days, the Internet began to evolve as a space for the exchange of informationâa common, if you will, where both ideas and academic materials could flow freely. This flow of information was in fact so freely flowing that as the network began to grow, so did military concerns for security. After more and more nonmilitary institutions began joining the network, the Department of Defense decided to abandon it in favor of its own network. In 1983 MILnet was formed using the same basic backbone of the original system.8
It was from this original academic mind-set that the Internet as we know it emerged. Understanding the academic background of the Internet is important because of the type of community that it promoted among its users. This community was formed in the spirit of cooperation and free sharing of information. Academic pursuit thrives on knowledge and information and the free flow of ideas and unfettered access. In the early days, the concept of ownership and regulation of this âcyberâ space were the last things on the minds of the newly emerging netizens. In this almost âWild West frontierâ environment, the rules, such as they were, were loose, highly fluid, and designed as honor codes more than traditional rules. Information and free access were king and queen, and citizens of this new domain were short on regulation and long on enthusiasm.9
This attitude coupled with the developments in the PC world created the beginnings of our computer forensic industry. Computer icons like Bill Gates, Steve Jobs, and Steven Wozniak built their fortunes on more than merely the spirit of competition. They built them on innovation born of the spirit of exploration and tinkering and a how-can-I-make-it-better attitude. The Internet in its early days of nonregulation was an environment tailor made for this entrepreneurial spirit. Additionally, the average computer user during the early days of the Internet was more like Bill Gates than todayâs black-box user.
Computers were more a phenomenon of the hobbyist and electronics buff than a fixture in every home. As a result, these users shared much more closely the personality traits of the early adopters like Gates, Wozniak, and Jobs. All these traitsâopenness, innovation, and explorationâcombined to create a free-wheeling world in which the only rules were that ...