Being Watched
eBook - ePub

Being Watched

Legal Challenges to Government Surveillance

  1. 208 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Being Watched

Legal Challenges to Government Surveillance

About this book

A riveting history of the Supreme Court decision that set the legal precedent for citizen challenges to government surveillance

The tension between national security and civil rights is nowhere more evident than in the fight over government domestic surveillance. Governments must be able to collect information at some level, but surveillance has become increasingly controversial due to its more egregious uses and abuses, which tips the balance toward increased—and sometimes total—government control.This struggle came to forefront in the early 1970s, after decades of abuses by U.S. law enforcement and intelligence agencies were revealed to the public, prompting both legislation and lawsuits challenging the constitutionality of these programs. As the plaintiffs in these lawsuits discovered, however, bringing legal challenges to secret government surveillance programs in federal courts faces a formidable obstacle in the principle that limits court access only to those who have standing, meaning they can show actual or imminent injury—a significant problem when evidence of the challenged program is secret.

In Being Watched, Jeffrey L. Vagle draws on the legacy of the 1972 Supreme Court decision in Laird v. Tatum to tell the fascinating and disturbing story of jurisprudence related to the issue of standing in citizen challenges to government surveillance in the United States. It examines the facts of surveillance cases and the reasoning of the courts who heard them, and considers whether the obstacle of standing to surveillance challenges in U.S. courts can ever be overcome.

Vagle journeys through a history of military domestic surveillance, tensions between the three branches of government, the powers of the presidency in times of war, and the power of individual citizens in the ongoing quest for the elusive freedom-organization balance. The history brings to light the remarkable number of similarities among the contexts in which government surveillance thrives, including overzealous military and intelligent agencies and an ideologically fractured Supreme Court. More broadly, Being Watched looks at our democratic system of government and its ability to remain healthy and intact during times of national crisis.

A compelling history of a Supreme Court decision and its far-reaching consequences, Being Watched is essential reading for anyone seeking to understand the legal justifications for—and objections to—surveillance.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Being Watched by Jeffrey L. Vagle in PDF and/or ePUB format, as well as other popular books in Law & Legal History. We have over one million books available in our catalogue for you to explore.

Information

Publisher
NYU Press
Year
2017
Print ISBN
9781479809271
eBook ISBN
9781479841530
Topic
Law
Index
Law

1

You Are Being Watched

It was not long after the events of September 11, 2001, and the subsequent actions of the United States and its allies, that Joanne Mariner began to suspect that her government was spying on her.1 Her concern was not without foundation. As an attorney and deputy director of the Americas Division at Human Rights Watch (HRW), she had been doing research on the legal issues surrounding the U.S. government’s covert policies adopted as part of its “global war on terrorism,” including the CIA practice of “extraordinary renditions” and the decision to house some prisoners at a detention camp at the Guantanamo Bay Naval Base, located on the southeastern coast of Cuba. In 2004 Mariner was asked to chair a working group within HRW to come up with a strategic plan to address the growing list of human rights issues emerging from the global war on terrorism. As a result of the working group’s research, HRW formed a new Terrorism and Counterterrorism Center in 2005. In 2006 Mariner was appointed the center’s director.
Among the first findings of HRW’s new Terrorism Center was the revelation that the CIA was operating secret prisons at “black sites” in countries such as Afghanistan, Poland, and Romania, where access and oversight of prison conditions could more easily be limited and controlled. As its research into these secret prisons expanded, HRW began to discover instances of CIA prisoner abuses, and this became a focal point for Mariner’s work. From 2006 through 2009, she spent much of her time traveling around the world tracking down former CIA prisoners and trying to convince them to speak with her about the abuses they witnessed and suffered while they were held at these black sites. During this time, she spoke with many former detainees, who related some of the shocking stories of torture and abuse that were later confirmed in the 2014 U.S. Senate Select Committee on Intelligence (SSCI) report, Committee Study of the Central Intelligence Agency’s Detention and Interrogation Program, commonly known as the Torture Report.2
Mariner and her team traveled extensively during this period, speaking not only with former detainees, but also with witnesses, experts, scholars, political activists, foreign government officials, and other HRW staff, discussing highly sensitive information, sometimes related to terrorism and counterterrorism, and often relating to U.S. foreign affairs. As this work progressed, it became increasingly obvious to Mariner and others at HRW that the CIA—and others within the U.S. government—would be highly interested in knowing the details of these communications, and she harbored few doubts that their communications would be monitored at every opportunity.
This was not a surprising conclusion for Mariner to reach. By the mid-2000s, stories began to surface telling of the mass interception of telecommunications by the U.S. intelligence community and, by extension, its partners in the “Five Eyes” signals intelligence–sharing alliance, comprising the United Kingdom, Canada, Australia, New Zealand, and the United States. Rumors about the breadth and depth of the ultra-secretive NSA’s data collection activities had long existed, but without evidence of these programs, these rumors were largely dismissed as the paranoid fantasies of the tinfoil hat community. The 1998 film Enemy of the State, a high-tech thriller starring Will Smith, depicted the NSA as an omniscient, power-hungry government body that is willing to kill in order to enhance and protect its power, and has at its instantaneous command nearly limitless technological resources with which to track and control its enemies (one of which happens to be Will Smith’s character). The premise of the film, while entertaining, was discounted by intelligence and technology experts as far-fetched, as it was generally agreed that the feats described in Enemy of the State would require technological capabilities far beyond anything the real-world NSA could muster. And besides, there were laws preventing foreign intelligence agencies like the NSA and CIA from spying on U.S. citizens, right? The rumors of a technologically powerful NSA with the ability to surveil the communications of every U.S. citizen seemed utterly implausible. To understand how these once fanciful or paranoid stories began to take hold among rational thinkers like Joanne Mariner, however, we turn to the story of Room 641A.
Just about the time Mariner and her colleagues were beginning their research on terrorism and counterterrorism for HRW, a technician named Mark Klein, then working for the global telecommunications giant AT&T, noted the construction of a new room in AT&T’s Folsom Street networking facility in San Francisco.3 This room, numbered 641A, was being built immediately adjacent to the room containing a 4ESS switch, a powerful computer switching system that was used to direct long-distance telephone calls through the Folsom Street facility. The facility also held the infrastructure that provided AT&T’s WorldNet Internet service, international and domestic Voice Over IP (VoIP) telephony services, and data transport service to Asia and the Pacific Rim. The information flow moving through the Folsom Street facility was not a mere trickling headwaters—it was an Amazon of data.
Room 641A, known in AT&T documents as the SG3 Secure Room, was secretly built and equipped in 2002 for the NSA. NSA agents made repeated visits to the Folsom Street facility in 2002 and 2003 to supervise the work. Unlike access to other rooms in the facility, access to Room 641A was limited to a handful of AT&T employees who had been personally cleared by the NSA. The AT&T technicians who were expected to service and maintain the equipment in the facility had master keys to every room but Room 641A. The extremely limited access to 641A began to cause problems at the facility. In 2003 a large industrial air conditioner in the secure room began to leak water, which eventually began leaking onto the equipment housed on the floor beneath it. Sensitive electronic equipment does not react well to water, and there was a risk of interrupting customer services and causing untold damage to AT&T’s expensive communications facility. But because access to 641A was limited to only a few AT&T employees—who did not necessarily work at or near the Folsom Street facility—it took days before a cleared technician arrived to enter the secure room and repair the faulty air conditioner. With the attendant inconvenience, risk, and expense of such a secretive endeavor, why was the NSA so interested in the AT&T Folsom Street facility? The answer can easily be seen in the facility’s function as a choke point through which vast amounts of communications information was forced to flow at scales unimaginable until it was discovered that light could be harnessed to transmit information over ultrathin strands of glass.
Copper wire had long been the favorite medium of telecommunications companies for transmitting electric signals. Due to its high electrical conductivity, strength, and ease of handling, copper was (and is) an excellent material for this purpose. One of the limitations of copper wire for telecommunications, especially transoceanic communication, was its inability to cleanly propagate electrical signals across long distances without the use of multiple repeaters—essentially, signal boosters—along the way. Relatedly, copper was also prone to “leak” electromagnetic energy, and was vulnerable to interference from other sources of electromagnetic radiation unless the copper was properly shielded from this interference. In the 1970s, advances in the manufacture of optical fiber—clear, flexible strands of extruded glass or plastic that used light to transmit information—made this medium an attractive alternative to copper. Optical fiber was electrically nonconductive, and was therefore not vulnerable to electromagnetic interference issues. Signal loss over long distances was orders of magnitude lower with optical fiber versus copper. Telecommunications companies such as AT&T quickly began replacing their long-haul copper wiring with optical fiber.
The switch from copper to optical fiber expanded not only the reach of land-based telecommunications, but also its capacity. The vast amount of data transmitted globally in our current information economy would not be possible without the enormous amounts of bandwidth available using optical fiber technology. Optical fiber can handle huge amounts of data. A single optical fiber alone can transmit three million full-duplex telephone calls or over ninety thousand television channels. The fiber bundles used at the Folsom Street facility typically consisted of ninety-six of these fibers, and multiple sets of these fiber bundles created a high-speed “common backbone” for data transmission, making the Folsom Street facility a major information hub that linked other such facilities across the world. The facility therefore handled all AT&T data going to and from the Pacific Rim, and there was a lot of it. If someone could tap into this vast river of information at this point, it would provide an extremely efficient means of eavesdropping on a significant portion of U.S. domestic and international communications. But while the Folsom Street facility offered an attractive opportunity, a potential eavesdropper was first faced with two technical—and one legal—hurdles. First, it was not a straightforward process to “tap” optical fiber, as it had been for copper wire. Second, doing any sort of useful analysis on the vast amounts of data flowing through the facility would require unheard-of amounts of computing horsepower and algorithmic complexity. Finally, the third-party tapping of private telephone conversations or electronic communications is generally illegal for private entities, and even government access is tightly controlled to protect citizens’ privacy. Even government agencies had to jump through certain legal hoops to intercept domestic communications. The NSA, however, had developed its own answers to all of these questions.
First, how does one go about tapping fiber optic cable? In the old days, when copper cables were the only method of transmitting wire line communication, the job was relatively simple. Since copper used electrical signals to transmit information, it conducted small amounts of electrical energy along its length, some of which “leaked” or radiated outward from the cable as electromagnetic energy. If eavesdroppers wanted to listen in on the information passing over a copper wire, they only needed to wrap some sort of passive device around the cable that could absorb this leaked energy, thus gaining access to the information passing over the copper cable without actually disturbing the original electrical circuit. Tapping fiber optic cable is not nearly this straightforward, since optical fiber does not use electricity to transmit information, but instead uses light.4 One would need to wrap the transparent core of the optical fiber with a cladding material that has a lower refractive index than the core material. This construction causes the light to propagate through the fiber by reflecting off the cladding material, and is therefore kept completely within the optical fiber’s core. This means that any information passed through the fiber via light pulses is visible only at the two ends of the unbroken fiber—no information is “leaked” through the optical fiber’s cladding. This means that any would-be fiber optic eavesdropper must be able to physically cut and splice the fiber in order to listen in on the transmitted information.
Splicing optical fiber requires a level of expertise far beyond merely cutting two cables and twisting their ends together, as one might do with copper wiring. It is an expensive process that requires specialized equipment and can be quite intrusive in a busy hub like the Folsom Street facility, where vast amounts of customer data were already flowing through its optical fiber bundles. Further, the NSA did not wish simply to intercept the information being transmitted over these fibers, but to “split” the signals into two identical copies, one of which would go to its original destination, and the other would be routed to NSA equipment. Just as dividing the flow of water through plumbing reduces the downstream capacity of the original pipes, this “splitting” of light pulses through AT&T’s optical fibers also reduced their downstream signal strength. Such an operation required careful planning, and a small group of cleared AT&T technicians—one of whom was Mark Klein—was brought in to assist the NSA in this complex task. As an expert in these matters, Klein was asked to review the NSA’s “Cut-In and Test Procedure” documents that articulated its plan to split the Folsom Street facility’s fiber optic signals, with one branch of the resulting split going directly to Room 641A. While reviewing these documents, Klein noticed that the plan allowed the NSA to capture the optical signals from the facility’s peering links, which meant that the NSA would receive communications information not only from AT&T customers but also from all non-AT&T customers who used these links. Through his review of the NSA’s cut-in and splicing plans for the Folsom Street facility, Klein recognized that the NSA was giving itself a direct tap into AT&T’s communications backbone.
The second technical hurdle to eavesdropping on an information hub like the Folsom Street facility was the challenge of making sense of the enormous flow of data in real time. That is, if the NSA wanted to conduct even the most superficial of analyses in this fast-flowing river of communications information, it would need computational capacities and analytical algorithms generally unknown to the commercial world at the time. But this problem fell right into the NSA’s sweet spot. It had long been known to deploy “acres” of the fastest and most powerful computers in the world, and as Moore’s Law continued to hold true, the NSA wielded untold-of depths of computing power within smaller and smaller footprints. Further, the NSA was known to be one of the biggest employers of mathematics, physics, and computer science PhDs in the world. This much concentrated brainpower was likely to yield computational techniques and algorithms that would put commercially available software and firmware to shame. The exact details behind the NSA’s analytical capabilities are highly classified, of course. Mark Klein was given a glimpse into the NSA’s design for the Folsom Street facility, however, through his review of its “cut-in” plans. Some of the equipment being loaded into Room 641A included high-end Sun Microsystems servers, fast Juniper “backbone” traffic routers, a Narus STA 6400 “Semantic Traffic Analyzer,” and a Narus Logic Server. In other words, Room 641A was being packed with some of the fastest, most powerful, and most sophisticated telecommunications processing and analytical gear then available. As we now know, the NSA’s technology has given it the power to effect just the sort of deep-and-wide dragnet surveillance once dismissed as science fiction when depicted by Hollywood.
Before the NSA could move forward with its plan to tap all communications going through the AT&T Folsom Street facility, however, it had one final obstacle to overcome—the law. Since its creation by a secret presidential memorandum in 1952, the NSA’s mission was to (a) strengthen U.S. signals intelligence capabilities, (b) support the nation’s ability to wage war, and (c) generate information central to the conduct of foreign affairs. The NSA was meant to be an outward-facing agency, obtaining foreign intelligence through the intercept of foreign communications. Following the revelation of multiple serious illegal activities and other abuses by the U.S. intelligence community (IC) in the late 1960s and early 1970s (which will be explored in greater depth in later chapters), Congress enacted a new law—the Foreign Intelligence Surveillance Act (FISA)—to prevent future IC abuses by forbidding intelligence agencies from using foreign intelligence gathering as an excuse to conduct domestic surveillance. This law also placed four crucial limits on the nature of foreign intelligence gathering by the IC. First, the new law required all targets of foreign intelligence surveillance to be either a foreign power or an agent of a foreign power. Second, the law required that intelligence agencies show probable cause—a standard drawn from criminal law—that would demonstrate, with particularity, that the target to be placed under surveillance was a foreign power or an agent of a foreign power. Third, the law limited the basis of this probable cause to exclude activity otherwise protected by the First Amendment. Finally, the new law demanded that intelligence agencies draft and adhere to “minimization procedures” that would find and destroy all inadvertently collected data not related to foreign intelligence. After FISA was enacted, the intelligence agencies reluctantly accepted the restrictions as the new cost of doing business.
The terrorist attacks of September 11, 2001, however, made some members of executive branch question the FISA restrictions, reasoning that, in times of national crisis, the intelligence community had to “take the gloves off” in order to effectively do its job helping to protect the nation. FISA restrictions would slow down the intelligence-gathering process and would hamper efforts to identify and locate potential threats. Further, this new kind of asymmetrical warfare meant that the concepts of discrete battlefields and easily identifiable enemies were now antiquated. It was no longer practical to assume that threats would come from outside our nation’s borders, and the intelligence community therefore needed to expand its brief to include all communication activity, foreign and domestic. In other words, the Bush administration reasoned that military and intelligence agencies now needed a free hand to conduct their business, and the peacetime rules no longer applied.
This reasoning was not done in public, however. For the intelligence agencies to conduct their business on the “dark side” (made necessary by this new kind of warfare), the administration also needed to operate in this fashion, and thus elected to treat all such discussions as privileged under the president’s constitutional wartime powers. This philosophy meant that the unnecessary involvement of Congress or the courts would only serve to further damage U.S. defense posture—and the Bush administration at this critical point considered nearly all communication with Congress unnecessary. The challenge for the administration, therefore, was how to allow the NSA unfettered access to foreign and domestic communication without either applying for a court order (as required by FISA) or going to Congress for special authorization. Both options were likely to take more time than the Bush administration wanted to spend, and, given the extremely broad scope of information being sought, both stood a better than fair chance of denial. The only viable option the Bush administration recognized at this point was to plow ahead unilaterally under what it saw as its wartime powers.
Following this reasoning, President George W. Bush secretly authorized a number of covert intelligence collection activities following the September 11 attacks, collectively known as the President’s Surveillance Program (PSP). All information gathered under PSP authorization was specially classified and compartmented to prevent outside knowledge of these activities. The administration knew that the courts and Congress would likely view the PSP as illegal without additional congressional authority, so secrecy was the order of the day. The secrecy of programs like the PSP is difficult enough to maintain when the only actors are themselves agents of the government. The problem is exacerbated when outsiders—like AT&T—are brought in to the equation. U.S. corporations have a long history of providing covert assistance to the government in times of crisis. Some of those activities have been legal, but many have occupied a legal gray area that the companies have been less than eager to make public or include in their corporate histories.
The U.S. legal system distinguishes between the government (along with its agents) and everyone else. Specifically, the constitutional rights and privileges found in the Bill of Rights apply only to government actors. Generally speaking, it is therefore not possible to hold corporations liable for violating the prohibition against unreasonable searches and seizures found in the Fourth Amendment. More plainly, unlike the government, Google can (and does) read your email sent through Google Mail without fir...

Table of contents

  1. Cover
  2. Title Page
  3. Dedication
  4. Contents
  5. 1. You Are Being Watched
  6. 2. A History of Government Surveillance
  7. 3. Getting through the Courthouse Door
  8. 4. The Doctrine of Article III Standing
  9. 5. Before the Supreme Court
  10. 6. Government Surveillance and the Law
  11. 7. The Legacy of Laird v. Tatum
  12. 8. Technology, National Security, and Surveillance
  13. 9. The Future of Citizen Challenges to Government Surveillance
  14. Notes
  15. Index
  16. About the Author