Internet for the People
eBook - ePub

Internet for the People

The Fight for Our Digital Future

Ben Tarnoff

Share book
  1. 272 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Internet for the People

The Fight for Our Digital Future

Ben Tarnoff

Book details
Book preview
Table of contents
Citations

About This Book

In Internet for the People, leading tech writer Ben Tarnoff offers an answer. The internet is broken, he argues, because it is owned by private firms and run for profit. Google annihilates your privacy and Facebook amplifies right-wing propaganda because it is profitable to do so. But the internet wasn't always like this-it had to be remade for the purposes of profit maximization, through a years-long process of privatization that turned a small research network into a powerhouse of global capitalism. Tarnoff tells the story of the privatization that made the modern internet, and which set in motion the crises that consume it today.The solution to those crises is straightforward: deprivatize the internet. Deprivatization aims at creating an internet where people, and not profit, rule. It calls for shrinking the space of the market and diminishing the power of the profit motive. It calls for abolishing the walled gardens of Google, Facebook, and the other giants that dominate our digital lives and developing publicly and cooperatively owned alternatives that encode real democratic control. To build a better internet, we need to change how it is owned and organized. Not with an eye towards making markets work better, but towards making them less dominant. Not in order to create a more competitive or more rule-bound version of privatization, but to overturn it. Otherwise, a small number of executives and investors will continue to make choices on everyone's behalf, and these choices will remain tightly bound by the demands of the market. It's time to demand an internet by, and for, the people now.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Internet for the People an online PDF/ePUB?
Yes, you can access Internet for the People by Ben Tarnoff in PDF and/or ePUB format, as well as other popular books in Politics & International Relations & Science & Technology Public Policy. We have over one million books available in our catalogue for you to explore.
PART I
THE PIPES
1
A People’s History of the Internet
On November 22, 1977, a van rolled down the freeway between San Francisco and San Jose. It was boxy and gray, the kind used for deliveries. At a distance it would’ve looked unremarkable: one of countless cars crawling up and down the peninsula in the rain. But if you’d gotten a closer look, you would’ve seen something a bit unusual: two large antennas stuck to the roof. This was the first clue. Drawing closer, you might’ve seen something else through the rear windows: a person typing at a computer terminal. In fact, the whole back of the van was filled with electronics. It looked like the inside of a research lab, the sort you might find in the meticulously landscaped office parks of the surrounding region, a place so crowded with semiconductor companies that it had recently become known as Silicon Valley.
But what made this van special wouldn’t be visible no matter how close you came. The van was a node in a network. Not a single network, but a network of networks—an inter-network. This internetwork was immense. It spanned land, sea, sky, and space, and stitched together computers from all over the world.
The first computer sat in the back of the van. It transformed the words being typed on the terminal into discrete slices of data called “packets.” These packets were encoded as radio waves and transmitted from the van’s antennas to repeaters on nearby mountaintops, which amplified them. With this extra boost, they could make it all the way to Menlo Park, where an office building received them.
In Menlo Park, the packets underwent a subtle metamorphosis. They shed their ethereal shape as radio waves and acquired a new form: electrical signals in copper telephone lines. Then they embarked on a long journey, riding those lines all the way to the East Coast before sailing via satellite over the Atlantic Ocean.
The packets touched down in a facility on the outskirts of Oslo. From there they ran to London, then over to England’s southwestern edge. Goonhilly Satellite Earth Station was the largest facility of its kind in the world at the time: a cluster of satellite dishes encircled by the bogs and marshes of Cornwall. Here, the packets took flight once again. A dish hurled them some twenty thousand miles up into space, where they bounced off an orbiting satellite and dove back down to earth, landing on the other side of the Atlantic, in a narrow valley that cut through the densely forested foothills of the Alleghenies: Etam Earth Station, West Virginia.
Etam wasn’t far from the site of the first skirmish of the Civil War. The young Ambrose Bierce had fought in that battle; he would later remember the region as “an enchanted land,” thick with fragrant spruce and fir, and populated with wild pigs that had once feasted on the corpses of his fellow Union soldiers.
Here, the packets returned to fixed lines. They continued northeast, to an office in an old warehouse on the western end of Cambridge, Massachusetts, before bending back across the country toward Los Angeles. This was their final destination: a complex overlooking the palm trees and pleasure craft of Marina del Rey, only four hundred miles south of the van where they had originated.
The packets didn’t know their route in advance. But they did know their destination. This was written into each packet, like the address on an envelope. Whenever a packet crossed from one network to the next, a computer called a gateway examined the address and forwarded the packet on to its next stop. This routine was repeated until the data reached Marina del Rey. There, the destination computer sent a note to the computer in the van saying the packet had arrived safely; if this acknowledgment wasn’t made, the packet was resent. Eventually, when all of the packets had completed the trip, the machine in Marina del Rey pieced them together and displayed the message they contained.
What was in the packets? What did the message say? Nobody remembers. It doesn’t matter. What matters isn’t what was said but how. The packets had traveled nearly a hundred thousand miles in about two seconds. They had crossed multiple networks and multiple mediums—radio, satellite, fixed-line—while arriving at their destination completely intact. Computers from across the world had talked to one another, and heard each other perfectly, speaking the new universal language of the internet.
Birth of a Network
The internet is fundamentally a language—a set of rules for how computers should communicate. These rules have to strike a very delicate balance. On the one hand, they have to be strict enough to ensure the reliable transmission of data. On the other, they have to be loose enough to accommodate all the different ways that data might be transmitted. Together, these qualities ensure that data can not only go anywhere, but also get there in one piece.
Think about water: it can be vapor, liquid, or ice, but its chemical composition remains the same. This flexibility is a feature of our natural universe. The language of the internet instills a similar flexibility into our digital universe, turning data into something that can flow across any device, network, and medium—which is the reason a smartphone in São Paulo can download a song from a server in Singapore.
That day in 1977 offered the first real evidence that this language could work at scale. There had been earlier experiments, but never one of such complexity. Pulling it off—and getting to the point where it could even be attempted—took a colossal, and collective, effort. The internet wasn’t invented by a lone genius tinkering in a garage. Rather, it involved thousands of individuals engaged in a decades-long act of co-creation. It took collaboration, cross-pollination, and the slow, accretive work of building on earlier breakthroughs to generate new ones. It also took a lot of public money.
Most of the innovation on which Silicon Valley depends comes from government-funded research, for the simple reason that the public sector can afford to take risks that the private sector can’t. It’s precisely the insulation from market forces that enables the government to finance the long-term scientific labor that ends up producing many of the most profitable inventions. This is particularly true of the internet.
The internet was such an unlikely idea that only decades of public funding and planning could bring it into existence. Not only did the basic technology have to be invented, but the infrastructure had to be built, specialists had to be trained, and contractors had to be staffed, funded, and, in some cases, directly spun off from government agencies. The internet is sometimes compared to the interstate highway system, another major public project. But as the activist Nathan Newman points out, the comparison only makes sense if the government “had first imagined the possibility of cars, subsidized the invention of the auto industry, funded the technology of concrete and tar, and built the whole initial system.”
The Cold War provided the pretext for this ambitious undertaking. Nothing loosened the purse strings of American politicians quite like the fear of falling behind the Soviet Union. This fear spiked sharply in 1957, when the Soviets put the first satellite into space. The Sputnik launch produced a genuine sense of crisis in the American establishment, and led to a substantial increase in federal research funding.
One consequence was the creation of the Advanced Research Projects Agency (ARPA), which would later change its name to the Defense Advanced Research Projects Agency (DARPA). DARPA became the R&D arm of the Defense Department.
In the early 1960s, DARPA began investing heavily in computing, installing mainframes at universities and the other research sites where its community of contractors worked. But even for an agency as generously funded as DARPA, this spending spree wasn’t sustainable. In those days, a computer cost millions of dollars. So DARPA came up with a way to share its computing resources more efficiently among its contractors: it built a network.
This network was ARPANET, and it laid the foundation for the internet. First connected in 1969, ARPANET linked computers through an experimental technology called packet- switching, which involved breaking messages down into small chunks, routing them through a maze of switches, and reassembling them on the other end. Today, this is the mechanism that moves data across the internet. At the time, the telecom industry considered it absurdly impractical. Years earlier, the Air Force had tried to persuade AT&T to build such a network, ultimately without success. DARPA even offered ARPANET to AT&T after it was up and running. The agency would’ve preferred to buy time on the network instead of managing it. Given the chance to acquire the most sophisticated computer network in the world, however, AT&T refused. The executives simply couldn’t see the money in it.
Luckily, as it turned out. Under public ownership, ARPANET flourished. Government control gave the network two major advantages. The first was money. DARPA could pour cash into the system without having to worry about profitability. The agency commissioned research from the country’s most talented computer scientists at a scale that would’ve been suicidal for a private corporation. And, just as crucially, DARPA enforced an open-source ethic that encouraged collaboration and experimentation. The contractors who contributed to ARPANET had to share the source code of their creations. This catalyzed scientific creativity, as researchers from a range of different institutions could refine and expand on each other’s work without living in fear of intellectual property law.
The most important innovation that resulted was the internet protocol, which first emerged in the mid-1970s. Initially, the protocol was a proposal for how computers should communicate. The proposal was subsequently implemented in software and refined through multiple experiments. This made it possible for ARPANET to evolve into the internet, by providing a common language that let very different networks talk to one another. The language would be open and non-proprietary—a free and universal medium, rather than a patchwork of incompatible commercial dialects.
Under private ownership, such a language could never have been created. Not only would the expense have been too great, the very idea of a free and universal medium cut against the grain of the commercial impulse to lock users into a proprietary ecosystem. It was the absence of the profit motive and the presence of public management that made the invention of the internet possible. Yet the internet would also reflect the institutional imperatives of the particular arm of the government that oversaw its creation: the military.
The Mainframe and the Battlefield
The internet was created to win wars, although not right away. As a “blue sky” research outfit, DARPA had wide latitude in picking its projects, but it still had to develop technologies that might someday be useful for military ends. The internet was no exception. Its champions within the agency made the case that the internet was worth pursuing because it could give American forces an edge. This edge would come from taking computing power out of the lab and into the field.
Picture a Jeep in the jungles of Zaire, or a B-52 miles above North Vietnam. Then imagine these as nodes in a wireless network linked to another network of powerful computers thousands of miles away. This is the dream of a networked military using computing to project American power. This is the dream that produced the internet.
ARPANET had been a major breakthrough. But it had a limitation: it wasn’t mobile. The computers on ARPANET were gigantic by today’s standards. That might work for DARPA researchers, who could sit at a terminal in Cambridge or Menlo Park—but it did little for soldiers deployed deep in enemy territory. For ARPANET to be useful to forces in the field, it had to be accessible anywhere in the world.
This required doing two things. The first was building a wireless network that could relay packets of data among the widely dispersed cogs of the US war machine by radio or satellite. The second was connecting those wireless networks to ARPANET, so that multimillion-dollar mainframes could serve soldiers in combat. “Internetworking,” the scientists called it.
Internetworking was hard. Getting computers to talk to one another—networking—had been challenging enough. But getting networks to talk to one another—inter-networking—posed a whole new set of difficulties, because the networks spoke different idioms. Trying to move data from one to another was like writing a letter in Mandarin to someone who only knows Hungarian and hoping to be understood.
In response, the architects of the internet developed a kind of digital Esperanto: a common language that enabled data to travel across any network. In 1974, two researchers named Robert Kahn and Vinton Cerf published an early blueprint. Drawing on conversations happening throughout the international networking community, they sketched a design for “a simple but very powerful and flexible protocol”: a universal set of rules for how computers should communicate.
These rules would make it possible to weave together a network of networks so versatile and so robust that a soldier in the field could connect to a mainframe halfway across the world. Indeed, the experiments that DARPA conducted to test the new idiom of the internet were designed with exactly this scenario in mind. The first major experiment took place in 1976, linking two networks. The second took place in 1977, featuring the van driving down the Bay Area freeway, flinging packets across the Atlantic, linking three networks.
The design of these experiments reflected a specific military scenario. “What we were simulating was a situation where somebody was in a mobile unit in the field, let’s say in Europe, in the middle of some kind of action,” Cerf later recalled. The soldiers would be trying to access “some strategic computing asset that was in the United States,” possibly while engaging or evading the enemy. The goal, in other words, was to bring the mainframe to the battlefield. The van played the role of the mobile unit. The Bay Area freeway was the battlefield. And it worked: the smaller computer in the mobile unit established a link to a bigger computer many miles away.
The protocol developed by Cerf and Kahn had fulfilled its promise. Eventually, it would evolve into a whole suite of protocols called TCP/IP. Today, TCP/IP is the lingua franca of the internet. It is no exaggeration to say that TCP/IP is the internet: without its rules, the world’s networks would be a Babel of mutually unintelligible tongues.
This universality was created with a particular end in mind. The internet was designed to run anywhere because the US military is everywhere. Today, it maintains around eight hundred bases in some eighty-five countries around the world. It has hundreds of ships, thousands of planes, and thousands of tanks. The reason the internet can work across any device, network, and medium is because it needed to be as ubiquitous as the military that financed its creation. It needed to be able to knit together a heterogeneous collection of people and machines into a single network of networks, so that a soldier in a Jeep or a pilot in a B-52 could use a computer thousands of miles away.
From Protocol to Place
The internet may have been created as a weapon of empire, but the empire wasn’t interested—at least not in the original idea. When the US military did embrace TCP/IP, it wasn’t because it wanted to link Jeeps to mainframes. Rather, it had a more mundane requirement: it needed to get the Pentagon’s growing assortment of fixed-line networks to start talking to one another.
TCP/IP was the answer. Indeed, when it came to reliable internetwork communication, the protocol “was really the only game in town,” Kahn later remembered. In 1983, ARPANET switched over to TCP/IP, which let it interconnect with other military and experimental networks. This new system became known as the internet, with ARPANET at its center. The internet was born as a protocol; it would now become a place, one increasingly populated by civilian researchers—trading emails, accessing high-performance computers, collaborating, arguing. While the government created the internet, it was users who made it useful, who made it a place worth visiting.
The internet’s usefulness soon led scientists from outside DARPA’s select circle of contractors to demand access. In response, the National Science Foundation (NSF)—a US government agency tasked with supporting basic research—undertook a series of initiatives aimed at bringing more people online. These culminated in NSFNET, a program that oversaw the creation of a new national network. This network, which first became operational in 1986, would be the new “backbone” of the internet, an assemblage of cables and computers forming the internet’s main artery. It resembled a river: data flowed from one end to another, feeding tributaries, which themselves branched into smaller and smaller streams. These streams served individual users, people who themselves never touched the backbone directly. If they sent data to another part of the internet, it would travel up the chain of tributaries to the backbone, then down another chain, until it reached the stream that served the recipient.
In this model, the river is useless without the tributaries that extend its reach. This is why the NSF, to ensure the broadest possible connectivity, subsidized a number of regional networks that linked universities and other participating institutions to the NSFNET backbone. All this wasn’t cheap, but it worked. Scholars Jay P. Kesan and Rajiv C. Shah have estimated that the subsidies to the regional networks, together with the cost of running the NSFNET backbone, came to approximately $160 million. Other public sources, such as state governments and state-supported universities, likely contributed more than $1.6 billion to the development of the internet during this period.
In the 1970s, the government invented the universal language of the internet. In the 1980s, it made this language the basis of a cutting-edge communications system and spent heavily on plugging more people into it. Thanks to this avalanche of public cash, the internet became widely available to American researchers by the end of the 1980s. Then, in the following decade, this internet abruptly died, and a different one appeared—one we would recognize today. The 1990s is when the internet became a business. The government ceded the pipes to a handful of corporations while asking nothing in return.
Open for Business
Privatization didn’t come out of nowhere. It was the plan all along. The government reports that guided the creation of NSFNET called for the backbone to eventually pass into private hands, but the internet’s surprising popularity pushed the NSF to make the transition sooner than expected.
By the early 1990s, the internet was becoming a victim of its own success. Congestion plagued the backbone, a...

Table of contents