Electric Dreams
eBook - ePub

Electric Dreams

Computers in American Culture

Ted Friedman

Share book
  1. 275 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Electric Dreams

Computers in American Culture

Ted Friedman

Book details
Book preview
Table of contents
Citations

About This Book

Electric Dreams turns to the past to trace the cultural history of computers. Ted Friedman charts the struggles to define the meanings of these powerful machines over more than a century, from the failure of Charles Babbage’s “difference engine” in the nineteenth century to contemporary struggles over file swapping, open source software, and the future of online journalism. To reveal the hopes and fears inspired by computers, Electric Dreams examines a wide range of texts, including films, advertisements, novels, magazines, computer games, blogs, and even operating systems.

Electric Dreams argues that the debates over computers are critically important because they are how Americans talk about the future. In a society that in so many ways has given up on imagining anything better than multinational capitalism, cyberculture offers room to dream of different kinds of tomorrow.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Electric Dreams an online PDF/ePUB?
Yes, you can access Electric Dreams by Ted Friedman in PDF and/or ePUB format, as well as other popular books in Scienze sociali & Cultura popolare. We have over one million books available in our catalogue for you to explore.

Information

Publisher
NYU Press
Year
2005
ISBN
9780814728420

PART I

Mainframe Culture

1

Charles Babbage and the Politics of Computer Memory

As we have seen, the dialectic of technological determination is both enabling and disempowering. It clears space to imagine wild visions of the future. But it closes off our ability to question our present options, since the future is presumed to be the inevitable result of inexorable technological progress. And it impoverishes our understanding of the past, robbing it of any sense of contingency. What happened had to happen, since it did happen.
This bargain has been good enough for technotopians from Edward Bellamy to Isaac Asimov (author of the Foundation series, which introduced the fictional predictive science of “psychohistory”) to Gene Rodenberry (creator of Star Trek) to Louis Rosetto (founder of Wired magazine). But for some thinkers, the trade-off isn’t worth it. Looking more closely at the history of computing, these skeptics notice the odd turns and dead ends that give the lie to the grand narrative of technological determinism.
This chapter will look at the struggle between the determinist mainstream and the critical margins to define the historical memory of computing. It will focus on the contested legacy of the so-called “father of computing,” Charles Babbage. Before we get to Babbage, though, we’ll need a little background. Let’s start with a question: what is a “computer”?
The answer depends on how you define “computer.” The term was originally used to label not machines, but people. For most of the past three centuries, a computer meant “one who computes,” according to the Oxford English Dictionary, which traces this usage as far back at 1646.1 Scientists engaged in large-scale projects involving many calculations, such as the computation of navigation tables, would hire rooms full of human “computers”—usually women—to crunch their numbers.2 It was not until the 1940s, when new kinds of flexible calculating machines began to replace people for these large-scale projects, that the connotations of the word began to shift, as engineers labeled their new devices “computers.” Even so, through the 1940s and 1950s, popular discourse more often referred to the machines as “giant brains,” “electronic brains,” or “mechanical brains.” It wasn’t until the 1960s that “computer” became standard usage. While a term such as “giant brains” may strike us today as a rather garish anthropomorphism, note that the seemingly more neutral term “computer” itself has its origins in anthropomorphism.
One version of the history of computing, then, is the story of computing as a process for the large-scale production and organization of information—a process performed sometimes by people, sometimes by machines. A second, more familiar version is the story of the computer as a mechanical calculating device. This chronology takes us from the abacus and other counting devices of the ancient world to the mechanical adding machines first developed in the seventeenth century, which used gears and levers to perform arithmetic. These two strands of computing history— computer as large-scale information processor, and computer as mechanical device—first came together in the work of a nineteenth-century British inventor named Charles Babbage.

Babbage’s Engines

Babbage began his first project, the “difference engine,” in the 1820s. A massive, steam-powered calculating machine and printer, it was designed to mechanize the process of computation and table-making, just as other inventions of the Industrial Revolution were mechanizing other labor processes. The British government invested a total of 17,000 pounds in his research; Babbage is estimated to have spent an equal amount of his own money. In 1833 Babbage produced a small-scale prototype that clearly demonstrated that the completed machine could work. But before Babbage could finish his machine, he was distracted by a new, more complex project. He never completed his difference engine.
Babbage’s new idea was an “analytical engine.” Rather than being hard wired to perform specific tasks, it was designed to be “a machine of the most general nature.” Inspired by the Jacquard loom, Babbage came up with the idea of using a series of punched cards to input information into his machine. The cards would contain not only the raw numbers to be processed, but also logically coded instructions on how to process them. Input numbers could be held in the “store,” a series of 1000 registers, each capable of storing one 50-digit number. Calculations of input numbers, or numbers taken from the store, would be performed by the “mill.” The results would be displayed by the “output,” an automated typesetter.
To contemporary computer users, Babbage’s design sounds strikingly familiar. One can see in the punch-card “instructions” the equivalent of a contemporary computer program, in the “store” an analogue to computer memory, and in the “mill” a parallel to a central processing unit. It’s worth being careful, however, in taking these parallels too far. As one historian of computing warns,
at first sight Babbage’s ideas are close to ours and one can almost imagine that he had invented the modern computer. However, it is too easy to read into his writings a modern interpretation. We assume that his thought ran along the same lines as ours would if we were faced with similar problems. In fact, it may have been running along quite different lines, lines that are hard for us to follow because we know the modern solutions.3
Similarly, R. Anthony Hyman explains,
discussions of the Analytical Engines pose semantic problems because of the many features they have in common with modern computers. These abstract features of logical organization are almost impossible to discuss without using concepts which have been developed in modern computing. Such concepts as the stored program, programming, or array processing carry many implications to a modern computer person which may or may not in some measure have been clear to Babbage.4
Babbage worked on his Analytical Engine from 1834 to his death in 1871, but never completed the machine. He died a bitter, unappreciated man.
Babbage is a problematic figure in the historical memory of computing culture. He’s been dubbed the “father of computing,” or sometimes “the grandfather of computing.” Today the largest computer museum in the United States is titled the Charles Babbage Institute, and a national software chain calls itself Babbage’s. In England he’s hailed as one of the country’s great inventors. The bicentennial of his 1791 birth was celebrated with museum exhibitions, commemorative postage stamps, and, as we shall see, a successful attempt to build a working difference engine.5 In a sense, Babbage invented the computer, since he developed a set of ideas that would ultimately see fruition in the twentieth century as what we now know as the computer.
But Babbage never successfully built his own machines, and no inventors followed in his footsteps. In fact, his work, now so widely hailed, was rediscovered too late to influence the development of the computer in the twentieth century. His writing was still obscure when the first digital computers were built in the 1940s. Most pioneers of the era, such as J. Presper Eckert and William Mauchly, have confirmed that they’d never heard of Babbage when they worked on ENIAC and UNIVAC, the machines widely considered to be the first true computers.6
Harvard’s Howard Aiken, who developed another early computer, the Mach IV, was one of the few scientists of the time who did know something of Babbage’s work. As a result, whiggish historians of technology have been quick to detect a direct lineage. The introduction to one collection on computing in 1953 claimed, “Babbage’s ideas have only been properly appreciated in the last ten years, but we now realize that he understood clearly all the fundamental principles which are embodied in modern digital computers.”7 More recent scholarship, however, has complicated this story. I. Bernard Cohen’s detailed study of the relationship between the work of Babbage and Aiken concludes that “Babbage did not play any seminal role in the development of Aiken’s own ideas about machine architecture.”8 In fact, the Mark I “suffered a severe limitation which might have been avoided if Aiken had actually known Babbage’s work more thoroughly.”9 As Stan Augarten concludes,
Unlike most of the early computer pioneers, Aiken had heard of Babbage, and his proposal contained a brief, if rather inaccurate, summary of the Englishman’s work. Aiken saw himself as Babbage’s spiritual heir, yet his machines, the Automatic Sequence-Controlled Calculator (ASCC), or Harvard Mark I, had little in common with the Analytical Engine.10
So the figure of Babbage raises nagging questions for the technological determinism that operates as computer culture’s common-sense theory of history. The logic of technological determinism presumes that if a technology can be developed, inevitably it will be—there’s no sense in pausing to weigh a technology’s promised benefits against its possible consequences, because technological momentum will keep pushing history forward, regardless. Raymond Williams identifies this perspective in Television: Technology and Cultural Form:
[T]echnological determinism … is an immensely powerful and now largely orthodox view of the nature of social change. New technologies are discovered, by an essentially internal process of research and development, which “created the modern word.” The effects of these technologies, whether direct or indirect, foreseen or unforeseen, are as it were the rest of history. The steam engine, the automobile, television, the atomic bomb, have made modern man and the modern condition.11
Technological determinism is a kind of essentialism, likening technological development to an unstoppable natural force. Cyber-pundit John Perry Barlow writes in his influential “Declaration of Independence for Cyberspace,” “Cyberspace … is an act of nature and it grows itself through our collective actions.”12 Likewise, Nicholas Negroponte warns in Being Digital, “like a force of nature, the digital age cannot be denied or stopped.”13 But if technology is an unstoppable force, why didn’t Babbage successfully build the first computer? Why did it take another century for Babbage’s ideas to be realized? What happened to that force of nature for 100 years?
The technological determinist response is the converse of the assumption that “if it can be built, it will be built”: “if it wasn’t built, it must be because it couldn’t have been built.” For decades, writers looking back on Babbage’s work concluded that while Babbage’s engineering plans may have been sound, the Britain of his day just didn’t have the technical capability to bring his design to fruition. In that era, the argument goes, rods, gears, and bearings couldn’t be made to the precise specifications Babbage’s design demanded. And so, Babbage’s machines were doomed to failure: they simply couldn’t be built with the existing technology.14
But more recent work has called this argument into question. In 1991 a team of historians succeeded in constructing a working difference engine from Babbage’s designs, using parts no more precisely tooled than those available to Babbage.15 As the project’s leader, Doron Swade, puts it,
In every history book which cites the reason Babbage failed, they say it was due to limitations in the technology of the time.… Our findings will oblige historians to look at more subtle reasons for the failure of the project, like the way governments are advised and the relationship between applied and pure science.16
Swade’s The Difference Engine examines these subtle reasons, tracing Babbage’s struggles with expensive machinists, skeptical government officials, and a conservative scientific establishment.

Charles Babbage, Icon of Contingency

Babbage’s story, then, is a prime example of the contingency of history. Under slightly different circumstances, his invention could very well have been realized. As a result, for contemporary critics of the rhetoric of technological determinism, Babbage is a charged and compelling historical figure. When technological determinists point to examples such as the failures of Britain’s Luddites to argue, “you can’t stop history,” critics of technological determinism can point to Babbage to demonstrate, “history doesn’t always turn out the way you expected it to.”
This critique is most fully elaborated in a science fiction genre known as “steampunk.” A play on the label “cyberpunk,” steampunk is a similarly revisionist science fiction genre. The difference is that is it set not in the future, but in the Steam Age. Steampunk imagines an alternate nineteenth century in which the Industrial Revolution and the Information Age happened simultaneously. The most influential steampunk novel is 1991’s The Difference Engine, cowritten by William Gibson and Bruce Sterling, two of cyberpunk’s most prominent novelists.
The Difference Engine proposes an alternate history in which Babbage succeeds in building both his difference engine and analytic engine. The result is a steam-powered Information Age, in which nineteenth-century “clackers” work on massive steam-driven computing machines. The effect of juxtaposing these two eras of technological change is to put each in relief, satirizing contemporary culture by garbing it in nineteenth century gear while forcing the reader to rethink clichés about the past by investing it with the urgency of the present. And by positing an alternative history in which Babbage did succeed, Gibson and Sterling force us to think of our own era as only one of many possible outcomes of history—and perhaps not the optimal one at that. As Gibson put it in one interview, “One of the things that Difference Engine does is to disagree rather violently with the Whig concept of history, which is that history is a process that leads to us, the crown of creation.… [I]t’s about contingency leading to us.”17 Likewise, literary critic Herbert Sussman writes,
The primary rhetorical effect of alternative history lies in the shock of defamiliarization. As much as contemporary theory has quite properly emphasized that we employ our own political agendas to organize the past, most readers … tend to naturalize past events, assuming that since events did happen they somehow had to happen, that social change and, given the emphasis of this novel, technological change are determined. One major effect of alternate history is to dramatize that what we accept as inevitable is only contingent, only one among an infinite number of possibilities, of forking paths.18
But if Gibson and Sterling successfully debunk technological determinism on one level, in another sense they nonetheless seem to succumb to it, demonstrating the tenacity of the logic of technological determinism. For, having granted Babbage the success of his difference engine, they proceed to extrapolate a nineteenth century vastl...

Table of contents