PART I
Mainframe Culture
1
Charles Babbage and the Politics of Computer Memory
As we have seen, the dialectic of technological determination is both enabling and disempowering. It clears space to imagine wild visions of the future. But it closes off our ability to question our present options, since the future is presumed to be the inevitable result of inexorable technological progress. And it impoverishes our understanding of the past, robbing it of any sense of contingency. What happened had to happen, since it did happen.
This bargain has been good enough for technotopians from Edward Bellamy to Isaac Asimov (author of the Foundation series, which introduced the fictional predictive science of âpsychohistoryâ) to Gene Rodenberry (creator of Star Trek) to Louis Rosetto (founder of Wired magazine). But for some thinkers, the trade-off isnât worth it. Looking more closely at the history of computing, these skeptics notice the odd turns and dead ends that give the lie to the grand narrative of technological determinism.
This chapter will look at the struggle between the determinist mainstream and the critical margins to define the historical memory of computing. It will focus on the contested legacy of the so-called âfather of computing,â Charles Babbage. Before we get to Babbage, though, weâll need a little background. Letâs start with a question: what is a âcomputerâ?
The answer depends on how you define âcomputer.â The term was originally used to label not machines, but people. For most of the past three centuries, a computer meant âone who computes,â according to the Oxford English Dictionary, which traces this usage as far back at 1646.1 Scientists engaged in large-scale projects involving many calculations, such as the computation of navigation tables, would hire rooms full of human âcomputersââusually womenâto crunch their numbers.2 It was not until the 1940s, when new kinds of flexible calculating machines began to replace people for these large-scale projects, that the connotations of the word began to shift, as engineers labeled their new devices âcomputers.â Even so, through the 1940s and 1950s, popular discourse more often referred to the machines as âgiant brains,â âelectronic brains,â or âmechanical brains.â It wasnât until the 1960s that âcomputerâ became standard usage. While a term such as âgiant brainsâ may strike us today as a rather garish anthropomorphism, note that the seemingly more neutral term âcomputerâ itself has its origins in anthropomorphism.
One version of the history of computing, then, is the story of computing as a process for the large-scale production and organization of informationâa process performed sometimes by people, sometimes by machines. A second, more familiar version is the story of the computer as a mechanical calculating device. This chronology takes us from the abacus and other counting devices of the ancient world to the mechanical adding machines first developed in the seventeenth century, which used gears and levers to perform arithmetic. These two strands of computing historyâ computer as large-scale information processor, and computer as mechanical deviceâfirst came together in the work of a nineteenth-century British inventor named Charles Babbage.
Babbageâs Engines
Babbage began his first project, the âdifference engine,â in the 1820s. A massive, steam-powered calculating machine and printer, it was designed to mechanize the process of computation and table-making, just as other inventions of the Industrial Revolution were mechanizing other labor processes. The British government invested a total of 17,000 pounds in his research; Babbage is estimated to have spent an equal amount of his own money. In 1833 Babbage produced a small-scale prototype that clearly demonstrated that the completed machine could work. But before Babbage could finish his machine, he was distracted by a new, more complex project. He never completed his difference engine.
Babbageâs new idea was an âanalytical engine.â Rather than being hard wired to perform specific tasks, it was designed to be âa machine of the most general nature.â Inspired by the Jacquard loom, Babbage came up with the idea of using a series of punched cards to input information into his machine. The cards would contain not only the raw numbers to be processed, but also logically coded instructions on how to process them. Input numbers could be held in the âstore,â a series of 1000 registers, each capable of storing one 50-digit number. Calculations of input numbers, or numbers taken from the store, would be performed by the âmill.â The results would be displayed by the âoutput,â an automated typesetter.
To contemporary computer users, Babbageâs design sounds strikingly familiar. One can see in the punch-card âinstructionsâ the equivalent of a contemporary computer program, in the âstoreâ an analogue to computer memory, and in the âmillâ a parallel to a central processing unit. Itâs worth being careful, however, in taking these parallels too far. As one historian of computing warns,
at first sight Babbageâs ideas are close to ours and one can almost imagine that he had invented the modern computer. However, it is too easy to read into his writings a modern interpretation. We assume that his thought ran along the same lines as ours would if we were faced with similar problems. In fact, it may have been running along quite different lines, lines that are hard for us to follow because we know the modern solutions.3
Similarly, R. Anthony Hyman explains,
discussions of the Analytical Engines pose semantic problems because of the many features they have in common with modern computers. These abstract features of logical organization are almost impossible to discuss without using concepts which have been developed in modern computing. Such concepts as the stored program, programming, or array processing carry many implications to a modern computer person which may or may not in some measure have been clear to Babbage.4
Babbage worked on his Analytical Engine from 1834 to his death in 1871, but never completed the machine. He died a bitter, unappreciated man.
Babbage is a problematic figure in the historical memory of computing culture. Heâs been dubbed the âfather of computing,â or sometimes âthe grandfather of computing.â Today the largest computer museum in the United States is titled the Charles Babbage Institute, and a national software chain calls itself Babbageâs. In England heâs hailed as one of the countryâs great inventors. The bicentennial of his 1791 birth was celebrated with museum exhibitions, commemorative postage stamps, and, as we shall see, a successful attempt to build a working difference engine.5 In a sense, Babbage invented the computer, since he developed a set of ideas that would ultimately see fruition in the twentieth century as what we now know as the computer.
But Babbage never successfully built his own machines, and no inventors followed in his footsteps. In fact, his work, now so widely hailed, was rediscovered too late to influence the development of the computer in the twentieth century. His writing was still obscure when the first digital computers were built in the 1940s. Most pioneers of the era, such as J. Presper Eckert and William Mauchly, have confirmed that theyâd never heard of Babbage when they worked on ENIAC and UNIVAC, the machines widely considered to be the first true computers.6
Harvardâs Howard Aiken, who developed another early computer, the Mach IV, was one of the few scientists of the time who did know something of Babbageâs work. As a result, whiggish historians of technology have been quick to detect a direct lineage. The introduction to one collection on computing in 1953 claimed, âBabbageâs ideas have only been properly appreciated in the last ten years, but we now realize that he understood clearly all the fundamental principles which are embodied in modern digital computers.â7 More recent scholarship, however, has complicated this story. I. Bernard Cohenâs detailed study of the relationship between the work of Babbage and Aiken concludes that âBabbage did not play any seminal role in the development of Aikenâs own ideas about machine architecture.â8 In fact, the Mark I âsuffered a severe limitation which might have been avoided if Aiken had actually known Babbageâs work more thoroughly.â9 As Stan Augarten concludes,
Unlike most of the early computer pioneers, Aiken had heard of Babbage, and his proposal contained a brief, if rather inaccurate, summary of the Englishmanâs work. Aiken saw himself as Babbageâs spiritual heir, yet his machines, the Automatic Sequence-Controlled Calculator (ASCC), or Harvard Mark I, had little in common with the Analytical Engine.10
So the figure of Babbage raises nagging questions for the technological determinism that operates as computer cultureâs common-sense theory of history. The logic of technological determinism presumes that if a technology can be developed, inevitably it will beâthereâs no sense in pausing to weigh a technologyâs promised benefits against its possible consequences, because technological momentum will keep pushing history forward, regardless. Raymond Williams identifies this perspective in Television: Technology and Cultural Form:
[T]echnological determinism ⌠is an immensely powerful and now largely orthodox view of the nature of social change. New technologies are discovered, by an essentially internal process of research and development, which âcreated the modern word.â The effects of these technologies, whether direct or indirect, foreseen or unforeseen, are as it were the rest of history. The steam engine, the automobile, television, the atomic bomb, have made modern man and the modern condition.11
Technological determinism is a kind of essentialism, likening technological development to an unstoppable natural force. Cyber-pundit John Perry Barlow writes in his influential âDeclaration of Independence for Cyberspace,â âCyberspace ⌠is an act of nature and it grows itself through our collective actions.â12 Likewise, Nicholas Negroponte warns in Being Digital, âlike a force of nature, the digital age cannot be denied or stopped.â13 But if technology is an unstoppable force, why didnât Babbage successfully build the first computer? Why did it take another century for Babbageâs ideas to be realized? What happened to that force of nature for 100 years?
The technological determinist response is the converse of the assumption that âif it can be built, it will be builtâ: âif it wasnât built, it must be because it couldnât have been built.â For decades, writers looking back on Babbageâs work concluded that while Babbageâs engineering plans may have been sound, the Britain of his day just didnât have the technical capability to bring his design to fruition. In that era, the argument goes, rods, gears, and bearings couldnât be made to the precise specifications Babbageâs design demanded. And so, Babbageâs machines were doomed to failure: they simply couldnât be built with the existing technology.14
But more recent work has called this argument into question. In 1991 a team of historians succeeded in constructing a working difference engine from Babbageâs designs, using parts no more precisely tooled than those available to Babbage.15 As the projectâs leader, Doron Swade, puts it,
In every history book which cites the reason Babbage failed, they say it was due to limitations in the technology of the time.⌠Our findings will oblige historians to look at more subtle reasons for the failure of the project, like the way governments are advised and the relationship between applied and pure science.16
Swadeâs The Difference Engine examines these subtle reasons, tracing Babbageâs struggles with expensive machinists, skeptical government officials, and a conservative scientific establishment.
Charles Babbage, Icon of Contingency
Babbageâs story, then, is a prime example of the contingency of history. Under slightly different circumstances, his invention could very well have been realized. As a result, for contemporary critics of the rhetoric of technological determinism, Babbage is a charged and compelling historical figure. When technological determinists point to examples such as the failures of Britainâs Luddites to argue, âyou canât stop history,â critics of technological determinism can point to Babbage to demonstrate, âhistory doesnât always turn out the way you expected it to.â
This critique is most fully elaborated in a science fiction genre known as âsteampunk.â A play on the label âcyberpunk,â steampunk is a similarly revisionist science fiction genre. The difference is that is it set not in the future, but in the Steam Age. Steampunk imagines an alternate nineteenth century in which the Industrial Revolution and the Information Age happened simultaneously. The most influential steampunk novel is 1991âs The Difference Engine, cowritten by William Gibson and Bruce Sterling, two of cyberpunkâs most prominent novelists.
The Difference Engine proposes an alternate history in which Babbage succeeds in building both his difference engine and analytic engine. The result is a steam-powered Information Age, in which nineteenth-century âclackersâ work on massive steam-driven computing machines. The effect of juxtaposing these two eras of technological change is to put each in relief, satirizing contemporary culture by garbing it in nineteenth century gear while forcing the reader to rethink clichĂŠs about the past by investing it with the urgency of the present. And by positing an alternative history in which Babbage did succeed, Gibson and Sterling force us to think of our own era as only one of many possible outcomes of historyâand perhaps not the optimal one at that. As Gibson put it in one interview, âOne of the things that Difference Engine does is to disagree rather violently with the Whig concept of history, which is that history is a process that leads to us, the crown of creation.⌠[I]tâs about contingency leading to us.â17 Likewise, literary critic Herbert Sussman writes,
The primary rhetorical effect of alternative history lies in the shock of defamiliarization. As much as contemporary theory has quite properly emphasized that we employ our own political agendas to organize the past, most readers ⌠tend to naturalize past events, assuming that since events did happen they somehow had to happen, that social change and, given the emphasis of this novel, technological change are determined. One major effect of alternate history is to dramatize that what we accept as inevitable is only contingent, only one among an infinite number of possibilities, of forking paths.18
But if Gibson and Sterling successfully debunk technological determinism on one level, in another sense they nonetheless seem to succumb to it, demonstrating the tenacity of the logic of technological determinism. For, having granted Babbage the success of his difference engine, they proceed to extrapolate a nineteenth century vastl...