PART I THE SEDUCTIONS OF THE INTERFACE
CHAPTER 1 A TALE OF TWO AESTHETICS
As I write these words, I keep shuffling the text on my computer screen. Once I would literally have had to cut and paste. Now I call it cut and paste. Once I would have thought of it as editing. Now with computer software, moving sentences and paragraphs about is just part of writing. This is one reason I now remain much longer at my computer than I used to at my paper writing tablet or typewriter. When I want to write and donât have a computer around, I tend to wait until I do. In fact, I feel that I must wait until I do.
Why is it so hard for me to turn away from the screen? The windows on my computer desktop offer me layers of material to which I have simultaneous access: field notes; previous drafts of this book; a list of ideas not yet elaborated but which I want to include; transcripts of interviews with computer users; and verbatim logs of sessions on computer networks, on bulletin boards, and in virtual communities. When I write at the computer, all of these are present and my thinking space seems somehow enlarged. The dynamic, layered display gives me the comforting sense that I write in conversation with my computer. After years of such encounters, a blank piece of paper can make me feel strangely alone.
There is something else that keeps me at the screen. I feel pressure from a machine that seems itself to be perfect and leaves no one and no other thing but me to blame. It is hard for me to walk away from a not-yet-proofread text on the computer screen. In the electronic writing environment in which making a correction is as simple as striking a delete key, I experience a typographical error not as a mere slip of attention, but as a moral carelessness, for who could be so slovenly as not to take the one or two seconds to make it right? The computer tantalizes me with its holding powerâin my case, the promise that if I do it right, it will do it right, and right away.
COMPUTER HOLDING POWER
The computerâs holding power is a phenomenon frequently referred to in terms associated with drug addiction. It is striking that the word âuserâ is associated mainly with computers and drugs. The trouble with that analogy, however, is that it puts the focus on what is external (the drug). I prefer the metaphor of seduction because it emphasizes the relationship between person and machine. Love, passion, infatuation, what we feel for another person teaches us about ourselves. If we explore these feelings, we can learn what we are drawn to, what we are missing, and what we need. The analysis of computational seductions offers similar promise if we drop the clichĂ© of addiction and turn to the forces, or more precisely, the diversity of forces that keep us engrossed in computational media.
What attracts me to the computer are the possibilities of âconversationâ among the multiple windows on my screen and the way an instantly responsive machine allays my anxieties about perfection. But other people are drawn by other sirens. Some are captured by virtual worlds that appear to be unsullied by the messiness of the real. Some are enthralled by the sense of mind building mind or merging with the mind of the computer. If one is afraid of intimacy yet afraid of being alone, even a stand-alone (not networked) computer offers an apparent solution. Interactive and reactive, the computer offers the illusion of companionship without the demands of friendship. One can be a loner yet never be alone.
Just as musical instruments can be extensions of the mindâs construction of sound, computers can be extensions of the mindâs construction of thought. A novelist refers to âmy ESP with the machine. The words float out. I share the screen with my words.â An architea who uses the computer to design goes further: âI donât see the building in my mind until I start to play with shapes and forms on the machine. It comes to life in the space between my eyes and the screen.â Musicians often hear the music in their minds before they play it, experiencing the music from within before they experience it from without. The computer can be similarly experienced as an object on the border between self and not-self.1 Or, in a new variant on the story of Narcissus, people are able to fall in love with the artificial worlds that they have created or that have been built for them by others. People are able to see themselves in the computer. The machine can seem a second self, a metaphor first suggested to me by a thirteen-year-old girl who said, âWhen you program a computer there is a little piece of your mind, and now itâs a little piece of the computerâs mind. And now you can see it.â An investment counselor in her midforties echoes the childâs sentiment when she says of her laptop computer: âI love the way it has my whole life on it.â
The computer, of course, is not unique as an extension of self. At each point in our lives, we seek to project ourselves into the world. The youngest child will eagerly pick up crayons and modeling clay. We paint, we work, we keep journals, we start companies, we build things that express the diversity of our personal and intellectual sensibilities. Yet the computer offers us new opportunities as a medium that embodies our ideas and expresses our diversity.
In the early years of the computer culture, the most dramatic instances of such projections of self into computers occurred in the esoteric domain of programming. Now, as in the case of the novelist and the architect, it is quite common for people to project themselves into the simulations that play on their screens, into the screen images and their actions. Computer holding power, once closely tied to the seductions of programming, today is tied to the seductions of the interface. When video games were very new, I found that the holding power of their screens often went along with a fantasy of a meeting of minds between the player and the program behind the game. Today, the program has disappeared; one enters the screen world as Alice stepped through the looking glass. In todayâs game simulations, people experience themselves in a new, often exotic setting. The minds they meet are their own.
Our minds, of course, are very different, one from the other, so it is not surprising that different people make the computer their own in their own way.2 People choose to personalize and customize their computers. And they have very different styles both of using computers and of interpreting their meaning. In this, the computer resembles the psychologistâs Rorschach test, whose inkblots suggest many shapes but commit themselves to none. It is up to individuals to make out what the legacy of personality, history, and culture causes them to see. Just as different people take up the computer in different ways, so do different cultures. Indeed, from the very beginning of its mass deployment, computer technology encouraged a variety of cultures in which a wide range of social, artistic, and political values found expression.
For example, by the late 1970s, the computer culture included welldeveloped âhackerâ and âhobbyistâ subcultures that could be described in terms of differing computational aesthetics.3 What most distinguished the subculturesâ members from one another was not how much they knew, but what they valued in the machine. The hacker subculture was made up of programmer-virtuosos who were interested in taking large, complex computer systems and pushing them to their limits. Hackers could revel in the imperfectly understood. As they programmed, things did not always get clearer, but they became workable, at least for the master hacker with the âright stuff.â Hacking offered a certain thrillseeking, a certain danger. It provided the sense, as one hacker put it, âof walking on the edge of a cliff.â He explained further, âYou could never really know that your next little âlocal fixâ wouldnât send the whole system crashing down on you.â The hacker style made an art form of navigating the complexity of opaque computer microworlds.
In contrast, the hobbyist subculture, the world of early personal computer owners, had an altogether different computational aesthetic. For hobbyists, the goal was to reduce a machine to its simplest elements in order to understand it as fully as possible. Hobbyists preferred to work close to the computer hardware; they enjoyed the sense of nothing standing between themselves and the âbodyâ of the machine. Hackers enjoyed working on large, complex machines and large, almost-out-of-control projects; hobbyists enjoyed working on small machines and very narrowly defined projects. Hackers enjoyed working on a level where they could ignore âthe bare machineâ; hobbyists took pleasure in reducing high-level commands to the details of machine code. Many hobbyists used the kind of control they felt able to achieve with their home computers to relieve a sense that they had lost control at work and in political life. In a typical remark about the compensatory pleasures of personal computing, one said, âAt work Iâm just a cog; at home with my computer I get to see how all of my thinking fits together.â For another, âI love the feeling of control when I work in a safe environment of my own creation.â In the early days of the personal computer culture, a satisfying understanding of the central processing unit (CPU) of home computers was turned into an ideal for how to understand society; the rules of the community should be transparent to all its members.4
Missing from this description of the computer culture of the late 1970s is the perspective of those who have come to be called âusers.â A user is involved with the machine in a hands-on way, but is not interested in the technology except as it enables an application. Hackers are the antithesis of users. They are passionately involved in mastery of the machine itself. The hobbyists in their own way were equally enthralled. Those who wanted to use computers for strictly instrumental purposesâto run data for a business analysis, for exampleâhad to either learn how to program the machine or hand their data over to someone who did. Only in the late 1970s and early 1980s did the contemporary notion of âuserâ appear. It first came up in connection with small personal computers that could be used for writing and financial analysis by means of application programs (such as WordStar and VisiCalc). These were programs that people could use without getting involved with the âgutsâ of the machine. Although I have introduced the terms hacker, hobbyist, and user to refer to specific people, they are best understood as different modes of relationship that one can have with a computer.
When I got my own personal computer in 1979,1 saw the hobbyist and the user modes come together in myself. My first personal computer was an Apple II. It ran Scribble, an early wordprocessing program. When I used Scribble, I gave commands to the machine: Mark this text, copy that text, paste this text, delete that text. I didnât know and I didnât care how Scribble communicated with the bare machine. I delegated that problem to the program. I was a user. Yet, there was something about working on that Apple II that reminded me of the thrill I had first felt the year before, when a home computer owner I interviewed, a hobbyist, let me work alongside him as he built his computer from a kit and talked about âthe pleasure of understanding a complex system down to its simplest level.â
My 1979 Apple II computer began its service as my wordprocessor by being stripped naked. Its plastic cover had been removed so that the Apple processor (and associated chips) could be replaced with another, which could run the operating system, called CP/M. Thus altered, the Apple II offered itself to me as a potentially transparent technology, that is, it offered the promise that ultimately it could be understood by being reduced to its constituent elements. So even though Scribble gave me the opportunity to relate to the machine as a user, as someone who was only interested in the machineâs performance, the Apple II communicated a vision of how one could understand the world.
Computers support different styles and cultures because they can be approached in different ways. The execution of the simplest program can be described on many levelsâin terms of electronic events, machine language instructions, high-level language instructions, or through a structured diagram that represents the functioning of the program as a flow through a complex information system. There is no necessary one-to-one relationship between the elements on these different levels of description, a feature of computation which has led philosophers of mind to see the computerâs hardware-software interplay as evocative of the irreducible relationship of brain and mind.
This irreducibility stands behind the diversity of possible styles of relating to the computer. But this natural pluralism on an individual level is in tension with other forces. Changes in both technology and culture encourage certain styles of technology and of representing technology to dominate others.
As I see it now, objects such as that Apple II support a modernist interpretation of understanding, according to which understanding proceeds by reducing complex things to simpler elements. My stripped-down Apple II both embodied and symbolized a theory that it was possible to understand by discovering the hidden mechanisms that made things work. Of course, this kind of theory, particularly in its utopian form (analyze and you shall know), has always presented itself as more than a model for understanding objects. It also promised understanding of the self and the social world. A modernist morality animates the writings of Karl Marx as well as Adam Smith, Sigmund Freud as well as Charles Darwin.
THE MACINTOSH MYSTIQUE
Five years after I got my Apple II, the Macintosh computer was introduced. The Macintosh suggested a radically different way of understanding. Unlike the personal computers that had come before, the âMacâ encouraged users to stay at a surface level of visual representation and gave no hint of inner mechanisms. The power of the Macintosh was how its attractive simulations and screen icons helped organize an unambiguous access to programs and data. The user was presented with a scintillating surface on which to float, skim, and play. There was nowhere visible to dive.
Yet strictly speaking, a Macintosh, like all computers, remained a collection of on/off switches, of bits and bytes, of traveling electrons, just like those embedded in the chips of my âtransparentâ Apple II. But the Macintosh strove to make these âirrevelantâ to the user. In this way, the tools of the modernist culture of calculation became layered underneath the experience of the culture of simulation.
The Macintosh interfaceâits screen, reallyâsimulated a real desk. Not a logical interface, manipulated with logical commands, as my CP/M system on the Apple II had been, but a virtual reality, albeit in two dimensions. This was a world in which you navigated your way through information as you would through space. In fact, when you held a mouse and moved it around by hand on a flat surface, you saw your physical movements mirrored on the screen by an indicator icon, usually an arrow or a pointing finger. When I used the Scribble program on my Apple II, I typed such things as â@center[@b(The Macintosh Mystique)]â to indicate that I wanted a centered subheading, âThe Macintosh Mystique,â printed in bold type. Although I didnât analyze the Scribble program any further, such requirements kept me in touch with the idea that I was giving commands to a machine. I felt that I needed to use symbols and a formal language of nested delimiters (parentheses and brackets) because my machine needed to reduce my commands to something that could be translated into electrical impulses. The fact that my machineâs printed circuits were physically exposed to view reinforced this notion.
Writing on the Macintosh was an altogether different experience. It did not feel like commanding a machine. A simulated piece of paper appeared. A flashing pointer told me where I could start typing. If I wanted the words âThe Macintosh Mystiqueâ to appear centered and in bold type, I typed them and moved the mouse to manipulate their position and form. If I did this correctly, they appeared as I had desired, right there on the screen. I saw no reference to anything beyond the magic.
The simulated desktop that the Macintosh presented came to be far more than a user-friendly gimmick for marketing computers to the inexperienced. It also introduced a way of thinking that put a premium on surface manipulation and working in ignorance of the underlying mechanism. Even the fact that a Macintosh came in a case that users could not open without a special tool (a tool which I was told was only available to authorized dealers) communicated the message. The desktopâs interactive objects, its anthropomorphized dialogue boxes in which the computer âspokeâ to its userâthese developments all pointed to a new kind of experience in which people do not so much command machines as enter into conversations with them. People were encouraged to interact with technology in something resembling the way they interact with other people. We project complexity onto people; the Macintosh design encouraged the projection of complexity onto the machine. In relationships with people we often have to get things done without necessarily understanding what is going on within the o...