The Human Factor
eBook - ePub

The Human Factor

Revolutionizing the Way People Live with Technology

Kim J. Vicente

Share book
  1. 368 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

The Human Factor

Revolutionizing the Way People Live with Technology

Kim J. Vicente

Book details
Book preview
Table of contents
Citations

About This Book

In this incessantly readable, groundbreaking work, Vincente makes vividly clear how we can bridge the widening gap between people and technology. He investigates every level of human activity-from simple matters such as our hand-eye coordination to complex human systems such as government regulatory agencies, and why businesses would benefit from making consumer goods easier to use. He shows us why we all have a vital stake in reforming the aviation industry, the health industry, and the way we live day-to-day with technology.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is The Human Factor an online PDF/ePUB?
Yes, you can access The Human Factor by Kim J. Vicente in PDF and/or ePUB format, as well as other popular books in Médias et arts de la scène & Film et vidéo. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2013
ISBN
9781135877255

image



TECHNOLOGY WREAKING HAVOC

image

A Threat to Our Quality of Life:
Technology Beyond Our Control

image
Just before midnight on Friday, April 25, 1986, Leonid Toptunov was about to begin the graveyard shift in the control room of the Vladimir Ilyich Lenin nuclear power station located near Chernobyl, just 130 kilometres northeast of Kiev and 600 kilometres southwest of Moscow.1 The weather had been unseasonably warm that week, but the joyous May Day holiday celebrations were less than a week away. As Toptunov took off his street clothes and donned his pristine white overalls and white beret for the last time, he had no idea that in less than two hours he would become an unwitting participant in a catastrophic event of historic proportions.
Earlier that day, the Chernobyl operators had begun an experimental test. It required two important conditions be satisfied: the power being produced by the nuclear reactor had to be reduced to about 25 per cent of its full capacity, and the primary safety system that was designed to protect the plant during an emergency had to be turned off- during the entire period of the test. At 1:00 P.M. the operators had begun to reduce the amount of power produced by the nuclear reactor, closely monitoring the relevant meters on the immense technological consoles in front of them. One hour later, they deliberately disabled the safety system, stripping the plant of one of its primary defences — all as required by the test plan. A nine-hour delay ensued. The continuation of the test was put off until a later shift.
Nuclear reactors have very complex dynamics, and Chernobyl was no exception. As a result of this complexity, Toptunov — the senior reactor control engineer on his crew — had trouble stopping the power level at 25 per cent and actually bottomed out at a power level of 7 per cent. But the Soviet RBMK-1000 reactor design is very unstable at low power, making it exceedingly difficult for operators to maintain control of the plant. This, combined with the fact that one of the main safety systems was turned off, made the situation extremely dangerous, but Toptunov and his comrades didn't realize it because they weren't used to running the reactor at such a low power level and because they didn't fully understand the complex principles governing the reactor's behaviour. To make matters worse, the thousands of indicators on the wall-sized consoles in front of Toptunov presented a bewildering array of data, but not enough information, and so the gravity of the situation wasn't obvious to him. And besides, young Toptunov had been told that technical experts had estimated the likelihood of a severe accident to be one in ten million — a virtual impossibility. So he and his co-workers persisted with the test.
To do so they improvised — with the plant in an unfamiliar and increasingly dangerous state — eventually stripping the reactor of its remaining safety systems. By 1:22 A.M. that fateful night, the nuclear reactor was almost out of control. Yet the temperature in the control room didn't skyrocket, there was no growing vibration, no loud noises — nothing comparable to what was shortly to come. The only thing that changed was the set of indications on the displays embedded in the bewildering consoles. Just two minutes later, at 1:24 A.M., Toptunov finally realized that the readings staring him in the face meant that a terrible event was about to occur: in a last-ditch effort to avert disaster, he tried to turn off the reactor. But his well-intentioned effort came too late; by that point, Chernobyl's fate was sealed. A critical nuclear reaction — the type that occurs by design in an atomic bomb but is never supposed to happen in a nuclear power plant — was inevitable. And immediate.
The first violent explosion unleashed a power spike one hundred times greater than anything the reactor was designed to produce under normal operating conditions. It hoisted the thousand-ton steel and concrete plate covering the reactor, exposing the 1,680 nuclear fuel rods in the reactor core and spewing deadly radioactivity into the atmosphere. The force of the explosion was so mighty that it sent radioactive particles flying as high as one kilometre into the air. A second furious explosion caused the graphite in the reactor core to burst into flames. The fire continued to burn for nine days, releasing an invisible, constant stream of radioactive particles into the environment. The reactor itself was destroyed.
Until that instant, when the first explosion ripped the reactor apart, the nuclear technology had functioned precisely as intended. The designers had done everything they were supposed to do from a technical perspective: all the hardware and software worked flawlessly. And Toptunov and his colleagues were carrying out the test plan as well as they knew how. The problem was that the plant designers hadn't paid enough attention to the human factor — the operators were trained but the complexity of the reactor and the control panels nevertheless outstripped their ability to grasp what they were seeing.2 Toptunov didn't completely understand the effects his actions were going to have until it was too late — with devastating consequences. When the graphite reactor core exploded into flames, the awesome impact that a nuclear power plant can have on both humankind and the environment was vividly realized.3
The six hundred people unlucky enough to be working at the plant that evening received very high doses of radiation and many later suffered lingering or fatal diseases. The 116,000 people who were evacuated from the neighbouring farms and towns received lower but still significant doses of radiation. The 600,000 military and civilian workers who heroically helped put out the fires, evacuate the public, and clean up the disaster were also exposed to high levels of radiation. About 140 people experienced various degrees of injuries, including convulsive radiation sickness and burns that caused blistered skin to slide off the flesh. A total of thirty-one people died as a result of the accident, including Toptunov.
He was twenty-six years old.
One of the further horrors of a nuclear catastrophe is that its impact travels widely across time and space. The number of cases of thyroid cancer among children in the area has increased, with almost 1,800 diagnosed between 1990 and 1998. Harder to measure, but just as real, is the psychological impact of such a disaster: one of the most significant health effects of the Chernobyl accident was the mental anguish and trauma experienced by the local population. People continue to be terrified of the unknown effects of radiation; they don't trust the government or scientific experts, and their way of life has been severely disrupted. These health effects will persist for generations.
But the environmental contamination is equally lasting because there is no “undo” command for a nuclear accident. To this day, large areas of land can no longer be used for agricultural purposes, and food is still monitored for radiation over an even larger area. And the impact of a nuclear accident on this scale transcends geographical borders. Chernobyl released radioactive material all across the northern hemisphere, although Europe was hardest hit. The degree of contamination outside the Soviet Union was relatively low, but radioactive fallout was detected and measured in England, Scandinavia, Southern Europe, Canada, the United States, and as far away as Japan, with the exact amount depending on the weather — if there was rainfall in a particular area when the radioactive cloud passed over, it received a greater amount of radioactivity. The lesson became clear with Chernobyl — a nuclear catastrophe anywhere can be a nuclear catastrophe everywhere.
Step back a moment to 1936. In the final days of black-and-white silent movies, Charlie Chaplin created a masterful satire of industrialization, Modern Times, which drew attention to the human and social costs of technology. In one memorable sequence, Chaplin is shown working on an assembly line. His job is to perform a few motions over and over again; he uses two wrenches, one in each hand, to tighten two bolts on each component rolling by on a conveyor belt. The speed of the belt increases; Chaplin tries desperately to keep up but eventually he's carried away by the conveyor belt and fed into a chute. In the next scene, we see several gigantic mechanical wheels with intertwined geared teeth grinding the bemused Little Tramp through a rigidly defined S-shaped path, first forward, then back, then forward again. He has been forced to adapt to technology — literally: he has become a veritable cog in the wheel.
Chaplin, however, had to adapt only to mechanical gears moving at terrestrial speed. We who inhabit the modern times of the twenty-first century have to adapt to digital technology moving at light speed. More and more technology is being foisted upon us at a faster and faster pace. We walk around with electronic leashes — pagers, cell phones, personal digital assistants and pocket PCs — that tie us to our work. At home, we have the latest electronic consumer products — each with its own remote control and hefty user's manual. All these gadgets are supposed to make life easier, but they often make it more difficult instead. And before we learn to use the latest technological “convenience,” there's a new one on the market with more “advanced” features. No matter how many user's manuals we read, we just can't seem to keep up.
The challenges facing us have never been more daunting, despite the fact that our knowledge of the physical world and the technological possibilities we possess are vastly more sophisticated than they were even fifty years ago. Never before in the history of human civilization have we so quickly amassed so much knowledge of science, mathematics and engineering, and never before have we seen such tremendous advances in technology. The number, diversity and sophistication of the options available to us allow us to conceive and construct increasingly intricate products and systems. Given this abundant knowledge of both the physical world and of technological possibilities, we might expect our problems with technology to decrease, not increase. Granted, many technical innovations have undoubtedly improved our quality of life. One well-known example is the PalmPilot personal digital assistant. This handheld electronic device has been a marketplace success because many people find it both useful and easy to use. In later chapters, I'll describe how the PalmPilot and several other successful everyday products were designed. But devices that are easy for people to use and that serve a significant human or societal need seem to be the exception. As a result, there's a growing realization that all is not well in the world of technology.4
Here's an everyday example. A few years ago, Mercedes-Benz started offering a feature on their E320 model that lets drivers check their oil electronically, from the driver's seat.5 It seems like a clever use of technology. You don't have to leave the cosy confines of your climate-controlled automobile. Smart. You no longer have to pop the hood, find a rag to wipe the dipstick, or figure out which of the several dipstick-looking things under the hood is really the dipstick. And you don't have to go through the tedious and messy manual process of lifting the dipstick, wiping it, reinserting it, taking a reading and reinserting it again — exactly the kind of innovation you'd expect from legendary German engineering.
This electronic oil-checking feature couldn't have been designed decades ago, before the transistor was invented. At that time, our knowledge of electronics and our available technological options were too impoverished to permit such a potentially useful feature. I say “potentially,” because I haven't yet described what you actually have to do to check your oil from the driver's seat in this car. There are only five steps. Step number 1: turn the car off. Step number 2: wait for the oil to settle. Fair enough. It doesn't make sense to check the oil with the engine on. You have to let things settle to get a reliable reading of the level. Step number 3: turn the ignition two notches to the right. Hmmm. That's a little less obvious. It's easy enough to do, but there's no intuitive relationship between the action and the effect of the action. Step number 4: wait five seconds. What? Wait five seconds? You've already waited for the oil to settle. Why do you have to wait another five seconds? But you're not done yet. There's one more step. Step number 5: within one second, press the odometer reset button twice. This step makes no sense whatsoever. It seems completely arbitrary. What does the odometer reset button have to do with checking the oil? As far as I can tell, there's no logical answer to this question — and I have a Ph.D. in mechanical engineering. The average driver will be baffled, even though the electronic components have been painstakingly designed, with a sophisticated understanding of the laws of electricity. In the end, most people will just get out of the car and check the oil the old-fashioned way because they can't remember the steps and can't be bothered to read the counterintuitive instructions again. So much for that legendary German engineering.
I once tried to describe the work my students and I do to a journalist, who turned my long-winded explanation into a succinct sound bite: “Oh, so you're technological anthropologists!” I had never thought of our work in that way, but I suppose that's one way of describing it. We have indeed done a number of field studies of people using technology in situ — or in their local habitat, my journalist friend might say. I once spent an entire week during spring break on the twelve-hour night shift in a nuclear power plant's control room trying to figure out how the operators performed what looked like an impossible job, incredibly reliably, day in and day out. I've also spent time in hospitals, just talking to doctors and nurses about how technology helps or hinders their jobs, and watching surgeries in the O.R. One of the first operations I saw was an amputation below the knee (my medical colleagues didn't inform me ahead of time — if they had, I might not have showed up). And more recently, I've spent days in 911 call centres listening in on phone operators and ambulance dispatchers trying to deal with life-and-death medical emergencies. (In one call, the 911 operator was trying to guide the caller to perform mouth-to-mouth resuscitation on an old man who had started to turn blue, but the caller was reluctant to get that close to the ailing patient, giving as an excuse not to follow the instructions: “I think he's dead, I think he's dead!”) My graduate students have spent countless hours talking to operators in petrochemical plant control rooms, watching computer network managers monitor and troubleshoot telecommunication webs, sitting in with flight engineers in aircraft simulators and in real cockpits on long flights, observing nurses programming computer-based medical devices in hospital recovery rooms, and observing an engineering design firm at work, over a period of months. At the same time, we've conducted research on how to design better technological systems — generating new design ideas, building prototypes and running controlled scientific experiments to see if our creations really do help peop...

Table of contents