Project Plowshare
eBook - ePub

Project Plowshare

The Peaceful Use of Nuclear Explosives in Cold War America

  1. 296 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Project Plowshare

The Peaceful Use of Nuclear Explosives in Cold War America

About this book

Inspired by President Dwight D. Eisenhower's "Atoms for Peace" speech, scientists at the Atomic Energy Commission and the University of California's Radiation Laboratory began in 1957 a program they called Plowshare. Joined by like-minded government officials, scientists, and business leaders, champions of "peaceful nuclear explosions" maintained that they could create new elements and isotopes for general use, build storage facilities for water or fuel, mine ores, increase oil and natural gas production, generate heat for power production, and construct roads, harbors, and canals. By harnessing the power of the atom for nonmilitary purposes, Plowshare backers expected to protect American security, defend U.S. legitimacy and prestige, and ensure access to energy resources.

Scott Kaufman's extensive research in nearly two dozen archives in three nations shows how science, politics, and environmentalism converged to shape the lasting conflict over the use of nuclear technology. Indeed, despite technological and strategic promise, Plowshare's early champions soon found themselves facing a vocal and powerful coalition of federal and state officials, scientists, industrialists, environmentalists, and average citizens. Skeptical politicians, domestic and international pressure to stop nuclear testing, and a lack of government funding severely restricted the program. By the mid-1970s, Plowshare was, in the words of one government official, "dead as a doornail." However, the thought of using the atom for peaceful purposes remains alive.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Project Plowshare by Scott Kaufman in PDF and/or ePUB format, as well as other popular books in History & North American History. We have over one million books available in our catalogue for you to explore.

Information

1


A Plan of Biblical Proportions

At 10 a.m. on September 19, 1957, a nuclear blast shook a mesa at the Nevada Test Site (NTS), located about sixty-five miles northwest of Las Vegas. Willard Libby, a member of the AEC, recalled that he and other observers who had positioned themselves about two and a half miles away heard “a muffled explosion” and felt “a weak ground wave.” The entire “mountain jumped about six inches,” a “ripple…spread over [its] face,” and some rocks rolled down the formation’s slopes. The explosion generated shock waves equivalent to those of an earthquake of approximately 4.6 on the Richter scale, and seismographs as distant as Alaska registered vibrations.1
What Libby witnessed and the seismographs recorded was Project Rainier, the first underground nuclear explosion conducted on U.S. soil. Rainier’s significance, however, went beyond where it took place. Since the late 1940s American scientists had given thought to using atomic explosives for peaceful purposes, and a few months before Rainier they had assigned the idea the name “Plowshare.” Yet throughout, proponents of using the atom in civilian projects faced an increasingly vocal and influential movement to ban nuclear testing because of the dangerous radioactive fallout such explosions generated. For those who believed in Plowshare’s potential, Rainier provided clear evidence that atomic blasts could take place while offering little, if not no, danger to humans.

Genesis

The idea for putting nuclear explosives to nonmilitary use developed from a number of sources. One was the creation of the U.S. Atomic Energy Commission in 1946. Prior to the establishment of the AEC, control over America’s nuclear technology was in the hands of the U.S. Army’s Manhattan Engineering District, better known as the Manhattan Project. An example of the “military-industrial complex” about which President Eisenhower famously warned in his farewell address, the Manhattan Project brought together government officials, scientists, the armed forces, and industry in an effort to develop the atomic bomb. Within a day of the dropping of the first A-bomb on Japan in August 1945, President Harry Truman called on Congress to create a “commission to control the production of the use of atomic power.” Although at first lawmakers considered giving the military much of the control over the new body and to have the commission focus primarily on weapons development, the outcry from both the public and the scientific community prompted Senator Brien McMahon (D-Connecticut) in 1946 to sponsor legislation that would shift responsibility for overseeing America’s military and civilian atomic energy programs from the armed forces to civilian officials. Passed by Congress and signed into law later that year, the McMahon Act, also known as the Atomic Energy Act, established the AEC. The Commission consisted of five civilians, all of whom required Senate confirmation to take their posts. Its job was to give priority to the development of nuclear weapons, but it also had the task of encouraging peaceful uses of the atom. To afford the armed forces, scientists, and lawmakers all a say in the new agency’s decision making, the McMahon Act divided the AEC into various divisions, among them the Division of Military Applications, headed by an officer in the armed forces,2 and established two committees. The first was a General Advisory Committee, made up of engineers and scientists, which met at least four times a year and advised the AEC “on scientific and technical matters relating to materials, production, and research and development.” Unlike the members of the AEC, those individuals who sat on this committee did not require Senate confirmation. The second was Congress’s Joint Committee on Atomic Energy (JCAE). Its job was to oversee the civilian and military nuclear programs. Unlike other agencies of government that sent budget requests directly to the House or Senate appropriations committees, the AEC first had to receive authorization from the JCAE for any budget request; only then would that request move on to the appropriate appropriations committee.3
The superpower arms race also had an impact. With the advent of the cold war, the United States took steps to contain the spread of Soviet-inspired communism. Maintaining a monopoly on atomic weaponry was to U.S. officials an integral component of containment. But the Soviet test of an atomic bomb in August 1949 threatened containment doctrine. It now became vital for the United States to stay ahead of its superpower rival militarily. As part of that effort, President Truman in January 1950 authorized construction of the “super,” or hydrogen bomb. At the end of the year he established the Nevada Proving Grounds as a location for secret tests of new weaponry. Renamed in 1951 the NTS, it was located inside the Tonopah Bombing and Gunnery Range, which encompassed more than five thousand square miles of land in southeastern Nevada.4
Figure 1. Ivy Mike, the first test of a thermonuclear explosive, 1952. Photo courtesy of National Nuclear Security Administration, Nevada Site Office.
In 1952 the “super” went from theory to reality when the “Ivy Mike” test occurred at Eniwetok Atoll, located in the Pacific Ocean. Weighing sixty-five tons, Ivy Mike generated a blast equivalent to ten megatons (or 10 million tons of TNT) and left behind a crater over a mile wide and 160 feet deep. What made Ivy Mike unique was that unlike the fission-based atomic bombs dropped on Japan, it employed fusion. In fission an atomic nucleus is split apart, thereby releasing energy. Fusion takes place by combining atomic nuclei; again, the blast releases energy, but in this case, the amount is much greater. Ivy Mike was evidence of fusion’s potency, having unleashed an explosion equivalent to some 650 Hiroshima-style bombs.
In the superpower arms race, fusion offered advantages over fission. First, there were the matters of production and cost. To build a fission explosive requires the use of uranium-238. A naturally occurring radioactive isotope, uranium-238 itself cannot be employed in a nuclear device. It must be enriched to create uranium-235 or bombarded with neutrons to generate plutonium-239. Uranium-238, however, is not a common element, and therefore is expensive.
A fusion device is different, relying primarily on deuterium and tritium. Deuterium, which is found in nature, is a form of hydrogen, the most common element on the planet and hence much cheaper to obtain than uranium-235. Tritium is more problematic. It too is a form of hydrogen but is not naturally occurring; this makes it costly to produce. However, by using lithium, a more readily available and less expensive element, and bombarding it with neutrons, one can create the necessary tritium.
There was still a problem. Tritium requires a temperature of 80 million degrees to fuse. This was lower than the temperature required to fuse deuterium but still a challenge in itself. Here fission came into play. In a process still used in today’s thermonuclear weapons, a small amount of enriched uranium provides the fission process that, within milliseconds, generates the heat necessary to create and fuse the tritium; that in turn almost instantaneously raises the temperature high enough to fuse the deuterium, and it is the fusion of tritium and deuterium that releases the force of a hydrogen blast. Hence a fusion device requires some uranium. However, since it needs less uranium than one relying solely on fission, a fusion weapon, despite the cost of the tritium, delivers far more bang for the buck than one that employs solely fission. According to U.S. scientists in 1955, a single pound of hydrogen for use in fusion cost only $140, as compared to $11,000 for an analogous amount of uranium-235.5
President Eisenhower and his secretary of state, John Foster Dulles, found fusion highly attractive. It was now possible to build nuclear weapons in larger quantity and at less cost than it took to maintain sizable conventional forces in the field, thereby allowing for cuts in defense spending. Fusion also offered greater deterrence value. With ever more nuclear weapons at its disposal, the Eisenhower administration could use the threat of a nuclear Armageddon to convince the Soviets not to try to spread their influence beyond where it already existed. As Dulles put it in January 1954, “Local defenses must be reinforced by the further deterrent of massive retaliatory power.”6 Adding to that deterrent effect was the fact that the United States (as was the Soviet Union) was within a few years of developing intercontinental ballistic missiles (ICBMs). The ability to place very powerful yet small hydrogen warheads on those missiles would permit the U.S. military to destroy numerous Soviet targets without having to fly aircraft over or physically place launchers near those sites.7 Ivy Mike, in short, proffered numerous military benefits to the United States.
Yet Ivy Mike also held promise for nonmilitary projects. The atom need not lead the world toward obliteration. Rather, if properly harnessed, it could direct humankind toward a better, brighter future. Scientists had known since the early 1900s that radiation could kill bacteria in food and treat human cancers. In February 1946 Edward Teller, a world-renowned physicist who worked on the Manhattan Project and later became known as the “father of the hydrogen bomb,” contended that it was possible to find other civilian uses for the atom, such as producing power. He declared, “Use of radio-elements which are by-products of atomic power plants will have an extremely great influence in science, particularly in medical science.”8 Indeed from Ivy Mike, scientists had discovered two new elements, fermium and einsteinium. Maybe it was possible to generate still others that could be put toward human benefit.
Construction was another possible application. A fission device is very “dirty,” meaning it releases a large amount of radioactivity, a considerable portion of which gets into the atmosphere and returns to Earth in the form of fallout. Fusion generates very little radiation, and what radioactivity is released comes from the fission process used to fuse the deuterium and tritium. Thus in addition to its explosiveness, the smaller amount of uranium employed in a fusion device as opposed to one of fission meant fusion produced a “cleaner” blast.9 If they could further reduce the radioactivity generated in a fusion reaction—in essence, create a totally clean explosive—wondered some scientists, might it not be possible to use the atom to, say, create a hole similar to that at Eniwetok and use it for a harbor? Could they, in short, find creative rather than destructive uses for fusion?
One of the first American scientists to ask such questions was Fred Reines, a physicist at Los Alamos National Laboratory. Shortly after President Truman announced confirmation of the Soviet atomic test, Moscow’s foreign minister, Andrei Vishinsky, told the United Nations in November 1949 that his country did indeed possess the power of the atom, but he insisted that the Kremlin had no intention of adopting it for military applications. Rather, he claimed, his nation had used, and would continue to use, nuclear explosives solely for nonmilitary projects, including mining, hydroelectric power, and the construction of canals.10 While many Americans doubted the Kremlin’s professions of benevolence, Reines found the possibility of putting nuclear devices to use in civilian projects intriguing. Writing in 1950 in the Bulletin of the Atomic Scientists, Reines admitted that nuclear blasts released dangerous radiation, yet he asked whether it might be possible to use “the bomb in such activities as mining, where the fission products would be confined to relatively small regions into which men would be required to go,” or to “divert a river by blasting a large volume of solid rock.” Reines was not alone. One of his colleagues, the mathematician John von Neumann, and scientists at the University of California’s Radiation Laboratory (UCRL), shared Reines’s concept of applying nuclear explosives to civilian undertakings.11
So did President Eisenhower. During World War II he had served as commander of Allied forces in Europe. Afterward he led the U.S. military troops occupying Germany, became president of Columbia University, and then commanded the military forces of the North Atlantic Treaty Organization (NATO) in Europe before running for the presidency in 1952. His time at Columbia had given him an opportunity to meet with atomic scientists, from whom he had learned the possibilities of using the atom for the benefit of humankind. At the same time, he had come to understand, especially with the advent of the fusion weapon, that as long as the arms race continued, the planet faced the risk of a war far more catastrophic than anything it had yet seen. “The world,” he told his recently appointed head of the AEC, Lewis Strauss, “simply must not go on living in the fear of the terrible consequence of nuclear war.” Reinforcing that realization was news in August 1953 that the Soviets had detonated their own hydrogen bomb.12
For Eisenhower, the question was how to curtail the arms race while permitting the United States to maintain its nuclear supremacy. An earlier attempt, called the Baruch Plan, had failed. A half-hearted proposal presented by the United States in 1946, the Baruch Plan called for an international organization to control “all atomic energy activities potentially dangerous to world security” through on-site inspections and other measures. The U.S. government insisted, however, that it would not relinquish its nuclear stockpile until after the Soviets halted their atomic research and permitted inspections. Correctly viewing Washington’s proposal as an effort to maintain America’s nuclear monopoly, and charging it as an infringement of their sovereignty, the Soviets rejected the Baruch Plan.13
Looking at that history, Eisenhower and his aides came up with an ingenious idea: promote the peaceful use of the atom. They saw several advantages to such a strategy. It might curb the arms race by having the superpowers assign a portion of their nuclear stockpile to civilian rather than military use. It would shift the emphasis from the danger offered by the atom to the possible benefits that might accrue. More ominously—and left unsaid—an emphasis on the atom’s potential for good would make Americans more receptive to an increase in the overall size of their nation’s nuclear arsenal, which, in the event of war, could be unleashed against an enemy.14
It was with these considerations in mind that in December 1953 Eisenhower proposed in a speech before the United Nations General Assembly what became known as “Atoms for Peace.” “Atomic bombs today are more than 25 times as powerful as the weapons with which the atomic age dawned,” he explained, “while hydrogen weapons are in the ranges of millions of tons of TNT equivalent.” Even so, history had demonstrated “mankind’s never-ending quest for peace, and mankind’s God-given capacity to build.” The United States sought to join that effort for peace. It wanted, the president insisted, “to be constructive, not destructive. It wants agreement, not wars, among nations.” Hence he proposed that those countries with fissionable material contribute some of it to an international atomic energy agency, overseen by t...

Table of contents

  1. Preface
  2. List of Abbreviations
  3. Introduction: Promoting the Peaceful Atom
  4. 1. A Plan of Biblical Proportions
  5. 2. Just Drop Us a Card
  6. 3. A Program on Hold
  7. 4. From Moratorium to Test Ban
  8. 5. The Complexities of Canal Construction
  9. 6. Nuclear Testing, Nonproliferation, and Plowshare
  10. 7. Making Headway?
  11. 8. Plowshare Goes Down Under
  12. 9. Dead as a Doornail
  13. Conclusion: Back from the Dead?
  14. Notes
  15. Bibliography