1
A Plan of Biblical Proportions
At 10 a.m. on September 19, 1957, a nuclear blast shook a mesa at the Nevada Test Site (NTS), located about sixty-five miles northwest of Las Vegas. Willard Libby, a member of the AEC, recalled that he and other observers who had positioned themselves about two and a half miles away heard “a muffled explosion” and felt “a weak ground wave.” The entire “mountain jumped about six inches,” a “ripple…spread over [its] face,” and some rocks rolled down the formation’s slopes. The explosion generated shock waves equivalent to those of an earthquake of approximately 4.6 on the Richter scale, and seismographs as distant as Alaska registered vibrations.1
What Libby witnessed and the seismographs recorded was Project Rainier, the first underground nuclear explosion conducted on U.S. soil. Rainier’s significance, however, went beyond where it took place. Since the late 1940s American scientists had given thought to using atomic explosives for peaceful purposes, and a few months before Rainier they had assigned the idea the name “Plowshare.” Yet throughout, proponents of using the atom in civilian projects faced an increasingly vocal and influential movement to ban nuclear testing because of the dangerous radioactive fallout such explosions generated. For those who believed in Plowshare’s potential, Rainier provided clear evidence that atomic blasts could take place while offering little, if not no, danger to humans.
Genesis
The idea for putting nuclear explosives to nonmilitary use developed from a number of sources. One was the creation of the U.S. Atomic Energy Commission in 1946. Prior to the establishment of the AEC, control over America’s nuclear technology was in the hands of the U.S. Army’s Manhattan Engineering District, better known as the Manhattan Project. An example of the “military-industrial complex” about which President Eisenhower famously warned in his farewell address, the Manhattan Project brought together government officials, scientists, the armed forces, and industry in an effort to develop the atomic bomb. Within a day of the dropping of the first A-bomb on Japan in August 1945, President Harry Truman called on Congress to create a “commission to control the production of the use of atomic power.” Although at first lawmakers considered giving the military much of the control over the new body and to have the commission focus primarily on weapons development, the outcry from both the public and the scientific community prompted Senator Brien McMahon (D-Connecticut) in 1946 to sponsor legislation that would shift responsibility for overseeing America’s military and civilian atomic energy programs from the armed forces to civilian officials. Passed by Congress and signed into law later that year, the McMahon Act, also known as the Atomic Energy Act, established the AEC. The Commission consisted of five civilians, all of whom required Senate confirmation to take their posts. Its job was to give priority to the development of nuclear weapons, but it also had the task of encouraging peaceful uses of the atom. To afford the armed forces, scientists, and lawmakers all a say in the new agency’s decision making, the McMahon Act divided the AEC into various divisions, among them the Division of Military Applications, headed by an officer in the armed forces,2 and established two committees. The first was a General Advisory Committee, made up of engineers and scientists, which met at least four times a year and advised the AEC “on scientific and technical matters relating to materials, production, and research and development.” Unlike the members of the AEC, those individuals who sat on this committee did not require Senate confirmation. The second was Congress’s Joint Committee on Atomic Energy (JCAE). Its job was to oversee the civilian and military nuclear programs. Unlike other agencies of government that sent budget requests directly to the House or Senate appropriations committees, the AEC first had to receive authorization from the JCAE for any budget request; only then would that request move on to the appropriate appropriations committee.3
The superpower arms race also had an impact. With the advent of the cold war, the United States took steps to contain the spread of Soviet-inspired communism. Maintaining a monopoly on atomic weaponry was to U.S. officials an integral component of containment. But the Soviet test of an atomic bomb in August 1949 threatened containment doctrine. It now became vital for the United States to stay ahead of its superpower rival militarily. As part of that effort, President Truman in January 1950 authorized construction of the “super,” or hydrogen bomb. At the end of the year he established the Nevada Proving Grounds as a location for secret tests of new weaponry. Renamed in 1951 the NTS, it was located inside the Tonopah Bombing and Gunnery Range, which encompassed more than five thousand square miles of land in southeastern Nevada.4
Figure 1. Ivy Mike, the first test of a thermonuclear explosive, 1952. Photo courtesy of National Nuclear Security Administration, Nevada Site Office.
In 1952 the “super” went from theory to reality when the “Ivy Mike” test occurred at Eniwetok Atoll, located in the Pacific Ocean. Weighing sixty-five tons, Ivy Mike generated a blast equivalent to ten megatons (or 10 million tons of TNT) and left behind a crater over a mile wide and 160 feet deep. What made Ivy Mike unique was that unlike the fission-based atomic bombs dropped on Japan, it employed fusion. In fission an atomic nucleus is split apart, thereby releasing energy. Fusion takes place by combining atomic nuclei; again, the blast releases energy, but in this case, the amount is much greater. Ivy Mike was evidence of fusion’s potency, having unleashed an explosion equivalent to some 650 Hiroshima-style bombs.
In the superpower arms race, fusion offered advantages over fission. First, there were the matters of production and cost. To build a fission explosive requires the use of uranium-238. A naturally occurring radioactive isotope, uranium-238 itself cannot be employed in a nuclear device. It must be enriched to create uranium-235 or bombarded with neutrons to generate plutonium-239. Uranium-238, however, is not a common element, and therefore is expensive.
A fusion device is different, relying primarily on deuterium and tritium. Deuterium, which is found in nature, is a form of hydrogen, the most common element on the planet and hence much cheaper to obtain than uranium-235. Tritium is more problematic. It too is a form of hydrogen but is not naturally occurring; this makes it costly to produce. However, by using lithium, a more readily available and less expensive element, and bombarding it with neutrons, one can create the necessary tritium.
There was still a problem. Tritium requires a temperature of 80 million degrees to fuse. This was lower than the temperature required to fuse deuterium but still a challenge in itself. Here fission came into play. In a process still used in today’s thermonuclear weapons, a small amount of enriched uranium provides the fission process that, within milliseconds, generates the heat necessary to create and fuse the tritium; that in turn almost instantaneously raises the temperature high enough to fuse the deuterium, and it is the fusion of tritium and deuterium that releases the force of a hydrogen blast. Hence a fusion device requires some uranium. However, since it needs less uranium than one relying solely on fission, a fusion weapon, despite the cost of the tritium, delivers far more bang for the buck than one that employs solely fission. According to U.S. scientists in 1955, a single pound of hydrogen for use in fusion cost only $140, as compared to $11,000 for an analogous amount of uranium-235.5
President Eisenhower and his secretary of state, John Foster Dulles, found fusion highly attractive. It was now possible to build nuclear weapons in larger quantity and at less cost than it took to maintain sizable conventional forces in the field, thereby allowing for cuts in defense spending. Fusion also offered greater deterrence value. With ever more nuclear weapons at its disposal, the Eisenhower administration could use the threat of a nuclear Armageddon to convince the Soviets not to try to spread their influence beyond where it already existed. As Dulles put it in January 1954, “Local defenses must be reinforced by the further deterrent of massive retaliatory power.”6 Adding to that deterrent effect was the fact that the United States (as was the Soviet Union) was within a few years of developing intercontinental ballistic missiles (ICBMs). The ability to place very powerful yet small hydrogen warheads on those missiles would permit the U.S. military to destroy numerous Soviet targets without having to fly aircraft over or physically place launchers near those sites.7 Ivy Mike, in short, proffered numerous military benefits to the United States.
Yet Ivy Mike also held promise for nonmilitary projects. The atom need not lead the world toward obliteration. Rather, if properly harnessed, it could direct humankind toward a better, brighter future. Scientists had known since the early 1900s that radiation could kill bacteria in food and treat human cancers. In February 1946 Edward Teller, a world-renowned physicist who worked on the Manhattan Project and later became known as the “father of the hydrogen bomb,” contended that it was possible to find other civilian uses for the atom, such as producing power. He declared, “Use of radio-elements which are by-products of atomic power plants will have an extremely great influence in science, particularly in medical science.”8 Indeed from Ivy Mike, scientists had discovered two new elements, fermium and einsteinium. Maybe it was possible to generate still others that could be put toward human benefit.
Construction was another possible application. A fission device is very “dirty,” meaning it releases a large amount of radioactivity, a considerable portion of which gets into the atmosphere and returns to Earth in the form of fallout. Fusion generates very little radiation, and what radioactivity is released comes from the fission process used to fuse the deuterium and tritium. Thus in addition to its explosiveness, the smaller amount of uranium employed in a fusion device as opposed to one of fission meant fusion produced a “cleaner” blast.9 If they could further reduce the radioactivity generated in a fusion reaction—in essence, create a totally clean explosive—wondered some scientists, might it not be possible to use the atom to, say, create a hole similar to that at Eniwetok and use it for a harbor? Could they, in short, find creative rather than destructive uses for fusion?
One of the first American scientists to ask such questions was Fred Reines, a physicist at Los Alamos National Laboratory. Shortly after President Truman announced confirmation of the Soviet atomic test, Moscow’s foreign minister, Andrei Vishinsky, told the United Nations in November 1949 that his country did indeed possess the power of the atom, but he insisted that the Kremlin had no intention of adopting it for military applications. Rather, he claimed, his nation had used, and would continue to use, nuclear explosives solely for nonmilitary projects, including mining, hydroelectric power, and the construction of canals.10 While many Americans doubted the Kremlin’s professions of benevolence, Reines found the possibility of putting nuclear devices to use in civilian projects intriguing. Writing in 1950 in the Bulletin of the Atomic Scientists, Reines admitted that nuclear blasts released dangerous radiation, yet he asked whether it might be possible to use “the bomb in such activities as mining, where the fission products would be confined to relatively small regions into which men would be required to go,” or to “divert a river by blasting a large volume of solid rock.” Reines was not alone. One of his colleagues, the mathematician John von Neumann, and scientists at the University of California’s Radiation Laboratory (UCRL), shared Reines’s concept of applying nuclear explosives to civilian undertakings.11
So did President Eisenhower. During World War II he had served as commander of Allied forces in Europe. Afterward he led the U.S. military troops occupying Germany, became president of Columbia University, and then commanded the military forces of the North Atlantic Treaty Organization (NATO) in Europe before running for the presidency in 1952. His time at Columbia had given him an opportunity to meet with atomic scientists, from whom he had learned the possibilities of using the atom for the benefit of humankind. At the same time, he had come to understand, especially with the advent of the fusion weapon, that as long as the arms race continued, the planet faced the risk of a war far more catastrophic than anything it had yet seen. “The world,” he told his recently appointed head of the AEC, Lewis Strauss, “simply must not go on living in the fear of the terrible consequence of nuclear war.” Reinforcing that realization was news in August 1953 that the Soviets had detonated their own hydrogen bomb.12
For Eisenhower, the question was how to curtail the arms race while permitting the United States to maintain its nuclear supremacy. An earlier attempt, called the Baruch Plan, had failed. A half-hearted proposal presented by the United States in 1946, the Baruch Plan called for an international organization to control “all atomic energy activities potentially dangerous to world security” through on-site inspections and other measures. The U.S. government insisted, however, that it would not relinquish its nuclear stockpile until after the Soviets halted their atomic research and permitted inspections. Correctly viewing Washington’s proposal as an effort to maintain America’s nuclear monopoly, and charging it as an infringement of their sovereignty, the Soviets rejected the Baruch Plan.13
Looking at that history, Eisenhower and his aides came up with an ingenious idea: promote the peaceful use of the atom. They saw several advantages to such a strategy. It might curb the arms race by having the superpowers assign a portion of their nuclear stockpile to civilian rather than military use. It would shift the emphasis from the danger offered by the atom to the possible benefits that might accrue. More ominously—and left unsaid—an emphasis on the atom’s potential for good would make Americans more receptive to an increase in the overall size of their nation’s nuclear arsenal, which, in the event of war, could be unleashed against an enemy.14
It was with these considerations in mind that in December 1953 Eisenhower proposed in a speech before the United Nations General Assembly what became known as “Atoms for Peace.” “Atomic bombs today are more than 25 times as powerful as the weapons with which the atomic age dawned,” he explained, “while hydrogen weapons are in the ranges of millions of tons of TNT equivalent.” Even so, history had demonstrated “mankind’s never-ending quest for peace, and mankind’s God-given capacity to build.” The United States sought to join that effort for peace. It wanted, the president insisted, “to be constructive, not destructive. It wants agreement, not wars, among nations.” Hence he proposed that those countries with fissionable material contribute some of it to an international atomic energy agency, overseen by t...