1
âAt the Time People Hadnât Been Asking Those Sorts of Questionsâ
Army Mental Health Research between Vietnam and Iraq
On September 3, 1981, twenty-five Army psychiatrists and psychologists settled into a classroom at Fort Sam Houstonâs Academy of Health Sciences in San Antonio, Texas, to chart the future of the Army mental health. Most of the seven majors and eight captains there had likely not been in the Army long enough to have served in the war in Vietnam, from which the United States had withdrawn combat forces in 1973.1 The ten colonels and lieutenant colonels there might have been among the relatively few psychiatrists who deployed to Southeast Asia, but even if they had, they likely spent the bulk of their time treating the rampant substance abuse issues that troops faced.2 Very few had significant experience with combat stress, much less post-traumatic stress disorder, a condition that had been included for the first time in the previous yearâs revision of the American Psychiatric Associationâs Diagnostic and Statistical Manual of Mental Disorders (DSM-III).3
They were all, however, part of an institution still recuperating from the war. In the late 1970s, Americans worried about a âhollow Armyâ populated by soldiers who were unprepared for modern warfare.4 That summer, the New York Times had reported that the Army was struggling to recruit qualified soldiers and that officers in the U.S. Seventh Army in West Germany were finding that âmany of the soldiers coming . . . from training in the United States cannot read or write or do simple arithmetic.â5 For those ill-prepared soldiers, a tour of duty in Germany was hardly an adventure.6 During the day, they trained on outdated equipment; at night, they slept in dilapidated barracks. âIn most American prisons,â the Times wrote in May, âsuch working and living conditions would probably lead to riots.â7 Morale, unsurprisingly, was low.8
These issues mattered greatly. After Vietnam, the U.S. military turned away from fighting insurgencies in the developing world and focused on building capacity to fight a conventional war to counter potential Soviet aggression in Europe, and the Seventh Army was the bellwether of the United Statesâs ability to meet that challenge.9 For Ronald Reagan, who would oversee a massive increase in defense spending and the recuperation of the U.S. armed forces, that threat was not notional. Telling the West German Bundestag in 1982 that âweâre menaced by a power that openly condemns our values and answers our restraint with a relentless military buildup,â Reagan called for âthe presence of well-equipped and trained forces on Europeâ and promised âa national effort . . . to make long-overdue improvements in our military posture.â10
This anxiety and these promises benefited the U.S. Army. As Andrew Bacevich explains, âPreparing to fight Russians was . . . the ready-made answer to every question essential to institutional recovery and continual health.â11 Reaganâs increased defense spending led to massive modernization of equipment, and the Army finally figured out how to recruit quality soldiers.12 The Seventh Army alone saw an increase of $239 million in its budget in 1983, and new weapons like the M-1 Abrams tank and the Bradley fighting vehicle poured into its arsenal.13 By the fall of 1984, soldiers in the Seventh Army were boasting of the unitâs morale and capabilities, and simulations showed that the Army was capable of defeating a Soviet advance.14 An entire generation of Army armor and infantry officersâthose soldiers who became senior leaders in the twenty-first centuryâmade their careers by demonstrating their ability to lead troops in an imagined battle on the plains of Central Europe.15
So it was in the Army Medical Department, or AMEDD, as well. Medical Department researchers contemplated the likely psychological consequences of the brutal, brief war with the Soviet Union that the larger Army imagined and studied how they could be prevented, mitigated, and treated. Two major schools of thought approached this question. At the Walter Reed Army Institute of Research (WRAIR) outside of Washington, DC, a group of researchers led by Dave Marlowe investigated how soldiers could be made more resilient, focusing on issues including the creation of cohesive units, the impact of strong leadership, and the effects of sleep deprivation. Marlowe was a World War II infantry veteran and Harvard-trained anthropologist who was, one colleague explained, the kind of man who âcould spend hours watching you make soup, just to understand something about you.â16 The WRAIR faction thus embraced his anthropological approach and viewed combat stress somewhat differently than a group at the AMEDD Center and School in San Antonio. This was led by A. David Mangelsdorff, a reservist and civilian psychologist for the Armyâs Medical School who had taken an interest in how soldiers could become better able to cope with adversity, and Col. James Stokes, an Army psychiatrist. This group focused primarily on combat stress and battle fatigueâa temporary but treatable ailment brought about by the overwhelming violence and chaos of combatâand took a traditional psychomedical approach.17 According to Col. James Martin, who served under Marlowe and led WRAIRâs research unit in Heidelberg, Germany, the two groups were akin to âpeople with different religious beliefs,â which meant that âthere were profound differences in their approachâ because âwe each had a different view of the elephant.â18
The research conducted by each group and its implementation in the 1980s and 1990s offers an important window into the assumptions and capabilities regarding mental health that the Army brought into the twenty-first century. In particular, the belief that the Army would henceforth fight short, intense conflicts led to a focus on combat stress and battle fatigue that, while clinically valid, did not account for the possibility of protracted irregular warfare that would require individual soldiers to deploy multiple times. Perhaps more important, when the Army did fight in the 1980s and 1990s, the Army persistently struggled to effectively implement its Combat Stress Control program. The combined failure to prepare for lengthy conflicts and to effectively apply the doctrine that it was developing was joined by both a shrinking of the AMEDDâs mental health capabilities after the Cold War and a wider Army culture that disdained mental health care. Despite important successes, like the creation of a combat stress control doctrine and the development of a capacity to survey the well-being of deployed troops, these factors left the AMEDD largely unprepared for, and having to learn in real time how to address, many of the challenges that the twenty-first-century wars would present.
***
Four decades later, it is perhaps difficult to recall the seriousness with which U.S. and NATO military planners accorded a potential Warsaw Pact invasion of Western Europe. In 1981, the RAND Corporation, the most prominent defense think tank, issued a white paper that tallied the number of Soviet divisions in East Germany, Poland, and Czechoslovakia at thirty-one, with another three dozen in the western Soviet Union.19 Although not all of these were combat ready, the balance of forces between the two sides was grim; the Warsaw Pact had more troops, more tanks, more artillery, and more aircraft.20 The report argued that, âin an attack against NATO, 36 divisions could be used in the first wave with the remaining 18 in a second echelon about 72 hours later.â21 That meant that about half a million Soviet troops would pour in Western Europe in an invasionâs first days.22
Given these figures, U.S. Army leaders envisioned a nightmare scenario if war came. At the first Usersâ Workshop on Combat Stress, Maj. Raymond Keller told the assembled psychiatrists and psychologists that such a war would be âcontinuous battle against a seemingly unending stream of fresh enemy forces . . . [that will] provide for no respite from combatâ and that âtactical units will be walled off from their support base . . . by conventional and nuclear munitions as well as chemical agents, surrounded and then destroyed in detail.â23 A year later, Maj. William H. Thornton offered a similarly bleak assessment: a Soviet invasion of Central Europe promised âperhaps the most difficult conditions ever faced by American soldiersâ and that âthere will be neither rest nor respite from the terrors of battle.â24
However terrible they assumed the war would be, though, these presenters also assumed that it would be brief. Assessments of the Soviet doctrine assumed that âthe Soviets hope to produce a swift and sudden collapse of NATO, conquer Europe, and sue for peaceâ and that they would use nuclear weapons at most âafter 1â2 days.â25 Another prediction, based on the assumed similarities between the 1973 Arab-Israeli War and a potential war against the Soviet Union, estimated that the conflict would last three or four days.26
These assumptions shaped the Armyâs approach to mental health. Army psychologists and psychiatrists maintained that combat inevitably produces psychological changes and that âcombat stress reactionsâ could be either beneficial or debilitating. A solider, for example, might do something uncharacteristically heroic; by the same token, he might suffer a psychotic break.27 Of course, it was combatâs âuncomfortable or performance degradingâ effects, which they grouped under the term âbattle fatigue,â that worried them.28 Combating and treating battle fatigue thus became Army mental healthâs primary preoccupation in the Cold Warâs final years. Foremost, the Army sought to minimize battle fatigue casualties. Under the leadership of Gen. Maxwell Thurman, the demanding workaholic leader who as Army Vice Chief of Staff played a defining role in the organizationâs post-Vietnam reccuperation, the Army focused on building healthy unitsâones in which quality, well-trained soldiers trusted their leaders and colleagues, had stable relationships outside of their units, and felt prepared for their mission.29 Throughout the 1980s, Military Review: The Professional Journal of the US Army, which the Command and General Staff College at Fort Leavenworth published, featured articles with titles like âSoldiers: They Deserve Good Leadershipâ and âLeadership Challenges on the Nuclear Battlefield.â30 Imagining a battlefield in which the Soviets would deploy tactical nuclear weapons, Lt. Col. Jeffrey L. House, the author of the latter article, argued that âan additional decrease in combat effectiveness can be anticipated from the psychological changes in soldiersâ and thus that âwe will need leaders who have the thinking skills, moral courage and initiative to form disciplined, cohesive units that are capable of fighting and winning in such an environment.â31 Another piece argued for training under realistically stressful conditions.32 In the field, ensuring that soldiers were well rested was critical because studies showed that soldiers quickly became unable to perform their missions if they got less than four hours of sleep.33 A 1986 âLeader Actions for Battle Fatigueâ graphical training aid counseled, âNever waste a chance for sleep.â34
Army researchers knew, however, that battle fatigue was not entirely preventable. Rather, given the war that they expected to fight, one presenter anticipated that over half of the casualties âwill be psychiatric casualties within the FIRST 48 hours,â while another predicted that âhigh stress conditions will prevail and make every soldier susceptible of becoming a psychological casualty,â to the point that âmedical facilities will be overwhelmed and engaged forces depleted.â35
These predictions were not, however, as dire as they might have seemed, for the prevailing medical opinion was that the condition was both short-lived and easily treatable. Central to the Armyâs treatment protocol were the principles of proximity, immediacy, and expectation, or PIE, which held that battle fatigue casualties stood the best chance of recovery if they were treated promptly near the front lines, rather than being evacuated to the rear, and were told that they would return to their units shortly.36 This concept, of course, was hardly original to the 1980s; it was, as Col. James Stokes, an AMEDD psychiatrist deeply involved in the creation of the Combat Stress Control doctrine, stated, âan understood and well-proved treatmentâ that had been combat psychiatryâs driving assumption throughout the twentieth century.37 The Armyâs protocols in the 1980s thus followed six decades of thinking on matters of military psychiatry. The correct prescription for a soldier âstaring into space . . . and unable to carry out his duties or sleep [because] one of his friends had been killedâ was âsleep (sedate if necessary). Food and liquid, shower and shave, stress normalcy not illness, and do not evacuate to rear, and return to full duty.â38
The Army sought to communicate this view beyond mental health providers and to train soldiers that battle fatigue was inevitable but not permanently debilitating. In 1985, for example, the First Cavalry Divisionâs Combat Stress Course was promulgated on the idea that âany ânormalâ soldier could be expected to become a battle fatigue casualty . . . if exposed to enough stresses in combat,â but the course explained tha...