Foreign Policy
eBook - ePub

Foreign Policy

Robert J. Lieber, Robert J. Lieber

Buch teilen
  1. 506 Seiten
  2. English
  3. ePUB (handyfreundlich)
  4. Über iOS und Android verfĂŒgbar
eBook - ePub

Foreign Policy

Robert J. Lieber, Robert J. Lieber

Angaben zum Buch
Buchvorschau
Inhaltsverzeichnis
Quellenangaben

Über dieses Buch

The best writing on foreign policy integrates theory and policy in ways that address the principal questions about a country's place in the world and encourage the reader to think about contemporary questions from a long-term perspective. Accordingly, the essays in this volume have been chosen with an eye to whether they represent important and original thinking and are likely to remain relevant. The authors included here represent diverse views about foreign policy and the international context in which it takes place. While two dozen pieces chosen from a vast literature can never be definitive, nevertheless each of these articles offers a thoughtful, reasoned and often eloquent assessment that is likely to remain a reference point for those seriously interested in the subject. The work is organized into five sections: how to think about foreign policy, the domestic context, foreign policy and unipolarity, foreign policy after 9/11, and foreign policy and the future.

HĂ€ufig gestellte Fragen

Wie kann ich mein Abo kĂŒndigen?
Gehe einfach zum Kontobereich in den Einstellungen und klicke auf „Abo kĂŒndigen“ – ganz einfach. Nachdem du gekĂŒndigt hast, bleibt deine Mitgliedschaft fĂŒr den verbleibenden Abozeitraum, den du bereits bezahlt hast, aktiv. Mehr Informationen hier.
(Wie) Kann ich BĂŒcher herunterladen?
Derzeit stehen all unsere auf MobilgerĂ€te reagierenden ePub-BĂŒcher zum Download ĂŒber die App zur VerfĂŒgung. Die meisten unserer PDFs stehen ebenfalls zum Download bereit; wir arbeiten daran, auch die ĂŒbrigen PDFs zum Download anzubieten, bei denen dies aktuell noch nicht möglich ist. Weitere Informationen hier.
Welcher Unterschied besteht bei den Preisen zwischen den AboplÀnen?
Mit beiden AboplÀnen erhÀltst du vollen Zugang zur Bibliothek und allen Funktionen von Perlego. Die einzigen Unterschiede bestehen im Preis und dem Abozeitraum: Mit dem Jahresabo sparst du auf 12 Monate gerechnet im Vergleich zum Monatsabo rund 30 %.
Was ist Perlego?
Wir sind ein Online-Abodienst fĂŒr LehrbĂŒcher, bei dem du fĂŒr weniger als den Preis eines einzelnen Buches pro Monat Zugang zu einer ganzen Online-Bibliothek erhĂ€ltst. Mit ĂŒber 1 Million BĂŒchern zu ĂŒber 1.000 verschiedenen Themen haben wir bestimmt alles, was du brauchst! Weitere Informationen hier.
UnterstĂŒtzt Perlego Text-zu-Sprache?
Achte auf das Symbol zum Vorlesen in deinem nÀchsten Buch, um zu sehen, ob du es dir auch anhören kannst. Bei diesem Tool wird dir Text laut vorgelesen, wobei der Text beim Vorlesen auch grafisch hervorgehoben wird. Du kannst das Vorlesen jederzeit anhalten, beschleunigen und verlangsamen. Weitere Informationen hier.
Ist Foreign Policy als Online-PDF/ePub verfĂŒgbar?
Ja, du hast Zugang zu Foreign Policy von Robert J. Lieber, Robert J. Lieber im PDF- und/oder ePub-Format sowie zu anderen beliebten BĂŒchern aus Politique et relations internationales & Relations internationales. Aus unserem Katalog stehen dir ĂŒber 1 Million BĂŒcher zur VerfĂŒgung.

Information

Part I
How to Think about Foreign Policy

[1]
HYPOTHESES ON MISPERCEPTION

By ROBERT JERVIS*
IN determining how he will behave, an actor must try to predict how others will act and how their actions will affect his values. The actor must therefore develop an image of others and of their intentions. This image may, however, turn out to be an inaccurate one; the actor may, for a number of reasons, misperceive both others’ actions and their intentions. In this research note I wish to discuss the types of misperceptions of other states’ intentions which states tend to make. The concept of intention is complex, but here we can consider it to comprise the ways in which the state feels it will act in a wide range of future contingencies. These ways of acting usually are not specific and well-developed plans. For many reasons a national or individual actor may not know how he will act under given conditions, but this problem cannot be dealt with here.

I. PREVIOUS TREATMENTS OF PERCEPTION IN INTERNATIONAL RELATIONS

Although diplomatic historians have discussed misperception in their treatments of specific events, students of international relations have generally ignored this topic. However, two sets of scholars have applied content analysis to the documents that flowed within and between governments in the six weeks preceding World War I. But the data have been put into quantitative form in a way that does not produce accurate measures of perceptions and intentions and that makes it impossible to gather useful evidence on misperception.1
The second group of theorists who have explicitly dealt with general questions of misperception in international relations consists of those, like Charles Osgood, Amitai Etzioni, and, to a lesser extent, Kenneth Boulding and J. David Singer, who have analyzed the cold war in terms of a spiral of misperception.2 This approach grows partly out of the mathematical theories of L. F. Richardson3 and partly out of findings of social and cognitive psychology, many of which will be discussed in this research note.
These authors state their case in general, if not universal, terms, but do not provide many historical cases that are satisfactorily explained by their theories. Furthermore, they do not deal with any of the numerous instances that contradict their notion of the self-defeating aspects of the use of power. They ignore the fact that states are not individuals and that the findings of psychology can be applied to organizations only with great care. Most important, their theoretical analysis is for the most part of reduced value because it seems largely to be a product of their assumption that the Soviet Union is a basically status-quo power whose apparently aggressive behavior is a product of fear of the West. Yet they supply little or no evidence to support this view. Indeed, the explanation for the differences of opinion between the spiral theorists and the proponents of deterrence lies not in differing general views of international relations, differing values and morality,4 or differing methods of analysis,5 but in differing perceptions of Soviet intentions.

II. THEORIES—NECESSARY AND DANGEROUS

Despite the limitations of their approach, these writers have touched on a vital problem that has not been given systematic treatment by theorists of international relations. The evidence from both psychology and history overwhelmingly supports the view (which may be labeled Hypothesis i) that decision-makers tend to fit incoming information into their existing theories and images. Indeed, their theories and images play a large part in determining what they notice. In other words, actors tend to perceive what they expect. Furthermore (Hypothesis ia), a theory will have greater impact on an actor’s interpretation of data (a) the greater the ambiguity of the data and (b) the higher the degree of confidence with which the actor holds the theory.6
For many purposes we can use the concept of differing levels of perceptual thresholds to deal with the fact that it takes more, and more unambiguous, information for an actor to recognize an unexpected phenomenon than an expected one. An experiment by Bruner and Postman determined “that the recognition threshold for 
 incongruous playing cards (those with suits and color reversed) is significantly higher than the threshold for normal cards.”7 Not only are people able to identify normal (and therefore expected) cards more quickly and easily than incongruous (and therefore unexpected) ones, but also they may at first take incongruous cards for normal ones.
However, we should not assume, as the spiral theorists often do, that it is necessarily irrational for actors to adjust incoming information to fit more closely their existing beliefs and images. (“Irrational” here describes acting under pressures that the actor would not admit as legitimate if he were conscious of them.) Abelson and Rosenberg label as “psycho-logic” the pressure to create a “balanced” cognitive structure—i.e., one in which “all relations among ‘good elements’ [in one’s attitude structure] are positive (or null), all relations among ‘bad elements’ are positive (or null), and all relations between good and bad elements are negative (or null).” They correctly show that the “reasoning [this involves] would mortify a logician.”8 But those who have tried to apply this and similar cognitive theories to international relations have usually overlooked the fact that in many cases there are important logical links between the elements and the processes they describe which cannot be called “psycho-logic.” (I am here using the term “logical” not in the narrow sense of drawing only those conclusions that follow necessarily from the premises, but rather in the sense of conforming to generally agreed-upon rules for the treating of evidence.) For example, Osgood claims that psycho-logic is displayed when the Soviets praise a man or a proposal and people in the West react by distrusting the object of this praise.9 But if a person believes that the Russians are aggressive, it is logical for him to be suspicious of their moves. When we say that a decision-maker “dislikes” another state this usually means that he believes that that other state has policies conflicting with those of his nation. Reasoning and experience indicate to the decision-maker that the “disliked” state is apt to harm his state’s interests. Thus in these cases there is no need to invoke “psychologic,” and it cannot be claimed that the cases demonstrate the substitution of “emotional consistency for rational consistency.”10
The question of the relations among particular beliefs and cognitions can often be seen as part of the general topic of the relation of incoming bits of information to the receivers’ already established images. The need to fit data into a wider framework of beliefs, even if doing so does not seem to do justice to individual facts, is not, or at least is not only, a psychological drive that decreases the accuracy of our perceptions of the world, but is “essential to the logic of inquiry.”11 Facts can be interpreted, and indeed identified, only with the aid of hypotheses and theories. Pure empiricism is impossible, and it would be unwise to revise theories in the light of every bit of information that does not easily conform to them.12 No hypothesis can be expected to account for all the evidence, and if a prevailing view is supported by many theories and by a large pool of findings it should not be quickly altered. Too little rigidity can be as bad as too much.13
This is as true in the building of social and physical science as it is in policy-making.14 While it is terribly difficult to know when a finding throws serious doubt on accepted theories and should be followed up and when instead it was caused by experimental mistakes or minor errors in the theory, it is clear that scientists would make no progress if they followed Thomas Huxley’s injunction to “sit down before fact as a mere child, be prepared to give up every preconceived notion, follow humbly wherever nature leads, or you will learn nothing.”15
As Michael Polanyi explains, “It is true enough that the scientist must be prepared to submit at any moment to the adverse verdict of observational evidence. But not blindly
 . There is always the possibility that, as in [the cases of the periodic system of elements and the quantum theory of light], a deviation may not affect the essential correctness of a proposition
. The process of explaining away deviations is in fact quite indispensable to the daily routine of research,” even though this may lead to the missing of a great discovery.16 For example, in 1795, the astronomer Lalande did not follow up observations that contradicted the prevailing hypotheses and could have led him to discover the planet Neptune.17
Yet we should not be too quick to condemn such behavior. As Thomas Kuhn has noted, “There is no such thing as research without counter-instances.”18 If a set of basic theories—what Kuhn calls a paradigm—has been able to account for a mass of data, it should not be lightly trifled with. As Kuhn puts it: “Lifelong resistance, particularly from those whose productive careers have committed them to an older tradition of normal science [i.e., science within the accepted paradigm], is not a violation of scientific standards but an index to the nature of scientific research itself. The source of resistance is the assurance that the older paradigm will ultimately solve all its problems, that nature can be shoved into the box the paradigm provides. Inevitably, at times of revolution, that assurance seems stubborn and pig-headed as indeed it sometimes becomes. But it is also something more. That same assurance is what makes normal science or puzzle-solving science possible.”19
Thus it is important to see that the dilemma of how “open” to be to new information is one that inevitably plagues any attempt at understanding in any field. Instances in which evidence seems to be ignored or twisted to fit the existing theory can often be explained by this dilemma instead of by illogical or nonlogical psychological pressures toward consistency. This is especially true of decision-makers’ attempts to estimate the intentions of other states, since they must constantly take account of the danger that the other state is trying to deceive them.
The theoretical framework discussed thus far, together with an examination of many cases, suggests Hypothesis 2: scholars and decision-makers are apt to err by being too wedded to the established view and too closed to new information, as opposed to being too willing to alter their theories.20 Another way of making this point is to argue that actors tend to establish their theories and expectations prematurely. In politics, of course, this is often necessary because of the need for action. But experimental evidence indicates that the same tendency also occurs on the unconscious level. Bruner and Postman found that “perhaps the greatest single barrier to the recognition of incongruous stimuli is the tendency for perceptual hypotheses to fixate after receiving a minimum of confirmation
 . Once there had occurred in these cases a partial confirmation of the hypothesis 
 it seemed that nothing could change the subject’s report.”21
However, when we apply these and other findings to politics and discuss kinds of misperception, we should not quickly apply the label of cognitive distortion. We should proceed cautiously for two related reasons. The first is that the evidence available to decision-makers almost always permits several interpretations. It should be noted that there are cases of visual perception in which different stimuli can produce exactly the same pattern on an observer’s retina. Thus, for an observer using one eye the same pattern would be produced by a sphere the size of a golf ball which was quite close to the observer, by a base-ball-sized sphere that was further away, or by a basketball-sized sphere still further away. Without other clues, the observer cannot possibly determine which of these stimuli he is presented with, and we would not want to call his incorrect perceptions examples of distortion. Such cases, relatively rare in visual perception, are frequent in international relations. The evidence available to decision-makers is almost always very ambiguous since accurate clues to others’ intentions are surrounded by noise22 and deception. In most cases, no matter how long, deeply, and “objectively” the evidence is analyzed, people can differ in their interpretations, and there are no general rules to indicate who is correct.
The second reason to avoid the label of cognitive distortion is that the distinction between perception and judgment, obscure enough in individual psychology, is almost absent in the making of inferences in international politics. Decision-makers who reject information that contradicts their views—or who develop complex interpretations of it—often do so consciously and explicitly. Since the evidence available contains contradictory information, to make any inferences requires that much information be ignored or given interpretations that will seem tortuous to those who hold a different position.
Indeed, if we consider only the evidence available to a decision-maker at the time of decision, the view later proved incorrect may be supported by as much...

Inhaltsverzeichnis