— 8 —
Tom Sjöblom
On Relevance1
One of the classical problems in the study of religion is the question of what makes a belief true for someone. For a long time two basic strategies have been used to answer this question. The first strategy, employed mostly by philosophers, has been to search for the logical conditions that make religious statements true. The problem with this approach is that in ordinary circumstances, people do not have religious beliefs because rational argumentation leads them to think that way. Instead, some individuals use rational argumentation to justify religious beliefs that they hold for other reasons (see, e.g. Shermer 2000, 59–88). The second strategy has been to argue, like scholars of culture tend to do, that we hold something to be true because we have been taught to do so by the cultural authorities of our social community (See Geertz 1973; Boyer 1994, ix). Thus, most Europeans believe in the existence of the Christian God but not in the existence of Mickey Mouse because we are coached to do so because of patterns established by our cultural environment. The problem with this approach is that people do not seem to believe in something simply because they are told to do so. Additional and more fundamental cognitive factors seem to be involved (Sperber 1975, 1996; Boyer 1990, 1994).
Cognitive anthropology and psychology have recently come up with a third solution. According to this solution, what makes something to be conceived as true is based on how much relevance it possesses. A highly relevant belief is more likely to be accepted as true than another with little or no relevance (Sperber and Wilson 1995). In other words, people believe in God instead of Mickey Mouse because the latter is not relevant to their lives. This approach has proven to possess more explanatory power than the two classical approaches. It has created a new problem, however, known as the “Mickey Mouse Problem” that has to do with how representations attain relevancy (e.g. Atran 2002, 260).
Three possible answers have been proposed to solve this question. First, it can be argued that most relevance is attached to beliefs that derive from our “deepest desires” (Nichols 2004, 347–371). This solution has its roots in Freud and classical psychology of religion. The cognitive version differs from earlier ones by describing the cognitive processes involved. Thus, the most relevant beliefs appear to be those that in one way or another deal with the survival of the individual either directly by promoting social cohesion or indirectly by offering alternative scenarios to existential questions such as what happens to us after death, etc. (see Nichols 2004, 357; Sosis and Alcorta 2003, 264–274; Bulbulia 2004, 1–32). Proponents of this approach usually think of religious representations as cognitive adaptations.
Two other approaches view religious representations rather as by-products, existing because it is possible, not because they have some essential function (see Gould 1991, 43–6; Pinker 1997, 554–558; Boyer 2001).2 The first of these by-product approaches argues that once a belief comes into existence, it survives if it contains sufficient strategic information. Beliefs that are invested with strategic information are those that can be used as frames of reference in our everyday interactions. Naturally, the belief-representations with the widest frameworks contain more strategic information. This gives them more relevancy over other representations (Morton 2000, 218–237; Boyer 2001, 150–155; Barrett 2004, 46–51). The second by-product approach argues that instead of, or in addition to, cognitive relevance, what really matters is emotional commitment to some beliefs rather than others. According to this view, people have emotional attitudes towards some representations, and it is this emotional commitment to them that makes them relevant for people (Atran 2002, 260–261; Pyysiäinen 2002, 110–132; 2004, 46, 130–134). The emotional commitment does not have to be due to some inherent connection with our “deepest desires,” but can be produced by emotional stimulation (Pyysiäinen 2001, 140).
This chapter is a contribution to this discussion by introducing a new body of data, namely, religious narratives. Narrative intelligence and narrative discourse have been mostly neglected in the cognitive study of religion. This might simply be because narratives are complex representations which are not easily approached on the level of cognition. Or the reason might be that narratives are often connected with written texts and stories. Such extra-somatic evidence is often interpreted to be somehow outside the competence area of cognitive science. Studying narrative intelligence and narrative discourse, however, has a long-standing place in cognitive science, starting with the ground-breaking work of cognitive scientist Roger Schank to the more recent, but even more influential, theories of psychologist Jerome Bruner (see, e.g. Bruner 1986; Schank 1990; Mateas and Sengers 2002, 1–25). It is the findings and approaches of these research programs that I want to link to the theories of relevance mentioned above.
In the Beginning
The importance of narratives for religious traditions is obvious. One can go so far as to argue that outside the field of experts, religions communicate mainly through narratives. It appears to be a fundamental feature of religious systems that structural beliefs and theological doctrines are supported by narratives and that narratives also integrate different aspects of the religious system into each other and into the self-systems of the believers (Hinde 1999, 101–102).3 Narrative representation is not, of course, restricted to religious systems and belief environments. On the contrary, it appears to be fundamental for human cognition in general. Our cognition seems to be built for seeking patterns, rhythm and causal connections in our environment (see, e.g. Dennett 1991, 344–356; Guthrie 1993; Thagard 2000). Recent research on these rhythmic patterns of causal connections in human cognition—what I will refer to as the narrative drive—suggests that they are vital for our ability to plan actions and solve problems (Damasio 1994; Greenspan and Shanker 2004, 218).
Given the importance of narratives in human experience and communication, it is not surprising that story-creation and narrative intelligence have been of interest in cognitive science from the beginning. It has, however, proven to be a very elusive object of study, and our understanding of the narrative mode of thought (in contrast to the forms of narrative discourse) has begun to accumulate only during the past decade or two (see Mateas and Sengers 2002, 1–25). One of the leading scholars behind this development has been psychologist Jerome Bruner who argues that there are two modes of cognitive functioning basic to human thought. One of the modes, the paradigmatic mode, employs categorization or conceptualisation as its basic devices for ordering human experience, while the second mode, the narrative mode, works through creating mental landscapes where present experiences, actions, and conditions are placed into relationship with what has been going on before (and sometimes what is thought will happen in the future)—something I call the principle of analogy (Bruner 1986, 11–43; Tulving and Lepage 2000, 208–228; Gentner, Holyoak and Kokinov 2001).
In the original formulation of his argument, Bruner saw these two modes as complementary but irreducible to one another, implying a more or less equal status and independent origins for both of them (Bruner 1986, 11). More recently, however, he has stressed that it is the narrative mode that seems to be cognitively preferred, and it might even turn out to be the fundamental obligatory basis for the paradigmatic mode to exist in the first place (Bruner 2002, 89). There are several lines of evidence that support this argument. For example, the development of narrative intelligence seems to precede that of the paradigmatic mode both historically and psychologically. On the basis of her experiments, Katherine Nelson argues that rudiments of narrative thinking—what she calls event sequences—are present in the child’s mind around the age of one. However, it takes an additional year or two for full-blown narrative thinking to be in place (Nelson 2003, 25–26; see also Holyoak and Thagard 1997, 35–44). The evolutionary origins of narrativity are much more difficult to investigate, but Michael Carrithers has presented convincing arguments for narrativity as the essential explanatory mechanism for the birth of culture, and the actual evolutionary dynamics involved have been explained in the much more detailed and refined hypothesis of the birth of mythic culture by Merlin Donald (Carrithers 1991, 305–317; Donald 1991, 201–268). Furthermore, Donald argues that the birth of the narrative mind can be equated with the appearance of the so-called archaic homo sapiens 160,000 years ago, while the first signs of paradigmatic thought do not appear before the appearance of modern culture starting around 40,000 years ago (Donald 2001, 260–262).
As discussed by Leda Cosmides and John Tooby through their experiments with the Wason Selection Task, problem solving appears to be easier for us when operating in a narrative mode (Cosmides and Tooby 1992, 163–228). Similarly, our memory also tends to prefer the narrative mode, and a well-known mnemonic technique used by professional mnemonists is to create a mental narrative of the memorized entities that seems to greatly enhance their capabilities to remember even abstract and highly paradigmatic material (see Yates 1966). Experiments by Andrea S. Heberlein and associates seem to indicate that we also prefer to ascribe what we see in terms of social narratives even when we describe changes in relationships of purely abstract entities (Heberlein et al. 1998, 1176).4 The same preference for narrativity is also evident in such prototypical examples of paradigmatic thought as scientific writing, as discussed by Misia Landau and others (Lewin 1987, 30–38; Landau 1991; Stockzkowski 2000, 187–192).
Third and finally, all the above-mentioned studies also suggest that our interactions in the social world are controlled by the narrative mode. We also know that our minds have been ultimately adapted to deal with this same social world, with its challenges and interactions (Chance and Mead 1953, 395–439; Humphrey 1976, 303–317; Byrne and Whiten 1988; Whiten and Byrne 1997; Dunbar 1998, 178–190). We are fundamentally a social species with social minds. Therefore, simply in terms of the evolution of cognition, it would make good sense to think of narrativity as the basic mode of thinking. Indeed, this is basically what Robin Dunbar suggests with his version of the so-called Social Brain Hypothesis, where the origins of language and human communication are understood in terms of emotional communication (Dunbar 1996; 1998 178–190; 2004; Dautenhahn 1999a, 59–66; 1999b, 63–90), to which I will turn in what follows.
Narratives and Emotional Communication
Narrativity as a cognitive tool should be distinguished from narratives as discourse. If the narrative mode, however, is the cognitive tool used for dealing with social situations, it is not surprising that narratives are the preferred mode of social communication. Dunbar’s starting-point is his finding that people use language mainly for gossip, i.e. exchanging information on social matters. Indeed, speaking seems ...