Working Memory in Perspective
eBook - ePub

Working Memory in Perspective

  1. 352 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Working Memory in Perspective

About this book

The Baddeley and Hitch (1974) Working Memory model holds a central place in experimental psychology and continues to be extremely successful in guiding and stimulating research in applied and theoretical domains. Yet the model now faces challenges from conflicting data and competing theories. In this book, experienced researchers in the field address the question: Will the model survive these challenges? They explain why it is so successful, evaluate its weaknesses with respect to opposing data and theories and present their vision of the future of the model in their particular area of research. The book includes a discussion of the "Episodic Buffer" component which has recently been added to the working memory model.
The result is a comprehensive and critical assessment of the working memory model and its contribution to current research in human cognition, cognitive development, neuroscience and computational modelling. Furthermore, this collection serves as a case study to illustrate the range of factors that determine the success or failure of a theory and as a forum for discussing what researchers want from scientific theories. The book begins with an accessible introduction to the model for those new to the field and explains the empirical methods used in working memory research. It concludes by highlighting areas of consensus and suggesting a programme of research to address issues of continuing controversy. Working Memory in Perspective will be a valuable resource to students and researchers alike in the fields of human memory, language, thought and cognitive development.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Working Memory in Perspective by Jackie Andrade in PDF and/or ePUB format, as well as other popular books in Psychologie & Psychologie cognitive et cognition. We have over one million books available in our catalogue for you to explore.

Part I
Introduction

1 An introduction to working memory

Jackie Andrade

Ulric Neisser defined cognition as ‘all the processes by which the sensory input is transformed, reduced, elaborated, stored, recovered, and used’ (Neisser, 1967, p. 4). The concept of ‘working memory’ refers to a set of processes or structures that are intimately associated with many of these processes, making it a cornerstone of cognitive psychology. Understanding how we temporarily store and process information is fundamental to understanding almost all other aspects of cognition. In 1974, Baddeley and Hitch proposed a model of working memory that comprised separate, limited-capacity storage and processing components. Many different models have been put forward since, but Baddeley and Hitch’s model remains extremely influential not only in cognitive psychology but also in neuroscience and developmental psychology. It has been particularly successful as a tool for exploring cognition outside the laboratory, helping explain data and generate new hypotheses in fields as diverse as mental imagery, language acquisition, and learning disability.
This book offers a case study of research driven by the working memory model proposed by Baddeley and Hitch (1974) and updated by Baddeley in 1986 (hereafter referred to as the WM model). It focuses on a single theory because I was impressed by the variety of research for which the WM model is used, and by the fact that researchers are using this model despite dissatisfaction with some aspects of it. With the exception of Jon May and Geoff Ward, who provide contrasting theoretical perspectives, the contributing authors were selected because they use the WM model in their day-to-day research, but are not the original authors of the model. I felt that they were thus the best people to give an objective critique of the model, a sense of what it is like to use the model to guide psychological research, and an evaluation of the likely future of the model. In the hands of a new generation of researchers, will the WM model gradually run out of steam and be superseded by competing theories or will it continue to go from strength to strength?
I asked contributors to assess their satisfaction with the WM model by considering the extent to which its success in generating new data or explanations outweighed the challenge posed by contradictory data or competing theories. Each author answered four questions:
  1. What are the strengths of the WM model for your research?
  2. What are its weaknesses?
  3. How does it compare with competing models in your field?
  4. What is the future of the model in your line of research?
Research using the WM model has been so productive that it is becoming increasingly difficult for a single researcher to keep track of all the new findings in the field and thereby to assess the balance between the usefulness of the model and the contradictory evidence. Collectively, the answers to questions 1 and 2 provide an up-to-date assessment of the weight of evidence for and against the model. Question 3 sets the WM model in a broader theoretical context. Some of its competitors are other models of working memory, some are models with quite a different focus, for example Kosslyn’s (1994) model of imagery, or a different level of explanation, for example Barnard’s (1999) Interacting Cognitive Subsystems. Question 4, about the future of the WM model, asks authors to assess the relative importance of the strengths, weaknesses, and competing models for future WM research. Overall, the questions provide a coherent thread through the book and their answers constitute an in-depth analysis of the role of the WM model in current psychological research.
The book is organised in four parts. This introductory chapter in the first part explains the historical context of the WM model and summarises the current evidence supporting it. It also describes the various methodologies employed in working memory research, and which form the basis for the research discussed in subsequent chapters. For readers new to the working memory field, this introduction sets the scene for the subsequent chapters; the background reading sections at the ends of the chapters suggest starting points for additional reading. The main body of the book is broadly divided into applied and theoretical approaches. Part II on Applied Perspectives illustrates the use of the WM model as a conceptual tool for guiding research into other aspects of cognition. Chapters in the Applied Perspectives part evaluate the model’s contribution to research in mental imagery, consciousness, neuroimaging, language acquisition, and individual differences in cognition across typical and atypical lifespan development. The third part, Theoretical Perspectives, assesses some new data and alternative theoretical approaches which challenge the WM model and the assumptions on which it was built. Chapters in this part evaluate explanations of verbal short-term memory phenomena in terms of working memory, and provide a commentary on the concept of the central executive. The division into applied and theoretical research is somewhat arbitrary because the two are mutually informative. Applied research has contributed to the evolution of WM theory and theoretical developments have influenced applied research. The concluding part uses the authors’ answers to the four questions to conclude that the WM model remains a viable framework for applied and theoretical research, but that several weaknesses in the model must be addressed if it is to remain useful in the future. A programme is outlined for future research to address these weaknesses while retaining the strengths of the WM model. The book concludes with a discussion of the implications of the newly proposed ‘episodic buffer’ component of the WM model.

THE HISTORY OF WORKING MEMORY

Working memory refers to a system that enables temporary storage of the intermediate products of cognition and supports transformations of those products. Reviewing competing contemporary theories of working memory, Richardson concludes that they share the assumption that ‘there is some mechanism responsible for the temporary storage and processing of information and that the resources available to this mechanism are limited’ (1996, p. 23). Miyake and Shah suggest there is consensus among working memory researchers that ‘Working memory is those mechanisms or processes that are involved in the control, regulation, and active maintenance of task-relevant information in the service of complex cognition’ (1999, p. 450).
The roots of working memory are in theories of short-term memory that focused on the temporary storage of information, rather than on the role that temporary storage or transformation played in general cognition. They aimed to explain phenomena such as the particularly good recall for the last items in a list (the recency effect) and the difficulty of verbatim immediate recall of more than a few items (the limited capacity of short-term memory). Nonetheless, some early authors discussed short-term memory as a system for holding information that was currently in use by other cognitive processes. For example, Atkinson and Shiffrin (1968) argued that ‘The short-term store is the subject’s working memory’ (p. 90) and that transfer of information from long-term storage into the short-term store occurred ‘in problem solving, hypothesis testing, and “thinking” in general’ (p. 94). This section explains the development of the Baddeley and Hitch WM model from earlier theories of short-term memory.

Short-term memory

William James (1918) distinguished between primary memory, i.e., our continued awareness of what has just happened or the ‘feeling of the specious present’ (1918, p. 647), and secondary memory, i.e., ‘knowledge of a former state of mind after it has already once dropped from consciousness’ (p. 648). Hebb (1949) suggested a neural mechanism for this binary memory system, primary memory being the result of temporarily reverberating electrical circuits in the brain and secondary memory reflecting permanent synaptic changes. Burgeoning interest in computers influenced memory research in two ways, by providing a new language for describing memory structures and functions (‘hardware’ versus ‘software’ or ‘processing’) and by raising new questions. In particular, interest in information theory (Shannon & Weaver, 1949) encouraged people to think about how incoming sensory information is processed, what limits how much information can be processed at any time, and what determines the chance of information being retained in long-term memory. For example, Broadbent (1958) explained the difficulty of attending to more than one stream of information at once by proposing a tripartite information-processing system. The S system temporarily stored parallel streams of sensory information, feeding them via a selective filter into the limited-capacity P system where they were stored briefly or transferred again to output mechanisms or a much larger capacity long-term store. Information could be maintained in immediate memory by a rehearsal loop which repeatedly transferred information between the P and S systems. The new computer models made explicit the need to buffer information being used in current computations, and the likelihood that the capacity to do this was limited. The computer analogy thus reinforced James’ assumption of separate primary and secondary memory systems.
Broadbent’s model stimulated research into the structure of memory, with subsequent models typically including a limited capacity short-term memory system as the route into permanent memory (Murdock, 1974). Two of the most influential accounts will be mentioned briefly here. Waugh and Norman (1965) developed a quantitative model of the function of James’ primary memory, in which recall probability was a function of the number of intervening items. A recently perceived stimulus could be represented in the primary and secondary stores simultaneously, could be transferred into secondary memory by rehearsal, and would be displaced from primary memory by subsequent stimuli if not rehearsed. Whereas James described primary memory in terms of temporal duration, Waugh and Norman described it in terms of the limited number of events it could store. Atkinson and Shiffrin (1968) included a similar short-term store in their model of memory. Incoming sensory information entered the short-term store, a limited-capacity, temporary storage system, via sensory registers. Rehearsal processes copied, or ‘transferred’, information from the short-term store into a long-term store which was relatively permanent and unlimited in capacity. Atkinson and Shiffrin explicitly assumed that the short-term store functioned as a working memory, a buffer for information being used in complex cognitive activities, but their paper did not address this aspect of memory.
Atkinson and Shiffrin cited two reasons for distinguishing between short-term and long-term storage structures. First, Milner (1966) reported amnesic patients with intact short-term memory function despite having impaired long-term memory, suggesting that the two types of memory depended on different anatomical structures. Second, they argued that it was more parsimonious to explain the recency effect in terms of two memory stores than one. The recency effect refers to the preferential recall of the last items in free recall of a supra- span list (e.g., Murdock, 1962). Atkinson and Shiffrin attributed it to the persistence of the last list items in short-term storage, from which they could be rapidly and accurately retrieved. An interpolated task, such as mental arithmetic, abolished the recency effect (Glanzer & Cunitz, 1966; Postman & Phillips, 1965) because it used representations that displaced the recency items from their slots in the short-term store.
Atkinson and Shiffrin’s model was consistent with contemporary neuropsychological and experimental data, and also with the introspection data, eloquently described by James, that our sense of ‘the specious present’ (1918, p. 647) or ‘just past’ (1918, pp. 646–647) is different from our long-term store of knowledge and memories. However, although the two-store view of memory was popular and offered a relatively parsimonious explanation of the data, there were also strong arguments against the view. Melton (1963) argued that the apparent dissociation between short-term and long-term storage systems could be explained in terms of interference within a single memory system. Norman (1968) argued that short-term and long-term memory phenomena could arise from a single storage system, and indeed that they must arise from a single system because perceptual identification of familiar stimuli could not be so rapid if sensory information only gained access to representations in long-term storage after being processed in a short-term store. In Norman’s scheme, short-term memory results from temporary excitation of stored representations, which can be accessed or triggered by incoming sensory information, whereas long-term memory results from permanent excitation of stored information.

Problems with the short-term store ? long-term store model of memory

Empirical problems for Atkinson and Shiffrin’s model came from two of the sources that originally supported their distinction between short-term and long-term storage. In neuropsychology, Warrington and Shallice (1969; Shallice & Warrington, 1970) reported a patient, KF, with the converse pattern of memory impairment to that shown by Milner’s amnesic patients. KF’s ability to learn word lists and remember stories was normal, but his digit span was only two or three items, well below the norm of seven plus or minus two items (Miller, 1956). This neat dissociation supports the hypothesis of separate memory systems underlying short-term and long-term memory, but is completely inconsistent with Atkinson and Shiffrin’s claim that a unitary short-term store is the route into long-term storage. Free recall also proved problematic when Tzeng (1973; see also Bjork & Whitten, 1974) demonstrated that, although the recency effect was abolished by a period of counting backwards between the end of the list and the start of recall, it could be reinstated by interpolating backward counting between every list item. Tzeng’s finding was not compatible with an account that assumed recency items were well recalled because they had not yet been displaced from a limited-capacity short-term store.
Craik and Watkins (1973) challenged the assumption that each rehearsal of an item increased its probability of transfer into long-term storage. They asked participants to listen to a list of words and remember the last word beginning with a specified letter. They manipulated the length of time for which a word must be remembered before being superseded by another word starting with the same letter. For example, if subjects are asked to remember words beginning with ‘g’, they should rehearse grain for longer than garden in the following list: daughter, oil, garden, grain, table, football, anchor, giraffe, pillow, thunder. On a surprise recall test, participants remembered the most rehearsed words no better than they remembered the least rehearsed words, suggesting that mere maintenance rehearsal does little to promote long-term retention of information.
There was little empirical support for Atkinson and Shiffrin’s assumption that the short-term store functioned as a working memory. Atkinson and Shiffrin did not address this issue experimentally and those who did found evidence that learning and recall tasks require general processing resources (e.g., Murdock, 1965) but no evidence that they loaded short-term memory. For example, Patterson (1971) found no disruption of recall when participants counted backwards between each item recalled, a task that should have disrupted any retrieval plans held in the short-term store.

Working memory

Baddeley and Hitch (1974) directly tested the assumption that the limited-capacity short-term store functioned as a working memory, i.e., that it supported general cognition by processing as well as storing information. They noted that short-term memory research typically used two tasks, immediate serial recall and free recall, with the recency portion of the free recall curve being assumed to reflect retrieval from short-term storage. Although there were discrepancies in the data from these two tasks, they consistently pointed to the short-term store having limited capacity. Baddeley and Hitch therefore took limited capacity to be the defining characteristic of short-term memory, and test...

Table of contents

  1. Cover Page
  2. Title Page
  3. Copyright Page
  4. Figures
  5. Tables
  6. Contributors
  7. Preface
  8. Foreword
  9. Acknowledgements
  10. Part I: Introduction
  11. Part II: Applied perspectives
  12. Part III: Theoretical perspectives
  13. Part IV: Conclusion