A Journey into Open Science and Research Transparency in Psychology
eBook - ePub

A Journey into Open Science and Research Transparency in Psychology

  1. 200 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

A Journey into Open Science and Research Transparency in Psychology

About this book

A Journey into Open Science and Research Transparency in Psychology introduces the open science movement from psychology through a narrative that integrates song lyrics, national parks, and concerns about diversity, social justice, and sustainability. Along the way, readers receive practical guidance on how to plan and share their research, matching the ideals of scientific transparency.

This book considers all the fundamental topics related to the open science movement, including: (a) causes of and responses to the Replication Crisis, (b) crowdsourcing and meta-science research, (c) preregistration, (d) statistical approaches, (e) questionable research practices, (f) research and publication ethics, (g) connections to career topics, (h) finding open science resources, (i) how open science initiatives promote diverse, just, and sustainable outcomes, and (j) the path moving forward. Each topic is introduced using terminology and language aimed at intermediate-level college students who have completed research methods courses. But the book invites all readers to reconsider their research approach and join the Scientific Revolution 2.0. Each chapter describes the associated content and includes exercises intended to help readers plan, conduct, and share their research.

This short book is intended as a supplemental text for research methods courses or just a fun and informative exploration of the fundamental topics associated with the Replication Crisis in psychology and the resulting movement to increase scientific transparency in methods.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access A Journey into Open Science and Research Transparency in Psychology by Jon Grahe in PDF and/or ePUB format, as well as other popular books in Psychology & History & Theory in Psychology. We have over one million books available in our catalogue for you to explore.

Information

1
A Replication Crisis

Responses Benefit Personal Workflow

Chapter 1 Objectives

  • Define Replication Crisis
  • Introduce causes of the Replication Crisis
  • Conceptualize diverse, just, sustainable lens for science
  • Explain why national parks are useful contextual examples
  • Describe the open science movement
  • Introduce tools and topics described later
  • Introduce the book research question

Music in a Box: About the Song ā€œReplication Crisisā€

The first song contains background information about the Replication Crisis, a series of events leading to major questions about the reproducibility of scientific findings. The lyrics introduce the setting of the album and consider some of the issues and problems that led to the crisis, ending with a nod to some early responses to the situation. This is the only song that names specific people and cheers them on for their part in initiating some changes to increase scientific transparency. The song offers a good list of scientists who pioneered open science for anyone who wants to look up their work. This song sounds like classic rock, but the minor key reminds the listener of the conflict of the crisis.

What Was the Replication Crisis?

The beginning of this research methods journey, which aims to achieve scientific transparency in our work, started for many at the beginning of the Replication Crisis or ā€œcrisis of confidenceā€ that emerged in the 2010s. Because many have been traveling this path for more than a decade, there are multiple retellings of the causes and consequences of this crisis (see Shrout & Rodgers, 2018). This book is personal in nature; the story is shared from my own experience within the crisis. This limited scope will result in a briefer description but does not intend to prioritize my singular narrative. Rather, the hope is that readers will face these questions from their own perspective and that this narrative will entice that interest.
In short, the Replication Crisis reflected concerns that published findings in peer-reviewed journals could not be replicated. Think on that problem for a moment. Textbooks, mental health treatments, educational interventions, and even public policies are drawn from research that is published using peer review. If the published findings cannot be trusted, then all the conclusions are suspect. To learn about the many causes, some of which will be explored in more detail later in the book, refer to the series of Special Sections in Perspective on Psychological Science (v. 7, #6, November 2012; v. 8, #4, July 2013; v. 9, #1, January 2014; v. 9, #3, May 2014), in which they are deeply explored. The first, entitled ā€œReplicability in Psychological Science: A Crisis of Confidence,ā€ introduces the problem (Pashler & Wagenmakers, 2012; Pashler & Harris, 2012), potential explanations for why the problem existed (Makel, Plucker, & Hegarty, 2012; Bakker, van Dijk, & Wicherts, 2012; Ferguson & Heene, 2012; Giner-Sorolla, 2012; Klein et al., 2012; Neuroskeptic, 2012; Ioannidis, 2012), and recommendations for solutions (Frank & Saxe, 2012; Grahe et al., 2012; Koole & Lakens, 2012; Nosek, Spies, & Motyl, 2012; Wagenmakers, Wetzels, Borsboom, van der Maas, & Kievit, 2012). Across these manuscripts, one might draw a short list of causes as follows: (a) publication bias favors novel and unusual findings over replication and confirmatory research, (b) the presence of reward structures that favor many previous publications, and (c) poor reporting standards. Each of these is itself complex, with multifaceted causes, but the outcome is that research reports with flashy findings receive the greatest attention from both readers and researchers. The problem is that striving for those findings led to particularly inadequate practices in science.

Potential Causes of the Crisis

These bad practices are highlighted in major events that occurred in 2011. Researchers often refer to 2011 as ā€œthe year of the crisis.ā€ Before these events, there was little concern for these problems in psychology, though some were voicing alarms more generally (Ioannidis, 2005). I myself had been pushing for reform for two years before these events, but no one really cared. After the year of crisis, I finally had an audience who was willing to help with ā€œHarnessing the Undiscovered Resource of Student Research Projectsā€ (see Grahe et al., 2012). Here is a brief description of two events that illuminated the replication crisis.
The most egregious affront against psychological science that alarmed the field in 2011 was when Diedrick Stapel was found to have falsified data in more than 40 published papers (Stroebe, Postmes, & Spears, 2012). This researcher was extremely influential, and his work is cited in many papers and textbooks. Over time, the lure of publication overwhelmed his ethics, and he started writing results sections with imaginary numbers. The papers were well written and interesting, but the findings were fiction.
Certainly, this man is not the only one who made up data or committed other forms of academic dishonesty. More critically, this example highlights a few problems with scientific reporting that need fixing. First, science reporting is built on trust. When manuscripts are submitted for peer review, reviewers are tasked with challenging the authors’ rationale and methodology. They are expected to review and consider the results, but they are not expected to rerun analyses or review the quality of the data. While a reviewer might disagree with an author, authors’ intentions are rarely questioned. This event highlights that in some circumstances, bad data and conclusions are due to willful disregard for scientific ethics.
However, another crisis event illuminates how bad science can emerge from good intentions. Daryl Bem published a paper in 2010 purporting to demonstrate precognition (parapsychological activity). Though there are many papers reporting the existence of parapsychological activity, this paper was published in the Journal of Personality and Social Psychology, one of the most prominent journals in social psychology. Further, Daryl Bem is a prominent social psychologist who suggested credible findings. Readers who believe in ghosts, goblins, astrology, tarot cards, and mind reading might be surprised to learn that this publication led to an uproar. Researchers demanded to see the data as they began to highlight many reporting issues evident in the manuscript. To his credit, Bem shared the data and did not argue strongly with the criticisms.
This second event introduces a number of related publication bias problems. Besides having a topic that is sensational and a prominent author that editors might favorably publish, the research was not maliciously reported. Bem did not intend to mislead or lie. Instead, his error was that he engaged in a series of questionable research practices more commonly described as hypothesizing after the results are known HARKing (Kerr, 1998) and p-hacking (Simmons, Nelson, & Simonsohn, 2011).
More critically, Bem was one of the researchers that taught the field how to effectively use these practices. In a book chapter about publishing an empirical article, Bem (2000) explains to future authors that, ā€œThere are two possible articles you can write: (a) the article you planned to write when you designed the study or (b) the article that makes the most sense now that you have seen the resultsā€ (p. 4). Bem argues that the correct answer is (b). Among otherwise good writing advice, Bem posits that the author should not bother a reader with the many pitfalls of the research practice. He suggests that rather than keeping a failed hypothesis in an introduction after conducting analyses, authors should rewrite a manuscript with new hypotheses and background literature to justify the findings that did emerge in the data. This is the definition of HARKing, but Bem argued that it was preferable to present a clear and straightforward story rather than distract the reader with errors made by the researcher. Later, in Chapters 3 and 5, this topic of massaging data to find effects, or p-hacking, and how to avoid it will be explored in more detail. For now, these events highlight that the challenges facing science were complex, while others would demonstrate that questionable research practices were both pervasive and systematic (Simmons et al., 2011; Bakker & Wicherts, 2011).
To understand these events, it is useful to remember that tools to make science easy to share are fairly recent. At the beginning of the new millennium, scientific manuscripts were still being submitted as hard copies, and journals published all materials in print, as there were no online journals or supplemental materials. With the cost of mailing documents and publishing printing pages, asking authors to also share data and materials was prohibitively expensive. Further, the drive toward shorter reports and, consequently, less stringent reporting standards was made in part to offer more publication opportunities for more authors as well as help disseminate findings and effects more broadly.
Regardless of the causes, this is a good moment to remind the reader that though this was publicly noted in social psychology, and many of the solutions were tested in social psychological research, the problem of publication bias and poor replicability pervades all fields of science, as suggested by Ioannidis (2005), who estimated that 50% of all published findings are false. In the decade that followed, many others recognized the need to change our approach to science, both in other psychological disciplines and also across the social and natural sciences.

Why the Replication Crisis Does Not Matter

During 2017–2018, I completed a Crisis Schmeisis Tour to Increase Scientific Transparency. In almost 50 speaking engagements and meetings, I began my persuasive arguments with the position that it does not matter if there is a replication crisis in the field. Finding errors in methodology and improving them is the purpose of the scientific method. A good scientist avoids believing any truth, because the basic assumption is that our knowledge is only the best representation of truth, not the actual truth. From that perspective, one would expect publication errors, and our job is not to debate why they exist but, rather, how to do better science. This debate yielded tools and calls for change that will improve science and benefit the researcher at the same time.
While others continue to debate what effects may or may not be generalizable or whether replication efforts are appropriate or sufficient, my position has been and continues to be that there is greater benefit to learning new ways to be more transparent than there is in debating. Future scientific efforts will demonstrate what effects are generalized, but only if we move forward. This book focuses on this goal by introducing the reader to new tools and methods to conduct more transparent science. These tools include (a) new, free computer programs and software that make it easy to share plans, materials, and data; (b) research opportunities that collate resources and researchers to conduct more powerful research; and (c) reward structures that offer different paths to success. These tools are introduced through lyrics intended for both amusement and deep learning. Where possible, the examples consider the context of diversity, social justice, and sustainability while considering national parks. The objective of the examples is to connect the research methods content to ongoing social struggles with meaningful impact to the reader.

Diversity, Social Justice, Sustainability, and Science

Recently, I considered psychological research from the diversity, social justice, sustainability lens (Grahe, 2019). In that editorial, I used the metaphor of the ā€œidealā€ US national identity to describe and characterize how perspectives considering diversity, social justice, and sustainability differ from each other and yet need to be considered in whole when conducting and advancing science. By comparing the ā€œUS as a melting pot,ā€ ā€œUS as a salad bowl,ā€ and ā€œUS as a stewpotā€ approaches to the ideal integration of diverse cultures, I showed how diversity represents the various approaches and responses to those approaches, social justice represents who gets to determine policies that encourage these identities and how they are applied to people, and sustainability is the consideration of human and material resources in developing a...

Table of contents

  1. Cover
  2. Half Title
  3. Title
  4. Copyright
  5. Contents
  6. Preface
  7. Chapter 1 A Replication Crisis: Responses Benefit Personal Workflow
  8. Chapter 2 Go Forth and Replicate: Making Methods for Others
  9. Chapter 3 Preregistered: Determining Answers to Decisions Before They Happen
  10. Chapter 4 Decision Heavyweights: Drawing Inference With Confidence
  11. Chapter 5 Ode to p-Hacking: Making Decisions Before They Happen
  12. Chapter 6 You Can’t Plagiarize Yourself: Avoiding Errors With Ethical Writing
  13. Chapter 7 Becoming a Second Stringer: Why Good People Do Replication Science
  14. Chapter 8 Open Science Alphabet: Learning to Read
  15. Chapter 9 Progress: Open Science Promotes Diverse, Just, and Sustainable Outcomes
  16. Chapter 10 Scientific Transparency: A Theme for a Movement
  17. Index