Practical Approaches to Applied Research and Program Evaluation for Helping Professionals
eBook - ePub

Practical Approaches to Applied Research and Program Evaluation for Helping Professionals

Casey A. Barrio Minton, A. Stephen Lenz

Share book
  1. 260 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Practical Approaches to Applied Research and Program Evaluation for Helping Professionals

Casey A. Barrio Minton, A. Stephen Lenz

Book details
Book preview
Table of contents
Citations

About This Book

Practical Approaches to Applied Research and Program Evaluation for Helping Professionals is a comprehensive textbook that presents master's-level counseling students with the skills and knowledge they need to successfully evaluate the effectiveness of mental health services and programs.

Each chapter, aligned with 2016 Council for Accreditation of Counseling and Related Educational Programs (CACREP) standards, guides counseling students through study design and evaluation fundamentals that will help them understand existing research and develop studies to best assess their own applied research questions. Readers will learn the basics of research concepts as applied to evaluative tasks, the art of matching evaluative methods to questions, specific considerations for practice-based evaluative tasks, and practical statistical options matched to practice-based tasks.

Readers can also turn to the book's companion website to access worksheets for practitioner and student planning exercises, spreadsheets with formulas for basic data analysis, a sample database, PowerPoint outlines, and discussion questions and activities aligned to each chapter.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Practical Approaches to Applied Research and Program Evaluation for Helping Professionals an online PDF/ePUB?
Yes, you can access Practical Approaches to Applied Research and Program Evaluation for Helping Professionals by Casey A. Barrio Minton, A. Stephen Lenz in PDF and/or ePUB format, as well as other popular books in Psychology & Research & Methodology in Psychology. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2019
ISBN
9781351611534
Edition
1

Part I
Embarking on an Inquiry

Chapter 1

Where Science Meets Practice

This chapter introduces the importance of applied research and program evaluation within counseling and related helping professions, with special attention to the importance of evidence-informed practice and scientist-practitioner models in professions often focused on abstract human relationships. We discuss development of the accountability era and introduce evidence-based practice and hierarchies of evidence in an attempt to illustrate how research is important to practice and how practice is important to research. The chapter concludes with an overview of this book, including strategies for optimizing learning.

Box 1.1 CACREP Standards and Chapter Learning Outcomes

CACREP 2016 Standards

  • 2.F.8.a the importance of research in advancing the counseling profession, including how to critique research to inform counseling practice
  • 2.F.8.b identification of evidence-based counseling practices
  • 6.B.1.d evidence-based counseling practices

Chapter Learning Outcomes

  • Deepen understanding regarding the important role of research and program evaluation in ensuring ethical and effective practice
  • Conceptualize hierarchies of evidence used to support evidence-based practices

Professional Identity and Foundations

Professional counselors and related mental health professionals are charged with facilitating healing and optimal functioning for the most vulnerable of populations. Although the degree of focus may vary across professions and professionals, there is widespread agreement that such relationships and processes must attend to both art and science. Certainly, integration of science in practice involving complex human experiences and relationships is an art.
Research and program evaluation have been of long-standing interest in mental health professions. The earliest and most famous counseling theorists—players such as Freud, Skinner, and Rogers—were scientists who developed and modified theories based on rigorous inquiry. As theorists began building silos of research, other scholars began considering common factors that cut across approaches in hopes they could help us better understand how counseling works and, in turn, how counselors best engage the science of counseling. Consider this excerpt from over 50 years ago:
The more we learn from our research, the more acutely aware we become of the limitations and inadequacies of our current theoretical formulations. Individual schools of counseling and psychotherapy cannot account for the multitude of variables which in all probability ultimately will constitute the process of psychotherapeutic personality change. However, these varied approaches have proved themselves to be of research value in the past. Any attempted formulations in the future should, therefore, leave themselves open to their potential contributions to research in the therapeutic process. Indeed, common elements, stemming from, yet cutting across these various theoretical approaches, have already demonstrated significant heuristic meaning.
(Truax & Carkhuff, 1964, p. 860)
More recently, scholars concluded:
Despite the field’s love affair with technique, nearly a half century of empirical investigation has revealed that the effectiveness of psychotherapy resides not in the many variables that ostensibly distinguish one approach from another. Instead, the success of treatment is principally found in the factors that all approaches share in common.
(Duncan, Miller, Wampold, & Hubble, 2010, p. xxvii)
Research and program evaluation are foundational to the work of mental health professionals. In professional counseling, the Council for Accreditation of Counseling and Related Educational Programs (CACREP) includes research and program evaluation as one of eight long-standing core curricular areas for master’s-level preparation. The American Counseling Association (2014) Code of Ethics requires counselors to work with clients to develop “counseling plans that offer reasonable promise of success” (p. 4) and counselor educators to “promote the use of techniques/procedures/modalities that are grounded in theory and/or have an empirical or scientific foundation” (p. 14). Counseling psychologists have long held the scientist-practitioner or “Boulder Model” of integrating science as practice as a critical component of their professional identity (Vespia & Sauer, 2006), and American Psychological Association (www.apa.org) accreditation standards require substantial attention to research. APA’s Ethical Principles of Psychologists and Code of Conduct (2017) requires that “psychologists’ work is based on established scientific and professional knowledge of the discipline” (Standard 2.04). Finally, the Council on Social Work Education (www.cswe.org) highlights the importance of research as one of nine professional competencies for social workers, specifically Competency 4: Engage in Practice-informed Research and Research-informed Practice. Specifically, bachelor’s- and master’s-level
[s]ocial workers understand quantitative and qualitative research methods and their respective roles in advancing a science of social work and in evaluating their practice. Social workers know the principles of logic, scientific inquiry, and culturally informed and ethical approaches to building knowledge. Social workers understand that evidence that informs practice derives from multi-disciplinary sources and multiple ways of knowing. They also understand the processes for translating research findings into effective practice.
(Council on Social Work Education, 2015, p. 8)
Certainly, ethical codes and training and standards highlight the use of research as a critical foundation to practice. These requirements reflect both internal development of mental health professions and external realities of practice in what some call the era of accountability.

Box 1.2 Ethical Responsibility to Ensure Effective Practice

Non-maleficence: Do no harm. Medical and mental health professionals alike recognize their responsibility to protect those they serve from intentional and unintentional harm. We are also guided by a number of other ethical principles. Beneficence refers to doing good or making a positive impact. It is not enough to exist in a neutral state of not doing harm. We must also do good.
From a philosophical and practical standpoint, we might argue that failure to do good may be harmful. At a most basic level, failure to do good wastes precious time, energy, and financial resources. More philosophically, failure to do good could send an unsavory message regarding a consumer’s hope for the future and the value of the professions we hold dear.
If we agree that doing no harm is a necessary, but not sufficient, bar for practice, we must turn attention to how we know we are doing good. How do we know our programs and practices are having their intended effects? How do we understand opportunities to minimize potential for harm among our most vulnerable of clients? Whatever your theoretical or philosophical foundations, we hope you agree that the answer lies in developing an ability to evaluate our programs and practices in an effort to continually grow, improve, and reach toward our ethical striving to do good.

The Era of Accountability

Mental health professionals are under unprecedented expectations to work efficiently and effectively and to be transparent about both process and outcome with a variety of stakeholders. Our most important stakeholders are those we serve through prevention and early intervention programs, as well as through individual, group, or family counseling services. Engaging in counseling or psychotherapy requires considerable resources, and most people do not engage in the process when things are going well. Consumers need to be able to trust that they are investing their time, money, and emotional reserves in ways that are likely to facilitate positive outcomes.
Consumers are not our only stakeholders. Professionals who work in agencies, schools, and other settings are accountable to supervisors, coworkers, and interdisciplinary collaborators. Those stakeholders need to know that our “parts” of the greater package or program are working. A number of other stakeholders often influence budget decisions regarding service authorization; just as consumers want to know they are making a wise investment, payors want to know that their money is being well spent.
Payors vary across settings and may include public funding sources (e.g., educational systems), public or private insurance (e.g., third-party payors), grant funders, and fourth-party payors such as employee assistance programs. Although a full discussion of mental health reimbursement, accountability, and management systems is beyond the scope of this book (see Brown and Minami (2010) for a complete history), it is important to understand that our current focus on outcomes management and accountability is rooted in a long history of an exponentially expanding utilization of mental health services in which there was not always a focus on cost control or demonstrated effectiveness. This led to utilization review services in which payors authorized the type and number of services based on pre-established sets of treatment guidelines and criteria. More recently, payors have shifted to requiring measurement and reporting of treatment outcomes as a condition of payment. The take-home messages are clear: Mental health professionals must be able to (a) articulate how we provide services with a strong probability of success and (b) demonstrate outcomes for individual clients and programs. To inform the services offered, many will turn to evidence-based practices and hierarchies of evidence.

Evidence-Based Practice and Hierarchies of Evidence

As you’ve heard already in this brief chapter, mental health professionals are expected to integrate research into our practice. This has given rise to an alphabet soup of acronyms designed help practitioners and researchers determine just what that looks like. Evidence-based practice (EBP) is the most recent focus of attention in mental health professions. EBP features “integration of the best available research within clinical expertise in the context of patient characteristics, culture, and preferences” (APA Presidential Task Force on Evidence-based Practice, 2006, p. 273).
By definition, EBP features acceptance of a broad array of research evidence and recognition that the strongest evidence bases are built upon a variety of different designs. In particular, EBP recognizes that “different research designs are better suited to address different types of questions” (APA Presidential Task Force on Evidence-based Practice, 2006, p. 274). Designs can include clinical observation, qualitative research, systematic case studies, single-case experimental designs, public health and ethnographic research, process-outcome studies, naturalistic studies of interventions, randomized controlled trials (RCTs), and meta-analyses. Certain designs are superior to other designs when answering specific questions. For example, validity controls (i.e., randomization, use of control group) inherent in a well-designed RCT offer superior evidence to a case study or quasi-experimental investigation. However, if one wants to understand experiences that lead a high percentage of clients to discontinue participation in an RCT, a qualitative design is superior in gathering the types of data that can answer the question. We would likely also agree to put more weight on a meta-analysis of a series of well-designed studies than on any one individual study. In recognition that evidence is complicated, the Centers for Disease Control and Prevention (CDC) created a Continuum of Evidence of Effectiveness (Pud...

Table of contents