
- 164 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
Practical Program Evaluation for Criminal Justice
About this book
Practical Program Evaluation for Criminal Justice shows readers how to apply the principles of fiscal responsibility, accountability, and evidence-based practice to criminal justice reform plans. Unlike other policy-based texts, which tend to focus more on implementation than assessment, this book provides applicable, step-by-step instruction on determining an initiative's necessity prior to its adoption (reducing the risk of wasting resources), as well as how to accurately gauge its effectiveness during initial roll-out stages. The book gradually introduces basic data analysis procedures and statistical techniques, which, once mastered, can be used to prove or disprove a program's worth. Lastly, the book introduces the types of stakeholders who should review evaluation results for quick action, as well as how to best structure reports to ensure their buy-in.
Frequently asked questions
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Information
1
Getting Started with Program Evaluation
Keywords
Maryland Report
meta-analysis
Campbell Collaboration (Crime and Justice Group)
Chapter Outline
| Introduction |
| Administrator and Evaluator |
| Strengths and Weaknesses of Program Evaluation |
| Evidence-Based Practices |
| Maryland Report |
| āWhat Worksā |
| āWhat's Promisingā |
| āWhat Doesn't Workā |
| Meta-Analysis |
| Campbell Collaboration (Crime and Justice Group) |
| Summary |
| Discussion Questions |
| References |
Introduction
Administrator and Evaluator
- The social problem (in our case, crime).
- The service agencies (the components of the criminal justice system).
- The public (who seek protection from crime).
Strengths and Weaknesses of Program Evaluation
- Use for decision makings: Use results of evaluation research are designed with use in mind. The evaluation should provide a basis for decision making in the future and provide information to determine whether a program should be continued and expanded or terminated.
- Program-derived questions: The research questions are derived from the goals of the program and its operations and are defined by the evaluator alone. The core of the study is administrative and operational: Is the program accomplishing what it is designed to do? Is it reaching its "target population"āthat is, the clients that the program was supposed to serve? Does the program make effective and efficient use of its resources, both physical and financial?
- Judgmental quality: Objectivity requires that the evaluator focuses on whether the program is achieving its desired goals. It is imperative that these goals are stated in a clear and measurable fashion that accurately documents effective performance.
- Action setting: The most important thing going on is the program, not the research. The program administrator and staff control access to information, records, and their clients. The research must deal with this reality and construct research designs that are feasible in the real-world setting.
- Role conflicts: The administrator's priority is providing program services, which often makes him or her unresponsive to the needs of the evaluation. Typically, the administrator believes strongly in the value and worth of the program services. The judgmental nature of the findings and the establishment of accountability are often viewed as a threat to both the program and the administrator personally. The possibility of friction between the program and research and the administrator and evaluator is almost inevitable. Programs are often tied to both the ego and professional reputation of the administrator and staff. Some programs (e.g., Drug Abuse Resistance Education, or D.A.R.E.) are politically attractive, and as a result have lives of their own that defy objectivity and rational assessment. Negative outcomes are not always accepted in a rational manner. In fact, one of the great ironies of evaluation research is that negative findings often fail to kill a program and positive results seldom save one. This is due to the fact that so many crime prevention programs are tied to availability of grant funding. The presence or absence of funding often determines program survival regardless of the program evaluation research findings.
- Publication: Publication of evaluation research is vital to the establishment of a base of information on effective crime prevention programs. To be published, the research must be carefully designed and executed and the statistical analysis must be valid and accurate.
- Allegiance: The evaluator is clearly conflicted on this aspect. He or she has obligations to the organization that funds the study, to the scientific requirements of research objectivity, and to work for the betterment of society through the determination of program effectiveness. These obligations can be contradictory and the researcher must face this reality. For example, program officials often need real-time assessments of tactics as they unfold. If the evaluator discovers problems during the process evaluation of program implementation that might jeopardize its success, then the evaluator has an obligation to alert program officials without compromising ethical concerns (Joyce & Ramsey, 2013, p. 361).
Evidence-Based Practices
Maryland Report
Table of contents
- Cover
- Half Title
- Title
- Copyright
- Dedication
- Contents
- Digital Assets
- Preface
- Chapter 1 Getting Started with Program Evaluation
- Chapter 2 Planning a Program Evaluation
- Chapter 3 Needs Assessment Evaluation
- Chapter 4 Theory-Driven Evaluation
- Chapter 5 Process Evaluation
- Chapter 6 Outcome Evaluation
- Chapter 7 Cost-efficiency Evaluation
- Chapter 8 Measurement and Data Analysis
- Chapter 9 Reporting and Using Evaluations
- Chapter 10 Looking Ahead: A Call to Action in Evaluation Research
- Glossary
- Index