ARTICLES
Methodological and Measurement Issues in School Violence Research: Moving Beyond the Social Problem Era
Michael J. Furlong
Gale M. Morrison
Dewey G. Cornell
Russell Skiba
Michael J. Furlong is Professor, University of California, Santa Barbara, Gevirtz Graduate School of Education, Santa Barbara, CA 93106 (E-mail:
[email protected]). He is also affiliated with the Center for School-Based Youth Development.
Gale M. Morrison is Professor, University of California, Santa Barbara, Gevirtz Graduate School of Education, Santa Barbara, CA 93106 (E-mail:
[email protected]). She is also affiliated with the Center for School-Based Youth Development.
Dewey G. Cornell is Professor, Curry School of Education, University of Virginia, 405 Emmet Street, Charlottesville, VA 22903-2495 (E-mail:
[email protected]). He is also Director of the Youth Violence Project.
Russell Skiba is Professor, Counseling and Educational Psychology, Indiana University, Center for Evaluation and Educaton Policy, 509 E. Third Street, Bloomington, IN 47401 (E-mail:
[email protected]).
Address correspondence to Michael J. Furlong.
SUMMARY. School violence became a topic of broad national concern in the United States in reaction to a series of tragic school shootings during the 1990s. Efforts to understand and prevent school shootings have stimulated the rapid development of a broader interest in school safety with an emerging multidisciplinary research agenda. The maturation and fulfillment of this research agenda require that researchers critically examine their research methods and measurement strategies. This article introduces a volume that examines fundamental methodological and measurement issues in the rapidly expanding body of research on school safety and violence. The authors hope to stimulate greater attention to methodological pitfalls and critical measurement issues that hinder research progress in several related areas, including the uncertain reliability and validity of self-report surveys used to measure high-risk behavior and bullying, the limitations of discipline referral databases as a source of information on school climate, and the overly narrow focus on relatively infrequent critical incidents of violence, often at the expense of a more comprehensive and multifactorial examination of the school environment.
[Article copies available for a fee from The Haworth Document Delivery Service: 1-800-HAWORTH. E-mail address: <[email protected]> Website: <http://www.HaworthPress.com> ©2004 by The Haworth Press, Inc. All rights reserved.] KEYWORDS. Measurement, school violence, methodological, school safety, school environment
School violence was a largely unacknowledged social problem prior to the 1990s (Furlong & Morrison, 2000). Between 1979 and 1992, there were 210 references in the National Newspaper Index under the keyword âschool violence,â but since 1993 there have been 1,291 reports focusing on school violence, clearly identifying school violence as an important social problem. The professional literature mirrors the mass mediaâs patternâcurrently there are only 21 publications prior to 1993 listed in the PsycINFO database under the school violence keyword but 296 publications during the past decade. From a historical perspective, school violence publications were driven more by public events (particularly school shootings) than by a well-considered research agenda. As a result, the rapidly developing professional literature often focused on the obvious need to prevent school violence, with little attention given to methodological and measurement issues (Furlong, Morrison, & Pavelski, 2000). It quickly became apparent, however, that knowledge about the nature and scope of school violence, as well as trends over time was incomplete, at best. Despite an inadequate knowledge base, the pressing concern with school violence as a social problem stimulated numerous calls to take action. In their meta-analysis of school violence prevention programs, Derzon and Wilson (1999) noted that there were two school violence review articles published for each empirical study of school violence, suggesting that empirically derived knowledge about school violence was not keeping pace with public interest and the demand for information to inform public policy.
The U.S. Federal government supported the development and widespread dissemination of three school violence prevention documents (Dwyer, Osher, & Warger, 1999; Osher & Dwyer, 2000; Osher, Dwyer, & Jackson, 2003). These documents contain highly plausible and thoughtful advice for school authorities based on a limited knowledge base but has not been accompanied by a systematic research effort to validate specific recommendations and practices.
The demand for immediate information about school violence precluded the possibility of careful development of new research measures. As a result, the first national report on school safety (U.S. Departments of Education and Justice, 1999) drew upon existing national studies such as the Youth Risk Behavior Surveillance Survey (e.g., Brener, Simon, Krug, & Lowry, 1999), the National Crime Victimization Survey (e.g., Hawkins, Herrenkohl, Farrington, Brewer, Catalano, Harachi, & Cothern, 2000), and the Monitoring the Future Study (e.g., Johnston, OâMalley, & Bachman [1996]). These well-regarded, periodic surveys were adapted to bootstrap information about student experiences with violence on school campuses. Such efforts might be regarded as a hallmark of the social problem era of research on school violence.
Perhaps the most significant effort to assess school violence came through adaptations of the CDCâs Youth Risk Behavior Survey (Kann et al., 1996). The Center for Disease Control (CDC) incorporated violence items into their ongoing surveillance surveys. CDC findings were reported in the first National Safe School Report before any peer-reviewed research article about the reliability and validity of the YRBS school safety items was published. Moreover, the early results from YRBS surveys were widely publicized without qualification or careful explanation. For example, CDC announced in early YRBS surveys that more than 1 in 5 âhigh school studentsâ carried a weapon each month. Of course, the early YRBS questions did not specify school as the location of the weapon possession. News media statements such as â20% in high schools found to carry weaponsâ (The New York Times, 1991, October 11) could easily be misinterpreted to mean that students were carrying weapons to school. Furthermore, survey questions about weapon carrying did not discriminate possession of a weapon for activities such as hunting or camping from weapon possession that was intended for interpersonal violence.
VOLUME ORGANIZATION AND OVERVIEW
With the publication of the Journal of School Violence it can be argued that âschool violenceâ as a topic of scientific inquiry has matured into a recognized field of study. We contend that research on school violence is ready to move beyond the social problem era when researchers responded to immediate needs for information and lacked time and opportunity to consider methodological and measurement issues. We believe that it is timely to examine the methodological challenges that are associated with this new, multidisciplinary field of study and to begin to articulate standards for the scientific credibility of its research methods and findings. The purpose of this volume is to advance that process.
Osher, VanAcker, Morrison, Gable, Dwyer, and Quinn (2004) provide a discussion of school-level warning signs for school aggression and violence. These authors present this information as a complement and extension to the well-known documents Early Warning, Timely Response: A Guide to Safe Schools (Dwyer, Osher, & Warger, 1998) and Safeguarding Our Children: An Action Guide (Dwyer & Osher, 2003). While the 1998 Warning Signs document focused on individual student warning signs, the article in this issue is intended to focus attention and future measurement work on school-level contributions to aggressive and violent student behavior. In doing so, the authors present an analysis of school, classroom, family, and individual contexts for aggressive behavior and suggest tools for assessing these important ecological indicators.
Morrison, Peterson, OâFarrell, and Redding (2004) explore the utility school discipline data, perhaps the most ânaturally occurringâ data on school misbehavior and aggression. They note that there is very little information available in professional or research literature about the reliability and validity of these data. Their article provides an examination of the sources of error that enter into the collection and use of these data and focuses particularly on office referral data, as these data constitute the most common information available on school campuses. The authors highlight monitoring misbehavior as one purpose of counting and categorizing office referrals but note that another important purpose of office referrals for school safety is to analyze and characterize the school discipline process and resulting climate and context for student behavior. They provide guidelines for how best to utilize information about behavior and discipline systems for school safety research.
Cornell and Brockenbrough (2004) examine measurement issues related to bullying, perhaps the most pervasive and widely recognized form of violence that occurs in schools. Their results show that there is only modest correspondence among three methods for identifying bullies and victims of bullying, and that peers identify far more students as bullies and bully victims than do either teacher nomination or self-report methods. Furthermore, identification as a bully by peers and teachers, but not self-identification, was predictive of school discipline referrals, detentions, and suspensions over the subsequent six months. Their results raise questions about the widespread reliance on student self-reports of bullying, and point to the need for careful validation of existing methods for identifying bullies and victims of bullying.
Cross and Newman-Gonchar (2004) utilize data gathered as part of one siteâs implementation of the Safe School/Healthy Student Initiative (Furlong, Osher, & Paige, 2003). They examine standard validity checks across three different school violence and safety surveys and found that fewer students gave contradictory responses to surveys given routinely or in conjunction with a specific educational unit (treatment groups) than they did to surveys administered on short notice or not in conjunction with any instructional units (control groups). In addition to effects related to who administered the surveys, it was found that incidence rates were influenced when inconsistent and extreme responses were controlled.
Furlong, Sharkey, Bates, and Smith (2004) use the 2001 YRBS database to examine the presence and influence of cases in which students give the most extreme response options for school violence, safety and risk-related items. They provide an overview of issues related to the reliability of the YRBS and further explore the response patterns of a subset of 414 youths who gave the most extreme response (6 or more times in the past month) to the item that asked about the frequency of school weapon possession, They found that this group of extreme weapon-item responders were more likely to also give extreme responses to other school risk behavior items as well as positive health behavior items. Their findings suggest that a subset of the YRBS cases may reflect a type of extreme response pattern that is uncontrolled for in virtually all research using the YRBS.
In addition to exploring the reliability of core school violence and safety instrument, Mayer (2004) examines challenges and limitations faced when using structural equation modeling (SEM) and structured means modeling (SMM) to analyze processes associated with school violence and disruption. As researchers increasingly examine more complex, multilevel model of school violence and disruption, the appropriate use of complex statistical modeling techniques will be an increasing concern.
Finally, Skiba, Simmons, Peterson, McKelvey, Forde, and Gallini (2004) note that extant national surveys are based on an understanding of school violence that is driven primarily by critical incidents or criminal violations. Yet most current theoretical models of school violence and its prevention emphasize a comprehensive perspective that encompasses both serious incidents and day-to-day disruption and climate issues. Further, very few reports on existing school safety surveys use empirical procedures, such as factor analysis, to derive their dimensions or subscales. Skiba et al. report on the development and technical characteristics of a comprehensive self-report survey for secondary students, the Safe and Responsive Schools Safe Schools Survey. Survey items were drawn from both school safety and school climate surveys in order to represent a more comprehensive model of school violence. Their results suggest that both major safety items and day-to-day discipline/climate issues shape student perceptions of school safety. Indeed, regression analyses suggest that student perceptions of climate may in some cases be a better predictor of perceptions of overall school safety than serious violence.
CONCLUSION
This volume is possible only because many researchers and educators have recognized the need to better understand school violence as a social problem. Their openness and sensitivity focused attention on this issue, and their interest began the development of knowledge about its occurrence. The various national studies, task forces, and research efforts to date have provided core information and moved the field of school violence research forward in essential ways.
At this time, however, the results presented in this volume present a challenge to the school violence research community. School violence and safety research will move forward and make unique scientific contributions only if it develops a core literature that critically examines its measurement, methods, and data analysis techniques. Such analysis becomes increasingly necessary as the field moves be...