Assessment in Student Affairs
eBook - ePub

Assessment in Student Affairs

John H. Schuh, J. Patrick Biddix, Laura A. Dean, Jillian Kinzie

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Assessment in Student Affairs

John H. Schuh, J. Patrick Biddix, Laura A. Dean, Jillian Kinzie

Book details
Book preview
Table of contents
Citations

About This Book

A practical, comprehensive manual for assessment design and implementation

Assessment in Student Affairs, Second Edition offers a contemporary look at the foundational elements and practical application of assessment in student affairs. Higher education administration is increasingly called upon to demonstrate organizational effectiveness and engage in continuous improvement based on information generated through systematic inquiry. This book provides a thorough primer on all stages of the assessment process. From planning to reporting and beyond, you'll find valuable assessment strategies to help you produce meaningful information and improve your program. Combining and updating the thoroughness and practicality of Assessment in Student Affairs and Assessment Practice in Student Affairs, this new edition covers design of assessment projects, ethical practice, student learning outcomes, data collection and analysis methods, report writing, and strategies to implement change based on assessment results. Case studies demonstrate real-world application to help you clearly see how these ideas are used effectively every day, and end-of-chapter discussion questions stimulate deeper investigation and further thinking about the ideas discussed. The instructor resources will help you seamlessly integrate this new resource into existing graduate-level courses.

Student affairs administrators understand the importance of assessment, but many can benefit from additional direction when it comes to designing and implementing evaluations that produce truly useful information. This book provides field-tested approaches to assessment, giving you a comprehensive how-to manual for demonstrating—and improving—the work you do every day.

  • Build your own assessment to demonstrate organizational effectiveness
  • Utilize quantitative and qualitative techniques and data
  • Identify metrics and methods for measuring student learning
  • Report and implement assessment findings effectively

Accountability and effectiveness are the hallmarks of higher education administration today, and they are becoming the metrics by which programs and services are evaluated. Strong assessment skills have never been more important. Assessment in Student Affairs gives you the knowledge base and skill set you need to shine a spotlight on what you and your organization are able to achieve.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Assessment in Student Affairs an online PDF/ePUB?
Yes, you can access Assessment in Student Affairs by John H. Schuh, J. Patrick Biddix, Laura A. Dean, Jillian Kinzie in PDF and/or ePUB format, as well as other popular books in Education & Student Life. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Wiley
Year
2016
ISBN
9781119051169
Edition
2
Subtopic
Student Life

Chapter 1
UNDERSTANDING THE CONTEMPORARY ASSESSMENT ENVIRONMENT

While assessment has not always been a central activity in student affairs practice in higher education, it is becoming an institutional imperative in contemporary times, as Kinzie (2009) points out, “Every college or university must decide how to most effectively assess student learning outcomes for institutional improvement and accountability (p. 4).” Livingston and Zerulik (2013) add the following observation about the centrality of assessment to student affairs practice as follows, “Assessment is an essential element in any successful student affairs division” (p. 15).
This chapter begins with a case study related to the potential role of assessment as part of implementing a new program. Then, we provide definitions of assessment, evaluation, and research, terms that are important to understand in the development of projects designed to determine the effectiveness of programs, activities, and experiences developed by student affairs educators. We follow that with a brief discussion of the historical development of assessment in student affairs practice and the centrality of assessment in contemporary institutional accreditation, student affairs practice and the education of student affairs educators. We conclude with questions to consider in the development of an assessment plan to address the dynamics identified in the case study.

Learning Communities at Mid-Central University

Sean is an area coordinator at Mid-Central University (MCU) in the residence hall system. As such, Sean has responsibility for four buildings, each housing about 240 students, four graduate assistants (one for each building), and 16 resident assistants. MCU is a regional institution, with most of its students majoring in education, business, or liberal arts. Predominantly, the students are the first in their families to attend college and many have significant amounts of federal financial aid.
Sean is in her second year of service at MCU and noted that as opposed to other institutions with which she was familiar, MCU did not have any learning communities (LC). Sean had served as a graduate assistant in the residence halls at State University while pursuing her master's degree and was used to having many learning communities in residence halls. She was surprised that MCU did not have any learning communities when she interviewed for her position but decided to accept the position with the hope that learning communities could be established though no promises were made that LC units would be established at MCU. She spent her first year investigating why MCU did not have any of these special residential units and it turned out that a variety of reasons contributed to the lack of learning communities, among them the philosophy of the residence department, lack of funding, and potentially, lack of student interest.
From Sean's point of view, the idea behind a learning community was to use the concept to improve retention. In the pilot project she was developing, two learning communities would be implemented in the trial program. Twenty students majoring in business would be assigned to one of the learning communities and another 20 education majors would be assigned to the other learning community. The students in each learning community would be assigned to three courses in the curriculum and a community advisor (CA) would be hired to provide support and enrichment, such as organizing study groups, arranging for tutoring as necessary, and organizing a field trip for the student participants once per month in the fall semester.
Sean briefed her staff at the end of the first academic year about wanting to implement two trial learning communities the next academic year. The concept was foreign to many of the staff and several asked this question: How did Sean know that the students needed this experience? Sean indicated that such would be a part of a pilot project of learning communities that was being planned.
She managed to convince the assistant director of student housing for residential programs that implementing two learning communities on a trial basis was worth undertaking but she was cautioned by Sami, the assistant director, that she would run into a series of hard questions as she had conversations with other member of the central office staff. And, Sami was clear about one central concern that was paramount in his mind: Whenever programs were implemented, senior staff would want to know how the program could be improved from one year to the next.
Sean also met with the fiscal officer of the residence life department who wanted to know what the cost of the program would be. Sean thought that adequate compensation for the community advisors would be a free room plus a monthly stipend of $100 for each CA plus an operations budget for each LC of $2,000 for modest programming efforts. The fiscal officer left Sean with this question: How would Sean demonstrate that the resources were used wisely?
The final discussion Sean had was with the director of the residence life department, Casey. While Casey was generally supportive of the program, there were some doubts about the effort required to implement learning communities. Would the establishment of the learning communities be worth Sean's time? Are the outcomes Sean has identified consistent with the purposes of residence halls at the university? What about staff time in organizing room assignments for the participants? Wouldn't working with the Registrar's office and the two academic programs, business and elementary education, take a lot of time? How would the benefits of the program be communicated to senior administrators? Wouldn't recruitment of participants take a tremendous effort? And, most important, how would Sean determine if the program made a difference?
Sean is faced with a daunting number of questions related to assessment, because without data she really can't answer the questions posed by the various administrators who will have an influence as to whether or not the learning communities will be implemented on a trial basis and what the future of these new units might be. We cannot be certain if Sean was ready for all of the questions raised by these administrators, even though learning communities are common on many campuses (see Benjamin, 2015).

Defining Assessment, Evaluation, and Research

Before we move further into this chapter, it is important that we are clear by what we mean by assessment. We'll also compare and contrast the term assessment with evaluation and research, since the terms often are used interchangeably—however, to our way of thinking, each represents a very different purpose.

Assessment

We think the definition of the term assessment that we introduced in the first edition of this book is still relevant in contemporary student affairs practice. We defined assessment this way:
“Assessment is any effort to gather, analyze, and interpret evidence which describes institutional, departmental, divisional, or agency effectiveness” (Upcraft & Schuh, 1996, p. 18).
To this definition we would add program or initiative effectiveness. In the case of our example, an assessment of the learning community initiative at MCU would be conducted to determine the extent to which the program achieved its goals. It is also important to note that for the purposes of this book, we are interested in students in the aggregate. We will be addressing individual student learning to the extent described in Chapter 4. We would, in the context of this volume, be interested in the aggregate scores of students who might have taken the College Senior Survey (http://www.heri.ucla.edu/cssoverview.php) or the National Survey of Student Engagement (http://nsse.iub.edu/) if the instrument measured an aspect of the student experience pertinent to the study being conducted.
Effectiveness, for the purpose of this definition, can take on many dimensions. Most important, we think of effectiveness as a measure of the extent to which an intervention, program, activity, or learning experience accomplishes its goals, frequently linked to how student learning is advanced. Goals will vary from program to program but typically they are linked to the goals of a unit, the division in which it is located, or the goals of the institution. So, for example, at a commuter institution with no residence halls, the development of community as an institutional goal might have a different definition than the development of community at a baccalaureate college where nearly all students live on campus.

Evaluation

We also defined the term evaluation in the first edition of this book but we think evaluation needs a bit of updating and for that we rely on the work of Suskie (2009). We defined evaluation, in effect, as the use of assessment data to determine organizational effectiveness. Suskie provides a more nuanced definition of evaluation by asserting, “…that assessment results alone only guide us; they do not dictate decisions to us” (p. 12). She adds that a second concept of evaluation is that “…it determines the match between intended outcomes…and actual outcomes” (p. 12). In our LC example, we might learn that participation in the learning community programs does not result in increased retention but we might find out that students who participate earn a higher grade point average at a statistically significant level. If the LCs were established with a goal of improving retention and that did not occur, the higher GPAs may or may be sufficient evidence to determine that the LCs should continue.
Suskie (2009) adds that evaluation also “…investigates and judges the quality or worth of a program, project, or other entity rather than student learning” (p. 12). We might find, for example, that participation in the LCs resulted in improved retention for the participants. But, suppose if when all the costs are tallied in our case study, what was found was that the program cost $8,990 per student. In the case study, it is important to note that the resources of MCU are modest, and with 40 students proposed to participate in the programs (20 in the education LC and 20 in the business LC), if the aggregate cost was $359,600, this amount is likely to be far more than could be sustained by the university's budget. So, while the goal of the program (increased retention) was met, the costs were prohibitive. Strictly speaking, the data suggested that the program was a success (retention was improved), so from an assessment point of view it should be continued, but from an evaluation perspective, it should not (the program was cost prohibitive).

Research

Our experience is that student affairs educators can be worried by the thought of undertaking assessments because they think what they are contemplating is conducting a research study, similar to writing a dissertation as part of completing a doctoral degree or conducting a study that would form the basis for a manuscript that would be submitted to an international journal with rigorous acceptance rates. We submit that such is not the case with assessment. Rather, we assert that while research methods are used in the process of conducting an assessment, we are not advocating a level of rigor that would be required to complete a doctoral dissertation. Suskie (2009), again, is informative on this point: “Assessment…is disciplined and systematic and uses many of the methodologies of traditional research” (p. 14). She adds, “If you take the time and effort to design assessments reasonably carefully and collect corroborating evidence, your assessment results may be imperfect but will nevertheless give you information that you will be able to use with confidence to make decisions…” (p. 15).
We would like to identify several distinctions between assessment and research that further illustrate th...

Table of contents