Chapter 1
How to Assess Studentsâ Social Media Preferences
A Comparison at Two Academic Institutions
Dan Sich1 and Mark Aaron Polger2, 1Western Libraries, University of Western Ontario, London, ON, Canada, 2Department of the Library, College of Staten Island, City University of New York (CUNY), New York, NY, United States
Abstract
The authors will discuss a questionnaire they administered at their academic institutions: the University of Western Ontario in London, Ontario, Canada and the College of Staten Island, City University of New York. Their goal was to identify studentsâ preference for various social media (SM) tools, what kind of information students wanted to see from their academic library via SM, and similarities and differences in use of SM tools by the two student populations. Findings will be discussed briefly. They will provide an overview of the current use of SM tools by each library and compare this use to what students actually want from their library via SM. They will also include some specific successes and failures, and will provide recommendations to other libraries wishing to run a similar questionnaire.
Keywords
Canada; United States of America; social media; students; academic libraries; assessment; market research; promotion; marketing
1.1 Introduction: Reason for Study
This chapter will outline our experiences conducting an assessment of a specific patron groupâs social media (SM) preferences. The main goal is to provide a framework that can be adapted and utilized by other academic librarians. We will touch on the findings of our study, but focus mostly on our experience conducting it.
We ran a study of the SM preferences of students in relation to their libraries, at two separate institutions. At both libraries we thought it necessary to conduct the study because we noted the potential for investing too much time in the librariesâ SM presence. We also saw potential for tool creepâmultiple SM accounts (including Facebook, Twitter, and Snapchat, to name a few) for various libraries. Tool creep could lead to a situation where there are too many SM accounts for users to choose from, leading to confusion. It is also more difficult for the library to manage. We wanted to make sure that the time and effort our libraries were investing in their SM presences was worthwhile. Were we posting the right messages in the right places? Our study was intended to help address this concern.
The results we obtained provide a snapshot of our studentsâ use of SM tools vis-a-vis the library. The SM use landscape changes from year to year and so this type of research should ideally be run on a continual basis. It can provide âsnapshotsâ of student use of SM tools that can be used to tweak a libraryâs SM strategy.
Understanding our studentsâ SM preferences is important. Evaluation of our data will help improve our librariesâ SM tool choices, posting behaviors, and thus our efforts to inform and educate our users. For example, we might consider changing the emphasis on which SM tools we are using, reconsider the type of content to post, and customize posts to target specific audiences. Throughout, we must assess whether we are successfully engaging with students, or are merely engaged in marketing.
1.2 Some Background
The literature provides further context for a study of this sort. Gerolimos (2011) suggests that rather than overloading Facebook with posts, we attempt to engage âfansâ by carefully crafting posts of interest. He concludes that while librarians put much effort into developing Facebook pages, they fail to assess whether their users are seeking information via these pages. Surveying users, as we have done, will help libraries understand how their users access information about the library.
There are many factors that influence studentsâ SM preferences. Brookbank (2015) notes that SM tool preferences can vary from campus to campus. We wanted to learn something about how studentsâ SM use varies between institutions. Our institutions are quite different (see later). We were interested in whether or not these differences might lead to differences in our data. If we could highlight how our respective studentsâ SM use differs from those at the other institution, we could then focus our SM campaigns on tools and topics that appealed to our users in particular.
Our results show clear preferences at both institutions, which will be used to help inform our librariesâ choice of SM tools. We now know more about the type of information students would like to see from us via SM, and a bit about what they do not want to see. These findings regarding what make our student populations different will be used in tailoring future SM posts.
1.3 Our Institutions
Our two institutions differ in a number of ways.
The University of Western Ontario (UWO) in London, Ontario, is a Canadian public university with three affiliated University Colleges. UWO has a total of 38,335 Full Time Equivalents (FTE), with 32,529 FTE on our main campus (Office of Institutional Planning & BudgetingâWestern University, n.d.). At the time of this research, Western Libraries had several Facebook and Twitter accounts. These accounts are administered by various teams and individuals. UWO has SM guidelines regarding use of appropriate perspective, verbiage and hashtags, and on how to reply to user comments.
The College of Staten Island (CSI), City University of New York (CUNY), is a public, comprehensive college, providing associate degrees, bachelor, master, and PhDs. CSI is part of the City University of New York, which comprises 24 campuses. CSI has 14,000 students, mostly 1st and 2nd year undergraduates. The CSI Library has been actively using SM tools to promote library services and resources since 2010. Currently, the CSI Library manages SM accounts for Facebook, Twitter, Instagram, and Google+ (Google Plus). All SM accounts are administered by one librarian (Polger) and assisted by the Web Services Librarian. There is a SM policy document in place and a SM scheduling calendar that is populated with content through consultations with the Libraryâs Marketing and Outreach committee (chaired by Polger).
1.4 Survey Questions
The survey included six questions (see Fig. 1.1).
We felt that the questionnaire was effective at providing the data we were seeking. See our Recommendations and Limitations section for advice on composing survey questions.
In order to be able to compare the results from two institutions, our institutional versions of the questionnaire had to be as close to identical as possible. Our questions varied only slightly to account for institutional differences (e.g., terminology differences like â1st year undergraduate studentâ vs. âfreshmanâ) and practices (e.g., Office of Research Ethics requirements). It was also necessary to coordinate the timing of the questionnaire to avoid the effects of a changing SM landscape.
While preparing to run the study, we recognized that it would have been simpler to conduct it independently at each of our two institutions. Anyone wishing to run a similar questionnaire at their own institution could do so without partnering with another institution, and still obtain valuable data. We do not think that our questionnaire was any less effective for having been run at two institutions.
1.5 Ethics Approval
Because of our intent to publish our research, we both sought and obtained approval from our institutionsâ offices of human research ethics (Office of Research Ethics at UWO, the Human Research Protection Program at CSI). Libraries wanting to conduct a similar study without publishing likely need not seek such approval. Obtaining this approval can required anywhere from four-six weeks (CSI) to three-four months (UWO) or more. As such we recommend that the reader give themselves adequate time for this important step.
1.6 Timeline and Response
We ran the study at both institutions for four months, from September 1 to December 31, 2016. This timing was ideal given the amount of Information Literacy instruction which occurs in the Fall term. Four months turned out to be adequate for obtaining enough responses (N=602 at UWO, N=637 at CSI). The reader might want to expand this time frame to two terms in order to obtain more responses, if they use our approach (see later).
1.7 Sampling and Distribution
We initially both thought it would be easiest to obtain data by distributing the questionnaire to students via email. However, there are issues with such an approach. For instance, survey emails are easy to ignore, lose, or forget. In addition, the email approach proved impossible at both UWO and CSI; the âall s...