06_info_lit

Reference Consultations and Student Success Outcomes

Robin E. Miller is Associate Professor and Assessment and Instruction Librarian at the University of Wisconsin-Eau Claire.

Correspondence concerning this column should be addressed to Esther Grassian and Sarah LeMire; email: esthergrassian@gmail.com and slemire@umich.edu.

Librarians have offered personal help in the form of reference for well over a century,1 increasingly using technology of one sort or another. During much of that period, reference service was often just that—a “service” where librarians would serve up information and answers to questions from users. Thanks to the proliferation of powerful technologies, however, many individuals now attempt to seek information on their own first, in a vast morass of websites, social media, apps, blogs, wikis (including Wikipedia), videos, podcasts, and more, all vying for eyeballs. When overwhelmed by the sheer amount of information available, and unable to sort through it all to find valid, reliable information, some turn to librarians for help. It is more and more common that assistance takes the form of helping people learn how to learn for themselves, rather than simply providing answers.

These technologies offer great educational opportunities but also have extraordinary, almost invisible, monitoring capabilities, which are only now seeing the light of day. For much of the history of reference service, librarians have zealously guarded the confidentiality of users and the content of their reference interactions, similarly to lawyer-client discussions, though not protected by law. Technologies used for reference changed those ethical/moral standards to some degree. With all good intentions, librarians have sought data from chat reference transcripts,2 database use, and use of software such as LibGuides, in order to understand what their users want, and how best to help them succeed, at times, without prior user consent. Use of such technologies in libraries has resulted in vast troves of user-generated data that remain invisible to users, as do the processes of user data collection and data retention. In the essay below, Miller discusses the ethical implications of studying and analyzing user reference data, even with admirable goals in mind, like student success. Underlying the discussion in this column is the broader and very timely issue of opting-in vs. opting-out of user data collection and retention in the world around us, for confidentiality and privacy protection, and its implications for libraries’ user data collection.—Esther Grassian, Co-editor

Reference librarians engage deeply with patrons and their research. Professionally, our orientation is to help at the point of need; while some satisfied and dissatisfied patrons may follow up to tell us how we helped, or did not help, we rarely hear the end of the student’s research “story.” Overall, however, systematic methods for assessing the outcome of reference interactions conducted in academic libraries have not emerged from the voluminous literature of library assessment. This is a curious phenomenon because the contemporary higher education landscape is dominated by conversations about assessment and accountability, and academic libraries are actively working to demonstrate value and alignment with institutional goals.

Reference consultations do not account for the majority of the library’s contacts with students enrolled at any institution of higher education, even though librarians and staff who provide reference services are a substantial, and highly visible, portion of the library workforce. Using systems like Gimlet, Desk Tracker, and similar software/applications, we have long quantified service-level data. Some academic libraries have even implemented systems like the Reference Effort Assessment Data (READ) Scale in order to collect qualitative data about reference service quality, staffing levels, question complexity, and other useful information.3 When it comes to reference, our profession is well aware what we do and how we do it, even as our professional standards have evolved over time.

What Does the Literature Indicate?

The library literature makes a strong case for a pedagogical approach to reference work. James Elmborg wrote persuasively of the need for a reference desk pedagogy that models research as inquiry, rather than as accumulation of information.4 Casting librarians as “discourse mediators,” Michelle Holschuh Simmons argued, “Reference work needs to be more about helping students ask questions about information and less about our delivering answers to questions.”5 A decade later, the ACRL Framework for Information Literacy for Higher Education introduced a set of core concepts that directly link reference and information literacy practice to student research and inquiry.6 While students consult with reference librarians at any point in the research process, our professional practice is to use the reference consultation as a site of student transformation in which the act of consultation inspires revision and facilitates growth of expertise. Librarians can also look to allied fields, like composition studies and writing centers, to learn more about teaching and sharing feedback through individual conferences.7 As academic libraries have transitioned away from reference desks and towards consultation models, embedded librarianship, peer learning, and learning commons, the physical and virtual structure of reference practice has evolved to support student learning and inquiry.

In recent years, a few authors have made strong arguments for assessing the outcomes of one-on-one reference or reference consultations. Most recently, Krieb demonstrated that community college students who visited a reference desk remained enrolled at a higher rate than students who did not.8 In a review of literature published 2001–2010, McLaughlin observed that the library assessment literature lacked a “universally accepted set of standard approaches, study methodologies, and reporting formats for comparison and analysis” of the outcome of reference transactions.9 At the 2015 ACRL Conference, Savage made a similar observation, suggesting our profession is “inattentive” to assessment of librarian-patron interactions at the reference desk or in other consultative environments.10 In various studies at large and small institutions, students who have completed a reference consultation have responded to surveys or interview questions about the consultation experience. Self-reported data collected through these studies suggest that patrons believe that non-directional reference consultations are valuable experiences.11 In recent exploratory research conducted at the University of Wisconsin-Milwaukee, Kopecky and Bowes surveyed students who had consulted with a librarian in the previous semester (in person, via email, or by phone, video conference, or chat/instant messenger), finding that respondents looked back at the consultation experience and overwhelmingly believed it had contributed to their academic success.12 At the University of Illinois Libraries, researchers analyzed the perspectives of students, instructors, and librarians who reviewed anonymized chat transcripts. Triangulating these points of view demonstrated that library stakeholders believed that chat reference is a valuable intervention that improves student learning.13 Studies that elicit self-reported assessments of satisfaction or learning can offer libraries insight about the impact of a reference interaction. Based on these studies, a library may be able to gauge patron good will; however, these studies do not help us to infer the impact of reference interactions on a student’s long-term learning, and self-reported assessments of learning do not establish a clear link to a student’s long-term academic performance.

While Savage’s characterization of “inattentiveness” may be an overstatement, it seems clear that much of the literature about the impact of reference consultations draws conclusions from small sets of self-reported data. A logical reason for this gap is that a reference consultation is essentially an intermediate intervention. Collectively, libraries responding to the 2016 ACRL Academic Library Trends and Statistics survey reported providing more than 5.3 million information services transactions that year. Seven percent, or 575,000 transactions were characterized as “consultations.”14 We consult with students at the point of need—or at the point an instructor requires students to consult. Because librarians are rarely a student’s instructor of record, we assist without expectation that we will ever see, or evaluate, the completed research. Librarians are also not typically a student’s academic or departmental advisor, limiting our ability to follow up about the student’s long-term progress.

At Webster University, Watts collaborated with special education faculty to design a study of student outcomes following a reference consultation. They examined research journals and conducted a focus group to assess ten graduate students’ learning outcomes, finding that the graduate students who had participated in reference consultations not only believed that they had learned something from the research consultation experience, but produced higher quality research.15 The library literature is replete with citation analysis studies. While this method is more frequently applied to studies of library collections, Reinsfelder used citation analysis to assess the quality of research conducted by students who had consulted with a librarian.16 Watts and Reinsfelder’s inquiries are well designed and the results may confirm a reference librarian’s perception that our individual consultations are important. While research like this is difficult to scale, reference librarians have other options for exploring our impact on student learning and success.

Academic librarians are analyzing student data in order to document relationships between a student’s use of the library and key elements of student success, particularly persistence and retention, time to graduation, and overall academic achievement. Several investigations have demonstrated that students who use elements of a library’s services and spaces perform better than peers who use the library less or not at all.17 As Steven Bell succinctly notes, “From an assessment perspective [studies like] this can help justify library expenditures by demonstrating how academic libraries contribute to students’ retention and persistence to graduation.”18 The rise of this research in libraries has followed the growth of “learning analytics” or “education data science” in higher education. Colleges and universities collect student identification information (usernames or identification numbers, for example) during the provision of almost all campus services and programs, from swiping a student ID in order to enter a sporting event, to providing one’s username before a tutoring session, or to schedule a career services consultation. This aggregated data enables institutional researchers to analyze and draw conclusions about the academic performance outcomes of students who have participated in a program or who have used a service. In some sectors of higher education, like academic advising, learning analytics systems go beyond retroactive analysis; academic advising units increasingly use learning analytics for “proactive” or “intrusive” academic advising,19 using student performance data to identify struggling students. Libraries, too, collect data about users and the library analytics niche is increasingly recognized by institutional researchers.

How Does Reference Fit Into This Landscape?

Several libraries that participated in the Association of College and Research Libraries’ Assessment in Action (AiA) program sought to demonstrate a positive connection between student learning and use of reference.20 While not necessarily generalizable, the AiA projects suggest that reference consultations make a positive impact on student learning. In the well-known studies conducted at the University of Minnesota Libraries, user data has been collected from two different kinds of reference patrons—chat users and students who had appointments with peer research consultants—as part of a much larger set of library usage variables, including use of collections, facilities, and instructional services. In a partnership with the University of Minnesota’s Office of Institutional Research, all of this library user data is analyzed in order to “examine the association between a variety of library interactions, student retention, and student academic achievement,” finding positive correlations between library use and indicators of undergraduate student success, including GPA and retention.21 To date, the largest study of the impact of information literacy instruction on student success outcomes is the multi-institutional and longitudinal research conducted by the Great Western Library Alliance, clearly demonstrates that classroom-based information literacy instruction makes a positive impact on student success.22 Studies at other institutions have drawn similar conclusions, although most outcomes assessment research focuses on student use of services like materials circulation, interlibrary loan, systems authentication, computer logins, and use of physical spaces. With a few notable exceptions, reference interactions are rarely included in research of this kind.

University of Wisconsin-Eau Claire—Reference Consultation Impact

Like many academic libraries, McIntyre Library at the University of Wisconsin-Eau Claire (UW-Eau Claire), partners with our Office of Institutional Research in order to understand the relationship between library use and student performance. In an ongoing study approved by our Institutional Review Board (IRB), we automatically collect identifying information (username or student ID number) when a patron circulates material, logs in to the proxy server, requests an interlibrary loan, reserves a study room, or enters the library’s annual orientation event. Based on course numbers, the Office of Institutional Research constructs lists of students who attend course-integrated information literacy sessions. And finally, when a student consults with a librarian about research—typically in person, but also via email, chat/IM, or on the phone—we collect the student’s username. This information is stored in an encrypted database, separately from reference statistics tracking, for which we use Gimlet. Annually, user data is supplied to UW-Eau Claire’s institutional researchers, who analyze the data and create data visualizations in order to represent the relationship between library use and student performance.

Results

UW-Eau Claire’s Office of Institutional Research reports that in 2015 and 2016, students who utilized library services, including reference consultations, earned higher grade point averages than nonlibrary users. Undergraduate students who consulted with a librarian earned average GPAs of 3.26 and 3.20 in 2016 and 2015, respectively. Students who used the library but did not ask a reference question earned average GPAs of 3.20 and 3.19 in 2016 and 2015, respectively. In contrast, students who did not use the library at all in 2016 and 2015 had average GPAs of 3.13 and 3.15, respectively.

Retention and time to graduation are also important indicators of student success. Our pilot research examined the entering class of 2012–2013. While 37 percent of that entering class graduated in four years, 45 percent of students who used the library for any reason that year graduated in four years. While our Office of Institutional Research reports that first-to-second year retention rates average 82–83 percent, students who use the library in their first year retain to the second year at rates of 85–90 percent, depending on the year. A future agenda for this research is to identify patterns of retention among students who used specific library services, including reference consultations.

Going beyond grade point average, learning analytics offers libraries opportunities to understand our relationship to other measures of student success, like participation in high-impact practices. For example, in 2016, undergraduates involved in student-faculty mentored research placed 80 percent of the interlibrary loan requests submitted by undergraduate students. The interlibrary loan request form does not ask students if they are placing a request for themselves or for a faculty member; however, local surveys of UW-Eau Claire faculty who mentor undergraduate researchers seem to indicate that faculty are directing students to conduct independent literature reviews as part of the mentored learning process. Our study has also shown us that undergraduates engaged in mentored research accounted for only 5 percent of the reference consultations conducted by librarians in 2016. Data like this offers a library helpful insight into areas of program strength—and weakness—and points to opportunities for strategic alignment with institutional priorities. Future plans for this research include exploring connections between student use of library reference, writing center consultations, and other student success services. While the patterns observed at UW-Eau Claire are not necessarily replicable at other institutions, research like this helps us learn how library services enrich the undergraduate experience and where we can do more to support student learning and growth.

How Do We Collect This Data?

At UW-Eau Claire, reference consultations may take place at the reference desk, a librarian’s office, or another location on campus. The interactions may be face-to-face, via email, phone, or through a chat client. Reference consultations yield a much smaller data set than other forms of library use, and unlike data collected from integrated library systems or central authentication services, this user data cannot be collected automatically. In other words, librarians must ask users to identify themselves. In setting up this pilot, my colleagues and I discussed the ethics and value of this work extensively. We ultimately decided to conclude any reference interaction conducted in person or via chat/IM by saying, “We’re conducting a research study about students who use reference services. Would you be willing to share your username for that project?” If the student consents to sharing the username, the information is entered into an encrypted database and analyzed by the University’s Office of Institutional Research. This user data is stored separately from information about the content, length, and location of the reference transaction.

While the emphasis on a reference consultation should be on the content of the student’s inquiry, our local experience is that asking the patron to provide their user information is not a distraction, although quite understandably, every librarian involved has forgotten to request the information at times. Mining user data from chat reference transcripts seems like an unobtrusive and efficient way of bringing learning analytics into the domain of reference services. Minnesota researchers were able to collect student usernames because a patron email address is required in order to initiate a chat session with that library. Another way to collect such information is to require authentication before beginning a chat session, though a considerable disadvantage of this method is that users who are unaffiliated with the institution would be barred from asking a question. However, according to the 2016 ACRL Academic Library Trends and Statistics survey,23 chat and SMS do not account for the majority of information services transactions received by all academic libraries. This means that chat transcripts alone cannot be the source of meaningful data about the total impact of reference consultations. Use of chat and SMS to collect identifying information for research without obtaining a user’s consent has also raised concern among some library professionals. To ethically integrate the research consultation into student success research, libraries must be transparent about their collection and use of reference patron data.

Yes, We Can. But Should We?

The outcomes of student analytics research are fascinating, but the ends do not justify the means. Librarians have joined other higher education professionals to raise alarm bells about the practice of collecting library user data for learning analytics projects and similar research. In a 2015 column, Fister declared, “I shudder at incorporating learning analytics into library assessment. Even though I’m very curious about how students learn, I don’t want to track them electronically, even if it’s good for them.”24 Jones and Salo clearly argue that mining library user data for analysis conflicts with core professional principles, including the ALA Code of Ethics.25

Ethical considerations should not be minimized, but user data can be collected and aggregated for analysis without compromising individual privacy. In the same 2015 column, Fister described libraries, and the vendors we rely upon, as “leaky” when it comes to protecting patron information. While no library hopes to be implicated in a breach of patron information a la Cambridge Analytica, Uber, or Equifax, Hinchliffe and Asher present a succinct primer of best practices for data collection by libraries that undertake learning analytics projects.26 If libraries—and their parent institutions—are not prepared to adopt best practices to secure user data, learning analytics research for any purpose should be out of the question. Key among these practices is that transaction-level data, like the content of a research question or the material accessed, should not be collected with identifiable user data. User information should be analyzed and reported in the aggregate, should not be linked to any individual user, and we should hold our vendors, institutional IT units, and offices of institutional research accountable for protecting our patrons’ privacy with the use of secure technology. All data collected should be encrypted, and libraries should comply with internal and institutional codes of practice about data preservation and destruction. Remember, though, that research like this cannot be conducted in a vacuum. Libraries must also collaborate with our institutional research offices because they have access to student achievement data, and they have an interest in complying with regulations and procedures that protect student privacy. In that process, we can work together to ensure that library user data is secure at all points in the process of analysis.

While the 2010 Value of Academic Libraries report inspired many libraries to “track library influences on increased student achievement,”27 Seale critiques the “logic of the market” that has driven libraries to attempt to demonstrate value.28 In agreement with Seale, Beilin argues that attempts to track student success metrics, like GPA and retention, are emblematic of neoliberalism in academic libraries.29 These reasonable concerns reflect growing alarm about the transformation of higher education from “public good” to an enterprise with customers who consume educational products and services. In that sense, research about whether use of the library makes an impact on student learning and performance may be the academic library’s attempt to demonstrate relevance in an environment focused on accountability. An alternative view is that academic librarians know anecdotally that our work is meaningful to students, and research that elicits reflection and self-reported outcomes demonstrates that students believe academic libraries make a difference to their student learning. If we can pair self-reported outcomes with quantitative evidence that library users experience greater academic success, we can demonstrate our active participation in the academic mission of higher education. Internally, evidence gathered in learning analytics research has great promise for helping libraries understand who we help and how we can dismantle barriers to any kind of library use.

Academic librarianship, and higher education in general, has reached an inflection point. We have invested heavily in software and systems that enable us to collect and analyze large quantities of data about students and we are already using these systems to go beyond small-scale assessments of student success. These inquiries can be extended into the domain of reference, helping us to relate use of our services to the mission of our academic institutions, and to identify opportunities for engagement with students across our campus communities. At the University of Wisconsin-Eau Claire, reference consultations will be integrated more fully into research about cocurricular support services and undergraduate student performance. In addition to continuing our partnership with institutional researchers to collect—and responsibly manage—library user data, we are exploring ways to more consistently, efficiently, and transparently collect user information during reference consultations. We also hope to develop a secondary method to collect additional reflective information from a sample of reference patrons, in order to understand why they chose a reference consultation and how they believe the experience aided their success. We will use our findings to help us understand who we reach with services we believe are high impact and to consider strategic improvements to our pedagogical models in order to reach a broad cross-section of emerging researchers. Learning analytics methods cannot help us to read the end of each student researcher’s story, but these methods can help us to learn more about how consulting with students enriches the academic experience.

References

  1. David A. Tyckoson, “Issues and Trends in the Management of Reference Services: A Historical Perspective,” >Journal of Library Administration 51, no. 3 (2011): 259, https://doi.org/10.1080/01930826.2011.556936.
  2. Paul Neuhaus, “Privacy and Confidentiality in Digital Reference,” Reference & User Services Quarterly 43, no. 1 (Fall 2003): 26–36, https://search.proquest.com/docview/217873564?accountid=14512.
  3. “What is the READ Scale?” READ Scale Research, accessed May 9, 2018, http://readscale.org.
  4. James Elmborg, “Teaching at the Desk: Toward a Reference Pedagogy,” portal: Libraries and the Academy 2, no. 3 (2002): 455–64.
  5. Michelle Holschuh Simmons, “Librarians as Disciplinary Discourse Mediators: Using Genre Theory to Move toward Critical Information Literacy,” portal: Libraries and the Academy, 5, no. 3 (2005): 308.
  6. “Framework for Information Literacy for Higher Education,” American Library Association, last modified February 9, 2015, http://www.ala.org/acrl/standards/ilframework.
  7. Jo Mackiewicz and Isabelle Thompson, “Instruction, Cognitive Scaffolding, and Motivational Scaffolding in Writing Center Tutoring,” Composition Studies 42, no. 1 (2014): 54–78; Muriel Harris, Teaching One-to-One: The Writing Conference. Urbana, IL: National Council of Teachers of English. 1986.
  8. Dennis Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” Evidence Based Library and Information Practice 13, no. 2 (2018): 2–12, https://doi.org/10.18438/eblip29402.
  9. Jean Mclaughlin, “Reference transaction assessment,” Reference Services Review 39, no. 4 (2011): 536–550.
  10. Devin Savage, “Not Counting What Counts: The Perplexing Inattention to Research Consultations in Library Assessment Activities” (paper, Association of College and Research Libraries Conference, Portland, OR, March 25–28, 2015).
  11. Gillian Gremmels and Karen Lehmann. “Assessment of Student Learning from Reference Service,” College & Research Libraries 68, no. 6 (2007): 488–502; Joann Jacoby and Nancy O’Brien, “Assessing the Impact of Reference Services Provided to Undergraduate Students,” College & Research Libraries 66, no. 4 (2005): 324-40; Bonnie Swoger and Kimberly Hoffman, “Taking Notes at the Reference Desk: Assessing and Improving Student Learning,” Reference Services Review 43, no. 2 (2015): 199–214.
  12. Linda Kopecky and Katherine Bowes, “Student Success 1:1: An Assessment of the UWM Libraries’ Research Consultation Service, AY2016-17,” (presentation, Wisconsin Association of Academic Libraries Conference, Oshkosh, WI, April 26–27, 2018).
  13. Joann Jacoby, David Ward, Susan Avery, and Emilia Marcyk, “The Value of Chat Reference Services: A Pilot Study,” portal: Libraries and the Academy 16, no. 1 (2016): 109–29.
  14. “ACRL Trends and Statistics Survey Data,” Association of College and Research Libraries, accessed April 2, 2018, https://www.acrlmetrics.com.
  15. John Watts and Stephanie Mahfood, “Collaborating With Faculty to Assess Research Consultations for Graduate Students,” Behavioral & Social Sciences Librarian 34, no. 2 (2015): 70–87.
  16. Thomas Reinsfelder, “Citation Analysis as a Tool to Measure the Impact of Individual Research Consultations,” College & Research Libraries 73, no. 3 (2012): 263–77.
  17. Krista Soria, Jan Fransen, and Shane Nackerud, “Library Use and Undergraduate Student Outcomes: New Evidence for Students’ Retention and Academic Success,” portal: Libraries and the Academy 13, no. 2 (2013): 147–64; Shun Han Rebekah Wong and T. D. Webb, “Uncovering Meaningful Correlation Between Student Academic Performance and Library Material Usage,” College & Research Libraries 72, no. 4 (2011): 361–70; Adam Murray, Ashley Ireland, and Jana Hackathorn, “The Value of Academic Libraries: Library Services as a Predictor of Student Retention,” College & Research Libraries, 77, no. 5 (2016): 631–42.
  18. Steven Bell, “Learning Analytics,” Keeping Up With . . . , October 2014: http://www.ala.org/acrl/publications/keeping_up_with/learning_analytics.
  19. Alison Herget, “Intrusive Academic Advising: A Proactive Approach to Student Success,” Higher Ed Jobs, November 9, 2017, https://www.higheredjobs.com/articles/articleDisplay.cfm?ID=1153.
  20. Association of College and Research Libraries, Academic Library Impact on Student Learning and Success: Findings from Assessment in Action Team Projects, prepared by Karen Brown with contributions by Kara J. Malenfant (Chicago: Association of College and Research Libraries, 2017), http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/findings_y3.pdf.
  21. Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 150.
  22. Great Western Library Alliance, The Impact of Information Literacy Instruction on Student Success: A Multi-Institutional Investigation and Analysis, report (Kansas City, MO: GWLA: 2017), http://www.arl.org/storage/documents/publications/The_Impact_of_Information_Literacy_Instruction_on_Student_Success_October_2017.pdf.
  23. “ACRL Trends and Statistics Survey Data.
  24. Barbara Fister, “Not In the Clear: Libraries and Privacy,” Inside Higher Ed, February 12, 2015, https://www.insidehighered.com/blogs/library-babel-fish/not-clear-libraries-and-privacy.
  25. Kyle Jones and Dorothea Salo, “Learning Analytics and the Academic Library: Professional Ethics Commitments at a Crossroads,” College & Research Libraries 79, no. 3 (2018): 304–23.
  26. Lisa Janicke Hinchliffe and Andrew Asher, “All the Data: Privacy, Service Quality and Analytics” (presentation, American Library Association Annual Conference, San Francisco, June 25–30, 2015), http://alaac15.ala.org/files/alaac15/AlltheDataHandout-HinchliffeAsher-ALAAC15.pdf.
  27. Association of College and Research Libraries, Value of Academic Libraries: A Comprehensive Research Review and Report, report (Chicago: ACRL, 2010).
  28. Maura Seale, “The Neoliberal Library,” in Information Literacy and Social Justice: Radical Professional Praxis, eds. Lua Gregory and Shana Higgins (Sacramento, CA: Library Juice Press, 2013), 39–61.
  29. Ian Beilin, “Student success and the neoliberal academic library,” Canadian Journal of Academic Librarianship 1, no. 1 (2016): 10–23.

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2023 RUSA