Extending Our Reach: Automatic Integration of Course and Subject Guides

Britt Fagerheim (britt.fagerheim@usu.edu) is Interim Associate Dean of Public Services, Kacy Lundstrom (kacy.lundstrom@usu.edu) is Interim Head of Reference & Instruction, Erin Davis (erin.davis@usu.edu) is Library Coordinator of Regional Campuses, and Dory Cochran (dory.cochran@usu.edu) is Reference & Instruction Librarian, Utah State University, Logan, Utah.

Librarians at the Utah State University (USU) Merrill-Cazier Library started working with LibGuides in 2007, and USU subject librarians quickly adopted the system. USU is a land-grant institution with a main campus of 14,000 students and several smaller regional campuses and centers throughout the state, many of which rely heavily on online resources. After seven years of working with LibGuides, a product of Springshare, approximately seven hundred research guides had been published. The guides varied in purpose and design, and we did not have a consistent or clear view of how students found or used them. We also did not have a template or a structured design, beyond some general best practices. Over time, we started to consider questions around the visibility of LibGuides, more effective ways to integrate LibGuides into courses, and possibilities of using emerging technologies to reach students where they study and conduct research. While the library had already begun manually integrating guides into Canvas, USU’s learning management system (LMS), as a way to extend our online presence, we sought a more automated integration with course and subject guides.

Several factors influenced our integration between LibGuides and the LMS. In particular, we learned of other libraries’ successful integration with their LMSs, we read about librarians’ thought-provoking usability testing of their LibGuides and subsequent recommendations, and we benefitted from the USU Library’s very strong relationship with the University’s Center for Innovative Design and Instruction (CIDI) (the campus group managing the LMS). Transitioning to version 2 of LibGuides also provided us with an ideal moment to approach issues relating to a lack of structure and best practices for design across all guides. We also recognized Springshare’s responsiveness to institution-specific requests and the existence of an open Application Programming Interface (API).

Given these factors, we investigated the usability and design of the library’s subject guides. We focused on the following research questions:

  • How can we maximize the effectiveness of LibGuides, both in design and reach?
  • How can we assess the design and the reach of LibGuides?
  • What role should the subject librarians play in the re-design and automation process, and how can we develop effective workflows?

Literature Review

In exploring our research questions, we relied on literature in various areas, including LibGuides’ usability studies and resulting best practices, the purpose and use of subject guides in general, and integrating LibGuides within LMSs.

Numerous examples of LibGuide usability studies exist in the literature. Most often, they include focus groups or student surveys, followed by a redesign of LibGuides at the institution and a recommendation for best practices. Librarians at Metropolitan State University asked students to complete tasks and observed confusion related to search boxes, inconsistent or confusing language, multiple/complicated tabs, and contact information.1 Some of their recommendations include meeting users’ needs instead of emphasizing types of information whenever possible, clearing up jargon relating to databases, journals, and articles to help lessen confusion, adding a table of contents, using specific names for tabs, and sorting sources by usefulness or by relevance. Similarly, Gonzalez and Westbrock recommended best practices such as collaborating with faculty on the design and content of the guide, monitoring use, soliciting feedback, and creating a consistent look and feel.2 Hintz et al. piloted a project where students compiled a list of top ten recommendations that was shared with faculty and subject liaisons.3 Likewise, Little created a list of ten practical suggestions for LibGuide design aimed at decreasing cognitive overload.4 Similar issues and recommendations arose in another study,5 mainly emphasizing the importance of using clear language, employing navigational signals that place the most important content at the top center or top left of the web page, and tailoring content to the assignment or intended need as much as possible.

Librarians continue to debate the use and purpose of subject guides. Hintz, et al. concluded that “students come to subject guides expecting to be firmly guided towards the materials and conventions of accepted scholarly practice.”6 A similar usability study was conducted on subject guides at the University of Alberta and Grant MacEwan University,7 again focusing on issues such as search box visibility, search box confusion, language inconsistency, and poor navigation; the authors compiled best practices and revisions. Another study analyzing twenty-one libraries’ subject guides determined they weren’t using Web 2.0 technologies to their full capability, noting that once created, guides were typically left for the user to find on their own and were infrequently updated, if ever.8 A more recent pilot study explored the impact of a redesign of an institution’s subject guides, emphasizing more standardized design, creation and management elements across guides, which they found “were generally viewed favorably by both staff and students.”9 This approach is particularly relevant in that many institutions find themselves in need of a more holistic approach after many variations of guides have proliferated.

The most current research on embedding libraries within learning management systems includes integrating eReserves and using LibGuides as an instructional tool.10 Bowen and Miller found no pedagogical advantage to putting instructional content in LibGuides versus other web platforms “neo-liberal enterprise.”11 Despite these concerns, LibGuides continue to be a preferred platform for many libraries for pedagogical content and curated access to resources. Their continued prevalence has led to more discussion and research in the literature regarding extending the reach of LibGuides, including integrating them into learning management systems.

Some research exists about how to integrate LibGuides effectively into LMSs and how to assess the value of automatically including them into course pages. A few small-scale studies explored uses of manual linking to LibGuides.12 In 2012, Duke University began manually linking relevant guides, but librarians found this unsustainable and worked with a library programmer and a Blackboard support team to automate the integration.13 Similarly, USU librarians have manually linked to LibGuides within Canvas and course syllabi for some time but not in any structured or automated way. The transition to version 2, combined with our connections with CIDI and their programmers, provided an opportune time to develop better-designed, more unified subject and course LibGuides, while simultaneously building an automated integration within Canvas.

Automatic LibGuide Integration within Canvas

Over the past five years, the USU Library developed a collaborative relationship with the University Center for Innovative Design and Instruction. Librarians attend CIDI’s monthly meetings and the groups keep each other abreast of potential collaborative opportunities. The automatic integration of LibGuides into Canvas became a team effort, with multiple librarians and CIDI staff working on solutions.

These collaborations resulted in a CIDI programmer building a Learning Tools Interoperability (LTI) tool within Canvas to pull in the most relevant LibGuide for each course or subject. The tool is activated when someone selects the “Research Help” link within a course page’s left-hand navigation menu, which is now a default navigation option in Canvas (see figure 1).

The LTI tool looks at the Canvas data about that course and retrieves the subject or course guide that appears to be most relevant. For Canvas to identify which guide to pull in for each course, librarians coded each published LibGuide related to instruction using the description field. Within this field we provided information on the department, course prefix, and instructor name if applicable. We used the computer keyboard pipe key (a vertical bar) to separate elements and placed the complete syntax within square brackets. Because we had many different instructors teaching first- and second-year English composition courses, it was necessary to add a first and last name component to differentiate guides for particular sections and instructors. Most course LibGuides only required coding at the course prefix and course number level. All subject guides are coded with the most relevant academic department. We are also able to apply multiple subject codes for guides that apply to several departments:

  • Example 1: [Course Prefix | Course Number | First Name Last Name]
  • Example 2: [ENGL | 2010 | Russell Beck]
  • Example 3: [FCHD | 4830]
  • Example 4: [HIST]

Once the guide is coded with the correct information (which librarians do each time they create a new guide), the LTI tool scans the coding in the description field for available guides using Springshare’s open API and compares it to Canvas course information. Then it loads the most specific match and opens the guide within an inline frame element (iFrame) in Canvas, rather than linking out to another window. If the LTI tool cannot find a LibGuide coded for the instructor or course, it will load a subject-specific LibGuide most relevant to the course (the LibGuide coded to the department in which the course is offered). For most Canvas courses, the Research Help link opens a subject guide because there are far fewer course-specific guides. Finally, if neither a course or subject guide exists that matches the Canvas course, a general library research guide is selected.

LibGuides Redesign

As we worked with CIDI to develop coding options for integrating LibGuides automatically with Canvas pages, an upgrade to “LibGuides Version 2” became available. We capitalized on this as an opportune moment to improve LibGuides’ design and to address the need to create subject guides for all disciplines. Until this point, only about half of the subject areas had a relevant subject guide, and many of those had not been revised or maintained for some time.

Subject librarians initially had full control over the design and general layout of their subject and course LibGuides. Periodically, subject librarians devoted time during meetings to edit guides and review best practices, but adhering to standards was never enforced. Given this informal process, we created a more structured and informed approach to the library’s use and creation of library guides. Since LibGuides were now going to be automatically linked in each Canvas page, we felt it was important to provide a consistent appearance to facilitate students’ use of the guides. In May 2014, we initiated a LibGuides’ redesign project initially focused on subject guides. Expanding on data from a study conducted at the University of Texas Arlington on LibGuides and web design best practices, the Coordinator of Library Instruction, a member of the research group, created a template that employed more graphics, featuring minimal text and four central boxes on the front page, that led to the major content of the guide (see figure 2).

Usability Test Method

In an effort to bring student input into the project, gain feedback on the usability of the new subject guide template, and identify students’ familiarity with the Research Help link in Canvas, the librarian team conducted two focus groups with USU undergraduate students. The focus group participants responded to an advertisement posted on the library website with the understanding that participation was voluntary, but that incentives would be provided. The first sixteen students who responded to the advertisement received an invitation to one of two focus groups, which were conducted in mid-February 2015 in the Merrill-Cazier Library. The goal was to interview five to eight students in each focus group, and we met this in both sessions. The sessions were held during lunchtime and participants received pizza and a $15 gift certificate to the Utah State University bookstore.

The same librarian moderated each hour-long session, and an additional team member observed the session, took notes, and recorded the discussion on a handheld recorder. Participants were given consent forms and nametags, and the moderator reviewed the purpose of the discussion and emphasized that there were no wrong answers. We asked each group of students the same questions, except for a few prompts requesting further detail or explanation of an answer. All interview participants were asked to evaluate specific aspects of a revised subject guide as well as the guide as a whole. Using a sociology subject guide, the moderator asked a series of questions relating to the automation and design of the guide (see appendix).

Since the Canvas LibGuide automation was new, the team wanted to find out if students knew what it was, and if they didn’t, what they thought it might include. We first asked the students what they would expect to find under the “Research Help” link in Canvas. All participants were then asked how they typically work through a research project and what they considered to be the most difficult part of such a project.

Following this discussion, the moderator showed the subject guide homepage on the projector and asked all participants a series of questions about the guides, specifically calling upon elements that were redesigned in the guide. The first question stated, “What stands out as most important on this page?” The moderator asked additional open-ended questions to get at content such as what students saw as the focus of the LibGuide homepage, whether or not students knew where to get help from a librarian, and what kinds of information students would expect to find in the specific boxes on the guide homepage.

Question five was a two-part question about what kind of information they hoped to find within the first two tabs of the LibGuide titled “Getting Started” and “Gathering Information” (see figure 3). The moderator then clicked on each of the tabs and showed them what each of the pages contained. The sixth and seventh questions then asked students if the information presented on each of the pages made sense and if too much information was present. The final interview question focused on the LibGuide and asked students what was missing from the guides that would be useful to them. The moderator finished the interview by opening up the Canvas page and showing them the “Research Help” link again, asking whether the name of the tab accurately reflected the LibGuide content.

Focus Group Analysis

The recordings from both focus groups were transcribed by undergraduate student library employees, and then two librarians involved with the study read through each transcript multiple times. In coding the transcriptions, we individually grouped the content by theme for each session’s transcript and later combined their themes to represent a more cohesive understanding of the content. The themes identified were: LibGuide features, categories, sections, and suggestions for improvement. Given that the coders initially analyzed the data with the intention of providing feedback to subject librarians, we completed the analysis with the goal of identifying information that would help subject librarians to improve the design and implementation of library guides.

When asked about what they saw as the purpose of the Research Help link, students’ responses were grouped into three codes:

  • research information for a specific class
  • search tools for research
  • help options

Students’ responses to how they typically start a research project were divided into two categories: library resources and nonlibrary resources. Many students mentioned specific sources for research that they use in the library such as books or databases; some used specific terminology like JSTOR and Academic Search Premier. Despite these specific instances, the majority of students said that they start with Google or the references listed in Wikipedia. The primary struggles with the research process that both groups described were the following:

  • topic exploration
  • search related
  • time
  • citations

Focus group participants offered many specific ideas about what they would change and not change about the LibGuides. Students commented that they liked the headings such as “getting started” and “gathering information” and appreciated the clean and simplified layout. Students also praised the multiple ways they could get help, including contact information for the subject librarian, chat features, and self-directed learning opportunities. The strongest points of criticism were the following:

  • consistency in design
  • guide personalization
  • visual creativity in the guide

Students noted that similar types of information were not consistently placed on every single tab or page. They suggested librarians create guides that had more conformity across each page of the guide in terms of content location. They also asked for more guide personalization, focusing on two major aspects. First, the students argued they needed an incentive to explore and use the guide for a class. Second, they commented that the language used should be less scholarly, and include more second person pronouns and witty language. Finally, students wanted a more visually creative layout that included non-academic style graphics.

Developing Quantitative Assessment Tools

In addition to re-designing the subject guides, the other major aspect of the integration involved developing mechanisms to gather ongoing metrics on usage of the guides. We anticipated being able to gather statistics from a range of electronic sources to address some of our major research questions. The specific questions we wanted to address with the quantitative data, developed from our research questions, were the following:

  • How many students are discovering a LibGuide through the automated Research Help link in Canvas?
  • Are students finding the subject guide using the Research Help link in classes that are unmediated by librarians?
  • Are instructors choosing to hide the Research Help link?
  • Which resources within the guides are being used most often? Are any trends developing?

To gather LibGuide data from within Canvas that addressed our research questions, we ultimately needed custom programming. CIDI initially was able to provide a very high-level overview through Canvalytics, a statistics system we accessed within our password-protected Canvas site. These data included the number of times a guide was used within a particular course and the number of unique users for an individual course or subject guide (see figure 4).

These data were organized by college and then department. We could sort the data by the course, but there were still aspects of our research questions that remained unanswered (see figure 5).

Fortunately, a programmer in CIDI developed a tool to record and track the information we needed. The web-based interface enabled us to filter by a range of criteria and export the data to Excel. We could filter by the following:

  • term (semester)
  • guide level (course, subject, or the general guide)
  • statistics for a specific guide, organized alphabetically by title
  • delivery method of the course (online, traditional, broadcast)
  • department
  • college
  • campus where the course originated (main campus in Logan or a regional campuses or center)

This site was a key breakthrough for gathering usable data, enabling us to begin assessing use of the Research Help link. At the moment, we do not have a way of tracking “Number of Times Used” within the tool created by CIDI, but we are able to pull this information from the more general data within Canvalytics. To learn more about student usage of the guides, we also added Google Analytics to our LibGuides site to measure the amount of time students spend on the LibGuide homepage.

Data Sharing

Our main audience for the data was the many subject librarians who develop course and subject guides and teach library instruction classes in the disciplines. We decided what data would be useful for subject librarians and then commissioned an undergraduate student library employee to help organize the data for each academic department. We then provided a spreadsheet to subject librarians for each of their subject areas.

When we provided the spreadsheets to subject librarians, we included both the raw data and also tips about what to look for and how the data might be useful. Each spreadsheet had four tabs. For the first tab, we listed which classes had at least one unique user for the guide in Canvas (see figure 6). We were interested in the use of subject guides because these would most likely be accessed within classes that weren’t interacting with a librarian. In the past, if a librarian met with a face-to-face or broadcast class or worked directly with an online class, the librarian usually created a course-specific guide and used the guide in class or otherwise actively marketed the guide to the students. Since students would find and use subject guides on their own, we were particularly interested in finding out in which of these courses students in fact used LibGuides. Courses with high subject guide usage might indicate an opportunity for subject librarians to target for library instruction.

This page of the spreadsheet also indicated the delivery method of the course. With the rapid evolution and expansion of online and broadcast courses that Utah State University offers, we are still developing methods and practices to become integrated into online and interactive broadcast courses, in particular the courses originating at regional campuses and centers across the state of Utah. We were therefore interested in usage and trends for the online and broadcast classes, hoping to pinpoint which classes might include research projects, identify instructors to collaborate with, rank the guides we might want to focus on by usage, and look for trends particular to the regional campuses and centers.

The second page of the spreadsheet provided to subject librarians showed how many times each guide had been used for a class for those guides that had at least one use (see figure 7). We sorted the spreadsheet by highest use and asked librarians to note which classes had the greatest number of views. Subject librarians could then gauge which course guides were heavily used, including perhaps after a class session, and which subject guides were frequently used, which could indicate an opportunity for integrating with a class.

For the forty-one subject-based guides within our LibGuide system, we collected a range of statistics for each disciplinary guide from Springshare’s statistics (see figure 8). We collected the total views for the subject guide, whether these views came from within Canvas or not. This would help the librarians gauge overall usage of their subject guides. We also noted the number of views each individual page had received, data that we could not collect from any other sources. Since we had recently redesigned the subject guides with a four-box layout on the homepage, leading to four main pages (several guides have additional pages, depending on the needs of the specific subject guide), it was important to try to assess which pages of the guide were being used the most and which were not used at all. We also highlighted the three most-used links within each guide, to help librarians assess which links students were using most often.

Finally, we wanted to know how many instructors were hiding the Research Help link (see figure 9). The link appears by default in the Canvas menu, but faculty members or instructors can choose to hide the link. We suspected that faculty members or instructors might hide it accidentally. Anecdotal evidence of this came from classes with whom librarians had met, where instructors had hidden the link because they didn’t realize it was a link to the library research guide for the class. If we knew in which courses the link was hidden, we could investigate each course and see if there appeared to be few or no research assignments, or if the course did require research and the instructor had perhaps hidden the link without knowing its purpose. We also wanted to check the USU campus or center where the classes originated to monitor if there were any trends among the regional campuses and to explore ways we might focus our marketing.

Discussion

Guide Redesign

The qualitative data we gathered from the subject guide usability studies highlighted problems students encountered with research projects in general as well as suggestions for improving the proposed subject guide layout. Students noted issues similar to those raised in studies conducted by librarians at other institutions, with their comments focusing on topic exploration, searching, time, and citations. Students noted a lack of consistency in the placement of information within each page, expressed a desire for more ties between the guide and their course research, more informal, inviting language, and an increased graphical presence.

Subject librarian feedback was essential throughout the design and revision process. After initial design changes and the creation of the subject guide, subject librarians requested similar changes be made for course guides based on the focus group feedback from students. The result was a new style guide template for course guides in addition to the subject guides template, which subject librarians could easily adapt. While these findings and discussions did lead to a new template with suggestions for design and best practice, we avoided requiring all course guides be moved to the new designs. The template does not meet all needs and librarians are encouraged to consider the goals of the guide as it relates to the assignment and course to determine whether it fits best within the new template. Even if they opt out of using the template, they are still encouraged to follow the suggested guidelines regarding placing most important boxes at the top left and center, limiting the number of databases listed to the three or four most important, and inclusion of graphics instead of large amounts of text.

Assessing Use

The early quantitative data on LibGuide usage within Canvas will illuminate our research questions over the long term as we gather data over multiple semesters. Preliminary data show the heaviest uses of research guides overall corresponds to the colleges with the greatest library instruction activity, including the College of Education and Human Services and the College of Humanities and Social Sciences. The College of Education and Human Services constitutes almost half of all online usage of research guides, which corresponds to the high number of online courses in the College. As an increasing number of colleges offer more online education, we will track LibGuide use within courses taught using less-traditional delivery methods.

Lacking comprehensive subject LibGuide coverage for all academic departments before the guide design revisions, we do not have a baseline for comparison with current subject guide usage. We will, however, continue to track the usage within academic departments to identify courses with particular research needs and target courses that could potentially benefit from more specific information literacy instruction.

Current data related to most-used pages within the subject guides show that after the homepage, the page with the highest use among almost all subject guides is the “Gathering Information” page, followed by a range of usage among the “Getting Started,” “Tutorials & Guides,” and “Organize and Cite pages.”

We will continue to collect and analyze the usage data for the guides in Canvas, in particular analyzing usage trends within academic departments, course versus subject guides, and usage by delivery method (face-to-face class, online, or interactive broadcast). We also plan to identify the classes where the Research Help link is hidden and determine whether the classes have a research component. We will also work with subject librarians to interpret and use all this data to improve their ability to reach students and identify unmediated classes that might benefit from closer collaboration.

Both the focus group assessment and redesign, and the quantitative data retrieved on the reach and use of LibGuides, have provided welcome opportunities for us to consider our practices carefully, to discuss our goals for the creation of these guides, and to reevaluate if we are meeting students’ research needs in these online spaces.

Conclusion

This study contributes important considerations and ideas for improving the way librarians use and think about LibGuides or other research guides, including expanding their overall reach, creating effective workflows, improving design, and collecting and using assessment data.

One limitation of our study is the necessity of a relatively high level of computer programming expertise, which we were fortunate to be able to outsource to CIDI. However, the recent release of Springshare’s new CMS LTI Tool should make automating LibGuides into a campus CMS much more accessible to libraries with smaller staff or limited resources. LibGuides CMS customers can directly embed a guide, a specific content box within a guide, or a page into a course.14 Libraries will still need an LMS administrator, but this new tool will not require as much coding, if any.

Future directions include a marketing campaign targeted at faculty and instructors to promote LibGuides. Initially, we chose not to market the Research Help link in Canvas actively until we were certain the program was stable and the coding functioned consistently. In winter 2016, the library’s graphic designer created a print postcard to send to all teaching faculty and instructors at the USU main campus as well as the University’s regional campuses. Before we mailed the cards, we gathered feedback from a small group of faculty members and edited the postcards based on their responses. This marketing will help with general awareness and will hopefully help faculty identify LibGuides as a resource. We will encourage subject librarians to follow up with their faculty members, answer any questions about customizing guides, and continue to collect statistics on LibGuide usage and requests for specific course guides from faculty members.

Understanding the impact of our online presence and reach with students via LibGuides provides a more comprehensive picture of how libraries support student research. A combination of usability testing with students, robust data gathering on research guide usage, and a list of best practices can make LibGuide design easier and more intuitive for subject librarians and more accessible to students. Collaborations with subject or disciplinary librarians who design guides can help libraries implement and assess an automated integration of LibGuides into the LMS, putting guides in students’ hands at their point of need.

References

  1. Alec Sonsteby and Jennifer Dejonghe, “Usability Testing, User-Centered Design, and LibGuides Subject Guides: A Case Study,” Journal of Web Librarianship 7, no. 1 (January 2013): 83–94, https://doi.org/10.1080/19322909.2013.747366.
  2. Alisa C. Gonzalez and Theresa Westbrock, “Reaching Out with LibGuides: Establishing a Working Set of Best Practices,” Journal of Library Administration 50, no. 5/6 (July 2010): 638–56, https://doi.org/10.1080/01930826.2010.488941.
  3. Kimberley Hintz et al., “Letting Students Take the Lead: A User-Centred Approach to Evaluating Subject Guides,” Evidence Based Library & Information Practice 5, no. 4 (December 2010): 39–52.
  4. Jennifer J. Little, “Cognitive Load Theory and Library Research Guides,” Internet Reference Services Quarterly 15, no. 1 (March 1, 2010): 53–63, https://doi.org/10.1080/10875300903530199.
  5. Gabriela Castro Gessner, Adam Chandler, and Wendy Sue Wilcox, “Are You Reaching Your Audience?,” Reference Services Review 43, no. 3 (July 2015): 491–508, https://doi.org/10.1108/RSR-02-2015-0010.
  6. Hintz et al., “Letting Students Take the Lead,” 47.
  7. Dana Ouellette, “Subject Guides in Academic Libraries: A User-Centred Study of Uses and Perceptions / Les guides par sujets dans les bibliothèques académiques : une étude des utilisations et des perceptions centrée sur l’utilisateur,” Canadian Journal of Information and Library Science 35, no. 4 (December 2011): 436–51.
  8. Sara E. Morris and Darcy Del Bosque, “Forgotten Resources: Subject Guides in the Era of Web 2.0,” Technical Services Quarterly 27, no. 2 (April 2010): 178–93, https://doi.org/10.1080/07317130903547592.
  9. Sharon Ince and John Irwin, “LibGuides CMS eReserves: Simplify Delivering Course Reserves through Blackboard,” Interlending & Document Supply 43, no. 3 (July 2015): 145–47, https://doi.org/10.1108/ILDS-05-2015-0014.
  10. Ruth L. Baker, “Designing LibGuides as Instructional Tools for Critical Thinking and Effective Online Learning,” Journal of Library & Information Services in Distance Learning 8, no. 3/4 (July 2014): 107–17, https://doi.org/10.1080/1533290X.2014.944423.
  11. Aaron Bowen, “A LibGuides Presence in a Blackboard Environment,” Reference Services Review 40, no. 3 (July 2012): 449–68, https://doi.org/10.1108/00907321211254698; Kimberly Miller, “No Pedagogical Advantage Found Between LibGuides and Other Web Page Information Literacy Tutorials,” Evidence Based Library & Information Practice 10, no. 1 (January 2015): 75–78.
  12. Bowen, “A LibGuides Presence in a Blackboard Environment”; Steven Shapiro, “Marketing the Library with Content Management Systems: A Case Study of Blackboard,” Library Hi Tech News 29, no. 3 (April 2012): 10–11, https://doi.org/10.1108/07419051211241859.
  13. Emily Daly, “Embedding Library Resources into Learning Management Systems: A Way to Reach Duke Undergrads at Their Points of Need,” College & Research Libraries News 71, no. 4 (April 1, 2010): 208–12.
  14. Talia Richards, “Why Upgrade to LibGuides CMS?: Why LibGuides CMS?,” Springhare Buzz, accessed March 16, 2016, http://buzz.springshare.com/producthighlights/whylgcms.

Appendix. Focus Group Questions

  1. [Moderator opens Canvas to show “research help” link] What do you think you’ll find in this link?
  2. How do you typically work through a research project?
  3. What is typically the most difficult part of a research project for you?
  4. What stands out as most important on this page? (subject guide homepage)
    1. What do students see as the focus of the homepage?
    2. Do they know where to get help from a librarian?
    3. Do students differentiate the four specific boxes that would lead to specific information?
  5. What kind of information (resources or help or information) would you hope to find within this box: Getting started? What kind of information (resources or help or information) would you hope to find within this box: Gathering information?
  6. Getting started page: Does the information presented here make sense? Why or why not?
  7. Gathering information page: Does the information presented here make sense? Why or why not? Is there too much information?
  8. What is missing from this guide that would be useful to you?
  9. [Moderator opens Canvas to show “research help” link] Does the name of this link reflect the LibGuide content?
Research help link in Canvas

Figure 1. Research help link in Canvas

Subject guide redesign

Figure 2. Subject guide redesign

Getting started and gathering

Figure 3. Getting started and gathering information

Unique Users and Times Accessed

Figure 4. Unique Users and Times Accessed

Breakdown by College

Figure 5. Breakdown by College

Times Viewed

Figure 6. Times Viewed

Guides Used and Course Delivery Method

Figure 7. Guides Used and Course Delivery Method

LibGuides Usage Statistics

Figure 8. LibGuides Usage Statistics

Hidden Guides

Figure 9. Hidden Guides

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2023 RUSA