Assessment in Action: A Journey through Campus Collaboration, a Learning Community, and Research Design

0
406

Members of the first cohort (2014) of the Association of College and Research Libraries’ (ACRL) Assessment in Action (AiA) learning community share the impact of the AiA program on library and university assessment initiatives. This article shares brief examples of effective and challenging cross-campus collaborative assessment projects and the five best practices the authors developed through the year-long experience of examining student success in three different academic library environments. Introduction Members of the first cohort (2014) of the Association of College and Research Libraries’ (ACRL) Assessment in Action (AiA) learning community found that their experiences significantly influenced their library and university assessment initiatives. By participating in a structured group assessment process, each team leader learned best practices and practical lessons for coordinating a large-scale assessment project with non-library campus collaborators. The projects ranged from a focused examination of a specific learning outcome in a first year experience program to a broad initiation of a multi-service program evaluation project that included the campus writing center and speech lab. The individual library assessment projects summarized here were helpful to the library communities on their own, yet the team leaders realized that the most memorable lessons learned focused not on outcomes but rather on the process itself—the process of planning, leading, and communicating a team-based assessment. Grand Valley State University In 2012 Grand Valley State University Libraries piloted a new academic support service based on best practices in peer learning. The service offers one-on-one or small group consultations with a well-trained student employee in a highly visible yet comfortable location in the center of the library. The twist is that these “peer research consultants” provide consultations in the same space and at the same time as consultants from the campus writing center and speech lab. All three services are independently administered— different colleges, different deans, different supervisors, and different budgets—yet they work collaboratively to provide comprehensive academic support; that collaborative service is called the Knowledge Market. The assessment challenge began with the first year pilot. Not only did administrators need to monitor and adjust regular operations through a formative program evaluation process, but they also needed to begin a more rigorous long-term assessment of the Knowledge Market. The expectation is that the whole service is greater than the sum of its parts. We set out to inquire whether positive learning outcomes are greater when the student interacts with multiple services compared to only one service at a time. In other words, the assessment should test whether collocating the three services so that students have research, writing, and public Stewart-Mailhiot, O’Kelly, and Theiss 389 speaking assistance in one, low-threshold service, correlates with positive student learning outcomes. The long-term assessment (still underway) began with a few simple and achievable incremental steps. The library asked each research consultant to end the consultation with an evaluation form that asked three questions about student perception: how comfortable was the student with the consultant; how helpful was the consultation; and how confident is the student in completing their assignment. That initial test of measuring basic student perceptions demonstrated that high return rates are possible (we achieved a 98.5% return rate), that student perceptions are positive, and that measuring student learning outcomes in such a variable service will be exceptionally challenging. How can we measure student learning outcomes in a half-hour reference interview? Perhaps the more important question is this: Do they need to be measured? And how can we correlate data from the writing center and speech lab with our library data? One of the most important outcomes of this project was the development of a relationship with the Institutional Analysis (IA) department (called Institutional Research at some other institutions). The library is now using ScheduleIt (a custom appointment scheduling application used in the Knowledge Market) and LibAnalytics to collect consultation, reference, and instruction data, which is being regularly sent to IA with lists of questions that can only be answered with student-level data (e.g., Which majors are represented by the students who use the consulting service? How many freshmen use the consulting service? How many students at each grade level do we reach through direct library instruction?). The analyst returns the results in aggregate, which preserves the privacy of the students yet still provides the library with rich descriptive and correlative data that would otherwise be inaccessible. AiA brought assessment to the forefront of learning assessment and program evaluation conversations in the university libraries. Data collection procedures are now routine, the head of instruction is in regular communication with Institutional Analysis, and early analysis using these new collaborative assessment processes is revealing exciting trends in student self-efficacy, retention, and skill development in library student employees. Pacific Lutheran University At Pacific Lutheran University (PLU), the primary focus of the AIA project was to investigate if the number of information literacy (IL) instruction sessions a student participated in during a First Year Experience Program (FYEP) course positively influenced his/her development of the University’s Critical Reflection integrative learning objective (ILO). This campus priority was selected because of the direct relationship between the ILO and IL. The ILO was operationalized into the following components: demonstrated use of the library, use of a variety of sources, use of credible/ reliable sources. Classroom faculty at PLU, as at other institutions, have expressed concerns about the type and quality of sources students cite in research projects. Evidence from our research indicates that a series of shorter IL instruction sessions is more beneficial for student learning than one long session. Data gathered from citation analysis of final projects and content analysis of student reflection surveys showed that students receiving multiple IL sessions used library resources at a rate of 80% compared to 53% for students in the one shot sessions. The multi-session students also reported employing a greater variety of search strategies to find a broader range of sources. The PLU project team included representatives from the FYEP and the Office of University Assessment, Accreditation, and Research. The FYEP has been a leader in assessment efforts on campus, as well as a strong supporter and user of library instruction. Building on these established strengths and relationships provided the AiA project a solid foundation and increased faculty buy-in. Through this project, librarians have developed a greater understanding of the complexity of assessment and the need to have a clear assessment plan in place. Collaborating with more experienced colleagues and attending assessment scoring sessions in other units on campus offered increased visibility of the library’s initiatives. Participation in AiA has helped situate the PLU library as an active participant in assessment efforts on campus. Evidence of this can be found in the recent accreditation report where the FYEP/AiA project received a commendation 2014 Library Assessment Conference 390 from the Northwest Commission on Colleges and Universities reviewers. Rockhurst University Rockhurst University sought to explore its oneshot instruction sessions with English Composition students. Was this approach the best one and what impact were these sessions having on the students in terms of their understanding of the research assignment, the library’s materials and electronic resources, and also the library’s services (chat, face-to-face reference assistance, etc.)? Data were gathered using a pre/post survey as well as a citation analysis. Results suggest that 87% surveyed were satisfied to very satisfied with their interactions with the library and 64% reported that they had asked for help at the reference desk. Ninety percent reported that they were very satisfied with library instruction and 87% noted that they used the library’s website and databases to research for course papers and projects. The RU team consisted of the team leader librarian who was the department head for research, learning, and assessment services, the assessment coordinator, the English Department’s faculty chair, a faculty member from the Education Department, and another faculty member from the Business School. Membership was later extended to the library director and additional members of the English Department since the project’s focus was on English composition students. Rockhurst’s participation in the AiA program greatly improved collaboration and communication between the multiple departments and the library as well as between the English and Education departments and the Business School. One positive outcome is that the project will soon be conducted with students in the Business School. Faculty and staff across the campus learned that the library was involved in assessment projects and became advocates for our programs, initiatives, and library staff members. Key Takeaways Despite the differing nature of the individual projects described above, the librarian team leaders shared a sense that the true value of the AiA experience was to be found in the development of a best practices approach to the process of assessment. While the specifics of the process will, of necessity, change from one institution to the next and from one project to the next, these core principles can serve as a strong foundatÂ