Teaching scholarly communication skills to graduate students

by Chris Chan
Head of Information Services at Hong Kong Baptist University Library

Part1: 7 May 2019

I am writing this first part of the blog post from my comfortable hotel room in downtown Minneapolis, where I have arrived ahead of the LOEX Annual Conference that will start at the end of the week. Navigating US Immigration at Chicago after the longest flight I have ever taken (15 hours!!) has taken its toll, but I am hoping that arriving a few days early will give me a chance to more or less recover from jet lag ahead of the event starting on the 9th.

The time will also allow me to put the finishing touches to my breakout session presentation. I’ll be talking about the efforts we have been making at HKBU Library to ensure our graduate students are equipped with the scholarly communication knowledge and skills that they will need to be successful researchers. For several years we have had required library workshops for our research students that covered the basics of scholarly publishing. These sessions also sought to raise awareness of current issues in scholarly communication, such as open access and altmetrics. Although student feedback has generally been positive, we found it challenging to design sessions that were suitable for both novice researchers and those graduate students that already had prior publication experience. We also wanted to better assess the extent to which students were achieving the learning outcomes of the session, as the results of relatively simple in-class exercises could tell us only so much.

Our new approach, launched this year, has been to adapt our workshop content into a modular online course. The course is designed so that students can skip content that they are already familiar with. To fulfill the new course requirement, students need to achieve a passing grade on a short online quiz assessing their knowledge of course content. In my presentation, I’ll be sharing the results from our first year of implementation. I’m also hoping to find out what approaches other institutions are taking, and to this end I’ll be using Mentimeter for the entire presentation. I’m a little nervous about having to rely on an online service, but fingers crossed that it runs smoothly. Another benefit is that I will be able to share the results in the second part of this blog post.

Part 2: 11 May 2019
All done! The conference was excellent – there were so many things that I will be bringing back to my own institution. As for my own presentation, everything went smoothly in technological terms. Mentimeter worked as advertised, and having the interactive segments seemed to help keep things interesting for the audience. Their responses were incorporated into the presentation material in real-time. For example, the results for this question supported a point that I had seen in the literature – that this type of support for graduate students is often not formally assessed (the full question was “How is support for graduate student scholarly communication skill development assessed at your institution?”):

I also used the built-in quiz function in Mentimeter to showcase some of the questions that we use to assess the student learning. Shout out to Marcela for winning!

You can view the full presentation (including the results of the audience voting) here: https://www.mentimeter.com/s/e1451a492dd1d3a21747448a6ff3ce70

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Did They Learn Anything? Strategies for Evaluating Discussion Posts

By Tasha Maddison, Saskatchewan Polytechnic

Transitioning library instruction into an online platform allows for increased flexibility for content provision, as well as further opportunities for assessment; provided that the learners access and fully engage with the content. Saskatchewan Polytechnic has a new online library module for faculty intended to complement face-to-face information literacy sessions. The course covers the entire research process; from information need, to writing and formatting academic papers. Opportunities to assess the students’ learning are built into each learning outcome through discussion posts and quizzes.

The key to a successful blended learning project is the “purposefully integrated assessment, both formative and summative, into the actual design of the course” (Arora, Evans, Gardener, Gulbrandsen & Riley, 2015, p. 249), thus evaluating student learning by incorporating quizzes and discussions. Their goal was to “create an online community of active and engaged learners”, and through instructor prompts, “students were directed to state what they found valuable in the peers’ response, what questions they had … and also what support they would add to make their peers’ argument stronger” (Arora et al., 2015, p. 239). The researchers noted the assessment activities “layered the learning experience – helped to generate and sustain student interest and enthusiasm for the course material to fashion a vibrant community of active learners in the process of meaning- and knowledge-making” (p. 241). Perhaps their most compelling statement is that the “students saw the online classroom as an active learning space, not just a repository for materials or a place to take quizzes” (p. 248).

Students in our online course were informally evaluated on their responses to three discussion posts. Being present and available to the online audience is vital for both student success and their corresponding engagement with the learning materials. Students are more likely to participate in a conversation if they feel that there is someone reviewing their posts and who is prepared to respond when necessary or appropriate. Discussion posts were scheduled at the mid-way point of each learning outcome so that students could reflect on the information that was just covered, as well as provide a foundation to the materials that were included later in the outcome. A librarian reviewed all content and responded to most of the discussion posts, providing feedback and suggestions. Anecdotally, responses were thoughtful and thorough, demonstrating a high level of comprehension and a strong connection with the content.

Participation in discussion posts allows students to synthesise their thoughts about a particular topic and more importantly, to learn from their peer group. Students can review the sentiments shared by their colleagues and learn from their experiences. According to Arora et al. (2015) discussion posts, help students to develop individual skills such as “critical reading and analysis through close reading and textual analysis” (p. 243) and builds community by “encouraging students to work together on their interpretive skills and in negotiating knowledge” (p. 244). This form of assessment is meaningful to the instructor as well, as they can learn about the student and their understanding of the materials.

This course was piloted in July 2018. Discussions were included in the pilot, but were not formally assessed at the time. The development of a rubric was designed to evaluate the student’s output for the next iteration of the course this spring. A rubric will assist students in determining how they are doing in the course and identify any growth areas.

In order to perfect formative assessment of online library instruction, librarians should include a variety of measurement instruments that fully address student learning. As recognized in the work of Lockhart (2015) “it is very important that the academic programme continually tests and builds the information literacy skills of students” (p. 20), which can be accomplished by developing strong partnerships with instructors so that the librarians can adequately analyse student learning and success.

The module discussed here was developed by Diane Zerr and Tasha Maddison, librarians at Saskatchewan Polytechnic, and is used by students in the Adult Teaching and Learning program.

References
Arora, A., Evans, S., Gardner, C., Gulbrandsen, K., & Riley, J. E. (2015). Strategies for success: Using formative assessment to build skills and community in the blended classroom. In S. Koç, X. Liu & P. Wachira (Eds.), Assessment in online and blended learning environments (pp. 235-251), [EBSCOhost eBook Collection]. Retrieved from https://ezproxy.saskpolytech.ca/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=971840&site=ehost-live&scope=site

Lockhart, J. (2015). Measuring the application of information literacy skills after completion of a certificate in information literacy. South African Journal of Libraries and Information Science, 81(2), 19-25. doi:10.7553/81-2-1567

table of the rubric used in the assessment
Click for larger image

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Future of Brain-Work

Brain-Work has been publishing weekly for the past four years and the blog advisory team is using this anniversary as an opportunity to review how the blog is working for you, our readers. We are looking at all aspects of the blog and everything is up for discussion – the topics we cover, the type and length of posts, publishing frequency, the name – everything!

We want to hear from what you like about Brain-work, what you would like to see changed, and how the blog can support your research and professional practice. Let us know what you think in the short survey below. We will share the results on the blog and use your feedback to help guide the future of Brain-Work.

Take the Brain-work reader survey now.

Gathering Evidence by Asking Library Users about Memorable Experiences

by Kathleen Reed
Assessment and Data Librarian, Vancouver Island University

For this week’s blog, I thought I’d share a specific question to ask library users that’s proving itself highly useful, but that I haven’t seen used much before in library assessment:

“Tell me about a memorable time in the library.”

Working with colleagues Cameron Hoffman-McGaw and Meg Ecclestone, I first used this question during the in-person interview phase of an on-going study on information literacy (IL) practices in academic library spaces. In response, participants gave detailed accounts of studying with friends, moments that increased or decreased their stress levels, and insight into the 24/7 Learning Commons environment – a world that librarians at my place of work see very infrequently, as the library proper is closed after 10pm. The main theme of answers was the importance of supportive social networks that form and are maintained in the library.

The question was so successful in the qualitative phase of our IL study, I was curious how it might translate to another project – an upcoming major library survey that was to be sent to all campus library users in March, 2016. Here’s the text of the survey question that we used:

“Tell us about a memorable time in the library. It might be something that you were involved in, or that you witnessed. It might be a positive or negative experience.”

It wasn’t a required question; people were free to skip it. But 47% (404/851) of survey takers answered the question, and the answers ranged in length from a sentence to several paragraphs. While analysis isn’t complete on the data generated from this question, some obvious themes jump out. Library users wrote about how both library services and spaces help or cause anxiety and stress, the importance of social connections and accompanying support received in our spaces, the role of the physical environment, and the value placed on the library as a space where diverse people can be encountered, among many other topics.

To what end are we using data from this question? First, we’re doing the usual analysis – looking at the negative experiences and emotions users expressed and evaluating whether changes need to be made, policies created, etc. Second, the question helped surface some of the intangible benefits of the library, which we hadn’t spent a lot of time considering (emotional support networks, the library’s importance as a central place on campus where diverse groups interact). Now librarians are able to articulate a wider range of these benefits – backed up with evidence in the form of answers to the “memorable time” question – which helps when advocating for the library on campus, and connecting to key points in our Academic Plan document.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Assessment and evidence based library and information practice

by Lorie Kloda
Assessment Librarian, McGill University

I have held the position of Assessment Librarian for almost three years, and been involved in the evidence-based library and information practice (EBLIP) movement for over a decade. Since taking on this position, I have been trying to make sense of EBLIP in my job – trying to understand how these two concepts complement each other, overlap, or even contradict one another.

In a 2006 article, “EBL and Library Assessment: Two Solitudes?” Pam Ryan, then the Assessment Librarian at the University of Alberta, asked the question regarding assessment and EBLIP, “Are these separate movements within librarianship forming theoretical bridges? Is some sort of merger, fusion, or takeover in the future?” It’s almost 10 years later, and I think this question still remains unanswered. I think that part of the answer lies in the way in which assessment and EBLIP relate to one another, not just on a theoretical level, but on a practical level.

In my work, I see assessment as having two (not mutually exclusive) goals: one, to inform decision-making for quality improvement to anticipate and meet users’ needs, and two, to demonstrate impact or value. There are, however, some occasions (OK, there are a lot of occasions) when one cannot conduct assessment. Hurdles to assessment include a lack of time, data, resource, experience, and skills. In cases where one cannot conduct assessment, whatever the reason, one can make use of evidence: credible, transferable findings from published research, to inform decision making.

One of the roles of an assessment librarian, or really, any librarian working in assessment and evaluation, is to foster a culture of assessment in the organization in which they work. According to Lakos and Phipps,

“A culture of assessment is an organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes for customers and stakeholders.”

I understand the above quote to mean that librarians need research, analysis of local data, and facts in order to plan and make decisions to best serve library users. A culture of assessment, then, is also one that is evidence-based. I find this idea encouraging and I plan to spend some time thinking more about how the steps in EBLIP and assessment overlap. While I think the realm of library and information practice is still far from a takeover or merger when it comes to assessment and EBLIP, I think the two will continue to mingle and hopefully foster a culture which leads to increasingly improved services.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Library Assessment Data Management ≠ Post-It Notes

by Kathleen Reed
Assessment and Data Librarian, Vancouver Island University

Over the last few years, data librarians have become increasingly focused on data management planning as major funders and journals insist researchers have data management plans (DMPs) in place. A DMP is a document that outlines how data will be taken care of during its life cycle. Lately I’ve spent a lot of time thinking about how my data service portfolio dovetails nicely with library assessment activities. A lot of discussion in the library evidence-based practice community is about obtaining and analyzing data and stats, with little emphasis about stewardship of said information. Recently, I gave a talk at an assessment workshop put on by the Council of Post-Secondary Library Directors where I reflected on data management for assessment activities. This post introduces some of the steps in working out a DMP. Please note that while DMPs usually refer to data, not statistics, in my world the ‘D’ stands for both.

Step 1: Take an Inventory
You can’t manage what you don’t know about! Spend some time identifying all your sources of evidence and what format they’re in. I like to group by themes – reference, instruction, e-resources, etc. While you’re doing this, it’s also helpful to reflect on whether the data you’re collecting is meeting your needs. Collecting something that you don’t need? Ditch it. Not getting the evidence you need? Figure out how to collect it. Also think about the format in which the data are coming in. Are you downloading a PDF that’s making your life miserable when what you need is a .csv file? See if that option is available.

Step 2: The ‘Hit by the Bus’ Rule (aka Documentation)
If you’re going to archive assessment data, you need to give the data some context. I like to think of this as the ‘hit by a bus’ rule. If a bus hits me tomorrow, will one of my colleagues be able to step into my job and carry on with minimal problems? What documentation is required by someone else to understand, collect, and use your data? Every single year when I’m working on stats for external bodies, I have to pull out my notes and see how I calculated various numbers in previous years. This is documentation that should be stored in a safe, yet accessible place.

7004634470_2eebf7e336_z
Post-its don’t count as ‘documentation.’ Neither does a random sheet of paper in a towering stack on your desk. Photo by Wade Morgan

Step 3: Storage and Backup
You’ve figured out what evidence and accompanying documentation you need to archive. Now what are you actually going to do with it? For routine data and stats, around my institution we use a shared drive. Within the shared drive I’ve built a simple website that sits on top of all the individuals data files; instead of having to scroll through hundreds of file names, users can just click links that are nicely divided by themes, years, vendors, etc. IT backs up the shared drive, as do I on an external hard drive. If your institution has access to Dataverse hosted on a Canadian server, this is a good option.

Step 4: Preservation
For key documents, you might consider archiving them in a larger university archive. LibQUAL+ results, documentation, and raw data are currently being archived via my institution’s instance of DSpace.

Step 5: Sharing
I always felt like a hypocrite, imploring researchers to make their data open when I squirreled library data away in closed-access shared drives. Starting with LibQUAL+, this past year I’ve tried to make as much library data open as possible. This wasn’t just uploading the files to our DSpace, but also involved anonymizing data to ensure no one was identifiable.

If you’re going to share data with your university community and/or the general public, keep in mind that you’ll need to identify this right away when you’re designing your evidence-collection strategies. For example, if you’re doing a survey, participants need to be informed that their responses will be made available. Ethics boards will also want to know if you’re doing research with humans (monkeys too, but if you’ve got monkeys in your library you’ve got bigger problems than figuring out a DMP…)

Perhaps the most important aspect of sharing is setting context around stats and data that go into the wild. If you’re going to post information, make sure there’s a story around it to explain what viewers are seeing. For example, there’s a very good reason that most institutions score below expectations in the “Information Control” category on LibQUAL+ – there isn’t a library search tool that’s as good as Google, which is what our users expect. Adding some context that explains that the poor scores are part of a wider trend in libraries and why this trend is happening will help people understand it’s not that your library is necessarily doing a bad job compared to other libraries.

Want more info on data management planning? Here are a few good resources:

DMP Builder by the California Digital Library

VIU’s Data Management Guide

Research Data Management @ UBC

What are your thoughts about library assessment data and DMPs? Does your institution have a DMP for assessment data? If so, how does your institution keep assessment data and stats safe? Let’s keep the conversation going below in the comments, or contact me at kathleen.reed@viu.ca or on Twitter @kathleenreed

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Are Students Succeeding with a Library Credit Course? C-EBLIP Journal Club, October 6, 2014

by Rachel Sarjeant-Jenkins
Client Services, University Library, University of Saskatchewan

I recently had the opportunity to lead our C-EBLIP Journal Club in a discussion of Jean Marie Cook’s article “A library credit course and student success rates: A longitudinal study” in College & Research Libraries 75, no. 3 (2014) (available at http://crl.acrl.org/content/75/3/272.full.pdf+html). This article had been sitting on my desk for a few months waiting for that magical moment when I found the time to read it thoroughly. Then came my turn to host journal club. What a perfect opportunity to finally delve into Cook’s article! And it couldn`t have come at a better time in light of our library’s focus on developing a programmatic approach to library instruction and the broader teaching and learning environment in which academic libraries currently find themselves.

Following some ‘proper’ journal club discussion about the article’s methodology and findings, Cook’s article proved a wonderful catalyst for a conversation about library instruction at our institution. Initially we were simply envious of Cook’s situation, where a library-focused course is one of the areas within her institution’s priorities. But then the questions started.

• Is there value in having a stand-alone library course or is it better to have instruction firmly embedded or integrated into academic program courses? (Of course, this question did not mean we ever stopped desiring that institutional commitment to information literacy — who would!?)
• How do you assess student learning? And, more importantly, how do you gauge the actual ongoing use of that learning by students?

We also talked about library value. The impetus for Cook’s work was institutional interest in ROI; the result was her quantitative research project.
• How, we asked, can qualitative data be used to support (and enhance) quantitative data when demonstrating library value to the parent institution?
So many questions, and only a lunchtime to discuss.

Not surprisingly, our hour just wasn’t enough. What that hour did do, however, was get us thinking. We talked about the known information literacy courses on campus and learned about pockets of embedded instruction by our librarians that we were completely unaware of. We had a lively debate about quantitative and qualitative research and the benefits of each. And of course we talked about assessment, not only that we need to do more of it and do it more consistently, but also the importance of knowing what we are trying to assess and therefore when we want to assess it.

Our journal club hour got me excited and primed for the next steps in developing our library’s programmatic approach to instruction. Cook’s article, and the energetic conversation it inspired, was an excellent beginning.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.