Did They Learn Anything? Strategies for Evaluating Discussion Posts

By Tasha Maddison, Saskatchewan Polytechnic

Transitioning library instruction into an online platform allows for increased flexibility for content provision, as well as further opportunities for assessment; provided that the learners access and fully engage with the content. Saskatchewan Polytechnic has a new online library module for faculty intended to complement face-to-face information literacy sessions. The course covers the entire research process; from information need, to writing and formatting academic papers. Opportunities to assess the students’ learning are built into each learning outcome through discussion posts and quizzes.

The key to a successful blended learning project is the “purposefully integrated assessment, both formative and summative, into the actual design of the course” (Arora, Evans, Gardener, Gulbrandsen & Riley, 2015, p. 249), thus evaluating student learning by incorporating quizzes and discussions. Their goal was to “create an online community of active and engaged learners”, and through instructor prompts, “students were directed to state what they found valuable in the peers’ response, what questions they had … and also what support they would add to make their peers’ argument stronger” (Arora et al., 2015, p. 239). The researchers noted the assessment activities “layered the learning experience – helped to generate and sustain student interest and enthusiasm for the course material to fashion a vibrant community of active learners in the process of meaning- and knowledge-making” (p. 241). Perhaps their most compelling statement is that the “students saw the online classroom as an active learning space, not just a repository for materials or a place to take quizzes” (p. 248).

Students in our online course were informally evaluated on their responses to three discussion posts. Being present and available to the online audience is vital for both student success and their corresponding engagement with the learning materials. Students are more likely to participate in a conversation if they feel that there is someone reviewing their posts and who is prepared to respond when necessary or appropriate. Discussion posts were scheduled at the mid-way point of each learning outcome so that students could reflect on the information that was just covered, as well as provide a foundation to the materials that were included later in the outcome. A librarian reviewed all content and responded to most of the discussion posts, providing feedback and suggestions. Anecdotally, responses were thoughtful and thorough, demonstrating a high level of comprehension and a strong connection with the content.

Participation in discussion posts allows students to synthesise their thoughts about a particular topic and more importantly, to learn from their peer group. Students can review the sentiments shared by their colleagues and learn from their experiences. According to Arora et al. (2015) discussion posts, help students to develop individual skills such as “critical reading and analysis through close reading and textual analysis” (p. 243) and builds community by “encouraging students to work together on their interpretive skills and in negotiating knowledge” (p. 244). This form of assessment is meaningful to the instructor as well, as they can learn about the student and their understanding of the materials.

This course was piloted in July 2018. Discussions were included in the pilot, but were not formally assessed at the time. The development of a rubric was designed to evaluate the student’s output for the next iteration of the course this spring. A rubric will assist students in determining how they are doing in the course and identify any growth areas.

In order to perfect formative assessment of online library instruction, librarians should include a variety of measurement instruments that fully address student learning. As recognized in the work of Lockhart (2015) “it is very important that the academic programme continually tests and builds the information literacy skills of students” (p. 20), which can be accomplished by developing strong partnerships with instructors so that the librarians can adequately analyse student learning and success.

The module discussed here was developed by Diane Zerr and Tasha Maddison, librarians at Saskatchewan Polytechnic, and is used by students in the Adult Teaching and Learning program.

References
Arora, A., Evans, S., Gardner, C., Gulbrandsen, K., & Riley, J. E. (2015). Strategies for success: Using formative assessment to build skills and community in the blended classroom. In S. Koç, X. Liu & P. Wachira (Eds.), Assessment in online and blended learning environments (pp. 235-251), [EBSCOhost eBook Collection]. Retrieved from https://ezproxy.saskpolytech.ca/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=971840&site=ehost-live&scope=site

Lockhart, J. (2015). Measuring the application of information literacy skills after completion of a certificate in information literacy. South African Journal of Libraries and Information Science, 81(2), 19-25. doi:10.7553/81-2-1567

table of the rubric used in the assessment
Click for larger image

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

KISS and Julie Andrews: My (unlikely) muses for effective information literacy instruction

by Megan Kennedy
Leslie and Irene Dubé Health Sciences Library
University of Saskatchewan

Perhaps strange bedfellows, but Julie Andrews and the rock band KISS are my muses for effective information literacy instruction.

In the classic film, The Sound of Music, Julie Andrews sings “let’s start at the very beginning, a very good place to start” – this little ditty has been a guiding principle for all my information literacy instruction thus far in my academic career.

Starting at the beginning is so simple, and yet, at least for myself, I often assume that I can jump ahead. When I began delivering information literacy instruction to students this earlier this year, I assumed a few things:

1. Students would be familiar with and understand some of the jargon I would be using (search engine, catalogue, index, database, metadata, indexing fields/record fields, Boolean operators, controlled vocabulary/subject terms and many other librarian-y terms)
2. Students would have some familiarity with the databases I was going to be talking about because they had used them in the past
3. Students would be familiar with the library website enough that they could comfortably navigate to things I was talking about

After just one session – that admittedly ended with a group of very confused looking students – I realized that this was not going to work; my students needed to know do-re-mi before they could sing! So how did I fix these issues going forward? By always starting at the very beginning and never underestimating the importance of providing simple navigational guidance – it doesn’t do students any good to know the ins and outs of searching CINAHL if they can never find the database on the library website. I’ve also tried to incorporate informal polls/assessments in my teaching to gauge current understanding about the topic I am talking about and to help me assess where more attention needs to be paid and what can perhaps simply be a refresher. Something that still needed to be addressed was the language I was using when talking with students, notably my use of unexplained library jargon.

KISS stands for Keep It Simple, Stupid – a wonderful little phrase I picked up from my high school history teacher. The premise is not novel but I find that this plucky acronym helps to center my focus when explaining particularly “librarian-y” concepts. For example, did you know that the only people genuinely excited to talk about metadata are librarians and other information folks? Students, for the most part, are not interested in the specific details of data about data, discovery, findability, indexing, etc. I learned this the hard way when talking to a student about how citation managers get the information needed to generate a complete citation. Unfortunately for this student, my librarian brain took over and I talked for a good ten minutes about the intricacies and importance of record metadata. The wide-eyed look I got at the end of my speech told me everything I needed to know, I had not kept it simple and had now confused this poor student. I tried again and slowed down and thought about it from their perspective, what are the essential bits of information they need to know to understand this concept (no more and no less)? I then gave the student a much simpler explanation, something along the lines of “metadata is the behind the scenes information of an item that makes it possible for you to find it. Citation managers can read this information from the item to compile what’s needed to make a citation”. I could practically hear the light switch flip on in their head – they got it.

When it comes to information literacy instruction, our tacit knowledge as librarians can be a double-edged sword. It makes us excellent “knowers of things”, “information wizards”, “database Yodas” and other delightful monikers, but it can be a somewhat unnatural and awkward process for us to actively stop and think about what we know, how we know it, and how we can explain it in simple and relatable terms. I let Julie Andrews and KISS lead the way for me* – start at the beginning and keep it simple.

*I also like to imagine Julie Andrews as Maria von Trapp teaching the band KISS to sing using the do-re-mi song so that also keeps things interesting.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

A brief and biased comparison of two live polling tools: Poll Everywhere and Kahoot

By Joanna Hare
Run Run Shaw Library, City University of Hong Kong

For this post, I would like compare the live-polling tools Kahoot (https://create.kahoot.it/) and Poll Everywhere (https://www.polleverywhere.com/). The comparison is based on my experience of using the tools in information literacy workshops to gather formal and informal evidence of learning. I must stress that I have used these for very specific purposes, and have not fully explored the entire functionality of both tools – hence my ‘brief and biased’ comparison!

The Basics
Both Poll Everywhere and Kahoot are online platforms that allow you to create interactive activities such as quizzes and surveys that can be conducted live in a classroom setting. Users can respond using any web-enabled device. Both tools allow you to create a number of different activities, including quizzes, discussions and surveys. Each tool offers a different selection of unique activities, which will be discussed in more detail below.

Cost
Kahoot is entirely free with no limitations, while Poll Everywhere has a limited free account with a number of different subscription options. The free version of Poll Everywhere is limited to only 25 users per poll. The minimum class size I teach is around 30 students, and more typically the classes sizes are 50+, meaning the free version isn’t suitable for me. The free account does not support grading.

I have paid to subscribe to Poll Everywhere to get access to the full platform, and their pricing model makes it really easy to sign up for just one month or one semester – they don’t lock you into any annoying contracts. With a paid account you also get excellent personalised customer service (no trawling the online forums to troubleshoot whatever problem you might be having).

Winner: Kahoot (but I find the paid version of Poll Everywhere worthwhile!)

Types of Activity
Kahoot has four types of activity: quizzes, discussions, surveys and ‘jumbles’, where users have to put items in the correct order (see Figure 1). All the activities require you to define pre-determined answers – Kahoot does not support open ended questions.

Poll Everywhere provides almost two dozen different options, including a number of activities supporting open-ended questions and the input of free text. (see Figure 2).


Figure 1: Activity options in Kahoot


Figure 2: Activity options in Poll Everywhere

Both platforms allow you to add images or videos either for the purpose of instruction or just to make your presentation a little more visually exciting.

Winner: Poll Everywhere – I am yet to explore all the options but I hope to try some more next semester!

Fun
If you spend half a minute to visit both the the Poll Everywhere and Kahoot homepages, you will immediately see that Kahoot has a more colourful, fun interface (see Figure 3), whereas Poll Everywhere has a more ‘austere’ look (see Figure 4).

Kahoot is the more “fun” platform of the two with it’s casual use of language, inclusion of tense “game show” music and bright graphics. It does have one annoying feature that cannot be turned off: before students can participate they are required to choose a nickname which then appears on screen – which provides the opportunity for students to choose naughty names. Kahoot does allows you to ‘kick out’ and identify cheeky participants, but in my experience the problem is not so much naughty names but students having too much fun choosing their nicknames! However, I don’t begrudge a room full of giggling undergraduates, and it increases the likelihood that students are paying attention to the activity and participating.


Figure 3: A preview of a Kahoot Quiz demonstrating the bright and colourful interface.


Figure 4: The more ‘austere’ interface of Poll Everywhere.

A word on the “game show” music: you can of course just turn the volume off, which I usually do, but I have had one professor ask me to turn it up because she likes the way it grabs the students attention!

Winner: Kahoot

Ease of use
Both tools take some getting used to if you have never used live polling before, and with either platform I would recommend practicing your quizzes or polls a few times with yourself as a participant.

If I were to recommend one tool over the other for ease of use if would have to be Kahoot based on its overall simplicity. The sheer number of options and granularity in creating live polls in Poll Everywhere may seem a little overwhelming to someone who has never created a live poll before. Starting with Kahoot you will learn the basics of live polling, and from there you can ‘graduate’ to Poll Everywhere if you are ready for the more advanced features.

Winner: Kahoot

Which live-polling team reigns supreme?

In my experience: it depends!

My personal preference is Poll Everywhere thanks to the variety of activities, level of control over granular details of your activities and their excellent customer service. I feel paying for an account (and the fact that it is easy to ‘turn off’ your subscription) is good value for money. The free version of Poll Everywhere might work for you if you have small class sizes and no need to grade incoming answers.

I do find Kahoot works quite well with undergraduate students, especially in an English as Second Language (ESL) context. If you looking for something light-hearted and easy to use, and you have no need to for students to answer open-ended questions, Kahoot might be the tool for you.

What I can say in conclusion is that if you haven’t already tried these tools (or other live-polling platforms) I would highly recommend you give them a go. I have received positive feedback from both students and professors, and they have improved my ability to do both informal and formal assessment even in a short time frame. Ultimately I have found live polling tools energise my teaching, making instruction more engaging for students – and more fun for me!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Reframing Instruction: Get Messy

By Tasha Maddison
Librarian, Saskatchewan Polytechnic

As an instructor, I endeavour to incorporate active learning techniques and rely on the constructivist theory for my pedagogy; yet inevitably I find myself devoting the first 5-10 minutes of any class period to live demonstrations. Before class, I spend a significant amount of time carefully developing search strategies. These perfect search strings are then used to illustrate how databases work or the features and benefits of a particular tool.  I am careful to include points from the students’ upcoming class assignment, along with the advanced search techniques that I feel are appropriate for their level.  Developing and trialing these searches provides me an opportunity to prep as I develop knowledge of my subject area and discover how the tools for that discipline work. Yet this seamless, rehearsed demonstration of search tools fails to acknowledge to students that library research is an interactive process, one that involves many stops and starts, and which may result in unsuccessful searches.  Sure, I always throw in a joke about librarians being magicians, but is this enough?

I recently came across a powerful article that has challenged the way that I think about information literacy instruction and, in particular, demonstration vs. the development of critical thinking skills.  The article suggests that instead of performing a perfectly crafted search, librarians should demonstrate the “messy process of research as exploration” which reveals to the student “some of the key dispositions required of novice (and experienced) researchers: resilience, curiosity, and persistence” (p. 4).  This idea is based upon the new ACRL Framework which reinforces the principles of lifelong learning and reiterates that successful information literacy instruction cannot be accomplished in a single transaction with students.  Burgess’ ‘messy research process’ and the Framework tie together well, but what happens when librarians only have one opportunity to work with a class and feel pressured to ‘cover it all’?

Recently, I had an opportunity to test out this process of exploration.  I had a structured lesson plan with a search example in mind, but I let my students decide what we were going to search for and how we were going to conduct the search.  Together we built a search string with multiple concepts and a variety of potential synonyms.  We decided on a database and which limiters we would apply.  The result: success through failure?  We found nothing.  This failure provided me with a great opportunity to discuss the research process, as well as the necessity of tweaking ideas and concepts to ensure that search results are meaningful and relevant.  I even mentioned Burgess’ theory.  One of my students piped up, “nice save”—and indeed it was!  More, it helped me to communicate that everyone—even advanced searchers—need to process information and adjust our methodology to accomplish a successful search.

When discussing the Framework, Burgess (2015) encourages librarians to “evolv[e] instruction from a point-and-click database demo style to an engaged and interactive IL discussion with students. The instructor occupies the role of coach, animator, or advisor leading the discussion, while encouraging students to become active agents in their own learning” (p, 2).  Librarians can integrate the principles of the Framework into their teaching by fostering open dialogue within their classrooms.  We can create safe environments that allow for questions but also nurture peer learning.  One great example that comes to mind is when a student presents a challenging question to the class.  The instructor can respond with, “that is an interesting situation, has anyone else come across that?” and “how did you resolve that issue?”  Rather than inspiring panic in the library instructor, such questions can instead offer rich opportunities for experiential learning experiences!

Reference

Burgess, C. (2015). Teaching students, not standards: The new ACRL information literacy framework and threshold crossings for instructors. Partnership: The Canadian Journal of Library and Information Practice and Research, 10(1), 1-6.  doi:10.21083/partnership.v10i1.3440


This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Library Resources (AND, OR, NOT) Google: C-EBLIP Journal Club, March 20, 2017

By Elizabeth Stregger, Discovery & Access Librarian, University of Saskatchewan

Perruso, C. (2016). Undergraduates’ use of Google vs. library resources: A four-year cohort study. College & Research Libraries, 77(5), 614-630. doi:10.5860/crl.77.5.614

There are so many ways to get to our library content. Students can start their research with the catalogue, discovery layer, research guides, library and archives websites, Google Scholar, and of course, Google. In consultation sessions with librarians and library staff over the past year, I’ve learned a lot about how these tools are perceived and taught in different areas of the University of Saskatchewan Library. I chose this article for the C-EBLIP journal club because it includes the question: does library instruction impact students’ initial choice of search tool?

A whole bunch of “value of libraries / librarianship” questions lurk around this topic, like pesky book reviews retrieved in a low relevancy search. Does library instruction make a difference? Why bother with library discovery systems if students will use Google anyway? Why do students even need libraries if they can write a passable paper using open web resources? One of the journal club members put a stop to these questions with the following zinger: “Is it valuable for kids to go to kindergarten?”

Getting back to the article, we wouldn’t have chosen “Google vs. Library Resources” as the options in the student survey described by the author. Google is a search tool, not a resource. We thought that more appropriate comparisons would include “Google vs. library systems” or “open resources vs. subscribed content.”

The data collection for this longitudinal study began in 2008 when we might have thought differently. So much has changed since then. It is easier to access subscribed resources from Google Scholar or even Google, depending on authentication workarounds. Librarians and archivists put effort into making special collections and other OA resources more discoverable and accessible to all. In a systematic review context, Google searching for grey literature is a recognized expertise. To sum up our conversation: the emphasis is less on what the student is using and more on how they are using it.

These changes in how we think about Google and student searching prompted a discussion of the challenges in conducting longitudinal research in libraries. The survey used in this study was administered a total of eight times over four years. This is a lot of sustained effort for everyone involved. Longitudinal research is subject to significant practical challenges including attrition, which was a factor in this study. It does not have the flexibility to allow for the reframing of survey questions in response to change. These changes become limitations of the research. The discussion section of this paper included many interesting questions and observations.

The discussion section of this paper included many interesting questions and observations. One of the ideas from the discussion section that we found intriguing was the maturation effect. A lot happens in the years that students spend at university. Library instruction and faculty requirements (the two variables in this study) may have a cumulative effect on how students approach research, but there are many other influences in student life. We discussed several of the other influences that might have an impact, such as interactions with peer mentors or student library workers.

In the end, that is what I will take away from our discussion. A student’s experience is made up of lots of interactions with the library, our people, and our systems. We have less control over the variables than we think. And I’ll hang onto that zinger about kindergarten as I continue learning, experimenting, and making things better.

Conducting Research on Instructional Practices: C-EBLIP Journal Club, May 14, 2015

by Tasha Maddison
University Library, University of Saskatchewan

Journal club article:
Dhawan, A., & Chen, C.J.J. (2014). Library instruction for first-year students. Reference Services Review, 42(3), 414-432.

I was sent this article through an active Scopus alert that I have running on the topic of flipped classrooms. I had not received an article that met my particular search criteria in a long while, so I was excited to read this offering. The authors had included flipped classrooms as one of their recommended keywords for the article; yet the term does not make an appearance until page 426 in a section entitled ‘thoughts for improving library instruction’ and makes up just over a paragraph of content. It was interesting to me to witness firsthand how the use of an inappropriate keyword caused further exposure to research that I probably would not have read otherwise, which is both good and bad. I chose this article for C-EBLIP Journal Club for this reason, as I believed it would generate a spirited debate on the use of keywords, and that it did. Immediately there was a strong reaction from the membership on how dangerous it can be to use deceptive descriptions and/or keywords in the promotion of your work, as you will likely end up frustrating your audience.

I found the scope of the literature review in this article to be overly ambitious as it focuses on librarian/faculty collaboration and best practices for instruction in addition to information on the first year college experience (pg. 415). I wondered if the reader would have been better served with a more specific review of the literature on ‘for-credit’ first year library instruction. Another point worthy of noting is the significant examination of the assessment process throughout the article including information about the rubric that was used as well as evidence from the ACRL framework and the work of Megan Oakleaf; yet the only quantiative data provided in the case study was briefly summarized on page 423.

The group had a lively discussion on the worth of communicating research on instructional practices in scholarly literature. Members questioned whether or not there is value in the ‘how we done it good’ type article and the validity of reporting observations and details of your approach without providing assessment findings or quantitative data. I would argue that there is a need for this type of information within library literature. Librarians with teaching as part of their assigned duties require practical information about course content, samples of rubrics, and details of innovative pedagogy, as well as best practices when using a certain methodology which ideally outlines both the successes and failures. Despite the advantages to the practitioner in the field, we postulated on how such information could be used within evidence based practice, as the findings from these types of articles are typically not generalizable and often suffer from inconsistent use of research methodology.

We wondered if there is a need to create a new category for scholarly output. If so, do these articles need to be peer reviewed or should they be simply presented as a commentary? There is merit in practitioner journals that describe knowledge and advice from individuals in the field, detailing what they do. This type of scholarly output has the potential to validate professional practice and help librarians in these types of positions develop a reputation by publishing the results of their integration of innovative teaching practices into their information literacy instruction.

In spite of the fact that this article had little to do with flipped classrooms, I did find a lot of interesting take-a-ways including: details of student learning services within the library and learning communities on their campus, as well as the merit of providing for-credit mandatory information literacy courses.

Suggested further reading: http://blogs.lse.ac.uk/impactofsocialsciences/2015/05/13/reincarnating-the-research-article-into-a-living-document/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Reflections from the C-EBLIP Journal Club, Feb 23, 2015

by Carolyn Doi
Education and Music Library, University of Saskatchewan

For this iteration of the C-EBLIP Journal Club, I decided to feature an article from outside the LIS literature that deals with the topic of reflection, creative processes and digital technologies in the classroom:

Kirk, Carole, and Jonathan Pitches. “Digital Reflection: Using Digital Technologies to Enhance and Embed Creative Processes.” Technology, Pedagogy and Education 22, no. 2 (July 1, 2013): 213–30. http://dx.doi.org/10.1080/1475939X.2013.768390

This paper caught my attention for several reason. The discussion of creative processes and incorporation of technology in the classroom is particularly interesting to me and these are topics that often come up when I discuss teaching strategies with other librarians. I was also looking forward to exploring the idea of reflection, both in the classroom and as part of the research process. This is something we have discussed in our institution and in particular through our own Library Leadership Development Program.

The authors of this paper are both scholars at the School of Performance and Cultural Industries at the University of Leeds who have shared the results from a teaching and learning project called Digitalis (http://www.digitalis.leeds.ac.uk/), which “investigates ways in which digital technologies can be used by teaching staff to facilitate reflection on creative practices within performing and creative arts disciplines” (p. 213). The study used action research methodology led by members of a cooperative inquiry group who incorporated reflection and digital technologies into their own teaching practice. They took this a step further and also incorporated reflection as one of the four project phases (planning, action, collection and reflection).

The study featured modules in five areas of study: performance design, dance, music, theatre & performance and museum studies. In each module, students were asked to reflect on their learning and experience, assisted by different types of digital technology. In one example, students in a second year compulsory Dance Choreography course were asked to use a flip camera to capture thoughts, ideas and observations, which were used in combination with written reflection and posted to a private blog. The other modules used varying types of reflective processes. Methods of digital capture included flip cameras, audio recorders and digital still cameras. Digital reflection mechanisms included blogs (on Blackboard), PowerPoint and Photo Story 3.

In some cases, the technology may have interfered with the process of critical reflection as some students ended up “concentrating too much on slick production values to the detriment of critical thinking” (p. 224). The paper mentioned that ease of use was an important factor in getting students to feel engaged in the reflection activities. One recommendation that came out of the paper was that digital reflection technologies should be introduced incrementally, as opposed to all at once.

We discussed the value of incorporating technology into the classroom, and also of the importance of not letting the technology ‘get in the way’ of the learning process. Some in our group remarked that they were still surprised that the incorporation of technology in the classroom still might be a barrier for some students.

The paper reports that students found digital reflection to be advantageous when ‘looking again’ at material which would otherwise have been lost in the creative practice. The digital capturing acted as a way they could benchmark their own impressions of the event, and allowed the performer to experience being an audience member of their own performance.

We discussed the benefits of reflection in two veins: 1) for integration into the classroom and 2) for integration into our own practice. Some questioned the viability of incorporating reflection (especially non-written reflection) into library instruction as we are often faced with the challenge of limited classroom time where it would be difficult to follow up with students. Librarians who teach in disciplines outside of the arts felt that they might just not be able to get their students to try a less conventional reflection method such as video capture. The article prompted some to think about video capture as a means to document and reflect on one’s own teaching practice. Others were thinking about incorporating reflection into other aspects of practice or research, or are currently embarking on projects that do incorporate an element of planned reflection.

The journal club is always an engaging gathering and it’s been interesting to see the various opinions and perspectives that emerge out of the group discussions. I look forward to many more discussions around the journal club table in the coming months!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Are Students Succeeding with a Library Credit Course? C-EBLIP Journal Club, October 6, 2014

by Rachel Sarjeant-Jenkins
Client Services, University Library, University of Saskatchewan

I recently had the opportunity to lead our C-EBLIP Journal Club in a discussion of Jean Marie Cook’s article “A library credit course and student success rates: A longitudinal study” in College & Research Libraries 75, no. 3 (2014) (available at http://crl.acrl.org/content/75/3/272.full.pdf+html). This article had been sitting on my desk for a few months waiting for that magical moment when I found the time to read it thoroughly. Then came my turn to host journal club. What a perfect opportunity to finally delve into Cook’s article! And it couldn`t have come at a better time in light of our library’s focus on developing a programmatic approach to library instruction and the broader teaching and learning environment in which academic libraries currently find themselves.

Following some ‘proper’ journal club discussion about the article’s methodology and findings, Cook’s article proved a wonderful catalyst for a conversation about library instruction at our institution. Initially we were simply envious of Cook’s situation, where a library-focused course is one of the areas within her institution’s priorities. But then the questions started.

• Is there value in having a stand-alone library course or is it better to have instruction firmly embedded or integrated into academic program courses? (Of course, this question did not mean we ever stopped desiring that institutional commitment to information literacy — who would!?)
• How do you assess student learning? And, more importantly, how do you gauge the actual ongoing use of that learning by students?

We also talked about library value. The impetus for Cook’s work was institutional interest in ROI; the result was her quantitative research project.
• How, we asked, can qualitative data be used to support (and enhance) quantitative data when demonstrating library value to the parent institution?
So many questions, and only a lunchtime to discuss.

Not surprisingly, our hour just wasn’t enough. What that hour did do, however, was get us thinking. We talked about the known information literacy courses on campus and learned about pockets of embedded instruction by our librarians that we were completely unaware of. We had a lively debate about quantitative and qualitative research and the benefits of each. And of course we talked about assessment, not only that we need to do more of it and do it more consistently, but also the importance of knowing what we are trying to assess and therefore when we want to assess it.

Our journal club hour got me excited and primed for the next steps in developing our library’s programmatic approach to instruction. Cook’s article, and the energetic conversation it inspired, was an excellent beginning.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Library Researcher Series: A Team Approach to Planning and Teaching

by Tasha Maddison
Engineering Library, University of Saskatchewan

During the summer of 2012, a chance meeting of two science liaison librarians led to the creation and development of the initial Library Workshop Series for Scientists and Engineers. DeDe Dawson was eager to address the needs of graduate students and faculty – often these two groups do not receive library instruction and could benefit from sessions on literature searching and research productivity skills. I had just started as a liaison librarian and was eager to begin providing instruction and expand my contacts within the College of Engineering. The idea of providing a series sounded like a perfect opportunity for both of us. Although we acknowledged that the initial course offerings might appeal to a broader audience, we focused our pilot project on our primary areas of liaison work and targeted these graduate students and faculty members specifically in all promotion and marketing initiatives of the series. The series was launched that fall with an initial offering of four classes. All sessions were taught collaboratively and the series was repeated with two additional classes in the winter semester.

Building upon the initial success of the fall semester, the Library Instruction Interest Group piloted a concurrent series that offered RefWorks training in the winter semester of 2013. Based on the initial pilot project, the collaboration with the Library Instruction Interest Group, a planning team was formed and the Library Researcher Series was born. DeDe Dawson, Carolyn Doi, Vicky Duncan, Angie Gerrard, Maha Kumaran and Tasha Maddison are the founding and current members of the planning team. The team members represent five of the seven library branches which includes discipline coverage in the Sciences, Social Sciences, Education and Fine Arts. Due to the interdisciplinary approach to planning, the team is able to offer a series with a broader scope, as well as an expanded breadth and depth than the original pilot project. The team continues to utilize a collaborative approach to teaching, reaching out to librarians and other instructors in the university community to offer sessions as part of the series.

A core element of each series since the beginning has been the collection of statistics and feedback associated with each session. This evidence has shown us which sessions are popular and should be offered again, what additional sessions could be developed based on comments received, and how best to market our series. The data collected has also allowed us to document our successes! Since the fall semester of 2012, we have seen an increase in attendance each subsequent semester. Most recently, in the winter of 2014, we averaged 13 participants per session. We also worked hard to brand our series last year, creating a logo and consistent promotional materials such as posters and advertisements in On Campus News, the University of Saskatchewan’s newspaper. Our most successful marketing tool remains the direct emails which are sent to faculty and graduate students from liaison librarians.

Planning is currently underway for the fall of 2014 with a roster of approximately 21 classes being offered with topics such as: Comprehensive Literature Review (Part A – Subject Searching, Part B – Keyword Searching), Plagiarism, Scholarly Identity, Making the most of Google and Managing Citations Series (RefWorks, EndNote, Mendeley and Zotero). We are also exploring live streaming and recording some sessions. Part of each planning meeting is dedicated to a review of existing classes, deciding which ones to keep and when is the most suitable time for them to be offered again. We have generated a list of new topics which are added to the series when appropriate. Expressions of interest are also requested from our colleagues and instructors within our University community. Some classes are favourites and are offered every term, while others come and go from the series.

For more information, please see: http://libguides.usask.ca/LibraryResearcherSeries

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.