Can you always do “just one more thing”?

by Jaclyn McLean, Electronic Resources Librarian
University of Saskatchewan

I grew up hearing the refrain “just one more thing” about my dad, usually around 6 p.m., as we were all sitting down to supper and his chair sat empty. One of us would say, “well, he probably had just one more thing to do.” And then we would sigh, or laugh, and eat. Now, this isn’t a post about nature/nurture, but I do find it curious that I often find myself trying to squeeze in just one more thing, at the end of the workday, or before going to sleep, and this attitude that I’ve always got time to squeeze something else in can get me into trouble.

Like now, as I am diving into not one, or two, but three new research-type endeavors (and wrapping up a fourth). All with specific and overlapping timelines; and different methodologies and topics. So how did I get there? It’s entirely my own fault, not that I feel negative about it. All of the projects are interesting, variously collaborative and solo, focused on publishing, presentation, and art curation. I am excited about all of them, and can’t wait to dig in and get past this beginning stage.

Planning how the projects will intersect and cohabitate in my brain for the next few months is key. To that end, I’ve been working out a detailed Gantt chart, and working on accepting that this chart will change on a weekly, if not daily, basis. I enjoy having lots on the go, different projects and ideas to divert my attention. I also like making lists, schedules, and organizing my time (and that of others, my collaborators should be warned). A key to my success is going to be paying attention to this careful planning and checking in regularly on the established timelines, shifting and nudging things around as things change.

I need to accept that this will all feel overwhelming at some point down the road. Probably when the days get shorter, and the deadlines loom much closer than they do today. Because you see, this isn’t the first time I’ve found myself with a lot on my plate. And I’ve learned that if I can do all the pre-planning, and have an established plan to shift and flex with, I am more effective. Flexibility and rolling with the punches is not my nature, but I am optimistic, and excited about the opportunities coming my way with these projects (and those that might emerge out of them in the future).

But it’s also time to sit on my hands, and stop coming up with new ideas of things I would like to do. Because I need to make sure I don’t exceed my capacity, and switch my perspective from excitement to dread, from optimism to overwhelmed. Stopping the flow of new ideas isn’t something I’ll be able to stick to (it’s good to recognize your own flaws, right?), but I am committing here, in this public forum, to write them down for later, or share them with someone else who might be able to take them and run. And I will keep reminding myself that my slate is full for this year. And as we head into a fresh new academic year, doesn’t that sound exciting?

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Research Groups and the Gift of Spaciousness

by Marjorie Mitchell
Research Librarian, UBC Okanagan Library

As I write this, it is early August. The days are long and hot, and a haze of smoke from wildfires tints the air. It’s a time of year I always find spacious. I have spent much of my life guided by the rhythms of the school/academic year and summer is that glorious time-out from regular duties and a period less scripted than most of the rest of the year. It is the time of the year for “projects” and “research” and “planning” and, my favorite, “reflection.” Traditionally, in the next few days, I would move from this feeling of spaciousness to one of increasing claustrophobia and borderline panic. Oh, it always started off as a mild discomfort. Niggling thoughts of “I should get this done before September” shifted to “I better get this data analysis done” to “OMG, I haven’t done nearly as much as I planned to do and now all my deadlines are getting pushed forward and now I have to plan for the classes I have to teach….” and so on into full panic mode.

This year is different. It’s not perfect, and yes, I still have a few “To Do” lists floating around, but I can see a big difference. This year I have seen evidence of increased research productivity and reduced stress that really are the advantages of sincere, concerted teamwork, specifically a research team.

I have been actively participating in research investigating the research data management needs of faculty from all across Canada and specifically from my institution, the University of British Columbia. I was not the initiator (a big thank you to Eugene Barsky who did initiate these studies at UBC), nor do I do the bulk of any of the work that goes into this research, and that’s the beauty of these research teams – sharing the work really does make it seem more manageable.

The larger team is a group of Canadian librarians, the Canadian RDM Survey Consortium, who saw a situation developing (research data management plans being made mandatory by multiple international granting bodies) and who decided to pro-actively prepare in the strong likelihood that Canadian granting bodies would follow suit. In order to effectively prepare, we needed to understand the research data management needs of our researchers across the disciplines. In other words, we needed to conduct original research about the actual practices and needs of researchers. We sought answers to questions as general as how many research projects did the respondent lead in the past year to specific questions about how much data a respondent’s research generated and where the respondent stored it, etc. We didn’t research all the disciplines at once. Instead, we started first with engineering and natural sciences, followed by the social sciences and humanities in the second round, then concluded with the health and allied sciences. This has taken over two years to complete.

The smaller team is a shifting group of librarians at UBC who have all participated in this research as we have worked our way through the disciplines. These research surveys and their results all form the basis of the national research, but were able to provide significant insight into our local research landscape. If you have questions about what researchers are doing with respect to research data management, we have discovered some of the answers.

The spirit of collaboration, goodwill, and support that members of these groups exhibit every time we meet (virtually) is inspiring. We discuss the tasks that need to be done for research, from the ethics applications, to analyzing the data, to writing the paper or poster, to colour schemes for graphics, etc. As we decide on the tasks, we also volunteer for them. One of the biggest advantages of such groups is the depth and breadth of skill within the group. Each of us aspires to creating the best paper or poster possible and each of us contributes something of value.

The other benefit of these collaborations has been the scheduling of the research, analysis, and writing. When working with a group, I don’t always get to set the timeframes for when the work needs to get completed, and that is not a bad thing. Yes, there can be some long days or extra work on a weekend as I race to meet a deadline I agreed to, but, ultimately, not letting the members of this group down is strong motivation for me. I appreciate that all the members of the group are also putting in the time, one way or another. The scheduling is often driven by conference or journal proposal deadlines, and those all happen in the winter and spring, and not so much over the summer. And so, this year, RDM research is not on my list of things yet to do before September. They really were right at the Librarian’s Research Institute when they suggested not being a solo researcher.

If your research practice is stalled, or hitting some speed bumps, or just not going the way you envisioned it, think about creating or joining a team/group/consortium. The benefits outweigh the costs significantly. And you might have some fun – I know I do.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Research that is (un-)related to librarianship

by Kristin Hoffmann
University of Western Ontario

I have noticed that conversations about librariansi doing research often lead to discussions about whether librarians can or should do research that isn’t related to librarianship or library and information science (LIS). Most often in those discussions, librarians express a desire to do research in any discipline or bemoan the fact that their institution’s policies or practices don’t permit or support them to do research that is un-related to librarianship.

In a recent study that I did with two colleagues, Selinda Berg and Denise Koufogiannakis, we surveyed academic librarians who work at universities across Canada to explore how various factors are related to research productivity. As part of our survey, we asked participants to report their LIS-related research output over the past five years. A handful of participants remarked on the idea of LIS-related research with comments such as:

“What is LIS research? Is it only research that has been published in LIS journals? The research that I do is primarily focused on teaching and learning. I believe that this also informs LIS, but am unclear if it would be considered strictly LIS research?”

“My area of research is not LIS-related, but librarians [at my university] are restricted to ‘work-related’ projects when applying for sabbatical.”

“Peer-reviewed, published research in non-library fields raises the image and acceptance of librarians as faculty and participants in post-secondary activities in my opinion.”

I admit having had a strong personal opinion on the matter: that librarians should do research related to librarianship. It has seemed like common sense to me that we research within our discipline. I also feel that “librarianship” is vast, far beyond the realm of “related to what I do as a librarian,” and so I haven’t perceived this boundary as a restriction.

But I find myself now wanting to be less fixed and more open to considering other ways of looking at this. I am curious to explore the issues around research that is and is not related to librarianship. Questions that interest me include:

What does “research related to librarianship” mean, and how might that meaning differ for librarians who are more or less interested in doing such research?

How does collective agreement languageii affect the kind of research that librarians do or the kind of research that they want to do?

How do subject expertise and other advanced degrees influence librarians’ research interests or confidence to carry out research, either related to librarianship or not?

I hope that this exploration will help me, and others, to better understand what is at the root of various perspectives about research that is or is not related to librarianship, so that we can better support and encourage each other as researchers.
__________________________________
iMy experience is limited to conversations about academic librarians doing research.
iiIn Canada, most academic librarians are members of faculty associations and their responsibilities, including research or scholarly activity, are outlined in collective agreements or similar documents.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

“I miss math…”- Strengths & Comfort Zones When Choosing Research Methods

by Laura Newton Miller, on sabbatical from Carleton University

I have the great fortune to be on a one-year sabbatical. I love to learn, and I’ve moved out of my comfort zone by doing more qualitative research. I am interpreting a lot of open-ended comments from many interesting people, and have gone from being overwhelmed to kind of/sort-of comfortable in the mounds of data I’ve collected. I really do appreciate and love the learning.

So, a little story: In late spring, I was helping my 11-year-old son with his homework to find the surface area of triangular prisms. After watching some YouTube videos, we eventually started working through a practice sheet until he finally got the hang of it. While working on some problems myself in order to help him understand, I had a bit of an epiphany: I miss math.

You see, in “real life” I’m an assessment librarian. This started as mainly collections assessment, and eventually broadened to also include service and space.  If anyone ever thought that they would like to become a librarian to avoid math, they best not be working in collections, administration, or assessment. I do math all the time in my job. Does it drive me crazy sometimes? Yep. But I like it- I’ve always been pretty good at it.

For the most part, my research so far this year does not include much math. And that’s ok; It doesn’t work for what I’m trying to do at the moment. I have been stretching out of my comfort zone, treading my way through to learn new skills. I guess this is nothing new- I get out of my comfort zone a lot in my regular job too (ie. I never knew I’d use Excel so much). With learning any new skill, there are overwhelming moments- the “what have I gotten myself into” kinds of moments. They are happening less and less now, but I sometimes find myself comparing this sabbatical to my last one in 2010. At that time, I was just getting used to the idea of doing research at all. One of the things I did was a bibliographic study on graduate biology theses at Carleton University (shameless plug here: http://www.istl.org/11-winter/refereed3.html). There was lot of math involved.  It was a very new process for me and I’m sure I had my doubts at the time, but I also remember saying out loud “I LOVE this”. Not that I’m NOT loving what I’m doing now…I’ve certainly had my “ooh” moments…. I just find it more…difficult maybe?

I love Selinda Berg’s blog post (https://words.usask.ca/ceblipblog/2016/03/22/capacity-not-competencies/) focusing on capacities for research- not just research competencies. I have to keep reminding myself that this is a learning process. I’m definitely growing as a researcher. I remember being part of the Librarians’ Research Institute (2014) (http://www.carl-abrc.ca/strengthening-capacity/workshops-and-training/librarians-research-institute/). Although I can’t find it in my notes (and I still refer to them 🙂 ), I do remember us talking about choosing research methods to answer your questions- understanding the advantages and disadvantages of choosing quantitative, qualitative, or critical/theoretical methods. In the end though, someone said you do have to feel comfortable with your choice of research method. As an example, if you are a complete introvert, you have to ask yourself if you really want to conduct focus groups or interviews. Just how much do you want to get out of your comfort zone?

I’m happy to be out of my comfort zone, but I have also learned that when I’m looking at future ways to answer my research questions, I need to remember my strengths and skills that I do have. I purposely did not say “weaknesses” because those are the opportunities to learn. I do think that librarians can sometimes be a little “judgey” about some methods (ie “not another survey”) and this is not helpful.

Ultimately choose the research method that is right for your research question, and when weighing the pros and cons of each method, remember your strengths and the learning curve that might be involved. Next time (if it makes sense to do so) I know that I won’t necessarily leave math out of the equation (bad pun intended).

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The right tool for the job: NVivo software for thematic analysis

by Carolyn Doi
Education and Music Library, University of Saskatchewan

This post builds off of an earlier one by research assistant Veronica Kmiech, which outlines the process for searching and identifying literature on the topic of how practitioners in cultural heritage organizations manage local music collections.1 I have worked with Veronica since summer 2016 on this project, which led to a thematic analysis of the literature seeking to better understand the professional practices implemented and challenges faced in managing, preserving and providing access to local music collections in libraries and archives.2

Using NVivo to facilitate the thematic analysis in this project was ultimately extremely helpful in organizing and managing the data. With over fifty sources to analyze in this review, the thought of doing this work manually seemed daunting.

Thematic analysis typically encompasses steps which take the researcher from familiarization of the data, through development of codes and themes, and finally to being able to tie these themes to the broader picture within the literature.3 NVivo becomes particularly useful at the stages of coding and theme development.

During the coding phase, NVivo will help save descriptions, inclusion, and exclusion criteria for each code. These are fairly easy to change as needed, being able to see an overview of the codes you are working with is definitely helpful, and it is easy to create hierarchies within the node sets. Once code labels are identified, coding the dataset involves (a lot!) of highlighting and decisions about which node(s) to assign to that piece of text. Adding new nodes is fairly simple, as there will likely be themes that come up throughout the coding process. Word to the wise: coding is made easier with NVivo, but the software doesn’t do all the work for you. Schedule extra time for this portion of the research.

During the phase of theme development and organization, NVivo made it quite easy to sort nodes into broader themes. In practice, this process took a few revisions in order to fully think through how and why nodes should be sorted and organized. The software has some features that assist with finding significance within the themes including ability to make mind maps, charts, and word frequency queries. After this process, I identified five broad themes were identified within the literature, some with as few as three associated nodes, and some with as many as thirteen (fig. 1).

Figure 1: Themes and node hierarchy

Following the development of this hierarchy, I went back into the literature, to find examples of how each theme was applied and referred to in the literature. When presenting the analysis portion, these examples were helpful in illustrating the underlying narrative.

This example (fig. 2) shows nodes found within the theme which brings together data on the theme of why practitioners choose to collect local music.


Figure 2: Goal and Objective theme

To better illustrate the significance or application of these concepts, I used quotes from the literature as examples. This excerpt works particularly well as an illustration of why heritage organizations might choose to collect local music, why it may present challenges, and why it can be considered unique:

The Louisville Underground Music Archives (LUMA) project was born of the need to document this particular, and important, slice of Louisville’s musical culture. …from a diverse community of bands and musicians, venue and store owners, recording studios and label managers, and fans to maintain the entire story from a broad range of perspectives.4

Pulling quotes such as this one helped me to build a narrative around the themes I’d identified, and serve to provide a gateway into the literature being analyzed.

The process of analyzing the data this way provided me with a rich resource on which to build the literature review, and a unique map of what the literature represents. While NVivo has some flaws and drawbacks (price, switching between operating systems, and working collaboratively were notable obstacles), the benefits outweighed them in the end (quick learning curve, saves the time of the researcher, assists considerably with organization of data and thematic synthesis). I highly recommend NVivo as a tool to keep in your back pocket for future qualitative analysis projects.

1 “Locating the local: A literature review and analysis of local music collections.” https://words.usask.ca/ceblipblog/2017/01/17/lit-review-local-music-collections/
2 Results from this analysis were recently presented during the 2017 annual meeting of the Canadian Association of Music Libraries (CAML) in Toronto, ON in a paper titled Regional music collection practices in libraries: A qualitative systematic review and thematic analysis of the literature.
3 “About Thematic Analysis.” University of Auckland. https://www.psych.auckland.ac.nz/en/about/our-research/research-groups/thematic-analysis/about-thematic-analysis.html
4 Caroline Daniels, Heather Fox, Sarah-Jane Poindexter, and Elizabeth Reilly. Saving All the Freaks on the Life Raft: Blending Documentation Strategy with Community Engagement to Build a Local Music Archives. The American Archivist, Vol. 78, No. 1 (2015): 238–261.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Grey areas in research

by Christine Neilson, Knowledge Synthesis Librarian
Neil John Maclean Health Sciences Library
Centre for Healthcare Innovation
University of Manitoba

Through the course of my day-to-day duties, I came across an interesting article by Adams et al. about searching for “grey information” for the purposes of evidence synthesis in public health. For anyone who is unfamiliar with evidence synthesis, evidence synthesis is much more than a literature review. It involves identifying and tracking down all relevant evidence, evaluating it, extracting data, contacting authors to request additional data that is not included in a published article (where applicable), and using that larger pool of data to answer the question at hand. A thorough evidence synthesis includes grey literature – literature that is published by an entity that’s main business is something other than publishing – in an attempt to reduce bias. There can be some seriously heavy statistical analysis involved, and the entire process is a heck of a pile of work. Adams et al. took the idea of grey literature, extended it to other information that is difficult to get hold of, and provided a critical reflection of three separate projects where they relied heavily on “grey information”. When I read their article, I was struck by two things.

First, Adams and colleagues were interested in public health programs that came out of practice, rather than formal research. As they point out, “Interventions and evaluations that were primarily conducted as part of, or to inform, practice may be particularly unlikely to be described in peer-reviewed publications or even formally documented in reports available to others in electronic or hard copy. Information on these activities may, instead, be stored in more private or informal spaces such as meeting notes, emails, or even just in people’s memories.” To me, this statement applies as much to librarianship as it does to public health. I can’t imagine how many awesome library programs and practices we could learn from, except for the fact that few of us have heard about them.

The second thing that struck me as I read this article was that even though the authors conceded that their work was “verging” on primary research, they considered these projects to be evidence syntheses instead. But evidence synthesis relies on published information. Rather than ask for additional information to clarify the data they collected from a published source, the authors gathered new information by interviewing key informants, so to me, they were conducting primary research: full stop. The authors seemed to know what an evidence synthesis actually entails – not everyone can say the same – so I wonder: the work they did was a legitimate form of research so why would they label it as evidence synthesis? Are the lines between different forms of research really that blurry? Were they trying to avoid going through the REB process? Or were they concerned their work wouldn’t have the status associated with an evidence synthesis and so they named it to their liking?

I think that sometimes we don’t realize that library research has a lot in common with research in other fields. Like the field of public health, there is so much useful information about our practice that is not widely available or findable. I think we also have our go-to research methods, and opinions about what kinds of publications count… and what don’t. The “how we done it good” articles that simply describe a program or activity have gotten a bit of a bad rap in recent memory. I do agree with those who say that we need more rigorous, research-oriented library publications in general. But simply sharing what was done informs us of what is going on in library practice in a discoverable way. Perhaps we should not be so quick to discourage it.

References

Adams J et al. Searching and synthesising ‘grey literature’ and ‘grey information’ in public health: critical reflections on three case studies. Systematic Reviews. 2016;5(1):164.
Available online at: https://systematicreviewsjournal.biomedcentral.com/articles/10.1186/s13643-016-0337-y


This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

In The End, It All Starts With Really Good Questions!

by Angie Gerrard, Student Learning Services, University of Saskatchewan

While attending an experiential learning showcase on my campus a few weeks ago, I was struck by a common theme mentioned by several faculty presenters. Faculty who work with students undertaking original research projects noted that a common challenge for students was identifying a research question. A particular faculty member surveyed her students on their experiences with the course research project and students reported that articulating a research question was the most difficult part of the entire research project. An interesting side note is that students reported that analyzing their data was the most valuable part of the process.

The challenge of formulating research questions piqued my interest as a librarian as we are often on the front lines assisting students with the evolution of their topic as the research process unfolds. We often help students navigate the iterative processes of exploring a topic, brainstorming potential avenues of research, asking different questions, undertaking initial searches in the literature, narrowing the scope of a question or alternatively broadening the scope, all the while tweaking the research question and trying to avoid the dreaded ‘maybe I should just switch my topic’.  I often wonder if there is an understanding of the time commitment and perseverance required for these initial, complex processes in the research cycle.  Clearly, students are struggling with this, as shown above; this challenge was echoed in Project Information Literacy’s findings where they asked students what was most difficult about research; 84% reported that getting started was the most challenging (Project Information Literacy, n.d.).

We know that students struggle with these initial stages of the research process, so what can librarians and faculty do to help students get past the hurdle of formulating good research questions? Here are a few suggestions.

Be explicit about the process. Research is iterative, messy, and time-consuming and often students who are new to academic research may arrive with a more linear mental model of the research process. To illustrate that research is a process, it is powerful to show students how to take broad course-related research topics, break them down into potential research questions, discuss how the questions evolve once one gets a taste of the literature and how further refinement of the question takes place as the process continues. By being explicit about the process, students have a better understanding that the broad topic they start with often evolves into something much more meaningful, unexpected, or interesting.

Encourage curiosity in the research process. At the campus event I alluded to, when I asked faculty how they dealt with students’ struggles with identifying research questions, they all reported the importance of students picking something that interests them, something they are curious about.  Anne-Marie Deitering and Hannah Gascho Rempel (2017), librarians at Oregon State University, recognized the overwhelming lack of curiosity expressed by students in their study, when these students were asked to reflect on their own research process. In response, the authors recommend “that as instruction librarians we needed to enter the process earlier, at the topic selection stage, and that we needed to think more intentionally about how to create an environment that encourages curiosity” (pg 3). In their awesome paper, the authors discuss different strategies they used with first-year students to encourage curiosity-driven research.

Start with a juicy source or artifact! Chat with faculty and ask them to recommend a subject-specific editorial, news article, blog posting, etc. that is controversial and/or thought provoking.  These sources can be old or new; the point is that students start with intriguing sources, not a pre-determined list of research topics. Students examine the sources then begin to develop various lines of inquiry, which evolve into research questions.

Use the Question Formulation Technique (QFT). Although this technique was developed for the K-12 environment, the approach can be adapted to higher education and beyond.  The QFT has six steps, as summarized in the Harvard Education Letter:

  • Step 1: Teachers Design a Question Focus. This question focus is a prompt in any form (visual, print, oral) that is meant to pique students’ interests and stimulate various questions.
  • Step 2: Students Produce Questions. Students note questions following a set of four rules: ask as many questions as you can; do not stop to discuss, judge, or answer any of the questions; write down every question exactly as it was stated; and change any statements into questions.
  • Step 3: Students Improve Their Questions. Students identify their questions are either open- or closed-ended and flip the questions into the alternative form.
  • Step 4: Students Prioritize Their Questions. With the assistance of the teacher, students sort and identify their top questions. Students move from divergent thinking (brainstorming) to convergent thinking (categorizing and prioritizing).
  • Step 5: Students and Teachers Decide on Next Steps. This stage is context specific where students and teachers discuss how they are going to use the identified questions.
  • Step 6: Students Reflect on What They Have Learned. This final step allows for students to develop their metacognitive / reflective thinking (Rothstein & Santana, 2011).

Rothstein and Santana (2011) note that “(w)hen students know how to ask their own questions, they take greater ownership of their learning, deepen comprehension, and make new connections and discoveries on their own. However, this skill is rarely, if ever, deliberately taught to students from kindergarten through high school. Typically, questions are seen as the province of teachers, who spend years figuring out how to craft questions and fine-tune them to stimulate students’ curiosity or engage them more effectively. We have found that teaching students to ask their own questions can accomplish these same goals while teaching a critical lifelong skill” (para. 3).

We know that formulating research questions can be a challenge for students. Being honest, explicit and transparent about this process may help students in tackling this challenge. I think we could all agree that encouraging curiosity in research and asking meaningful questions is not something that is confined to academia but rather are characteristics seen in lifelong learners.

In the end, it all starts with really good questions!

References

Deitering, A.-M., & Rempel, H. G. (2017, February 22). Sparking curiosity – librarians’ role in encouraging exploration. In the Library with the Lead Pipe. Retrieved from http://www.inthelibrarywiththeleadpipe.org/2017/sparking-curiosity/

Project Information Literacy. (n.d.). Project Information Literacy: A national study about college students’ research habits [Infographic].  Retrieved from http://www.projectinfolit.org/uploads/2/7/5/4/27541717/pilresearchiglarge.png

Rothstein, D., & Santana, L. (2011). Teaching students to ask their own questions. Harvard Education Letter, 27(5). Retrieved from http://hepg.org/hel-home/issues/27_5/helarticle/teaching-students-to-ask-their-own-questions_507#home


This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Impactful research

by Nicole Eva-Rice, Liaison Librarian for Management, Economics, Political Science, and Agriculture Studies, University of Lethbridge Library

Why do we do research? Is it simply to fulfill our obligations for tenure and promotion? Is it to satisfy our curiosity about some phenomenon? Or is it to help our fellow librarians (or researchers in another discipline) to do their jobs, or further the knowledge in our field?

I find myself grappling with these thoughts when embarking on a new research project. Sometimes it’s difficult to see the point of our research when we are stuck on the ‘publish or perish’ hamster wheel, and I suspect it’s all the more so for faculty outside of librarianship. It’s wonderful when we have an obvious course set out for us and can see the practical applications of our research – finding a cure for a disease, for example, or a way to improve school curriculum – but what if the nature of our research is more esoteric? Does the world need another article on the philosophy of librarianship, or the creative process in research methods? Or are these ‘make work’ projects for scholars who must research in order to survive in academe?

My most satisfying research experiences, and the ones I most appreciate from others, have to do with practical aspects of my job. I love research that can directly inform my day to day work, and know that any decisions I make based on that research have been grounded in evidence. If someone has researched the effectiveness of flipping a one-shot and can show me if it’s better or worse than the alternative, I am very appreciative of their efforts both in performing the study and publishing their results as I can benefit directly from their experience. Likewise, if someone publishes an article on how they systematically analyzed their serials collections to make cuts, I can put their practices to use in my own library. I may not cite those articles – in fact, most people won’t unless they do further research along that line – but they have a direct impact on the field of librarianship. Unfortunately, that impact is invisible to the author/researchers, unless we make a point of making contact with them and telling them how we were able to apply their research in our own institutions (and I don’t know about you, but I have never done that nor has it occurred to me to do that until just this minute). So measuring ‘impact’ by citations, tweets, or downloads just doesn’t do justice to the true impact of that article. Even a philosophy of librarianship article could have serious ‘impact’ in the way that it affects the way someone approaches their job – but unless the reader goes on to write another article citing it, that original article doesn’t have anything that proves the very real impact it has made.

In fact, the research doesn’t even have to result in a scholarly article – if I read a blog post on some of these topics, I might still be able to benefit from them and use the ideas in my own practice. Of course, this depends on exactly what the content is and how much rigor you need in replicating the procedure in your own institution, but sometimes I find blog posts more useful in my day-to-day practice than the actual scholarly articles. Even the philosophical-type posts are more easily digested and contemplated in the length and tone provided in a more informal publication.

This is all to say that I think the way we measure and value academic research is seriously flawed – something many librarians (and other academics) would agree with, but that others in academia still strongly adhere to. This is becoming almost a moral issue for me. Why does everything have to be measurable? Why can’t STP committees take the research project as described at face value, and accept other types of impact it could have on readers/policy makers/practitioners rather than assigning a numerical value based on where it was published and how many times it was cited?

When I hear other faculty members discussing their research, even if I don’t know anything about their subject area, I can often tell if it will have ‘real’ impact or not. The health sciences researcher whose report to the government resulted in policy change obviously had a real impact – but she won’t have a peer-reviewed article to list on her CV (unless she goes out of her way to create one to satisfy the process) nor will she likely have citations (unless the aforementioned article is written). It also makes me think about my next idea for a research project, which is truly just something I’ve been curious about, but which I can’t see many practical implications for other than to serve others’ curiosity. It’s a departure for me because I am usually the most practical of people and my research usually has to serve the dual purpose of both having application in my current workplace as well as becoming fodder for another line on my CV. As I have been thinking about the implication of impact more and more, I realize that as publicly paid employees, perhaps we have an obligation to make our research have as wide a practical impact as possible. What do you think? Have we moved beyond the luxury of researching for research’s sake? As employees of public institutions, do we have a societal impact to produce practical outcomes? I’m curious as to what others think and would love to continue the conversation.

For more on impact and what can count as evidence of it, please see Farah Friesen’s previous posts on this blog, What “counts” as evidence of impact? Part 1 and Part 2.


This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

(Small) public libraries do research too!

By Meghan O’Leary, MLIS, Collections and Reader’s Advisory Librarian, John M. Cuelenaere Public Library

Last October I attended the Centre of Evidence-Based Library and Information Practice Fall Symposium and quickly came to the realization that I was the only public librarian in attendance and the year before that there were only two of us. Almost all the presentations were geared towards special or academic libraries, which got me thinking, “Hey! Public librarians do this kind of research too!”

Of course, public libraries do research! Admittedly, research in the LIS discipline is dominated by academic librarians. Even research about public libraries tends to be done mostly by academic librarians. Why is that? Public librarians do not need to publish in the same way that academic librarians need to, but why don’t we publish more research? Do we not have the time or funding? Do we not consider what we do as research worth publishing? These are important questions, but not what I want to discuss today.

What I do want to talk about is what small public libraries, specifically the one I work at, does as far as research is concerned. But, first, some background information. I live in Prince Albert, Saskatchewan and work as the Collections and Reader’s Advisory Librarian at John M. Cuelenaere Public Library. Prince Albert has one full branch and one satellite branch out on the west side of the city and a population of roughly 40,000 people. Compared to Saskatoon, Regina, Edmonton, Calgary, etc. we are a rather small library.

Small public libraries, like mine, do engage in research. However, the research we do is generally not seen as “traditional” research because data collection is usually an ongoing process and we often do not share it with the LIS community. Matthews (2013) offers a model of “Try, Assess, and Reflect” for public libraries embracing evidence-based librarianship and says, “try something, gather some data about the effectiveness of the change, and then make some adjustments” (p. 28). Here’s an example of how we used this model: A couple of years ago we looked at what other libraries were doing and made the decision to launch a small video game collection. After a few months, I gathered statistical information about the new collection. Based on that we tweaked how we were doing things. Some of the items were not being returned, so we limited checkouts to two games per patron. E-rated games were being used more than M-rated games, therefore I altered my buying habits accordingly. Each month I gather statistical data on the whole collection to see what is being used, what is not being used, and what current trends are.

That is an example of how small public libraries use quantitative research methods to guide change; however, there has been a shift in research trends in the LIS community from quantitative to qualitative methodologies. Another project I want to talk about is our most recent strategic planning project. It has been ongoing for a few months now and we have done various different types of information gathering. We use statistical data like gate counts, usage stats, website metrics, etc. to guide us in creating a new strategic plan, but we also had three separate strategic planning sessions where we gathered qualitative data. Our first session was with the members of our board and library management, the second was with the rest of the library staff, and finally, the third session was held with the public. The major topics up for discussion were Facilities, Technology, Collections, Programs, and Community Outreach. The topics were written on large pieces of paper posted around the room, then everyone who attended the session was given a marker (and a cookie, because you have to lure them in somehow) and asked to go around the room and write their ideas under each heading. Each session built on the previous session and we analyzed the information gathered and have started developing a work plan which will target each of the major points. The information gathered has already helped us with the designs for our renovation project, as well as with our budget allocations.

I could write more about the various types of research small public libraries, such as John M. Cuelenaere Public Library, do but I do not want to turn this blog post into an essay! If there are any Brain-Works blog readers out there who are also from public libraries and conduct other forms of research please comment! I would love to hear what other public libraries (large or small) are doing.

Resources

Matthews, J. R. (2013). Research-based planning for public libraries increasing relevance in the digital age. Santa Barbara, CA: Libraries Unlimited.


This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The first few weeks of sabbatical – Time to focus!

by Laura Newton Miller, on sabbatical from Carleton University

I’m lucky to be in the beginning weeks of a one-year sabbatical.  This is my second sabbatical, and I seem to be approaching this one a little differently than 7 years ago.

Unlike my first sabbatical, I started this one with a once-in-a-lifetime family trip to New Zealand. Although it was mostly a holiday, I did have the opportunity to meet and discuss all-things-library with Janet Fletcher and some of the lovely staff at Victoria University of Wellington.  I love seeing how other libraries do things, and our discussions really helped me to focus on my particular research. I was also able to discuss my research focus with family in Wellington, and really appreciated how I can apply their non-library perspective to my own work.

Knowing that I was taking this holiday, I did a lot of initial research pre-sabbatical (ethics approval, survey implementation) so that when I returned, I’d be able to immediately sink my teeth into the analysis. This is different than my first sabbatical, where I started work right away.

So what have I learned so far in my second sabbatical? I will readily admit that I probably have more questions than answers at this point, but I do have some tidbits of what to watch out for….

Limit social media

  • I know, I know – we know this – but it’s tricky sometimes! I find this very easy to do while on vacation, but the combination of jet-lag and arriving back just in time for a lot of turmoil south of the Canadian border made it very difficult to focus my first week back. I’m finding staying off social media a little more difficult this time around, but am aiming to limit myself to checking less often.

Find the time to work

  • I have school-age kids. I’m not sure if the winter weather was better the last sabbatical or not, but my kids seem to be around more because of storm cancellations or catching some sickness/bug. It makes it difficult to try and work during perhaps more “traditional” hours. I’m happy to be there for them, but finding that quiet time can sometimes be a challenge.

I still love analysis

  • I’m reading through comments from my survey. It was overwhelming at first- just sort of “swimming” in all the data, trying to figure out the themes and ways to code things. I’ve finally reached a breakthrough, which is exciting in itself, but even when I’m floundering I still just love it. I’m so excited for all of the things I’m going to learn this year.

I MAY have taken on too much .

Take a vacation/significant break before sinking teeth into work

  • Since I’m really at the beginning of everything right now, I’m still on the fence on whether or not this has helped my productivity. But it has been wonderful to give myself space between my work life and my sabbatical life- to have a chance to “let go” of some of the work-related things and to really focus. Which leads me to….

Stay off work emails

  • I found this very easy to do for my first sabbatical. Because I’m at a different point in my career now, I find myself checking my email *sometimes* this time around. But I try to limit it to infrequently getting rid of junk mail and catching up on major work-related news.

Do you have any tips on staying focused? I would love to hear them. I’m excited and energized about what my sabbatical year holds!


This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.