Action Research: Keeping Research Real and Relevant to Practice

by Karim Tharani
Library Systems & Information Technology, University of Saskatchewan

I recently took an online course on Research Methods as part of my postgraduate studies in educational technology. I was delighted to discover the title of the required text for the course: Research in Education: Evidence-Based Inquiry! At that moment I remember thinking that I might finally be able to make some headway in making evidence-based inquiry real and relevant to my profession as an academic librarian. Allow me to elaborate. As a member of the C-EBLIP, I have been on an on-going quest to internalize the notion of evidence-based research and practice to the extent that when asked, I should be able to explain it unhesitatingly based on my own experience, and not just by repeating someone else’s definition. Now being a member of the C-EBLIP for couple of years, I feel a bit embarrassed to still be on this quest, but at moments like these, I typically seek comfort in aspirational and motivational sayings such as: It is impossible to live without failing at something, unless you live so cautiously that you might as well not have lived at all, in which case you have failed by default (J.K. Rowling).

Education and librarianship are both practice-based professions where the tradition of research to inform practice is relatively new. In my opinion this dual responsibility of being practitioner-researchers makes us more open to finding new ways of researching and transcending the traditional and binary world of qualitative and quantitative research. And as I have learned in this course that the notion of evidence-based inquiry is a fundamental enabler for practitioner-researchers to continue to inform their practice through innovative research approaches:

The term evidence-based does not refer to ritualization and using narrow forms of investigation, nor does it necessarily refer to following formal procedures. A study is evidence-based when investigators have anticipated the traditional questions that are pertinent and instituted techniques to avoid bias at each step of data collection and reasoning. (McMillan & Schumacher, 2010)

It was not until I learned about the concept of action research in this course that I truly started connecting the dots and questioning why and how I do research. I also realized that my initial understanding of research (as an intellectual exercise that is undertaken to identify, investigate, and analyze issues that matter to as many people as possible) was very narrow. With action research, it seems possible for me to integrate my research and practice as a practitioner-researcher within the profession of academic librarianship.

[A]ction research is a participatory, democratic process concerned with developing practical knowing in the pursuit of worthwhile human purposes, grounded in a participatory worldview which we believe is emerging at this historical moment. It seeks to bring together action and reflection, theory and practice, in participation with others, in the pursuit of practical solutions to issues of pressing concern to people, and more generally the flourishing of individual persons and their communities. (Reason & Bradbury, 2010)

Personally, the notion of action research came as a great relief to me as a practitioner-researcher. Since I am more comfortable working on projects to resolve practical issues, I find the notion of action research to be more compatible with the principle of evidence-based inquiry. The action research paradigm also embraces local context as a perfectly valid and acceptable setting for research as long as the underlying research design is valid. In other words, I can focus on the impact of my research on a single community, organization or department without the burden of justifying my research’s applicability or extensibility to other more broader or generic contexts. And last but not least, I find that action research welcomes research collaborations and partnerships, which I greatly appreciate.

While, my quest may not be over yet, this course has helped me internalize the notion of evidence-based inquiry. I remain curious about how others have come to apply the notion of evidence-based inquiry in their research and practice. This is where you (the readers of this blog) come in. Yes, you! The Brain-Work blog is a way for us to learn from each other, so please drop in a line or two and share your thoughts on this or other ideas with your fellow practitioner-researchers. 🙂

Rowling, J.K. (n.d.) In Famous Quotes and Quotations at BrainyQuote. Retrieved from

McMillan, J. H., & Schumacher, S. (2014). Research in education: Evidence-based inquiry. Boston: Pearson Higher Ed.

Reason, P., & Bradbury, H. (Eds.). (2001). Handbook of action research: Participative inquiry and practice. London: Sage Publications.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Assessment and evidence based library and information practice

by Lorie Kloda
Assessment Librarian, McGill University

I have held the position of Assessment Librarian for almost three years, and been involved in the evidence-based library and information practice (EBLIP) movement for over a decade. Since taking on this position, I have been trying to make sense of EBLIP in my job – trying to understand how these two concepts complement each other, overlap, or even contradict one another.

In a 2006 article, “EBL and Library Assessment: Two Solitudes?” Pam Ryan, then the Assessment Librarian at the University of Alberta, asked the question regarding assessment and EBLIP, “Are these separate movements within librarianship forming theoretical bridges? Is some sort of merger, fusion, or takeover in the future?” It’s almost 10 years later, and I think this question still remains unanswered. I think that part of the answer lies in the way in which assessment and EBLIP relate to one another, not just on a theoretical level, but on a practical level.

In my work, I see assessment as having two (not mutually exclusive) goals: one, to inform decision-making for quality improvement to anticipate and meet users’ needs, and two, to demonstrate impact or value. There are, however, some occasions (OK, there are a lot of occasions) when one cannot conduct assessment. Hurdles to assessment include a lack of time, data, resource, experience, and skills. In cases where one cannot conduct assessment, whatever the reason, one can make use of evidence: credible, transferable findings from published research, to inform decision making.

One of the roles of an assessment librarian, or really, any librarian working in assessment and evaluation, is to foster a culture of assessment in the organization in which they work. According to Lakos and Phipps,

“A culture of assessment is an organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes for customers and stakeholders.”

I understand the above quote to mean that librarians need research, analysis of local data, and facts in order to plan and make decisions to best serve library users. A culture of assessment, then, is also one that is evidence-based. I find this idea encouraging and I plan to spend some time thinking more about how the steps in EBLIP and assessment overlap. While I think the realm of library and information practice is still far from a takeover or merger when it comes to assessment and EBLIP, I think the two will continue to mingle and hopefully foster a culture which leads to increasingly improved services.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.


by Margy MacMillan
Mount Royal University Library

I practiced SoTL for at least 5 years in blissful ignorance of its existence. You too may be a SoTList or have SoTList leanings and not even know it; it may well be time to explore the Scholarship of Teaching and Learning.

The research projects I’m currently involved in occur at the intersection of information literacy (IL) and SoTL, and like all intersections it’s an exciting, slightly unsettling place to be. There’s a lot of movement in many directions, a lot of choices on where to go next, and some things just have to wait until there’s a break in the traffic to get going. Standing at this intersection I`ve had some time to think about the links between SoTL and evidence based library and information practice (EBLIP)…
Hanoi, 2013. By D. MacMillan


SoTL might be described as evidence-based practice in teaching. It is focused, like EBLIP on gathering evidence to understand different situations and/or the impact of different interventions. It uses a range of methodologies and works both within and across discipline boundaries. While it is most obviously akin to evidence-based research in IL, branches of SoTL concerned with technology or institutional cultures may resonate with other library researchers. Much like EBLIP conferences where those who work with bioinformatics data discover common ground with public librarians working with citizen science initiatives, SoTL fosters conversations between people who might not otherwise meet. Academics working in SoTL don’t always get much support for their research at their own institutions (sound familiar?) or within their own disciplines and they value conferences both for finding kindred spirits and for the interdisciplinarity that brings fresh ideas and approaches. Since arriving in this welcoming SoTLsphere, I have enjoyed exploring further – attending conferences, getting involved in SoTL on my campus and currently supporting the SoTL work of colleagues through Mount Royal`s Institute for SoTL.

3 ways SoTL has helped me EBLIP

Methodologies – SoTL work rests on applying disciplinary research methods to understanding teaching and learning. I’ve encountered a really broad range of methods in SoTL work that also apply to EBLIP.

Understanding Threshold Concepts (TCs) – While I had first heard of TC’s at a library conference, this way of looking at learning is a major focus in SoTL and I have been able to bring knowledge from SoTL folks into discussions around the new TC-informed Framework for IL.

Focus on building a community – Some SoTLers are involved with building communities on campuses by expanding relationships, providing support, and developing policy. There are many useful insights here for library initiatives and I have benefited from becoming part of a very supportive, cross disciplinary group of scholars.

3 ways EBLIP has helped me SoTL

Better understanding of diverse literatures and how to search them – This has helped me enter a new field, but also allows me to contribute back to the SoTL community on campus as I am aware of resources and tools for searching outside their disciplines.

Longer experience with evaluating usefulness of small steps and interventions – IL is often assessed at micro levels: the use of a particular tool, or the effectiveness of a teaching strategy, often within a single class. We have developed a number of strategies to examine teaching and learning at this atomized level useful for instructors accustomed to thinking in course-sized chunks.

Understanding how dissemination works – Work like Cara Bradley`s is informing my work with SoTLers in identifying venues for publication, and my next project on studying dissemination patterns in SoTL.

Interest in SoTL among librarians is growing, as evidenced by increasing numbers at conferences and a colleague in the UK who is writing a book about SoTL and librarians (many thanks to Emma Coonan for a great conversation that clarified many of these thoughts and if you aren’t reading her The Mongoose Librarian blog on a regular basis … .well, you should be!). Explore a little, dip into their literature, maybe go to a conference or talk to the teaching and learning folks on your campus… they can use our help and we might be able to borrow a few things from them. Maybe we’re overdue for a change.

3 good reads about SoTL

Felten, P. (2013). Principles of good practice in SoTL. Teaching and Learning Inquiry: The ISSOTL Journal, 1(1), 121-125.

Huber, Mary Taylor and Sherwyn P. Morreale, eds. Disciplinary Styles in the Scholarship of Teaching and Learning: Exploring Common Ground. Menlo Park, CA: Carnegie Foundation, 2002.

Hutchings, P. (2010). The scholarship of teaching and learning: From idea to integration. New Directions for Teaching and Learning, 2010(123), 63-72.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Counting What You Cannot Count Or, A Literature Scholar Turned Librarian Ponders the Meaning of Evidence

by Heidi LM Jacobs
Leddy Library, University of Windsor

The first sentence I wrote for this blog post was this: “Perhaps this isn’t the best way to start a blog post for C-EBLIP but I have a confession to make: I am deeply suspicious of evidence.” The more I wrote, the more I realized that everyone is (or should be) deeply suspicious of evidence. We need to think about it carefully and critically and ask the difficult questions of it. The more I wrote about my distrust of evidence, the more I realized that it wasn’t evidence per se that gave me pause, but how evidence is generally defined and approached in LIS.

I think what I really meant when I said “I am deeply suspicious of evidence” is that I am deeply suspicious of numbers and the concept of “indisputable facts.” There are always, I think, more sides to the story than numbers can convey and when I see charts, figures, and graphs, I am always looking for the other parts of the story, the absent narratives, the “random dots,” the findings off the grid. What are the things these numbers aren’t telling us?

My feelings toward evidence could easily be explained by the fact that I am a literature scholar turned information literacy librarian. In both fields, we are trained to look at “evidence” in particular ways. My next blog post will consider what humanities-based modes of thinking could contribute to evidence based library and information practice (EBLIP) but in this post I’d like to pick up something that Denise Koufogiannakis (2013) raised in her EBLIP7 keynote. She observed that “EBLIP’s focus to date has been on research evidence and how to read and understand research better. This is a good thing (I certainly do not want to diminish the importance of the work that has been done in this respect)–but it is not the only thing–and we must begin to explore other types of evidence” (9).

Like Denise, I do not want to diminish the importance of the work being done in EBLIP and my questions about EBLIP are not to challenge or undermine the work done in this area. Rather, in this blog post, I want to respond to what I read as an invitation from Denise to “explore other types of evidence.” What other evidence types might there be? And what might these other types of evidence contribute to EBLIP, to research, and to librarianship? In this and my next blog post, I will be pondering some of these issues and invite others to join me in thinking through these ideas.

As I read in the field of EBLIP, I often wonder where librarians like me, with ties to the humanities, might fit in with evidence-based work. But as I write this, I pause because it’s not that my research, practice, teaching, and thinking aren’t informed by evidence—it’s just that the kind of evidence I summon, trust, and use is not easily translated into what usually constitutes “evidence,” formally and informally. Perhaps I am the kind of librarian who Denise (2012) describes here: “The current model may be alienating some librarians who feel that the forms of evidence they are using are not being recognized as important” (6). One cannot quantify theoretical thoughts or chart reflective practice; some researchers might view this kind of evidence as soft at best, inadmissible at worst.

For a while I thought it might make sense, as the song says, for me and EBLIP to call the whole thing off. However, the more I think about EBLIP the more I think there might be something worth considering. I realize that it’s not that I distrust evidence: I am skeptical of a certain kind of evidence. Or, phrased another way, I trust a different kind of evidence, a kind of evidence I don’t often see reflected in EBLIP.

My argument is not that EBLIP needs to change so that those of us with humanities backgrounds and humanities ways of thinking feel personally included or intellectually welcome in EBLIP endeavours. My argument is, instead, that humanities ways of thinking about evidence could offer EBLIP perspectives and approaches that might take us in new directions in our thinking, research, scholarship, and practice.

For those of us working with students and information literacy, we know that we can attempt to understand their experiences quantitatively or explore their thoughts and habits qualitatively. But again, to my mind, these kinds of studies are only part of the information literacy story: findings need to be contextualized, studies problematized, evidence questioned, “random dots” explored, premises and practices reflected upon and theorized. When we consider students—or any segment of user communities—in library scholarship we need to remember that we are studying complex, changeable people who themselves cannot be reduced into charts and graphs. Any “answers” we may find may raise more questions, and arguably, they should.

EBLIP’s suitability for helping us make decisions has been well-explored and theorized. I’m wondering if we could also use evidence—writ large—to help us ask new questions of our practices, our selves, our research and our profession.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

What’s All This About Evidence? C-EBLIP Journal Club, June 9, 2014

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

On June 9 over lunch, the C-EBLIP Journal Club met for the first time with 9 librarians in attendance. The plan is to meet every 6 weeks or so, and dates have been set until May 2015 to get us going. The club will run around rotating convenors, with the convenor’s role being to choose an article to discuss, book the meeting space, send out meeting announcements and reminder emails, lead the journal club discussion, and write up a discussion summary for the blog. I convened the first meeting and here’s how it went.

Koufogiannakis, D. (2012). Academic librarians’ conception and use of evidence sources in practice. Evidence Based Library and Information Practice, 7(4): 5-24.

1. Brief summary of article

This is a qualitative study using grounded theory methodology that addresses these research questions: What forms of evidence do academic librarians use when making professional decisions? Why do they use these types of evidence? How do academic librarians incorporate research into their professional decision making? Koufogiannakis recruited 19 academic librarian participants using purposeful sampling (in other words, she looked for a particular population – librarians who “had some interest in exploring the use of evidence in relation to their professional decision-making” (p. 8)). These librarians kept a private blog diary for a month, noting questions or problems related to their professional practice and how they came to resolve those issues. She followed up with semi-structured phone interviews which enabled her to clarify and get a deeper analysis of specific aspects participants noted in their diaries. The results of this study include two variations of evidence: hard and soft, and the librarians in the study used both kinds to inform decision-making. They often didn’t think of the soft types of evidence as real evidence, but were using them in conjunction with the harder forms of evidence such as the research literature to inform their practice. The piece concludes with the idea that EBLIP needs to expand to include a wide variety of evidence types to “bring the EBLIP model closer to one that has truly considered the needs of librarians” (p. 21).

2. Discussion Summary
In retrospect, it is a difficult process to fully participate in a journal club discussion and to take notes for some kind of discussion summary after the fact. The following discussion points are highly contextual. The journal club discussion was informal and conversational, allowing us to take different roads.

• The discussion began with general impressions of the article and these impressions included praise for the clarity of the writing and an appreciation for the articulation and demonstration of material that at first glance seemed obvious but upon deeper reading revealed itself to be thought-provoking and meaningful.
• There was some discussion about the sample of librarians who participated in the study – how was that sample generated, were the librarians similar to the researcher, what would be the implications of that?
• The article takes a broad view of what is considered evidence. A comment was made that we are still so uncertain and that the article highlights an ongoing struggle. Other people may not accept the soft evidence. As librarians we can evaluate sources and all have the potential to have something to offer. Where it can get complex for us (as librarians) is the fact that we serve other research communities. We sometimes inherit those value systems through that work.
• There was a comment about the reflective nature of the research and someone commented that the humanities are doing the same kind of introspective things.
• The discussion moved to evidence based practice in health/medicine. In many models of EBP, the focus seems to be only on the research and not the other two aspects (professional knowledge/expertise; user/client/patient preference).
• Why aren’t librarians going to their associations for information/evidence, i.e. ACRL – tools, checklists, etc.?
• Is soft evidence used in the absence of hard evidence? After this question arose, there was a discussion around the designations of the terms “hard” and “soft.” Is the designation imposing some kind of bias? These are not neutral terms. Why have we as a culture or a society determined a hierarchy for these terms? What other terms might have been used to classify these two types of evidence? We couldn’t come up with anything that wasn’t hierarchical to some extent, and we put that down to the idea that we are organizers by nature.
• If the definition of evidence is expanding, does it not just dilute everything? i.e. describing our situation versus prescribing a direction
• Tacit knowledge: gut reactions help to pose the questions which ideally we go on to explore. The impetus to “prove it” helps to solidify thoughts. We often don’t believe our own knowledge is good or that it’s appropriate proof.
• We need to get away from the idea that right and wrong are forever. Need a new standard that allows for new knowledge and flexibility.

This was a great article with which to kick off the C-EBLIP Journal Club. Not only did we discuss the particulars of the research reported in the article, it also acted as a catalyst that sparked a professional discussion that we may well not have had without it about how we practice as librarians and what it means to practice in a evidence-based way.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.