The appropriation of evidence based terminology by vendors

by Denise Koufogiannakis
University of Alberta Libraries

Over the past few years, I’ve noticed an increasing number of products being marketed to librarians as “evidence based” tools for improving our decision making. Vendors seem to be hooking onto the growth and acceptance of evidence based practice within librarianship and are marketing their products as such. They are wanting to appeal to those who see value in data as a driver for decision making.

I recently looked into this more formally (see my EBLIP8 presentation from July of this year) and found two different types of products being promoted as “evidence based”:

1. Data gathering tools for collections analysis – these products are aimed at both academic and public librarians, but there are different products for each. For public libraries, the products focus on information such as circulation and demographic data to aid with management of the collection and new acquisitions. Similar products being targeted to academic libraries focus on collections usage statistics for the purposes of making cancellation decisions, weeding, and showing return on investment. Examples include CollectionHQ for public libraries and Intota Assessment for academic libraries.

2. Evidence Based Acquisition approaches – aimed at academic librarians, “evidence based acquisition” (sometimes called usage-based acquisition) is a relatively new option being presented by publishers, similar to patron-driven or demand-driven approaches. In this model, a group of titles from a publisher (such as all the titles in a particular subject area) are enabled upon commitment from the library to spend an agreed upon amount of money. Following the agreed upon time period, the library chooses the titles they wish to keep, based upon usage of those titles (for more detail see the overview included in the NISO Recommended Practice for Demand Driven Acquisition of Monographs). Examples of this approach can be found with many of the major academic publishers including Elsevier, Cambridge, and SAGE.

The question I ask myself is whether these products are really evidence based? Can they deliver what they promise when they say that they will improve collection management, make librarians’ jobs easier, help with decision making, save time, and provide dependable, high quality service? I guess it is the evidence based, critical side of me that is doubtful.

EBLIP is a process that asks us to consider the whole of the evidence when making a decision. To try and determine what the best evidence is. To try and see a complete picture by bringing together different evidence sources when making a decision. EBLIP is an approach to practice that is considered and reflective. Conversely, these products are meant to convince us that because they are called evidence based they will magically take care of all this hard work for us!

None of this is to say that the products are bad. In fact, they seem to offer potentially useful ways of drawing together data for collections and acquisitions librarians to use, or a model for acquisition that may actually prove to be a good one for many libraries. In short, what I see in these products are individual pieces of evidence that may be useful to aid with decisions, but certainly will not be a complete answer.

What we should all consider is the appropriation of evidence based terminology. This appropriation probably means that the EBLIP movement has become sufficiently recognized as integral to librarianship, to the point that its terminology is now selling vendors’ products to librarians, using the discourse of the movement. Referring to a product as evidence based lends credibility to it. If accepted as evidence based, the product’s profile is raised in comparison to other products, which may then be regarded as not being evidence based, even though they may certainly be just as evidence based as the products being marketed as such. This use of the term has been too easily allowed to be applied without question.

EBLIP as a way of approaching practice is far more complex than what these products can offer. If they hold some piece of information that helps you with the process, great! But don’t think your job ends there. Just like all products, the types of products I’ve described above need to be assessed and tested. To state the obvious, do not rely on the evidence based terminology used by the vendor. If it does something that makes your work easier, then by all means use it. But no product will be a magic solution. Above all, let’s test these these products and determine how evidence based they actually are. How much will they help us advance the goals and mission of our Library? Let’s make sure they live up to what they say they offer, and place whatever they do offer in the larger context of overall evidence based decision making within collections.

Let’s not rely on vendors to tell us what is evidence based – let’s figure it out ourselves. We need to do more testing and critically examine all these products, document and share what we learn with one another. Here are a couple of examples that may help you with your own examination:
Buying by the bucket: A comparative study of e-book acquisitions strategies.
Evidence based acquisitions: Does the evidence support this hybrid model?

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Action Research: Keeping Research Real and Relevant to Practice

by Karim Tharani
Library Systems & Information Technology, University of Saskatchewan

I recently took an online course on Research Methods as part of my postgraduate studies in educational technology. I was delighted to discover the title of the required text for the course: Research in Education: Evidence-Based Inquiry! At that moment I remember thinking that I might finally be able to make some headway in making evidence-based inquiry real and relevant to my profession as an academic librarian. Allow me to elaborate. As a member of the C-EBLIP, I have been on an on-going quest to internalize the notion of evidence-based research and practice to the extent that when asked, I should be able to explain it unhesitatingly based on my own experience, and not just by repeating someone else’s definition. Now being a member of the C-EBLIP for couple of years, I feel a bit embarrassed to still be on this quest, but at moments like these, I typically seek comfort in aspirational and motivational sayings such as: It is impossible to live without failing at something, unless you live so cautiously that you might as well not have lived at all, in which case you have failed by default (J.K. Rowling).

Education and librarianship are both practice-based professions where the tradition of research to inform practice is relatively new. In my opinion this dual responsibility of being practitioner-researchers makes us more open to finding new ways of researching and transcending the traditional and binary world of qualitative and quantitative research. And as I have learned in this course that the notion of evidence-based inquiry is a fundamental enabler for practitioner-researchers to continue to inform their practice through innovative research approaches:

The term evidence-based does not refer to ritualization and using narrow forms of investigation, nor does it necessarily refer to following formal procedures. A study is evidence-based when investigators have anticipated the traditional questions that are pertinent and instituted techniques to avoid bias at each step of data collection and reasoning. (McMillan & Schumacher, 2010)

It was not until I learned about the concept of action research in this course that I truly started connecting the dots and questioning why and how I do research. I also realized that my initial understanding of research (as an intellectual exercise that is undertaken to identify, investigate, and analyze issues that matter to as many people as possible) was very narrow. With action research, it seems possible for me to integrate my research and practice as a practitioner-researcher within the profession of academic librarianship.

[A]ction research is a participatory, democratic process concerned with developing practical knowing in the pursuit of worthwhile human purposes, grounded in a participatory worldview which we believe is emerging at this historical moment. It seeks to bring together action and reflection, theory and practice, in participation with others, in the pursuit of practical solutions to issues of pressing concern to people, and more generally the flourishing of individual persons and their communities. (Reason & Bradbury, 2010)

Personally, the notion of action research came as a great relief to me as a practitioner-researcher. Since I am more comfortable working on projects to resolve practical issues, I find the notion of action research to be more compatible with the principle of evidence-based inquiry. The action research paradigm also embraces local context as a perfectly valid and acceptable setting for research as long as the underlying research design is valid. In other words, I can focus on the impact of my research on a single community, organization or department without the burden of justifying my research’s applicability or extensibility to other more broader or generic contexts. And last but not least, I find that action research welcomes research collaborations and partnerships, which I greatly appreciate.

While, my quest may not be over yet, this course has helped me internalize the notion of evidence-based inquiry. I remain curious about how others have come to apply the notion of evidence-based inquiry in their research and practice. This is where you (the readers of this blog) come in. Yes, you! The Brain-Work blog is a way for us to learn from each other, so please drop in a line or two and share your thoughts on this or other ideas with your fellow practitioner-researchers. 🙂

References
Rowling, J.K. (n.d.) In Famous Quotes and Quotations at BrainyQuote. Retrieved from http://www.brainyquote.com/quotes/authors/j/j_k_rowling.html

McMillan, J. H., & Schumacher, S. (2014). Research in education: Evidence-based inquiry. Boston: Pearson Higher Ed.

Reason, P., & Bradbury, H. (Eds.). (2001). Handbook of action research: Participative inquiry and practice. London: Sage Publications.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Assessment and evidence based library and information practice

by Lorie Kloda
Assessment Librarian, McGill University

I have held the position of Assessment Librarian for almost three years, and been involved in the evidence-based library and information practice (EBLIP) movement for over a decade. Since taking on this position, I have been trying to make sense of EBLIP in my job – trying to understand how these two concepts complement each other, overlap, or even contradict one another.

In a 2006 article, “EBL and Library Assessment: Two Solitudes?” Pam Ryan, then the Assessment Librarian at the University of Alberta, asked the question regarding assessment and EBLIP, “Are these separate movements within librarianship forming theoretical bridges? Is some sort of merger, fusion, or takeover in the future?” It’s almost 10 years later, and I think this question still remains unanswered. I think that part of the answer lies in the way in which assessment and EBLIP relate to one another, not just on a theoretical level, but on a practical level.

In my work, I see assessment as having two (not mutually exclusive) goals: one, to inform decision-making for quality improvement to anticipate and meet users’ needs, and two, to demonstrate impact or value. There are, however, some occasions (OK, there are a lot of occasions) when one cannot conduct assessment. Hurdles to assessment include a lack of time, data, resource, experience, and skills. In cases where one cannot conduct assessment, whatever the reason, one can make use of evidence: credible, transferable findings from published research, to inform decision making.

One of the roles of an assessment librarian, or really, any librarian working in assessment and evaluation, is to foster a culture of assessment in the organization in which they work. According to Lakos and Phipps,

“A culture of assessment is an organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes for customers and stakeholders.”

I understand the above quote to mean that librarians need research, analysis of local data, and facts in order to plan and make decisions to best serve library users. A culture of assessment, then, is also one that is evidence-based. I find this idea encouraging and I plan to spend some time thinking more about how the steps in EBLIP and assessment overlap. While I think the realm of library and information practice is still far from a takeover or merger when it comes to assessment and EBLIP, I think the two will continue to mingle and hopefully foster a culture which leads to increasingly improved services.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

EBLIP + IL = SoTL

by Margy MacMillan
Mount Royal University Library

I practiced SoTL for at least 5 years in blissful ignorance of its existence. You too may be a SoTList or have SoTList leanings and not even know it; it may well be time to explore the Scholarship of Teaching and Learning.

The research projects I’m currently involved in occur at the intersection of information literacy (IL) and SoTL, and like all intersections it’s an exciting, slightly unsettling place to be. There’s a lot of movement in many directions, a lot of choices on where to go next, and some things just have to wait until there’s a break in the traffic to get going. Standing at this intersection I`ve had some time to think about the links between SoTL and evidence based library and information practice (EBLIP)…
MargyHanoi
Hanoi, 2013. By D. MacMillan

EBLIP and SoTL

SoTL might be described as evidence-based practice in teaching. It is focused, like EBLIP on gathering evidence to understand different situations and/or the impact of different interventions. It uses a range of methodologies and works both within and across discipline boundaries. While it is most obviously akin to evidence-based research in IL, branches of SoTL concerned with technology or institutional cultures may resonate with other library researchers. Much like EBLIP conferences where those who work with bioinformatics data discover common ground with public librarians working with citizen science initiatives, SoTL fosters conversations between people who might not otherwise meet. Academics working in SoTL don’t always get much support for their research at their own institutions (sound familiar?) or within their own disciplines and they value conferences both for finding kindred spirits and for the interdisciplinarity that brings fresh ideas and approaches. Since arriving in this welcoming SoTLsphere, I have enjoyed exploring further – attending conferences, getting involved in SoTL on my campus and currently supporting the SoTL work of colleagues through Mount Royal`s Institute for SoTL.

3 ways SoTL has helped me EBLIP

Methodologies – SoTL work rests on applying disciplinary research methods to understanding teaching and learning. I’ve encountered a really broad range of methods in SoTL work that also apply to EBLIP.

Understanding Threshold Concepts (TCs) – While I had first heard of TC’s at a library conference, this way of looking at learning is a major focus in SoTL and I have been able to bring knowledge from SoTL folks into discussions around the new TC-informed Framework for IL.

Focus on building a community – Some SoTLers are involved with building communities on campuses by expanding relationships, providing support, and developing policy. There are many useful insights here for library initiatives and I have benefited from becoming part of a very supportive, cross disciplinary group of scholars.

3 ways EBLIP has helped me SoTL

Better understanding of diverse literatures and how to search them – This has helped me enter a new field, but also allows me to contribute back to the SoTL community on campus as I am aware of resources and tools for searching outside their disciplines.

Longer experience with evaluating usefulness of small steps and interventions – IL is often assessed at micro levels: the use of a particular tool, or the effectiveness of a teaching strategy, often within a single class. We have developed a number of strategies to examine teaching and learning at this atomized level useful for instructors accustomed to thinking in course-sized chunks.

Understanding how dissemination works – Work like Cara Bradley`s is informing my work with SoTLers in identifying venues for publication, and my next project on studying dissemination patterns in SoTL.

Interest in SoTL among librarians is growing, as evidenced by increasing numbers at conferences and a colleague in the UK who is writing a book about SoTL and librarians (many thanks to Emma Coonan for a great conversation that clarified many of these thoughts and if you aren’t reading her The Mongoose Librarian blog on a regular basis … .well, you should be!). Explore a little, dip into their literature, maybe go to a conference or talk to the teaching and learning folks on your campus… they can use our help and we might be able to borrow a few things from them. Maybe we’re overdue for a change.

3 good reads about SoTL

Felten, P. (2013). Principles of good practice in SoTL. Teaching and Learning Inquiry: The ISSOTL Journal, 1(1), 121-125. http://muse.jhu.edu/journals/teaching_and_learning_inquiry__the_issotl_journal/v001/1.1.felten.html

Huber, Mary Taylor and Sherwyn P. Morreale, eds. Disciplinary Styles in the Scholarship of Teaching and Learning: Exploring Common Ground. Menlo Park, CA: Carnegie Foundation, 2002.

Hutchings, P. (2010). The scholarship of teaching and learning: From idea to integration. New Directions for Teaching and Learning, 2010(123), 63-72. http://fresnostate.edu/academics/csalt/documents/Hutchings2010.pdf

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Reflecting on Our Biases

by Denise Koufogiannakis
University of Alberta Libraries

Reflection is an important part of evidence based practice. The principle has been embraced by those who aim to practice in an evidence-based manner because being an evidence based practitioner is not just about the evidence itself, but about the process of how and why we use that evidence. To date, reflection has generally been inserted into the evidence based process towards the end of the cycle, prompting one to look reflectively back on what was done in order to reach a decision. One reflects on such questions as: what evidence did I find and use; what evidence was lacking; what happened during the decision making process; did the chosen implementation work; what did I personally learn; what would I change next time? Reflection has also largely been considered an individual and private act that a professional undertakes for self-improvement.

I’d like to propose that we begin the process of reflection earlier in the process, specifically as it pertains to the biases we have in relation to a particular question or problem at hand. And, since library decisions are frequently made in groups, we should make reflection on our biases a shared act with colleagues who are also engaged in finding a solution to the problem. As we strive to incorporate evidence into our decision making, it is important to be aware of the biases that we all bring to finding, interpreting, weighing, and using evidence. We work in organizations, large or small, with others – we all have different perspectives, motivations, and desires. Decision making as part of a group is not easy! We need to be conscious of how the biases of each group member and the collective dynamic might influence the process. Through reflection and openness, we may be able to limit our biases and therefore make better decisions.

In practical terms, what this means is being upfront with our colleagues, and where a group has been tasked to make a decision or put forward recommendations on a specific new initiative or review of an existing area, that we have conversations about our biases from the very start. This requires that each person reflect on how they are considering and approaching the problem or question, what their initial reaction was, what they hope will be the outcome, and any other preconceived notions they have related to the issue. It also means that collectively, the group discusses and acknowledges the various biases, and consciously moves forward with the intent to address all biases so that they do not adversely affect the final decision. Doing this may be a bit risky for each individual, but it creates a climate in which trust can be built, and the group can proceed with an open and transparent approach to their decision making. It means that in all likelihood, more sources of evidence will be sought and considered, potential solutions will not be dismissed out of hand, and a sound approach will be chosen.

Here are some common biases people have, and without being aware of them, they may adversely affect our decision making:
• overconfidence bias – when people think they know more than they actually do
• confirmation bias – when people gather information selectively in order to confirm what they already think
• framing bias – when people make different decisions depending upon how information is presented
• representative bias – when people rely on stereotypes and predict outcomes based on past situations
• anchoring bias – when people rely too heavily on one piece of information
(Robbins, 2005; Greenberg and Baron, 2008)

For more on biases within the workplace, I recommend this brief overview by Rykersmith (2013) who provides a list of 5 biases in decision making, based on the research of Lovallo and Sibony (2010). While taken from business, the advice soundly applies to decision making within libraries, and provides ways for us to spot these biases and overcome them.

Recognizing your own biases or those within your group is important. Here are some questions to ask yourself and your group, in order to identify possible biases and discuss them.
• What is my natural inclination with respect to this problem? Do I already think I know the answer for what is best?
• Am I picking and choosing evidence that only suits my predetermined notion?
• If I have passionate feelings about this topic, why is that? Is there an important ethical or professional principle that needs to be considered within the decision?
• Are there other people with opposing views that I find difficult to discuss the problem with, and this is clouding my judgement?
• Am I reacting due to my own motivations/desires? Is a potential change going to impact me personally and therefore I am afraid of it?
• Am I easily influenced by one particular piece of evidence? Why might that be? Why did that piece of evidence impress me?
• Do I stand to gain or lose based on the outcome of this decision? Is this potential change influencing me?
• Have I gathered the types of evidence that would help, or just what was easy? Have all possibilities been considered? Have all perspectives been represented?
• Is the evidence sound or just based on anecdote and sentiment?

Once a bias has been brought to light, it is much easier to deal with and proceed with a higher level of consciousness. Such reflection is sure to bring us closer to better decision making.

References

Greenberg, J., & Baron, R. A. (2008). Behavior in organizations (9th ed.). Upper Saddle River, NJ: Pearson Prentice Hall.

Lovallo, D., & Sibony, O. (2010). The case for behavioral strategy. McKinsey Quarterly. Accessed 15 Feb. 2015
http://www.i3-invest.com/uploads/pdf_file/c850526fa6f572915c3645199db25297.pdf

Robbins, S. P. (2005). Essentials of organisational behavior (8th ed.). Upper Saddle River, NJ: Pearson Education, Inc.

Rykrsmith, E. (2013). 5 biases in decision making – Part 2. The Fast Track Blog. Accessed 15 Feb. 2015
http://quickbase.intuit.com/blog/2013/06/07/5-biases-in-decision-making-part-2/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

I can’t make bricks without clay: A Sherlock Holmesian approach to Research and EBLIP

by Marjorie Mitchell
Librarian, Learning and Research Services
UBC Okanagan Library

Sherlock Holmes was invoked in the inaugural post of Brain-Work, the C-EBLIP blog, and I would like to revisit Conan-Doyle for the inspiration of this post. So, being a curious librarian, I searched “Sherlock Holmes and research” and came across the heading “What user researchers can learn from Sherlock Holmes.” The author, Dr. Philip Hodgson, took quotes from a variety of the Sherlock Holmes novels and laid out a five-step research process for investigators working in the user-experience field. As I read Dr. Hodgson’s article, it struck me there was a strong kinship between the user-experience community and the library community that extends beyond the electronic world. I also believe Hodgson’s steps provide a reasonable starting point for novice evidence-based library and information practice (EBLIP) researchers to follow.

Step 1 – According to Hodgson, the first step is understanding the problem, or formulating the question. I would adapt it even further and suggest being curious is the very first step. If we’re not curious, we can’t identify what we want to know, or the question we hope to answer. Perhaps a mindset of being open to the discomfort of not knowing motivates researchers to embark on the adventure of inquiry. Once my curiosity has been aroused, I move to formulating a question. Personally, my question remains somewhat fluid as I begin my research because there are times I really don’t have enough information to formulate an answerable question at the beginning of my research.

Step 2 – Collecting the facts, or as I prefer to call it, gathering the evidence, follows. This is one of the juicy, tingly, exciting parts of research. Once I have a question, I think about what information will answer the question. Sometimes simply reading the literature will give me enough of an answer. At other times, I have to go further. Even just thinking about methods can send a shiver of excitement through me. Administering surveys, or conducting interviews, or running the reports from the ILS in hopes it will illuminate some arcane or novel library user behavior are all ways of collecting juicy evidence; it is exciting to see initial results come in and begin to decipher what the results are actually saying. Sometimes the results are too skimpy, or inconclusive, and the evidence gathering net needs to be cast again in a different spot for better results.

Step 3 – Hodgson suggests the next step should be developing a hypothesis to explain the facts you have gathered. This is one step, as much or more than the others, that requires our brain-work. Here we bring our former knowledge to bear on the results and how they relate to the question. It is a time for acute critical thinking as we take the results of our evidence gathering and determine their meaning(s). Several possible meaning may arise at this stage. Hodgson implies it is important to remain open to the multiple meanings and work to understand the evidence gathered in preparation for the next step.

Step 4 – In this step, Hodgson is especially Holmsian. He suggests it is now time to eliminate the weaker hypotheses in order to come closer to a solution. The focus on user experience research is especially strong here. Specific, actionable solutions are being sought to the question identified in the first step. Here he recommends evaluating your evidence to eliminate the weaker evidence in favor of the stronger. He is also cognizant of the need to have solutions that will be affordable and able to be implemented in a given situation. While the whole of this step may not apply to all research, much of it will.

Step 5 – Implementation or action now have their turn. Again, Hodgson is speaking directly to the user experience audience here. However, implementation or action based on research may lead to a decision to not implement or act upon a suggestion. The strength lies in the process around reaching this decision. Questions were asked; evidence was gathered; analysis took place; judgment was applied. As Hodgson pointed out, this is a much better process than proceeding by intuition.

Finally, I would like to add a Step 6 to Hodgson’s list. In order to really know whether the action implemented had the desired effect, or an unintended effect, it is important to evaluate the results of the action or change. In the effort to publish results of research, timeliness is an issue. It is not often possible to have the luxury of the amount of time it would take to be able to measure an effect. However, even in those cases, I am interested in what type of evaluation might take place at a later date. Sometimes researchers will address their future evaluation plans, sometimes they don’t. Even if they aren’t being shared, I hope they are being considered.

This is a simple and elegant plan for research. In its simplicity, it glosses over many of the messy complications that arise when conducting research. That said, I hope this post encourages librarians to follow their curiosity down the path of research.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Open Data and EBLIP – How open are we?

by Pam Ryan
Director, Collections & Technology at Edmonton Public Library

When we talk about evidence based library and information practice (EBLIP), we’re most often talking about research – conducting our own or finding and integrating research and best-available evidence into our practice. Part of this continuum should also include working towards making our own library service and operations data openly available for analysis and re-use.

This isn’t out of line with current library initiatives. Academic libraries have long supported the open access movement and for many, services around managing institutional research data are a current priority. Influenced by open government developments in their municipalities, public libraries are increasingly working to increase open data literacy through programming, encouraging citizens to think critically about government services and learn how to unlock the value of open data.

Why does open data matter for libraries? It aligns with our core values of access to information, sharing, openness, transparency, accountability, and stewardship. It supports our missions to provide information and data literacy, it can provide others with information to help us in our advocacy and value of libraries initiatives, and maybe most importantly, it can fuel research and initiatives we ourselves haven’t yet thought of.

My own place of work has a current business plan goal to: Develop an open data policy that includes how we will use and share our own data; participate in Edmonton’s Open Data community and support data literacy initiatives. We’ve begun to make progress in these areas by developing a statement on open data and collaborating with the City of Edmonton on public programs:

• Edmonton Public Library’s Statement on Open Data:
http://www.epl.ca/opendata

• EPL’s 2014 Open Data Day program:
http://www.epl.ca/odd2014

Has your library started a discussion about what your library’s approach to open data will be?

Further Reading:

Thompson, B, The open library and its enemies, Insights, 2014, 27(3), 229–232; DOI: http://dx.doi.org/10.1629/2048-7754.172

Data is Law / Civic Innovations: The Future is Open http://civic.io/2014/12/28/data-is-law/

pryan@epl.ca / Twitter: @pamryan

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Oh, the Humanities: A Literature Scholar Turned Librarian Ponders the Art and Science of Librarianship

by Heidi LM Jacobs
Leddy Library, University of Windsor

When librarians with humanities backgrounds have an opportunity to meet and talk, I am struck by how we seem to cling to each other, relieved and grateful to have someone to talk with who understands us. It also strikes me as strange that while there are many, many academic librarians with backgrounds in the humanities, we feel so alone within LIS.

It seems unchallengeable that LIS is a social science: indeed it is and there is nothing at all wrong with this. Yet I often wonder what would happen if we thought about librarianship as a humanities endeavor, bringing together, as Denise Koufogiannakis writes, “the art and science of our profession”:

“We need to embrace both the science and the art of evidence based practice — otherwise, we will overlook important elements of the whole situation that practitioners work within. Doing so is not neat and tidy, but does that really matter? LIS is a social science, and the “social” implies “messy” because people and real-life situations are not easily controlled. The art of our craft allows us to embrace the messy situation, find ways to be creative, put our professional judgments to use and find the best solutions to meet the needs of individual users by applying the best of what we find in the research literature together with the best of what we know is likely to help this person.” (49)

I’m not at all disagreeing with Denise and the countless others who have called LIS a social science, but I do want to raise this question: what would happen if we called LIS a humanities subject? How would our ideas of evidence change? What kinds of evidence would we use? What kinds of evidence would inform the questions that we ask? Our decisions? Our profession?

In 2013, John Horgan published an interesting blog post on the Scientific American site called “Why Study Humanities? What I tell Engineering Freshman.” This post raises ideas that pick up on points I made in my last blog post and helps me think more fully about what the humanities could offer librarianship in terms of both research and practice.

Horgan writes, “We live in a world increasingly dominated by science. And that’s fine. I became a science writer because I think science is the most exciting, dynamic, consequential part of human culture, and I wanted to be a part of that.”

“But,” he goes on to argue, that “it is precisely because science is so powerful that we need the humanities now more than ever. In your science, mathematics and engineering classes, you’re given facts, answers, knowledge, truth. Your professors say, “This is how things are.” They give you certainty. The humanities, at least the way I teach them, give you uncertainty, doubt, skepticism.”

“The humanities,” Horgan tells his students, “are subversive. They undermine the claims of all authorities, whether political, religious or scientific. . . The humanities are more about questions than answers, and we’re going to wrestle with some ridiculously big questions in this class.”

I’m particularly drawn to Horgan telling his students that they are going to “wrestle with some ridiculously big questions.” It seems to me that if we focus our research and our inquiry into things that we can count or quantify or that we can collect a particular kind of evidence about, then we’re scaling back the questions we can ask about our practice and our profession.

Literature students are trained to ask questions, look for fissures in logic, notice how meaning can hang on a single word or the placement of a comma. We look for the silences and gaps that reveal significant things through omission or silences. We are taught to question continuously. We are taught that never arriving at an answer is perfectly fine as the journey of asking questions is not only valid, it’s vital. In short, we are taught that things that you cannot count do count.

As an information literacy librarian, I continually bring questions to my students: where is this information coming from? Who wrote it? Who is presenting or sponsoring this information? Who is benefitting from having this particular kind of information published? What kinds of information aren’t we finding, why might that be? While we cannot always answer these questions definitively or answer them without a qualifier like “it depends,” it is imperative that we ask these questions of them and with them.

I want our profession to embody what we teach our students about information: doubt it, question it, be skeptical, be critical. It is not that I am against quantifiable evidence but I want to ensure we are using it carefully and critically and not complacently. And also that we don’t exclude other kinds of evidence that – though it cannot be counted – still counts.

Horgan concludes his essay arguing the point of the humanities is that they “keep us from being trapped by our own desire for certainty.” Librarianship is at a point where we are bombarded by “some ridiculously big questions” and if we limit our inquiries to what we can answer, prove, quantify, and chart, we’re doing ourselves and our profession a disservice.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Counting What You Cannot Count Or, A Literature Scholar Turned Librarian Ponders the Meaning of Evidence

by Heidi LM Jacobs
Leddy Library, University of Windsor

The first sentence I wrote for this blog post was this: “Perhaps this isn’t the best way to start a blog post for C-EBLIP but I have a confession to make: I am deeply suspicious of evidence.” The more I wrote, the more I realized that everyone is (or should be) deeply suspicious of evidence. We need to think about it carefully and critically and ask the difficult questions of it. The more I wrote about my distrust of evidence, the more I realized that it wasn’t evidence per se that gave me pause, but how evidence is generally defined and approached in LIS.

I think what I really meant when I said “I am deeply suspicious of evidence” is that I am deeply suspicious of numbers and the concept of “indisputable facts.” There are always, I think, more sides to the story than numbers can convey and when I see charts, figures, and graphs, I am always looking for the other parts of the story, the absent narratives, the “random dots,” the findings off the grid. What are the things these numbers aren’t telling us?

My feelings toward evidence could easily be explained by the fact that I am a literature scholar turned information literacy librarian. In both fields, we are trained to look at “evidence” in particular ways. My next blog post will consider what humanities-based modes of thinking could contribute to evidence based library and information practice (EBLIP) but in this post I’d like to pick up something that Denise Koufogiannakis (2013) raised in her EBLIP7 keynote. She observed that “EBLIP’s focus to date has been on research evidence and how to read and understand research better. This is a good thing (I certainly do not want to diminish the importance of the work that has been done in this respect)–but it is not the only thing–and we must begin to explore other types of evidence” (9).

Like Denise, I do not want to diminish the importance of the work being done in EBLIP and my questions about EBLIP are not to challenge or undermine the work done in this area. Rather, in this blog post, I want to respond to what I read as an invitation from Denise to “explore other types of evidence.” What other evidence types might there be? And what might these other types of evidence contribute to EBLIP, to research, and to librarianship? In this and my next blog post, I will be pondering some of these issues and invite others to join me in thinking through these ideas.

As I read in the field of EBLIP, I often wonder where librarians like me, with ties to the humanities, might fit in with evidence-based work. But as I write this, I pause because it’s not that my research, practice, teaching, and thinking aren’t informed by evidence—it’s just that the kind of evidence I summon, trust, and use is not easily translated into what usually constitutes “evidence,” formally and informally. Perhaps I am the kind of librarian who Denise (2012) describes here: “The current model may be alienating some librarians who feel that the forms of evidence they are using are not being recognized as important” (6). One cannot quantify theoretical thoughts or chart reflective practice; some researchers might view this kind of evidence as soft at best, inadmissible at worst.

For a while I thought it might make sense, as the song says, for me and EBLIP to call the whole thing off. However, the more I think about EBLIP the more I think there might be something worth considering. I realize that it’s not that I distrust evidence: I am skeptical of a certain kind of evidence. Or, phrased another way, I trust a different kind of evidence, a kind of evidence I don’t often see reflected in EBLIP.

My argument is not that EBLIP needs to change so that those of us with humanities backgrounds and humanities ways of thinking feel personally included or intellectually welcome in EBLIP endeavours. My argument is, instead, that humanities ways of thinking about evidence could offer EBLIP perspectives and approaches that might take us in new directions in our thinking, research, scholarship, and practice.

For those of us working with students and information literacy, we know that we can attempt to understand their experiences quantitatively or explore their thoughts and habits qualitatively. But again, to my mind, these kinds of studies are only part of the information literacy story: findings need to be contextualized, studies problematized, evidence questioned, “random dots” explored, premises and practices reflected upon and theorized. When we consider students—or any segment of user communities—in library scholarship we need to remember that we are studying complex, changeable people who themselves cannot be reduced into charts and graphs. Any “answers” we may find may raise more questions, and arguably, they should.

EBLIP’s suitability for helping us make decisions has been well-explored and theorized. I’m wondering if we could also use evidence—writ large—to help us ask new questions of our practices, our selves, our research and our profession.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

What’s All This About Evidence? C-EBLIP Journal Club, June 9, 2014

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

On June 9 over lunch, the C-EBLIP Journal Club met for the first time with 9 librarians in attendance. The plan is to meet every 6 weeks or so, and dates have been set until May 2015 to get us going. The club will run around rotating convenors, with the convenor’s role being to choose an article to discuss, book the meeting space, send out meeting announcements and reminder emails, lead the journal club discussion, and write up a discussion summary for the blog. I convened the first meeting and here’s how it went.

Article:
Koufogiannakis, D. (2012). Academic librarians’ conception and use of evidence sources in practice. Evidence Based Library and Information Practice, 7(4): 5-24. https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/18072

1. Brief summary of article

This is a qualitative study using grounded theory methodology that addresses these research questions: What forms of evidence do academic librarians use when making professional decisions? Why do they use these types of evidence? How do academic librarians incorporate research into their professional decision making? Koufogiannakis recruited 19 academic librarian participants using purposeful sampling (in other words, she looked for a particular population – librarians who “had some interest in exploring the use of evidence in relation to their professional decision-making” (p. 8)). These librarians kept a private blog diary for a month, noting questions or problems related to their professional practice and how they came to resolve those issues. She followed up with semi-structured phone interviews which enabled her to clarify and get a deeper analysis of specific aspects participants noted in their diaries. The results of this study include two variations of evidence: hard and soft, and the librarians in the study used both kinds to inform decision-making. They often didn’t think of the soft types of evidence as real evidence, but were using them in conjunction with the harder forms of evidence such as the research literature to inform their practice. The piece concludes with the idea that EBLIP needs to expand to include a wide variety of evidence types to “bring the EBLIP model closer to one that has truly considered the needs of librarians” (p. 21).

2. Discussion Summary
In retrospect, it is a difficult process to fully participate in a journal club discussion and to take notes for some kind of discussion summary after the fact. The following discussion points are highly contextual. The journal club discussion was informal and conversational, allowing us to take different roads.

• The discussion began with general impressions of the article and these impressions included praise for the clarity of the writing and an appreciation for the articulation and demonstration of material that at first glance seemed obvious but upon deeper reading revealed itself to be thought-provoking and meaningful.
• There was some discussion about the sample of librarians who participated in the study – how was that sample generated, were the librarians similar to the researcher, what would be the implications of that?
• The article takes a broad view of what is considered evidence. A comment was made that we are still so uncertain and that the article highlights an ongoing struggle. Other people may not accept the soft evidence. As librarians we can evaluate sources and all have the potential to have something to offer. Where it can get complex for us (as librarians) is the fact that we serve other research communities. We sometimes inherit those value systems through that work.
• There was a comment about the reflective nature of the research and someone commented that the humanities are doing the same kind of introspective things.
• The discussion moved to evidence based practice in health/medicine. In many models of EBP, the focus seems to be only on the research and not the other two aspects (professional knowledge/expertise; user/client/patient preference).
• Why aren’t librarians going to their associations for information/evidence, i.e. ACRL – tools, checklists, etc.?
• Is soft evidence used in the absence of hard evidence? After this question arose, there was a discussion around the designations of the terms “hard” and “soft.” Is the designation imposing some kind of bias? These are not neutral terms. Why have we as a culture or a society determined a hierarchy for these terms? What other terms might have been used to classify these two types of evidence? We couldn’t come up with anything that wasn’t hierarchical to some extent, and we put that down to the idea that we are organizers by nature.
• If the definition of evidence is expanding, does it not just dilute everything? i.e. describing our situation versus prescribing a direction
• Tacit knowledge: gut reactions help to pose the questions which ideally we go on to explore. The impetus to “prove it” helps to solidify thoughts. We often don’t believe our own knowledge is good or that it’s appropriate proof.
• We need to get away from the idea that right and wrong are forever. Need a new standard that allows for new knowledge and flexibility.

This was a great article with which to kick off the C-EBLIP Journal Club. Not only did we discuss the particulars of the research reported in the article, it also acted as a catalyst that sparked a professional discussion that we may well not have had without it about how we practice as librarians and what it means to practice in a evidence-based way.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.