Organizational Innovation: C-EBLIP Journal Club, August 20, 2015

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

The first C-EBLIP journal club meeting of the 2015/16 academic year took place on Thursday, August 20, 2015. Despite many librarians being in full-on holiday mode at the time, six participants discussed the following article:

Jantz, R.C. (2015). The Determinants of Organizational Innovation: An Interpretation and Implications of Research Libraries. College and Research Libraries, 76(4), 512-536. http://crl.acrl.org/content/76/4.toc

This article was chosen by me with the idea that innovation was a light and exciting topic and that the article would be perfect for an August journal club meeting. While the article did have some redeeming qualities, the club meeting had a bit of a rocky start: this article is long and slightly unwieldy! Jantz conducted a research study which focused on innovation as a specific type of change and he determined five factors that had a significant impact on how well libraries innovate. The research method consisted of a survey distributed to 50 member libraries of the Association of Research Libraries.

The discussion opened up with some problems around the methodology of the research. The details of how the research was conducted were sketchy. There was no word on how Jantz coded the data, no talk of analysis or collection methods in detail. The survey instrument was not included in the paper, nor were the raw survey results, which, one journal club member commented, would have been the interesting part! In terms of the instrument, it would have been helpful to have a look at it as club members wondered if the author defined the complex terms he used. How can we be sure that all the respondents were on the same page? The author also did not list the specific institutions surveyed, causing us to wonder if they were perhaps skewed to the sciences? Would liberal arts universities/libraries have R&D departments?

Club members found it problematic that the surveys were only administered to senior leadership teams, as views could be quite different down the organizational hierarchy. As well, as the responses were said to have come from senior leadership teams, members were interested in how this might have logistically happened. Did the teams get together to fill out the surveys? Did each team member fill out a survey and were the results of those collated? This is an example of insufficient detail around the methods employed for this research and the problems could have been alleviated with more attention to detail. It was a good takeaway for my own research: be detail oriented, especially when it comes to methodology! If the research is to be useful, it has to be seen as valid and reliable. That won’t happen if the reader is questioning the methods.

Another issue about the paper was that in several instances, the author stated things as established facts but did not cite them. Perhaps the author made assumptions but as we are attuned to the idea of providing evidence for claims, we weren’t buying it. Another take away: in terms of evidence, more is more! Or at the very least, some is vastly better than none. As well, in the conclusion of the paper, the author used the terms vision and mission interchangeably, which particularly irritated one journal club member and was another example of imprecision.

The discussion moved from the particulars of the article to innovation in general with some observations being made:
• Innovation is not dependent on age: people across the age spectrum can be innovative.
• We are overwhelmed by choice. The more choices we have makes making a choice more difficult. Decision-making is difficult.
• Is there a type of innovation on the other side of radical, i.e. useless? Change for change sake?
• One participant felt that innovation of all types can be useless. Innovation doesn’t necessarily create more viable choices.
• Libraries are always playing catch up…it may be innovative to us but not elsewhere [depending of course on the library].
• Difference between innovation and library innovation. Is there a difference between innovation and implementation?

At the very least, C-EBLIP Journal Club members felt that reading about innovation could be valuable when heading into a state of change. And let’s face it, we’re generally dealing with change more often than not these days. To conclude, although the article had some methodological gaps and members felt that the author could have been more selective when transforming his PhD dissertation into an article, the article did give us the basis for a fruitful discussion on innovation and the first meeting of 2015/16 was an hour well spent.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Conducting Research on Instructional Practices: C-EBLIP Journal Club, May 14, 2015

by Tasha Maddison
University Library, University of Saskatchewan

Journal club article:
Dhawan, A., & Chen, C.J.J. (2014). Library instruction for first-year students. Reference Services Review, 42(3), 414-432.

I was sent this article through an active Scopus alert that I have running on the topic of flipped classrooms. I had not received an article that met my particular search criteria in a long while, so I was excited to read this offering. The authors had included flipped classrooms as one of their recommended keywords for the article; yet the term does not make an appearance until page 426 in a section entitled ‘thoughts for improving library instruction’ and makes up just over a paragraph of content. It was interesting to me to witness firsthand how the use of an inappropriate keyword caused further exposure to research that I probably would not have read otherwise, which is both good and bad. I chose this article for C-EBLIP Journal Club for this reason, as I believed it would generate a spirited debate on the use of keywords, and that it did. Immediately there was a strong reaction from the membership on how dangerous it can be to use deceptive descriptions and/or keywords in the promotion of your work, as you will likely end up frustrating your audience.

I found the scope of the literature review in this article to be overly ambitious as it focuses on librarian/faculty collaboration and best practices for instruction in addition to information on the first year college experience (pg. 415). I wondered if the reader would have been better served with a more specific review of the literature on ‘for-credit’ first year library instruction. Another point worthy of noting is the significant examination of the assessment process throughout the article including information about the rubric that was used as well as evidence from the ACRL framework and the work of Megan Oakleaf; yet the only quantiative data provided in the case study was briefly summarized on page 423.

The group had a lively discussion on the worth of communicating research on instructional practices in scholarly literature. Members questioned whether or not there is value in the ‘how we done it good’ type article and the validity of reporting observations and details of your approach without providing assessment findings or quantitative data. I would argue that there is a need for this type of information within library literature. Librarians with teaching as part of their assigned duties require practical information about course content, samples of rubrics, and details of innovative pedagogy, as well as best practices when using a certain methodology which ideally outlines both the successes and failures. Despite the advantages to the practitioner in the field, we postulated on how such information could be used within evidence based practice, as the findings from these types of articles are typically not generalizable and often suffer from inconsistent use of research methodology.

We wondered if there is a need to create a new category for scholarly output. If so, do these articles need to be peer reviewed or should they be simply presented as a commentary? There is merit in practitioner journals that describe knowledge and advice from individuals in the field, detailing what they do. This type of scholarly output has the potential to validate professional practice and help librarians in these types of positions develop a reputation by publishing the results of their integration of innovative teaching practices into their information literacy instruction.

In spite of the fact that this article had little to do with flipped classrooms, I did find a lot of interesting take-a-ways including: details of student learning services within the library and learning communities on their campus, as well as the merit of providing for-credit mandatory information literacy courses.

Suggested further reading: http://blogs.lse.ac.uk/impactofsocialsciences/2015/05/13/reincarnating-the-research-article-into-a-living-document/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Altmetrics: what does it measure? Is there a role in research assessment? C-EBLIP Journal Club April 6, 2015

by Li Zhang
Science and Engineering Libraries, University of Saskatchewan

Finally, I had the opportunity to lead the C-EBLIP Journal Club on April 6, 2015! This was originally scheduled for January, but was cancelled due to my injury. The article I chose was:

How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. By Zohreh Zahedi, Rodrigo, Costas, and Paul Wouters. Scientometrics, 2014, Vol.101(2), pp.1491-1513.

There are several reasons why I chose this article on altmetrics. First, in the University of Saskatchewan Library, research is part of our assignment of duties. Inevitably, how to evaluate librarians’ research outputs has been a topic of discussion in the collegium. Citation indicators are probably the most widely used tool for evaluation of publications. But with the advancement of technology and different modes of communications, how to capture the impact of scholarly activities from those alternative venues? Altmetrics seems to be a timely addition to the discussion. Second, altmetrics is an area I am interested in developing my expertise. My research interests encompass bibliometrics and its application in research evaluation; therefore, it is natural to extend my interests to this new emerging field. Third, this paper not only presents detailed information on the methods used in this research but also provides a balanced view about altmetrics, thus helping us to understand how altmetric analysis is conducted and to be aware of the issues around this new metrics as well.

We briefly discussed the methodology and main findings in the article. Some of the interesting findings include: Mendeley readership was probably the most useful source for altmetrics, while the mentioning of the publications in other types of media (such as twitter, delicious, and Wikipedia) was very low; Mendeley readership counts also had a moderate positive correlation to citation counts; in some fields of social sciences and humanities, altmetric counts were actually higher than citation counts, suggesting altmetrics could be a potentially useful tool for capturing impact of scholarly publications from different sources in these fields, in addition to citation indicators.

Later in the session, we discussed a couple of issues related to altmetrics. Although measuring the impact of scholarly publications in alternative sources has gained notice, it is not yet clear why publications are mentioned in these sources. What kind of impact does Altmetrics measure? In traditional citation indicators, at least we know that the cited articles stimulated or informed the current research in some way (either positive or negative). In contrast, a paper appearing in Mendeley does not necessarily mean it is read. Similarly, a paper mentioned in Twitter could be just self-promotion (there is nothing wrong with it!). From here, we extended our discussion to publishing behaviours and promotion strategies. Are social scientists more likely to use social media to promote their research and publications than natural scientists? The award criteria and merit system in academia will also play a role. If altmetrics is counted as an indication of the quality of the publications, we may see a sudden surge of social media use by researchers. Further, it is much easier to manipulate altmetrics than citation metrics. Care needs to be taken before we can confidently use altmetrics as a reliable tool to measure scholarly activities.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Reflections from the C-EBLIP Journal Club, Feb 23, 2015

by Carolyn Doi
Education and Music Library, University of Saskatchewan

For this iteration of the C-EBLIP Journal Club, I decided to feature an article from outside the LIS literature that deals with the topic of reflection, creative processes and digital technologies in the classroom:

Kirk, Carole, and Jonathan Pitches. “Digital Reflection: Using Digital Technologies to Enhance and Embed Creative Processes.” Technology, Pedagogy and Education 22, no. 2 (July 1, 2013): 213–30. http://dx.doi.org/10.1080/1475939X.2013.768390

This paper caught my attention for several reason. The discussion of creative processes and incorporation of technology in the classroom is particularly interesting to me and these are topics that often come up when I discuss teaching strategies with other librarians. I was also looking forward to exploring the idea of reflection, both in the classroom and as part of the research process. This is something we have discussed in our institution and in particular through our own Library Leadership Development Program.

The authors of this paper are both scholars at the School of Performance and Cultural Industries at the University of Leeds who have shared the results from a teaching and learning project called Digitalis (http://www.digitalis.leeds.ac.uk/), which “investigates ways in which digital technologies can be used by teaching staff to facilitate reflection on creative practices within performing and creative arts disciplines” (p. 213). The study used action research methodology led by members of a cooperative inquiry group who incorporated reflection and digital technologies into their own teaching practice. They took this a step further and also incorporated reflection as one of the four project phases (planning, action, collection and reflection).

The study featured modules in five areas of study: performance design, dance, music, theatre & performance and museum studies. In each module, students were asked to reflect on their learning and experience, assisted by different types of digital technology. In one example, students in a second year compulsory Dance Choreography course were asked to use a flip camera to capture thoughts, ideas and observations, which were used in combination with written reflection and posted to a private blog. The other modules used varying types of reflective processes. Methods of digital capture included flip cameras, audio recorders and digital still cameras. Digital reflection mechanisms included blogs (on Blackboard), PowerPoint and Photo Story 3.

In some cases, the technology may have interfered with the process of critical reflection as some students ended up “concentrating too much on slick production values to the detriment of critical thinking” (p. 224). The paper mentioned that ease of use was an important factor in getting students to feel engaged in the reflection activities. One recommendation that came out of the paper was that digital reflection technologies should be introduced incrementally, as opposed to all at once.

We discussed the value of incorporating technology into the classroom, and also of the importance of not letting the technology ‘get in the way’ of the learning process. Some in our group remarked that they were still surprised that the incorporation of technology in the classroom still might be a barrier for some students.

The paper reports that students found digital reflection to be advantageous when ‘looking again’ at material which would otherwise have been lost in the creative practice. The digital capturing acted as a way they could benchmark their own impressions of the event, and allowed the performer to experience being an audience member of their own performance.

We discussed the benefits of reflection in two veins: 1) for integration into the classroom and 2) for integration into our own practice. Some questioned the viability of incorporating reflection (especially non-written reflection) into library instruction as we are often faced with the challenge of limited classroom time where it would be difficult to follow up with students. Librarians who teach in disciplines outside of the arts felt that they might just not be able to get their students to try a less conventional reflection method such as video capture. The article prompted some to think about video capture as a means to document and reflect on one’s own teaching practice. Others were thinking about incorporating reflection into other aspects of practice or research, or are currently embarking on projects that do incorporate an element of planned reflection.

The journal club is always an engaging gathering and it’s been interesting to see the various opinions and perspectives that emerge out of the group discussions. I look forward to many more discussions around the journal club table in the coming months!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Oh, the Digital Humanities! C-EBLIP Journal Club, November 13, 2014

by Shannon Lucky
Library Systems & Information Technology, University of Saskatchewan

The most recent meeting of the C-EBLIP Journal Club was my chance to select an article and lead our discussion. I was intrigued by our previous conversations about strategic publishing choices, open access journals, the perception of different publishing outlets in our field, and alternatives to traditional scholarly publications (like blogs!) and wanted to keep that discussion going. An announcement about the recent digital humanities project The Exceptional and the Everyday: 144 hours in Kiev landed in my inbox and I thought it could ignite a great conversation about alternative research communication options – but it is decidedly not a traditional journal article. Although this goes against the basic organizing principle of a journal club, Virginia encouraged me to take some liberties with the definition of an article and it led to a lively discussion that ranged from the audience(s) for research, to selfies, to peer review and anxiety about wanting to experiment with your research, but also wanting to get tenure.

The Exceptional and the Everyday: 144 hours in Kiev comes from Lev Manovich’s Software Studies Initiative and uses “computational and data visualization techniques [to] explore 13,208 Instagram images shared by 6,165 people in the central area of Kyiv during 2014 Ukrainian revolution (February 17 – February 22, 2014).” I will state my bias up front – I love this kind of stuff. This is a very “DH” (digital humanities) project – I wondered how researchers from other disciplines (like libraries) would feel about putting their resources into creating a website where they release everything (findings, data, and research tools) rather than focusing on getting that highly regarded publication accepted first.

A recent theme that has been running through our meetings is standards for promotion and tenure at our institution and how the collegium values research work and output. There are so many different ways we all engage with research and share the things we learn, but the gold standard remains the traditional peer reviewed, high-impact journals. We had previously talked about the tension between publishing in the most appropriate or interesting place vs. traditional, highly regarded library journals. I wanted to talk about how a project like this breaks the expectation of how research is communicated and if this format is effective, persuasive, authoritative, or just a gimmick.

This publishing format is certainly out of the ordinary but one of the biggest benefits was the ability to get information out FAST. The event this project studied happened less than eight months before the day I got that announcement in my inbox. That kind of production time is basically unheard of in traditional publishing, even for electronic journals. The downside is there is no time for peer review. Post publication peer review was mentioned as an option to keep the timely nature of the publishing cycle while maintaining the important value of peer review. I am very curious what that would look like for this project and how that peer review would be communicated to readers.

Perhaps my favourite comment from our discussions was “This made me feel like the oldest fogey ever”. While a hysterical comment coming from a room full of people who love new research, it nicely described the feeling several of us had trying to read this project like a journal article. As we picked our way through the site we acknowledged that most of the information we look for in an article was there (except an abstract!), but not having it in the familiar linear format was disorienting. The project checks all of the boxes in term of citing sources and uses research methods we recognize, but nothing is where you would expect it to be. It is both easy to browse and difficult to skim for information. We need to develop new literacies to become more comfortable with this format or at least check our assumptions that the best way to communicate research findings is the way we do it now.

Although the project proved complicated to read in the same way we understand journal articles, this format does have major benefits. This kind of project allows you to publish everything – multiple articles or essays, your dataset(s), huge full-colour graphics, interactive visualizations, the digital tools you used to do the research, and you can update all of this stuff on the fly with no extra cost. This is only good if all of that information is useful (or at least beautiful to look at) but it does give the opportunity to understand the methodology and process better by revealing multiple aspects of the research, particularly if the research subject exists online.

All of this analysis and comparison to traditional academic publishing kept coming back to the question of who the audience is. Who is this research for? We didn’t come to a consensus on this question. We did wonder what the altmetrics for something like this would look like and what the benefits are in pursuing this publication model. The project didn’t show up in Google Scholar at all, but it did have over 800 hits from a regular Google search (many from social media). In the end we posed the question: What is more valuable for your research, having a paper peer reviewed and read by your academic peers or seen by thousands of people outside your field and likely outside academia? I can’t imagine building an academic career based on web projects (without peer review) at the moment, but who can tell the future? Things are changing all of the time. Besides, I wouldn’t be surprised to see some peer reviewed articles about 144 Hours in Kiev pop up in Google Scholar in 2015.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Are Students Succeeding with a Library Credit Course? C-EBLIP Journal Club, October 6, 2014

by Rachel Sarjeant-Jenkins
Client Services, University Library, University of Saskatchewan

I recently had the opportunity to lead our C-EBLIP Journal Club in a discussion of Jean Marie Cook’s article “A library credit course and student success rates: A longitudinal study” in College & Research Libraries 75, no. 3 (2014) (available at http://crl.acrl.org/content/75/3/272.full.pdf+html). This article had been sitting on my desk for a few months waiting for that magical moment when I found the time to read it thoroughly. Then came my turn to host journal club. What a perfect opportunity to finally delve into Cook’s article! And it couldn`t have come at a better time in light of our library’s focus on developing a programmatic approach to library instruction and the broader teaching and learning environment in which academic libraries currently find themselves.

Following some ‘proper’ journal club discussion about the article’s methodology and findings, Cook’s article proved a wonderful catalyst for a conversation about library instruction at our institution. Initially we were simply envious of Cook’s situation, where a library-focused course is one of the areas within her institution’s priorities. But then the questions started.

• Is there value in having a stand-alone library course or is it better to have instruction firmly embedded or integrated into academic program courses? (Of course, this question did not mean we ever stopped desiring that institutional commitment to information literacy — who would!?)
• How do you assess student learning? And, more importantly, how do you gauge the actual ongoing use of that learning by students?

We also talked about library value. The impetus for Cook’s work was institutional interest in ROI; the result was her quantitative research project.
• How, we asked, can qualitative data be used to support (and enhance) quantitative data when demonstrating library value to the parent institution?
So many questions, and only a lunchtime to discuss.

Not surprisingly, our hour just wasn’t enough. What that hour did do, however, was get us thinking. We talked about the known information literacy courses on campus and learned about pockets of embedded instruction by our librarians that we were completely unaware of. We had a lively debate about quantitative and qualitative research and the benefits of each. And of course we talked about assessment, not only that we need to do more of it and do it more consistently, but also the importance of knowing what we are trying to assess and therefore when we want to assess it.

Our journal club hour got me excited and primed for the next steps in developing our library’s programmatic approach to instruction. Cook’s article, and the energetic conversation it inspired, was an excellent beginning.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Perpetual Access, Perpetually Confusing? C-EBLIP Journal Club, August 25, 2014

by Charlene Sorensen
Services to Libraries, University of Saskatchewan

I’m really enjoying the C-EBLIP journal club and I’ve been trying to figure out why since I’ve never been one for book clubs. It certainly helps that journal articles are short but that isn’t the whole reason. I find all areas of librarianship so interesting, but I don’t have enough time to explore areas outside of my own (technical services and collections). So the exposure to others’ article selections, combined with the small time commitment to read the articles and attend the meetings, is very exciting to me.

The third meeting of the C-EBLIP Journal Club was held on August 25, 2014 to discuss this journal article of my choosing:

Bulock, Chris. “Tracking Perpetual Access: A Survey of Librarian Practices.” Serials Review 40, no. 2 (2014): 97-104
http://doi.org/10.1080/00987913.2014.923369

I chose this article because it was from the area of the library literature that I typically follow but would probably be a topic unfamiliar to the journal club members. It was also relevant to a project I am involved in and it was short (do you see a theme here?). I also liked it because the research study was pretty straightforward and was an example of what any one of us might undertake.

The author undertook a survey that asked librarians about their practices with respect to tracking perpetual access to e-journals, e-books, and multimedia resources. That is, even if perpetual access is contained with a license agreement, the perpetual access entitlements must then be tracked and holdings must be adjusted if changes occur. The author concludes that librarians seem committed to securing the perpetual access rights, but they were less dedicated to maintaining the access as evidenced by the fact that a great many weren’t actually tracking the access.

The conversation started out innocently enough. We identified a couple of inconsistencies in the paper and yearned for better definitions of some of the concepts. But the discussion took off from there and we wondered if the advent of electronic resources has changed our perspective on long-term access of any online resource. Libraries struggle with electronic resources every step of the way, from selection and acquisition, to description and discovery, right through to current and long-term access. We are so very good at managing these processes for print materials, but are nowhere near having the same control over e-resources. BUT maybe we just can’t have the same ‘control’ over these materials and should dial back our expectations. For example, I have a shoebox of letters I received throughout my life up until 1996, when email came along and now correspondence with friends and family is regularly deleted. Many of us have photos on our phones that will be deleted accidentally or on purpose. Maybe history matters less now that it’s harder to preserve?

But are libraries supposed to stand up to these difficulties and be responsible for the long-term access to its resources for the benefit of the university community? The author of this paper isn’t very hopeful and concludes:

“It remains to be seen whether librarians will develop the tools necessary to bring their practices into alignment with their ideals, or whether the goal of perpetual access will simply fall by the wayside” (p. 103).

I personally believe that libraries do have the responsibility to ensure perpetual access, though the ideal may be different from that of print materials. I look forward to further discussions on this topic throughout the library.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Librarians Need to “Walk the Talk” on OA Publishing: C-EBLIP Journal Club, July 17, 2014

by DeDe Dawson
Science Library, University of Saskatchewan

This post discusses a recent article in In the Library with the Lead Pipe:

Librarian, Heal Thyself: A Scholarly Communication Analysis of LIS Journals by Micah Vandegrift & Chealsye Bowley

Librarians are at the forefront of many discussions and actions related to advancing the open access movement. We often talk about the need to change the culture of researchers in academia. Researchers need to understand the importance of the issues and their rights as authors – then put this into action by changing their scholarly communications practices. It is the researchers that have the real power to create change in academic publishing. By “researchers” though we’re usually referring to the disciplinary faculty we support… but what about researching librarians?

Increasingly, librarians are publishing researchers in their own right. Indeed, it is a job requirement of many academic librarians. So why isn’t there a stronger movement within our own community of scholars to change our scholarly communications systems, and culture, to be more open? Even though we preach to other researchers at our institutions about the benefits of publishing in gold OA journals, or archiving copies of manuscripts in repositories, we have a dismal track record of following through on this ourselves.

Vandegrift and Bowley review the literature in this area and conclude:
“Taken together, the research could lead one to think that academic librarians are invested in changes to the scholarly publishing system about as little as disciplinary faculty and are just as cautious about evolving their own publishing habits.”

So, there is a problem – but what is the solution? The authors of this paper hope to ignite this discussion among librarians with their analysis of the openness of the main Library and Information Science (LIS) journals in our field. They adapt the “How Open Is It?” scale produced by SPARC/PLOS to propose a new measure: the “Journal Openness Index” (J.O.I.). And proceed to code 111 LIS journals according to this criteria, then apply the J.O.I. Factor to 11 “prestige” LIS journals (as identified by Nixon, 2013).

Information Technology and Libraries, published by Library and Information Technology Association/ALA, comes out on top with another ALA publication, College & Research Libraries (C&RL), close behind (see Table 2). Unsurprisingly, commercial publishers land at the bottom of the list. An aside: In the Library with the Lead Pipe runs on a blog-style format which allows comments and discussion at the end of the article. There is an interesting back and forth in the Comments between the current editor of C&RL and Vandegrift.

The authors intend that this application of the J.O.I. Factor serves as a “proof of concept”, and encourage others to use their coded data on the 111 journals (posted as a Google doc and in FigShare). They end the article with this:

“It is our hope that this article prompts furious and fair debate, but mostly that it produces real, substantive evolution within our profession, how we research, how we assign value to scholarship, and how we share the products of our intellectual work.”
The article did receive a flurry of attention back in April 2014 when first posted (see some of the trackbacks in the Comments section), but this has now died down. I share the authors’ desire for furious and fair debate in this arena. However, I am continually surprised, and disappointed, by the apparent apathy of many librarians on scholarly communications topics – especially related to their own research output. How can we account for this?

Our C-EBLIP Journal Club met today to discuss this article and also the topic of librarian values regarding their own research/publishing activities. We had a wide-ranging and compelling discussion… but kept arriving back at the distorted importance placed on various metrics like the impact factor. We need to satisfy our tenure and promotion committees just as any other faculty member. So, long-standing traditional proxies for “quality” are slow to change.
We did not solve all the problems of the [academic] world at Journal Club today, but I think we came a little closer to some understanding of what some of those problems are.

Bowley, C., & Vandegrift, M. (2014). Librarian, Heal Thyself: A Scholarly Communication Analysis of LIS Journals. In the Library with the Lead Pipe. Retrieved from http://www.inthelibrarywiththeleadpipe.org/2014/healthyself/

This blog post was originally posted on the blog, Open Access @ UofS Library.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

What’s All This About Evidence? C-EBLIP Journal Club, June 9, 2014

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

On June 9 over lunch, the C-EBLIP Journal Club met for the first time with 9 librarians in attendance. The plan is to meet every 6 weeks or so, and dates have been set until May 2015 to get us going. The club will run around rotating convenors, with the convenor’s role being to choose an article to discuss, book the meeting space, send out meeting announcements and reminder emails, lead the journal club discussion, and write up a discussion summary for the blog. I convened the first meeting and here’s how it went.

Article:
Koufogiannakis, D. (2012). Academic librarians’ conception and use of evidence sources in practice. Evidence Based Library and Information Practice, 7(4): 5-24. https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/18072

1. Brief summary of article

This is a qualitative study using grounded theory methodology that addresses these research questions: What forms of evidence do academic librarians use when making professional decisions? Why do they use these types of evidence? How do academic librarians incorporate research into their professional decision making? Koufogiannakis recruited 19 academic librarian participants using purposeful sampling (in other words, she looked for a particular population – librarians who “had some interest in exploring the use of evidence in relation to their professional decision-making” (p. 8)). These librarians kept a private blog diary for a month, noting questions or problems related to their professional practice and how they came to resolve those issues. She followed up with semi-structured phone interviews which enabled her to clarify and get a deeper analysis of specific aspects participants noted in their diaries. The results of this study include two variations of evidence: hard and soft, and the librarians in the study used both kinds to inform decision-making. They often didn’t think of the soft types of evidence as real evidence, but were using them in conjunction with the harder forms of evidence such as the research literature to inform their practice. The piece concludes with the idea that EBLIP needs to expand to include a wide variety of evidence types to “bring the EBLIP model closer to one that has truly considered the needs of librarians” (p. 21).

2. Discussion Summary
In retrospect, it is a difficult process to fully participate in a journal club discussion and to take notes for some kind of discussion summary after the fact. The following discussion points are highly contextual. The journal club discussion was informal and conversational, allowing us to take different roads.

• The discussion began with general impressions of the article and these impressions included praise for the clarity of the writing and an appreciation for the articulation and demonstration of material that at first glance seemed obvious but upon deeper reading revealed itself to be thought-provoking and meaningful.
• There was some discussion about the sample of librarians who participated in the study – how was that sample generated, were the librarians similar to the researcher, what would be the implications of that?
• The article takes a broad view of what is considered evidence. A comment was made that we are still so uncertain and that the article highlights an ongoing struggle. Other people may not accept the soft evidence. As librarians we can evaluate sources and all have the potential to have something to offer. Where it can get complex for us (as librarians) is the fact that we serve other research communities. We sometimes inherit those value systems through that work.
• There was a comment about the reflective nature of the research and someone commented that the humanities are doing the same kind of introspective things.
• The discussion moved to evidence based practice in health/medicine. In many models of EBP, the focus seems to be only on the research and not the other two aspects (professional knowledge/expertise; user/client/patient preference).
• Why aren’t librarians going to their associations for information/evidence, i.e. ACRL – tools, checklists, etc.?
• Is soft evidence used in the absence of hard evidence? After this question arose, there was a discussion around the designations of the terms “hard” and “soft.” Is the designation imposing some kind of bias? These are not neutral terms. Why have we as a culture or a society determined a hierarchy for these terms? What other terms might have been used to classify these two types of evidence? We couldn’t come up with anything that wasn’t hierarchical to some extent, and we put that down to the idea that we are organizers by nature.
• If the definition of evidence is expanding, does it not just dilute everything? i.e. describing our situation versus prescribing a direction
• Tacit knowledge: gut reactions help to pose the questions which ideally we go on to explore. The impetus to “prove it” helps to solidify thoughts. We often don’t believe our own knowledge is good or that it’s appropriate proof.
• We need to get away from the idea that right and wrong are forever. Need a new standard that allows for new knowledge and flexibility.

This was a great article with which to kick off the C-EBLIP Journal Club. Not only did we discuss the particulars of the research reported in the article, it also acted as a catalyst that sparked a professional discussion that we may well not have had without it about how we practice as librarians and what it means to practice in a evidence-based way.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.