The Many Benefits of OA and Open Peer Review: C-EBLIP Journal Club, May 10, 2016

by DeDe Dawson
Science Library, University of Saskatchewan

Article:
Tennant JP, Waldner F, Jacques DC et al. The academic, economic and societal impacts of Open Access: an evidence-based review [version 1; referees: 4 approved, 1 approved with reservations]. F1000Research 2016, 5:632 (doi: 10.12688/f1000research.8460.1)

This is arguably the perfect journal club article! A juicy topic with a few points of contention, a journal platform with many innovative features, and open post-publication peer review. Lots and lots to discuss, and indeed we ran out of time. Here I try to summarize our conversation:

I always gravitate towards review articles and highly recommend them to students. They are often the most efficient way to get up to speed on all the relevant literature in a complex area. Open Access (OA) is just such a complex area with multiple overlapping layers of issues, and all progressing so rapidly, that it is the ideal topic for a review. It is ironic because OA itself is such a simple concept. The complexity comes from the challenges of implementation and the multiple stakeholders (and vested interests) involved.

This review article summarizes the main evidence to date on the impact of OA from three perspectives: academic, economic, and societal. They are essentially three lines of reasoning in support of OA. We thought that the strongest, most well-developed argument in favour of OA in this article was the academic. It certainly had the most citations behind it because of the highly productive research area documenting the OA citation effect. We also thought that maybe this academic perspective was focused on by the authors because it was the most likely to persuade researchers who might be reading this review.

So, was the point of the article to persuade researchers to support OA? The authors have an obvious bias as proponents of OA. But how important is it for authors to be neutral? We thought it was unrealistic to expect authors not to have a bias, and for most kinds of papers authors indeed argue a particular point. But should review papers be different? It was suggested that if authors are clear and upfront about their objectives and competing interests this shouldn’t be a problem.

This brought us to the question of what evidence against OA might there be anyway? (We expose our own pro-OA bias here!). One of the online commenters on the article challenged the authors to provide a more balanced review – but he could not provide the authors with links to literature to support these other anti-OA perspectives. Some of the obvious counter-arguments were already dealt with in the article: such as the rise of deceptive (“predatory”) publishing, and the challenges of paying article processing charges (APCs) for authors without funding or those from underdeveloped countries. Otherwise, it is pretty hard to argue against OA unless you are a commercial publisher (or shareholder) with financial interest in sustaining the current system. The commenter argued that jobs will be lost in the transition. But this is a weak point. Are we to prop up an entire dysfunctional, and inequitable, system for the sake of some jobs? Besides these jobs will likely morph into other more relevant and useful functions. What seemed to emerge from this back-and-forth was that “sustainable” means something completely different to commercial publishers (and their allies) and OA proponents! Publishers are from Mars; OA proponents are from Venus.

Beyond the article itself we had a lot to say about the platform and the open peer review model. The article is essentially still in its pre-print version. It was posted on the F1000Research site before peer review. It was a fascinating process to see the reviewers’ reports as they were submitted, and to watch as others commented on the article and the authors responded. It gave the impression of a proper scholarly conversation taking place. This is ideally what journals should be facilitating. Technology allows this now – so why are so many journals still clinging to outdated formats from the print era?

The “open” nature of the reviews and comments also ensured an appropriate level of civility. Who has not received rude and unproductive comments from a reviewer that feels protected by their anonymity? (There is an entire Tumblr site devoted to such remarks!). However, if the reviewer is obliged to reveal themselves, not just to the authors but to the whole of the readership, then they are more likely to behave diplomatically, and provide constructive and substantiated critiques. This also works in the reviewer’s favour: readers (and evaluators) can plainly see the amount of work and time invested by the reviewer in their function. If the reviewer has spent considerable time in providing a thoughtful review then they can justifiably link to it on their CV and collegial committees can see for themselves the energy the reviewer expended.

We also spoke of how we might use this kind of journal format in information literacy instruction with students. This would more clearly make the point that scholarship is a conversation, and that there are multiple points of view. It would demystify the peer review process too: we can see the issues raised by the reviewers and can follow the paper into its next version seeing how the authors might address these concerns. This process is usually completely hidden from the average reader; so it is difficult for a student to imagine a paper other than the final version.

These various versions of papers do present challenges for the reader in citing though! It seems that all the versions remain on the site and have their own DOIs, but the added complexity in citing remains. This is a relatively minor issue though compared to the benefits of an open scholarly conversation that such a model of peer review allows.

We look forward to seeing the next version of this article and continuing the conversation on the benefits of OA!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Carrots & Sticks: Encouraging self-archiving in an IR. C-EBLIP Journal Club, Mar 31, 2016

by Shannon Lucky
IT Librarian
University Library, University of Saskatchewan

Article: Betz, S., & Hall, R. (2015). Self-Archiving with Ease in an Institutional Repository: Microinteractions and the User Experience. Information Technology and Libraries, 34(3), 43–58. http://doi.org/10.6017/ital.v34i3.5900

One of the things I love about the C-EBLIP journal club is the ease of having one of my colleagues pick out an interesting article from their area of specialization so I can poke my head into their world for an hour and see what ideas they are wrestling with. As an IT librarian, picking an article creates some anxiety because systems and technology aren’t always that accessible (or interesting) for a diverse audience. I was happy to see Sonya Betz and Robyn Hall’s article pop up on a library tech listserv as it was a great fit for our group.

The University Library currently doesn’t have an institutional repository (IR) for the entire campus, but we do have a DSpace eCommons repository for research by UofS librarians. Because we have all deposited our own work into eCommons our conversation started with a unanimous (good natured) rant about how hard it is to do self-archiving. It is time-consuming and the technology was deemed to be frustrating and unsatisfying. Like other tedious institutional reporting systems, we assumed this was the only way. As one member put it, “I didn’t know we could expect better”.

While we talked about how frustrating the process could be, we also wondered just how much effort, time, and money should be invested in improving a system that we all have to use, but that our library users will never see. When do we make the call that something is good enough and we, or our fellow faculty, can suck it up and figure it out or ask for help? One of my favourite suggestions was that a “good enough” scenario would have the user feeling “the absence of anger”. Apparently the bar is quite low. Betz and Hall talk about some of the barriers to self-archiving but don’t ask why, when contributing to IRs is so difficult, many academics voluntarily submit their work to sites like academia.edu and ResearchGate – what is it they are doing right that we could learn from?

This led to a discussion about what libraries could do to encourage faculty, both within and outside the library, to deposit in an IR. We saw two routes: the carrot and the stick.

1024px-Carrot_and_stick_motivation svg

Carrots:
• Link academic reporting systems together to cut down on the number of places this information needs to be input (e.g. have citations from the IR export to formatted CVs, link ORCHID accounts with IR entries for authority control and better exposure, etc.)
• Group scholarly output for colleges, departments, or research groups together in the IR to show the collective impact of their work
• Gamify the submission process with progress bars, badges, and the ability to level up you scholarly work

Sticks:
• Money. Canada Council requires submission to an IR as a part of their funding model
• Librarians armed with actual sticks going office to office “persuading” scholars to deposit their research

We agreed that libraries don’t wield an effective stick in this scenario. Research services, colleges, and departments have to be the ones to put on the pressure to deposit. Librarians can help make that happen and (hopefully) make it as pain-free as possible.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Describing a phenomenon through experience: C-EBLIP Journal Club, February 16, 2016

by Carolyn Pytlyk
Research Facilitator
University Library, University of Saskatchewan

Article:
Forster, M. (2015). Phenomenography: A methodology for information literacy research. Journal of Librarianship and Information Science, 1–10. doi: 10.1177/ 0961000614566481.

Way back in October at the C-EBLIP Fall Symposium, Margy MacMillan from Mount Royal University talked about phenomenography as a new methodology for conducting research on information literacy. Phenomenography is “the empirical study of the limited number of qualitatively different ways in which various phenomena in, and aspects of, the world around us are experienced” (Marton quoted in Forster, p. 1). Margy’s enthusiasm and excitement for phenomenography certainly piqued my interest. In my conversations with library researchers, research methodology is often to topic of discussion when planning research projects, applying for grants, or developing research budgets and can sometimes be a stumbling block for researchers. As such when it came my turn to convene Journal Club, I thought Forster’s review article might be a good opportunity to explore phenomenography as a viable library research methodology for library researchers.

The majority of our conversation revolved around whether phenomenography was indeed a useful new methodology for conducting library research or not. For the most part, we agreed that from the perspective of the review article, it seemed a rather complex and involved methodology. However, in the end, we couldn’t really tell without actually following a researcher through the process. This review article was a fairly good introduction to and overview of phenomenography but to really understand its complexity, we agreed that we would need to read research employing phenomenography as a methodology to see how it works and if it is really as complex as it seems at the outset.

While presenting an intriguing and possible methodological alternative, this article left us with many more questions than answers. Some questions stemming from this review article include:
1. Is this a useful methodology? Would library researchers use it?
2. Is it a methodology about how we think?
3. How do researchers unobtrusively interview people without priming the participants? Is it even possible?
4. Is it a complex methodology, or does it just seem like it?
5. What are the steps involved? How does someone actually do it?
6. Could it be appropriate for library research other than information literacy (like usability or librarians as researchers)?
7. What other methodologies are out there in other disciplines that are possible for library research?
8. What sorts of learning/training would researchers need before undertaking phenomenography?
9. Do researchers have to be experienced interviewers to use it?

Still, despite the numerous unanswered questions, we were not deterred and were in agreement that we are all keen to learn more about it and its process.

Finally, we rounded out our conversation with the value of review articles, although not all of us are keen on them. (Don’t worry; I won’t name names.). Forster’s article not only opened our eyes to phenomenography as a new methodology; it also opened our eyes to the value of review articles as providing overviews of new methodologies, both as consumers and producers of knowledge.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Publish or practice, never that simple: C-EBLIP Journal Club, November 17, 2015

by Selinda Berg
Schulich School of Medicine – Windsor Program
Leddy Library, University of Windsor
University Library Researcher in Residence, University of Saskatchewan

As the Researcher-in-Residence, I was very eager to convene the November gathering of the University of Saskatchewan Library’s C-EBLIP Journal Club. I think that this initiative by the Centre (C-EBLIP) is incredibly valuable to librarians: It expands our understanding of the research landscape; increases are understanding our colleague’s research interests; and diversifies our perspectives and deepens our knowledge about research.

The article we discussed in November was:
Finlay, C. F., Ni, C. Tsou, A., Sugimoto, C. R. (2013). Publish or practice?: Examination of librarians’ contributions to research. portal: Libraries and the Academy, 134), 403-421.

In this article, the researchers share the results of their investigation into the authorship of LIS literature with an emphasis on understanding the contributions and attributes of practitioner scholarship. The article intersects well with my own research interests, as well as aligns with many of the ongoing conversations about the research outputs and the research productivity by academic librarians. The conversation was lively, informative, and thoughtful.

The article was well-received by those at journal club with members highlighting the article’s clear methods and style of writing. The discussion was diverse and lead us to many different conversation, but three themes did emerge.

Other possible interpretations and explanations:
The authors found that there was a decrease in the proportion of article published by practitioners between 2006 and 2011. The authors made a couple of suggestions as to why this may have occurred, including the increase in non-traditional publications and the decrease in expectations for research. In addition to these explanations, we discussed other possibilities including a movement away from LIS journals as librarians’ research interests become more diverse; a decrease in tenure-track/tenured librarian positions (resulting in more contract positions without research opportunities and perhaps more practice heavy positions); and/or a change in the nature of articles with a movement away from a focus on quantity of articles to a focus on quality research.

Application of method and findings to the development of institutional standards and a disciplinary research culture:
The discussion led to interesting conversation about how contributions to scholarship are measured, both in relation to our disciplinary research culture as well as institutional standards. As scholarly communications evolve, is the counting of articles in respected journals the only (or best) was to evaluate research contributions? This discussion led us to further consideration about how disciplinary differences in research culture make a difference in the interpretation of contributions, and in turn, the relatively young and immature research culture in academic libraries makes it difficult to name our disciplinary criteria and in turn develop institutional standards.

Related research questions:
The article was really well-received and from good research comes more questions. The article raised some interesting discussion about related research questions that were not within in the scope of the research article. There was an interest in knowing more about the qualities and attributes of the librarians who have been publishing (including their position, their length of service, their motivations for research, and the factors that determined where they publish). There was also questions as to whether these librarians who are contributing to scholarship through “traditional” scholarly venues are also contributing to the scholarly conversations though non-traditional formats (blogs, open publishing etc.). Lastly there was an underlying assumption that these two bodies of literature by two set of authors, LIS scholars and practitioner-scholars interact and impact each other; however, there was an interest in knowing how these two bodies literature, written by two groups of authors actually do interact- for example: are they citing each other, or do they cite their own communities?

Great discussion ensued at the meeting and some stimulating ideas were generated from the many interesting findings within the paper and beyond. Some very thoughtful discussion emerged during journal club—looking forward to Janurary!!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Data for Librarians: C-EBLIP Journal Club, October 1, 2015

by Kristin Bogdan
Engineering Library, University of Saskatchewan

At the second meeting of the C-EBLIP Journal club for 2015-2016, held on October 1, 2015, we discussed the article:

MacMillan, D. (2014). “Data Sharing and Discovery: What Librarians Need to Know”. The Journal of Academic Librarianship, 40, 541-549. http://dx.doi.org/10.1016/j.acalib.2014.06.011

I chose this article because it is a nice overview of the key things that librarians should be familiar with about data and data management. MacMillan does a great job of synthesizing the information out there and applying it in a Canadian context, where current data management trends are not as driven by granting agencies as they are in other jurisdictions (although that could be coming). There was general agreement that the article was a useful place to start when it comes to understanding where data management can fit into library services and systems.

The flow of the discussion changed as we looked at data sharing and discovery based on the roles that librarians and information scientists fulfill in this context. We recognized that the library is a possible home for research data and that we have a role as educators, curators, and stewards of data, but we are also researchers who consume and produce data. These points of view overlap and complement each other, but also offer different ways of looking at how the library can be involved.

When it comes to our role as curators and stewards of data, we discussed the kinds of things that could make data sharing difficult. The members of the Journal Club acknowledged that there is a difference between being able to find data and being able to provide those data to patrons in a way that is usable and sustainable. Infrastructure is required for data sharing and discovery, and there are many possible ways to make this happen. Should libraries have their own repositories or take advantage of existing repositories? What are the possible down-sides of housing data in institutional repositories instead of those that are discipline-specific (highlighted by MacMillan on page 546)? How can we work together to make the most of our limited resources and provide the most comprehensive services for Canadian researchers? Resources are being collected by the Canadian Association of Research Libraries (CARL), including a list of institutional repositories and adoptive repositories (http://www.carl-abrc.ca/ir.html). We talked briefly about data journals as dissemination venues, but wondered about the implications of publishers owning this content.

Issues around data privacy also came up in the discussion. Concerns were raised around security and the measures in place to make sure the individuals’ identities were protected. The Saskatchewan Research Data Centre (SKY-RDC) was identified as an example of how data can be distributed in a controlled way to protect research subjects (more about the SKY-RDC here: http://library.usask.ca/sky-rdc/index.html). In terms of research data, we came to the conclusion that privacy will trump sharing in terms of sensitive data.

Our role as data producers and consumers brought up concerns about when it was appropriate to release data that was still being written about. The idea of being scooped came up as a possible deterrent to making data public. This applies to “small” data as much as to “big” data. There were also concerns about how data sets would be used after they were made public. What if they were not used in a way that was consistent with their intended purpose? Data documentation can help users understand the data and use it in a way that enriches their research but acknowledges the possible limitations of the original data set. Data citation is an important if still relatively new thing, and part of our role as stewards and creators will be to make citing data as easy and common-place as citing other materials.

In the end, I think this article was a great place to begin the discussion of data sharing and discovery in the context of libraries for the C-EBLIP Journal Club. The discussion generated more questions than answers, which made it clear that this is a topic worthy of further investigation.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Organizational Innovation: C-EBLIP Journal Club, August 20, 2015

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

The first C-EBLIP journal club meeting of the 2015/16 academic year took place on Thursday, August 20, 2015. Despite many librarians being in full-on holiday mode at the time, six participants discussed the following article:

Jantz, R.C. (2015). The Determinants of Organizational Innovation: An Interpretation and Implications of Research Libraries. College and Research Libraries, 76(4), 512-536. http://crl.acrl.org/content/76/4.toc

This article was chosen by me with the idea that innovation was a light and exciting topic and that the article would be perfect for an August journal club meeting. While the article did have some redeeming qualities, the club meeting had a bit of a rocky start: this article is long and slightly unwieldy! Jantz conducted a research study which focused on innovation as a specific type of change and he determined five factors that had a significant impact on how well libraries innovate. The research method consisted of a survey distributed to 50 member libraries of the Association of Research Libraries.

The discussion opened up with some problems around the methodology of the research. The details of how the research was conducted were sketchy. There was no word on how Jantz coded the data, no talk of analysis or collection methods in detail. The survey instrument was not included in the paper, nor were the raw survey results, which, one journal club member commented, would have been the interesting part! In terms of the instrument, it would have been helpful to have a look at it as club members wondered if the author defined the complex terms he used. How can we be sure that all the respondents were on the same page? The author also did not list the specific institutions surveyed, causing us to wonder if they were perhaps skewed to the sciences? Would liberal arts universities/libraries have R&D departments?

Club members found it problematic that the surveys were only administered to senior leadership teams, as views could be quite different down the organizational hierarchy. As well, as the responses were said to have come from senior leadership teams, members were interested in how this might have logistically happened. Did the teams get together to fill out the surveys? Did each team member fill out a survey and were the results of those collated? This is an example of insufficient detail around the methods employed for this research and the problems could have been alleviated with more attention to detail. It was a good takeaway for my own research: be detail oriented, especially when it comes to methodology! If the research is to be useful, it has to be seen as valid and reliable. That won’t happen if the reader is questioning the methods.

Another issue about the paper was that in several instances, the author stated things as established facts but did not cite them. Perhaps the author made assumptions but as we are attuned to the idea of providing evidence for claims, we weren’t buying it. Another take away: in terms of evidence, more is more! Or at the very least, some is vastly better than none. As well, in the conclusion of the paper, the author used the terms vision and mission interchangeably, which particularly irritated one journal club member and was another example of imprecision.

The discussion moved from the particulars of the article to innovation in general with some observations being made:
• Innovation is not dependent on age: people across the age spectrum can be innovative.
• We are overwhelmed by choice. The more choices we have makes making a choice more difficult. Decision-making is difficult.
• Is there a type of innovation on the other side of radical, i.e. useless? Change for change sake?
• One participant felt that innovation of all types can be useless. Innovation doesn’t necessarily create more viable choices.
• Libraries are always playing catch up…it may be innovative to us but not elsewhere [depending of course on the library].
• Difference between innovation and library innovation. Is there a difference between innovation and implementation?

At the very least, C-EBLIP Journal Club members felt that reading about innovation could be valuable when heading into a state of change. And let’s face it, we’re generally dealing with change more often than not these days. To conclude, although the article had some methodological gaps and members felt that the author could have been more selective when transforming his PhD dissertation into an article, the article did give us the basis for a fruitful discussion on innovation and the first meeting of 2015/16 was an hour well spent.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Conducting Research on Instructional Practices: C-EBLIP Journal Club, May 14, 2015

by Tasha Maddison
University Library, University of Saskatchewan

Journal club article:
Dhawan, A., & Chen, C.J.J. (2014). Library instruction for first-year students. Reference Services Review, 42(3), 414-432.

I was sent this article through an active Scopus alert that I have running on the topic of flipped classrooms. I had not received an article that met my particular search criteria in a long while, so I was excited to read this offering. The authors had included flipped classrooms as one of their recommended keywords for the article; yet the term does not make an appearance until page 426 in a section entitled ‘thoughts for improving library instruction’ and makes up just over a paragraph of content. It was interesting to me to witness firsthand how the use of an inappropriate keyword caused further exposure to research that I probably would not have read otherwise, which is both good and bad. I chose this article for C-EBLIP Journal Club for this reason, as I believed it would generate a spirited debate on the use of keywords, and that it did. Immediately there was a strong reaction from the membership on how dangerous it can be to use deceptive descriptions and/or keywords in the promotion of your work, as you will likely end up frustrating your audience.

I found the scope of the literature review in this article to be overly ambitious as it focuses on librarian/faculty collaboration and best practices for instruction in addition to information on the first year college experience (pg. 415). I wondered if the reader would have been better served with a more specific review of the literature on ‘for-credit’ first year library instruction. Another point worthy of noting is the significant examination of the assessment process throughout the article including information about the rubric that was used as well as evidence from the ACRL framework and the work of Megan Oakleaf; yet the only quantiative data provided in the case study was briefly summarized on page 423.

The group had a lively discussion on the worth of communicating research on instructional practices in scholarly literature. Members questioned whether or not there is value in the ‘how we done it good’ type article and the validity of reporting observations and details of your approach without providing assessment findings or quantitative data. I would argue that there is a need for this type of information within library literature. Librarians with teaching as part of their assigned duties require practical information about course content, samples of rubrics, and details of innovative pedagogy, as well as best practices when using a certain methodology which ideally outlines both the successes and failures. Despite the advantages to the practitioner in the field, we postulated on how such information could be used within evidence based practice, as the findings from these types of articles are typically not generalizable and often suffer from inconsistent use of research methodology.

We wondered if there is a need to create a new category for scholarly output. If so, do these articles need to be peer reviewed or should they be simply presented as a commentary? There is merit in practitioner journals that describe knowledge and advice from individuals in the field, detailing what they do. This type of scholarly output has the potential to validate professional practice and help librarians in these types of positions develop a reputation by publishing the results of their integration of innovative teaching practices into their information literacy instruction.

In spite of the fact that this article had little to do with flipped classrooms, I did find a lot of interesting take-a-ways including: details of student learning services within the library and learning communities on their campus, as well as the merit of providing for-credit mandatory information literacy courses.

Suggested further reading: http://blogs.lse.ac.uk/impactofsocialsciences/2015/05/13/reincarnating-the-research-article-into-a-living-document/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Reflections from the C-EBLIP Journal Club, Feb 23, 2015

by Carolyn Doi
Education and Music Library, University of Saskatchewan

For this iteration of the C-EBLIP Journal Club, I decided to feature an article from outside the LIS literature that deals with the topic of reflection, creative processes and digital technologies in the classroom:

Kirk, Carole, and Jonathan Pitches. “Digital Reflection: Using Digital Technologies to Enhance and Embed Creative Processes.” Technology, Pedagogy and Education 22, no. 2 (July 1, 2013): 213–30. http://dx.doi.org/10.1080/1475939X.2013.768390

This paper caught my attention for several reason. The discussion of creative processes and incorporation of technology in the classroom is particularly interesting to me and these are topics that often come up when I discuss teaching strategies with other librarians. I was also looking forward to exploring the idea of reflection, both in the classroom and as part of the research process. This is something we have discussed in our institution and in particular through our own Library Leadership Development Program.

The authors of this paper are both scholars at the School of Performance and Cultural Industries at the University of Leeds who have shared the results from a teaching and learning project called Digitalis (http://www.digitalis.leeds.ac.uk/), which “investigates ways in which digital technologies can be used by teaching staff to facilitate reflection on creative practices within performing and creative arts disciplines” (p. 213). The study used action research methodology led by members of a cooperative inquiry group who incorporated reflection and digital technologies into their own teaching practice. They took this a step further and also incorporated reflection as one of the four project phases (planning, action, collection and reflection).

The study featured modules in five areas of study: performance design, dance, music, theatre & performance and museum studies. In each module, students were asked to reflect on their learning and experience, assisted by different types of digital technology. In one example, students in a second year compulsory Dance Choreography course were asked to use a flip camera to capture thoughts, ideas and observations, which were used in combination with written reflection and posted to a private blog. The other modules used varying types of reflective processes. Methods of digital capture included flip cameras, audio recorders and digital still cameras. Digital reflection mechanisms included blogs (on Blackboard), PowerPoint and Photo Story 3.

In some cases, the technology may have interfered with the process of critical reflection as some students ended up “concentrating too much on slick production values to the detriment of critical thinking” (p. 224). The paper mentioned that ease of use was an important factor in getting students to feel engaged in the reflection activities. One recommendation that came out of the paper was that digital reflection technologies should be introduced incrementally, as opposed to all at once.

We discussed the value of incorporating technology into the classroom, and also of the importance of not letting the technology ‘get in the way’ of the learning process. Some in our group remarked that they were still surprised that the incorporation of technology in the classroom still might be a barrier for some students.

The paper reports that students found digital reflection to be advantageous when ‘looking again’ at material which would otherwise have been lost in the creative practice. The digital capturing acted as a way they could benchmark their own impressions of the event, and allowed the performer to experience being an audience member of their own performance.

We discussed the benefits of reflection in two veins: 1) for integration into the classroom and 2) for integration into our own practice. Some questioned the viability of incorporating reflection (especially non-written reflection) into library instruction as we are often faced with the challenge of limited classroom time where it would be difficult to follow up with students. Librarians who teach in disciplines outside of the arts felt that they might just not be able to get their students to try a less conventional reflection method such as video capture. The article prompted some to think about video capture as a means to document and reflect on one’s own teaching practice. Others were thinking about incorporating reflection into other aspects of practice or research, or are currently embarking on projects that do incorporate an element of planned reflection.

The journal club is always an engaging gathering and it’s been interesting to see the various opinions and perspectives that emerge out of the group discussions. I look forward to many more discussions around the journal club table in the coming months!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Oh, the Digital Humanities! C-EBLIP Journal Club, November 13, 2014

by Shannon Lucky
Library Systems & Information Technology, University of Saskatchewan

The most recent meeting of the C-EBLIP Journal Club was my chance to select an article and lead our discussion. I was intrigued by our previous conversations about strategic publishing choices, open access journals, the perception of different publishing outlets in our field, and alternatives to traditional scholarly publications (like blogs!) and wanted to keep that discussion going. An announcement about the recent digital humanities project The Exceptional and the Everyday: 144 hours in Kiev landed in my inbox and I thought it could ignite a great conversation about alternative research communication options – but it is decidedly not a traditional journal article. Although this goes against the basic organizing principle of a journal club, Virginia encouraged me to take some liberties with the definition of an article and it led to a lively discussion that ranged from the audience(s) for research, to selfies, to peer review and anxiety about wanting to experiment with your research, but also wanting to get tenure.

The Exceptional and the Everyday: 144 hours in Kiev comes from Lev Manovich’s Software Studies Initiative and uses “computational and data visualization techniques [to] explore 13,208 Instagram images shared by 6,165 people in the central area of Kyiv during 2014 Ukrainian revolution (February 17 – February 22, 2014).” I will state my bias up front – I love this kind of stuff. This is a very “DH” (digital humanities) project – I wondered how researchers from other disciplines (like libraries) would feel about putting their resources into creating a website where they release everything (findings, data, and research tools) rather than focusing on getting that highly regarded publication accepted first.

A recent theme that has been running through our meetings is standards for promotion and tenure at our institution and how the collegium values research work and output. There are so many different ways we all engage with research and share the things we learn, but the gold standard remains the traditional peer reviewed, high-impact journals. We had previously talked about the tension between publishing in the most appropriate or interesting place vs. traditional, highly regarded library journals. I wanted to talk about how a project like this breaks the expectation of how research is communicated and if this format is effective, persuasive, authoritative, or just a gimmick.

This publishing format is certainly out of the ordinary but one of the biggest benefits was the ability to get information out FAST. The event this project studied happened less than eight months before the day I got that announcement in my inbox. That kind of production time is basically unheard of in traditional publishing, even for electronic journals. The downside is there is no time for peer review. Post publication peer review was mentioned as an option to keep the timely nature of the publishing cycle while maintaining the important value of peer review. I am very curious what that would look like for this project and how that peer review would be communicated to readers.

Perhaps my favourite comment from our discussions was “This made me feel like the oldest fogey ever”. While a hysterical comment coming from a room full of people who love new research, it nicely described the feeling several of us had trying to read this project like a journal article. As we picked our way through the site we acknowledged that most of the information we look for in an article was there (except an abstract!), but not having it in the familiar linear format was disorienting. The project checks all of the boxes in term of citing sources and uses research methods we recognize, but nothing is where you would expect it to be. It is both easy to browse and difficult to skim for information. We need to develop new literacies to become more comfortable with this format or at least check our assumptions that the best way to communicate research findings is the way we do it now.

Although the project proved complicated to read in the same way we understand journal articles, this format does have major benefits. This kind of project allows you to publish everything – multiple articles or essays, your dataset(s), huge full-colour graphics, interactive visualizations, the digital tools you used to do the research, and you can update all of this stuff on the fly with no extra cost. This is only good if all of that information is useful (or at least beautiful to look at) but it does give the opportunity to understand the methodology and process better by revealing multiple aspects of the research, particularly if the research subject exists online.

All of this analysis and comparison to traditional academic publishing kept coming back to the question of who the audience is. Who is this research for? We didn’t come to a consensus on this question. We did wonder what the altmetrics for something like this would look like and what the benefits are in pursuing this publication model. The project didn’t show up in Google Scholar at all, but it did have over 800 hits from a regular Google search (many from social media). In the end we posed the question: What is more valuable for your research, having a paper peer reviewed and read by your academic peers or seen by thousands of people outside your field and likely outside academia? I can’t imagine building an academic career based on web projects (without peer review) at the moment, but who can tell the future? Things are changing all of the time. Besides, I wouldn’t be surprised to see some peer reviewed articles about 144 Hours in Kiev pop up in Google Scholar in 2015.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Are Students Succeeding with a Library Credit Course? C-EBLIP Journal Club, October 6, 2014

by Rachel Sarjeant-Jenkins
Client Services, University Library, University of Saskatchewan

I recently had the opportunity to lead our C-EBLIP Journal Club in a discussion of Jean Marie Cook’s article “A library credit course and student success rates: A longitudinal study” in College & Research Libraries 75, no. 3 (2014) (available at http://crl.acrl.org/content/75/3/272.full.pdf+html). This article had been sitting on my desk for a few months waiting for that magical moment when I found the time to read it thoroughly. Then came my turn to host journal club. What a perfect opportunity to finally delve into Cook’s article! And it couldn`t have come at a better time in light of our library’s focus on developing a programmatic approach to library instruction and the broader teaching and learning environment in which academic libraries currently find themselves.

Following some ‘proper’ journal club discussion about the article’s methodology and findings, Cook’s article proved a wonderful catalyst for a conversation about library instruction at our institution. Initially we were simply envious of Cook’s situation, where a library-focused course is one of the areas within her institution’s priorities. But then the questions started.

• Is there value in having a stand-alone library course or is it better to have instruction firmly embedded or integrated into academic program courses? (Of course, this question did not mean we ever stopped desiring that institutional commitment to information literacy — who would!?)
• How do you assess student learning? And, more importantly, how do you gauge the actual ongoing use of that learning by students?

We also talked about library value. The impetus for Cook’s work was institutional interest in ROI; the result was her quantitative research project.
• How, we asked, can qualitative data be used to support (and enhance) quantitative data when demonstrating library value to the parent institution?
So many questions, and only a lunchtime to discuss.

Not surprisingly, our hour just wasn’t enough. What that hour did do, however, was get us thinking. We talked about the known information literacy courses on campus and learned about pockets of embedded instruction by our librarians that we were completely unaware of. We had a lively debate about quantitative and qualitative research and the benefits of each. And of course we talked about assessment, not only that we need to do more of it and do it more consistently, but also the importance of knowing what we are trying to assess and therefore when we want to assess it.

Our journal club hour got me excited and primed for the next steps in developing our library’s programmatic approach to instruction. Cook’s article, and the energetic conversation it inspired, was an excellent beginning.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.