Getting to grips with NVivo

by Chris Chan
Head of Information Services at Hong Kong Baptist University Library

Despite having worked in academic libraries for almost ten years, and taken two graduate degrees that included research methods courses, I must confess that my level of comfort with research data collection and analysis is not terribly high. My lack of confidence manifests itself particularly in the use of specialised data analysis software packages.

Studies that make use of such tools (e.g. NVivo and ATLAS.ti) are becoming increasingly common. Woods, Paulus, Atkins, and Macklin (2016, p. 602) used the Scopus database to confirm that the number of published articles that made use of one or the other of these qualitative data analysis software titles has grown significantly in recent years. I have found this to also be reflected at my institution, where our library has received an increasing number of enquiries about NVivo in particular. Apart from this contextual need to become more fluent in these data analysis software packages, as a social sciences subject librarians I personally feel a strong professional desire to deepen my understanding in this area.

As every instruction librarian knows, teaching or learning specialist skills in a vacuum is difficult. If there is no practical application for what is being learned, staying motivated will be a challenge. I was therefore fortunate to have a recent opportunity to gain practical experience with using NVivo as part of an action research project in partnership with a faculty member. This grew from an invitation by the faculty member to collaboratively co-teach a postgraduate research methods course. We adopted an embedded librarian approach. In addition to leading a traditional instruction session, I also attended several of their regular classes and was active in the course’s learning management site.

The depth of this collaboration between librarian and course instructor was new to both myself and the faculty, and this level of partnership is certainly uncommon at our university. We were thus naturally keen on assessing how effective this approach was in enhancing student learning. We were inspired by the action research undertaken by Insua, Lantz, and Armstrong (2018) in which the researchers asked students to reflect upon their research process via structured research journals, which were subsequently coded and analysed using NVivo. The authors reported that these “qualitative data yielded valuable insights into the research process from the student’s point-of-view”. In asking our own students to complete a similar task, we intended to also analyse the results to gain insights into their development of information literacy abilities and dispositions.

Apart from the obvious benefit that this enterprise would have on my teaching practice, the idea appealed to me in that it represented an authentic need to learn how to use NVivo. Prior to embarking on this project, I knew very little about NVivo beyond the name and the fact that it was used for the analysis of qualitative data. In response to feedback from our library’s users, I had been involved in making the software itself available on library PCs and laptops. However, my own practical experience with NVivo was zero. My first step was therefore to access a beginner’s introduction through an institutional subscription to Lynda.com. This was great for learning basic concepts and terminology and required less than ninety minutes.

After learning the ropes, my first step was to import the student journals into NVivo. We had asked them to submit their journals using Google Sites. Getting this content into NVivo was surprisingly easy to do using the NCapture browser extension. With one click, the entire content of a web page can be saved as a .ncvx file. These can then be uploaded in a batch to NVivo for organization and coding.

Once all of the data was imported, I experimented with coding a small proportion (10%) of the journal entries. I then sought feedback on the coding structure from the faculty. Once this was incorporated, I began coding in earnest. This was a time-consuming and labour intensive process. I completely agree with Carolyn Doi, who in her earlier post on Brain-Work about Nvivo states that “coding is made easier with NVivo, but the software doesn’t do all the work for you.” I found maintaining focus and attention during coding to be a challenge, and I had to split the work between multiple different sessions.

The effort did pay off, as once the coding was done I was able to start playing around with some of the analysis and exploration features of the software. Some are very intuitive, such as the treemap pictured in Figure 1 below that allows you to visualize your coding hierarchies. This allows you to see at a glance some of the more prominent themes based on coding frequency. Some of the other features (such as the various different queries) are more opaque, and I will need to dedicate more effort to understanding the purpose of these and whether they will be useful to the analysis for this project.


Figure 1 – Treemap produced using NVivo 12 Mac (click on image for clearer version)

So far my experience with NVivo has been good, but I clearly have a long way to go before I become a proficient user of this software package. My motivation remains high, as apart from using NVivo in research projects, I would like to be able to answer patron enquiries about the software and possibly even run workshops on its use.

References

Insua, G. M., Lantz, C., & Armstrong, A. (2018). In their own words: Using first-year student research journals to guide information literacy instruction. Portal: Libraries and The Academy, 18(1), 141–161. https://doi.org/10.1353/pla.2018.0007

Woods, M., Paulus, T., Atkins, D. P., & Macklin, R. (2016). Advancing qualitative research using qualitative data analysis software (QDAS)? Reviewing potential versus practice in published studies using ATLAS.ti and NVivo, 1994–2013. Social Science Computer Review, 34(5), 597–617. https://doi.org/10.1177/0894439315596311

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Responsible Metrics Movement: Don’t Judge Research by the Package it Comes In!

by DeDe Dawson @dededawson
Science Library, University of Saskatchewan

I often rail against the unsustainability and inequity of the current subscription journal publishing system. We have the technology, the money (if we disinvest from the current system), and the ingenuity to completely re-imagine this system (see Jon Tennant’s recent article – it is short and worth your time!). A new system could be entirely open, inclusive, and democratic: enabling anyone in the world to read and build upon the research. This has the potential to dramatically increase the speed of progress in research as well as its uptake and real-world impact. The return on investment for universities and research funders would be considerable (this is exactly why many funders are adopting open access policies).

So, why is it so hard to get to this ScholComm paradise?

It is a complex system, with many moving parts and vested interests. And getting to my idealistic future is also a huge collective action problem. But I think there’s more going on that holds us back…

Have you ever heard of the analytical technique called The 5 Whys? It is designed to get at the underlying basis of a problem. Basically, you just keep asking “why?” until you get at the root of the issue (this may be more or less than five times obviously!). Addressing the basis of the problem is more effective than dumping loads of time and resources in fixing all the intermediary issues.

I’ve used The 5 Whys numerous times when I’m stewing over this dilemma of inertia in transitioning to a new model of scholarly publishing. I always arrive at the same conclusion. (Before reading on, why don’t you try this and see if you arrive where I always do?)

1st Why: Why is it so hard to transition to a new, more sustainable model of publishing?
Answer: Because the traditional subscription publishers are so powerful; they control so much!

2nd Why: Why are they so powerful?
Answer: Because many researchers insist on publishing in their journals.

3rd Why: Why do they insist on publishing in those journals?
Answer: Because they are addicted to the prestige titles and impact factors of those journals.

4th Why: Why are they addicted to these things?
Answer: Because they feel that their career depends on it.

5th Why: Why do they think that their careers depend on this?
Answer: Hiring & merit committees, tenure & promotion committees, and granting agencies often judge the quality of research based on the prestige (or impact factor) of the journal it is published in.

Of course there are many variations in how to ask and answer these questions. And there are associated problems that emerge as well. But, the underlying problem I always arrive back at is the perverse incentive systems in higher education and the “Publish or Perish” mentality. And of course what this tweet says:

Ok, so now let’s ask a “How?” question…

If academia’s incentive systems are one of the major factors holding us back from transitioning to a more sustainable publishing system then… How do we change the incentives?

The Responsible Metrics Movement has been growing in recent years. Two statements are fueling this movement:
The San Francisco Declaration on Research Assessment (DORA)
Leiden Manifesto for Research Metrics

Each of these statements advocates for academia to critically examine how they assess research, and encourages adoption of responsible metrics (or methods) that judge the research on its own merits and not the package it comes in (i.e. the prestige of the journal). DORA focuses primarily on combating the problem of journal-based metrics (the problems with the Journal Impact Factor are well known), and makes a number of suggestions for action by various stakeholders. While Leiden is more comprehensive with 10 principles. See this video for a nice overview of the Leiden Principles:

Evaluating researchers by actually reading their published outputs seems like an obvious solution… until you are on one of those hiring committees (or tenure/promotion/merit committees, or grant adjudication committees, etc.) and faced with a stack of applications – each with a long list of publications for you to read and assess! Instead, Stephen Curry (Chair of the DORA Steering Committee and passionate advocate in this area), suggests candidates compile a one or two-page “bio-sketch” highlighting their best outputs and community contributions. I recently came across a research centre that is using just such a method to assess candidates:

“…we prefer applicants to select which papers they feel are their most important and write a short statement explaining why.”


From the Centre for Mechanochemical Cell Biology (CMCB)

DORA is also collecting examples of “Good Practices” like this on their website.

In my experience, many researchers are aware of these problems with journal-level metrics and the over-emphasis on glamour journals. It has even been noted that Nobel Prize winners of the past would not likely succeed in today’s hyper-competitive publish or perish climate. But researchers often feel powerless to change this system. This is why I particularly like the last paragraph of the CMCB blurb above:

“As individuals within CMCB, we argue for its principles during our panel and committee work outside CMCB.”

Researchers are the ones making up these committees assessing candidates! Use your voice during those committee meetings to argue for responsible metrics. Use your voice when your committee is drawing up the criteria by which to assess a candidate. Use your voice during collegial meetings when you are revising your standards for tenure/promotion/merit. You have more power than you realize.

Ingrained traditions in academia don’t change overnight. This is a long game of culture change. Keep using your voice until other voices join you and you wear down those traditions and the culture changes. Maybe in the end we’ll not only have responsible metrics but sustainable, open publishing too!

Recommended Further Reading:

Lawrence, P. A. (2008). Lost in publication: How measurement harms science. Ethics in Science and Environmental Politics, 8(1), 9-11. https://doi.org/10.3354/esep00079

Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314, 498-502. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2126010/

Vanclay, J. K. (2012). Impact factor: Outdated artefact or stepping-stone to journal certification? Scientometrics, 92(2), 211-238. https://doi.org/10.1007/s11192-011-0561-0

P.S. Assessing the actual research instead of the outlet it is published in has implications for the “Predatory Publishing” problem too. Martin Eve and Ernesto Priego wrote a fantastic piece that touches on this:

Eve, M. P., & Priego, E. (2017). Who is Actually Harmed by Predatory Publishers? TripleC: Communication, Capitalism & Critique, 15(2), 755–770. http://www.triple-c.at/index.php/tripleC/article/view/867

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Updating a Collections Assessment Rubric

by Kathleen Reed, Assessment and Data Librarian, Instructor in the Department of Women’s Studies, and VP of the Faculty Association at Vancouver Island University

Last year, I wrote a blog post that mentioned the rubric my place of work (MPOW) uses to assess collections. The rubric is a collaborative document designed by colleagues Jean Blackburn, Dana McFarland, and me. Recently on Twitter a few people mentioned the rubric again, and made some suggestions for additional items to consider. For this blog post, I thought I’d go over some of these suggestions, and discuss the way the document has been used over five years at MPOW.

The 27-point rubric emerged from a recognition that generic data like cost-per-use wasn’t sufficient in deciding whether to renew or cancel products. We needed a system to look at products in a broader information context. Thus, the rubric was born. In its first five years, it has proven itself very valuable to librarians. We use it when products come up for renewal, often grouping “like” databases together into baskets (i.e. the Big Deals basket, the videos basket) for easy comparison and a more holistic overview. Our liaisons find it useful to use when talking to faculty about potential cancellations.

We haven’t adapted the rubric much, only adding a “required for program accreditation?” question as our institution’s programs expand. But the broader information context in which the rubric sits has shifted, and some new suggestions make sense. Ryan Reiger proposed that an open access lens would be helpful, considering: “OA options for authors, OA percentage, Copyright Override in license, [and] Alternative routes for access.” DeDe Dawson suggested “other friendly license terms such as: support for text & data mining, and no non-disclosure agreements.” As increasing numbers of librarians critique vendors, emphasize open access, and demand transparency, these suggestions make sense to be added to the rubric v2.0.

Thanks to Ryan and DeDe for sharing their thoughts on the rubric. If you have ideas on how to improve this tool, feel free to leave them in the comments below.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Librarians helping librarians with research (aka, my experience at the CARL Librarians’ Research Institute)

by Jaclyn McLean
Electronic Resources Librarian, University of Saskatchewan

Attending the CARL Librarians’ Research Institute (LRI) at Concordia in June was an exhilarating, exhausting experience that helped me solidify my identity as a practitioner researcher and confirm my plan for this coming year, in which I will compile and submit my tenure case file.

I have many ideas and thoughts that I’m taking away from LRI, and it’s so recent that they’re not very organized yet. So here they are, in no particular order:

• Librarians are awesome and so supportive of each others’ work—this experience was about research, but I’ve seen the same thing when we get together to talk about our practice.
• Working in a beautiful, acoustically thoughtful, comfortable space makes everything else you’re doing easier. The chairs in the renovated Webster Library are the most comfortable and ergonomically thoughtful chairs I’ve encountered at any professional event. I wish I’d taken a picture of them, but I did not (luckily, Concordia has posted some great photos on their transformation website).
• We talked a lot about habits of the mind, and I found that a very useful frame for talking about practitioner research, especially the concepts of responding with wonderment and awe and remaining open to continuous learning.
• While we talked about methods and such, I really appreciated the overall framework of the research lifecycle that guided our work (from developing research problems and questions to dissemination and research culture).
• We went beyond qualitative and quantitative and talked about a type of research I’m more familiar with from my background in studying history: conceptual/theoretical. It was pretty amazing to talk about this third kind of research as fitting in with LIS, which I had never thought of before. I had thought that I was more limited now to qualitative/quantitative research as a librarian, with data gathered from participants. It feels like now I’ve maybe got “permission” to go back to my roots and do other kinds of research too!
• Talking about your research with others can help you figure out what to do next, think of a new project idea, or change your direction with a current idea. The opportunity to discuss research with peer mentors and other attendees at LRI was really helpful as I developed a new project and reminded me of the value of talking about research with others.
• We had a morning keynote from Concordia’s Researcher in Residence, Claire Burrows, and her reminder to question what surprises us in our research and also what doesn’t, to dig into those areas and explore those ideas because they can be fertile ground for future research, or inspire a new direction for current research was inspiring.

I’m sure that I will continue to reflect on my LRI experience over the summer, as I tweak my research plan, work on my active projects, and continue to try and find the right balance of research with practice. Opportunities like LRI, by providing the dedicated, safe space to think about research and to meet other librarians who are engaged in and interested in research, are few and far between. If you get the chance to attend, I’d highly recommend it.

Now, I’m off to read one of the books we talked about at LRI, How to Write a Lot by Paul Silvia. If you don’t find yourself with enough time to read it yourself, I stumbled across this excellent post about it on The Thesis Whisperer. I’m also going to think some more about my own research specifically, LIS research generally, and how I can continue to consciously build my habits of mind.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Beyond survey design: take survey data to the next level

by Carolyn Doi
Education and Music Library, University of Saskatchewan

You’ve designed a survey, found the right participants, and waited patiently while responses come streaming in. The initial look at responses can be thrilling, but what happens next? I’ve used questionnaires as a data collection technique, and made the mistake of thinking the work is over once the survey closes. Kelley, Clark, Brown and Sitzia warn us about treating survey research as a method requiring little planning or time:

“Above all, survey research should not be seen as an easy, ‘quick and dirty’ option; such work may adequately fulfil local needs… but will not stand up to academic scrutiny and will not be regarded as having much value as a contribution to knowledge.”1

Let’s consider some steps to explore once data collection has been completed.

1) Data cleaning and analysis
Raw survey data is usually anything but readable. It takes some work to transform results into meaningful and shareable research findings. First of all, familiarize yourself with some of the relevant terminology, before moving on to actually working with the data. Before touching the dataset, you’re going to need to create four worksheets, one for raw data, one for cleaning in progress, one for cleaned data, and one for data analysis. Each worksheet shows a stage in the process, which will allow you to backtrack, or find errors. If you haven’t taken a stats class recently, I like this introductory Evaluation Toolkit, which clearly describes the processes of cleaning, tabulation, and analysis for both quantitative and qualitative data.

2) Visualization and reporting
Consider data visualization to bring your survey data to life, but remember to choose a visualization tool that makes sense for the data you’re trying to represent. The data visualization catalogue is a handy tool to learn more about the purpose, function, anatomy, and limitations of a wide range of visualizations. It includes links to software and examples of each visualization. There are lots of free or inexpensive programs to help create visualization including Microsoft excel, Google sheets, or Tableau Public. If you’re looking for some inspiration, take a browse through the stunning work of Information is Beautiful for ideas.

Likely you will want to share the outcomes of your research, either at your institution or in a paper or presentation. Kelley, Clark, Brown, and Sitzia provide a great checklist of information to include when reporting on any survey results, including research purpose, context, how the research was done, methods, results, interpretation, and recommendations.2 Clarity and transparency in the research process will help your audience to better understand and evaluate the research and its applicability to their context.

3) Data preservation and access
Consider an open data repository such as the Dataverse Project to make your data discoverable and accessible. Sharing your data comes with benefits such as “web visibility, academic credit, and increased citation counts.” You may also be required to archive your data to satisfy a data management plan or grant funding requirements, such as those from the Tri-Council. When archiving in a repository, remember to share your data in an accessible file format, and include accompanying files such as a codebook, project description, survey instrument, and outputs such as the associated report or paper. As a rule of thumb, aim to provide enough documentation that another researcher would be able to replicate your study. A dataset is a publication that you can cite in your CV, ORCID profile, in a paper, or presentation. Doing so is a great way encourage others to learn about your research or to build on your research project.

Getting your hands dirty and working directly with survey data is where you’ll be able to explore and eventually tell a compelling story based on your research. Be curious, persistent, and enjoy the process of research discovery!

1KATE KELLEY, BELINDA CLARK, VIVIENNE BROWN, JOHN SITZIA; Good practice in the conduct and reporting of survey research, International Journal for Quality in Health Care, Volume 15, Issue 3, 1 May 2003, Pages 261–266, https://doi.org/10.1093/intqhc/mzg031

2Ibid. p. 265.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Controlling Reuse in Archives: C-EBLIP Journal Club, May 22, 2018

by Stevie Horn
University Archives and Special Collections, University of Saskatchewan

Dryden, Jean. “Just Let It Go? Controlling Reuse of Online Holdings.” Archivaria 77 (Spring 2014). Pp,43-71. https://archivaria.ca/index.php/archivaria/article/view/13486

For the last C-EBLIP journal club meeting before our summer hiatus, I brought to the table an article by copyright expert Jean Dryden discussing the tension between access and control within the online archival realm. My selection of a somewhat dated article stemmed from this very tension. Archivaria is Canada’s premiere journal for archival professionals, and I was determined to choose a discussion piece from within its pages. Unfortunately, any issues more recent than 2014 are subscription locked, and so in order to find something that could easily be accessed by the rest of the group, it was necessary to jump back in time a bit.

So here, from the very selection of the article, we run into the issue at hand: the ongoing struggle archives and archivists face in simultaneously making materials available to their researchers, and maintaining some sort of control over those materials. I will note that although this article focuses on online holdings, the same access-control war is also being fought in the day-to-day physical work of the archives. When the job description involves making order out of chaos, control becomes a bit of an obsession.

Pursuing control over archival materials while providing access to them online has led to the creation of a number of unofficial best practices as well as some not-so-good practices that are common across institutions. Dryden highlights several, including the use of low-resolution or watermarked images which force users to contact the institution directly for images of a higher quality; the passive or aggressive application of terms and conditions on the website; and a practice Jason Mazzone termed “copyfraud”, or the claiming of copyright ownership over materials that are actually in the public domain, or whose copyright lies with another entity.

Following an extensive study based on viewing a number of archival websites, collecting surveys, and conducting one-on-one interviews with a number of participants, Dryden concludes that although most institutions employ some means of preventing the copying of online archival holdings, the repository is rarely, in fact, the rights holder. She suggests that archives should use caution in how they employ copyright, or the impression of copyright to ensure that they do not “present a barrier to online documentary heritage” (Dryden, 43).

The discussion around this article was wide-ranging and enriched by the professional backgrounds of those present. We spent some time speaking on how the nature of archival materials as being most often one-of-a-kind can affect the need for control. As an example, the citation requirements for an archival document may be much more stringent than those applied to a widely published text. A good citation may be the only path (and certainly the easiest path) to finding a given archival resource again. For many institutions, it is here that the pressure to assert some sort of control occurs, and where restrictions may be placed on the mass copying and redistribution of a high quality image, regardless of whether the image is in the public domain or not. An improperly cited digital object floating around the internet can be a great causer of future headaches. However, should we contradict our own mandates of providing access simply to avoid a headache (or in the case of institutions that rely on a pay-for-copies model, make a profit)? An ethical dilemma I will not tackle here.

Another interesting element of the discussion was the hypothesis that a better use of technology could more harmoniously provide both access and control over digital objects. For example, a step-by-step process could be applied, walking users through any questions of copyright involving the item they are interested in using. Clickable licenses related to that item could also be present for those looking for more detailed copyright information. Rather than providing a false sense of copyright being held over materials in the public domain in order to maintain control, a downloadable citation option could be offered alongside the image, with a statement as to why a reference back to the institution of origin is important.

Along these user-friendly lines, we also considered whether some of the language and symbolism around Creative Commons licenses could be applied to archival materials. Certainly, Creative Commons is a parlance that is becoming more familiar to many researchers, and applying that language to the materials that archives do hold copyright over may disambiguate some of the current restrictions on and requirements for use. It was suggested that applying a Creative Commons license to a donor’s materials could even be done at the point of acquisition as part of the deed of gift. Although this notion has not taken the archival world by storm, it does seem as though there is some discussion of this sort of synthesis going on from the Creative Commons end of things.

In the end, we came through our discussion of control and access with a few exciting new ideas and, I think, greater understanding all around. I appreciated hearing perspectives from voices outside of the archival world, as sometimes archivists can become so caught up in putting things tidily into boxes that we do not notice that we have boxed ourselves up as well.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

‘Divergent’ Funding Opportunities as Tools for Reflective Research

By Katya MacDonald
Library Research Facilitator, University of Saskatchewan

A few years ago, as a grad student feeling the financial pinch of multiple extended research trips, I stumbled across an informal, seemingly one-off blog post listing funding opportunities across diverse disciplines and regions. I read the whole list, but nothing seemed like a plausible fit. I clicked on one link anyway; it had the word “history” in it. I was working on a history PhD. Close enough?

Not really; the granting agency funded research that dealt with a very specific theme in a very different region that seemed to have little to no bearing on my work. But the society was offering money and it just so happened that I needed money. So, after some hours with coffee shop treats and pieces of paper with a lot of arrows and question marks drawn on them as I tried to articulate a connection to my work, I submitted an application.

To my surprise, I got the grant. But by explaining my work to a non-specialist audience, and by reframing it to suit the (oddly specific) requirements of the granting agency, I also got a revelation about my dissertation that allowed me to be more precise about my process and motivations. Being explicit about these components of my work led me to clearer, more accessible arguments and away from my own initial assumptions. The process of reframing was exactly what my research needed, and in the end I based my entire dissertation on the explanations I developed for the grant application.

I open with this anecdote not because I think anyone really hopes to replicate the experience of being an impoverished grad student! Instead, I want to expand on what this story suggests about serendipity and the broadening of perspective to see the grant as a potential fit. In the remainder of this post, I consider potential ways to “stack the deck” to take advantage of similar opportunities that, because they aren’t readily predictable, probably can’t form the core of a research project, but that can help to clarify or expand it in transformative ways.

To help focus my discussion, I informally canvassed grant announcements in the areas I list below, with librarian research networks in mind and an eye towards funding opportunities that seemed to sit outside of the most common or apparent funding channels. I wanted to think more about to what extent it’s possible and practical to cast a wider net for unexpected opportunities as part of the research process. Here, I’m calling these “divergent” opportunities to reflect the fact finding them often requires taking a different path than usual.

Where to look for divergent opportunities?

– Research and grant communications in adjacent/cognate disciplines, or in disciplines asking similar methodological/ethical/theoretical/practical questions
– Grants and agencies based in other countries that may offer awards with broader eligibility
– Research listservs (e.g. university-specific, H-Net groups, multidisciplinary and discipline-specific)
– Prizes (e.g. for articles, professional activities, conference papers) – these are sometimes structured around broad themes or questions, rather than specific, discipline-defined topics
– Smaller-scale grants or other opportunities may have more flexible requirements and require less investment of time if they feel like a long shot

How do we know a grant opportunity when we see it? (Or, how to think about research to encompass a broad scope of grant opportunities?)

– Research as a story: main plot points probably support the larger/most relevant funding opportunities, but side plots or incidental moments can branch out to additional, supplemental funding
– Identifying themes, questions, or concerns in common is sometimes easier than identifying a topic in common
– Conceptualizing research in terms of its relevance or novelty to a new or unexplored audience
– Describing projects to new audiences often creates new ways of depicting the importance of the research

Why invest time and effort into applications that might seem random or unlikely?

– Doesn’t have to involve a large investment of time: reframing is a way of gaining access to expanded opportunities – not changing the project, just emphasizing different aspects of it to suit broader or more diverse audiences
– Kickstarting a project that lacks direction or momentum
– Building innovative research connections and conversations
– Impetus and support for engaging ideas that might not otherwise comprise an entire research project
– Can serve as catalysts for expanding awareness/impact/scope of existing research
– Small, divergent grant applications can also become conference papers and/or articles

But is it just too out there?

– Applying for divergent grant opportunities is an exercise in determining the difference between non-negotiable ineligibility vs. finding ways to fit within requirements using novel or unexpected framing
– Innovative thinking and strong, well-justified ideas are nearly always welcome even if they don’t end up being fundable in the context of a specific grant

The question above that especially stands out to me is whether or how to justify the time spent on these kinds of divergent opportunities. In my opening anecdote, the eventual benefits were well worth the investment. But particularly given the unpredictable nature of when these opportunities crop up, that may not always be the case. Is it best to consider them as single-dose antidotes to burnout (certainly also the case in my opening anecdote), rather than as regular features of the research process? Or is there room to keep an eye out for a broader swath of seemingly unrelated opportunities as a matter of habit and as a tool for thoughtful research, just in case they lead to new insights and activities?

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Planning a Library Escape Room

By Gina Brander & Ann Liang
Saskatchewan Polytechnic Library

Stress Better is a semiannual Saskatchewan Polytechnic Library event designed to help students combat stress as they prepare for exams. As part of our 2017 event, the Library piloted an escape room at the Regina campus. Escape rooms are physical adventure games in which active participants solve a series of clues and puzzles to escape a room before the allotted time elapses. Academic libraries have used escape rooms for staff development workshops (Marks, 2017), library orientations (Salisbury & Ung, 2016), and library instruction (Pun, 2017). While our Library considered incorporating an information literacy component into the event, we ultimately decided that a fun, escapist approach would generate more interest and better support the aims of Stress Better. Given that many of our students more or less lock themselves in the library during this period of the semester, the irony of a library escape room was particularly appealing.

Outreach
Since no one on staff knew the first thing about running an escape room, we decided to reach out to local businesses and ask if they would be interested in developing a mini version of one of their rooms. Emails were sent to three local escape rooms, emphasizing the promotional benefits of the partnership and making clear that we would not offer remuneration. To our delight, all three rooms expressed interest in assisting with the escape room. The Library chose to partner with a local, family-run business that had received favourable reviews online, and which had offered to design a new room based on our individual needs.

Planning
Over the next week, the escape room co-owner and a librarian selected an appropriate space (a small, windowless study room), determined length of gameplay (15 minutes) and discussed potential storylines based on available library furniture and props (weeded books, filing cabinets, whiteboards, wall-hangings, etc.). The co-owner developed the design and flow of the room. Then, a week before the event, the room was closed to allow for set up and testing. A script and reset list were prepared for the Library, and a faculty group was invited to trial the room before the official launch.

Promotion
A variety of promotional methods were used. Due to the high foot traffic at the Regina campus, a chalkboard near the entrance and our frontline staff were the most effective channels of promotion. Participants returned to the library throughout the week to inquire about best time, which inspired us to promote a ‘time to beat’ on print posters and social media. This added competitive element kept the momentum going, and as the week progressed, we opened additional time slots at the request of students and faculty/staff.

Run & Reset
A staff member greeted each group and led them to the escape room. After laying down ground rules and introducing the scenario, they closed the (unlocked) door and set the clock. Each group was granted three clues, which could be requested via walkie-talkie. After the allotted time had elapsed (or after the group had ‘escaped’), the staff member returned to the room to debrief and answer questions about unsolved puzzles. After a brief photo shoot, the group was led out and the room was reset.

Takeaways
The escape room was successful, with 25 students and 14 faculty/staff taking part over the course of five days. In our post-Stress Better student survey, escape rooms were among the top three requested offerings at our next event. Based on our experiences, we can offer the following takeaways to libraries considering hosting a similar event:

• Faculty/staff want to participate! Consider leveraging an escape room for professional development, or as a method of raising employee awareness about issues like copyright.
• Team up with a local escape room for your first event. Utilizing the design expertise and props of an established escape room eliminates material costs and significantly reduces the amount of time required to plan the event.
• Elect one staff member to coordinate groups, send email reminders, and reset the room. Expect that they will have their hands full throughout the event.
• Pilot a room with a team of library staff to ensure participants have enough time to find their bearings, break through at least half of the puzzles, and team build.
• Host a ‘just for fun’ escape room with a competitive element to bring new students into the library and keep them coming back.
• Keep the momentum going by posting group photos and a ‘time to beat’ on social media.
• Utilize an escape room to offer a splashy, dynamic programming while also meeting the needs of students who use the library as a quiet study space.
• Expand your programming without expanding your budget by emphasizing the promotional benefits of partnering with your library.

Photo by Gina Brander

References

Marks, G. (2017). Escape Room! In the Library [PDF Document]. NJLA Annual Conference Poster Session. Retrieved from http://hdl.handle.net/20.500.12164/64

Pun, R. (2017). Hacking the research library: Wikipedia, Trump, and information literacy in the escape room at Fresno State. The Library Quarterly, 87(4), 330-336. Retrieved from https://www.journals.uchicago.edu/doi/abs/10.1086/693489

Salisbury, F. & Ung, E. (2016). Can you escape the library escape room? Incite, 37(5/6), 24-25 Retrieved from https://search.informit.org/browseJournalTitle;res=IELHSS;issn=0158-0876

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Student Research Assistants in Library and Information Studies Research

by Cara Bradley, Teaching and Learning Librarian
University of Regina Library

Student research assistants (RAs) play an important (and often unsung) role in the conduct of academic research. I imagine that many of you, like me, have both been a research assistant yourself (while completing a degree) and also hired student research assistants to help with your own projects.

I’ve been thinking a lot about student research assistants lately. This reflection has been prompted by my recent experience:

– applying for a Tri-Agency Grant*, an application process that emphasizes the “development of talent” and HQP (Highly Qualified Personnel)

and

– hiring and supervising a student research assistant

To be quite honest, I feel like I’ve done a “good-ish” job at these two endeavours, but not definitely not a great job. I’ve been trying to figure out why, and to learn what I can do to improve in the future.

As I think this through, I’ve been struck by the somewhat unique position of librarians seeking to hire students to assist with their research. Faculty in the disciplines have access to a pool of potential applicants who have studied in their field, and can usually draw a clear line between the student research assistant’s experience and the development of HQP. Unless you work at one of the few Canadian universities with a MLIS (or equivalent) program, you do not have ready access to students with an interest and/or background in your LIS, and the line between the student’s experience and HQP can seem more difficult to draw.

Further reading has led me to the conclusion (a conclusion unfortunately reached after I submitted my grant application) that I’ve been too limited in my thinking about “development of talent.” Rather than stressing about how to create mini-librarians out of those who have no desire to be such, I need to think more broadly about the experience and training that I can provide to student research assistants. Extensive navigation of the labyrinthine Tri-Agency web sites eventually led me to the (well-hidden) Guidelines for Effective Research Training, in which SSHRC asserts that research training should “build both academic (research and teaching) competencies and general professional skills, including knowledge mobilization, that would be transferable to a variety of settings.” The site goes on to list some of these “valuable skills”:

• research methods and theories;
• publication and research communication;
• knowledge mobilization and dissemination;
• teaching in diverse settings and with various technologies;
• digital literacy;
• data management and analysis;
• research ethics;
• interdisciplinary research;
• consultation and community engagement;
• project and human resources management;
• leadership and teamwork; and/or
• workshops and conferences.

Hey, wait a minute! Those are exactly the kinds of skills that my grant-funded student research assistant would develop. This was a light-bulb moment for me. Although the grant application necessarily focuses on the details and minutiae of the proposed research project, I need to take a step back from this when describing the kinds of transferable skills that students would gain through working on my project. This insight will also help me to better engage and communicate with my research assistants, supporting them to realize and articulate their experience in ways that will benefit them in future research and employment environments.

I’ve also benefited from my reading of some of the literature around faculty-student mentoring relationships, as I’ve found that this relationship more closely reflects what I hope to offer student research assistants. In particular, Lechuga’s conceptualization of faculty as “allies, ambassadors, and master-teachers,” strikes a chord with me. He writes that the faculty he studied served as

allies to their students and took a supportive approach in working with them. Participants were apt to focus on the specific individual needs of their graduate students, either academically or otherwise. This finding is in line with other research on faculty-student relationships that has demonstrated the importance of providing personal support through formal and informal interactions

He goes on to describe another faculty role as that of “ambassador”:

In their role as agents of socialization, faculty served as ambassadors of the profession by imbuing students with a sense of professional responsibility and introducing them into the culture of academe.

Lechuga’s research on the faculty/student relationship has inspired me to expand my understanding of how I can support the growth and development of my student research assistants.

Now let’s hope that grant comes through!

* for those outside of Canada, the Tri-Agencies includes the Canadian Institutes of Health Research (CIHR), the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Social Sciences and Humanities Research Council of Canada (SSHRC), and are the major government funders of research in Canada.

Reference
Lechuga, V.M. (2011). Faculty-graduate student mentoring relationships: mentors’ perceived roles and responsibilities. Higher Education 62(6): 757-771.

Doing Research as a Procrastinator

by Kristin Hoffmann, University of Western Ontario

I procrastinate.

I procrastinate with my research, and in many other aspects of my work. For example, I started writing this post at 1:43pm the day before it was due. At the same time, I needed to work on a mostly-unfinished presentation for a 40-minute workshop that I was delivering the next morning, and I hadn’t yet started compiling the data for a report that was due to colleagues at the end of the week.

I get things done, but I often do them at the last minute.

I used to berate myself all the time, and feel very, very bad about my tendency to procrastinate. Then I heard a podcast with Mary Lamia, author of What Motivates Getting Things Done: Procrastination, Emotions and Success, and now I’m starting to re-frame my procrastination in way that is helpful, not shameful:

I’m motivated by deadlines.

Here is Mary Lamia’s definition of procrastinators:

“People who are primarily motivated to complete tasks when their emotions are activated by an imminent deadline. They are deadline driven.”

The idea is that our emotions are what motivate us to get things done. For some people, the emotions that motivate them come from having a task to do and wanting to complete it. For people who procrastinate, the emotions that motivate us come from deadlines.

Other characteristics of people who procrastinate include:

• Being energized and getting increased focus as a deadline gets closer,
• Feeling like they lack motivation and concentration when they try to get something done ahead of time,
• Having ideas percolating in the background, which come together as the deadline approaches.

I can see all of these in myself, and it’s been quite a revelation for me in thinking about my approach to research. In the last six months, a colleague and I have taken a research project from idea to ethics application to data gathering to analysis—thanks, in large part, to the motivation brought on deadlines. I’ve had other research ideas and papers in various stages, but I haven’t touched any of them for months; they haven’t had deadlines.

I’ve often talked with other researchers about the benefit of having external deadlines, such as conference presentations or submission deadlines. I’m realizing that my particular challenge is to figure out how to reproduce the emotional motivation of deadlines when an external due date doesn’t exist. I don’t need to feel bad about procrastinating, I just need to accept that I’m motivated by deadlines.

References and Further Reading

Lamia, Mary. What Motivates Getting Things Done: Procrastination, Emotions and Success. Rowman & Littlefield. 2017.

Blog posts by Mary Lamia at Psychology Today:
• Getting Things Done, Procrastinating or Not, https://www.psychologytoday.com/us/blog/intense-emotions-and-strong-feelings/201703/getting-things-done-procrastinating-or-not
• The Secret Life of Procrastinators and the Stigma of Delay, https://www.psychologytoday.com/us/blog/intense-emotions-and-strong-feelings/201708/the-secret-life-procrastinators-and-the-stigma
• How Procrastinators Get Things Done, https://www.psychologytoday.com/us/blog/intense-emotions-and-strong-feelings/201709/how-procrastinators-get-things-done
• Why You Should Hire a Procrastinator https://www.psychologytoday.com/us/blog/intense-emotions-and-strong-feelings/201712/why-you-should-hire-procrastinator

Podcasts:
• CBC Tapestry, Procrastination 101, aired November 26, 2017, available at http://www.cbc.ca/radio/tapestry/procrastination-101-1.4416658
• Success.com, Ep. 85: What Type of Procrastinator Are You?, aired October 17, 2017, available at https://www.success.com/podcast/ep-85-what-type-of-procrastinator-are-you

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.