Leaning In or Leaning Back?

by Marjorie Mitchell
Research Librarian, UBC Okanagan Library

With apologies to Sheryl Sandberg.

I am going to admit it here first – I’m going through a bit of a dry spell with my research. Actually, it’s not the research – that feels like it’s making some forward movement for the time being. No, the dry spell I’m experiencing has to do more with disseminating my research results and getting my findings “out there” than it has to do with the “research” per se. I’ve recently had proposals for two conference presentations (one traditional presentation and one poster) rejected. Now I’m in a bit of a quandary. I’m trying to read the message of these two rejections to determine whether I should continue this line of research or not. Do I continue and hope the results of my research will be more convincing and compelling nearer completion (leaning in) or maybe it’s a good time to adjust the focus of my research (leaning back).

Research is a funny thing. While many of us conduct research for reasons like “contributing to the profession” or “out of curiosity” or “because I’m required to do research for tenure and/or promotion”, few of us spend enough time determining whether our topic meets the criteria Hollister (2013) called “noteworthy”. Basically, he is pointing out the importance of saying something new, or utilizing something existing, either a theory or method, in a new and unique way.

I would take this a step further and say that a topic also needs to be timely. If your topic has already been written about and presented on many times, it might be that the topic has become stale, even if you have found something new to add to the knowledge about the topic. Another contribution to a topic that has occupied our professional attention for some time just isn’t as appealing as something newer. There is also the problem of being too new. There are some topics and ideas that are just a bit too far ahead of the crowd and won’t be accepted in the current round of conferences and upcoming journals.

Some ideas are just ahead of what the profession is ready to be discussing at any given time. No matter how well composed, researched and executed, an idea that is ahead of its time will fall on deaf ears. You may have had the experience of coming up with a topic and pitching it, only to see it presented by someone else two years later at your favorite conference. There is no quick or easy solution to this. You can only console yourself with a hot cup of tea, secure in the knowledge that you had that idea first.

I think one solution to the issue of being timely is also to develop a certain passionate detachment to the research you’re doing. Research needs a certain amount of objectivity, but I truly believe research needs passion and enthusiasm to carry it forward. I’ve come to recognize, however, I also require a certain amount of detachment, particularly at the conclusion of my research, to allow me to withstand the rejections my ideas sometimes receive.

Sometimes it is worthwhile to step back from the research, particularly after a rejection, and honestly weigh whether the research is still worth pursuing and finishing. It may be your great idea is just a little too late. For now, I’m going to take my proposals to a colleague and get a second, less biased, look at them before I make any decisions. So, before I lean in any direction, I’m going to lean on a friend for advice. I don’t think Sheryl mentioned that kind of leaning.

References

Hollister, C. V. (2013). Handbook of academic writing for librarians. Chicago: Association of College & Research Libraries.


This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Locating the Local: A Literature Review and Analysis of Local Music Collections

by Veronica Kmiech, BMUHON, College of Education, University of Saskatchewan

This work is part of a larger research project titled “Local Music Collections” led by Music Librarian Carolyn Doi and funded by the University of Saskatchewan President’s SSHRC research fund. A post from Carolyn’s perspective on managing this project will be published in 2017 on the C-EBLIP Blog.

Introduction

Research is to see what everybody else has seen, and to think what nobody else has thought.
Albert Szent-Gyorgy1

At this point in my university career, I have written several research papers, most of which were for the musicology courses I took as part of my music degree. This research gave me familiarity with the library catalogue, online databases for musicological articles, interlibrary loan, and contacting European collections to request material (this last one involved an interesting 4 a.m. phone call). As a Research Assistant, my background was helpful, but I found the depth of searching needed for the literature review much greater than anything I had done before.

My role as a research assistant for music librarian Carolyn Doi involved searching for sources, screening those sources based on their relevance to the project, and using NVivo software to identify themes in the literature.

Aim

The aim in doing the Literature Review was to find sources that discuss local music collections, especially those found in libraries. With these results, a survey to accumulate information on current practices for managing local music collections is under development.

It was important to find as many sources as possible, across a wide geographic area and collection types, although the majority came from North America. Reading sources from all over the world that talk about collections in a range of settings (e.g. libraries, churches, privately built, etc.) increased my understanding of the contexts that exist for local music collections.

Methodology
One of the most important parts of the Literature Review was to find as many items relating to local music collections as possible, or in other words – FIND ALL THE SOURCES!
findallthesources2
There were thirteen sources that became a jumping-off point, providing guidelines for how to focus the literature review. From here, I searched for literature in a variety of locations including USearch, Google Scholar, Library and Information Studies (LIS) databases, music databases, education databases, newspaper databases, humanities databases, and a database for dissertations and theses.

As a music student, I was familiar with the library catalogue and databases such as JSTOR. However, I was not familiar with the LIS or the Education databases. There were a variety of articles from journals, books, and newspapers that described different types and aspects of local music collections. One point of interest was the range of collection types, which appear in academic libraries and public libraries, to private and government archives. Most of the sources were case studies, which discussed the challenges and successes of a particular collection.

Other sources of information were print works from the University of Saskatchewan Library and Interlibrary Loan, conference abstracts and listserv conversations from the International Association of Music Libraries, Archives and Documentation Centres (IAML), the Canadian Association of Music Libraries, Archives and Documentation Centres (CAML), and the Music Library Association (MLA).

After completing the search, 408 unique results were saved. Although many of the same sources appeared in different search locations, Figure 1 shows where documents were first located.

The majority of the sources came from North America and Europe. It is worth noting that this may be a result of the databases searched, rather than an indication of absence of local music collections and the study of such in other parts of the globe.

Figure 1: Pie chart showing all 408 saved documents based on search location
Figure 1: Pie chart showing all 408 saved documents based on search location.3

Challenges & Limitations

The common challenge, regardless of the database being searched, was finding effective search terms for finding relevant sources. It was important when searching in places like Google Scholar, JSTOR, and USearch to narrow the parameters considerably; otherwise one would obtain thousands of hits. Full-text searches, for example, were not helpful.
onedoesnotsimply4
Comparatively, some of the LIS databases and ERIC, an education database, required only a keyword or two to find all of the information relative to local music collections that they contained.

Results

Figure 2Figure 2: Geographical distribution of sources in the literature review.5

I saved 408 sources to Mendeley. These consisted primarily of journal articles describing case studies from a variety of international locations. Three hundred and sixty of the sources came from North America and Europe, with the complete breakdown by continent shown in Figure 2. Since we were more interested in research from North America, it is worth noting that 123 of the 201 North American sources are from the United States, 73 are Canadian, and 5 are from other countries such as Jamaica.

After screening, 59 documents were selected for NVivo content analysis. Documents were included if they spoke directly to the management of local music collections in public institutions. Documents were excluded if they were less relevant to the research topic (for instance, they may describe private collections), or they may be items that provide useful context (for example, this may be a resource on developing sound collections in a library).

Conclusions

For me, completing this literature review was a little bit like a treasure hunt – what could I do to find more information? Where else can I look? This process took me to locations for research that I did not even know existed, like the IAML listserv. And, after accidentally emailing every music librarian on the planet while trying to figure out how to work the thing, I was able to add a new researching tool to my repertoire.

In conclusion, the literature review served as a means for finding sources to analyze. However, it provided more than just a list of articles. The completion of the literature review, although global in scope, created a picture centered on North America, which has been an enormous help in understanding the topic of research. Through this search for documents, it has also been possible to see how it would be best to approach the analysis, based on the what work has already been accomplished and what work still needs to be done in this field.

1Szent-Gyorgyi, Albert. BrainyQuote. “Albert Szent-Gyorgyi Quotes.” Accessed July 22, 2016. http://www.brainyquote.com/quotes/authors/a/albert_szentgyorgyi.html
2Imgflip. “Meme Generator.” Accessed May 30, 2016. https://imgflip.com/memegenerator
3Meta-chart. “Create a Pie Chart.” Accessed September 17, 2016. https://www.meta-chart.com/pie
4“Meme Generator.”
5“Create a Pie Chart.”

This article gives the views of the author and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Building a positive culture around practitioner research, one symposium at a time

by Christine Neilson
Neil John MacLean Health Sciences Library
Centre for Healthcare Innovation
University of Manitoba

This fall I attended my first C-EBLIP symposium, and it was fantastic. The day was filled with interesting presentations; I had a chance to see old colleagues and meet new people who share an interest in library research; and they gave me bacon for breakfast, which is always a win as far as I’m concerned. Two recurring themes during the day were 1) leading by example, and 2) the personal aspects of doing research (such as dealing with research projects that go off the rails, professional vulnerability, and the dreaded “imposter syndrome”). Both of these themes are important. The first as a call to action. The second as an acknowledgement that research isn’t necessarily easy, but none of us are truly alone and there are things we can do to cope.

Acknowledging and exploring the personal issues that come with conducting research is not something that we tend to talk about. I might tell a trusted colleague that sometimes I’m afraid others will see me as the researcher equivalent of the Allstate DIY-er – all of the enthusiasm and optimism, but none of the skill or ability – but generally, we limit our “official” professional discussion to less sensitive topics. Maybe that’s because we don’t want to admit that there might be any issues. Or maybe it’s because there’s a risk the discussion could degenerate into a pity-party that doesn’t move anyone or anything forward. Either way, I think that this is a topic area that needs to be explored in a constructive way.

The C-EBLIP Symposium was a venue that genuinely felt safe to talk about research and the experience of doing research, and I’m thankful I was able to attend. I’m particularly happy that this year’s presenters will have an opportunity to publish about their presentations in an upcoming issue of Evidence Based Library and Information Practice journal. It’s a great opportunity for presenters to share their research, ideas, and experiences with a wider audience, and it will help ensure that content from the day doesn’t disappear into the ether. Building a culture with certain desired qualities is extremely difficult. I’m encouraged that C-EBLIP is building a positive, supportive culture of practitioner research in librarianship and I hope the momentum continues!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The “Why?” behind the research

by Andrea Miller-Nesbitt
Liaison Librarian, Schulich Library of Physical Sciences, Life Sciences and Engineering, McGill University

Lorie Kloda
Associate University Librarian, Planning & Community Relations, Concordia University

Megan Fitzgibbons
Innovation Librarian, Centre for Education Futures, University of Western Australia

When reading journal articles reporting original research, content usually follows the IMRAD format: introduction, methods, results, analysis, and discussion. Word count, author guidelines, and other conventions usually mean that the researchers’ motivation for conducting the study are often left out. In this post we present our motivations for conducting a research study on librarians’ participation in journal clubs:

Fitzgibbons, M., Kloda, L., & Miller-Nesbitt, A. (pre-print). Exploring the value of academic librarians’ participation in journal clubs. College & Research Libraries. http://crl.acrl.org/content/early/2016/08/22/crl16-965.abstract

Being an evidence-based practitioner can sometimes involve a bit of navel-gazing. Beyond using evidence in our professional work (e.g., for decision-making, evaluating initiatives, etc.), we may likewise ask questions about the outcomes of our own professional development choices.

After three years of facilitating the McGill Library Journal Club, we began to think about ways we could disseminate our experience and lessons learned, and most importantly, how we could determine librarians’ perceived outcomes of participating in a journal club. We felt anecdotally that participating in a journal club is worthwhile, but we wondered: Can we formally investigate the impacts of participation on librarians’ practice and knowledge? What evidence can we find to inform the decisions and approaches of librarians and administrators in supporting or managing a journal club? Is there a connection between journal clubs and evidence-based librarianship? We also wanted to learn more about approaches taken in a variety of journal clubs and how they define success for their group.

The McGill Library Journal Club was initially established in order to help foster evidence-based practice by reflecting on the library and information studies literature and using those reflections to inform practice. The journal club also provides a professional development opportunity for all those interested. Although the McGill Library Journal Club has experienced many of the same challenges as other journal clubs, it is still going strong after 6 years thanks to a core group of motivated facilitators. (For more information about the journal club’s activities, see the McGill Library Journal Club wiki.)

In order to answer these questions, we first had to agree on a definition of a journal club. After some reading and deliberation, we framed participation in a journal club as an informal learning activity: learning that occurs outside classrooms or training sessions, but still involves some coordination and structure. In this context, our research question was: “What do librarians perceive as the value of participating in a journal club?” We focused on academic librarians who participate in journal clubs to manage the scope of the study, but a similar approach could be taken in other library and information organizations as well.

Because we were interested in gaining insight into individuals’ experiences, we considered several methods, and ultimately selected an in-depth qualitative method, the hermeneutic dialectic process (Guba & Lincoln, 1989). This is a method that we have seen used in the social sciences for the purpose of evaluation and reconciling diverse perspectives. At the time we were coming up with our research question, one of the authors (Lorie) was an assessment librarian, and interested in qualitative methods. She brought Guba and Lincoln’s writing to the team for discussion. It seemed both appropriate for answering our research question and also flexible to enable us to be able to really capture study participants’ experiences – not just what we expected to hear. We believe that this is the first use of this method in LIS research, so an additional motivation for the study was to apply the approach in the field.

As per the method, we conducted semi-structured in-depth interviews with each participant. After the first interview, central themes, concepts, ideas, values, concerns and issues that arose in the discussion were written into an initial “construction” which captured the experiences and perceptions expressed by the interviewee. Then in the second interview, the participant was asked to react to some of the points brought up by the first interviewee, as expressed in the construction. The construction was added to after each interview, incorporating the perspectives of each successive interviewee and used to inform the subsequent interviews. At the end, all participants were given the opportunity to comment on the final construction and let us know whether their perspectives were accurately represented.

Ultimately, we believe that the findings of our published study are of interest to librarians who aim to create and sustain a journal club. In particular, it could offer insight as they form goals for their group and justify the activity, including seeking support and recognition for their group.

More details about the impacts of academic librarians’ participation in journal clubs are of course presented in the article. In addition, in collaboration with the C-EBLIP Research Network, we hope to compile additional resources about journal club practices in librarianship and open communication channels in the future. Watch this space, and please get in touch if you have any ideas about promoting journal clubs for academic librarians.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Learning to Let Go: The Perfectionist’s Struggle

by Laura Thorne
UBC Okanagan Library

Striving for excellence motivates you; striving for perfection is demoralizing.
– Dr. Harriet Braiker

Last week, I attended C-EBLIP’s third annual Fall Symposium. There were so many great presentations, but there were two in particular that I kept thinking about in the days following – Angie Gerrard’s Changing Your Research Plan En-Route and The Elephant in the Room: Imposter Syndrome and Librarian Researchers by Jaclyn McLean. Both presentations tackled often-encountered, but rarely discussed topics that come up when conducting research – our emotions and personalities. Gerrard discussed the emotional challenge associated with research not going according to plan and the need for professional vulnerability, while McLean discussed imposter syndrome and feeling like you’re not good enough, even when you’ve accomplished so much. They led me to think about a related issue that I’ve struggled with in my career and while doing research – perfectionism.

Like many librarians I know, I am a perfectionist. Perfectionism can be an excellent trait. It can lead to high quality work and can motivate me to always strive to do my best. But it can also be challenging. While I wouldn’t diagnose myself with atelophobia, at times my perfectionism has been paralyzing and has prevented me from taking risks, trying new things, or even completing what I’ve started. There are drawbacks to thinking everything you do needs to be perfect or the best.

Studies show that perfectionism is rampant in academia (Charbonneau, 2011;
Dunn, Whelton & Sharpe, 2006; Sherry, Hewitt, Sherry, Flett & Graham, 2010; Rockquemore, 2012; Shives, 2014) and is something many of our students also struggle with while at university (Çapan, 2010; Eum & Rice, 2011; Jiao & Onwuegbuzie, 1998). While knowing I’m not alone is of some comfort, one of the biggest professional struggles I’ve had to overcome is learning to let go of projects, especially my writing.

You could say my entire career thus far has been an experiment in letting go, in telling myself, “It’s good enough, just send it out,” but nowhere has this needed to be repeated as much as in my research. For the most part, my research is not something I have to do; it’s something I want to do. I do it largely outside of my regular everyday work and is truly a labour of love. Because of this, there tends not be set deadlines (I attempt to set them for myself, but I’m not the strictest timekeeper), and I either a) procrastinate or b) agonize over tiny details instead of just getting it over with and letting it go.

Some of the tricks I’ve found useful in combatting my perfectionism and learning to let go:
• Embrace the mantra of good enough: This isn’t to say do the bare minimum, but accepting that perfection is unattainable and realizing that a finished project is a good project makes it easier to make progress on your research.
• Fight the urge to procrastinate: For me personally, it’s easy to procrastinate – it gives you an out for why something isn’t perfect. But this only exacerbates the problem.
• Set deadlines for yourself (and stick to them): This helps with the procrastination!
• Don’t go alone: Having a research partner or team has been incredibly helpful in learning to let go and can work as a support system when you’re obsessing about the details and unable to see the bigger picture.
• Love the draft: By completing drafts of my work, whether it be a research proposal or an article, I can slowly get used to the idea of letting go of my work in a staged process before sending it out into the world.
• Develop a network you trust: When you’re unsure of or fighting with a project, it’s useful to have a network of people you can talk to and receive feedback. It makes it easier to let go of a project when I know someone I respect thinks it’s good. And I do the same for them!
• Don’t re-read after you’ve submitted your work: This should go without saying, but unfortunately, I had an awful habit of re-reading an item right after I’ve hit submit or send. As I’m reading through, I’m thinking “I should have changed this or that that” and making myself feel dreadful instead of happy that I’ve finished. It’s an exercise in torture and since I’ve stopped, I feel much less critical of the work I’ve done and can actually celebrate a job well done.

“Too many people spend too much time trying to perfect something before they actually do it. Instead of waiting for perfection, run with what you got, and fix it along the way.”
– Paul Arden

References (and further reading)

Çapan, B. E. (2010). Relationship among perfectionism, academic procrastination and life satisfaction of university students. Procedia-Social and Behavioral Sciences, 5, 1665-1671.

Charbonneau, L. (2011). Perfectionist professors have lower research productivity. University Affairs. Retrieved from http://www.universityaffairs.ca/news/news-article/perfectionist-professors-have-lower-research-productivity/

Dunn, J. C., Whelton, W. J., & Sharpe, D. (2006). Maladaptive perfectionism, hassles, coping, and psychological distress in university professors. Journal of Counselling Psychology, 53(4), 511.

Eum, K., & Rice, K. G. (2011). Test anxiety, perfectionism, goal orientation, and academic performance. Anxiety, Stress & Coping, 24(2), 167-178.

Hibner, H. (2016, Jan 19). Don’t overthink it: How librarians can conquer perfectionism with mindfulness. [Web log post]. Retrieved from https://librarylostfound.com/2016/01/19/dont-overthink-it-how-librarians-can-conquer-perfectionism-with-mindfulness/

Jiao, Q. G., & Onwuegbuzie, A. J. (1998). Perfectionism and library anxiety among graduate students. The Journal of Academic Librarianship, 24(5), 365-371.

Rockquemore, K. (2012). Overcoming academic perfectionism. [Web log series]. Retrieved from https://www.insidehighered.com/career-advice/overcoming-academic-perfectionism

Sherry, S. B., Hewitt, P. L., Sherry, D. L., Flett, G. L., & Graham, A. R. (2010). Perfectionism dimensions and research productivity in psychology professors: Implications for understanding the (mal)adaptiveness of perfectionism. Canadian Journal of Behavioural Science, 42(4), 273-283. doi:10.1037/a0020466

Shives, K. (2014, Nov 11). The battle between perfectionism and productivity. [Web log post]. Retrieved from https://www.insidehighered.com/blogs/gradhacker/battle-between-perfectionism-and-productivity

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Non-Attachment as an Antidote for Procrastination

by DeDe Dawson
Science Library, University of Saskatchewan

I have a weakness for popular psychology books and I’m a wee bit of a cynic, too – so when I came across Oliver Burkeman’s book The Antidote: Happiness for People Who Can’t Stand Positive Thinking I knew it would be good. And it is good, very good: entertainingly written and thought-provoking. After returning my borrowed library copy I actually went out and bought my own copy and re-read it! Now that is an endorsement.

But what does this have to do with research?

I wasn’t expecting to find advice to apply to my research in this book – but sometimes when you least expect it the most useful nugget of wisdom lands on your lap!

Each chapter in the book explores a different counter-intuitive route to happiness. In chapter three, “The Storm before the Calm”, Burkeman discusses the Buddhist philosophy of non-attachment. Essentially, Buddhists believe that the root of all suffering is attachment. It is a very human and understandable tendency to cling to things we like and avoid things we don’t. Both of these tendencies can be considered attachments though. The examples Burkeman uses are:

“Develop a strong attachment to your good looks – as opposed to merely enjoying them while they last – and you will suffer when they fade, as they inevitably will; develop a strong attachment to your luxurious lifestyle, and your life may become an unhappy, fearful struggle to keep things that way.” (Burkeman, 2012, p. 53)

So, the Buddhist approach to life is to practice non-attachment: to be non-judgmentally aware of these feelings and impulses but not get hung up on them. Once we stop struggling to be positive and happy then we might actually experience some peace! Counter-intuitive… but compelling.

And now the connection to research… Virginia Wilson wrote candidly in this blog a few weeks ago about her struggles with procrastination – a common curse of academics when they get to the “write-up” portion of a research project. We’ve all heard the inspirational quotes, the motivational tips, and other well-meaning advice. Burkeman states that most of these tips and tricks don’t work simply because they are more about putting you in the mood to get things done, instead of how to actually get things done.

It turns out that non-attachment can be a practical way to get things done.

If you wait until you’re in the right mood to get things done… then you’ll never get things done:

“Who says you need to wait until you ‘feel like’ doing something in order to start doing it? The problem, from this perspective, isn’t that you don’t feel motivated; it’s that you imagine you need to feel motivated. If you can regard your thoughts and emotions about whatever you’re procrastinating on as passing weather, you’ll realise that your reluctance about working isn’t something that needs to be eradicated or transformed into positivity. You can coexist with it. You can note the procrastinatory feelings and act anyway.” (Burkeman, 2012, p. 69)

The emphasis in the above quote is mine. These last two sentences are the ones I underlined and starred in the book (my copy – not the library copy!). After this passage Burkeman goes on to describe the daily rituals of some highly productive and famous writers – they rarely include techniques meant to inspire or motivate, instead they are routines that provide structure whether or not the writer happens to feel motivated at the time.

An aside: My artist husband claims he needs to be inspired to paint – and guess what? He doesn’t get much painting done. I always tell him: “Just sit down and paint, the inspiration will come!”

So, this is my advice to myself and my fellow procrastinating writers: recognize that you don’t feel like it… then just sit down and write. It is very similar to Lorie’s advice to Virginia: “Just do it!”

Oh, and read this book:
Burkeman, O. (2012). The antidote: Happiness for people who can’t stand positive thinking. New York: Faber and Faber, Inc.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Considering collaborations

by Margy MacMillan
Mount Royal University Library

Most of my work in Library and Information Practice involves other people so it’s not surprising that working on building and using an evidence base for this work has brought me into close collaboration with people across the library, the campus, and global libraryland*. Reflecting on these experiences has illuminated some patterns and common factors in positive collaborations as well as some aspects that require attention at the beginning to ensure everyone stays friendly at the end.

One of the most important things is to align conceptions of realistic timelines, milestones and deadlines. In one group I worked with, this was evident even in small things – if we said we’d meet in the lobby at 7:30 to catch a shuttle to a conference, we were all there by 7:00. This congruence happened naturally among us, but is something that most groups I’ve been part of have had to work out. While the set dates of publications and presentations can be helpful motivators, developing a schedule that all collaborators are comfortable with should be part of the early planning stages.

Related to the question of time, is motivation. Understanding why your collaborators are interested in the project and how it fits into their lives can help determine feasible timelines. If one partner needs to analyse data as part of planning for a new service and another sees the potential of this analysis to inform wider work through publication, the partners will have to accept different commitment and energy levels for different parts of the project. In situations like these, colleagues and I have often taken the lead at different stages: gathering, initial analysis, submission and write-up. While we all contributed to these stages, leading different parts was an effective way to align aspects of the projects with our skills and motivations, and ensured that no one felt overburdened.

A crucial aspect in both of these collaboration was that we trusted each other to do the work. That trust was built on frank discussions of available time and competing priorities, acknowledgements of each others’ expertise, and shared understanding of tasks and expectations. Looking back those have been key factors in all of the successful collaborations I’ve been a part of.

Nancy Chick, Caitlin McClurg, and the author, collaborating on a cross-disciplinary project.

Nancy Chick, Caitlin McClurg, and the author, collaborating on a cross-disciplinary project.

Openness to others’ expertise is, of course, critical when you are working across disciplinary boundaries. Your partner may be more comfortable in a different research methodology, or simply a different citation style, and developing a shared language around the project is critical. Disciplines bring distinct terminologies and conventions around knowledge creation and dissemination (to see this in action, bring a table of mixed faculty together, open the discussion of author name order, and stand back). These differences affect the questions you ask, the evidence you value, the analysis you undertake and the audience(s) for the final product.Just as you would when coding data, nothing works quite so well as writing down decisions once you find  consensus.  It’s easy (and occasionally disastrous for a project) to make assumptions about shared understandings working with people in your own discipline, but I’ve found these groups can have just as divergent thinking as cross-disciplinary ones. The early communicaiton stage is often skipped on the assumption that as members of the ‘hive mind’ of librarianship we have common conceptions of  information literacy, or what term we should use for patron/user/client/ or how open does a publication need to be to count as OA?.

Much of this: negotiating meaning across disciplines, negotiating time zones and spelling conventions across borders and oceans, or negotiating variations in motivation regardless of other differences or similarities, is a matter of making the tacit explicit, of learning how to say what we mean, what we need, and what we can do clearly and without apology.

It turns out that this really is one of the great unsung benefits of collaboration. Working with others has taught me more about my professional self than any other activity. It has made me think about my values as a librarian, as a researcher, and as a teacher, and in articulating those values to others I have found a strengthened sense of purpose. Negotiating the meaning of information literacy, whether with library colleagues or with other faculty has given me a more nuanced personal definition, and helped me enact and communicate that definition in my teaching and scholarship. I have found that these meaning-making tasks have been far more productive and authentic when I have worked on them as a means to collaboration than when I have considered them as ends in themselves.

Try starting your next collaboration with the kind of conversation that engages participants in self-explanation, where tacit assumptions and definitions are brought into the light of others’ questions, probed for nuance, and made explicit. There is no guarantee this will lead to a trouble-free project of course, but according to the OED ‘explicit’ does derive from the classical Latin explicitus: free from difficulties… so it just might.

*A semi-mythical place where all information is well-organized, all colleagues are congenial and collegial, and timezones prove no barrier to productive conversations.

For a longer discussion of collaboration in research, I highly recommend the “Coda on Collaboration” chapter of Critical Reading in Higher Education: Academic Goals and Social Engagement by Karen Manarin, Miriam Carey, Melanie Rathburn, and Glen Ryland, 2015, Indiana University Press.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

A small experiment to improve Facebook engagement

By Joanna Hare
Run Run Shaw Library, City University of Hong Kong

As I am sure is the case at many academic libraries, I am the sole person responsible for maintaining the Library Facebook Page. This means that a lot of my time is spent planning and scheduling content, with not as much time as I would like spent collecting evidence for the purpose of improving content. I regularly check and download Facebook Insights reports to keep an eye on how our page is doing, and of course I always pay attention to how much interaction a particular post is getting through comments, likes, or shares. Recently, however, I trialed a small experiment to see if I could improve the performance of a particular type of post: a weekly link to the Library’s Books of the Week blog.

Books of the Week is a place to share recommended Library books, usually related to a current event such as the Olympics or the beginning of semester. In the past, a feed was created so all new blog posts would be automatically posted to the Facebook page. This was causing a number of problems, such as the timing and number of posts becoming unpredictable, and the posts being poorly formatted in Facebook. Most importantly, the Facebook posts coming automatically from the blog were getting zero engagement, and the Reach of the posts was very low. A change was clearly needed.

I decided to stop the blog posting automatically to Facebook, and manually post the item myself. I created a simple graphic to be used each week, and posting manually meant I could write the accompanying status to be more timely and unique. Even though manually posting the item each week only takes a few minutes, in terms of my job description and job performance I knew I would need to justify if this increased manual work was worth the effort.

Based on an experiment described in this article, I started a log of the variables when posting Books of the Week each week. The log included a link to the post, a description of the post such as the image dimensions, length of the accompanying status, and the time and date of the post. Then, each week I recorded the basic units of measurements for the post provided by Facebook: the Reach and the Post Clicks. I was less interested in likes, comments, and shares in this instance – of course I kept a record of them in my log – but metrics like Reach and Post Clicks are sufficient to see if people are engaged with your content without taking the extra step to ‘like’ a post: “…just because someone didn’t click on your specific post, if that post encouraged them to click anywhere else on your page, you’ve done a good job!” (Cohen, 2014)

For the first four weeks, I saw marked improvement in terms of the Reach, rising from 43 in the first week to 185 by the fourth week. At this point, I tweaked the method of posting. Rather than posting a link then adding the graphic as an attachment, I posted the graphic as a photo, with an html link in the description. Crucially, after digging into my Insights reports I found Facebook categorises the first type of post as a ‘Link’ and the second type as a ‘Photo’. The difference is very small in practice, and looks like this:

figure1_sample-fb-posts_jhare
Fig 1: Image on the left shows a ‘Link’ post type. The second image shows a ‘Photo’ post type.

After making this change, the increase in the post’s Reach was remarkable – the figure jumped to over 500. Over the next 6 weeks I continued this method of posting, and the posts consistently reached over 800 users. Once in the six week period I reverted to the first method, and the Reach dropped to 166. I returned to the second method, and the Reach increased again, which has remained at or above 800 since I stopped keeping a record of the variation each week.

Much of the literature and the marketing material about using Facebook recommends that page managers use images to engage their audience, so I suppose these results are not surprising. I did not however expect there to be such a difference in Reach simply because my post originated as a ‘Photo’ rather than a ‘Link’, when the content is essentially the same.

The general visibility of the posts was much improved with this method, but the change in the actual click through rate to the blog was less dramatic. On average around 5 people each week clicked on the post. My Insight reports show 2-3 of the clicks were to expand the image or description, while on average 0-1 people clicked the link to visit the blog. Quite disappointing!

Despite this, I do not think the exercise was in vain. Firstly, seeing for myself that images truly do have a greater Reach according to Facebook’s algorithm is useful for all future posting practices. Secondly, I think it is valuable to have our posts become more visible on Facebook, increasing our presence on the platform in general. It seems that the manual effort (which is really only around 10- 15 minutes each week – especially now as my colleagues assist in drafting the text and modifying the image!) is worthwhile given the marked increase in the post’s Reach, and the small increase in people who are clicking on the post. This is just a small scale way of using Facebook Insights, and in future I hope to use Insights more strategically in designing and delivering the Library’s Facebook content. In the coming weeks I will be experimenting with a more coordinated approach to Facebook including a paid advertising campaign, and I look forward to sharing some of the results with the C-EBLIP community.

References:

Busche, L. (2016, February 20). 10 Marketing Experiments You Can Run Yourself to Improve Your Reach on Social Media. Retrieved September 27, 2016, from https://designschool.canva.com/blog/marketing-experiments/

Cohen, D. (2014, August 6). Post Clicks, Other Clicks Are Important Metrics for Facebook Page Admins, Too. Retrieved September 27, 2016, from http://www.adweek.com/socialtimes/post-clicks-other-clicks-are-important-metrics-for-facebook-page-admins-too/300388

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Confessions of a Procrastinating (at times) Researcher

By Virginia Wilson, Director
Centre for Evidence Based Library and Information Practice (C-EBLIP)

When I sat down this morning to write out a comprehensive to-do list, I had to turn away from it for a moment. In my research section, there’s a bit too much going on. I’m in the middle of three research projects – one of which is a solo project and is hanging on far longer than I would have hoped. If it were a child, my data would be starting kindergarten this fall. My other two projects are collaborations. They are moving along, which I attribute to the accountability that comes from working with others. I sometimes look at co-workers and colleagues whom I admire and wonder “how do they get it all done?”

Regarding my solo project, I think my procrastination has been fueled by the feeling of not having a big enough chunk of time to really get into it. That’s merely an excuse, of course. I do have time, and I have had the time, and even if there are not great stretches of it, I should be able to be productive. But the longer I don’t do it, the easier it is to not do it. One of my collaborators, Lorie, said, (and I paraphrase): You can get a lot done in a couple hours or a half a day. You just do it! Just do it. That’s it, really. Don’t think about it, don’t mull it over, don’t wonder, don’t ponder, and for heaven’s sake, don’t read any more literature…just do it. As Yoda says, “Do. Or not do. There is no try.” I’ve been doing a lot of “not doing” on this solo project. So, enough of that! I’m going to enlist all of you as my accountability buddies. I’m declaring here in print that I will write that paper by Spring 2017.

How am I going to do this, you ask? I’m going to take advantage of the C-EBLIP Writing Circle. Every two weeks, a group of us gets together, shares progress and goals, and then writes for a couple of hours. It’s surprisingly effective! I also did some looking around for other productivity techniques and came across a post on lifehacker (and who doesn’t want to hack their life, am I right?) where they outline the five best productivity methods based on “your” votes. The Pomodoro Technique looks pretty interesting. I just need a “simple timer and a little discipline.” Hmm, okay. I’ll set the timer for 25 minutes, start it, and get to work. After I’ve worked for 25 minutes, the timer goes off, and I get a 5 minute break. Apparently, that is one “Pomodoro.” And I go on from there. The key is “short, sustained bursts.” There are some other productivity techniques listed, including a secret from Jerry Seinfeld. I do fear, however, that I will end up procrastinating by exploring more and better productivity techniques!

So, there you go. Probably more than you needed to know about my inner research psyche, but I surely cannot be alone when it comes to following through on research projects. I look to role models for inspiration, which is helpful. But probably the biggest drive for my solo research project right now is the age of the data. It’s still viable, I’m sure of that, but it really needs to get out there. If anything, I owe it to the folks who took the time to share their stories with me. So, that’s a good motivator, too. If you have similar stories to share, or some interesting productivity techniques, I’d love to hear about them.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

C-EBLIP Research Network: You’re Invited

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

For almost 3 years, the Centre for Evidence Based Library and Information Practice (C-EBLIP) has been supporting librarians at the University of Saskatchewan (U of S) as researchers and promoting evidence based library and information practice (EBLIP). This spring, we launched the C-EBLIP Research Network, an affiliation of institutions committed to librarians as researchers and/or interested in evidence based practice. The Network is conceived of as a supportive intellectual space supplemented by concrete activities. A 2-year pilot, granting institutional membership to the C-EBLIP Research Network, is kicking things off on a national and international level.

When I look at what we’ve achieved internally here at the U of S by getting librarians together for such things as the C-EBLIP Journal Club, Writing Circle, Code Club, the C-EBLIP Fall Symposium, and even this blog, I can’t help but wonder what we might achieve if we extend the participation, the collaboration, and the sharing. And so, the C-EBLIP Research Network is designed to facilitate all of those things within a global context (we go big or we go home).

An institutional membership in the C-EBLIP Research Network is primarily for the benefit of librarians within that institutions who are actively engaged in research and/or evidence based library and information practice. Becoming an institutional affiliate member of and signing a Memorandum of Understanding (MOU) with the C-EBLIP Research Network demonstrates that the larger institution supports the librarians’ growth in these areas. Institutions can be multi-sectoral (i.e. universities, public libraries, schools, special libraries, research groups). A $250CAD yearly membership fee will be reinvested back into C-EBLIP Research Network programming, i.e. webinars, research grant, etc. Librarian contacts from each institution will act in an advisory capacity to start with. The Network as it stands now is essentially a scaffold. Librarians at member institutions will have a chance to shape the Network in meaningful ways.

There are all kinds of networks out there: business networks, computer networks, telecommunication networks, television networks, even our nervous system is a network. One thing they all have in common is information exchange. The different nodes are all linked together to facilitate the movement and sharing of information. Just look at how many configurations there are!

NetworksBy NetworkTopologies.png: Maksimderivative work: Malyszkz (talk) – NetworkTopologies.png, Public Domain, https://commons.wikimedia.org/w/index.php?curid=15006915

So why a network? Why not another term such as association, partnership, alliance, consortium, or syndicate? Well, the last sounds a bit too much like we’d be up to no good. And the rest didn’t seem to click. And why am I so fixated on these diagrams (I really am)? Novick and Hurley state that “a large body of research has shown that schematic diagrams […] are powerful tools for thinking”; however, “it is important to note that superior performance is only obtained when the display format and the structure of the environment are consistent” (2001, p. 160). The idea of a network, to me, speaks to a flat structure with no institution above the other. We are peers, practicing librarians involved in the research enterprise. And yes, there will be some librarians with more experience, or more experience in certain areas, but that’s what makes the network a beautiful idea. In terms of conducting research as practicing librarians and incorporating EBLIP into our daily work, getting information from a variety of sources and sharing information in turn can assist us in many different ways. In their research, Novick and Hurley studied three schematic diagrams: the matrix, the network, and the hierarchy. Their descriptions of the network diagram are what we envision for the C-EBLIP Research Network:

• Any node (in our case, institution) may be linked to any other node (i.e. there are no constraints (p. 163).
• All of the nodes have identical status (i.e. are indistinguishable except by name) (p. 164).
• The links between nodes may be associative (p. 164).
• Any number of lines can enter and leave each node. Thus both one-to-many and many-to-one (i.e., many-to-many) relations can be represented simultaneously (p. 165).

(The above are just a few of the properties of networks, but they are the properties that speak the loudest to C-EBLIP Research Network configuration.)

noun_21272_ccSo far, the C-EBLIP Research Network is alive and well and host to several institutional members including members from Canada and the UK. A list will be coming soon and our numbers are growing (hello, Australia!). If you are interested in joining the C-EBLIP Research Network or would like to know more, please do not hesitate to be in touch with me: virginia.wilson@usask.ca

References
Novick, L.R. and Hurley, S.M. 2001. To Matrix, Network, or Hierarchy: That Is the Question. Cognitive Psychology, 42, p. 158–216 doi:10.1006/cogp.2000.0746

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.