Changing roles and changing needs for academic librarians

by Dr Danny Kingsley, Head of Scholarly Communication, University of Cambridge
and
Claire Sewell, Research Skills Coordinator, Office of Scholarly Communication, University of Cambridge

The Office of Scholarly Communication (OSC) has joined the Centre for Evidence Based Library and Information Practice Research Network, and as part of this commitment has prepared the following blog which is a literature review of papers published addressing the changing training needs for academic librarians. This work feeds into research currently being carried out by the OSC into the educational background of those working in scholarly communication. The piece concludes with a discussion of this research and potential next steps.

Changing roles

There is no doubt that libraries are experiencing another dramatic change as a result of developments in digital technologies. Twenty years ago in their paper addressing the education of library and information science professionals, Van House and Sutton note that “libraries are only one part of the information industry and for many segments of the society they are not the most important part”.

There is an argument that “as user habits take a digital turn, the library as place and public services in the form of reference, collection development and organisation of library resources for use, all have diminishing value to researchers”. Librarians need to adapt and move beyond these roles to one where they play a greater part in the research process.

To this end scholarly communication is becoming an increasingly established area in many academic libraries. New roles are being created and advertised in order to better support researchers as they face increasing pressure to share their work. Indeed a 2012 analysis into new activities and changing roles for health science librarians identified ‘Scholarly communications librarians’ as a new role for health sciences librarians based on job announcements whilst in their 2015 paper on scholarly communication coaching Todd, Brantley and Duffin argue that: “To successfully address the current needs of a forward-thinking faculty, the academic library needs to place scholarly communication competencies in the toolkit of every librarian who has a role interacting with subject faculty.”

Which skill sets are needed

Much of the literature is in agreement about the specific skill set librarians need to work in scholarly communication. “Reskilling for Research” identified nine areas of skill which would have increasing importance including knowledge about data management and curation. Familiarity with data is an area mentioned repeatedly and acknowledged as something librarians will be familiar with. Mary Anne Kennan describes the concept as “the librarian with more” – traditional library skills with added knowledge of working with and manipulating data.

Many studies reported that generic skills were just as much, if not more so, in demand than discipline specific skills. A thorough knowledge of advocacy and outreach techniques is needed to spread the scholarly communication message to both library staff and researchers. Raju highlighted presentation skills for similar reasons in his 2014 paper.

The report “University Publishing in a Digital Age” further identified a need for library staff to better understand the publishing process and this is something that we have argued at the Office of Scholarly Communication (OSC) in the past.

There is also a need to be cautious when demanding new skills. Bresnahan and Johnson (article pay-walled) caution against trying to become the mythical “unicorn librarian” – an individual who possesses every skill an employer could ever wish for. This is not realistic and is ultimately doomed to fail.

In their 2013 paper Jaguszewski and Williams instead advocate a team approach with members drawn from different backgrounds and able to bring a range of different skills to their roles. This was also the argument put forward by Dr Sarah Pittaway at the recent UKSG Forum where her paper addressed the issue of current library qualifications and their narrow focus.

Training deficit

Existing library roles are being adapted to include explicit mention of areas such as Open Access whilst other roles are being created from scratch. This work provides a good fit for library staff but it can be challenging to develop the skills needed. As far back as 2008 it was noted that the curricula of most library schools only covered the basics of digital library management and little seems to have changed since with Van House and Sutton identifying barriers to “the ability of LIS educational programs to respond” to changing needs such as the need to produce well-rounded professionals.

Most people working in this area learn their skills on the job, often from more experienced colleagues. Kennan’s study notes that formal education could help to fill the knowledge gap whilst others look to more hands-on training as this helps to embed knowledge.

The question then becomes should the profession as a whole be doing more to prepare their new recruits for the career path of the 21st century academic librarian? This is something we have been asking ourselves in the OSC at Cambridge. Since the OSC was established at the start of 2015 it has made a concerted effort to educate staff at the one hundred plus libraries in Cambridge through both formal training programmes and targeted advocacy. However we are aware that there is still more to be done. We have begun by distributing a survey to investigate the educational background of those who work in scholarly communications. The survey was popular with over five hundred responses and many offers of follow up interviews which means that we have found an area of interest amongst the profession. We will be analysing the results of the survey in the New Year with a view to sharing them more widely and further participating in the scholarly communication process ourselves.

Conclusion

Wherever the skills gaps are there is no doubt that the training needs of academic librarians are changing. The OSC survey will provide insight into whether these needs are currently being met and give evidence for future developments but there is still work to be done. Hopefully this project will be the start of changes to the way academic library staff are trained which will benefit the future of the profession as a whole.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

This article was originally posted on Unlocking Research, the blog of the University of Cambridge Office of Scholarly Communication on November 29, 2016.

Building a positive culture around practitioner research, one symposium at a time

by Christine Neilson
Neil John MacLean Health Sciences Library
Centre for Healthcare Innovation
University of Manitoba

This fall I attended my first C-EBLIP symposium, and it was fantastic. The day was filled with interesting presentations; I had a chance to see old colleagues and meet new people who share an interest in library research; and they gave me bacon for breakfast, which is always a win as far as I’m concerned. Two recurring themes during the day were 1) leading by example, and 2) the personal aspects of doing research (such as dealing with research projects that go off the rails, professional vulnerability, and the dreaded “imposter syndrome”). Both of these themes are important. The first as a call to action. The second as an acknowledgement that research isn’t necessarily easy, but none of us are truly alone and there are things we can do to cope.

Acknowledging and exploring the personal issues that come with conducting research is not something that we tend to talk about. I might tell a trusted colleague that sometimes I’m afraid others will see me as the researcher equivalent of the Allstate DIY-er – all of the enthusiasm and optimism, but none of the skill or ability – but generally, we limit our “official” professional discussion to less sensitive topics. Maybe that’s because we don’t want to admit that there might be any issues. Or maybe it’s because there’s a risk the discussion could degenerate into a pity-party that doesn’t move anyone or anything forward. Either way, I think that this is a topic area that needs to be explored in a constructive way.

The C-EBLIP Symposium was a venue that genuinely felt safe to talk about research and the experience of doing research, and I’m thankful I was able to attend. I’m particularly happy that this year’s presenters will have an opportunity to publish about their presentations in an upcoming issue of Evidence Based Library and Information Practice journal. It’s a great opportunity for presenters to share their research, ideas, and experiences with a wider audience, and it will help ensure that content from the day doesn’t disappear into the ether. Building a culture with certain desired qualities is extremely difficult. I’m encouraged that C-EBLIP is building a positive, supportive culture of practitioner research in librarianship and I hope the momentum continues!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Is It Possible To Develop An Evidence-Based Complement Plan?

by Frank Winter
Librarian Emeritus, University of Saskatchewan

Although not typically phrased as such, librarian labour – what it is, how much of it a library has, how best to deploy it – underlies the ongoing discussion of how best to deliver the services needed by the host institution. Opinions abound on what is needed and what is desirable. Proposals for new or modified roles and services are often received as something that can only be achieved using incremental resources rather than by internal reallocation, a stance based on the oft-voiced assertion that each member of the current complement is already more than fully occupied with existing responsibilities.

The common unit of analysis in these discussions tends to be the position held by an individual librarian, considered as an indivisible worker. But any position and its associated responsibilities is made up of a bundle of tasks. Considering workload and complement in a position-based sense can forestall systematic discussion about options for assigning or reassigning tasks up, down, sideways, out (in terms of partnerships and collaborations, and/or outsourcing, and/or assigning to a different group of employees in the library or elsewhere on campus), redefining the task, utilizing technological options, or not doing the task at all. These are all part of the standard toolkit when dealing with short term situations such as leaves and vacancies and perhaps longer term situations such as downsizing but seem not to be typically part of the discussion when discussing longer term complement plans.

Complement plans are assembled from many component parts. Although there is typically a great deal of professional judgment that goes into complement planning, it is often individual, implicit, and fraught with the usual power dynamics of any group process and all the other pitfalls of planning and decision-making.

Is it possible to employ the processes and tools of evidence-based library and information practice (EBLIP) to develop a complement plan that would address some of these challenges and produce a robust planning document? A quick review of the relevant evidence-based literature suggests that such an approach has not yet been reported but might be productive.

What would such a process look like using the 5 As (Articulate, Assemble, Assess, Action, Adapt and the interactive process of their use) outlined by Denise Koufogiannakis (2013) together with her description of what types of evidence are typically considered in library research as well as the “institutional, group-driven decision making” framework typical of library organizations? Constraints of space make a full discussion of each A impracticable but a quick sketch might be helpful as a starting point.

* Articulate

Koufogiannakis sketches out several points but it is important to recognize that a complement plan addresses the allocation of one subset of a library’s resources – librarian labour. As with every proposed resource allocation it is a political document incorporating budget choices that reflect values.

* Assemble

There is a wealth of riches with respect to potentially relevant evidence. Many of the sources would typically be included in the Environmental Scan section of any Strategic Plan. What EBLIP provides is clarity of purpose in the Articulation stage and focus in assembling evidence at this stage. If the assembled evidence does not, at the Assessment stage, reveal enough about the librarian labour involved, then the evidence-based approach requires an iteration of this stage.

* Assess

Assessing the evidence is the next step in EBLIP. The standard criteria of credibility and validity apply as well as issues of relevance and context. Ensuring that at the Assemble step there is as much depth, breadth, and context as possible in the assembled evidence will aid in assessment. Transparency and inclusivity during the discussions are also important elements at this stage.

For example, although evidence from comparator libraries is often considered it is actually quite tricky to find true comparators. It is very important to be very aware of similarities and differences and what specific tasks and responsibilities are included and not included and the extent to which they might be distributed among others in the library and on campus. It is not particularly helpful to assume what any library or librarian is doing based on what is described on home pages or position titles. The arbitrariness of organizational structure on campus and within libraries sometimes makes it challenging to map apples to apples. At a minimum, personal contact should be made to ensure that the full situation is known. On the other hand, if a comparator library with approximately the same complement of librarians and roughly the same organizational mission is responsible for services not supported by the local library, then further investigation is needed to discover how that other library distributes the responsibilities among their librarian complement. If a smaller university library delivers the same or even an expanded array of librarian-related services then that, too, merits further investigation and perhaps further iteration of the Assemble stage.

It is necessary to assess the potential impact of the evidence on “the Library” and the librarians. Impacts range from measurable and substantial through to insubstantial and unmeasurable.

Evidence from existing librarians must be weighed to distinguish anecdotal empiricism and self-interest from credible evidence.

Another step to take at this point is to be clear about the appropriate unit of analysis when assessing evidence. It is not helpful to view “The Library” – either local or comparator – as an undifferentiated lump. It is more appropriate to disaggregate “The Library” into a bundle of things (work groups including librarians, physical locations, and so on) responding to differing user needs. This step will help in the assessment of what works and what won’t and why. What might work in one area of a library might not be appropriate in another. This avoids the trap of trying to find one size that fits all.

* Action

Getting to agreement is obviously another critical step in the development of a complement plan. Koufogiannakis describes a number of criteria but it is her articulation of the outcome of this step that is important: Determine a course of action and begin implementation of the decision. If no action results from the work above (and acknowledging that a considered conclusion that no changes are desirable is a possible outcome), then arguably the process has been pointless.

In this respect, it is interesting to read the recent blog posting by Roger Schonfeld entitled Shaping a Library by Linking Planning and Budgeting, and the associated comments (2016). Even for the largest libraries, librarian complement is typically a slowly evolving resource if viewed as being composed of positions. Alternatively, for smaller academic libraries changing just one position can be a major and rare action in the overall composition of the complement. The Schonfeld posting highlights librarian time – a more fungible resource than positions – as the productive unit of analysis.

* Adapt

Have the goals and outcomes of the process resulted in what was anticipated to be their effect – the allocation of librarian labour to most effectively meet the current and emerging information needs of library users? If not, why not? At least one possible outcome at this stage (very much institution-dependent) is a conclusion that there is a diminished need for librarians labour. If this is the case, it makes for a pretty gloomy complement plan going forward. And so, the planning cycle returns to the Articulation stage.

In conclusion, the 5 As of EBLIP in addition to the collegial decision-making style typical of libraries seem quite suitable to the development of useful librarian complement plans.

References

Koufogiannakis, D. (2013). EBLIP7 Keynote: What we talk about when we talk about evidence. Evidence Based Library and Information Practice, 8(4), 6-17 doi:http://dx.doi.org/10.18438/B8659R.

Schonfeld, R. (2016, November 7). Shaping a library by linking planning and budgeting [Blog post]. Retrieved from http://www.sr.ithaka.org/blog/shaping-a-library-by-linking-planning-and-budgeting/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The “Why?” behind the research

by Andrea Miller-Nesbitt
Liaison Librarian, Schulich Library of Physical Sciences, Life Sciences and Engineering, McGill University

Lorie Kloda
Associate University Librarian, Planning & Community Relations, Concordia University

Megan Fitzgibbons
Innovation Librarian, Centre for Education Futures, University of Western Australia

When reading journal articles reporting original research, content usually follows the IMRAD format: introduction, methods, results, analysis, and discussion. Word count, author guidelines, and other conventions usually mean that the researchers’ motivation for conducting the study are often left out. In this post we present our motivations for conducting a research study on librarians’ participation in journal clubs:

Fitzgibbons, M., Kloda, L., & Miller-Nesbitt, A. (pre-print). Exploring the value of academic librarians’ participation in journal clubs. College & Research Libraries. http://crl.acrl.org/content/early/2016/08/22/crl16-965.abstract

Being an evidence-based practitioner can sometimes involve a bit of navel-gazing. Beyond using evidence in our professional work (e.g., for decision-making, evaluating initiatives, etc.), we may likewise ask questions about the outcomes of our own professional development choices.

After three years of facilitating the McGill Library Journal Club, we began to think about ways we could disseminate our experience and lessons learned, and most importantly, how we could determine librarians’ perceived outcomes of participating in a journal club. We felt anecdotally that participating in a journal club is worthwhile, but we wondered: Can we formally investigate the impacts of participation on librarians’ practice and knowledge? What evidence can we find to inform the decisions and approaches of librarians and administrators in supporting or managing a journal club? Is there a connection between journal clubs and evidence-based librarianship? We also wanted to learn more about approaches taken in a variety of journal clubs and how they define success for their group.

The McGill Library Journal Club was initially established in order to help foster evidence-based practice by reflecting on the library and information studies literature and using those reflections to inform practice. The journal club also provides a professional development opportunity for all those interested. Although the McGill Library Journal Club has experienced many of the same challenges as other journal clubs, it is still going strong after 6 years thanks to a core group of motivated facilitators. (For more information about the journal club’s activities, see the McGill Library Journal Club wiki.)

In order to answer these questions, we first had to agree on a definition of a journal club. After some reading and deliberation, we framed participation in a journal club as an informal learning activity: learning that occurs outside classrooms or training sessions, but still involves some coordination and structure. In this context, our research question was: “What do librarians perceive as the value of participating in a journal club?” We focused on academic librarians who participate in journal clubs to manage the scope of the study, but a similar approach could be taken in other library and information organizations as well.

Because we were interested in gaining insight into individuals’ experiences, we considered several methods, and ultimately selected an in-depth qualitative method, the hermeneutic dialectic process (Guba & Lincoln, 1989). This is a method that we have seen used in the social sciences for the purpose of evaluation and reconciling diverse perspectives. At the time we were coming up with our research question, one of the authors (Lorie) was an assessment librarian, and interested in qualitative methods. She brought Guba and Lincoln’s writing to the team for discussion. It seemed both appropriate for answering our research question and also flexible to enable us to be able to really capture study participants’ experiences – not just what we expected to hear. We believe that this is the first use of this method in LIS research, so an additional motivation for the study was to apply the approach in the field.

As per the method, we conducted semi-structured in-depth interviews with each participant. After the first interview, central themes, concepts, ideas, values, concerns and issues that arose in the discussion were written into an initial “construction” which captured the experiences and perceptions expressed by the interviewee. Then in the second interview, the participant was asked to react to some of the points brought up by the first interviewee, as expressed in the construction. The construction was added to after each interview, incorporating the perspectives of each successive interviewee and used to inform the subsequent interviews. At the end, all participants were given the opportunity to comment on the final construction and let us know whether their perspectives were accurately represented.

Ultimately, we believe that the findings of our published study are of interest to librarians who aim to create and sustain a journal club. In particular, it could offer insight as they form goals for their group and justify the activity, including seeking support and recognition for their group.

More details about the impacts of academic librarians’ participation in journal clubs are of course presented in the article. In addition, in collaboration with the C-EBLIP Research Network, we hope to compile additional resources about journal club practices in librarianship and open communication channels in the future. Watch this space, and please get in touch if you have any ideas about promoting journal clubs for academic librarians.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Learning to Let Go: The Perfectionist’s Struggle

by Laura Thorne
UBC Okanagan Library

Striving for excellence motivates you; striving for perfection is demoralizing.
– Dr. Harriet Braiker

Last week, I attended C-EBLIP’s third annual Fall Symposium. There were so many great presentations, but there were two in particular that I kept thinking about in the days following – Angie Gerrard’s Changing Your Research Plan En-Route and The Elephant in the Room: Imposter Syndrome and Librarian Researchers by Jaclyn McLean. Both presentations tackled often-encountered, but rarely discussed topics that come up when conducting research – our emotions and personalities. Gerrard discussed the emotional challenge associated with research not going according to plan and the need for professional vulnerability, while McLean discussed imposter syndrome and feeling like you’re not good enough, even when you’ve accomplished so much. They led me to think about a related issue that I’ve struggled with in my career and while doing research – perfectionism.

Like many librarians I know, I am a perfectionist. Perfectionism can be an excellent trait. It can lead to high quality work and can motivate me to always strive to do my best. But it can also be challenging. While I wouldn’t diagnose myself with atelophobia, at times my perfectionism has been paralyzing and has prevented me from taking risks, trying new things, or even completing what I’ve started. There are drawbacks to thinking everything you do needs to be perfect or the best.

Studies show that perfectionism is rampant in academia (Charbonneau, 2011;
Dunn, Whelton & Sharpe, 2006; Sherry, Hewitt, Sherry, Flett & Graham, 2010; Rockquemore, 2012; Shives, 2014) and is something many of our students also struggle with while at university (Çapan, 2010; Eum & Rice, 2011; Jiao & Onwuegbuzie, 1998). While knowing I’m not alone is of some comfort, one of the biggest professional struggles I’ve had to overcome is learning to let go of projects, especially my writing.

You could say my entire career thus far has been an experiment in letting go, in telling myself, “It’s good enough, just send it out,” but nowhere has this needed to be repeated as much as in my research. For the most part, my research is not something I have to do; it’s something I want to do. I do it largely outside of my regular everyday work and is truly a labour of love. Because of this, there tends not be set deadlines (I attempt to set them for myself, but I’m not the strictest timekeeper), and I either a) procrastinate or b) agonize over tiny details instead of just getting it over with and letting it go.

Some of the tricks I’ve found useful in combatting my perfectionism and learning to let go:
• Embrace the mantra of good enough: This isn’t to say do the bare minimum, but accepting that perfection is unattainable and realizing that a finished project is a good project makes it easier to make progress on your research.
• Fight the urge to procrastinate: For me personally, it’s easy to procrastinate – it gives you an out for why something isn’t perfect. But this only exacerbates the problem.
• Set deadlines for yourself (and stick to them): This helps with the procrastination!
• Don’t go alone: Having a research partner or team has been incredibly helpful in learning to let go and can work as a support system when you’re obsessing about the details and unable to see the bigger picture.
• Love the draft: By completing drafts of my work, whether it be a research proposal or an article, I can slowly get used to the idea of letting go of my work in a staged process before sending it out into the world.
• Develop a network you trust: When you’re unsure of or fighting with a project, it’s useful to have a network of people you can talk to and receive feedback. It makes it easier to let go of a project when I know someone I respect thinks it’s good. And I do the same for them!
• Don’t re-read after you’ve submitted your work: This should go without saying, but unfortunately, I had an awful habit of re-reading an item right after I’ve hit submit or send. As I’m reading through, I’m thinking “I should have changed this or that that” and making myself feel dreadful instead of happy that I’ve finished. It’s an exercise in torture and since I’ve stopped, I feel much less critical of the work I’ve done and can actually celebrate a job well done.

“Too many people spend too much time trying to perfect something before they actually do it. Instead of waiting for perfection, run with what you got, and fix it along the way.”
– Paul Arden

References (and further reading)

Çapan, B. E. (2010). Relationship among perfectionism, academic procrastination and life satisfaction of university students. Procedia-Social and Behavioral Sciences, 5, 1665-1671.

Charbonneau, L. (2011). Perfectionist professors have lower research productivity. University Affairs. Retrieved from http://www.universityaffairs.ca/news/news-article/perfectionist-professors-have-lower-research-productivity/

Dunn, J. C., Whelton, W. J., & Sharpe, D. (2006). Maladaptive perfectionism, hassles, coping, and psychological distress in university professors. Journal of Counselling Psychology, 53(4), 511.

Eum, K., & Rice, K. G. (2011). Test anxiety, perfectionism, goal orientation, and academic performance. Anxiety, Stress & Coping, 24(2), 167-178.

Hibner, H. (2016, Jan 19). Don’t overthink it: How librarians can conquer perfectionism with mindfulness. [Web log post]. Retrieved from https://librarylostfound.com/2016/01/19/dont-overthink-it-how-librarians-can-conquer-perfectionism-with-mindfulness/

Jiao, Q. G., & Onwuegbuzie, A. J. (1998). Perfectionism and library anxiety among graduate students. The Journal of Academic Librarianship, 24(5), 365-371.

Rockquemore, K. (2012). Overcoming academic perfectionism. [Web log series]. Retrieved from https://www.insidehighered.com/career-advice/overcoming-academic-perfectionism

Sherry, S. B., Hewitt, P. L., Sherry, D. L., Flett, G. L., & Graham, A. R. (2010). Perfectionism dimensions and research productivity in psychology professors: Implications for understanding the (mal)adaptiveness of perfectionism. Canadian Journal of Behavioural Science, 42(4), 273-283. doi:10.1037/a0020466

Shives, K. (2014, Nov 11). The battle between perfectionism and productivity. [Web log post]. Retrieved from https://www.insidehighered.com/blogs/gradhacker/battle-between-perfectionism-and-productivity

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

A small experiment to improve Facebook engagement

By Joanna Hare
Run Run Shaw Library, City University of Hong Kong

As I am sure is the case at many academic libraries, I am the sole person responsible for maintaining the Library Facebook Page. This means that a lot of my time is spent planning and scheduling content, with not as much time as I would like spent collecting evidence for the purpose of improving content. I regularly check and download Facebook Insights reports to keep an eye on how our page is doing, and of course I always pay attention to how much interaction a particular post is getting through comments, likes, or shares. Recently, however, I trialed a small experiment to see if I could improve the performance of a particular type of post: a weekly link to the Library’s Books of the Week blog.

Books of the Week is a place to share recommended Library books, usually related to a current event such as the Olympics or the beginning of semester. In the past, a feed was created so all new blog posts would be automatically posted to the Facebook page. This was causing a number of problems, such as the timing and number of posts becoming unpredictable, and the posts being poorly formatted in Facebook. Most importantly, the Facebook posts coming automatically from the blog were getting zero engagement, and the Reach of the posts was very low. A change was clearly needed.

I decided to stop the blog posting automatically to Facebook, and manually post the item myself. I created a simple graphic to be used each week, and posting manually meant I could write the accompanying status to be more timely and unique. Even though manually posting the item each week only takes a few minutes, in terms of my job description and job performance I knew I would need to justify if this increased manual work was worth the effort.

Based on an experiment described in this article, I started a log of the variables when posting Books of the Week each week. The log included a link to the post, a description of the post such as the image dimensions, length of the accompanying status, and the time and date of the post. Then, each week I recorded the basic units of measurements for the post provided by Facebook: the Reach and the Post Clicks. I was less interested in likes, comments, and shares in this instance – of course I kept a record of them in my log – but metrics like Reach and Post Clicks are sufficient to see if people are engaged with your content without taking the extra step to ‘like’ a post: “…just because someone didn’t click on your specific post, if that post encouraged them to click anywhere else on your page, you’ve done a good job!” (Cohen, 2014)

For the first four weeks, I saw marked improvement in terms of the Reach, rising from 43 in the first week to 185 by the fourth week. At this point, I tweaked the method of posting. Rather than posting a link then adding the graphic as an attachment, I posted the graphic as a photo, with an html link in the description. Crucially, after digging into my Insights reports I found Facebook categorises the first type of post as a ‘Link’ and the second type as a ‘Photo’. The difference is very small in practice, and looks like this:

figure1_sample-fb-posts_jhare
Fig 1: Image on the left shows a ‘Link’ post type. The second image shows a ‘Photo’ post type.

After making this change, the increase in the post’s Reach was remarkable – the figure jumped to over 500. Over the next 6 weeks I continued this method of posting, and the posts consistently reached over 800 users. Once in the six week period I reverted to the first method, and the Reach dropped to 166. I returned to the second method, and the Reach increased again, which has remained at or above 800 since I stopped keeping a record of the variation each week.

Much of the literature and the marketing material about using Facebook recommends that page managers use images to engage their audience, so I suppose these results are not surprising. I did not however expect there to be such a difference in Reach simply because my post originated as a ‘Photo’ rather than a ‘Link’, when the content is essentially the same.

The general visibility of the posts was much improved with this method, but the change in the actual click through rate to the blog was less dramatic. On average around 5 people each week clicked on the post. My Insight reports show 2-3 of the clicks were to expand the image or description, while on average 0-1 people clicked the link to visit the blog. Quite disappointing!

Despite this, I do not think the exercise was in vain. Firstly, seeing for myself that images truly do have a greater Reach according to Facebook’s algorithm is useful for all future posting practices. Secondly, I think it is valuable to have our posts become more visible on Facebook, increasing our presence on the platform in general. It seems that the manual effort (which is really only around 10- 15 minutes each week – especially now as my colleagues assist in drafting the text and modifying the image!) is worthwhile given the marked increase in the post’s Reach, and the small increase in people who are clicking on the post. This is just a small scale way of using Facebook Insights, and in future I hope to use Insights more strategically in designing and delivering the Library’s Facebook content. In the coming weeks I will be experimenting with a more coordinated approach to Facebook including a paid advertising campaign, and I look forward to sharing some of the results with the C-EBLIP community.

References:

Busche, L. (2016, February 20). 10 Marketing Experiments You Can Run Yourself to Improve Your Reach on Social Media. Retrieved September 27, 2016, from https://designschool.canva.com/blog/marketing-experiments/

Cohen, D. (2014, August 6). Post Clicks, Other Clicks Are Important Metrics for Facebook Page Admins, Too. Retrieved September 27, 2016, from http://www.adweek.com/socialtimes/post-clicks-other-clicks-are-important-metrics-for-facebook-page-admins-too/300388

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

What is Transliteracy?

by Dr. Suzana Sukovic
Executive Director Educational Research and Evidence Based Practice, HETI
New South Wales, Australia

Transliteracy as a concept originated in the work of academics who were involved in digital building and tinkering, people who got their hands dirty with some practical work while thinking theoretically (see Transliteracies Project http://liu.english.ucsb.edu/transliteracies-research-in-the-technological-social-and-cultural-practices-of-online-reading/ and Transliteracy Research Group Archive https://transliteracyresearch.wordpress.com/ ). And that is the essence of transliteracy – it is an abstract idea, but also an embodied practice and sensory experience. Transliteracy is neither an idea nor a practice: it is both. It is hardly surprising that librarians at the coalface of information and digital work embraced the concept as they recognised it in their everyday work with information, knowledge and technology.

But what is it exactly? A short answer is that transliteracy is about a fluidity of movement across a range of technologies, media and contexts.

A longer answer is more layered as it is based on a careful analysis of research data:

Transliteracy is an ability to use diverse analogue and digital technologies, techniques, modes and protocols
• to search for and work with a variety of resources
• to collaborate and participate in social networks
• to communicate meanings and new knowledge by using different tones, genres, modalities and media.
Transliteracy consists of skills, knowledge, thinking and acting, which enable a fluid ‘movement across’ in a way that is defined by situational, social, cultural and technological contexts.

A study into transliteracy on which this definition is based provides plentiful examples of transliterate behaviours. A historian presents research data on a website for community use, responds to online queries about family connections, puts people from the community in touch with each other and notes their experience for research purposes. An academic studies parks as public spaces and uses GPS, digital and hard-copy maps, hand drawings made by park visitors, audio-recordings and then publishes reports, brochures, academic journal articles and a website with multimedia. A teenager explores a known fictional text by taking a perspective of an inanimate object or a minor character and expresses her creative reading in a digital story. High school students and scholars alike use resources in analogue and digital forms, create new content, and collaborate and communicate in a variety of modes.

As any educator would have noticed, there are a number of skill sets and capabilities packed in the definition and examples. We can represent transliteracy conceptually with different capabilities as its main components.

Transliteracy conceptual model Sukovic
Figure 1 Conceptual model of transliteracy

Transliteracy comes to the fore in information and technology rich environments, so it is based on information and ICT capabilities. It also encompasses creativity, critical thinking, and communication and collaboration. These are the main skill and knowledge components of transliteracy. These defining components are not situated wholly in the transliteracy framework as they can be observed regardless of transliteracy. Literacy and numeracy underpin transliteracy in the same way they enable any learning.

In order to understand and appreciate transliteracy, it is helpful to understand ‘ICT’ as a label for analogue and digital information and communication technologies, and their many combinations. ‘ICT’ often refers to digital technologies. However, the book, and traditional radio and television are also technologies designed to carry information and facilitate communication. As the line between different types of technologies becomes increasingly blurry, familiar technologies become a thing of the past and old technologies undergo a revival, any technology that helps us to transmit information and communicate is relevant to transliteracy.

Transliteracy existed well before digital technologies, but contemporary ways of interacting with information and digital tools sped up, broadened and changed our daily ‘movement across’ information and technological fields. As technologies, skills and contexts in which we live and work are constantly changing, transliteracy becomes a literacy of the modern era. It is integrative in a sense that it doesn’t want to replace other useful ways of thinking about information and technology. Rather, it provides an integrative framework for bringing together modern literacies (e.g. information, digital, media literacy). It also provides a framework for connecting rational-emotional, analytical-creative and theoretical-practical ways of thinking and working, which enable us to live effectively and creatively with abundant information around us.

The next post in this series will introduce transliteracy palettes and consider the development of transliteracy in formal and informal learning environments.

See the previous post: Transliteracy: the art and craft of ‘moving across’
http://scitechconnect.elsevier.com/transliteracy-art-of-moving-across/

This post was first published on the LARK blog http://lark-kollektive.blogspot.com.au/

Transliteracy in Complex Information Environments_final

Transliteracy in Complex Information Environments is scheduled to publish in November. If you would like to pre-order a copy, please visit the Elsevier Store. Apply discount code STC215 at checkout for 30% off the list price and free global shipping.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

What “counts” as evidence of impact? (Part 2 of 2)

by Farah Friesen
Centre for Faculty Development (CFD)
University of Toronto and St. Michael’s Hospital

In February’s post, I proposed a critical look at what counts as evidence of research impact, beyond traditional metrics (grants, publications, and peer-reviewed conference presentations). I work specifically in the medical/health professions education context and so wanted to find alternative indicators, beyond traditional and altmetrics. Below, I will share some of these resources with you.

Bernard Becker Medical Library Model for Assessment of Research Impact.1 The Becker Library Model advances 5 pathways of diffusion to track biomedical research impact:
1. Advancement of knowledge
2. Clinical Implementation
3. Community Benefit
4. Legislation and Policy
5. Economic Benefit
Each of these 5 pathways have indicators (some indicators are found in more than one pathway). While the Becker Library Model includes traditional indicators, they suggest some novel impact indicators:2
• valuing collaborations as an indicator of research output/impact
• tracking data sharing, media releases, appearances, or interviews, mobile applications/websites, and research methodologies as evidence of impact
This Model has great indicators to consider for biomedical research impact, but many of the indicators do not apply to medical/health professions education (e.g. Patents, quality of life, clinical practice guidelines, medical devices, licenses, etc).

Kuruvilla et al (2006)3 developed the Research Impact Framework (RIF) as a way to advance “coherent and comprehensive narratives of actual or potential research impacts” focusing on health services research. The RIF maps out 4 types of impact:
1. Research-related impacts
2. Policy impacts
3. Service impacts
4. Societal impacts
Each type of impact area has specific indicators associated with it. Novel indicators include: Definitions and concepts (e.g. the concept of equity in health care financing), ethical debates and guidelines, email/listserv discussions, media coverage. RIF suggests many indicators applicable to non-clinical/biomedicine disciplines.

Tracking collaborations (Becker) and email/listserv discussions (RIF) as research impact, I started to wonder what other types of research dissemination activities we might not have traditionally counted, but which are, in fact, demonstrative of impact. I have coined a term for this type of indicator: Grey Metrics.

Grey metrics denote metrics that are stumbled-upon and serendipitous, but for which there is not really a systematic way to track. These can include personal email asks or phone conversations that actually denote impact. I call it “grey metrics” because it’s kind of like grey literature searching. Grey metrics might include:
• slide sharing (not in repository, but when it’s a personal ask)
• informal consultations (e.g. through email, about a topic or your previous research. These email connections can sometimes inform other individuals’ research – sometimes even for them to develop projects that have won awards. So even if the consultations are informal via email, it shows how one’s research and guidance has an impact!)
• service as expert on panels, roundtables (shows that your research/practice expertise and knowledge are valued)
• curriculum changes based on your research (e.g. if your research informed curriculum change, or if your paper is included in a curriculum, which might lead to transformative education)
• citation in grey literature (e.g. mentions in keynote addresses, other conference presentations)

An example of grey metrics: my supervisor (Stella Ng, Director of Research, CFD) and colleague (Lindsay Baker, Research and Education Consultant, CFD) developed a talk on authorship ethics. One of the CFD Program directors heard a brief version of this talk and asked for the slides. That Program director (who also happens to be Vice-Chair of Education for Psychiatry at the University of Toronto) has shared those slides with her department and now they are using the content from those slides around authorship ethics to guide all their authorship conversations, to ensure ethical practice. In addition, Stella and Lindsay developed an authorship ethics simulation game to teach about authorship ethics issues. A colleague asked for this game to be shared and it has now been used in workshops at other institutions. These were personal asks from a colleague, but which demonstrate impact in terms of how Stella and Lindsay’s research is being applied in education to prepare health professionals for ethical practice in relation to authorship. Tracking these personal requests builds a strong case of impact beyond traditional metrics or altmetrics.

There is interesting work coming out of management learning & education4 and arts-based research5 examining different ways to think about impact. The National Information Standards Organization (NISO) is also working on identifying/defining alternative outputs in scholarly communications and appropriate calculation methodologies.6 The NISO Phase 2 documents were open for public comment (to April 20, 2016), but are now closed, but check the website for their revised documents.

As we work on broadening our conceptions of what counts as research impact, we must try to resist the urge to further quantify our achievements (and worth) as researchers. These blog posts are not meant to be prescriptive about what types of indicators to track. I want to encourage researchers to think about what indicators are most appropriate and align best with their context and work.

We must always be cognizant and vigilant that the time we spend tracking impact could often be better spent doing work that has impact.

References:
1. Becker Medical Library. Assessing the Impact of Research. 2016. Available at: https://becker.wustl.edu/impact-assessment. Accessed July 20, 2016.
2. Becker Medical Library. The Becker List: Impact Indicators. February 04, 2014. Available at: https://becker.wustl.edu/sites/default/files/becker_model-reference.pdf. Accessed July 20, 2016.
3. Kuruvilla S, Mays N, Pleasant A, Walt G. Describing the impact of health research: a Research Impact Framework. BMC Health Services Research. 2006;6(1):1. doi:10.1186/1472-6963-6-134
4. Aguinis H, Shapiro DL, Antonacopoulou EP, Cummings TG. Scholarly impact: A pluralist conceptualization. Academy of Management Learning & Education. 2014;13(4):623-39. doi:10.5465/amle.2014.0121
5. Boydell KM, Hodgins M, Gladstone BM, Stasiulis E, Belliveau G, Cheu H, Kontos P, Parsons J. Arts-based health research and academic legitimacy: transcending hegemonic conventions. Qualitative Research. 2016 Mar 7 (published online before print). doi:10.1177/1468794116630040
6. National Information Standards Organization. Alternative Metrics Initiative. 2016. Available at: http://www.niso.org/topics/tl/altmetrics_initiative/#phase2. Accessed July 20, 2016.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Personality types and the 360° survey for professional development, Or “Being a Bulldozer and Still Winning Hearts”

by Tegan Darnell
Research Librarian
University of Southern Queensland, Australia

This short article is about how evidence-based practice applies on a personal/individual level, and how I’m using the outcomes of survey tools for reflective professional development.

As part of an ongoing leadership development program, I have completed a Myers-Briggs Type Indicator® (MBTI), and the slightly less well known 360° Life Styles Inventory™ (LSI). Both are evidence-based and meet rigorous academic and psychometric standards.

Although reluctant to be categorised, I have committed to ‘develop’ my practice and become a better and more effective leader. I endeavour to take what I have learned from this and use it in my practice.

The MBTI® told me I am an ENTJ type, or ‘the commander’, closely correlated with the ‘Fieldmarshal’ in the Keirsey Temperament Sorter (KTS). An ENTJ can be dictatorial, abrasive, and insensitive. Notable ENTJ types include Margaret Thatcher, Vladimir Putin, Steve Jobs, and Gordon Ramsey (the British chef that swears at people a lot).

It isn’t all bad… Basically, an ENTJ is a natural born leader with a ‘forceful’ personality. Also, Intuition (N) types have been shown to have significantly higher ego development (Vincent, Ward, & Denson 2013) – Apparently that is a good thing.

Last time I took the same test I came out as an INFJ, or ‘the advocate’, (the ‘counsellor’ according to the KTS) so my new result was somewhat of a shock. As a committed researcher-practitioner, however, I have to accept what the data is telling me. Quite a lot of time has passed since I last took the full questionnaire…

However, it was the 360° survey that was the most revealing.

In a 360° degree survey not only do you complete a survey about your behaviours, but so does your boss, your direct reports, and peers and colleagues. The differences between your self-evaluation and the perceptions others have of you are revealing.

I am a human bulldozer.

My colleagues rated me as unhealthily competitive, approval seeking, and lacking in actual performance. Apparently I disregard others’ feelings, and come across and cold and insensitive. Positive areas include: My colleagues see me as an independent and unconventional colleague, and I am nowhere near as avoidant as I thought I was.

Sometimes, the evidence does not say what you want it to say. When the data is about a beloved Library service or resource this is hard to take. When the data is about your personal performance and behaviour, this can be particularly difficult to reconcile. But, as will all research, I have asked the question, I have collected the data, and I have the results. Now, with this information, I need to make a decision about what action I take.

An ENTJ appreciates and welcomes objective and rational statements about what they do well and what could be done better. Criticisms, to me, mean: “Here is your challenge. Do whatever is in your power to make it happen”. So, I have accepted this challenge.

Being a bulldozer was never my intention, but if I am a bulldozer, I’ll be a bulldozer with friends, thank you. I’ll be working on my ‘encouraging’ and ‘humanistic’ behaviours, and doing lots of open communication (ie. ‘listening’) over the next few weeks.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Member-Driven Decision Making

by Gwen Schmidt
Manager, Branches, Saskatoon Public Library

How many librarians does it take to change a lightbulb? If you can believe it, there are at least four different answers to this joke. My favourite is “Only one. But first you have to have a committee meeting.”

I have just finished a two-year term as the President of the Saskatchewan Library Association (SLA), following a period of previous involvement on the Board. It has been a fascinating time of change in our organization’s history, with a reshaping of our direction, our governance, and our style of decision making. I was glad to be a part of it.

One of the most interesting things about this time of change was the renewed SLA commitment to a member-driven philosophy. In 2010, the SLA membership determined that our Board structure needed an overhaul, and a Board Governance Task Force was struck. The Task Force took a look at our history, our values, our goals, and the challenges ahead, and set us on a new path with a new vision – central to which was the idea that our members would lead us.

We were always trying to be member-driven, but this renewed commitment to that idea came at a time when cheap/free consultative software tools exist in abundance, and when social media has given individuals the expectation that they can have their say easily. It was easier than ever before to make ‘member-driven decision making’ a reality.

My presidency fell during a time when strategic planning needed to be done. Instead of just doing it at the Board level, we did a broad survey of the members to find out what they think SLA should be doing, what they need from their library association, and what they think we do well. Once we had that data, we also took the opportunity at the annual conference to have conversations with people in person about it. We took broad themes out of the survey data, and had an in-person membership consultation where people could expand on those themes. All of that consultation helped us to build a robust strategic plan that is taking us forward.

During the same time, provincial and territorial and national library associations across Canada were considering the building of a new national federation together, which ultimately became the Canadian Federation of Library Associations (CFLA). SLA was invited to participate. Our member-driven philosophy set a road-map for us: before committing to participation, we took the question to our members. Did they want us to go forward as part of that federation? If yes, within what parameters? Our members gave us a resounding mandate, and endorsed a set of parameters going forward. Consultation with them throughout the process of building the CFLA identified a problem to be solved with a shared Saskatchewan-Manitoba CFLA Prairie Provinces Representative position. Knowing what our members wanted allowed us to set up a Saskatchewan-Manitoba working group, to determine the structure of the Prairie Provinces rep, to ensure strong communication and representation.

In associations, ‘member-driven decision-making’ sounds a little – or a lot – like evidence-based decision making. Instead of doing what we think they want us to do, we ask them what they want us to do and then do that. Those collaborative conversations take time, but ultimately build trust and energy, and give better results in the end.

How many member perspectives does it take to make an association truly shine? A heckuvalotta them. But that makes the future bright.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.