Teaching scholarly communication skills to graduate students

by Chris Chan
Head of Information Services at Hong Kong Baptist University Library

Part1: 7 May 2019

I am writing this first part of the blog post from my comfortable hotel room in downtown Minneapolis, where I have arrived ahead of the LOEX Annual Conference that will start at the end of the week. Navigating US Immigration at Chicago after the longest flight I have ever taken (15 hours!!) has taken its toll, but I am hoping that arriving a few days early will give me a chance to more or less recover from jet lag ahead of the event starting on the 9th.

The time will also allow me to put the finishing touches to my breakout session presentation. I’ll be talking about the efforts we have been making at HKBU Library to ensure our graduate students are equipped with the scholarly communication knowledge and skills that they will need to be successful researchers. For several years we have had required library workshops for our research students that covered the basics of scholarly publishing. These sessions also sought to raise awareness of current issues in scholarly communication, such as open access and altmetrics. Although student feedback has generally been positive, we found it challenging to design sessions that were suitable for both novice researchers and those graduate students that already had prior publication experience. We also wanted to better assess the extent to which students were achieving the learning outcomes of the session, as the results of relatively simple in-class exercises could tell us only so much.

Our new approach, launched this year, has been to adapt our workshop content into a modular online course. The course is designed so that students can skip content that they are already familiar with. To fulfill the new course requirement, students need to achieve a passing grade on a short online quiz assessing their knowledge of course content. In my presentation, I’ll be sharing the results from our first year of implementation. I’m also hoping to find out what approaches other institutions are taking, and to this end I’ll be using Mentimeter for the entire presentation. I’m a little nervous about having to rely on an online service, but fingers crossed that it runs smoothly. Another benefit is that I will be able to share the results in the second part of this blog post.

Part 2: 11 May 2019
All done! The conference was excellent – there were so many things that I will be bringing back to my own institution. As for my own presentation, everything went smoothly in technological terms. Mentimeter worked as advertised, and having the interactive segments seemed to help keep things interesting for the audience. Their responses were incorporated into the presentation material in real-time. For example, the results for this question supported a point that I had seen in the literature – that this type of support for graduate students is often not formally assessed (the full question was “How is support for graduate student scholarly communication skill development assessed at your institution?”):

I also used the built-in quiz function in Mentimeter to showcase some of the questions that we use to assess the student learning. Shout out to Marcela for winning!

You can view the full presentation (including the results of the audience voting) here: https://www.mentimeter.com/s/e1451a492dd1d3a21747448a6ff3ce70

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Ethical Publishing Choices and the Librarian Researcher

by DeDe Dawson @dededawson
Science Library, University of Saskatchewan

As librarians we have a unique vantage point on the scholarly publishing market – both as publishing researchers ourselves and as our institution’s agents in acquiring content from publishers. We are perfectly situated to appreciate the dysfunction and unsustainability of the current for-profit system. And I believe we have a professional obligation to raise the awareness of our university colleagues about this issue. Certainly many of our faculty colleagues already have some level of awareness, but the details and extent of the problem remains mostly hidden to the average person outside of libraries.

In the past month or so I have been riveted by the steady stream of news and analyses of the University of California (UC) system’s cancellation of all Elsevier journal titles. It is not that the UC system cannot afford the big deal subscription. UC is actually taking a principled stand with their key goal being “securing universal open access to UC research while containing the rapidly escalating costs associated with for-profit journals.”

The UC libraries have worked for a decade or so now to raise the awareness of the faculty on their campuses of the problems with the current publishing system and the benefits of a transition to open access. So, the faculty are largely supportive of the stance UC libraries took with Elsevier. Some have even started a petition to boycott Elsevier in support of open access. Those who have signed resolve to publish their work elsewhere and to refuse to donate their time as reviewers and editorial board members. The free content and labour provided by the authors, reviewers, and editors is why commercial scholarly publishers are so extremely profitable. As Adriane MacDonald and Nicole Eva of University of Lethbridge note: It’s time to stand up to the academic publishing industry.

This is not just a library problem. And solutions need to come with the active involvement of the community of authors, reviewers, editors, and readers. Authors, reviewers, and editors in particular have real power! As Lorcan Dempsey ends his recent blog post on the UC cancellations:

“The UC action has galvanized attention. For Elsevier, the financial impact may be less of an issue than the potential loss of participation in their journals of UC authors, editors, and reviewers. This is because of the scale of the UC research enterprise. For faculty elsewhere, it is potentially important as an exemplary event – the example of UC authors may have more of an influence than the exhortation of their library. For other consortia and libraries it is a call to action.”

What about us? As librarian-researchers, those most aware of the problems in the current system, do we have an ethical obligation to lead by example with our publishing, editorial, and reviewing choices?

Personally, I think so. For years I have chosen to only publish my research in open access journals and I will not donate my time as a peer reviewer or editorial board member to closed-access, for-profit journals either. I consider this an ethical and values-driven decision. Having said that, I recognize I am in a privileged position as a tenured librarian (though I made this decision well before I achieved tenure), so I will not judge those who feel they need to publish in certain titles for career advancement. I only note that this in itself is the underlying reason for this dysfunctional market: the incentive structures in academia are extremely problematic. If we could let go of our addiction to “high impact” and “prestige” journals, and instead judge research by its own merits (not the package it comes in), then we could free ourselves from the grip of the Elseviers of the world. But I have already written an entire blogpost on that…

I’ll end with a reminder that the C-EBLIP website hosts a list of peer-reviewed LIS journals, those that are open access are identified by the orange open lock symbol!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Responsible Metrics Movement: Don’t Judge Research by the Package it Comes In!

by DeDe Dawson @dededawson
Science Library, University of Saskatchewan

I often rail against the unsustainability and inequity of the current subscription journal publishing system. We have the technology, the money (if we disinvest from the current system), and the ingenuity to completely re-imagine this system (see Jon Tennant’s recent article – it is short and worth your time!). A new system could be entirely open, inclusive, and democratic: enabling anyone in the world to read and build upon the research. This has the potential to dramatically increase the speed of progress in research as well as its uptake and real-world impact. The return on investment for universities and research funders would be considerable (this is exactly why many funders are adopting open access policies).

So, why is it so hard to get to this ScholComm paradise?

It is a complex system, with many moving parts and vested interests. And getting to my idealistic future is also a huge collective action problem. But I think there’s more going on that holds us back…

Have you ever heard of the analytical technique called The 5 Whys? It is designed to get at the underlying basis of a problem. Basically, you just keep asking “why?” until you get at the root of the issue (this may be more or less than five times obviously!). Addressing the basis of the problem is more effective than dumping loads of time and resources in fixing all the intermediary issues.

I’ve used The 5 Whys numerous times when I’m stewing over this dilemma of inertia in transitioning to a new model of scholarly publishing. I always arrive at the same conclusion. (Before reading on, why don’t you try this and see if you arrive where I always do?)

1st Why: Why is it so hard to transition to a new, more sustainable model of publishing?
Answer: Because the traditional subscription publishers are so powerful; they control so much!

2nd Why: Why are they so powerful?
Answer: Because many researchers insist on publishing in their journals.

3rd Why: Why do they insist on publishing in those journals?
Answer: Because they are addicted to the prestige titles and impact factors of those journals.

4th Why: Why are they addicted to these things?
Answer: Because they feel that their career depends on it.

5th Why: Why do they think that their careers depend on this?
Answer: Hiring & merit committees, tenure & promotion committees, and granting agencies often judge the quality of research based on the prestige (or impact factor) of the journal it is published in.

Of course there are many variations in how to ask and answer these questions. And there are associated problems that emerge as well. But, the underlying problem I always arrive back at is the perverse incentive systems in higher education and the “Publish or Perish” mentality. And of course what this tweet says:

Ok, so now let’s ask a “How?” question…

If academia’s incentive systems are one of the major factors holding us back from transitioning to a more sustainable publishing system then… How do we change the incentives?

The Responsible Metrics Movement has been growing in recent years. Two statements are fueling this movement:
The San Francisco Declaration on Research Assessment (DORA)
Leiden Manifesto for Research Metrics

Each of these statements advocates for academia to critically examine how they assess research, and encourages adoption of responsible metrics (or methods) that judge the research on its own merits and not the package it comes in (i.e. the prestige of the journal). DORA focuses primarily on combating the problem of journal-based metrics (the problems with the Journal Impact Factor are well known), and makes a number of suggestions for action by various stakeholders. While Leiden is more comprehensive with 10 principles. See this video for a nice overview of the Leiden Principles:

Evaluating researchers by actually reading their published outputs seems like an obvious solution… until you are on one of those hiring committees (or tenure/promotion/merit committees, or grant adjudication committees, etc.) and faced with a stack of applications – each with a long list of publications for you to read and assess! Instead, Stephen Curry (Chair of the DORA Steering Committee and passionate advocate in this area), suggests candidates compile a one or two-page “bio-sketch” highlighting their best outputs and community contributions. I recently came across a research centre that is using just such a method to assess candidates:

“…we prefer applicants to select which papers they feel are their most important and write a short statement explaining why.”


From the Centre for Mechanochemical Cell Biology (CMCB)

DORA is also collecting examples of “Good Practices” like this on their website.

In my experience, many researchers are aware of these problems with journal-level metrics and the over-emphasis on glamour journals. It has even been noted that Nobel Prize winners of the past would not likely succeed in today’s hyper-competitive publish or perish climate. But researchers often feel powerless to change this system. This is why I particularly like the last paragraph of the CMCB blurb above:

“As individuals within CMCB, we argue for its principles during our panel and committee work outside CMCB.”

Researchers are the ones making up these committees assessing candidates! Use your voice during those committee meetings to argue for responsible metrics. Use your voice when your committee is drawing up the criteria by which to assess a candidate. Use your voice during collegial meetings when you are revising your standards for tenure/promotion/merit. You have more power than you realize.

Ingrained traditions in academia don’t change overnight. This is a long game of culture change. Keep using your voice until other voices join you and you wear down those traditions and the culture changes. Maybe in the end we’ll not only have responsible metrics but sustainable, open publishing too!

Recommended Further Reading:

Lawrence, P. A. (2008). Lost in publication: How measurement harms science. Ethics in Science and Environmental Politics, 8(1), 9-11. https://doi.org/10.3354/esep00079

Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314, 498-502. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2126010/

Vanclay, J. K. (2012). Impact factor: Outdated artefact or stepping-stone to journal certification? Scientometrics, 92(2), 211-238. https://doi.org/10.1007/s11192-011-0561-0

P.S. Assessing the actual research instead of the outlet it is published in has implications for the “Predatory Publishing” problem too. Martin Eve and Ernesto Priego wrote a fantastic piece that touches on this:

Eve, M. P., & Priego, E. (2017). Who is Actually Harmed by Predatory Publishers? TripleC: Communication, Capitalism & Critique, 15(2), 755–770. http://www.triple-c.at/index.php/tripleC/article/view/867

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.