Altmetrics: what does it measure? Is there a role in research assessment? C-EBLIP Journal Club April 6, 2015

by Li Zhang
Science and Engineering Libraries, University of Saskatchewan

Finally, I had the opportunity to lead the C-EBLIP Journal Club on April 6, 2015! This was originally scheduled for January, but was cancelled due to my injury. The article I chose was:

How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. By Zohreh Zahedi, Rodrigo, Costas, and Paul Wouters. Scientometrics, 2014, Vol.101(2), pp.1491-1513.

There are several reasons why I chose this article on altmetrics. First, in the University of Saskatchewan Library, research is part of our assignment of duties. Inevitably, how to evaluate librarians’ research outputs has been a topic of discussion in the collegium. Citation indicators are probably the most widely used tool for evaluation of publications. But with the advancement of technology and different modes of communications, how to capture the impact of scholarly activities from those alternative venues? Altmetrics seems to be a timely addition to the discussion. Second, altmetrics is an area I am interested in developing my expertise. My research interests encompass bibliometrics and its application in research evaluation; therefore, it is natural to extend my interests to this new emerging field. Third, this paper not only presents detailed information on the methods used in this research but also provides a balanced view about altmetrics, thus helping us to understand how altmetric analysis is conducted and to be aware of the issues around this new metrics as well.

We briefly discussed the methodology and main findings in the article. Some of the interesting findings include: Mendeley readership was probably the most useful source for altmetrics, while the mentioning of the publications in other types of media (such as twitter, delicious, and Wikipedia) was very low; Mendeley readership counts also had a moderate positive correlation to citation counts; in some fields of social sciences and humanities, altmetric counts were actually higher than citation counts, suggesting altmetrics could be a potentially useful tool for capturing impact of scholarly publications from different sources in these fields, in addition to citation indicators.

Later in the session, we discussed a couple of issues related to altmetrics. Although measuring the impact of scholarly publications in alternative sources has gained notice, it is not yet clear why publications are mentioned in these sources. What kind of impact does Altmetrics measure? In traditional citation indicators, at least we know that the cited articles stimulated or informed the current research in some way (either positive or negative). In contrast, a paper appearing in Mendeley does not necessarily mean it is read. Similarly, a paper mentioned in Twitter could be just self-promotion (there is nothing wrong with it!). From here, we extended our discussion to publishing behaviours and promotion strategies. Are social scientists more likely to use social media to promote their research and publications than natural scientists? The award criteria and merit system in academia will also play a role. If altmetrics is counted as an indication of the quality of the publications, we may see a sudden surge of social media use by researchers. Further, it is much easier to manipulate altmetrics than citation metrics. Care needs to be taken before we can confidently use altmetrics as a reliable tool to measure scholarly activities.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.