Responses of universities to altmetrics

The 2015 Altmetrics Workshop
Amsterdam, 9 October 2015

Julie Birkholz
Jeroen Huisma


The hot topic of discussion in understanding scholarly impact is arguably altmetrics; as a complimentary or contrasting tool from traditional research metrics for measuring and monitoring scholarly activity in the online domain (Neylon & Wu 2009; Priem & Hemminger 2010). An increasing amount of and combinations of altmetrics have emerged (Torres-Salinas, Cabezas-Clavijo & Jiménez-Contreras 2013), as well as the emergence of tools for users (libraries, researchers, reviewers, funders) to evaluate altmetric activities (e.g. Altmetric, ReaderMeter, ImpactStory). Governments (education ministries), funding bodies and the like have remained relatively silent in these discussions, with few to have (in)formally replied to how altmetrics are considered in a science system. Among the responses have been the emergence of white papers and reports largely financed by different science agencies and non-for-profit agency to assess not only altmetircs but the role of metrics in science in general (NISO 2014; Hicks et al 2015).
As a way to take stock of what is happening in response to the ongoing scientific and also practical role of altmetrics in research assessment and communication, we investigate what actions HEIs in particular are taking related to altmetrics. Knowledge about the responses of HEIs are important as HEIs serve as the in-between between researchers, society, and steering bodies such as funding agencies and government; with the ability to stipulate formal day-to-day policies stipulating researchers’ work, as well as negotiating with these steering bodies.
Through Web research, we investigate what responses universities have made and explore whether we can explain what drives the universities engagement (or lack thereof) with altmetrics. We focus our exploratory research on HEIs in the UK where this topic has received some attention from funding agencies (Wilsdon et al 2015; Wouters 2014).
Findings show, that despite the lack of a formal placement or role of altmetrics in research assessment from funding agency or government that serve as one steering mechanism within the science system, universities are responding in different ways; ranging from acknowledgements from university libraries on how to consider altmetrics, to the emergence of alliances and discussion platforms between universities, to official press releases on altmetrics, as well as to the lack of a digital trace of altmetrics. This allows us to identify questions for further research (e.g. identifying the nature of university responses in relation to their institutional position in their fields of higher education) but also for the practitioners and the advocates of altmetrics to identity how and with which HEIs may be more and less respective of their call.


Hicks, D. Wouters, P. Waltman, L., de Rijcke, S. & Rafols I. (2015). Bibliometrics: The Leiden Manifesto for research metrics., accessible:
Neylon, C., & Wu, S. (2009). Article-level metrics and the evolution of scientific impact. PLoS biology, 7(11), 2459.
NISO (2014). Alternative Metrics Initiative, Phase 1, White Paper. Accessible:
Priem, J., & Hemminger, B. H. (2010). Scientometrics 2.0: New metrics of scholarly impact on the social Web. First Monday, 15(7).
Torres-Salinas, D., Cabezas-Clavijo, Á., & Jiménez-Contreras, E. (2013). Altmetrics: New indicators for scientific communication in web 2.0. arXiv preprint arXiv:1306.6595.
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Roger, K., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., Johnson, B. (2015). The metric tide: report of the independent review of the role of metrics in research assessment and management. HEFCE, accessible:,2014/Content/Pubs/Independentresearch/
Wouters, P. (2014). A key challenge: the evaluation gap. Accessible:

Supplementary materials