White paper on Next Generation Metrics
Report, 2020

We - the writers - of this paper summarise a methodological debate amongst experts from our Members on ´traditional´ and ´next generation metrics´ for science, education and innovation in the light of the developments and expectations towards greater ´openness´ to realise long-term ecological, economic and social sustainability and benefit to citizens and to the world. A broad range of indicators from various sources were discussed in terms of feasibility in different contexts, as well as their suitability to serve diverse purposes. Rather than presenting a formal position on behalf of CESAER, we present our synthesis of this debate. In chapter one, we provide the definitions, describe the methodology used and set the scope of this paper, thus setting the scene for the following chapters.
In chapter two, we report on our findings on metrics dealing with (open) science. Ever since E. Garfield’s Journal Impact Factor (JIF) came into use in the mid-70s, and certainly with the h-index proposed by the physicist J. E. Hirsch in 2005, the rise of quantitative metrics in the assessment of research has seemed to be unstoppable - up to the use of ´views´, ´likes´ and ´tweets´. While in times of accountability and competing for visibility and funds, it is only reasonable to focus on the measurability and comparability of metrics as efficient means to display performance, the limitations of doing so are obvious. As a result, in the past years, a countermovement criticising this practice and questioning the validity of the metrics and reliability of the data used has become stronger. Moreover, there are strong (political) expectations to make science more open. Metrics for (open) education and training are the topic of chapter three. In many (global) rankings of higher education institutions, the indicators used reflect the model of traditional, established, wealthy and largely English-speaking research universities (Hazelkorn, 2015). They are, therefore, ill-suited to truly give an idea about the quality or the performance of higher education more broadly, and they are limited in helping universities to set priorities. They do, however, reveal that there is still a lack of meaningful internationally comparable information on these matters. By covering (open) innovation in chapter four, we complete the discussion of the mission of our Members. Open innovation promotes approaches that boost disruptive innovation rather than incremental, stimulate inventions produced by outsiders and founders in start-ups, and is based on a view on the world of widely distributed knowledge.
We synthesised our findings on the confrontation between ´traditional´ and ´next generation metrics´ and present ten each for science, education and innovation for use mainly within our Members and to monitor the desired progress over time (see annexe I). While this might be interpreted as sufficient responsiveness to external expectations on our behalf, we instead advanced further and in chapter five suggest that universities strive towards ´progressive metrics´ and highlight the need to acknowledge knowledge as a common good, promote a culture of quality, risk-taking and trust and measure the contribution to sustainability. That is why we conclude this paper with ideas for progressive indicators in annexe II, outlining an agenda for future work to stay at the forefront of science, education and innovation; to benchmark against like-minded institutions; and to pursue institutional development paths; and - ultimately - to optimise our contributions to society and the world.

bibliometrics

ranking

altmetrics

open science

scientometrics

Author

Ingrid Bauer

Vienna University of Technology

David Bohmert

CESAER

Alexandra Czarnecka

Delft University of Technology

Thomas Eichenberger

Swiss Federal Institute of Technology in Zürich (ETH)

Juan Garbajosa

Technical University of Madrid

Horia Iovu

Politehnica University of Bucharest (UPB)

Yvonne Kinnaird

University of Strathclyde

Ana Carla Madeira

University of Porto

Mads Nygård

Norwegian University of Science and Technology (NTNU)

Per-Anders Östling

Royal Institute of Technology (KTH)

Susanne Räder

Technische Universität Dresden

Mario Ravera

Polytechnic University of Turin

Per-Eric Thörnström

Chalmers, Communication and Learning in Science, Research support, bibliometrics and ranking

Kurt De Wit

KU Leuven

Subject Categories

History of Ideas

Information Studies

DOI

10.5281/zenodo.3874801

Publisher

CESAER

More information

Latest update

10/23/2023