White paper on Next Generation Metrics
Rapport, 2020

We - the writers - of this paper summarise a methodological debate amongst experts from our Members on ´traditional´ and ´next generation metrics´ for science, education and innovation in the light of the developments and expectations towards greater ´openness´ to realise long-term ecological, economic and social sustainability and benefit to citizens and to the world. A broad range of indicators from various sources were discussed in terms of feasibility in different contexts, as well as their suitability to serve diverse purposes. Rather than presenting a formal position on behalf of CESAER, we present our synthesis of this debate. In chapter one, we provide the definitions, describe the methodology used and set the scope of this paper, thus setting the scene for the following chapters.
In chapter two, we report on our findings on metrics dealing with (open) science. Ever since E. Garfield’s Journal Impact Factor (JIF) came into use in the mid-70s, and certainly with the h-index proposed by the physicist J. E. Hirsch in 2005, the rise of quantitative metrics in the assessment of research has seemed to be unstoppable - up to the use of ´views´, ´likes´ and ´tweets´. While in times of accountability and competing for visibility and funds, it is only reasonable to focus on the measurability and comparability of metrics as efficient means to display performance, the limitations of doing so are obvious. As a result, in the past years, a countermovement criticising this practice and questioning the validity of the metrics and reliability of the data used has become stronger. Moreover, there are strong (political) expectations to make science more open. Metrics for (open) education and training are the topic of chapter three. In many (global) rankings of higher education institutions, the indicators used reflect the model of traditional, established, wealthy and largely English-speaking research universities (Hazelkorn, 2015). They are, therefore, ill-suited to truly give an idea about the quality or the performance of higher education more broadly, and they are limited in helping universities to set priorities. They do, however, reveal that there is still a lack of meaningful internationally comparable information on these matters. By covering (open) innovation in chapter four, we complete the discussion of the mission of our Members. Open innovation promotes approaches that boost disruptive innovation rather than incremental, stimulate inventions produced by outsiders and founders in start-ups, and is based on a view on the world of widely distributed knowledge.
We synthesised our findings on the confrontation between ´traditional´ and ´next generation metrics´ and present ten each for science, education and innovation for use mainly within our Members and to monitor the desired progress over time (see annexe I). While this might be interpreted as sufficient responsiveness to external expectations on our behalf, we instead advanced further and in chapter five suggest that universities strive towards ´progressive metrics´ and highlight the need to acknowledge knowledge as a common good, promote a culture of quality, risk-taking and trust and measure the contribution to sustainability. That is why we conclude this paper with ideas for progressive indicators in annexe II, outlining an agenda for future work to stay at the forefront of science, education and innovation; to benchmark against like-minded institutions; and to pursue institutional development paths; and - ultimately - to optimise our contributions to society and the world.

bibliometrics

ranking

altmetrics

open science

scientometrics

Författare

Ingrid Bauer

Technische Universität Wien

David Bohmert

CESAER

Alexandra Czarnecka

TU Delft

Thomas Eichenberger

Eidgenössische Technische Hochschule Zürich (ETH)

Juan Garbajosa

Universidad Politecnica de Madrid

Horia Iovu

Universitatea Politehnica din Bucuresti (UPB)

Yvonne Kinnaird

University of Strathclyde

Ana Carla Madeira

Universidade do Porto

Mads Nygård

Norges teknisk-naturvitenskapelige universitet

Per-Anders Östling

Kungliga Tekniska Högskolan (KTH)

Susanne Räder

Technische Universität Dresden

Mario Ravera

Politecnico di Torino

Per-Eric Thörnström

Chalmers, Vetenskapens kommunikation och lärande, Forskarstöd, bibliometri och rankning

Kurt De Wit

KU Leuven

Ämneskategorier

Idé- och lärdomshistoria

Biblioteks- och informationsvetenskap

DOI

10.5281/zenodo.3874801

Utgivare

CESAER

Mer information

Senast uppdaterat

2023-10-23