Quantifying the reproducibility of scientometric analyses: a case study
Paper in proceeding, 2018

Reproducibility of scientific articles and their findings has gained importance in the last years. Although most efforts have been made in biomedicine, health and psychology science, reproducibility is important and necessary in all the research fields. Thus, in this contribution an empirical evaluation of the reproducibility of scientometric studies was carried out. To do so, 285 articles published in the journal of Scientometrics in 2017 were examined in term of the following reproducibility artifacts: workflow, search strategy, database, software, the availability of the source code (where applicable) and the availability of the dataset. Our findings showed that whilst workflow and search strategy were well described in the majority of articles, the dataset used was shared by very few studies. The data was usually retrieved from the WoS and Scopus databases. Finally, a few articles shared the source code where ad-hoc software was used.

open data

scientometrics

reproducibility

Author

Tahereh Dehdarirad

Research support, bibliometrics and ranking

Manuel Jesus Cobo

Department of Computer Science and Engineering, University of Cádiz, Spain

Pablo García-Sánchez

Department of Computer Science and Engineering, University of Cádiz, Spain

Jose A. Moral-Munoz

Department of Nursing and Physiotherapy, University of Cádiz, Spain

23rd international conference on science and technology indicators (STI 2018)

23rd international conference on science and technology indicators (STI 2018)
Leiden, Netherlands,

Subject Categories

Philosophy

Information Studies

Business Administration

Related datasets

Data for Quantifying the reproducibility of scientometric analyses [dataset]

DOI: 10.6084/m9.figshare.6137501

More information

Latest update

10/27/2023