Quantifying the reproducibility of scientometric analyses: a case study
Paper i proceeding, 2018

Reproducibility of scientific articles and their findings has gained importance in the last years. Although most efforts have been made in biomedicine, health and psychology science, reproducibility is important and necessary in all the research fields. Thus, in this contribution an empirical evaluation of the reproducibility of scientometric studies was carried out. To do so, 285 articles published in the journal of Scientometrics in 2017 were examined in term of the following reproducibility artifacts: workflow, search strategy, database, software, the availability of the source code (where applicable) and the availability of the dataset. Our findings showed that whilst workflow and search strategy were well described in the majority of articles, the dataset used was shared by very few studies. The data was usually retrieved from the WoS and Scopus databases. Finally, a few articles shared the source code where ad-hoc software was used.

open data

scientometrics

reproducibility

Författare

Tahereh Dehdarirad

Forskarstöd, bibliometri och rankning

Manuel Jesus Cobo

Department of Computer Science and Engineering, University of Cádiz, Spain

Pablo García-Sánchez

Department of Computer Science and Engineering, University of Cádiz, Spain

Jose A. Moral-Munoz

Department of Nursing and Physiotherapy, University of Cádiz, Spain

23rd international conference on science and technology indicators (STI 2018)

23rd international conference on science and technology indicators (STI 2018)
Leiden, Netherlands,

Ämneskategorier

Filosofi

Biblioteks- och informationsvetenskap

Företagsekonomi

Relaterade dataset

Data for Quantifying the reproducibility of scientometric analyses [dataset]

DOI: 10.6084/m9.figshare.6137501

Mer information

Senast uppdaterat

2023-10-27