Quantifying the reproducibility of scientometric analyses: a case study
Paper i proceeding, 2018
Reproducibility of scientific articles and their findings has gained importance in the last years. Although most efforts have been made in biomedicine, health and psychology science, reproducibility is important and necessary in all the research fields. Thus, in this contribution an empirical evaluation of the reproducibility of scientometric studies was carried out. To do so, 285 articles published in the journal of Scientometrics in 2017 were examined in term of the following reproducibility artifacts: workflow, search strategy, database, software, the availability of the source code (where applicable) and the availability of the dataset. Our findings showed that whilst workflow and search strategy were well described in the majority of articles, the dataset used was shared by very few studies. The data was usually retrieved from the WoS and Scopus databases. Finally, a few articles shared the source code where ad-hoc software was used.