REAL

Irreproducibility in searches of scientific literature: A comparative analysis

Pozsgai, Gábor and Lövei, Gábor and Vasseur, Liette and Gurr, Geoff and Batáry, Péter and Korponai, János and Littlewood, Nick A. and Liu, Jian and Móra, Arnold and Obrycki, John and Reynolds, Olivia and Stockan, Jenni A. and VanVolkenburg, Heather and Zhang, Jie and Zhou, Wenwu and You, Minsheng (2021) Irreproducibility in searches of scientific literature: A comparative analysis. ECOLOGY AND EVOLUTION, 11 (21). pp. 14658-14668. ISSN 2045-7758

[img]
Preview
Text
32403761.pdf - Published Version
Available under License Creative Commons Attribution.

Download (3MB) | Preview

Abstract

Abstract Repeatability is the cornerstone of science, and it is particularly important for systematic reviews. However, little is known on how researchers? choice of database, and search platform influence the repeatability of systematic reviews. Here, we aim to unveil how the computer environment and the location where the search was initiated from influence hit results. We present a comparative analysis of time-synchronized searches at different institutional locations in the world and evaluate the consistency of hits obtained within each of the search terms using different search platforms. We revealed a large variation among search platforms and showed that PubMed and Scopus returned consistent results to identical search strings from different locations. Google Scholar and Web of Science's Core Collection varied substantially both in the number of returned hits and in the list of individual articles depending on the search location and computing environment. Inconsistency in Web of Science results has most likely emerged from the different licensing packages at different institutions. To maintain scientific integrity and consistency, especially in systematic reviews, action is needed from both the scientific community and scientific search platforms to increase search consistency. Researchers are encouraged to report the search location and the databases used for systematic reviews, and database providers should make search algorithms transparent and revise access rules to titles behind paywalls. Additional options for increasing the repeatability and transparency of systematic reviews are storing both search metadata and hit results in open repositories and using Application Programming Interfaces (APIs) to retrieve standardized, machine-readable search metadata.

Item Type: Article
Additional Information: Korponai János: 8 Department of Biology, Savaria Campus, Eötvös Loránd University, Szombathely, Hungary 9 Department of Environmental Sciences, Sapientia Hungarian University of Transylvania, Cluj- Napoca, Romania 10 Department of Water Supply and Sewerage, Faculty of Water Science, National University of Public Service, Baja, Hungary 11 Aquatic Ecological Institute, Centre for Ecological Research, Budapest, Hungary A publikáció a Nemzeti Közszolgálati Egyetem 2020. évi Tématerületi Kiválóság Program keretében, a Fenntartható biztonság és társadalmi környezet elnevezésű projekt támogatásával valósult meg, az Innovációs és Technológiai Minisztérium Nemzeti Kutatási, Fejlesztési és Innovációs Alapból nyújtott támogatásával, a Nemzeti Kutatási, Fejlesztési és Innovációs Hivatal által kibocsátott támogatói okirat alapján.
Uncontrolled Keywords: DATABASE; reproducibility; Repeatability; Information retrieval; search engine; evidence synthesis methods; search location;
Subjects: Q Science / természettudomány > QA Mathematics / matematika > QA76.9.D343 Data mining and searching techniques / adatbányászati és keresési módszerek
SWORD Depositor: MTMT SWORD
Depositing User: MTMT SWORD
Date Deposited: 06 Aug 2024 14:24
Last Modified: 06 Aug 2024 14:24
URI: https://real.mtak.hu/id/eprint/201931

Actions (login required)

Edit Item Edit Item