Abstract
Background: Citation screening is time consuming and inefficient. We sought to evaluate the performance of Abstrackr, a semi-automated online tool for predictive title and abstract screening.
Methods: Four systematic reviews (aHUS, dietary fibre, ECHO, rituximab) were used to evaluate Abstrackr. Citations from electronic searches of biomedical databases were imported into Abstrackr, and titles and abstracts were screened and included or excluded according to the entry criteria. This process was continued until Abstrackr predicted and classified the remaining unscreened citations as relevant or irrelevant. These classification predictions were checked for accuracy against the original review decisions. Sensitivity analyses were performed to assess the effects of including case reports in the aHUS dataset whilst screening and the effects of using larger imbalanced datasets with the ECHO dataset. The performance of Abstrackr was calculated according to the number of relevant studies missed, the workload saving, the false negative rate, and the precision of the algorithm to correctly predict relevant studies for inclusion, i.e. further full text inspection.
Results: Of the unscreened citations, Abstrackr's prediction algorithm correctly identified all relevant citations for the rituximab and dietary fibre reviews. However, one relevant citation in both the aHUS and ECHO reviews was incorrectly predicted as not relevant. The workload saving achieved with Abstrackr varied depending on the complexity and size of the reviews (9% rituximab, 40% dietary fibre, 67% aHUS, and 57% ECHO). The proportion of citations predicted as relevant, and therefore, warranting further full text inspection (i.e. the precision of the prediction) ranged from 16% (aHUS) to 45% (rituximab) and was affected by the complexity of the reviews. The false negative rate ranged from 2.4 to 21.7%. Sensitivity analysis performed on the aHUS dataset increased the precision from 16 to 25% and increased the workload saving by 10% but increased the number of relevant studies missed. Sensitivity analysis performed with the larger ECHO dataset increased the workload saving (80%) but reduced the precision (6.8%) and increased the number of missed citations.
Conclusions: Semi-automated title and abstract screening with Abstrackr has the potential to save time and reduce research waste.
Original language | English |
---|---|
Article number | 80 |
Journal | Systematic Reviews |
Volume | 4 |
Issue number | 1 |
DOIs | |
Publication status | Published - 15 Jun 2015 |
Fingerprint
Dive into the research topics of 'Faster title and abstract screening? Evaluating Abstrackr, a semi-automated online screening program for systematic reviewers'. Together they form a unique fingerprint.Student theses
-
Automating Systematic Reviews
Author: Rathbone, J., 7 Oct 2017Supervisor: Glasziou, P. P. (Supervisor), Hoffmann, T. C. (Supervisor) & Beller, E. M. (Supervisor)
Student thesis: Doctoral Thesis
File