Development and validation of the ACE tool: Assessing medical trainees' competency in evidence based medicine

Dragan Ilic, Rusli Bin Nordin, Paul Glasziou, Julie K. Tilson, Elmer Villanueva

Research output: Contribution to journalArticleResearchpeer-review

28 Citations (Scopus)
32 Downloads (Pure)

Abstract

Background: While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees' competency in EBM across knowledge, skills and attitude. Methods. The 'Assessing Competency in EBM' (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing 'novice', 'intermediate' and 'advanced' EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed. Results: We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20). Conclusion: The 15-item ACE tool is a reliable and valid instrument to assess medical trainees' competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.

Original languageEnglish
Article number114
JournalBMC Medical Education
Volume14
Issue number1
DOIs
Publication statusPublished - 9 Jun 2014

Fingerprint

Evidence-Based Medicine
trainee
medicine
evidence
discrimination
scenario
Expert Testimony
construct validity
Reproducibility of Results

Cite this

Ilic, Dragan ; Nordin, Rusli Bin ; Glasziou, Paul ; Tilson, Julie K. ; Villanueva, Elmer. / Development and validation of the ACE tool: Assessing medical trainees' competency in evidence based medicine. In: BMC Medical Education. 2014 ; Vol. 14, No. 1.
@article{3535fec97a444a5a8687ca3d8615bedc,
title = "Development and validation of the ACE tool: Assessing medical trainees' competency in evidence based medicine",
abstract = "Background: While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees' competency in EBM across knowledge, skills and attitude. Methods. The 'Assessing Competency in EBM' (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing 'novice', 'intermediate' and 'advanced' EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed. Results: We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20). Conclusion: The 15-item ACE tool is a reliable and valid instrument to assess medical trainees' competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.",
author = "Dragan Ilic and Nordin, {Rusli Bin} and Paul Glasziou and Tilson, {Julie K.} and Elmer Villanueva",
year = "2014",
month = "6",
day = "9",
doi = "10.1186/1472-6920-14-114",
language = "English",
volume = "14",
journal = "BMC Medical Education",
issn = "1472-6920",
publisher = "BMC",
number = "1",

}

Development and validation of the ACE tool: Assessing medical trainees' competency in evidence based medicine. / Ilic, Dragan; Nordin, Rusli Bin; Glasziou, Paul; Tilson, Julie K.; Villanueva, Elmer.

In: BMC Medical Education, Vol. 14, No. 1, 114, 09.06.2014.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Development and validation of the ACE tool: Assessing medical trainees' competency in evidence based medicine

AU - Ilic, Dragan

AU - Nordin, Rusli Bin

AU - Glasziou, Paul

AU - Tilson, Julie K.

AU - Villanueva, Elmer

PY - 2014/6/9

Y1 - 2014/6/9

N2 - Background: While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees' competency in EBM across knowledge, skills and attitude. Methods. The 'Assessing Competency in EBM' (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing 'novice', 'intermediate' and 'advanced' EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed. Results: We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20). Conclusion: The 15-item ACE tool is a reliable and valid instrument to assess medical trainees' competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.

AB - Background: While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees' competency in EBM across knowledge, skills and attitude. Methods. The 'Assessing Competency in EBM' (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing 'novice', 'intermediate' and 'advanced' EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed. Results: We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20). Conclusion: The 15-item ACE tool is a reliable and valid instrument to assess medical trainees' competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.

UR - http://www.scopus.com/inward/record.url?scp=84903276629&partnerID=8YFLogxK

U2 - 10.1186/1472-6920-14-114

DO - 10.1186/1472-6920-14-114

M3 - Article

VL - 14

JO - BMC Medical Education

JF - BMC Medical Education

SN - 1472-6920

IS - 1

M1 - 114

ER -