Evidence-based practice educational intervention studies: A systematic review of what is taught and how it is measured

Research output: Contribution to journalArticleResearchpeer-review

4 Citations (Scopus)
64 Downloads (Pure)

Abstract

BACKGROUND: Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. We sought to systematically assess coverage of the five EBP steps, review the outcome domains measured, and assess the properties of the instruments used in studies evaluating EBP educational interventions.

METHODS: We conducted a systematic review of controlled studies (i.e. studies with a separate control group) which had investigated the effect of EBP educational interventions. We used citation analysis technique and tracked the forward and backward citations of the index articles (i.e. the systematic reviews and primary studies included in an overview of the effect of EBP teaching) using Web of Science until May 2017. We extracted information on intervention content (grouped into the five EBP steps), and the outcome domains assessed. We also searched the literature for published reliability and validity data of the EBP instruments used.

RESULTS: Of 1831 records identified, 302 full-text articles were screened, and 85 included. Of these, 46 (54%) studies were randomised trials, 51 (60%) included postgraduate level participants, and 63 (75%) taught medical professionals. EBP Step 3 (critical appraisal) was the most frequently taught step (63 studies; 74%). Only 10 (12%) of the studies taught content which addressed all five EBP steps. Of the 85 studies, 52 (61%) evaluated EBP skills, 39 (46%) knowledge, 35 (41%) attitudes, 19 (22%) behaviours, 15 (18%) self-efficacy, and 7 (8%) measured reactions to EBP teaching delivery. Of the 24 instruments used in the included studies, 6 were high-quality (achieved ≥3 types of established validity evidence) and these were used in 14 (29%) of the 52 studies that measured EBP skills; 14 (41%) of the 39 studies that measured EBP knowledge; and 8 (26%) of the 35 studies that measured EBP attitude.

CONCLUSIONS: Most EBP educational interventions which have been evaluated in controlled studies focus on teaching only some of the EBP steps (predominantly critically appraisal of evidence) and did not use high-quality instruments to measure outcomes. Educational packages and instruments which address all EBP steps are needed to improve EBP teaching.

Original languageEnglish
Article number177
Pages (from-to)177
JournalBMC Medical Education
Volume18
Issue number1
DOIs
Publication statusPublished - 1 Aug 2018

Fingerprint

educational practice
evidence
teaching practice
study contents

Cite this

@article{ef9cd42cf61441d491ac6e7d4fc642a6,
title = "Evidence-based practice educational intervention studies: A systematic review of what is taught and how it is measured",
abstract = "BACKGROUND: Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. We sought to systematically assess coverage of the five EBP steps, review the outcome domains measured, and assess the properties of the instruments used in studies evaluating EBP educational interventions.METHODS: We conducted a systematic review of controlled studies (i.e. studies with a separate control group) which had investigated the effect of EBP educational interventions. We used citation analysis technique and tracked the forward and backward citations of the index articles (i.e. the systematic reviews and primary studies included in an overview of the effect of EBP teaching) using Web of Science until May 2017. We extracted information on intervention content (grouped into the five EBP steps), and the outcome domains assessed. We also searched the literature for published reliability and validity data of the EBP instruments used.RESULTS: Of 1831 records identified, 302 full-text articles were screened, and 85 included. Of these, 46 (54{\%}) studies were randomised trials, 51 (60{\%}) included postgraduate level participants, and 63 (75{\%}) taught medical professionals. EBP Step 3 (critical appraisal) was the most frequently taught step (63 studies; 74{\%}). Only 10 (12{\%}) of the studies taught content which addressed all five EBP steps. Of the 85 studies, 52 (61{\%}) evaluated EBP skills, 39 (46{\%}) knowledge, 35 (41{\%}) attitudes, 19 (22{\%}) behaviours, 15 (18{\%}) self-efficacy, and 7 (8{\%}) measured reactions to EBP teaching delivery. Of the 24 instruments used in the included studies, 6 were high-quality (achieved ≥3 types of established validity evidence) and these were used in 14 (29{\%}) of the 52 studies that measured EBP skills; 14 (41{\%}) of the 39 studies that measured EBP knowledge; and 8 (26{\%}) of the 35 studies that measured EBP attitude.CONCLUSIONS: Most EBP educational interventions which have been evaluated in controlled studies focus on teaching only some of the EBP steps (predominantly critically appraisal of evidence) and did not use high-quality instruments to measure outcomes. Educational packages and instruments which address all EBP steps are needed to improve EBP teaching.",
author = "Loai Albarqouni and Tammy Hoffmann and Paul Glasziou",
year = "2018",
month = "8",
day = "1",
doi = "10.1186/s12909-018-1284-1",
language = "English",
volume = "18",
pages = "177",
journal = "BMC Medical Education",
issn = "1472-6920",
publisher = "BMC",
number = "1",

}

Evidence-based practice educational intervention studies : A systematic review of what is taught and how it is measured. / Albarqouni, Loai; Hoffmann, Tammy; Glasziou, Paul.

In: BMC Medical Education, Vol. 18, No. 1, 177, 01.08.2018, p. 177.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Evidence-based practice educational intervention studies

T2 - A systematic review of what is taught and how it is measured

AU - Albarqouni, Loai

AU - Hoffmann, Tammy

AU - Glasziou, Paul

PY - 2018/8/1

Y1 - 2018/8/1

N2 - BACKGROUND: Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. We sought to systematically assess coverage of the five EBP steps, review the outcome domains measured, and assess the properties of the instruments used in studies evaluating EBP educational interventions.METHODS: We conducted a systematic review of controlled studies (i.e. studies with a separate control group) which had investigated the effect of EBP educational interventions. We used citation analysis technique and tracked the forward and backward citations of the index articles (i.e. the systematic reviews and primary studies included in an overview of the effect of EBP teaching) using Web of Science until May 2017. We extracted information on intervention content (grouped into the five EBP steps), and the outcome domains assessed. We also searched the literature for published reliability and validity data of the EBP instruments used.RESULTS: Of 1831 records identified, 302 full-text articles were screened, and 85 included. Of these, 46 (54%) studies were randomised trials, 51 (60%) included postgraduate level participants, and 63 (75%) taught medical professionals. EBP Step 3 (critical appraisal) was the most frequently taught step (63 studies; 74%). Only 10 (12%) of the studies taught content which addressed all five EBP steps. Of the 85 studies, 52 (61%) evaluated EBP skills, 39 (46%) knowledge, 35 (41%) attitudes, 19 (22%) behaviours, 15 (18%) self-efficacy, and 7 (8%) measured reactions to EBP teaching delivery. Of the 24 instruments used in the included studies, 6 were high-quality (achieved ≥3 types of established validity evidence) and these were used in 14 (29%) of the 52 studies that measured EBP skills; 14 (41%) of the 39 studies that measured EBP knowledge; and 8 (26%) of the 35 studies that measured EBP attitude.CONCLUSIONS: Most EBP educational interventions which have been evaluated in controlled studies focus on teaching only some of the EBP steps (predominantly critically appraisal of evidence) and did not use high-quality instruments to measure outcomes. Educational packages and instruments which address all EBP steps are needed to improve EBP teaching.

AB - BACKGROUND: Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. We sought to systematically assess coverage of the five EBP steps, review the outcome domains measured, and assess the properties of the instruments used in studies evaluating EBP educational interventions.METHODS: We conducted a systematic review of controlled studies (i.e. studies with a separate control group) which had investigated the effect of EBP educational interventions. We used citation analysis technique and tracked the forward and backward citations of the index articles (i.e. the systematic reviews and primary studies included in an overview of the effect of EBP teaching) using Web of Science until May 2017. We extracted information on intervention content (grouped into the five EBP steps), and the outcome domains assessed. We also searched the literature for published reliability and validity data of the EBP instruments used.RESULTS: Of 1831 records identified, 302 full-text articles were screened, and 85 included. Of these, 46 (54%) studies were randomised trials, 51 (60%) included postgraduate level participants, and 63 (75%) taught medical professionals. EBP Step 3 (critical appraisal) was the most frequently taught step (63 studies; 74%). Only 10 (12%) of the studies taught content which addressed all five EBP steps. Of the 85 studies, 52 (61%) evaluated EBP skills, 39 (46%) knowledge, 35 (41%) attitudes, 19 (22%) behaviours, 15 (18%) self-efficacy, and 7 (8%) measured reactions to EBP teaching delivery. Of the 24 instruments used in the included studies, 6 were high-quality (achieved ≥3 types of established validity evidence) and these were used in 14 (29%) of the 52 studies that measured EBP skills; 14 (41%) of the 39 studies that measured EBP knowledge; and 8 (26%) of the 35 studies that measured EBP attitude.CONCLUSIONS: Most EBP educational interventions which have been evaluated in controlled studies focus on teaching only some of the EBP steps (predominantly critically appraisal of evidence) and did not use high-quality instruments to measure outcomes. Educational packages and instruments which address all EBP steps are needed to improve EBP teaching.

U2 - 10.1186/s12909-018-1284-1

DO - 10.1186/s12909-018-1284-1

M3 - Article

VL - 18

SP - 177

JO - BMC Medical Education

JF - BMC Medical Education

SN - 1472-6920

IS - 1

M1 - 177

ER -