Abstract
Background: Objective structured clinical examinations (OSCEs) are commonly used to assess the clinical skills of health professional students. Examiner judgement is one acknowledged source of variation in candidate marks. This paper reports an ex- ploration of examiner decision making to better characterise the cognitive processes and workload associated with making
judgements of clinical performance in exit-level OSCEs.
Methods: Fifty-five examiners for exit-level OSCEs at five Australian medical schools completed a NASA Task Load Index (TLX) measure of cognitive load and participated in focus group interviews immediately after the OSCE session. Discussions focused on how decisions were made for borderline and clear pass candidates. Interviews were transcribed, coded and thematically analysed. NASA TLX results were quanti- tatively analysed.
Results: Examiners self-reported higher cognitive workload levels when assessing a borderline candidate in comparison with a clear pass candidate. Further analysis re- vealed five major themes considered by examiners when marking candidate perfor- mance in an OSCE: (a) use of marking criteria as a source of reassurance; (b) difficulty adhering to the marking sheet under certain conditions; (c) demeanour of candidates;
(d) patient safety, and (e) calibration using a mental construct of the 'mythical
[pro- totypical] intern'. Examiners demonstrated particularly higher mental demand when assessing borderline compared to clear pass candidates.
Conclusions: Examiners demonstrate that judging candidate performance is a com- plex, cognitively difficult task, particularly when performance is of borderline or lower standard. At programme exit level, examiners intuitively want to rate candi- dates against a construct of a prototypical graduate when marking criteria appear not to describe both what and how a passing candidate should demonstrate when completing clinical tasks. This construct should be shared, agreed upon and aligned with marking criteria to best guide examiner training and calibration. Achieving this integration may improve the accuracy and consistency of examiner judgements and reduce cognitive workload.
judgements of clinical performance in exit-level OSCEs.
Methods: Fifty-five examiners for exit-level OSCEs at five Australian medical schools completed a NASA Task Load Index (TLX) measure of cognitive load and participated in focus group interviews immediately after the OSCE session. Discussions focused on how decisions were made for borderline and clear pass candidates. Interviews were transcribed, coded and thematically analysed. NASA TLX results were quanti- tatively analysed.
Results: Examiners self-reported higher cognitive workload levels when assessing a borderline candidate in comparison with a clear pass candidate. Further analysis re- vealed five major themes considered by examiners when marking candidate perfor- mance in an OSCE: (a) use of marking criteria as a source of reassurance; (b) difficulty adhering to the marking sheet under certain conditions; (c) demeanour of candidates;
(d) patient safety, and (e) calibration using a mental construct of the 'mythical
[pro- totypical] intern'. Examiners demonstrated particularly higher mental demand when assessing borderline compared to clear pass candidates.
Conclusions: Examiners demonstrate that judging candidate performance is a com- plex, cognitively difficult task, particularly when performance is of borderline or lower standard. At programme exit level, examiners intuitively want to rate candi- dates against a construct of a prototypical graduate when marking criteria appear not to describe both what and how a passing candidate should demonstrate when completing clinical tasks. This construct should be shared, agreed upon and aligned with marking criteria to best guide examiner training and calibration. Achieving this integration may improve the accuracy and consistency of examiner judgements and reduce cognitive workload.
Original language | English |
---|---|
Pages (from-to) | 344-353 |
Number of pages | 10 |
Journal | Medical Education |
Volume | 55 |
Issue number | 3 |
Early online date | 18 Aug 2020 |
DOIs | |
Publication status | Published - Mar 2021 |