Competency examination validating

14-Jan-2016 13:09 by 8 Comments

Competency examination validating

To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.Evidence based medicine (EBM) is now well established as a discipline across a variety of medical, allied and health sciences curricula.

competency examination validating-66competency examination validating-31competency examination validating-44

EBM integrates knowledge and skills from a variety of sub-disciplines including clinical epidemiology, information literacy, knowledge management and biostatistics.Practicing ‘EBM’ requires that users are skilled across a 5-step process that includes; (i) construction of an answerable question from the clinical scenario, (ii) systematic retrieval of the best available evidence, (iii) critical appraisal of the evidence for validity, clinical relevance and applicability, (iv) application of results, and (v) evaluation of performance [].The classification rubric for EBP assessment tools in education (CREATE) provides guidance when developing new EBM-related assessments by classifying assessment categories (reaction to educational experience, attitudes, self-efficacy, knowledge, skills, behaviours and benefit to patients) and types (self-report, cognitive testing, performance assessment, activity monitoring and patient orientated outcomes) with the five EBM steps [].The CREATE framework suggests that reaction to educational experience, self-efficacy and attitudes towards EBM are best assessed via self-report.While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated.The aim of this study was to develop and validate an instrument that evaluates medical trainees’ competency in EBM across knowledge, skills and attitude.

The ‘Assessing Competency in EBM’ (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion.

A cross-sectional sample of 342 medical trainees representing ‘novice’, ‘intermediate’ and ‘advanced’ EBM trainees were recruited to complete the ACE tool.

Construct validity, item difficulty, internal reliability and item discrimination were analysed.

We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants.

A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p The 15-item ACE tool is a reliable and valid instrument to assess medical trainees’ competency in EBM.

The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM.