Skip to main content
Article
Real-Time Inter-Rater Reliability of the Council of Emergency Medicine Residency Directors Standardized Direct Observation Assessment Tool
Academic Emergency Medicine: Official Journal Of The Society For Academic Emergency Medicine,
  • Joseph LaMantia, MD, Department of Emergency Medicine, North Shore University Hospital, (JL, AT, MWF) Manhasset, NY;
  • Bryan G Kane, MD, Lehigh Valley Health Network
  • Lalena Yarris, MD
  • Anthony Tadros, BA
  • Mary Frances Ward, RN, ANP
  • Martin Lesser, PhD
  • Philip Shayne, MD
  • The SDOT Study Group II
Publication/Presentation Date
12-1-2009
Abstract

OBJECTIVES: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED.

METHODS: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass/fail) agreement.

RESULTS: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores.

CONCLUSIONS: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass/fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.

PubMedID
20053212
Document Type
Article
Citation Information

LaMantia, J., Kane, B., Yarris, L., Tadros, A., Ward, M. F., Lesser, M., & Shayne, P. (2009). Real-time inter-rater reliability of the Council of Emergency Medicine residency directors standardized direct observation assessment tool. Academic Emergency Medicine: Official Journal Of The Society For Academic Emergency Medicine, 16 Suppl 2S51-S57. doi:10.1111/j.1553-2712.2009.00593.x.

Poster presented at: The 2009 New England Regional SAEM Meeting.

Poster presented at: The 2009 Annual CORD Academic Assembly, Las Vegas, NV.