• Critical care medicine · May 2016

    Validity and Feasibility Evidence of Objective Structured Clinical Examination to Assess Competencies of Pediatric Critical Care Trainees.

    • Briseida Mema, Yoon Soo Park, and Afrothite Kotsakis.
    • 1Department of Critical Care Medicine, Hospital for Sick Children, Toronto, ON, Canada. 2Department of Paediatrics, Faculty of Medicine, University of Toronto, Toronto, ON, Canada. 3Department of Medical Education, University of Illinois at Chicago, Chicago, IL.
    • Crit. Care Med. 2016 May 1; 44 (5): 948-53.

    ObjectiveThe purpose of this study was to provide validity and feasibility evidence for the use of an objective structured clinical examination in the assessment of pediatric critical care medicine trainees.DesignThis was a validation study. Validity evidence was based on Messick's framework.SettingA tertiary, university-affiliated academic center.SubjectsSeventeen pediatric critical care medicine fellows were recruited in 2012 and 2013 academic year.InterventionsNone. All subjects completed an objective structured clinical examination assessment.Measurements And Main ResultsSeventeen trainees were assessed. Simulation scenarios were developed for content validity by pediatric critical care medicine and education experts using CanMEDS competencies. Scenarios were piloted before the study. Each scenario was evaluated by two interprofessional raters. Inter-rater agreement, measured using intraclass correlations, was 0.91 (SE = 0.09) across stations. Generalizability theory was used to evaluate internal structure and reliability. Reliability was moderate (G-coefficient = 0.67, Φ-coefficient = 0.52). The greatest source of variability was from participant by station variance (40.6%). Pearson correlation coefficients were used to evaluate the relationship of objective structured clinical examination with each traditional assessment instruments: multisource feedback, in-training evaluation report, short-answer questions, and Multidisciplinary Critical Care Knowledge Assessment Program. Performance on the objective structured clinical examination correlated with performance on the Multidisciplinary Critical Care Knowledge Assessment Program (r = 0.52; p = 0.032) and multisource feedback (r = 0.59; p = 0.017), but not with the overall performance on the in-training evaluation report (r = 0.37; p = 0.143) or short-answer questions (r = 0.08; p = 0.767). Consequences were not assessed.ConclusionValidity and feasibility evidence in this study indicate that the use of the objective structured clinical examination scores can be a valid way to assess CanMEDS competencies required for independent practice in pediatric critical care medicine.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…