• Mt. Sinai J. Med. · Jan 2005

    Comparative Study

    Correlation between housestaff performance on the United States Medical Licensing Examination and standardized patient encounters.

    • William D Rifkin and Arthur Rifkin.
    • Deaprtment of Medicine, Yale Primary Care Residency Program, Waterbury Hospital, 64 Robbins Street, Waterbury, CT 06721, USA. wrifkin@wtbyhosp.chime.org
    • Mt. Sinai J. Med. 2005 Jan 1; 72 (1): 47-9.

    BackgroundThere is interest in the use of "standardized patients" to assist in evaluating medical trainees' clinical skills, which may be difficult to evaluate with written exams alone. Previous studies of the validity of observed structured clinical exams have found low correlation with various written exams as well as with faculty evaluations. Since the United States Medical Licensing Examination (USMLE) results are often used by training programs in the selection of applicants, we assessed the correlation between performance on an observed structured clinical exam and the USMLE, steps 1 and 2, for internal medicine housestaff.MethodsWe collected scores on the USMLE, steps 1 and 2, and the overall score from a required standardized patient encounter for all PGY-1 trainees, in a single urban teaching hospital. Pearson coefficients were used to compare the USMLE and observed structured clinical exam performance.ResultsThe two steps of the USMLE correlated with each other to a large extent (r=0.65, df=30, p=0.0001). However, both steps of the USMLE correlated poorly with the observed structured clinical exam (step 1 r=0.2, df=32, p=0.27; step 2 r=0.09, df=30, p=0.61).ConclusionsThe low correlation between the USMLE and performance on a structured clinical exam suggests that either the written exam is a poor predictor of actual clinical performance, the small window of clinical skills measured by the structured clinical exam is inadequate, or the two methods evaluate different skill sets entirely. Our findings are consistent with previous work finding low correlations between structured clinical exams and accepted common means of evaluation, such as faculty evaluations, other written exams and program director assessments. The medical education community needs to develop an objective, valid method of measuring important, yet subjective, skill-sets such as interpersonal communication, empathy and efficient data collection.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.