• Can J Emerg Med · Oct 2000

    Development, implementation and reliability assessment of an emergency physician performance evaluation tool.

    • J Etherington, G Innes, J Christenson, J Berkowitz, R Chamberlain, R Berringer, and C Leung.
    • Department of Emergency Medicine, Providence Health Care, St. Paul's Hospital Site, Vancouver, British Columbia, Canada.
    • Can J Emerg Med. 2000 Oct 1;2(4):237-45.

    UnlabelledEvaluation of physician practice is necessary, both to provide feedback for self-improvement and to guide department heads during yearly evaluations.ObjectiveTo develop and implement a peer-based performance evaluation tool and to measure reliability and physician satisfaction.MethodsEach emergency physician in an urban emergency department evaluated their peers by completing a survey consisting of 21 questions on effectiveness in 4 categories: clinical practice, interaction with coworkers and the public, nonclinical departmental responsibilities, and academic activities. A sample of emergency nurses evaluated each emergency physician on a subset of 5 of the questions. Factor analysis was used to assess the reliability of the questions and categories. Intra-class correlation coefficients were calculated to determine inter-rater reliability. After receiving their peer evaluations, each physician rated the process's usefulness to the individual and the department.Results225 surveys were completed on 16 physicians. Factor analysis did not distinguish the nonclinical and academic categories as distinct; therefore, the survey questions fell into 3 domains, rather than the 4 hypothesized. The overall intra-class correlation coefficient was 0.43 for emergency physicians, indicating moderate, but far from perfect, agreement. This suggests that variability exists between physician evaluators, and that multiple reviewers are probably required to provide a balanced physician evaluation. The intra-class correlation coefficient for emergency nurses was 0.11, suggesting poor reliability. Overall, 11 of 15 physicians reported the process valuable or mostly valuable, 3 of 15 were unsure and 1 of 15 reported that the process was definitely not valuable.ConclusionPhysician evaluation by a single individual is probably unreliable. A useful physician peer evaluation tool can be developed. Most physicians view a personalized, broad-based, confidential peer review as valuable.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.