-
- Richard S Klasco, Richard E Wolfe, Terrance Lee, Philip Anderson, Lee S Jacobson, Joshua Solano, Jonathan Edlow, and Shamai A Grossman.
- Beth Israel Deaconess Medical Center, Department of Emergency Medicine. Electronic address: rklasco@bidmc.harvard.edu.
- Am J Emerg Med. 2016 Jun 1; 34 (6): 1043-8.
BackgroundChart review has been the mainstay of medical quality assurance practices since its introduction more than a century ago. The validity of chart review, however, has been vitiated by a lack of methodological rigor.ObjectivesBy measuring the degree of interrater agreement among a 13-member review board of emergency physicians, we sought to validate the reliability of a chart review-based quality assurance process using computerized screening based on explicit case parameters.MethodsAll patients presenting to an urban, tertiary care academic medical center emergency department (annual volume of 57,000 patients) between November 2012 and November 2013 were screened electronically. Cases were programmatically flagged for review according to explicit criteria: return within 72hours, procedural evaluation, floor-to-ICU transfer within 24hours of admission, death within 24hours of admission, physician complaints, and patient complaints. Each case was reviewed independently by a 13-member emergency department quality assurance committee all of whom were board certified in emergency medicine and trained in the use of the tool. None of the reviewers were involved in the care of the specific patients reviewed by them. Reviewers used a previously validated 8-point Likert scale to rate the (1) coordination of patient care, (2) presence and severity of adverse events, (3) degree of medical error, and (4) quality of medical judgment. Agreement among reviewers was assessed with the intraclass correlation coefficient (ICC) for each parameter.ResultsAgreement and the degree of significance for each parameter were as follows: coordination of patient care (ICC=0.67; P<.001), presence and severity of adverse events (ICC=0.52; P=.001), degree of medical error (ICC=0.72; P<.001), and quality of medical judgment (ICC=0.67; P<.001).ConclusionAgreement in the chart review process can be achieved among physician-reviewers. The degree of agreement attainable is comparable to or superior to that of similar studies reported to date. These results highlight the potential for the use of computerized screening, explicit criteria, and training of expert reviewers to improve the reliability and validity of chart review-based quality assurance.Copyright © 2016. Published by Elsevier Inc.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.