-
Comparative Study
Assessment of interrater and intrarater reliability in the evaluation of metered dose inhaler technique.
- S L Gray, A C Nance, D M Williams, and C C Pulliam.
- University of North Carolina, School of Pharmacy, Chapel Hill.
- Chest. 1994 Mar 1;105(3):710-4.
Study ObjectiveTo determine if a training session using videotaped metered dose inhaler (MDI) performances can result in high interrater and intrarater reliability of five evaluators assessing MDI technique.DesignFive evaluators (three pharmacists, two pulmonary fellows) were trained to evaluate MDI technique during a 2-h training session. The training session consisted of verbal instruction and practical experience in evaluating MDI technique using video-taped MDI performances of six nonstudy subjects. After the training session, the evaluators independently observed the same videotaped MDI demonstrations of 14 subjects on two occasions separated by a 7- to 10-day interval. Interrater and intrarater reliability was determined for individual steps by calculating percent agreement and intraclass correlation (ICC) coefficient.ResultsInterrater. The interrater reliability for individual steps ranged from 29 to 86 percent (ICC coefficient = 0.13 to 0.81). Steps in which evaluators were in agreement for less than 9 of the 14 subjects were shaking the inhaler before inhalation, exhaling, continuing to inhale slowly, and adequate breath hold. Intrarater: The overall percent agreement by step ranged from 74 to 97 percent. Exhaling to functional residual volume (76 percent) and continuing to inhale slowly and deeply (74 percent) had the lowest overall agreement between the first and second observation day. The consistency of evaluating a step between the two observation days varied considerably depending on the step and evaluator.ConclusionsHigh interrater and intrarater reliability in MDI evaluation is difficult to obtain. Clinicians and researchers involved in MDI evaluation and education should be trained to achieve consistency. A single training session using videotaped MDI demonstrations was not adequate in achieving consistency among evaluators. To improve accuracy of research results, researchers should include at least two evaluators to assess MDI technique or take other measures to show and report reliability.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.