• Sleep · Nov 2004

    Comparative Study

    Assessment of automated scoring of polysomnographic recordings in a population with suspected sleep-disordered breathing.

    • Stephen D Pittman, Mary M MacDonald, Robert B Fogel, Atul Malhotra, Koby Todros, Baruch Levy, Amir B Geva, and David P White.
    • Division of Sleep Medicine, Brigham and Women's Hospital, Boston, Mass 02115-5817, USA.
    • Sleep. 2004 Nov 1; 27 (7): 1394-403.

    Study ObjectivesTo assess the accuracy of an automated system (Morpheus I Sleep Scoring System) for analyzing and quantifying polysomnographic data from a population with sleep-disordered breathing.SettingSleep laboratory affiliated with a tertiary care academic medical center.Measurements And Results31 diagnostic polysomnograms were blindly analyzed prospectively with the investigational automated system and manually by 2 registered polysomnography technologists (M1 & M2) from the same laboratory. Sleep stages, arousals, periodic limb movements, and respiratory events (apneas and hypopneas) were scored by all 3. Agreement, Cohen kappa, and intraclass correlation coefficients were tabulated for each variable and compared between scoring pairs (A-M1, A-M2, M1-M2). The 26,876 epochs (224 hours of recording time) were analyzed. For sleep staging, agreement/kappa were A-M1: 78%/0.67, A-M2: 73%/0.61, and M1-M2: 82%/0.73. The mean respiratory disturbance indexes were M1: 20.6+/-23.0, M2: 22.5+/-24.5, and A: 23.7+/-23.4 events per hour of sleep. The respiratory disturbance index concordance between each scoring pair was excellent (intraclass correlation coefficients > or = 0.95 for all pairs), although there was disagreement in the classification of moderate sleep-disordered breathing (percentage of positive agreement: A-M1, 37.5% and A-M2, 44.4%) defined as a respiratory disturbance index between 15 and 30 events per hour of sleep. For respiratory-event detection, agreement/kappa were A-M1 and A-M2: 90%/0.66 and M1-M2: 95%/0.82. The agreement and kappa for limb movement detection were A-M1: 93%/0.68, A-M2: 92%/0.66, and M1-M2: 96%/0.77. The scoring of arousals was less reliable (agreement range: 76%-84%, kappa range: 0.28-0.57) for all pairs.ConclusionsAgreement between manual scorers in a population with moderate sleep-disordered breathing was close to the average pairwise agreement of 87% reported in the Sleep Heart Health Study. The automated classification of sleep stages was also close to this standard. The automated scoring system holds promise as a rapid method to score polysomnographic records, but expert verification of the automated scoring is required.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…