Computers in biology and medicine
-
Accurate segmentation of the left ventricle (LV) from cine magnetic resonance imaging (MRI) is an important step in the reliable assessment of cardiac function in cardiovascular disease patients. Several deep learning convolutional neural network (CNN) models have achieved state-of-the-art performances for LV segmentation from cine MRI. However, most published deep learning methods use individual cine frames as input and process each frame separately. ⋯ The results showed that OF-net achieves an average perpendicular distance (APD) of 0.90±0.08 pixels and a Dice index of 0.95±0.03 for LV segmentation in the middle slices, outperforming the classical U-net model (APD 0.92±0.04 pixels, Dice 0.94±0.16, p < 0.05). Specifically, the proposed method enhances the temporal continuity of segmentation at the apical and basal slices, which are typically more difficult to segment than middle slices. Our work exemplifies the ability of CNN to "learn" from expert experience when applied to specific analysis tasks.
-
Comparative Study Clinical Trial
Evaluation of a machine learning algorithm for up to 48-hour advance prediction of sepsis using six vital signs.
Sepsis remains a costly and prevalent syndrome in hospitals; however, machine learning systems can increase timely sepsis detection using electronic health records. This study validates a gradient boosted ensemble machine learning tool for sepsis detection and prediction, and compares its performance to existing methods. ⋯ The MLA predicts sepsis up to 48 h in advance and identifies sepsis onset more accurately than commonly used tools, maintaining high performance for sepsis detection when trained and tested on separate datasets.
-
The Continuous Non-Invasive measurement of arterial Blood Pressure [CNIBP] is possible via the method of arterial tonometry and the arterial volume clamp methods. Arterial tonometry successfully measures continuous arterial pressure but requires large vessel deformation and a highly miniaturized pressure sensor to obtain a direct calibration of pressure. A properly designed tonometer is capable of achieving pressure accuracy of less than 5% error at the radial artery. The volume clamp method achieves comparable errors but is generally restricted to the very peripheral arteries. Since the brachial or radial arteries are preferable sites to record blood pressure, tonometry is generally preferred. However, due to its strict operating requirements, tonometry requires a highly skilled operator. The greatest source of measurement error results from slight deviation from the artery wall applanation position. In this study, a method for correcting tonometry deflection error is introduced and evaluated using preliminary experiments. ⋯ A modeling method for tonometer deflection correction was derived and evaluated using a phantom vessel. Average error was significantly reduced over the non-corrected data. The variability of error was also reduced for all data points collected. The experiments reveal that blood pressure measurement error can be reduced to levels obtained in near ideal tonometry conditions without the need for precise position control. The relaxed user precision is anticipated to simplify the use and design requirements for arterial tonometry in practice.
-
Wheezes in pulmonary sounds are anomalies which are often associated with obstructive type of lung diseases. The previous works on wheeze-type classification focused mainly on using fixed time-frequency/scale resolution based on Fourier and wavelet transforms. The main contribution of the proposed method, in which the time-scale resolution can be tuned according to the signal of interest, is to discriminate monophonic and polyphonic wheezes with higher accuracy than previously suggested time and time-frequency/scale based methods. ⋯ It is concluded that time and frequency domain characteristics of wheezes are not steady and hence, tunable time-scale representations are more successful in discriminating polyphonic and monophonic wheezes when compared with conventional fixed resolution representations.
-
Heart rate complexity (HRC) is a proven metric for gaining insight into human stress and physiological deterioration. To calculate HRC, the detection of the exact instance of when the heart beats, the R-peak, is necessary. Electrocardiogram (ECG) signals can often be corrupted by environmental noise (e.g., from electromagnetic interference, movement artifacts), which can potentially alter the HRC measurement, producing erroneous inputs which feed into decision support models. ⋯ This raises many questions regarding how this fiducial point is altered by noise, the resulting impact on the measured HRC, and how we can account for noisy HRC measures as inputs into our decision models. This work uses Monte Carlo simulations to systematically add white and pink noise at different permutations of signal-to-noise ratios (SNRs), time segments, sampling rates, and HRC measurements to characterize the influence of noise on the HRC measure by altering the fiducial point of the R-peak. Using the generated information from these simulations provides improved decision processes for system design which address key concerns such as permutation entropy being a more precise, reliable, less biased, and more sensitive measurement for HRC than sample and approximate entropy.