• Neuroscience · Nov 2022

    A novel end-to-end BG-Attention algorithm for denoising of EEG signals.

    • Wenlong Wang, Baojiang Li, and Haiyan Wang.
    • The School of Electrical Engineering, Shanghai DianJi University, Shanghai, China; Intelligent Decision and Control Technology Institute, Shanghai DianJi University, Shanghai, China.
    • Neuroscience. 2022 Nov 21; 505: 102010-20.

    AbstractElectroencephalography (EEG) signals are nonlinear and non-stationary sequences that carry much information. However, physiological signals from other body regions may readily interfere with EEG signal capture, having a significant unfavorable influence on subsequent analysis. Therefore, signal denoising is a crucial step in EEG signal processing. This paper proposes a bidirectional gated recurrent unit (GRU) network based on a self-attention mechanism (BG-Attention) for extracting pure EEG signals from noise-contaminated EEG signals. The bidirectional GRU network can simultaneously capture past and future information while processing continuous time sequence. And by paying different levels of attention to the content of varying importance, the model can learn more significant feature of EEG signal sequences, highlighting the contribution of essential samples to denoising. The proposed model is evaluated on the EEGdenoiseNet data set. We compared the proposed model with a fully connected network (FCNN), the one-dimensional residual convolutional neural network (1D-ResCNN), and a recurrent neural network (RNN). The experimental results show that the proposed model can reconstruct a clear EEG waveform with a decent signal-to-noise ratio (SNR) and the relative root mean squared error (RRMSE) value. This study demonstrates the potential of BG-Attention in the pre-processing phase of EEG experiments, which has significant implications for medical technology and brain-computer interface (BCI) applications.Copyright © 2022 IBRO. Published by Elsevier Ltd. All rights reserved.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…