• IEEE Trans Biomed Eng · Dec 2010

    Adaptation in P300 brain-computer interfaces: a two-classifier cotraining approach.

    • Rajesh C Panicker, Sadasivan Puthusserypady, and Ying Sun.
    • Department of Electrical and Computer Engineering, National University of Singapore, Singapore. rajesh.c@nus.edu.sg
    • IEEE Trans Biomed Eng. 2010 Dec 1; 57 (12): 2927-35.

    AbstractA cotraining-based approach is introduced for constructing high-performance classifiers for P300-based brain-computer interfaces (BCIs), which were trained from very little data. It uses two classifiers: Fisher's linear discriminant analysis and Bayesian linear discriminant analysis progressively teaching each other to build a final classifier, which is robust and able to learn effectively from unlabeled data. Detailed analysis of the performance is carried out through extensive cross-validations, and it is shown that the proposed approach is able to build high-performance classifiers from just a few minutes of labeled data and by making efficient use of unlabeled data. An average bit rate of more than 37 bits/min was achieved with just one and a half minutes of training, achieving an increase of about 17 bits/min compared to the fully supervised classification in one of the configurations. This performance improvement is shown to be even more significant in cases where the training data as well as the number of trials that are averaged for detection of a character is low, both of which are desired operational characteristics of a practical BCI system. Moreover, the proposed method outperforms the self-training-based approaches where the confident predictions of a classifier is used to retrain itself.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…