• Brain research · Apr 2010

    Differences in the neural basis of automatic auditory and visual time perception: ERP evidence from an across-modal delayed response oddball task.

    • Youguo Chen, Xiting Huang, Yangmei Luo, Chunhua Peng, and Chunxiang Liu.
    • Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Southwest University, Chongqing, China.
    • Brain Res. 2010 Apr 14;1325:100-11.

    AbstractIn our everyday lives, we need to process auditory and visual temporal information as efficiently as possible. Although automatic auditory time perception has been widely investigated using an index of the mismatch negativity (MMN), the neural basis of automatic visual time perception has been largely ignored. The present study investigated the automatic processing of auditory and visual time perception employing the cross-modal delayed response oddball paradigm. In the experimental condition, the standard stimulus was 200 ms and the deviant stimulus was 120 ms, which were exchanged in the control condition. Reaction time, accuracy, and event-related potential (ERP) data were measured when participants performed the duration discrimination task. The ERP results showed that the MMN, N2b, and P3 were elicited by an auditory deviant stimulus under the attention condition, while only the MMN was elicited under the inattention condition. The MMN was largest over the frontal and central sites, while the difference in MMN amplitude was not significant between under the attention and inattention condition. In contrast, the change-related positivity (CRP) and the visual mismatch negativity (vMMN) were elicited by the visual deviant stimulus under both the attention and inattention conditions. The CRP was largest over the occipito-temporal sites under the attention condition and over the fronto-central sites under the inattention condition. The difference in CRP amplitude was significant between the attention and inattention condition. The vMMN was largest over the parieto-occipital sites under the attention condition, and largest over the fronto-central sites under the inattention condition. The difference in vMMN amplitude was significant between the attention and inattention condition. Auditory MMN does not appear to be modulated by attention, whereas the visual CRP and the vMMN are modulated by attention. Therefore, the present study provides electrophysiological evidence for the existence of automatic visual time perception and supports an "attentional switch" hypothesis for a modality effect on duration judgments, such that auditory temporal information is processed relatively automatically, whereas visual temporal information processing requires controlled attention.Copyright 2010 Elsevier B.V. All rights reserved.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

Want more great medical articles?

Keep up to date with a free trial of metajournal, personalized for your practice.
1,694,794 articles already indexed!

We guarantee your privacy. Your email address will not be shared.