Neuroscience
-
Editorial Comment
Potential of Mirror Rehabilitation Therapy in Stroke Outcome.
-
Perception deals with temporal sequences of events, like series of phonemes for audition, dynamic changes in pressure for touch textures, or moving objects for vision. Memory processes are thus needed to make sense of the temporal patterning of sensory information. Recently, we have shown that auditory temporal patterns could be learned rapidly and incidentally with repeated exposure [Kang et al., 2017]. ⋯ Results showed that, if a random temporal pattern re-occurred at random times during an experimental block, it was rapidly learned, whatever the sensory modality. Moreover, patterns first learned in the auditory modality displayed transfer of learning to either touch or vision. This suggests that sensory systems may be exquisitely tuned to incidentally learn re-occurring temporal patterns, with possible cross-talk between the senses.
-
Review
Automatic frequency-shift detection in the auditory system: A review of psychophysical findings.
The human brain has the task of binding successive sounds produced by the same acoustic source into a coherent perceptual stream, and binding must be selective when several sources are concurrently active. Binding appears to obey a principle of spectral proximity: pure tones close in frequency are more likely to be bound than pure tones with remote frequencies. It has been hypothesized that the binding process is realized by automatic "frequency-shift detectors" (FSDs), comparable to the detectors of spatial motion in the visual system. ⋯ A number of variants of this study have been performed since 2005, in order to confirm the existence of FSDs, to characterize their properties, and to clarify as far as possible their neural underpinnings. The results obtained up to now suggest that the working of the FSDs exploits an implicit sensory memory which is powerful with respect to both capacity and retention time. Tones within chords can be perceptually enhanced by small frequency shifts, in a manner suggesting that the FSDs can serve in auditory scene analysis not only as binding tools but also, to a limited extent, as segregation tools.
-
Comparative Study
Auditory and visual sequence learning in humans and monkeys using an artificial grammar learning paradigm.
Language flexibly supports the human ability to communicate using different sensory modalities, such as writing and reading in the visual modality and speaking and listening in the auditory domain. Although it has been argued that nonhuman primate communication abilities are inherently multisensory, direct behavioural comparisons between human and nonhuman primates are scant. Artificial grammar learning (AGL) tasks and statistical learning experiments can be used to emulate ordering relationships between words in a sentence. ⋯ Moreover, the humans and monkeys produced largely similar response patterns to the visual and auditory sequences, indicating that the sequences are processed in comparable ways across the sensory modalities. These results provide evidence that human sequence processing abilities stem from an evolutionarily conserved capacity that appears to operate comparably across the sensory modalities in both human and nonhuman primates. The findings set the stage for future neurobiological studies to investigate the multisensory nature of these sequencing operations in nonhuman primates and how they compare to related processes in humans.
-
Motor sequence learning involves predictive processing that results in the anticipation of each component of a sequence of actions. In smooth pursuit, this predictive processing is required to decrease tracking errors between the eye and the stimulus. Current models for motor sequence learning suggest parallel mechanisms in the brain for acquiring sequences of differing complexity. ⋯ In addition, distinct activation was found in more working memory related brain regions for the shorter sequences (e.g. the middle frontal cortex and dorsolateral prefrontal cortex), and higher activation in the frontal eye fields, supplementary motor cortex and motor cortex for the longer sequences, independent on the number of repetitions. These findings provide new evidence that there are parallel brain areas that involve working memory circuitry for short sequences, and more motoric areas when the sequence is longer and more cognitively demanding. Additionally, our findings are the first to show that the parallel brain regions involved in sequence learning in pursuit are independent of the number of repetitions, but contingent on sequence complexity.