Neuroscience
-
Perception deals with temporal sequences of events, like series of phonemes for audition, dynamic changes in pressure for touch textures, or moving objects for vision. Memory processes are thus needed to make sense of the temporal patterning of sensory information. Recently, we have shown that auditory temporal patterns could be learned rapidly and incidentally with repeated exposure [Kang et al., 2017]. ⋯ Results showed that, if a random temporal pattern re-occurred at random times during an experimental block, it was rapidly learned, whatever the sensory modality. Moreover, patterns first learned in the auditory modality displayed transfer of learning to either touch or vision. This suggests that sensory systems may be exquisitely tuned to incidentally learn re-occurring temporal patterns, with possible cross-talk between the senses.
-
Comparative Study
Multisensory integration in short-term memory: Musicians do rock.
Demonstrated interactions between seeing and hearing led us to assess the link between music training and short-term memory for auditory, visual and audiovisual sequences of rapidly presented, quasi-random components. Visual sequences' components varied in luminance; auditory sequences' components varied in frequency. Concurrent components in audiovisual sequences were either congruent (the frequency of an auditory item increased monotonically with the luminance of the visual item it accompanied), or incongruent (an item's frequency was uncorrelated with luminance of the item it accompanied). ⋯ Subjects with prior instrumental training significantly outperformed their untrained counterparts, with both auditory and visual sequences, and with sequences of correlated auditory and visual items. Reverse correlation showed that the presence of a correlated, concurrent auditory stream altered subjects' reliance on particular visual items in a sequence. Moreover, congruence between auditory and visual items produced performance above what would be predicted from simple summation of information from the two modalities, a result that might reflect a contribution from special-purpose, multimodal neural mechanisms.
-
Repeating spatiotemporal spike patterns exist and carry information. How this information is extracted by downstream neurons is unclear. Here we theoretically investigate to what extent a single cell could detect a given spike pattern and what the optimal parameters to do so are, in particular the membrane time constant τ. ⋯ Long sequences could be recognized thanks to coincidence detectors working at a much shorter timescale. This is consistent with the fact that recognition is still possible if a sound sequence is compressed, played backward, or scrambled using 10-ms bins. Coincidence detection is a simple yet powerful mechanism, which could be the main function of neurons in the brain.
-
Predictive coding is possibly one of the most influential, comprehensive, and controversial theories of neural function. While proponents praise its explanatory potential, critics object that key tenets of the theory are untested or even untestable. The present article critically examines existing evidence for predictive coding in the auditory modality. ⋯ More work exists on the proposed oscillatory signatures of predictive coding, and on the relation between attention and precision. However, results on these latter two assumptions are mixed or contradictory. Looking to the future, more collaboration between human and animal studies, aided by model-based analyses will be needed to test specific assumptions and implementations of predictive coding - and, as such, help determine whether this popular grand theory can fulfill its expectations.
-
Comparative Study
Auditory and visual sequence learning in humans and monkeys using an artificial grammar learning paradigm.
Language flexibly supports the human ability to communicate using different sensory modalities, such as writing and reading in the visual modality and speaking and listening in the auditory domain. Although it has been argued that nonhuman primate communication abilities are inherently multisensory, direct behavioural comparisons between human and nonhuman primates are scant. Artificial grammar learning (AGL) tasks and statistical learning experiments can be used to emulate ordering relationships between words in a sentence. ⋯ Moreover, the humans and monkeys produced largely similar response patterns to the visual and auditory sequences, indicating that the sequences are processed in comparable ways across the sensory modalities. These results provide evidence that human sequence processing abilities stem from an evolutionarily conserved capacity that appears to operate comparably across the sensory modalities in both human and nonhuman primates. The findings set the stage for future neurobiological studies to investigate the multisensory nature of these sequencing operations in nonhuman primates and how they compare to related processes in humans.