In a study published in Cell Reports, researchers from Dr. GU Yong’s lab at the Institute of Neuroscience, Center for Excellence in Brain Science and Intelligent Technology of the Chinese Academy of Sciences, found that compared with synchronous visual and inner ear vestibular stimuli, macaques discriminated self-motion directions more precisely when visual stimuli appeared about 250-500 ms earlier than vestibular stimuli. This result suggested that in natural spatial navigation, the brain typically integrates multiple senses in a temporal-dynamics-incongruent way.
Spatial navigation, especially vector-based navigation requires inputs from multiple self-motion cues, including optic flow and vestibular. Numerous studies have shown that people's brain could integrate these cues to improve heading perception, yet how they are exactly integrated is unknown. For example, visual channel generally processes motion velocity information. By contrast, peripheral vestibular organs detect a much faster signal of acceleration. In the central nervous system (CNS), the vestibular signals are integrated in the temporal domain when propagating from peripheral to central, leading to a broad distribution from acceleration to velocity signals. Hence, temporal dynamics of the two sensory signals in CNS could be either congruent or incongruent.
So far evidence supporting either model has been found in the brain. On one hand, neurons in the dorsal portion of medial superior temporal area (MSTd) in the dorsal visual pathway mainly encode velocity quantity for both visual and vestibular signals. On the other hand, neurons in the lateral intraparietal area (LIP) in the posterior parietal cortex predominantly processes vestibular acceleration and visual velocity signals with a time lag of up to 300-400 ms.
To test the two different models, researchers from Dr. GU’s lab trained macaques to perceive heading directions in a virtual reality motion system wherein vestibular signals were generated by the motion platform, and optic flow were provided onto a large visual display mounted on the system.
The researchers produced a condition in which visual and vestibular stimuli were perfectly synchronized at the input level, simulating the natural situation. Under this condition, they first verified the previous phenomenon that the animal integrated the two stimuli and improved its heading performance, and found that the amount of improvement was close to the prediction from an optimal Bayesian integration theory. Next, they introduced offset between the vestibular and visual inputs, and found that in the asynchronous stimuli conditions where visual cue was 250-500 ms ahead of the vestibular cue, macaques' performance of heading discrimination was improved, which was better than that in the natural synchronous stimuli condition. Such effect was not observed when visual stimuli was ahead of vestibular by 750 ms, or when visual stimuli was further delayed after vestibular input, suggesting that the 250-500 ms temporal synchrony offset window was specific for further-improved heading performance.
Why does the temporal offset of 250-500 ms further enhance the efficiency of multisensory integration? The researchers speculated that it's related to LIP and FEF. They implanted electrodes in these two brain regions in macaques and performed in vivo single-unit electrophysiological recordings. The results showed that under naturally synchronous stimuli conditions, neurons in both brain regions essentially processed a relatively slower visual velocity signal and a relatively faster vestibular acceleration signal with a temporal difference of exactly 250-500 ms between them. When the "lagged" visual cue was advanced vestibular cue by 250-500 ms, the temporal dynamics difference between two modality signals (velocity peak-acceleration peak) of the FEF and LIP neurons was reduced and the signal peaks overlapped and superimposed, increasing the capacity of the encoded heading information, which exactly explained the asynchronous stimuli effect of animal behavior.
By introducing a new paradigm with artificial asynchronous conditions, this study found enhanced heading perception in animals under asynchronous conditions and increased heading information capacity in the corresponding frontal and posterior parietal cortices. The findings supported that under natural conditions, the brain uses a model with inconsistent physical quantities of temporal dynamics for multisensory heading perception. As to why the brain needs to integrate visual velocity and vestibular acceleration signals under natural conditions, the researchers proposed that the reason may be that vestibular acceleration is a faster signal, which could help the organism use the signal to update changes in its headings quickly and thus respond in a timely way.
86-10-68597521 (day)
86-10-68597289 (night)
52 Sanlihe Rd., Xicheng District,
Beijing, China (100864)