Abstract:
Numerous psychophysical experiments have shown that human subjects integrate information from different sensory modalities in a statistically optimal or near optimal way to improve behavioral performance, yet the underlying neural mechanisms remain unclear. In our laboratory, we train rhesus monkeys to judge self-motion, i.e. heading directions based on three cues: optic flow only, inertial motion only and the combination of the both. After trained, the animals can integrate the two cues optimally, in a Bayesian sense, to improve heading estimate. With this behavioral model at hand, we record from neurons in multiple areas in macaque brain including both sensory cortices and sensory-motor transformation areas. In particular, we investigate how visual and vestibular heading information is represented in multiple sensory cortices, as well as how these different sources of sensory evidence is decoded and accumulated by the downstream decision areas across time and modalities to optimize the behavioral output. Our studies may place important constraints on theoretical considerations for neural basis of multisensory integration and decision-making.
Sponsored by the NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai





