Audio-Visual Integration in a Redundant Target Paradigm: A Comparison between Rhesus Macaque and Man

The mechanisms underlying multi-sensory interactions are still poorly understood despite considerable progress made since the first neurophysiological recordings of multi-sensory neurons. While the majority of single-cell neurophysiology has been performed in anesthetized or passive-awake laboratory...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in systems neuroscience Vol. 11; p. 89
Main Authors Bremen, Peter, Massoudi, Rooholla, Van Wanrooij, Marc M., Van Opstal, A. J.
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Media S.A 29.11.2017
Subjects
Online AccessGet full text
ISSN1662-5137
1662-5137
DOI10.3389/fnsys.2017.00089

Cover

More Information
Summary:The mechanisms underlying multi-sensory interactions are still poorly understood despite considerable progress made since the first neurophysiological recordings of multi-sensory neurons. While the majority of single-cell neurophysiology has been performed in anesthetized or passive-awake laboratory animals, the vast majority of behavioral data stems from studies with human subjects. Interpretation of neurophysiological data implicitly assumes that laboratory animals exhibit perceptual phenomena comparable or identical to those observed in human subjects. To explicitly test this underlying assumption, we here characterized how two rhesus macaques and four humans detect changes in intensity of auditory, visual, and audio-visual stimuli. These intensity changes consisted of a gradual envelope modulation for the sound, and a luminance step for the LED. Subjects had to detect any perceived intensity change as fast as possible. By comparing the monkeys' results with those obtained from the human subjects we found that (1) unimodal reaction times differed across modality, acoustic modulation frequency, and species, (2) the largest facilitation of reaction times with the audio-visual stimuli was observed when stimulus onset asynchronies were such that the unimodal reactions would occur at the same time (response, rather than physical synchrony), and (3) the largest audio-visual reaction-time facilitation was observed when unimodal auditory stimuli were difficult to detect, i.e., at slow unimodal reaction times. We conclude that despite marked unimodal heterogeneity, similar multisensory rules applied to both species. Single-cell neurophysiology in the rhesus macaque may therefore yield valuable insights into the mechanisms governing audio-visual integration that may be informative of the processes taking place in the human brain.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Present Address: Peter Bremen, Department of Neuroscience, Erasmus Medical Center, Rotterdam, Netherlands
Reviewed by: Preston E. Garraghty, Indiana University Bloomington, United States; Benjamin A. Rowland, Wake Forest University, United States; Michael Brosch, Leibniz Institute for Neurobiology, Germany
Edited by: Mikhail Lebedev, Duke University, United States
ISSN:1662-5137
1662-5137
DOI:10.3389/fnsys.2017.00089