John van Opstal
26-03-1957 in Zevenbergen
Man-years of research since PhD
Brief summary of my research over the last five years.
The main research topics in the lab are Human sound-localization and plasticity; Sound processing in the behaving monkey, Multisensory integration, and the Role of monkey Superior Colliculus in saccadic eye-head gaze shifts.
This research has been very successful, with many papers in high-rated journals like Science, Nature Neuroscience, J. Neurophysiology, and J. Neuroscience (see below).
An important focus has been plasticity in human sound localization. Studies have been performed with congenitally blind subjects, with monaurally deaf subjects, as well as with visually and auditory normal subjects who underwent specific auditory (swapped ears, ear moulds) or visual manipulations (distorting glasses). Our studies have clearly shown that the (adult!) human auditory system maintains a surprisingly high degree of plasticity, and that different non-acoustic signals play a role in calibrating sound-localization. These signals include information about eye position, head orientation, as well as visual input. The relative weighting of these signals, however, is unknown, nor is the underlying neural mechanism.
Our recordings in trained rhesus monkeys showed that the midbrain Inferior Colliculus (IC) receives a signal about eye position (and presumably also of head movements). Such signals are needed to enable the formation of a stable representation of sound locations in space, irrespective of intervening eye-and/or head movements.
We have pursued this research line on trained, head-unrestrained monkeys making orienting eye-head movements to sounds, while we recorded from single cells in the midbrain Inferior and Superior Colliculus (VICI), as well as in auditory cortex (Marie-Curie grant). The former neural structures are crucial for the generation of coordinated eye-, head- and body movements, but have so far been studied in head- restrained animals only. The latter structure is thought to be involved in the planning and selection of these responses and may also incorporate the mechanisms for updating of the difference reference frames.
By (reversibly) interfering with the system (applying ear plugs, ear molds, optical means, microstimulation, local inactivation in IC and SC, etc.) the behavioural consequences, as well as neural correlates of adaptation, will be studied in detail.
A second research line has been devoted to unravel the neural code underlying the generation of rapid eye movements by neurons in the midbrain SC. Based on recordings from over 150 single-units, we have recently proposed a neural model that explains the generation of saccadic eye movements in great detail (‘from single spikes to full behaviour’). This research has recently been extended to eye-head coordination in the head-unrestrained monkey in collaboration with prof Edward Freedman (Univ Rochester, New York).
A recent third research line concerns the representation and processing of sounds at subcortical (IC) and cortical levels (primary auditory cortex, and core). We are currently recording in the primary auditory cortex of a rhesus monkey that is trained in a signal-detection task. Our findings indicate that task-related aspects strongly modulate the acoustic response properties of cortical neurons, without affecting their spectro-temporal receptive fields. The modulations can be modeled by assuming a low-frequency top-down modulation that multiplicatively interacts with the high-frequency phase-locked stimulus-evoked (bottom-up) activity.
A fourth line of research is clinical. In collaboration with the Department of Otolaryngology (profs. Snik and Mylanus) we have acquired substantial funds (from Oticon, DK, Advanced Bionics, CH, and Cochlear, B) to study the effects of bone- anchored hearing aids, air-transducing hearing aids and cochlear implants on binaural and bimodal processing. In future work we aim to explore the neural plasticity involved in adapting to these devices, and to develop optimal fitting procedures and prospective diagnostics for individual patients (NeuroCIMT project). We include combined Near-Infrared Spectroscopy with EEG to perform neuro-imaging in patients with hearing devices that an otherwise not be tested in fMRI scanners.
I also collaborated with prof Jan Buitelaar (psychiatry). We published a joint paper on sound-localisation behaviour of autistic individuals.
With my newly acquired ERC Advanced Grant (per Jan 2017) I have started a new, fifth research line into the exciting field of robotics, in which we aim to develop a humanoid audio-visual-motor eye-head system that is governed by the same principles as the primate gaze-control system. A tight collaboration has been initiated with prof. Alexandre Bernardino and his Visual Robotics group from the Instituto Técnico Superior of the Universidad de Lisboa, where several master students, a PhD student and a postdoc work on our humanoid 3D eye-head motor-control system. See website https://www.mbfys.ru.nl/~johnvo/OrientWeb/orient.html for more information.