I am an Experimental Psychologist whose research focuses on multisensory integration, perception-action coupling, self-motion perception, and locomotor rehabilitation. I have examined how the human brain integrates different sensory signals (visual, auditory, proprioceptive, vestibular) by studying several different populations (eg, younger adults, older adults and high-level athletes), and by using behavioural measures and computational models. My work makes extensive use of Virtual Reality and motion simulation technologies. I have a particular interest in understanding how multisensory processes are affected by locomotor challenges and how multisensory training tools can be used to improve performance.
Jennifer Campos, PhD
Senior Scientist, KITE (TRI)
- Characterizing interactions between hearing and balanceThere is now convincing evidence linking falls with hearing loss, yet the specific nature of this link is unclear. It is possible, for instance, that hearing loss causes problems with orienting because binaural cues are reduced, that hearing loss taxes cognitive resources in complex environments, or that there is a shared pathology of the auditory and vestibular systems. Therefore, the objective of this research is to carefully evaluate the relationship between hearing and balance in younger and older adults (with and without hearing loss) in realistic, multisensory, simulated environments.
- Multi-sensory feedback and driving performance in older vs. younger adultsDriving simulators are now providing opportunities for the safe training and assessment of older drivers under realistic conditions. Despite the clear advantages of simulators, the vast majority only simulate visual inputs and neglect to simulate the range of important multisensory information associated with driving (sound, movement, and vibration). Therefore, the objectives of this research are to use Toronto Rehab's state-of-the-art, multisensory driving simulator to evaluate the effects of introducing particular sensory inputs (visual, auditory, vestibular, vibrotactile) on driving performance in younger and older adults. This research will inform the development of training/assessment simulators, the design of on-road vehicles, and will provide novel insights into the general importance of particular sensory inputs on driving performance.
- Understanding And Modelling The Interactions Between Vision, Proprioception And Vestibular Inputs During Active And Passive Self-MotionIn order to update our self-motion through space, the human brain uses a combination of information from several different sensory systems including our muscles and joints (proprioception), the acceleration detectors in our inner ear (vestibular) and dynamic visual information (optic flow). In a series of research projects we have precisely evaluated and modeled how these cues are integrated in the brain and the relative importance of each. We are now evaluating how multisensory self-motion perception is affected by specific sensory and cognitive deficits and as a function of training
Senior Scientist, KITE (TRI)
Associate Professor, Department of Psychology, University of Toronto
Adjunct Faculty, Centre for Vision Research, York University
Canada Research Chair in Multisensory Integration and Aging (Tier 2)