I am an Experimental Psychologist whose research focuses on multisensory integration, perception-action coupling, self-motion perception, and locomotor rehabilitation. I have examined how the human brain integrates different sensory signals (visual, auditory, proprioceptive, vestibular) by studying several different populations (eg, younger adults, older adults and high-level athletes), and by using behavioural measures and computational models. My work makes extensive use of Virtual Reality and motion simulation technologies. I have a particular interest in understanding how multisensory processes are affected by locomotor challenges and how multisensory training tools can be used to improve performance.
- Characterizing interactions between hearing and balanceThere is now convincing evidence linking falls with hearing loss, yet the specific nature of this link is unclear. It is possible, for instance, that hearing loss causes problems with orienting because binaural cues are reduced, that hearing loss taxes cognitive resources in complex environments, or that there is a shared pathology of the auditory and vestibular systems. Therefore, the objective of this research is to carefully evaluate the relationship between hearing and balance in younger and older adults (with and without hearing loss) in realistic, multisensory, simulated environments.
- Multi-sensory feedback and driving performance in older vs. younger adultsDriving simulators are now providing opportunities for the safe training and assessment of older drivers under realistic conditions. Despite the clear advantages of simulators, the vast majority only simulate visual inputs and neglect to simulate the range of important multisensory information associated with driving (sound, movement, and vibration). Therefore, the objectives of this research are to use Toronto Rehab's state-of-the-art, multisensory driving simulator to evaluate the effects of introducing particular sensory inputs (visual, auditory, vestibular, vibrotactile) on driving performance in younger and older adults. This research will inform the development of training/assessment simulators, the design of on-road vehicles, and will provide novel insights into the general importance of particular sensory inputs on driving performance.
- Understanding And Modelling The Interactions Between Vision, Proprioception And Vestibular Inputs During Active And Passive Self-MotionIn order to update our self-motion through space, the human brain uses a combination of information from several different sensory systems including our muscles and joints (proprioception), the acceleration detectors in our inner ear (vestibular) and dynamic visual information (optic flow). In a series of research projects we have precisely evaluated and modeled how these cues are integrated in the brain and the relative importance of each. We are now evaluating how multisensory self-motion perception is affected by specific sensory and cognitive deficits and as a function of training
Perception. 2016 Oct 27;:
Effects of Hearing Loss on Dual-Task Performance in an Audiovisual Virtual Reality Simulation of Listening While Walking.
J Am Acad Audiol. 2016 Jul;27(7):567-87
Front Psychol. 2016;7:595
Front Psychol. 2015;6:1581
Front Psychol. 2015;6:472
Exp Brain Res. 2014 Nov 2;
PLoS One. 2014;9(7):e101016
Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies.
Exp Brain Res. 2014 Oct;232(10):3277-89
Exp Brain Res. 2014 Mar;232(3):827-36
Exp Brain Res. 2012 May;218(4):551-65
Scientist, Toronto Rehabilitation Institute (TRI)
Assistant Professor, Department of Psychology, University of Toronto
Associate Member, Graduate Studies, University of Toronto
Adjunct Member, Centre for Vision Research, York University
Status Appointment, Department of Occupational Sciences and Occupational Health, University of Toronto