Human Sensori-Motor Computation
By Dinesh Pai
Humans experience life through active movement and the integration of information from several senses (including vision, touch, hearing, and proprioception). Computational models of this process could lead to fundamental advances in basic neuroscience, as well as medical interventions for conditions such as spinal cord injury. Human models will also provide insights into the design of better robots and multisensory human interfaces.
I will informally describe ongoing projects in my lab in two broad areas: (1) Human movement: this includes computational models of the musculoskeletal system, the control of movement in the spinal cord, and human reaching and grasping. (2) Multisensory integration: this includes modeling saccades to auditory targets in non-human primates and the design of a novel multisensory environment called the HAVEN (Haptic, Auditory, and Visual Environment).
Back to the LCI Forum page