Sensory Perception & Interaction Research Group

University of British Columbia

Direct manual cues, such as touching an interlocutor's face while they talk, can enhance the intelligibility of their speech. We aim to investigate the feasibility of using vibrotactile feedback to enhance speech intelligibility in acoustically noisy environments. 

Devices with small screens, such as smartwatches and fitness trackers, struggle with limited forms of input - relying on touch, physical buttons, or voice. Inspired by new materials (such as Gelly, a mutual capacitance based sensor that can sense touch localization, proximity, pressure, and shear developed in the UBC Madden lab), this project explores the potential of in-air gestures performed above the surface to expand both the interaction space and richness of input for such devices.

A wearable haptic notification system for speakers and session chairs.

To guide walking rate, we detect cadence and cue individual steps with minimal attentional impact.

People differ widely in how they perceive and want to use tactile signals. What kind of tools are needed to support them in customizing?