Sensory Perception & Interaction Research Group

University of British Columbia

Collection of work on physical human-robot interaction in industrial and manufacturing contexts.

Devices with small screens, such as smartwatches and fitness trackers, struggle with limited forms of input - relying on touch, physical buttons, or voice. Inspired by new materials (such as Gelly, a mutual capacitance based sensor that can sense touch localization, proximity, pressure, and shear developed in the UBC Madden lab), this project explores the potential of in-air gestures performed above the surface to expand both the interaction space and richness of input for such devices.

The CuddleBot allows us to use an animal model of therapy to examine how touch interactions influence stress and anxiety mitigation.

We develop machine learning models of affect using touch and biometrics (such as EEG signal, skin conductance, heart rate, etc) to support affective interaction with companion robots for health applications.

Bits are one-degree-of-freedom actuated sketches which we use as a design tool to explore rendering of emotion through physical and visual motion.

The smart fur prototype is a new type of touch sensor built with conductive fur, with the goal of aiding in gesture recognition in the Haptic Creature system.