Designed as a bland, unobtrusive appliance, Calmer is a bed for premature infants that replicates mothers' breathing movement and heart rate.
Bits are one-degree-of-freedom actuated sketches which we use as a design tool to explore rendering of emotion through physical and visual motion.
Inspired by needs for haptic support of large motions on a surface (in embodied conceptual learning, commercial design, and 2D virtual / augmented reality), we present the ballpoint drive. This novel approach circumvents conventional constraints by imposing a new one: motion restricted to rolling on an arbitrary two dimensional surface, and grounding forces generated through friction.
SPIN research in haptic notification and movement guidance spans a number of different project arches, as seen through associated publications.
Educational haptic platforms can leverage various modalities in order create effective interactive environments that can support embodied physical interactions. These platforms have the potential to leverage a student’s physical intuition to make abstract topics in physics, math, and other fields of science more concrete.
Haptipedia is a comprehensive library of haptic devices annotated with designer-relevant metadata, then accessed through an interactive visualization to assist designers in finding relevant design examples.
We develop machine learning models of affect using touch and biometrics (such as EEG signal, skin conductance, heart rate, etc) to support affective interaction with companion robots for health applications.
Investigating the challenges involved in haptic experience design from hapticians involved in industry to novices starting out in the field. We are aiming to create effective solutions such as tools to help hapticians at different stages in the haptic design process.
People and robots communicating and collaborating in close proximity on manufacturing tasks.
The CuddleBot allows us to use an animal model of therapy to examine how touch interactions influence stress and anxiety mitigation.
Embodied, physical interaction can improve learning by making abstractions concrete. We use DIY haptic devices to move hands-on learning online.
A wearable haptic notification system for speakers and session chairs.
Direct manual cues, such as touching an interlocutor's face while they talk, can enhance the intelligibility of their speech. We aim to investigate the feasibility of using vibrotactile feedback to enhance speech intelligibility in acoustically noisy environments.
Crowdsourcing can gather rapid feedback at scale, but how do we crowdsource a haptic prototype? Haptic proxies are visualizations and simplified phone vibrations that can be used to represent high-fidelity haptics in studies conducted using Mechanical Turk.
Collection of work on physical human-robot interaction in industrial and manufacturing contexts.
Devices with small screens, such as smartwatches and fitness trackers, struggle with limited forms of input - relying on touch, physical buttons, or voice. Inspired by new materials (such as Gelly, a mutual capacitance based sensor that can sense touch localization, proximity, pressure, and shear developed in the UBC Madden lab), this project explores the potential of in-air gestures performed above the surface to expand both the interaction space and richness of input for such devices.
Flexible smartphones can use life-like movement to communicate information to users.
Macaron is a free, open-source, online haptic editor. Designed to be an easy-to-use tool for anyone working with haptics, it is a platform for experimenting with new haptic design features and has helped us understand how examples aid the haptic design process.
Radiologists use 30-year-old technology to interact with cutting-edge diagnostic imagery. We studied their problem and looked at the ways haptic feedback could be used to improve it.
To guide walking rate, we detect cadence and cue individual steps with minimal attentional impact.
The smart fur prototype is a new type of touch sensor built with conductive fur, with the goal of aiding in gesture recognition in the Haptic Creature system.
Our furry robots' breathing is innately calming. Can we use this therapeutically?
People describe vibrations using emotional descriptions. For example, signals can feel more or less "lively" than others. Can this language be used to stylize vibrations to feel more/less along certain emotional dimensions?
People differ widely in how they perceive and want to use tactile signals. What kind of tools are needed to support them in customizing?
Voodle is an interface that uses vocal input to design haptic behaviour.