Sensory Perception & Interaction Research Group

University of British Columbia

Full citation: 
Jung, M.M., Cang, X.L, Poel, M., MacLean, K.E., "Touch Challenge'15: Recognizing Social Touch Gestures." In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction 2015 Nov 9 (pp. 387-390). ACM.
Abstract: 
Advances in the field of touch recognition could open up applications for touch-based interaction in areas such as Human-Robot Interaction (HRI). We extended this challenge to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two data sets were made available containing labeled pressure sensor data of social touch gestures that were performed by touching a touch-sensitive surface with the hand. Each set was collected from similar sensor grids, but under conditions reflecting different application orientations: CoST: Corpus of Social Touch and HAART: The Human-Animal Affective Robot Touch gesture set. In this paper we describe the challenge protocol and summarize the results from the touch challenge hosted in conjunction with the 2015 ACM International Conference on Multimodal Interaction (ICMI). The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods used.
SPIN Authors: 
Year Published: 
2015