P. H. Bucci, X.L. Cang, H. Mah, L. Rodgers, and K.E. MacLean, “Real Emotions Don’t Stand Still: Toward Ecologically Viable Representation of Affective Interaction,” in Proceedings of 8th International Conference on Affective Computing and Intelligent Interfaces, 2019.
To create emotionally expressive robots, designers of human-robot interaction routinely translate emotion theories into instruments through which we estimate, quantify and analyze human emotional responses to robot behaviour. Pragmatically, we often use straightforward models such as Russell’s circumplex, treating emotion as a single point in a two-dimensional space. However, this simple metaphor and its consequent representations omit many aspects of real emotional experience, can lead to erroneous data and may undermine computational models that rely on them. Problems with emotion representations currently prevalent in human-robot interaction fall into three categories: (1) Representations are static and singular, whereas real emotions can be dynamic, multi-valued, uncertain or conflicting. (2) The framing of an interaction is unspecified (i.e., in an affective rating task: which part of an interaction involving multiple parties and perspectives the participant is meant to consider). (3) Participant responses captured with instruments and methods that are not well-understood by experimenters nor participants produce data that is hard to interpret. We propose alternative emotion representations to account for dynamic emotions inherent in interactive contexts; scrutinize framing ambiguities in study tasks and argue for mixed-methods approaches to achieve shared understanding of emotion representations between participants and researchers.