M.Sc. student Shane D. Sims and his supervisor, Professor Cristina Conati, won Best Paper in the 22nd ACM International Conference on Multimodal Interaction in October, 2020.
ABSTRACT: Encouraged by the success of deep learning in a variety of domains, we investigate the effectiveness of a novel application of such methods for detecting user confusion with eye-tracking data. We introduce an architecture that uses RNN and CNN sub-models in parallel, to take advantage of the temporal and visuospatial aspects of our data. Experiments with a dataset of user interactions with the ValueChart visualization tool show that our model outperforms an existing model based on a Random Forest classifier, resulting in a 22% improvement in combined confused & not confused class accuracies.