Using Eye-tracking data for High-Level User Modeling in Adaptive Interfaces

By Kasia Muldner

In recent years, there has been substantial research on exploring how AI can contribute to Human-Computer Interaction by enabling an interface to understand a user's needs and act accordingly. Understanding user needs is especially challenging when it involves assessing the user's high-level mental states not easily reflected by interface actions. We present our results on using eye-tracking data to model such mental states during interaction with adaptive educational software. We then discuss the implications of our research for Intelligent User Interfaces.

Click here to go to the LCI Forum page