A Multimodal Approach Towards Robust Human-Machine Interaction - PDLS Talk by Junaed Sattar, UBC/CS

Date
Location

DMP 110, 6245 Agronomy Rd.

Title: Towards Multimodal Human-Robot Interaction: Algorithms, Applications and Systems

Speaker:  Junaed Sattar, Postdoctoral Research Fellow, UBC Computer Science

Abstract:
In recent times, rapid advances in intelligent machines have started to make an impact in the way we perceive and use technology. Groundbreaking progress in systems and algorithms are contributing towards a higher quality of life, and also making significant impact in more exotic domains. Robots are increasingly being used in diverse environments in a variety of tasks, including but not limited to exploration, search-and-rescue and rehabilitation. In addition, with the advent of inexpensive, pervasive systems, there is a predicted rise of robots in indoors, personal applications. For safe, reliable robot operations, close collaboration between humans and robots are of utmost importance, as are robust, intuitive and natural communication methods. My research looks at algorithms and interfaces for human-robot interaction and control for autonomous robots in arbitrary environments. In particular, I have investigated vision-based approaches for human-robot interaction, human-motion detection and robust tracking for human-robot collaborative missions, particularly in underwater explorations.  Recent work has looked at a quantitative model of task cost and uncertainty in human-robot dialog, to prevent robots from carrying out potentially dangerous and unsafe tasks unless confirmed by its human partner. Using a network of smart devices, sensors and robots to provide a rich sensory representation of the environment, a distributed model of task cost assessment is created. Coupled with cost-uncertainty dialog engine, we are thus able to not only detect unsafe commands, but also accurately predict the parts of the command contributing to the lack of safety. This human-robot dialog framework has been evaluated on-board a number of different robotic platforms -- including the Aqua amphibious robot and the Willow Garage PR2 robot. Currently, under the umbrella of the CanWheel project, this mechanism is being evaluated for risk assessment in collaborative control of robotic wheelchairs by older, cognitively impaired adults. This talk will present an insight into this research in sensory human-robot interaction, and present findings from quantitative and qualitative studies, and also from various robot field trials.
Bio
Junaed is an FRQNT post-doctoral fellow in the Laboratory for Computational Intelligence at the Department of Computer Science, University of British Columbia, working under the supervision of Dr. James J. Little. He completed his PhD from McGill University at the end of 2011, pursuing research in mobile robotics and human-robot interaction working with Dr. Gregory Dudek at the McGill Mobile Robotics Lab at the School of Computer Science. He received the FRQNT (Quebec Research Funds in Natural Sciences and Engineering) Post-Doctoral Scholarship, and has been a recipient of the FRQNT Doctoral scholarship and PRECARN industrial scholarship during his doctoral studies. His research interests are in mobile robotics, computer vision, human-robot interaction, machine learning and software architectures for intelligent systems. He is also a member of the CanWheel project, investigating robotics application to improve mobility and interaction for powered wheelchairs.