For haptic interfaces, a technology whereby
interactive user-machine communication is established by
way of mechanical signals, that is, gestures and touch sensations,
we have found that progress is impeded by several problems.
In this project, we are working on a subset of problems
motivated by industrial needs and which are also academically
relevant:
1. Need for automatic determination of the
parameters of haptic effects to users. Novice users have
trouble coping by the complexity of the settings. Given
a device and an applicative context, only certain combinations
are useful and should be automatically determined.
2. Existing application programming interfaces
(APIs) focus on niche applications and fall short generality
(this would be like having graphics APIs that could only
render shinny surfaces). We propose to develop a richer
set of primitives and efficient implementations.
3. Not all approaches of presenting haptic
information admit intrinsic physical models. This entails
the development of an abstract "haptic language" able to
articulate haptic signals that users can intuitively interpret,
manipulate and create.
4. The industry is looking for new classes
of devices that can go beyond point-like interaction paradigms,
namely devices capable of distributed stimulation. The technology
developed by us thus far is sufficiently practical to address
the programming such distributed tactile displays.
These four problems all depend on each other
and also share the property that they are better researched
in an academic environment. They also all depend on knowledge
of human performance, from a cognitive viewpoint (e.g. what
characterizes the intelligibility of haptic signals?) and
from a sensory-motor performance perspective (e.g. what
are the engineering short-cuts that are acceptable/unnoticeable
for a user?). The solution to these problems will help haptic
interfaces deliver far more effective results given hardware
are computational resources.