Karon MacLean
for students:

- working with me

- physical user interfaces

writing papers in SPIN

writing a research proposal

writing a project report

This page gives an inadequate bit of background on haptic and physical user interfaces, followed by a paragraph on each of the aspects that I'm most interested in. Students interested in projects in any of these areas should

Why Do We Need New Kinds of Physical Interfaces?

We've gotten used to interacting with the computer through a keyboard and mouse; sometimes a joystick. If we are heavy computer users, we use them all day. They are great tools for some things, but they constrain interaction in many ways:
  • The absence of tactile feedback reduces information, overloads the visual sense and increases motor strain.
  • Input is largely limited to discrete operations such as typing and selecting.
  • While a mouse affords some continuous control, without physical resistance or tactile response it is not very effective. Those who try to draw, sculpt or create continuous audio lines for long periods of time using a mouse are generally frustrated and in pain.
  • There are adverse health consequences to the fixed posture and small repetitive movements required by working at a desk.

It's more interesting to think of it from the other direction: consider the things we've given up in the physical world which might be nice to have back, but augmented with computation and connectivity. Paintbrush and pencils and musical instruments; a single personal key that lets you into home, car and work and has a distinct feel as you insert it in a lock depending on whether your spouse, a friend or a stranger has been by in your absence; a bank card that feels as heavy as your account balance when you swipe it in the ATM.

In short, there are unlimited new and old functions and information that computers make available to us, and we need to develop a correspondingly broad variety of handles in many contexts to exploit them fully. However, there are many constraints and requirements which make this a nontrivial task.

What is a Physical User Interface?

Several kinds of physical interfaces have emerged in the past decade as a new means of interacting with computers, on the desktop, in the operating room or embedded in objects and architecture.

One medium of physical interface is actuated haptic or force feedback, whereby a user feels forces and vibrations corresponding to a computer interaction model while exploring or manipulating a remote or virtual space; these are often gainfully integrated with multimodal sensory cues (auditory, visual).

Another category relies on the user's manipulation of passive computer-recognizable tokens to direct or explore the system represented in the computer. Many other possibilities and combinations for physical interfaces exist; this research employs them as part of an interface design palette.

Application-Driven Device Design

This is really an approach to all of my research, rather than a specific thread. The working principle is that the most immediately useful innovation comes out of solving real problems. This is not to say that the solutions we try will be immediately workable or even successful in principle: but they will exercise the technology and tend to push it in valid directions. Thus, we iterate in a continual [observe] -> [design] -> [build] -> [evaluate in context] cycle.

Integrated Multisensory Interaction

In the real world, we rarely feel without also hearing or seeing; haptic sensations are generally coupled inextricably with auditory and visual stimuli. We receive information from multiple senses, often without being able to discern which.

Reproducing this fine grain in coupling of sensory displays is a problem of both time and space: for example, to be perceived as part of the same event, an auditory and a visual sensation must occur with apparent simultaneity, and appear to emanate from the same location. Human psychophysical acuity (still being determined) defines the specifications to which synthetic displays must render stimuli, and some of these values challenge state-of-the-art computer hardware.

There are several branches to this problem. Continued psychophysical testing requires innovative experimental hardware and protocols relevant to real design problems. The physical design of co-located haptic / acoustic / visual displays is a mechatronics challenge and is often context-specific. Finally, driving both of these is the continual search for the best application, and integration of devices into appropriate environments and then testing them for performance or comfort enhancement.

Low-Power, Embedded and Parasitic Interfaces

Most haptic displays to date are situated on the desktop. However, distributed computation is already a reality in many environments, and this thread aims to enable distributed user control. I have focussed on simple, low-cost manual computer interfaces designed into the user's environment and customized to particular tasks - for instance, an automobile cockpit, a living room media control console or a mobile phone. Because they have only a single degree of freedom (knobs, buttons and sliders), high quality haptic feedback can be generated at a relatively low cost. We will build on prior work in prototyping media remote controls, architectural features and auto interiors, and proceed to develop methods for functionally evaluating and iteratively designing such integrated, task-oriented displays.

Parasitic interfaces: Adding complexity and actuation to user interfaces increases their power consumption, with undesirable ecological consequences; and both portability and embeddability are compromised. Since the basic effect of force feedback is to transmit power to the user, we envision haptic interfaces which unobtrusively collect mechanical power from a user as part of the haptic display cycle, rather than transmit it primarily into the user, as is the norm. Preliminary investigation and prototyping (MIE440f) has suggested that power available in this form can be significant relative both to haptic display and embedded processor requirements.

Reading and Writing Haptic Language

As haptic display hardware and algorithmic techniques mature, the field has become proficient at geometric and dynamic rendering of virtual physical environments - e.g. a virtual surgical site. The next difficult step is to communicate more abstract information between user and computer, or between users mediated by a computer. Again, there are several potential branches to the problem, each centered in a different primary discipline:

Evaluation Techniques: Usability and Psychophysics

While haptic and other tangible interfaces are proliferating, we do not yet have a good, far less standardized, means of assessing and comparing their functionality. One pragmatic reason for wanting to do so is to convince potential funders of an invention's value; another is to fine-tune our own research direction.

Development and application of in-context evaluation techniques are part of the iterative application-based design practice alluded to above, and make for some interesting challenges: placing the device in context means building an appropriately realistic experimental setup. Stimuli must be supplied in controlled yet natural ways, and human responses measured sometimes at levels the strain computational and sensory limits. Thus this type of research involves mechatronics, psychology and the most creative of experimental design. Performance-assessment projects in surgical and to automobile cockpit environments are anticipated.