Difference: BikramAdhikariProjectIter3 (4 vs. 5)

Revision 52014-04-14 - TWikiGuest

Line: 1 to 1
 
META TOPICPARENT name="BikramAdhikari"

CPSC 543 Project - Third Iteration

Line: 54 to 54
  Wizard of Oz interfaces have drawn interest in research community as a tool to train novice elderly PWC users. Problems associated with unintuitive guidance (in speed control policy) and switching discontinuity (in direction control) make it difficult for novice elderly PWC user to comprehend PWC behavior. In this project, we intend to bridge this discontinuity in direction control policy and lack of direction guidance in speed control policy. We integrate, speed control with steering guidance rendered as shear force on a custom designed low cost joystick handle. This shear force rendering does not require the joystick to be bulky to have a high torque. We rather position the shear force actuator at a suitable location requiring minimal torque. This shear force form of guidance has not been tested before on PWCs. We hypothesize that, our proposed shared control strategy will keep the user in control at all times and provide assistance when the user is unclear about suitable driving direction.
Changed:
<
<
Our first and second design iterations focused on exploring the problem space and possible technical solution space respectively. We leave the discussion on findings from these iterations on our final report. In the next section, we describe an experiment we propose to test our joystick interface and the proposed control strategy.
>
>
Our first and second design iterations focused on exploring the problem space and possible technical solution space respectively. We leave the discussion on findings from these iterations on our final report. In the next section, we describe an experiment we propose to test our shear force joystick interface and the proposed control strategy.
 

Experimental Setup

Deleted:
<
<
Divided attention
 It is difficult to recruit a large number of wheelchair users on the target demographics which makes it difficult to produce statistically significant results [11],[12]. As [10] and [11], we will use able-bodied users to evaluate the performance of our system. Future work will include a case study with an user from the target population.
Changed:
<
<

Egocentric View

Powered wheelchairs are usually driven by joystick interface modified as per users needs. Our user study from previous iteration suggests that users prefer joystick that has larger surface area as it provides more sense of control and comfort. We use this opportunity to design an embedded navigation assistive physical user interface that would fit into the joysticks on these wheelchairs. We use egocentric view representation on a dome shaped physical interface as a visual sense of collision free direction. This visual guidance is supported by haptic rendering as a sub-additive sensory modal to stimulate user reaction. Similar work has been done in [1], where they use haptic display to block motion of joystick in certain direction, LED display around the joystick knob to indicate direction free from obstacles and an audio prompt to prompt user towards certain direction. Auditory percept requires user to process and execute the percept. It would be useful to provide a natural guidance that would direct the user towards a suitable trajectory. Here we extend to using shear force and vibratory haptic guidance instead of audio prompts with the hypothesis that it can provide a sense of natural guidance towards a safely projected trajectory.

Visual Display

We use a RGB LED ring comprising of 24 serially controllable red, green and blue channels. We found this display suitable (see Fig [1]) for the size of the user interface we are considering to build.

Figure [1] Egocentric LED display around wheelchair joystick

Our first attempt was to use egocentric LED display to point in the direction towards the which the intelligent system wants to guide. Figure [2] shows the brightest green as the direction of possible heading.

Figure [2] LED display showing green head in the direction away from obstacle

Another form of display would be suggestive of direction towards which the joystick could turn. We simulated this using trailing LEDs. This video shows the green trail turning in clockwise direction. We could use this form of display if we need direction only visual guidance. We have not experimented with multiple colors and other patterns. However above two forms cover position and gradient component of guidance using visual feedback.

>
>
For our experiment, we choose one scenario representative of the activities of daily living of our target population. The Power-Moblity Indoor Driving Assessment (PIDA) is an assessment tool designed to describe an individuals indoor mobility status. This assesment is conducted in their own environment rather than an isolated obstacle course. Out of thirty tasks specified on the PIDA, we are particularly interested in back-in parking task as it involves driving in all directions, under limited visibility and apparently complicated/confusion joystick motion.
 
Changed:
<
<

Haptic Rendering

We use a similar approach to haptic rendering as to that of visual display. We use a servo motor with an arrow like shaft on top to point in the direction of guidance. We mounted this motor into a hemispherical surface in such a way that the palm surface would be in contact with the pointer. This video shows the servo motor turning to the its final position which was clockwise from the original position. The LED display turning clockwise showed which direction to turn while the arrow pointed to the direction of guidance.
>
>
We will test three shared control policies; speed control, direction control, speed control with direction guidance to determine which policy is most effective from users point of view and how each control policy effect quantitative measures such as completion time and trajectory smoothness.
 
Changed:
<
<
This method has its challenges mainly because the motor we are using is not strong enough to produce any torque when the palm solidly rests on it. Using a bigger motor would be another option which we will explore in the next iteration.
>
>
Each control policy will be tested three times with three different randomized initial conditions. The order of the control policy will also be randomized.
 
Deleted:
<
<
For now, we could possibly not bring the motor in touch with user's hand surface but generate a vibration either to represent position or direction.This video shows a vibration motor turning round inside the surface of the hemispherical ball. The vibration from the motor produced was hard to localize spatially. This could be due to the hollow space and light surface material made of styrofoam. To reduce the transfer of vibration from the motor to the servo motor shaft, the vibration motor was orthogonally attached to a loosely coupled plastic surface. Attaching the vibration motor orthogonal to the rotary plane of the servo motor reduced unnecessary vibration significantly.
 
Changed:
<
<
To make the vibration localized within a region, the vibration motor was made to touch the inner surface of the surrounding inner wall around the joystick. This transferred the vibration on to that surface. On removing some more material from the styrofoam on the sides, we were able to perceive change in vibration as the serve shaft turned around the joystick.This video shows the mounting of vibration motor and we test if the vibration was perceived any better. However, we could only perceive slight change in position of vibration.
>
>

Iteration #3 : A technical perspective

 
Changed:
<
<
As vibration was hard to localize, we removed vibration at the end of the servo motor shaft and replaced it with an eccentric wheel. This eccentric wheel protruded right enough outside the surface of the joystick to bring a sensation of guiding movement. This example video shows how the eccentric wheel produce shear force on fingers of an user.
>
>
In this iteration, we planned to refine the haptic display using shear force. We also planned to integrate visual and haptic displays together into a single system. Our first version of assembled visual, haptic display was presented last week during our meeting. We rendered trajectory displayed on a processing gui on to the joystick handle through serial interface in Arduino. The rendered steering guidance feature was perceivable and hence offered some proof of concept. Next, we would like to implement the Since Robot Operating System (ROS) also uses serial interface to communicate with the Arduino, communicating between processing, arduino and ROS was not possible. As suggested during last week meeting, we spent sometime exploring shared memory to interface between these three nodes but were not successful at achieving that within the time frame. Hence, we pursued to understanding and identifying an experimental setup which would be achievable. Hence, we came up with the experimental setup as described above. We are currently working on integrating the Joystick handle to the PWC via ROS. We expect to have an experimental setup ready before presentation.
 
Deleted:
<
<

Egocentric Input

As input, we also calibrated the circular potentiometer against LED display. This feature can be used to control the pan-tilt unit on which the camera of the wheelchair is mounted. In this video, we show how the center of the distribution on the LED display moves towards the point where the circular potentiometer is pressed.

Figure [3]: Touch sensors (circular and point pressure type touch sensors)used in this project

Reflections

In this iteration, we focused on generating basic visual and tactile patterns showing target position and target heading. We used egocentric approach to representing the environment around the wheelchair. We demonstrated some of the patterns using a servo motor, a vibration motor and circular LED display. We achieved position and direction guidance like behaviour using LED display. Using a vibration motor was challenging to control and localize vibration. Our simple experiment showed some promise on displaying direction guidance using vibration feedback. We experimented with some simple shear force mechanisms to observe, if we can realize position and direction guidance. Our eccentric wheel based shear force display appears to be able to render both position and direction. We will explore this domain in our next iteration.

The input section experimented in this iteration is merely an illustration of a working sensory system. We intend to use this sensory input to control the pan-tilt of the wheelchair camera so that the user can obtain vision based system's assistance. Right now, we have only used the circular touch sensor. This sensor could be used to turn the camera around by mapping sensor to the camera position (similar to mapping LED display with sensor position). We intend to incorporate controllability of tilt as well using combination of circular and point touch sensor with gestures such as swiping up and down to tilt the camera up and down.

Next Iteration

In next iteration, we plan to refine the haptic displays using shear force. We will integrate visual and haptic displays together into a single system. If time schedule aligns with my user interview, I will conduct a final interview with the user to get feedback on the designed interface. We also plan to integrate pan-tilt control using the touch interface.
  Reference:
Line: 141 to 107
  [16] I. M. Mitchell, P. Viswanathan, B. Adhikari, E. Rothfels and A. K. Mackworth, "Shared control policies for safe wheelchair navigation of elderly adults with cognitive and mobility impairments: designing a Wizard of oz study", in American Control Conference, 2014.
Added:
>
>
[17] D. R. Dawson, R. Chan, and E. Kaiserman, “Development of the power-mobility indoor driving assessment for residents of long term care facilities,” Canadian Journal of Occupational Therapy, vol. 61, no. 5, pp. 269–276, 1994.
  -- BikramAdhikari - 24 Mar 2014
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback