An Architecture for Gesture-Based Control of Mobile Robots

Soshi Iba, J. Michael Vande Weghe, Christiaan J. J. Paredis, and Pradeep K. Khosla
Carnegie Mellon University


Comments: ...........

This paper presents a system for controlling mobile robots using hand gestures.

Previous work has been done with controlling robots in various ways, but most of those have used a keyboard and mouse to control the robots. This is deemed inappropriate for novice or unfamiliar users, so a more intuitive interface is necessary for these kinds of users.

The goal of this project is to work toward an intuitive, multi-modal system for controlling mobile robots. This project introduces hand gestures as a means to control mobile robots by waving in the desired direction the robots should move or pointing at the intended location for the robots to move.

The system uses a CyberGlove, a Polhemus 6DOF sensor, and a GPS unit in the robot itself.

6 gestures were used to control the robot:

OPENING: Moving from a closed fi st to a flat open hand
OPENED: Flat open hand
CLOSING: Moving from a flat open hand to a closed fist
POINTING: Moving from a flat open hand to index fi nger pointing, or from a closed fi st to index fi nger pointing
WAVING LEFT: Fingers extended, waving to the left, as if directing someone to the left
WAVING RIGHT: Fingers extended, waving to the right

They also incporated a "wait state," which simply was a gesture other than those above.

Robot control can occur in two modes: local and global. In local mode, the gestures are interpreted as if from the point of view of the robot. In global mode, they are interpreted in world coordinates to control the robot from the user's view. The reason for having the local control mode is to operate the robot remotely, in which video signals from the robot are viewed. The global control mode is used if the robot is in sight of the user.

The gestures work like this for Local Control:

CLOSING: decelerates and eventually stops the robot
OPENING, OPENED: maintains the current state of the robot
POINTING: accelerates the robot
WAVING LEFT/RIGHT: increase the rotational velocity to move left/right
The gestures work like this for Global Control:

CLOSING: decelerates and eventually stops the robot (cancels the destination if one exists)
OPENING, OPENED: maintains the current state of the robot
POINTING: "go there"
WAVING LEFT/RIGHT: directs the robot towards the direction in which the hand is waving.

A Hidden Markov Model algorithm was used to detect and recognize the gestures with an accuracy of 96%. The wait state feature helps the recognition significantly compared to systems without a wait state.

----------

I like that these researchers are trying to get a more intuitive interface for controlling robots so that novice users can use the system. I always like this approach to projects where appropriate. It is also interesting to see a new use of the data glove that I have not thought of before.

I wonder how feedback is given from the robot/system to the user. It would be very important to know exactly how your actions are affecting the robot, so that you don't over-steer or over-accelerate the robot, for example. This problem would be magnified if there is any sort of delay between a gesture and the robot's response as perceived by the user or if the user is using local mode, which definitely would introduce some lag.

I would like to see a usability study, obviously, to sort out issues like the one I have described, especially if the research is aimed at the general public.

I am also interested in the high-level multi-robot control to come...

3 comments of glory:

manoj said...

feedback for over - steer/ acceleration would be difficult but an interesting problem.

Franck Norman said...

I wonder if the gestures they choose are the most intuitive. More studies would have been helpful.

M Russell said...

They mentioned they were going to extend this to multi robot systems. I am interested to know how they will identify which robot is being controlled.

Post a Comment