Reading #5: Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes (2007)


by Jacob Wobbrock, Andrew Wilson, and Yang Li (paper)

Comments: George

This paper describes the $1 gesture recognizer. This sketch/gesture recognition algorithm is intended to be a simple, easy to program algorithm that can be implemented anywhere. This hopefully would allow gestures to be incorporated into rapid prototyped interfaces that otherwise might not have been able to use gesture input. This is because most user interface designers and programmers don't have the necessary knowledge or skills to be able to implement complex recognition algorithms, and current recognition toolkits are not everywhere in any language, especially in many environments human-computer interaction experts might use.

The authors describe the algorithm in 4 parts: point resampling, indicative angle rotation, scaling and translation, and finding the optimal angle for best score. These transformations applied to each input stroke allow them to easily match up to a few template strokes for each gesture. The recognition result is the template gesture with the smallest Euclidean distance to the input stroke.

The $1 algorithm is compared to the DTW and Rubine algorithms, and it is found to compete well against them, achieving high recognition rates and recognition speed. The $1 algorithm pseudo code is given as well to aid programmers.

__________

This paper is very clearly written and the $1 algorithm is indeed very simple. I find it interesting that such a simple, almost naive, approach can perform very well if executed intelligently. It is easy to imagine improvements and how to add more recognition capabilities to this algorithm, such as rotation-dependent or time-dependent gestures.

0 comments of glory:

Post a Comment