This work involves using eye trackers to provide a gaze interface that uses panning and zooming to navigate a 3D-like interface. Their work is tolerant of noise introduced to the system. Unlike other gaze systems, this one does not rely on dwell time to make selections.
An application called StarGazer is used to provide the 3D interface used for the gaze tests. This program consists of a circular keyboard that the user pans and zooms to select letters and thus type using only the eyes.
It was found that this system works better than those based on dwell time. Users were able to type at a relatively high word per minute using the StarGazer program, 8.16 wpm, with a low error rate of 1.23%. Also, users remained in control even when some noise was introduced to the system, which was noteworthy. Only efficiency was decreased, not accuracy or the number of errors produced. Additionally, users only needed about 5 minutes to understand how to use the system.
------------
I think this work is interesting because it helps pave the way to eliminate the mouse and keyboard, which I feel are barriers to a natural computer-human interaction. This system still has a way to go before it can truly replace the keyboard in terms of typing speed. It looks like it is faster than the mouse, however. I would like to see this system used in some other areas besides typing. I think the quick learning curve and use of cheaper, off the shelf gear will also help this system gain popularity.
3 comments of glory:
It is an interesting comment. While using this technique for interfaces other than keyboards, it will be interesting to see if the dwell time activation can be used for click events and does this cause increase in selection time.
I also feel that this approach would be better suited to applications other than typing. However, I would have to be persuaded to believe that in general a visually guided system could improve upon the efficiency of a keyboard where it is feasible to use one.
Post a Comment