paper
Comments: Manoj and eventually others...
This paper shares the results of using simple machine learning algorithms to see if hand postures can be used to help identify tasks. Specifically, the 1-nearest-neighbor algorithm was used on a set of 12 office-related gestures.
Gestures were collected from 8 users with a CyberGlove. The users performed each of the 12 tasks 5 times each. The sensor readings were captured at 10 frames per second and averaged together across the whole gesture. The algorithm was trained on both single-user and multi-user gestures. Per-user training yielded much higher accuracy across all gestures (94%). Only a few of the gestures were confused. In these cases, a simple examination of the gestures shows that the gestures are very similar, such as holding a mug and stapling a paper.
------------
This work is similar to what we did with the RPS-15 data in this class. One of the class members shared a 95% accuracy across all users using the 1-nearest-neighbor algorithm. Our RPS gestures were more rigidly defined using a picture of the gesture. In contrast, the users in this paper were told to perform an activity such as dialing a phone. These can be done in many different ways.
I thought of some things that could be added to this work to help clear up some of the gesture confusions. Most simply, including more sensors up the arm could help disambiguate certain gestures. For example, the elbow and shoulder are probably going to be different when drinking from a mug than when stapling a paper, even though the hand gesture is very similar (this is similar to the proposed 3D sensor being attached to the hand that is given in the paper). This method would be easier than incorporating a 4th dimension to the data, namely time. However, analyzing gestures over time might be a very valuable addition, though the analysis would be more complex.
I am also interested in seeing the results of these same gestures with other, better learning algorithms.
2 comments of glory:
Interesting set of additional sensors to recognize activities. But the whole system is invasive, it would be better to find alternate non invasive methods.
I think this would be more helpful if they could determine gestures without the glove. Asking employees to wear a glove raises suspicion.
Post a Comment