Recent Developments and Applications of Haptic Devices

paper

Comments: Manoj, Franck

This paper basically collects all the major haptic devices and technologies for comparison. It separates various input devices by degrees of freedom. It also presents some glove based devices and discusses vibration and hydraulics, among other things.

EyeDraw: enabling children with severe motor impairments to draw with their eyes

paper

Comments: Franck, Murat

EyeDraw is a system which runs using an Eyegaze system to allow drawing with the eyes. Many disabled people have used eye-tracking systems for years to use a computer and to communicate. The EyeDraw system was developed and tested with 4 users. With feedback from the users, a 2nd version was developed and some improvements were made to enable easier use. Users gave positive feedback, though drawing was still difficult. The paper mentions a 3rd version under development.

----------

The EyeDraw system is the most successful application I have seen for drawing with the eyes. It seems to solve the Midas touch problem somewhat, and the users can actually draw with it.

Coming to Grips with the Objects We Grasp: Detecting Interactions with Efficient Wrist-Worn Sensors

paper

This paper aims to recognize gestures by attaching an accelerometer and an RFID transmitter to a wrist worn device. The idea is that we know what device is being used due to RFID tags placed on devices, and the accelerometer data paired with the device ID can be used to classify the current gesture. A "box test" was performed in which various items were placed into and removed from the box using different RFID antenna types, different objects, and with different subjects. A long term study was also conducted in which the bracelet was worn for an entire day to test real world applications.

----------

I think the idea of this paper is nice, and if RFID tags were present in all our everyday items, much information could be obtained about how users interact on a day to day basis. I think it would only work well with a high number of tagged items, so if that doesn't happen, I don't know how useful this will be.

User-Defined Gestures for Surface Computing

paper

This paper is a study of gestures for multitouch display interaction. The gestures were user defined for 27 commands. In all, 1080 gestures were observed. The purpose of this paper is to "help designers create better gesture sets informed by user behavior." The resulting gesture set was completely user defined. It turned out that the combined gestures predicted by the authors only covered 43.5% of the final gesture set. The authors also found that "users rarely care about the number of fingers they employ, that one hand is preferred to two, that desktop
idioms strongly influence users’ mental models, and that some commands elicit little gestural agreement."

----------

This paper illustrates the necessity of user input when designing s system. Often, surprising results can be obtained by studying users in this way. This step is crucial when designing a new type of system in which little is known about the input or little input is formally defined.

Whack Gestures: Inexact and Inattentive Interaction with Mobile Devices

paper

This paper explores using simple hitting gestures to interact with a device. For example, if the device is attached to the belt, a simple smack or wiggle can provide interaction. A custom device was developed and tested for this project. Users found that the whack gestures were simple, easy to remember. They experimented with multiple whacks and combinations of whacks and wiggles.

------

This type of interaction could be useful for cell phones, ipods, and other common pocket devices. For example, if the cell phone starts ringing, it could be silenced with a whack. Some ipods also use a shake, or wiggle, to advance songs. I personally don't use that feature very often.

Device Agnostic 3D Gesture Recognition using Hidden Markov Models

paper

This paper attempts to determine how to effectively use HMMs to classify 3d gestures "regardless of the sensor device being used." They use a decomposition technique which successfully works for any sensor combination.

------

I am currently beginning learning HMMs, and this could be helpful as I design my gesture recognition algorithms for the glove.

Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes

paper

This paper is a well known sketch recognition paper that can recognize 16 shapes with a simple algorithm.

---------

This is a good paper for beginning sketch recognition. It is easy to implement, and can recognize a fair set of shapes. Higher level systems can be built on top of this to create simple yet rich sketch interfaces.

The $3 Recognizer: Simple 3D Gesture Recognition on Mobile Devices


This is an extension of the $1 to 3D. It does not support all the original $1 shapes, but it does add a few new shapes, such as tennis serve and hacksaw.

--------

This is cool, and I would like to see a video or a user study of how well this works for 3D sketching.

An Empirical Evaluation of Touch and Tangible Interfaces for Tabletop Displays


This paper compares several different methods for interacting on a table type display. They have implemented a touch based interface as well as a model-based interface on a single surface. They compared the touch and model based interaction for a variety of computing tasks and found that touch was better for some tasks and models were better for other tasks.

-------

I personally have never played around with any model based interfaces. I can see the usefulness of such an interface, though I am not sure that many small pieces would be such a good thing.

FreeDrawer: a free-form sketchingsystem on the responsive workbench


These dudes use a pen to sketch in 3D. They employ curve and surface smoothing to create smooth drawings.

----------

This paper was not so interesting. Drawing in 3D is nice, but I have seen better methods. Also no tests were done, so we don't know how well it actually works.