Lab Day!

So today we went into the lab to see some of the devices we have available for this class.


3D Display Glasses.

These glasses have 2 cameras in front and 2 displays, one for each eye. A control box is attached that has video outputs and inputs so a computer can process and alter/augment the user's vision.

This device was overall pretty easy to use. I was able to instantly function normally in an office or lab setting, though I may be impaired in other settings.

I can immediately see the benefits of these glasses, and I have a few ideas of applications. The first thing I thought of when I put them on was 3D object manipulation using augmented reality techniques. I'm sure many of us have seen the demos with the papers with markers on them that show a 3D model on the paper when viewed with a webcam on a computer. This could make those kinds of things feel more real, since it would be as if the 3D model is actually in the user's hand instead of on a computer monitor.

On a more whimsical note, I think these glasses could be coupled with facial recognition software and Facebook to allow the user to identify people they know on Facebook. Information about that person could be displayed around the head or body so the user can remember things about that person, such as a favorite food, color, or the person's name. This could help people get to know each other.

I did notice three possible problems. First, depth perception is altered while wearing the device. Everything seems farther away. While I was fine and adjusted to this quickly, other people may not be able to adapt or might become impaired by this. The long-term effects of this altered depth perception might be detrimental as well, but that would require more testing. Second, the resolution of the device limits the details the user can see. The biggest problem I noticed with this was the difficulty of reading small or far away text. Third, peripheral vision is blocked by the glasses. This, coupled with the extended depth perception, gives the user a kind of tunnel vision which makes it difficult to walk around. The biggest problem with this is the difficulty of seeing your feet while wearing it, so it is easy to trip over things. I also experienced problems going around corners, as my shoulder hit a few times.

Josh had us use the 3D glasses and fill out a questionnaire. We went through some actions to get a feel for the device and help Josh understand the device's capabilities.



Eye tracking glasses

These glasses have a camera attached that feeds data to a software application that recognizes the human eye and can determine the location the user is looking.

My impression of the device was medium. I was impressed with the eye recognition software and that it worked. However, the accuracy was not great, and I had trouble moving the cursor with my eyes. I also had some difficulty positioning the camera so my eye was completely in the viewing window, but I don't know if this is a problem or not. The device also requires that the user's head remain still, which is an impossible task. This device would be awesome if we can calibrate it better and be able to move our heads.

In its current state, I can see this device augmenting other input devices, such as a mouse or glove. I don't think it can currently be used as a standalone input device. For this class, I think we could implement some sort of eye gesture system similar to the EOG glasses we read about. The glasses seem to work fine for coarse input like that, though I had some difficulty with the up-left gesture.



CyberTouch gloves

I have been working with the gloves since last semester. My initial thoughts on the glove were very positive once I got one connected to a computer. The 3D hand in the configuration utility really illustrates what this glove can do, and the vibrotactile feedback adds a new layer of functionality and opens up a new realm of possibilities.

I have thought of a few ways this glove can control the mouse. First of all, it could be combined with a 3D location sensor, such as the Flock of Birds, to create a system for controlling the cursor on a large display (similar to Vogel et al.'s work). The vibrotactile actuators could add the feedback that Vogel was missing from his system. We could also just use the angles of one or more fingers to control the mouse. Finally, we could combine this glove with Josh's 3D glasses and the Flock of Birds to create a 3D interaction augmented reality application, in which 3D objects can be grabbed by the user and manipulated.

I have noticed a few problems with the glove. First, the calibration is not perfect, especially with the thumb. Second, the vibrotactile actuators have 5 discrete levels of vibration intensity instead of adjusting continuously. The lowest level of intensity is still pretty intense as well. I would like to have some gloves with a finer intensity akin to a cell phone vibrotactile actuator.


These are my initial thoughts and ideas for the devices we played with in class today.

Comments: Manoj, Paul

0 comments of glory:

Post a Comment