Categories
BlogSchmog

Hands-Free Your Mind

A few months ago, an item floated down the news stream about Japanese researchers creating an iPod control for one’s teeth. Since then, news other non-traditional inputs have followed.

Toshiba is experimenting with gesture recognition engine that might allow a hand to be used as a DVD remote. Internationally renown physicist Stephen Hawking controls his computer through blinking. Bernd Brügge collaborated on a paper that describes Pinocchio, a virtual conductor technology to allow a person to control a CD of an orchestra through gestures. Just yesterday, I read about computer-facilitated brain control interfaces developed by researchers at the Keio University. Thoughts about arm movements are detected as brain waves and translated through a headpiece from electronic signals into commands Second Life can understand, moving a virtual avatar.

Brain waves!

Jonathan Korman of Cooper recently posted an interesting commentary on gesture and how it is both intuitive and pleasurable. Intuition is when something is “easy to explain, powerful in its implications, impossible to forget.” While this can be done in all sorts of interfaces, both digital and physical, there is something special about being able to use gestures to control our tools:

I do the gestures from muscle memory, rather than cognitive memory, just like I do with my typing on my computer keyboard. Most of the time tools that run on software tax our cognitive capacity but leave the intelligence that lives in our bodies relatively untapped, which makes us East African Plains Apes a little uncomfortable; using those gestures makes me a happier animal.

Korman uses the iPhone and his old trackpad devices as examples of how gesture input can be perceived as better than using a mouse.

At the IU School of Informatics, our training in human-computer interaction design is something that doesn’t often go near input devices. We focus mostly on what happens after the input is received (and, the things that prompt the input in the first place). However, there are some people here interested in pervasive computing, which deals not only with non-desktop computing systems (PDAs, mobile phones, etc) but also the components to interact between the digital and physical world.

Design materials are becoming more accessible. Phidgets are “physical widgets” that can detect temperature, position, orientation, pressure and any number of other environmental indicators that something has changed. Nintendo’s Wii remote (a “wiimote”), a versatile game controller that uses an optical sensor and accelerometer, has inspired hacker communities to turn the device into things like email clients. While these kinds of non-traditional inputs—and hey, the computer mouse was non-traditional when it first showed up in the 60s and 70s—make for flashy headlines (Brain Waves!), the real value is in the design philosophy behind why they might be adoptable.

Focusing on gestures is similar to examining the normal routines of the user and placing the technology at natural moments of interaction. If someone is doing something already without the intervention, then fitting the interface to match that existing motion lowers the barrier to use. The flip side to that, of course, is that there is some value to conscious action. Brain-wave controllers are way cool because of the notion that thinking = doing … which is also a reason to fear such devices.