Microsoft Research and Carnegie Mellon University publicly unveiled work at the Computer Human Interaction conference in Atlanta Monday that makes an entire body a touch interface. Called Skinput, the system listens to the sounds made by tapping on parts of a body and pairs those sounds with actions that drive tasks on a computer or cell phone.
To make it work, a user straps on a prototype armband that includes 10 sensors, each smaller than a penny, that listen for taps made on the arm or hand.
"It can actually listen to these effects and impacts traveling up the bones and muscles and ligaments," said Chris Harrison a Ph.D. student at Carnegie Mellon University who is now working with Microsoft Research to develop the project further. "It listens to what they sound like and it can classify them. If it learns what your middle finger sounds like then it can say, ‘Oh I just heard the middle finger,’ and then you can bond interactive capabilities onto those."
It takes a minute or two to calibrate the system for each new user. That’s done by choosing one of six tasks that the prototype can now handle — up, down, left, right, enter or cancel — and pairing the choice with a tap on the arm or hand.
"And so it memorizes these mappings and attaches those to interactive things like pause your music player," he said.
The demonstration at the CHI conference used a computer monitor to show the interface, but in a previous demonstration off-site, Harrison had a pico, or mini, projector mounted on his arm that projected the display onto his hand or arm.
One of the examples Harrison showed was a game of Tetris where he controlled the blocks by tapping on his fingers and arm. In another demonstration he navigated a simple music player; playing, pausing and selecting music with a series of taps.
The system is bulky now, with an armband and 10 wires plugged into a large receiver that then plugs into a computer to process the sounds. Harrison hopes to scale it down.
"Maybe the size of a penny could encompass the entire armband. That could fit in the back of your audio player when you’re out for a jog," he said.
In a lab test, Harrison found that jogging didn’t interfere with the system.
As more task choices are added besides, the system will need to be more accurate.
"In the wild and especially when you have 10 inputs, your accuracy is going to suffer," he said. "We really need to push the accuracy up to the high 90s or even 100 percent if we want this to be palatable for consumer electronics."
According to Harrison, the accuracy of six inputs is between 90 and 100 percent, depending on the user.
Skinput was born from an earlier project that Harrison worked on called Scratch Input that allowed users to write by simply dragging a finger over the surface of the textured material, like a desk. A sensor interpreted the sounds and turned them into lines on a display. Imagine scratching letters on a desk and having them show up on a display screen.
There’s no word on when or if the system will be commercialized, but Microsoft’s interest in what was once a student’s project could indicate that it may use the technology in the future.