TOUCH YOURSELF INTERFACE
According to researchers in the US, Skinput system, an acoustic sensor capable of detecting distinctive sounds made by tapping different parts of skin, will turn bodies into touchscreens.
The unit is small and wearable combining a bio-acoustic sensor and a pico projector. Demos have centred on using an arm or hand as an interface but with roughly two square metres of skin there are numerous possibilities.
The research team, comprising Carnegie Mellon University’s Chris Harrison and two Microsoft Research employees, Desney Tan and Dan Morris, will publish their findings in a paper titled Skinput: Appropriating the Body as an Input Surface. The paper will be released in April at CHI 2010, the ACM Conference on Human Factors in Computing Systems.
Tan explained the technology had the potential to advance the way we interact with technology. He said:
“Just as cellphones and mobile computing have changed the way we operate in the world, so, too will this vision of ‘intravenous computing’ revolutionise the way we use – and rely on – computers.”
“Proprioception (our sense of how our body is configured in three-dimensional space) allows us to accurately interact with our bodies in an eyes-free manner. For example, we can readily flick each of our fingers, touch the tip of our nose, and clap our hands together without visual assistance.”
One word of warning however comes from Pranav Mistry, of the Media Lab at the Massachusetts Institute of Technology, who told the New Scientist that users would have to position the armband very precisely so the projection appeared in the right place. Because we don’t want to be caught tapping the wrong thing in public!