OmniTouch wearable multitouch system turns any surface into interface
Researchers at Microsoft Research and the Carnegie Mellon University are developing a wearable projection system developed by, enables users to turn pads of paper, walls or even their own hands, arms and legs into graphical, interactive surfaces. Dubbed OmniTouch, the technology could be used for gestural control of the projected information and interact with it by using gestures or touch.
“It’s conceivable that anything you can do on today’s mobile devices, you will be able to do on your hand using OmniTouch”, said Chris Harrison, a Ph.D. student in Carnegie Mellon’s Human-Computer Interaction Institute.
Harrison previously worked with Microsoft Research to develop Skinput – a technology we previously described here. In a nutshell, that technology uses bio-acoustic sensors to detect finger taps on a person’s hands or forearm. Skinput thus enabled users to control smartphones or other compact computing devices. The optical sensing used in OmniTouch, by contrast, allows a wide range of interactions, similar to the capabilities of a computer mouse or touchscreen.
OmniTouch relies on a depth-sensing camera, similar to the Microsoft Kinect, to track the user’s fingers on everyday surfaces. This allows users to control interactive applications by tapping or dragging their fingers, much as they would with touchscreens found on smartphones or tablet computers. The laser pico-projector is used to superimpose images of keyboards, keypads and other 2D graphical interfaces onto any surface, automatically adjusting for the surface’s shape and orientation to minimize distortion of the projected images.
It can track three-dimensional motion on the hand or other commonplace surfaces, and can sense whether fingers are “clicked” or hovering. What’s more, OmniTouch does not require calibration — users can simply wear the device and immediately use its features.
“With OmniTouch, we wanted to capitalize on the tremendous surface area the real world provides”, said Hrvoje Benko, a researcher in Microsoft Research’s Adaptive Systems and Interaction group. “We see this work as an evolutionary step in a larger effort at Microsoft Research to investigate the unconventional use of touch and gesture in devices to extend our vision of ubiquitous computing even further. Being able to collaborate openly with academics and researchers like Chris on such work is critical to our organization’s ability to do great research — and to advancing the state of the art of computer user interfaces in general.”
The current version of the device is mounted on a user’s shoulder, but its developers hope future versions would be the size of a deck of cards, or even a matchbox, so that it could fit in a pocket, be easily wearable, or be integrated into future handheld devices.
For more information, you can read the paper named: “OmniTouch: Wearable Multitouch Interaction Everywhere” (PDF).
Harrison will present the OmniTouch on Wednesday (Oct. 19) at the Association for Computing Machinery’s Symposium on User Interface Software and Technology (UIST) in Santa Barbara, Calif.
I was a finalist in the Microsoft Imagine Cup with this project in 2009, just before the 6th Sense came out. Ironically, it is called the 7th Sense.The concept is very similar to mine!
The montage looks interesting, and it shows how this technologies could be used in future.
Once you make a functional prototype we could cover your version of this interface.
Is this product available for sales ?
If yes, May i know what is overall price of this unit ?
It available to which website ?