Sony enters body motion control market with Atracsys ICU
The latest games console arms race, where Microsoft’s Project Natal for Xbox 360 had a slight advantage, just got more interesting. Sony has unveiled their attempt to reach hands-free, full-body game control system called Interactive Communication Unit or ICU, at the Vision 2009 trade fair in Stuttgart, Germany. Sony Europe’s image-sensing division created ICU in collaboration with Atracsys, a small firm in Lausanne, Switzerland, that specializes in optical tracking.
At the same E3 where Microsoft presented their Project Natal, Sony demonstrated their own motion controller. They probably wanted to expand t the market which Nintendo held with their Wii controller. Their controller offers true 1:1 tracking in a 3D environment. This sort of accuracy is crucial to doing things ranging from drawing with variable pressure, to flicking an enemy’s chin with the tip of your sword. Although we expected the Microsoft to announce usage of controllers along the Project Natal, Sony surprised with their version of a full-body game control concept.
ICU uses stereo cameras to watch a player and judge depth, as we do with our pair of eyes. Like Natal, Sony’s system tracks a person’s whole body without their having to wear the body markers used in motion-capture studios. Its face detection and movement recognition technologies allow it to know at which part of the screen the users are looking. The system is also able to tell different information by looking at the user, such as age, gender, and the emotion currently displayed.
Atracsys already sells a system that gives medics hands-free control of computers in sterile environments, called Infinitrack. But its users have to wear small reflective markers like those used in a movie industry motion-capture studio; previous versions required users to wear particular colors.
“Casual users can’t be expected to do that”, says Gaëtan Marti, CEO of Atracsys, “which limits the system’s precision. We cannot at present detect ‘finger signs’ [but] we can detect where you are looking at on the screen – up, middle, down – and the raw position of your arms [or legs],” he says.
ICU’s stereo cameras can detect the position of specific points on the arms, legs and head to within 10 cubic centimeters, compared with the 0.2 cubic millimeters accuracy of Infinitrack. ICU ‘reads’ facial expressions using a pattern-matching algorithm that has been trained on pictures of people expressing different emotions. Using cues such as the position and shape of the lips, ICU spots five basic states: happiness, anger, surprise, sadness and neutral.
It also has the ability to tune out the visual clutter around a player that could otherwise distort its results. “Once it detects a face 2 metres in front of the cameras, the system can isolate the person by only keeping the information between 1.5 and 2.5 metres away,” Marti says.
“Sophisticated as it is, however, ICU isn’t yet going to be launched into the punishing domestic entertainment market”, says Arnaud Destruels, marketing manager at Sony’s image-sensing division.
They are planning to test it in the world of advertising before they use it with their gaming system. If placed in a shop window, since it is able to adapt the content to the persons who are watching, it can provide them with targeted information. It can also catch passerby’s attention and invite them to interact with the system in a natural and easy way. It could also offer the possibility to browse through the information by simply moving their heads.