Digitizing yourself as Miku Hatsune or some other avatar
In this article you can read about 2 projects which are able to transfer your actions into a virtual world. Both projects function without special motion tracking suits or markers, and their another thing in common is the way they were presented. Both of their presentations were done with usage of a virtual character named Miku Hatsune – a popular virtual character which we described in our article about Vocaloid virtual singers.
Markerless face tracking with high speed and precision
A while ago, we wrote about Emotiv Epoc – a neural headset able to read brain activity and translate it into actions in virtual world. One of the ways that equipment could be used is to use those readings to create virtual avatars which mimic our facial expression – a feature useful in games, online conversations and CG animation.
Led by Associate Professor Yasue Mitsukura, a group of researchers from Keio University came up with a different approach which relies on visual information gathered from a camera in order to achieve face tracking with high speed and high precision. It measures which way a person is facing and how their expression changes and translates those expressions onto virtual characters.
“We think this system could be used by CG animation hobbyists, in Web dialog systems that show a character instead of the person’s face, and for making characters move in real time at events. Because the system uses just one PC and one camera, it can be applied in many situations very easily”, said Koichi Takahashi, from Keio University’s Graduate School of Science and Technology.
This system’s hardware consists out of an ordinary PC and a USB camera, and the ‘magic’ comes from the tracking software. This system uses time-series signal processing which detects and tracks characteristic points such as the eyes, nose, and mouth. The algorithm reacts on detected changes in positions of those characteristic points and it reacts nearly instantly with very high precision.
Aside tracking characteristic points, the system is able to analyze other features such as facial expression or current position of user’s mouth. Hence, it is able to reproduce emotions and speech with slightly less precision, thus causing the user to be a bit more expressive than usual.
This system is convenient for gaming industry since it doesn’t require markers or headsets to be worn during play. The researchers aim to develop motion generation software for standard PCs, thus providing various applications of their system.
Freeware 3D animation and markerless body tracking software
Since we wrote our article about Vocaloids and Miku Hatsune, there have been many videos where people used a software package named Miku Miku Dance (MMD) to create videos starring Miku. Aside other non-Vocaloid models, you can choose between Miku Hatsune, Rin and Len Kagamine, Kaito, Meiko (along with her younger version, Sakine Meiko), Haku Yowane, and Neru Akita.
Originally developed by Yu Higuchi, MMD is freeware software developed as a part of Vocaloid Promotion Video Project (VPVP). Higuchi stopped the development of the software in May 2011, but the fact it is freeware inspired fans to continue improving the software.
MMD allows users to import these 3D models into a virtual stage and animate them. Animations can be used along music in order to create fan-based music videos which feature imported characters. The animation sequence can be stored into a motion data file, making it easily reusable for other projects and shareable among fans.
The best feature of this software was saved for last, and that is combining it with Microsoft Kinect technology. You can use the body tracking ability of Kinect to transfer your moves into the virtual world where avatars such as Miku repeat your movements in real time. With each iteration, MMD offered better and more fluid movement detection.
In order to make the animation more realistic, the software offers map shadowing, screenshot rendering and full movie rendering are also possible. It also relies on Bullet physics engine – an open source physics engine which enables soft and rigid body dynamics, as well as for 3D collision detection. The engine was used to create effects for movies such as Sherlock Holmes, Hancock and animated movie Bolt.
For beginner tutorials, examples and general information about MMD in English language, you should visit MMD Sekai website.