At performance time, the position and rotation of the performer’s face were calculated from the tracking data in real time, and we synchronized the position and rotation of the 3D model with it. Thus, an accurate, perspective view of the 3D model’s animated textures was projected onto the real face.
Cinema 4D’s speed was important because this was the first face mapping project we have done at WOW.
Abe said, "The whole project had to be completed in six months, but because it took us some time to get our ideas and approaches into shape we only had one month to spend on production work. The speed of Cinema 4D really helped us."
“One of the first and most important aspects of face mapping we discovered was how sensitive viewers are to even slight changes to the face that are incongruous or out of harmony with the performance. Compared to small alterations made to landscapes or product shots, for instance, audience reactions to faces are much stronger. So we had to do a lot of testing and work rapidly through many trials.”
Cinema 4D’s MoGraph tools were especially useful in three of the animations. The first is the scanning sequence at the start of the video. Later on we see a transition from the model’s skin to a black-and-yellow lizard’s skin, and a few seconds later a black-and-red frog’s skin forms and slides off the face. 12sec, 50sec, 1.03sec
MoGraph’s Inheritance Effector was used in the scanning sequence to transition a composited wireframe as a 3D effect as it moves over the face, and then spreads out from the nose. When this was rendered with the Cel Renderer as a post effect, it created a complex pattern.
For the lizard’s skin, the Shader Effector forced a gradual change across the texture of the face’s surface. The Random Effector was used to produce the randomized, fractured black-and-red pieces of the frog skin, while the slide-off effect was created using Dynamics, which have parameters that can be set through the Effectors.
Once all of these animations, looks and effects were completed as WOW wanted them to be, precise projection data was generated using original software the team developed for this purpose. Interestingly, although this production gives you an impression of interactivity and spontaneity, the actual projected movie is fixed. Only the projection, driven by the motion capture, was interactive. This made it possible to storyboard the piece in a deliberate way.
At times, the model appears to blink and open up her eyes to reveal an animal’s eyes, or to suddenly smile. In those cases, Abe explained, she did not actually blink in real-time. Viewers are watching a projected animation - only. “We have done a lot of research and created numerous simulations for eye expressions, adjusting them over and over,” he said. “People can immediately sense even the smallest incongruity around the eyes. In daily life, they watch other people’s eyes and, from there, read slight changes in feelings. To test those projections, we used a mockup of the scanned face model, Cinema 4D and other software until we could finally create very natural blinks and smiles.”
Asai commented, “Modifying details of the face is very sensitive work. It is fairly easy to express humor or fear, or achieve grotesque results. Conversely, beautiful expressions on faces are very difficult.”