Bungie's shared-world shooter Destiny is set to rock the gaming world when it launches later this year. In anticipation of the game's release, I took the time to chat with Peter Busch, the vice president of business development at Faceware Technologies, the facial motion capture company behind Activision's highly anticipated shooter.
GameRevolution: Faceware Technologies describes itself as being the "most experienced provider of markerless 3D facial motion capture solutions.” Can you talk a little bit about the process of (and what’s involved in) “markerless” motion capture?
Peter Busch: Faceware Technologies is a spin off of the service provider Image Metrics, which has been providing facial motion capture and animation services to the video game industry for nearly 15 years. Rather than relying on markers placed on the face and skin, Faceware's technology uses video and software to extract and understand the features and motions of a person’s face.
Our experience has helped us create tools that provide a very practical approach to a complex challenge. Facial animation is a very difficult craft – we have learned from our years in the industry and created a simple process that embraces the artist and the way they work. Actors simply act on camera (without having markers all over their face) and animators simply animate with the aid of our Retargeter software, which has been called “an assistant animator” by Bungie.
GR: What sets Faceware apart from other motion capture companies out there?
PB: We have many unique features in our software, most of which provide the heightened level of flexibility required for a successful production tool. We couple that flexibility with a workflow that is very straight forward and simple, allowing the end user to really increase his/her effectiveness and optimize how they use our software.
One unique feature is the AutoTrack feature in Analyzer, which uses an exclusive library generated from over 10 years of facial data to help the software understand how faces move. This feature allows any video of any actor to be automatically tracked by our software without any information of their face. The tracked movement can then be applied to a digital character, which helps artists produce highly realistic facial performances much faster than doing it all manually. It is truly amazing.
GR: How does Destiny’s character customization affect how you approach motion capture? Will players be able to customize their character’s face?
PB: While we can’t speak about specifics within Destiny’s characters, we can say that our five-year relationship with Bungie has always had one goal: the best quality or nothing else will do. They truly drive to have believable performances and have embraced our best practices to drive the quality of facial animation on Destiny.
GR: Destiny boasts a number of different creatures that don’t necessarily have human-like facial features. How can one go about capturing facial animations for alien species?
PB: Because Bungie uses our Retargeter software as an “assistant animator”, this allows them to fully utilize our unique pose-driven workflow. In our process, an animator will play role of interpreter and define how a smile or scowl will look on any of their characters, humanoid or not. This level of creative control allows the relative movement of a live human performance to be applied to any character—even if they don’t have facial features which resemble a human! I can say that the Bungie team has done some very innovative and exciting approaches to facial animation that I’ve been truly impressed by. Wish I could say more!
GR: You’ve worked with Bungie in the past on Halo: Reach, as well as Crytek on Crysis 2 and 3, but your partners expand beyond the sci-fi first-person shooter genre, as you’ve worked with 2K on its popular NBA 2K franchise. Is there a different approach involved in capturing the faces of recognizable professional athletes?
PB: Absolutely. In almost every genre of game we do, the way a developer uses our software is very different. This is why the flexibility of our software is so critical.
With NBA, 2K's constant challenge is to have all of their highly recognizable public figures look and act like their live action counterparts. What’s really different is that a sports title does not have the level of dialogue that a title like Destiny or Grand Theft Auto V would have—2K is really building a library of emotions that must work on all of their athletes, combined with “Signature” performances.
What is really unique about our software is that because it can work with any video source, 2K can use live game footage of Kobe Bryant to drive the 3D Kobe Bryant in their game—this obviously is a perfect scenario to meet the demands of their audience.
GR: Destiny is being developed for PlayStation 4, PS3, Xbox One, and Xbox 360, spanning two different console generations. How is the disparity technological capabilities handled? Does Faceware feature some sort of scalability?
PB: Most Autodesk 3D software packages have a means of adjusting animation quality to the technical limitations of the target platform. The animation portion of our process is done through our software product Retargeter, which is actually a plug-in to Autodesk 3D software products. This gives Bungie full flexibility in porting the animation to any console.
Our products are very scalable, and can be used on a wide variety of titles. Camouflaj Games in Seattle recently released a few scenes of their character named Hope in the iOS title Republique. Our technology also helped produce over 400 minutes of cutscenes in Grand Theft Auto V. We even have a motion capture service client who is processing over 10-20 hours of animation per week for a wide variety of media. Our software has been used on over 9.5 million frames of animation since the beginning of 2012. Needless to say, scalability is not a challenge for our products, which is why our clients have trusted our products over the years!