Not only is “Beauty and the Beast” a tale as old as time, it’s also the biggest hit of 2017. Out now on DVD, Blu-ray and digital formats, Disney‘s live action remake of its classic animated film is also at the cutting edge of eye-popping visual effects — so be our guest to find out how it was created.
To update “Beauty and the Beast” in the age of CGI, Disney recruited Digital Domain, the Oscar-winning effects company originally founded by James Cameron and Stan Winston in 1993. Visual effects supervisors Kelly Port and Derren Hendler, who between them have worked on effects extravaganzas from “Titanic” to “Star Trek” and the next Marvel “Avengers” movie, worked to bring to life the beloved characters of the original film in a new form.
Darren Hendler: We didn’t go in thinking we were going to animate this Beast to match the original. The actor and the director worked on his performance in a way that would invoke the original, and we matched the actor’s performance.
Digital Domain’s main task was creating the character of the Beast, a digital creation based on a performance by “Downton Abbey” star Dan Stevens.
DH: The Beast definitely was one of the most technically challenging characters we’ve ever had to do. This character is covered in body fur and is wearing multiple layers of clothing riding on top of the body fur. He’s wearing a cloak which is made up of hundreds of different shredded pieces of material and fabric. Each of them have hair and threads and stuff on top of them, that are riding on his body and on his hair. He’s got this long hairstyle that’s flowing back down onto this cloak, and onto his body hair. And at the same time we’re driving off this talking actor, trying to get this one-to-one transfer of the actor’s face onto the character. That’s a lot of things to get right in one shot if you want to have that one shot look believable and not take you out of the movie.
Digital Domain was also responsible for conjuring digital backdrops to the physical sets of the Beast’s castle.
Kelly Port: Our production designer, Sarah Greenwood, did an amazing job with her team in building these gorgeous sets, like the ballroom and the foyer where they walk down the stairs before they go into the main ballroom dance. Even though the stages were huge, we would take the work that Sarah and her team did and extend that out. Where the set ends we would begin, matching the architectural style and lighting.
Although visual effects are added after filming, the team also have a presence during the shoot, advising the filmmakers, using LIDAR to digitally scan the dimensions of the physical sets, and recording what’s going on in each shot ready for post-production.
KP: We take that information about what lens was used and roughly what the camera was doing. We take a record of how the set was lit so we have a record of how to light a CG character like the Beast in the same lighting environment when we put all that together.
These days it’s even possible to render a rough version of the effects in real-time to show the filmmakers on a monitor right there on set, but Port and Hendler have reservations.
DH: One thing we’ve noticed with real-time feedback is it’s not accurate. People tend to start to rely on it, actors start to change their performances, but if it’s not accurate that’s actually more harmful than helpful. So sometimes we’ll show them a quick preview of something, but we then let them actually focus on their performance instead of watching something.
Playing the hulking Beast, Stevens effectively performed his role twice. First, on set, he walked on stilts and wore a special body suit to bulk him up to the Beast’s imposing height and physique. The suit was made from sculpted latex foam and covered with a grey lycra outer layer covered in tracking markers, which were recorded by the cameras to track the movements of his body. This was his “plate” performance, capturing the template over which the digital character would be overlaid.
DH: In some initial tests we actually had Dan wearing some prosthetic teeth, to change the way his mouth moves so it would be easier for us to put the fangs in. But we noticed very quickly as Dan’s doing his performances, you could tell that he hadn’t lived his entire life with fangs. His fangs were a little uncomfortable and awkward, and that started to come through in the performance. We very quickly abandoned that idea.
Often actors wearing motion capture suits wear camera rigs on their heads and have tracking points marked on their faces too. But it was decided that Stevens and co-star Emma Watson should be able to see each clearly in order to interact more realistically, so Stevens didn’t wear a head-mounted camera. Instead, after shooting a scene on set with the other actors, he would go into a studio at a later date to record the second part of his performance.
DH: We used a custom rig that has 25 cameras all filming Dan’s face, tracking thousands and thousands of points on his face. He’s seated, he can move around a bit, but not much. Some actors really struggle because you’re trying to give this emotive performance, but you’re sitting in this rig that’s capturing your face and you can’t really move your head too much.
Those thousands of tracking points were then mapped to the digitally-created face of the Beast, transferring the nuances of Stevens’ performance to the outlandish creature.
KP: It’s almost a new kind of acting. You don’t really know how he’s going to deal with it until the first time he gets in there and actually does the first facial performance. It’s a very abstract, bizarre-looking system completely removed from any of the beautiful sets. What was nice was that Emma sat beside him with a lot of the dialogue. And it turned out that Dan had an amazing ability to remember back to being on set one or two or three weeks prior, and being able to take his mind there and do that performance really well.
The team then used custom software to transfer Stevens’ facial movements onto the digitally-created creature. A data-intensive process that used to take months, this now takes weeks or even days.
KP: The detail that we’re getting into is beyond pore-level. It’s little things like eyelashes and eye water, and pores and the nuances of the beard hair and whether it has variations of colour and the root of the hairs. It’s pretty astounding.
Creating digital characters requires an army of experts in the human face. Specialist facial trackers, facial modelers and facial animators are employed to capture a performance and translate it into the finished character.
KP: Every single film has a unique set of requirements from the last one. So every generic job like a modeller or someone who’s creating the shaders or the textures or the painting, those jobs have always stayed the same but they just fine-tune into a specific area for this particular need. A lot of these people have a background in modelling or anatomy, facial anatomy, and then a lot of software engineering.
Most big effects-driven films have more than one effects vendor working their magic. For this film, Digital Domain worked alongside British VFX company FrameStore, which was responsible for the household staff and for extending the town sets.
KP: There were a few shared shots where we had to deal with Mrs. Potts and Cogsworth and Lumière, and the Beast. We had pretty good communication with FrameStore supervisor, Kyle McCulloch. We’d have weekly calls to deal with shared shots and talk about scheduling and things like that, but typically the look of the shot has been established by the context of the shots around it. Where it gets a lot more complicated is when there’s interaction between two characters like where they’re physically touching.
DH: There’s generally an understanding that for any shot, one vendor becomes the primary and the other vendor becomes the secondary. We have certain standards for handing off animation, geometry, textures, models. When we get into custom rendering software, generally we hand off custom lookDev turntables with lighting environments so we know they are matching correctly with their own custom software.
With films like “Beauty and the Beast” and the Oscar-Winning “” blurring the line between live action and animation, and movies like “ ” digitally re-creating actors who have passed away, today’s visual effects seem to provide almost unlimited possibilities.
KP: You can pretty much do anything — but that doesn’t mean it’s not expensive. Time and money is still the operating factor.
DH: The one difficulty is still human faces. You do see some human faces that are believable in a few shots here and there but we haven’t seen that consistent, all-digital CG person that’s believable. But with enough money it’s definitely doable.
KP: There’s things that we could do today that are just remarkable. Today we’re doing stuff on shows that literally last year or the year before we couldn’t have been doing. It’s moving incredibly fast, so it’s a very exciting time.
After the runaway success of “Beauty and the Beast” and Disney’s other live action reboots, there are new renditions of “Mulan“, “Aladdin” and more in the works. Doubtless they won’t be the only classic movies upgraded using today’s jaw-dropping effects innovations.
KP: If they want to give me all the Disney films, I’m cool with that!
Tech Culture: From film and television to social media and games, here’s your place for the lighter side of tech.
Star Wars at 40: Join us in celebrating the many ways the Force-filled sci-fi saga has impacted our lives.