Capturing Thanos: How the Ultimate ‘Avengers’ Adversary Was Brought to Life

VFX supervisors Dan DeLeeuw and Kelly Port discuss the challenges of capturing the nuanced performance of Josh Brolin as the philosophical antagonist in ‘Avengers: Infinity War.’

Avengers: Infinity War, the 19th film in the Marvel Cinematic Universe, brings together a vast ensemble cast of Avengers and Guardians of the Galaxy to try to save the Universe from the powerful villain Thanos, (played by Josh Brolin), as he tries to collect all six infinity stones giving him control over reality itself.

Directed by Anthony and Joe Russo and produced by Marvel Studios, the film grossed more than $2 billion worldwide following its release by Disney in April, making it the highest-grossing film of 2018. And now it’s one of the front-runners this awards season for its spectacular visual effects. The sequel, Avengers: Endgame, is scheduled for release in April 2019.

Brolin’s nuanced performance as the philosophical antagonist Thanos really drives the film.

Visual effects supervisor Dan DeLeeuw, who was nominated for a visual effects Oscar in 2015 for his work on Captain America: The Winter Soldier, explained that from the outset, because of his central role, the visual effects artists knew that creating a believable Thanos was always the lynchpin that would hold the movie together.

“We had done Thanos two or three times before in the other films, but we knew that we were going to have to push it to the next level to be able to carry what we saw in the script,” he says. “When we first started to do R&D we talked to Digital Domain and we talked to Weta and we started a parallel process on Thanos figuring we could learn something from both companies and build the character and make him better, but also make sure that we could get a version of Thanos as quickly as possible, which the filmmakers could see and Josh Brolin could see and then know and trust that we could pull it off.”

DeLeeuw recalled that working with two VFX facilities on one character was somewhat tricky. “You look at the different components of the film in terms of what you need to solve,” he says. “The subtle performances generally went to Digital Domain. The action performances went to Weta. You kind of split the character up and, hopefully, hide the differences. The key was always going back to Josh,” he adds.

“In terms of keeping it within the same palette, we’ve got a really great look dev department here at Marvel, so what they’ll do is sculpt a lot of our characters in ZBrush so that even though the two companies started working on the character, they both started from a common sculpt of his face,” DeLeeuw continues.

The VFX team started early with motion-capture tests of Brolin performing and rehearsing some of his lines as the Russo brothers coached him on the character. “We just left the motion-capture helmet camera running the entire day,” DeLeeuw says. “What we found is we got better, more interesting performances when Josh was just sitting there experimenting with Thanos’ character, than we did with some of the lines.”

While Thanos was always portrayed as a menacing, bombastic villain in previous films, Brolin was able to deliver a very subdued and much more thoughtful performance. “Basically, we went through the recording of the day and picked out key lines that we thought were very cool. Then Digital Domain did the first test and it came back and it was just amazing because you could see Thanos there for the first time and then as the story develops Thanos started taking on more and more of the antagonist/protagonist role that you see in the film,” DeLeeuw recounts.

Digital Domain’s visual effects supervisor Kelly Port noted that “A lot of times actors wearing a facial capture helmet will over perform or exaggerate a performance thinking that it will make it come through stronger. But it doesn’t necessarily make the subtle parts of the face come through any better. So, the fact that he saw the subtleties of his performance come through gave him the confidence, I think, to perform that character as he really intended.”

Port, who was nominated for VES Awards in 2008 and 2015 for his work on We Own the Night and Maleficent respectively, explained that Digital Domain developed a proprietary machine-leaning system to help refine facial motion capture for the film.

“What made us go that way is that we were looking for techniques and technology to capture as much as possible of the subtleties of Josh Brolin’s performances — [something that] hasn’t been done before,” he elaborates. “So, we had a session with Josh — a seated session that had nothing to do with the lines in the film — to get a high-resolution capture of how Josh Brolin’s face moved. He would just talk with sample dialog, [giving us] facial expressions — extreme, subtle, you name it. We just tried to capture as much as possible of how this particular human being’s face moved,” he says.

“Then once we had that, we could take the low-resolution mesh that’s derived from just about 150 tracking dots on his face and feed that into a machine-learning system and over time it learned what sort of high-resolution face to generate from that,” Port continues. “It’s not like warping or blending or anything like that. It’s actually generating a high-resolution face, exactly how he’s moving.”

Port explained that as the artists evaluated the results side by side to make sure all of the nuances were carrying through, they would correct any errors and adjust the results to better match the performance, which gave the system the data it needed to learn from its mistakes and get better and better over time.

“Unfortunately, a lot of the places that ended up being a bit off are the areas where you don’t have a lot of data, which is often the most expressive parts of the face — the mouth and the eye especially,” he recalls. “So we ended up creating hundreds of handcrafted face shapes to tie into this library, and ultimately this feeds into the machine learning system.”

Of course, in the end, Brolin had to perform on set. “We wanted him with the rest of the actors,” says DeLeeuw. “We wanted him to be able to perform with the other actors on set because when you get into this motion-capture volume it becomes this sterile place. You just kind of feel that you’re just acting with someone feeding you lines.”

DeLeeuw stressed that the overall complexity of film was the biggest challenge. “There are 2,700 cuts in the film and there’s only 80 shots that we didn’t touch. So we knew it was going to be a giant film.”

“There is an enormous amount of visual effects [in this film],” adds Port. “Ninety-seven percent of this film was visual effects. Then you ask, ‘well do those visual effects support and move the story forward?’ Without the visual effects, this film wouldn’t go very far. Are there technical innovations in the film that pushed the industry forward? Yes. Very much so. Are the effects consistent? I think so. Are they complex? Well, they’re all really hard shots, too.”

DeLeeuw described 10 or 12-hour days reviewing dailies right up to the last possible minute. “There’s no big magical system,” he says. “[In the end] it’s just nose-to-the-grindstone, going through it again and again and again until you basically run out of time,” he observes.

“Somewhere around 2:30 in the morning on the last night, they walked in and said, ‘okay, we’ve got to be done in half-an-hour because it takes us three hours to get everything rendered, and we need to send it to London so we can start the localization,’” DeLeeuw recounts. “I’m like, ‘No. Keep working. Keep going until someone says stop.’”

“Of all the visual effects in this film, Thanos is what I am most proud of,” DeLeeuw concludes. “I think he’s very impressive terms of the different range of emotions he’s able to project. It was great to watch Josh Brolin see himself as Thanos and see his eyes light up when he saw the performance. He said, ‘That’s really me.” We said, ‘Yeah, that’s the point. That’s what we’ve been trying to do.’”

PHP Code Snippets Powered By : XYZScripts.com