The ‘Marvel’ Behind Real-Time On-Set Visualization of Digital Hulk and Thanos

Profile Studios’ Matt Madden shares how virtually integrating Mark Ruffalo and Josh Brolin with CG assets, live, on set, gave ‘Endgame’ and ‘Infinity War’ filmmakers the immediate viewing access they needed to continually adjust and refine the film’s key digital character performances.

For Profile Studios, working on Marvel’s back-to-back smash hits Avengers: Infinity War and Avengers: Endgame afforded the company a unique opportunity to fully employ their state-of-the-art, on-set, real-time visualization system on what were arguably two of the most sophisticated and challenging VFX-driven films ever produced.

Founded in 2014 by Matt Madden, president and creative director, and Connie Kennedy, executive producer, Profile Studios specializes in performance capture and virtual production services for film, games and TV. They’ve worked on a number of high-profile projects including Spider-Man: Far From HomeBlack PantherWelcome to Marwen and Star Wars: The Force Awakens.

Brought on by Marvel prior to the start of Infinity War shooting, Madden and his team had an extremely important mission: to provide the filmmakers with an immediately viewable, real-time “virtual” version of what they were shooting live, replacing motion-captured performances with digital characters alongside other key CG assets. This virtually composited version of the scene being shot allowed for instant assessments of the quality and precision of acting performances and camera framing. Needed adjustments could be made on the spot.

Profile Studios’ primary work on the two Marvel films involved capturing and perfecting the performances of Mark Ruffalo, as Hulk, and Josh Brolin, as Thanos, though they also were involved with The Black Order and other characters that couldn’t be represented just through physical makeup.  After beginning prep work in late 2016, Madden and his team of 12-15 artists began working on set in January 2017, moving on and off shooting over most of the year as well as a chunk of 2018, with on set segments usually lasting 3-6 weeks at a time.

Unless otherwise noted, all images © 2019 Industrial Light & Magic, a division of Lucasfilm Entertainment Company Ltd. All Rights Reserved.

“During production, first, we would capture the actors’ performance on set,” Madden shares. “Then, we would re-target the actors’ motion onto the respective characters they were playing in real-time. The filmmakers also had the option to see the CG background composited in if there was a greenscreen replacement. But, at the very least, they could see the fully digital characters composited into the live-action frame in real-time right as they were shooting. We used models and other production assets, like the CG character skeleton setup for Hulk and Thanos, that had been previously agreed to by the different VFX vendors and approved by Marvel.”

When everything worked as planned, filmmakers viewed the live-action scene through two monitors: one with a clean view of just the actors, the other, a virtual CG composite with digital characters and assets. This real-time production system allowed instant evaluation of the relationship between the two “sets,” enabling immediate adjustments in actor blocking or framing based on the character’s final “size” and position within the scene. Changes critical to the look and flow of any scene, that depended on visualization of the final digital character performance, could be made quickly at the time of the original shoot.

In one example, Madden describes a Hulk scene they shot in a diner. “We shot a Smart Hulk scene at the Buckhead Diner in Atlanta. We installed a small capture volume on set with hidden cameras, then captured Ruffalo, tracked the camera, and did the live composite all on location.”

Real-time on-set visualization also helps actors decked out in motion-capture suits and cameras feel like they’re really tied into the physical set, interacting with other actors, in a way that makes the final action feel much more real and convincing. “Trying to create such complicated scenes involving many characters, using separate processes, would have been prohibitively time consuming, and the ‘disconnect’ the actors might have felt between their performance and the ‘world’ of the film could easily have come through in their performance,” Madden notes. “For so many acting subtleties, the timing cues that make or break a performance can only be fully captured when the actors are shot together. That connection is lost when everyone is filmed individually by a team on a motion-capture stage.”

Through their efforts, Profile Studios’ innovative marriage of real-time rendering, performance capture and CG compositing helped eliminate some of the problems many similarly CG-intensive films suffer from, where various digital characters and elements feel arbitrarily “pushed” into the frame and bolted together with live-action elements; once composited into a final film, these disparate pieces often lack sufficient emotional or visual connection to the action at hand. With so many CG elements being worked on by various vendors, all integrated together in editorial for the first time during post- production, it’s always difficult for filmmakers to blend everything together into a coherent, believable story. Visualizing some of that final digital character integration early on, while shooting the underlying live-action performances, gives filmmakers an enormously valuable creative tool.

But their on-set efforts were only one part of the project. Their post-production pipeline team of 10 artists was equally busy, working on shot cleanup up until early 2019. “During production, our primary goal was real-time visualization of characters and backgrounds that couldn’t be done physically,” Madden shares. “Of course, there’s a post editing session, and clean up, that’s done on top of that.

“Once the shooting was done, we would get a shot turnover from editorial,” Madden describes. “We would essentially clean that for up any minor flaws in the motion, basically re-process and re-track the performance, then re-target the character motion back onto the actors, which allowed us to fine tune the performance. We had re-targeting controls to make sure that the motion of Thanos and Hulk were being projected from the actor onto the character as accurately as possible, while still allowing for physical differences adjustments. We turned those rendered shots back over to Marvel editorial; they distributed the updated motion of all the characters to the respective VFX vendors like Weta.”

With such an immense and complicated production, Madden’s biggest challenge was navigating the sheer enormity of the live-action on-set experience. “Many of the film sets were very large, and when you’re on first unit, of course, the filmmakers need to have the creative freedom to do whatever it is they want to do,” he explains. “So, you never want them to be bound by the limitations of any technology. Our job is to roll with the changes, and make sure that we have a team that, one, can foresee issues before they come up, and two, respond very quickly as soon as they sense that a change is required. For example, take something that would seem simple, like the blocking of an actor. When it’s a live capture, there’s no smoke and mirrors. You have to be able to see what the actor is doing so you can accurately re-create that motion in real-time. If the tracking or solving is causing problems, it can become a distraction to the point where they may ask to have it turned off, because they’re focused on the performance. And, if there’s noise in the data, then in some cases that can do more harm than good.”

“So, it was paramount that we didn’t have noise in the motion, that the motion came out relatively clean in real-time so that it allowed the creative team to focus on and evaluate the performance,” he continues. “You’ve got a dozen different departments on set, and each department has a goal, and sometimes those goals can conflict in terms of the demands, whether it’s the grip department, set decoration, or lighting. All those present challenges to other departments. But we all have to work together.”

Madden stresses that the real-time visualization actually gives other departments a better appreciation of what the performance capture and digital VFX work is actually achieving. “One of the benefits of seeing everything in real-time is that the other departments can appreciate what we’re doing, and how their work affects our work. If it was just all bluescreen, greenscreen, dots and tracking markers, they can’t make that connection to this process, and our role on set. But, when you can see the result live, you immediately gain a sense of how our collective work contributes to the product on stage. It helps us establish a kind of non-verbal communication, where they see us doing our drill amidst the controlled chaos of it all, prepping for the next setup, and intuitively they understand what it is we’re trying to accomplish. So, they may move a flag or a light a little one way or another without us having even to ask, because they understand what our goal is, generally, and want to help create this on-set solution.”

From a technology standpoint, Profile Studios relies upon a production pipeline they’ve integrated using both off-the-shelf and proprietary software systems. According to Madden, as far as technology, “It’s certainly a mix for sure. We have core technology, like for capture and rendering, that’s not proprietary. But, we have proprietary layers on top of all that to make everything communicate in a certain way. So, there’s camera tracking hardware, and our own tracking technology we integrated with commercial tracking technology. The commercial software provides the ability to communicate with it, so that if we want to add a separate tracking layer, for example, we have that ability to make the process more robust.”

“We need to look at more than just one way to track,” he adds, “It’s resorting to different sources of feedback to determine the position and orientation of the camera, because there’s no one on-set tool that works in every condition. It’s important for us to have multiple resources to determine where that camera is at all times. And there’s software we’ve written to work with other hardware on set. For example, the lensing coders. They’re encoders that production uses, that the camera operators use, and we’ve written software to transfer that data to our system and integrate it with the camera position tracking data, so we have all the information coming off the camera in sync in real-time.”

Looking back on the project, Madden reflects on how much he appreciated the experience. “Working with the Marvel staff is as good as it gets,” he concludes, “Everyone was a true professional. And overall, whether it was the Marvel staff or a vendor like us, we all had the same goal, and the continuity between everyone was very special. Marvel understood what was involved in everything they were asking of us, which helped with communication. They understood fundamentally what we were required to do, so that helped tremendously with planning, troubleshooting, and getting ahead of problems. You don’t usually get that type of support from production partners who aren’t experienced in these areas.”

By:Dan Sarto
PHP Code Snippets Powered By : XYZScripts.com