Sony Pictures Imageworks Takes A Bite Out of ‘The Meg’
SPI visual effects supervisor Sue Rowe and her team use simulation tools and a lot of bubbles to tackle a dynamic, action-packed third act in Warner Bros.’ underwater thriller.
For Sony Pictures Imageworks VFX supervisor Sue Rowe, an industry veteran who moved to the Vancouver studio in 2016, The Meg represented an opportunity to work in the one visual medium she’d hadn’t had much previous experience: water.
“I had actually never done underwater,” she recounts. “I have done above water, I’ve blown up the moon in Independence Day: Resurgence. I’ve done CG soft characters, fat underbelly characters, but I was like, ‘Oh, I’ve never done underwater. Can I just do a show which is similar to what I’ve done before?’ But with the tech team at Sony, I couldn’t have felt more comfortable going to them and saying, ‘I’ve never done this before.’ They told me, ‘Well, we’ve never done exactly this before either. But, you know what? We’ve got some really smart people who will figure it out.’ And, they just hunkered down and showed me pass upon pass of CG water until we were satisfied.”
The project began with a call from VFX producer Steve Garrad about a project with a third act in flux that would require considerable shot work which ultimately wouldn’t make it into the final cut.
“I knew Steve from way back,” says Rowe. “We used to be at Cinesite together. So, he called me in late 2016 and said, ‘I’ve got this brilliant project for you.’ He knew it was going to be really big. He knew that decisions were going to be made fairly late. I really appreciated his honesty. He told me, ‘The third act is still being put together. I’m going to turn over 400 shots, of which maybe 200 are going to be in the movie. But I’m going to turn over a lot of shots, and I want Sony to do some very quick slap-togethers for me so we can get that to editorial and get it out,’ knowing the pivotal third act was going to change. It was a smart decision coming to a company that has the resources to churn through and turn around a large amount of work very quickly. Hearing this strategy upfront meant we could plan for the work volume properly.”
One of the film’s three main VFX vendors, Rowe’s team was able to share certain assets such as the gliders, made by DNEG, which was handling the first act, and the Meg itself, made by Scanline, which was handling the second act and had been on the project for a number of years. “Everyone shares really beautifully nowadays,” she notes. “We took assets made by DNEG and Scanline and made them our own.”
In addition to quickly turning over hundreds of shots she knew up front were bound for the cutting room floor, Rowe also faced the challenge of a third act taking place almost exclusively underwater.
“The third act was pretty much underwater,” she explains. “We had about 400 shots turned over to us and in the end, I think actually about 280 ended up in the movie. We used Maya and Viewport 2.0 and were able to turn stuff around very quickly by just putting our stuff through matchmove and some very basic rotoscoping. Very quick stuff where we can slot things into a background, color it blue, put some fog depth into it, some basic underwater views and a fast turnaround. It wasn’t going through huge teams of people.”
In this case, the speedy schedule was crucial to the development of the film’s final act rather than just the result of a VFX production crush that had to be accommodated. “Our final shot count doesn’t reflect the amount of work that we did on the film, but there you go,” she continues. “That’s was how we did it: quick and dirty, then from there, build up more and more until the editors and director got on board.”
To compensate for the quick turnaround and lower resolution output, Rowe decided to use as many live-action plates as possible in their early work, which gave the director a more accurate idea of what the finished shot might look like; though they worked with a considerable amount of more polished previs, the director and editorial team had to rely upon a large amount of Rowe’s more rough-looking animation in order to get their story finalized.
“The previs gave us a really brilliant start,” Rowe says, “But our problem on the VFX facility side is it takes a really long time for the director to get comfortable enough to say, ‘OK, I’m not going to use the previs to tell my story.’ They don’t want to use the rough looking animation that we would normally do early on. So, the smart thing we did was use actual footage where they shot Jason [Stathem] reacting and basic blocking of the Meg, where you could tell it was underwater. It wasn’t quite as polished as the previs, but it worked extremely well.”
The film’s third act centers around Statham and his crew maneuvering underwater in a game of cat and mouse, trying to capture and kill the massive 75-foot shark. “They travel in this huge evac ship and then drop their gliders, which are little mini-submarines,” Rowe describes. “We’ve got Suyin [played by Binbing Li], who’s the female protagonist, Jonas [played by Statham], the male protagonist, in their little gliders that look like UFOs that in many cases are completely CG.”
She continues, “The actors were in a motion base for the gliders that was built on set, that could rotate and turn in five different directions. It was great for the needed physicality. Jason was throwing himself around, and if we wanted it to look like he was coming towards us, we flew the camera past him. All fairly straightforward visual effects things, but it took a little bit of planning.”
An errant missile aimed at the Meg misses its mark, setting off an underwater explosion created by Rowe’s team that is completely different from the way movies usually display such an effect. “They have a missile they’re going to fire at the shark to kill it,” she says. “That’s the plan, but something goes wrong! It fires, just misses, but produces this amazing underwater explosion. Normally in VFX movies you see an underwater explosion and then you cut to an explosion shooting up above the water, like from a depth charge. But, we did it all underwater because it had to be Jason and Suyin’s POV. We created a storm of water coming towards the camera like they’re getting hit by a huge wave. The frame fills with bubbles and particulate.”
In the film’s climactic clash, the shark attacks the glider, slicing itself open on some sharp metal while Statham stabs it with a shard from the damaged craft. There’s a lot of thrashing about, the water gets messy, and the Meg breeches. “The Meg chomps down on the glider — there are multiple hits of the teeth smashing this thick, six-inch glass and crushing the glider,” Rowe describes. “And then the ultimate moment…there’s Jason and the Meg, and he stabs the Meg, the Meg kind of rears up, and I mean it’s classic, he grabs a sharp part of the glider, stabs the Meg in the eye, the Meg rears up out of the water, falls backwards. Meanwhile, the Meg has been gutted by sliding across the damaged side of the glider, there’s blood flow, there’s water everywhere. We used a CG digi-double at the scene’s beginning and the end, but the moment where Jason actually does the stabbing, that’s all him. We built it back on set so he could do the physicality of the action.”
It was during a live shoot that director Jon Turtletaub came up with one of the movie’s iconic scenes. According to Rowe, “While we were shooting in New Zealand, Jon came up with this great idea where the Meg actually breeches the water. I go over and talk to him. He shows me this footage of a whale that actually comes out of the water and breeches. If you’ve ever met Jon Turteltaub, he’s hilarious. And all he says is, ‘There you go, there’s your shot.’ Okay, yup, we can do that.”
Rowe’s team put together a test shot that looked so real, it had the director convinced it was actual shark footage. “In the beginning, to get Jon’s confidence, we did a test,” she adds. “I exported the breech footage he gave me, showed it to the guys in the office, who got a great white model and quickly mimicked the whale footage in a more stylized CG breech. On my next trip back to New Zealand, I showed it to him and he was just like, ‘Oh, yeah…but that’s real though, right?’ I said, ‘No, it’s all CG.’ I showed him both shots side by side. After that, he never questioned that we could get him what he needed.”
One of the biggest challenges for Rowe’s team was something that at first seemed almost insignificant: bubbles. “Who knew that bubbles would be a thing?” Rowe asks. “Oh my god, I’ve learned way too many things about engineering, submersibles and sharks on this film. What propels the gliders is cavitation. That’s the propellers rotating at such a speed they actually boil the water and that’s what propels you forward. That became the word of the project, cavitation, cavitation, cavitation! The first time I heard it, I had to look it up. But I said, ‘Yes, yes we can do cavitation.’”
She continues, “We did a ton of tests about how cavitation should actually work and showed the director. Everyone was happy. The thing was, when we actually put footage together and you’re looking at Jason inside this submersible’s big glass dome and he’s got to deliver lines, the cavitation shots straight out the back. And it looks much better when the bubbles rise up. It felt like you were really inside a submersible. So, I had that typical conversation with my team, which was, ‘I know what we’ve done is correct, I know that’s what would have happened, but hey, we need to change it. It doesn’t look cool enough.’ And god bless them, they’ve done all this before and they’re, like, ‘Right, OK, if I change this parameter and that parameter…’ and we showed the director five different types of cavitation. After a number of weeks, he said, ‘This one,’ and we re-output everything right towards the end with that kind of bubble look.”
Bubbles were also crucial in showing the speed of the shark swimming through shots of nothing but water. “Though the shark is 75 feet long, there’s no sense of scale when it’s actually in a body of water,” Rowe explains.
“When you’ve got a shark coming towards you, in order to make it look like it’s underwater, you have to fake in a little bit of water distance. Or you have to fake in the base of the ocean. Once you’ve done the usual, which is the particulate, the chromatic aberration, little bits in the water, the muscle simulation, all that kind of stuff, how do you get a sense of the shark’s speed? The trick that we found is we did little streamers of bubbles that came out of the gills and the nose. It’s correct physics when the shark is breeching, but once she’d been under the water for a period of time there was no reason to have those bubbles. But, we put it in every single shot because the bubbles came out and they followed the contour along the muscles of the Meg and it just made you feel like you could see the speed,” she continues.
“All these silly little things that you just never think of. It’s how we tell the story in a filmic way. How do you show a body of water? Well turns out you put little tiny bubble streams in it, on top of the kelp and all the things that are moving about. I said, ‘That’s it, put it in every single shot.’ The guys were like, ‘How much?’ I said, ‘Go to 11! Take it to 11! I’ll tell you when there’s not enough!’”
Rowe singles out two tools in particular that her team relied upon quite heavily on the show: Sprout, their proprietary Maya-based system for creating large quantities of high-res assets like plants and trees, and Ziva VFX, from Ziva Dynamics, which both Imageworks and Scanline used to create the Meg itself.
“Sprout is an in-house too that lets you instance objects to make hundreds of them and place them interactively,” she says, “It worked really well with my underwater scenes. We built that underwater Sanya Bay environment very quickly and very lightly. We made 15 types of rock, seven types of coral, four types of kelp, sand and grit, layers of that dropped into an environment we build in Houdini. But, I also covered those rocks with different types of coral, so every piece of rock is different because there was a different coral layering in the kelp, large and small. All kinds of things that you used to have problems with now I now can say, ‘Kill that, scale that, put one over there.’”
“Ziva allowed us to build an anatomically correct shark with the right musculature that will deform in the correct manner,” Rowe continues. “It means that the animators could concentrate on getting that character to move through the water, to get the right head shape, to get all that detail into how it was going to move, without having to worry about the secondary animation, like how the body was going to turn. I knew from using it on another show that if we put in the time and effort with the tool upfront, the result afterwards would be fantastic and fast.”
Rowe couldn’t have been more complimentary of her talented Imageworks project team as well as the studio’s commitment to the technical R&D needed to handle the changing demands of a show like The Meg. “The thing about Sony is that there’s something great about working in a company that has no problem applying needed technical resources on these projects. The enthusiasm you get from a team that is so buzzed about coming up with new ways of doing things is incredibly exciting.”
熱門頭條新聞
- Autodesk Introduces AI Video-to-3D Scene Solution Wonder Animation
- Studio Far Out Games has had a rapid rise.
- More Than 120 Games Sign on to SAG-AFTRA Video Game Contracts
- LORDS OF THE FALLEN CONFIRMS ENHANCED SUPPORT FOR PLAYSTATION®5 PRO
- Revolutionizing Cloud Gaming and Graphics Rendering with NVIDIA GDN
- UNWRAP, Belgium’s Premier Games Industry Event Delivers The Future of Entertainment
- Studio 100 and Lunch Films to develop Halloween vs Day of the Dead
- Chinese game makers look abroad to avoid regulations, fees at home