Here’s what the future of previsualization looks like

Recently, we covered how virtual production and its real time capabilities harbor huge potential for filmmakers and directors everywhere, with Jon Favreau and his 2019 feature film The Lion King serving as an undeniable success story to cement this.

It also goes to show the ways in which virtual production is turning traditional filmmaking on its head and encouraging wider creative collaboration earlier on in the production pipeline. Previsualization, on-set direction and post-production no longer have to exist as mutually exclusive stages if the director of a film can access digital visuals and environments before, and whilst, shooting.

It’s this immediacy of feedback and holistic approach to cinematography and filmmaking which underpin previsualization tool Cine Tracer. Part game, part application, Cine Tracer is powered by Unreal Engine to provide a virtual 3D space in which users can block scenes, work out where lights should go, where to put cameras, how to frame shots and block actors—the list goes on.

It handles like a video game, complete with a user-controlled avatar which traverses and explores the 3D scene. Press ‘Y’, however, and your character will disappear, replaced by ‘edit’ mode—and here’s where the magic really happens. In this mode,  users can play with industry-standard equipment used in real-world studios and sets to visualize how a scene is going to look, before deftly capturing these to storyboards. Color Jon Favreau impressed.

Screengrab of Cine Tracer showing avatar

Setting the scene

Matt Workman, the mind behind Cine Tracer, is an established cinematographer on the one hand—and ‘accidental game developer’ on the other. Before embarking  on the Cine Tracer project, Matt worked as a Director of Photography in New York, working on commercials and music videos.

On realising that he enjoyed the planning and previs stage more than doing the shoots themselves, but frustrated by the sharp learning curve of the tools that served these stages, Matt made inroads into making his own for ‘non-3D people’. The seeds of Cine Tracer were sown; and, in partnership with Unreal Engine in 2018, Cine Tracer came to light, published by Cinematography Database, and pitched as ‘Fortnite for filmmakers’.

Screengrab of Cine Tracer showing light placement

It’s easy to see why—the principles are similar. Users build a set, place virtual cameras, put character models in place for realistic real-world set simulation, before taking a picture and sending it to storyboard for tangible and immediate feedback.

What sets Cine Tracer apart from video game status is the level of accuracy and pragmatism that it brings to its virtual sets. Users working in it can expect a sophisticated world and a high level of intelligence that simulates real life. Cameras or lights can’t just be placed anywhere—instead, they adhere to the same laws of physics that us mere mortals abide by, so as to simulate a traditional experience that filmmakers and cinematographers are used to. In contrast to Fortnite’s fantastical nature, Cine Tracer is necessarily grounded in reality.

The cherry on Cine Tracer’s cake is its inclusion of cutting-edge tech. Users exploring lighting and aperture can expect real-time raytracing powered by Nvidia RTX—which, when turned on, ‘feels amazing’ as Matt touches on in Epic Games’ Visual Disruptors podcast.

Gif of breakdown showing light placement in Cine Tracer

Getting into game engines

Cine Tracer’s rising popularity, coupled with its promise to make previs simpler, comes amidst a high-level trend across production teams to make creative collaboration faster, easier and more efficient.

Its use of Unreal Engine isn’t a coincidence. Game engines, and their incumbent real time tech, are continuously proving effective at driving a faster creative experience through immediate visual feedback during pre-production and onset visualization. In these cases, Cine Tracer provides a perfect example of where the benefits of real time can be felt wholesale.

The efficiency and ROI gains for artists and studios looking to invest in real time technology are undeniable. Using tools like Cine Tracer to get closer to a final image onset means that these same studios can produce more content, with potentially less expense—especially important in the streaming content wars where timings and budgets are tight, and demands are high.

But whilst game engines and real time tech allow for detailed changes to be made instantaneously, there still needs to be a balance between immediate, real-time feedback v quality of work—i.e. full, high-quality renders—achieved through post-production.

Tracing a path to the future

So—what do tools like Cine Tracer, and the exciting technology that makes up their DNA, mean for the VFX industry?

We predict a change to linear pipelines in VFX and animation, and see this as a wholly positive thing for the industry.  This in turn means that more VFX expertise will be required earlier on in pre-planning, pre-vis work and onset to facilitate post-production quality and standards, offset.

Teams and departments will work in tandem earlier in the production pipeline, with a renewed appreciation and recognition of VFX roles in these productions and beyond.

To support this happy ideal, Foundry is investigating how the rich data generated on set, including in real time virtual production tools like Cine Tracer, can be leveraged to accelerate post-production workflows. Our efforts in this area are all a part of our commitment to ensuring our tools have the best possible real time feedback technologies to provide VFX artists the feedback they need to make efficient and consistent choices in sequence-based contexts.

 

Source:Foundry

PHP Code Snippets Powered By : XYZScripts.com