How Real-Time Technology is Making Better Movies

by Marc Petit

Marc PetitArticle by guest blogger Marc Petit. As general manager of Epic Games’ Unreal Engine, Marc oversees growth of Unreal Engine into all markets. Between 2002 and 2013, Marc ran Autodesk’s Media and Entertainment business unit, steering development and marketing of 3D animation and visual effects software products. During the 1990s he worked at Softimage, where he oversaw the design and development of Softimage XSI.

Virtual production can encompass many different aspects of filmmaking – from previsualization, to techvis, to full production via performance capture – which all ultimately serve the same goal of enabling directors to more effectively achieve their desired vision for a film. By bringing digital content creation tools into the traditional filmmaking process much earlier, directors can leverage technology to better inform their storytelling.

Though virtual production has existed for years, it is now rapidly gaining traction for films both large and small – and even in new areas such as prototyping content to get a studio green light – thanks to advancements in real-time technology. Now, even on CG-heavy films, a director can realistically construct an entire scene and instantly iterate on different creative choices. A cinematographer can run through various lighting choices interactively. An entire film can be visualized and validated earlier in the production process, which ultimately saves time, money, and headaches on set and in post.

Halon Entertainment previs and postvis supervisor Ryan McCoy explores a virtual space using Nurulize. Photograph courtesy of Halon Entertainment.

Halon Entertainment previs and postvis supervisor Ryan McCoy explores a virtual space using Nurulize. Photograph courtesy of Halon Entertainment.

Virtual production typically begins with previs, where artists work with directors to bring their vision to the screen prior to production. Previs can help immensely with creative decisions such as character animation, set design, and lighting, as well as storytelling beats. With real-time technology, directors can use virtual cameras to bring CG environments or characters to life and interactively iterate on colors, size, framing, lighting and more until they are happy with the result. They can then plan out entire shots and sequences with those approved elements in mind. Augmented and virtual reality have enhanced this process even further, letting directors visualize their CG elements within a 3D space, and even conducting virtual location scouting with their teams.

“Real-time tools like game engines are allowing us to do more up-front,” observed Ryan McCoy, previs and postvis supervisor at Halon Entertainment. “You can actually be in a VR space and scout your environment before you’ve even been there – an environment that doesn’t even exist. You can experience that and understand where your cameras go.”

Hand in hand with previs is techvis – the process of technically planning out how to achieve shots in production. The techvis process can determine which specific cameras, lenses, cranes and other equipment are needed for which shots, plan out camera movements and motion control for complex sequences, and even identify the most efficient order for shooting based on when and where different equipment will be used. By helping to nail down creative and technical decisions in advance, previs and techvis can ultimately drive significant cost and time savings.

Welcome to Marwen

Real-time technology has been a particular game-changer for lighting during the previs stage. Until recently, it was impossible to work interactively with different lighting options. Now, artists are able to leverage incredible advancements in tools like Epic’s Unreal Engine to plan real-world lighting before they ever hit the set. This was crucial for the team led by visual effects supervisor Kevin Baillie at Atomic Fiction (now part of Method Studios) when they were working on Welcome to Marwen, directed by Robert Zemeckis. In order to bring live action performances into the CG world of Marwen, Baillie, Zemeckis, and director of photography C. Kim Miles relied on virtual production to ensure that lighting and camera composition decisions made during filming would translate to the final film.

“This was a very non-traditional motion capture process,” Kevin Baillie explained. “We were mocapping the actors and the cameras, and we also had to light the actors as if we were shooting a normal live action movie. The dolls in the movie are totally digital except for their faces – which we brought to life using projections of live action footage – so the lighting on the mocap stage had to be spot-on. That meant we had to design all of the lighting before we ever came to the mocap stage. To accomplish that, we had an iPad control system that allowed our director of photography to use very simple controls to dial sun height and direction, how much fill light there was, and tweak all kinds of custom ‘set lights.’ He was actually able to go through and pre-light the entire Marwen section of this movie before we ever filmed a single frame of it. Later, during production, that allowed us to walk away from the motion capture stage having everything we needed – we had cameras, we had performances, we had the actors’ faces, and everything was lit perfectly, and we knew that our compositions were going to work at the end of the day.”

Framestore used virtual production tools to bring Winnie the Pooh and other classic A.A. Milne characters to life in "Christopher Robin." Photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

Framestore used virtual production tools to bring Winnie the Pooh and other classic A.A. Milne characters to life in “Christopher Robin.” Photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

On set during production, tools like a Simulcam can come into play to integrate CG elements with live-action footage. In virtual production – just like in previs – real-time technology is allowing filmmakers to visualize more and more complex sequences right there on set, with greater fidelity and more quickly than was previously possible. Recent feature films in which real-time tools drove on-set virtual production include Blade Runner: 2049 and Christopher Robin, both of which included expertise from Framestore, which has been integrating Unreal Engine into its virtual production workflow for the past year.

Richard Graham, capture supervisor for Framestore’s motion capture stage, explained that to create crowd animation for a scene in the Blade Runner: 2049 trash mesa sequence, his team in London shot mocap and rendered the data in real-time in Unreal. The visual effects team, watching live from the review suite in Montreal, got a good representation of the final shot without having to fly across the Atlantic. On Christopher Robin, Graham’s team was asked to create CG cameras to match a scene that had already been shot on location. Tapping into Unreal’s Alembic support, Graham brought in animation from the film pipeline and allowed the director of photography and animation supervisor to shoot cameras on the animation to match.

"Christopher Robin" photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

“Christopher Robin” photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

“The main benefit of virtual production is giving filmmakers a way to make smart creative decisions more quickly,” said Richard Graham. “For a director, it’s about them being able to see a version of their movie that’s closer to what they see in their head. I think the thing that’s most exciting in virtual production right now is just the improvement in rendering technology, so that we can get a picture that’s closer to what the final picture will look like much more quickly. Epic is really pushing the area of virtual production now, which is helping us enormously. There are so many new features in the engine now, and frequent releases, making our jobs much more straightforward.”

Beyond creative decisions, these improvements driven by real-time technology are also trickling down to impact visual effects artists and producers. With shots planned more thoroughly ahead of time, and CG elements already fully fleshed out during previs, visual effects producers can create much more detailed bids and budgets. Additionally, the ability to accurately visualize a scene’s CG elements during production allows creatives to make adjustments as needed on set, making sure things like composition, framing, and lighting are working as imagined. This all helps eliminate wasted time and bottlenecks for visual effects artists, who now have a clearer mandate earlier in the process.

Glenn Derry, vice president of visual effects at Fox VFX Lab, elaborated on the benefits for visual effects artists: “The idea is that whatever we’re looking at on the front end is very easily translatable to what the guys on the back end are doing. They don’t have to worry about figuring out what the shot is. We did all that work with the director up-front. Here’s the shot. This is it. Now you can spend all your focus on making it the best version of that shot possible, rather than trying to invent things and get approvals, because it’s already approved.”

Nvidia real-time technology captures human movement and transfers it to a fully rendered CG model. Photograph courtest of Halon Entertainment.

Nvidia real-time technology captures human movement and transfers it to a fully rendered CG model. Photograph courtest of Halon Entertainment.

The industry is now approaching a tipping point where real-time technology will produce final pixel quality on the front end. The creative, financial, and efficiency benefits will only continue to grow from there.

“Our quality of previs has evolved over the years so much to the point where some directors now are like, ‘Wow, this is looking like a final piece of production,’” observed Brad Alexander, previs supervisor at Halon. “It’s getting there; we’re starting to hit a bleeding edge of the quality matching to a final of an animated feature.”

Ryan McCoy added, “Often we’ll get assets from the finals vendors – these beautiful, high-quality assets with all these texture maps and things that we could never normally use in previs. But now we’re able to get those and put them straight into Unreal and make it almost look as good as it does in the finished piece. Previs is getting closer and closer to final-level quality every year with the new advancements in the technology and the software.”

Framestore’s chief creative officer Tim Webber excitedly calls virtual production “an opportunity to redesign the whole process of filmmaking.” Once final pixel quality can be achieved in real time, the effects on filmmaking are sure to be quite revolutionary.

Read the complete behind-the-scenes story on "Welcome to Marwen" in Cinefex 162.

Read the complete behind-the-scenes story on “Welcome to Marwen” in Cinefex 162.

Special thanks to Karen Raz.

Leave a Reply

Your email address will not be published. Required fields are marked *