How Real-Time Technology is Making Better Movies

Marc PetitArticle by guest blogger Marc Petit. As general manager of Epic Games’ Unreal Engine, Marc oversees growth of Unreal Engine into all markets. Between 2002 and 2013, Marc ran Autodesk’s Media and Entertainment business unit, steering development and marketing of 3D animation and visual effects software products. During the 1990s he worked at Softimage, where he oversaw the design and development of Softimage XSI.

Virtual production can encompass many different aspects of filmmaking – from previsualization, to techvis, to full production via performance capture – which all ultimately serve the same goal of enabling directors to more effectively achieve their desired vision for a film. By bringing digital content creation tools into the traditional filmmaking process much earlier, directors can leverage technology to better inform their storytelling.

Though virtual production has existed for years, it is now rapidly gaining traction for films both large and small – and even in new areas such as prototyping content to get a studio green light – thanks to advancements in real-time technology. Now, even on CG-heavy films, a director can realistically construct an entire scene and instantly iterate on different creative choices. A cinematographer can run through various lighting choices interactively. An entire film can be visualized and validated earlier in the production process, which ultimately saves time, money, and headaches on set and in post.

Halon Entertainment previs and postvis supervisor Ryan McCoy explores a virtual space using Nurulize. Photograph courtesy of Halon Entertainment.

Halon Entertainment previs and postvis supervisor Ryan McCoy explores a virtual space using Nurulize. Photograph courtesy of Halon Entertainment.

Virtual production typically begins with previs, where artists work with directors to bring their vision to the screen prior to production. Previs can help immensely with creative decisions such as character animation, set design, and lighting, as well as storytelling beats. With real-time technology, directors can use virtual cameras to bring CG environments or characters to life and interactively iterate on colors, size, framing, lighting and more until they are happy with the result. They can then plan out entire shots and sequences with those approved elements in mind. Augmented and virtual reality have enhanced this process even further, letting directors visualize their CG elements within a 3D space, and even conducting virtual location scouting with their teams.

“Real-time tools like game engines are allowing us to do more up-front,” observed Ryan McCoy, previs and postvis supervisor at Halon Entertainment. “You can actually be in a VR space and scout your environment before you’ve even been there – an environment that doesn’t even exist. You can experience that and understand where your cameras go.”

Hand in hand with previs is techvis – the process of technically planning out how to achieve shots in production. The techvis process can determine which specific cameras, lenses, cranes and other equipment are needed for which shots, plan out camera movements and motion control for complex sequences, and even identify the most efficient order for shooting based on when and where different equipment will be used. By helping to nail down creative and technical decisions in advance, previs and techvis can ultimately drive significant cost and time savings.

Welcome to Marwen

Real-time technology has been a particular game-changer for lighting during the previs stage. Until recently, it was impossible to work interactively with different lighting options. Now, artists are able to leverage incredible advancements in tools like Epic’s Unreal Engine to plan real-world lighting before they ever hit the set. This was crucial for the team led by visual effects supervisor Kevin Baillie at Atomic Fiction (now part of Method Studios) when they were working on Welcome to Marwen, directed by Robert Zemeckis. In order to bring live action performances into the CG world of Marwen, Baillie, Zemeckis, and director of photography C. Kim Miles relied on virtual production to ensure that lighting and camera composition decisions made during filming would translate to the final film.

“This was a very non-traditional motion capture process,” Kevin Baillie explained. “We were mocapping the actors and the cameras, and we also had to light the actors as if we were shooting a normal live action movie. The dolls in the movie are totally digital except for their faces – which we brought to life using projections of live action footage – so the lighting on the mocap stage had to be spot-on. That meant we had to design all of the lighting before we ever came to the mocap stage. To accomplish that, we had an iPad control system that allowed our director of photography to use very simple controls to dial sun height and direction, how much fill light there was, and tweak all kinds of custom ‘set lights.’ He was actually able to go through and pre-light the entire Marwen section of this movie before we ever filmed a single frame of it. Later, during production, that allowed us to walk away from the motion capture stage having everything we needed – we had cameras, we had performances, we had the actors’ faces, and everything was lit perfectly, and we knew that our compositions were going to work at the end of the day.”

Framestore used virtual production tools to bring Winnie the Pooh and other classic A.A. Milne characters to life in "Christopher Robin." Photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

Framestore used virtual production tools to bring Winnie the Pooh and other classic A.A. Milne characters to life in “Christopher Robin.” Photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

On set during production, tools like a Simulcam can come into play to integrate CG elements with live-action footage. In virtual production – just like in previs – real-time technology is allowing filmmakers to visualize more and more complex sequences right there on set, with greater fidelity and more quickly than was previously possible. Recent feature films in which real-time tools drove on-set virtual production include Blade Runner: 2049 and Christopher Robin, both of which included expertise from Framestore, which has been integrating Unreal Engine into its virtual production workflow for the past year.

Richard Graham, capture supervisor for Framestore’s motion capture stage, explained that to create crowd animation for a scene in the Blade Runner: 2049 trash mesa sequence, his team in London shot mocap and rendered the data in real-time in Unreal. The visual effects team, watching live from the review suite in Montreal, got a good representation of the final shot without having to fly across the Atlantic. On Christopher Robin, Graham’s team was asked to create CG cameras to match a scene that had already been shot on location. Tapping into Unreal’s Alembic support, Graham brought in animation from the film pipeline and allowed the director of photography and animation supervisor to shoot cameras on the animation to match.

"Christopher Robin" photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

“Christopher Robin” photograph copyright © 2018 The Walt Disney Company and courtesy of Framestore.

“The main benefit of virtual production is giving filmmakers a way to make smart creative decisions more quickly,” said Richard Graham. “For a director, it’s about them being able to see a version of their movie that’s closer to what they see in their head. I think the thing that’s most exciting in virtual production right now is just the improvement in rendering technology, so that we can get a picture that’s closer to what the final picture will look like much more quickly. Epic is really pushing the area of virtual production now, which is helping us enormously. There are so many new features in the engine now, and frequent releases, making our jobs much more straightforward.”

Beyond creative decisions, these improvements driven by real-time technology are also trickling down to impact visual effects artists and producers. With shots planned more thoroughly ahead of time, and CG elements already fully fleshed out during previs, visual effects producers can create much more detailed bids and budgets. Additionally, the ability to accurately visualize a scene’s CG elements during production allows creatives to make adjustments as needed on set, making sure things like composition, framing, and lighting are working as imagined. This all helps eliminate wasted time and bottlenecks for visual effects artists, who now have a clearer mandate earlier in the process.

Glenn Derry, vice president of visual effects at Fox VFX Lab, elaborated on the benefits for visual effects artists: “The idea is that whatever we’re looking at on the front end is very easily translatable to what the guys on the back end are doing. They don’t have to worry about figuring out what the shot is. We did all that work with the director up-front. Here’s the shot. This is it. Now you can spend all your focus on making it the best version of that shot possible, rather than trying to invent things and get approvals, because it’s already approved.”

Nvidia real-time technology captures human movement and transfers it to a fully rendered CG model. Photograph courtest of Halon Entertainment.

Nvidia real-time technology captures human movement and transfers it to a fully rendered CG model. Photograph courtest of Halon Entertainment.

The industry is now approaching a tipping point where real-time technology will produce final pixel quality on the front end. The creative, financial, and efficiency benefits will only continue to grow from there.

“Our quality of previs has evolved over the years so much to the point where some directors now are like, ‘Wow, this is looking like a final piece of production,’” observed Brad Alexander, previs supervisor at Halon. “It’s getting there; we’re starting to hit a bleeding edge of the quality matching to a final of an animated feature.”

Ryan McCoy added, “Often we’ll get assets from the finals vendors – these beautiful, high-quality assets with all these texture maps and things that we could never normally use in previs. But now we’re able to get those and put them straight into Unreal and make it almost look as good as it does in the finished piece. Previs is getting closer and closer to final-level quality every year with the new advancements in the technology and the software.”

Framestore’s chief creative officer Tim Webber excitedly calls virtual production “an opportunity to redesign the whole process of filmmaking.” Once final pixel quality can be achieved in real time, the effects on filmmaking are sure to be quite revolutionary.

Read the complete behind-the-scenes story on "Welcome to Marwen" in Cinefex 162.

Read the complete behind-the-scenes story on “Welcome to Marwen” in Cinefex 162.

Special thanks to Karen Raz.

Now Showing – Cinefex 162

Cinefex 162

One of the realities of the film industry today – something solidly in the “This Never Used to Happen” category – is the phenomenon of shifting movie release dates. In January, Film X is set to be released in March; by February, its release has been changed to May; in April, its release is set for July – of the following year.

This trend of jockeying release dates – whether to afford a film a more advantageous opening or to tinker with a film that isn’t quite ready or any number of other considerations – plays havoc with our editorial schedules. We have an article planned, sometimes even written, and then, at the last minute, the release date changes and our article must be postponed for publication in a later issue.

That is what happened with Cinefex 162, and we were suddenly left with an Alita-sized hole in our magazine!

Fortunately, we still have a few tricks up our sleeves, and a few treasures in our old file cabinets – such as an interview that Cinefex founder Don Shay conducted with Richard Fleischer, the director of 20,000 Leagues Under the Sea, back in 1977, as part of his research for a never-realized book project.

That interview, never before published, is a treasure, indeed, as the now late Fleischer recalled the making of that iconic film 23 years prior as if it had just wrapped. We present it to you in our issue 162, along with our cover story on the making of Aquaman, Joe Fordham’s coverage of Fantastic Beasts: The Crimes of Grindelwald, and Graham Edwards’ Welcome to Marwen story – supported by an in-depth Q&A with director Robert Zemeckis.

It’s a terrific issue – and you can look for Alita in February!

Cinefex 162 is on newsstands now, and available to order at our online store. If you’re a subscriber, your copy is already splashing its way to your mailbox. And don’t forget our iPad edition, out soon, featuring tons more photographs and exclusive video content.

Cinefex Quiz 2018

Yes, it’s time for this year’s grand quiz! We’ve posed one multiple choice question for every film or television show we covered during 2018, ranging from Star Wars to Game of Thrones, First Man to Aquaman. Test your knowledge now in the Cinefex Quiz 2018 … and good luck!

The Visual Effects of “Outlaw King”

Outlaw King - visual effects by Method Studios

When King Alexander III of Scotland died suddenly in the year 1286, King Edward I of England made a bid to seize the country for himself. During the bitter years of conflict that followed, Scottish freedom fighters rose up as heroes in a fight for independence, including William Wallace, portrayed by Mel Gibson in the 1995 film Braveheart.

In the feature-length Netflix drama Outlaw King, director David Mackenzie picks up the story with Robert the Bruce (Chris Pine), a claimant to the Scottish throne who led his forces to victory against the much larger English army at the Battle of Loudoun Hill in 1307. Under the direction of production visual effects supervisor Alex Bicknell, Method Studios delivered over 500 shots for Outlaw King amounting to roughly one hour of screen time, augmenting photography captured on location in Scotland and ramping up the action for the film’s battle scenes.

Outlaw King - visual effects by Method Studios

During the climactic Battle of Loudoun Hill, Bruce and his 600 soldiers use local knowledge to outwit a 5,000-strong English army. Method Studios augmented crowds of around 300 extras and 40 horses, and enhanced castle locations with historically correct architecture, weapons and props.

Outlaw King - visual effects by Method Studios


“This sequence was quite unusual for visual effects,” said Method Studios visual effects supervisor Dan Bethell. “Typically it’s more of a linear process, but here the sequence had about 150 shots that were all in play at the same time. We’d work on a large bulk of shots simultaneously, and with each story tweak we’d have to implement that across all the relevant shots. It was fun and collaborative, but definitely a different way of working. Our talented department leads were fabulous at keeping everything moving at a high level of quality and technical precision so that David and editor Jake Roberts could make the most informed decisions.”

Outlaw King - visual effects by Method Studios

To swell the ranks of both the Scottish and English armies, Method Studios populated the battlefield with various classes of CG soldier including archers, swordsmen and cavalry. Digital horses boasted muscle simulations, sliding skin, and authentic tack. Research ensured that clothing and armor was period-accurate, and that every faction was flying the correct flag. If plate photography contained historically inappropriate trees, artists mercilessly uprooted them.

Outlaw King - visual effects by Method Studios

Method Studios also augmented the film’s opening shot, an unbroken eight-minute take captured by director of photography Barry Ackroyd, during which King Edward’s forces attack Stirling Castle. Extending the English encampment, artists added crowds of soldiers and the gigantic Warwolf trebuchet used by the invaders to pulverize the fortress, which Method Studios constructed in digital form and then promptly demolished.

Outlaw King - visual effects by Method Studios


“David and Alex had a great understanding of how visual effects could enhance the historical accuracy,” said Bethell. “This helped create the believability for certain sequences when we couldn’t physically capture everything as it was in the 1300s. With so much captured on location as opposed to bluescreen, our tracking and roto departments really had their work cut out, and they did a phenomenal job giving us a foundation for the rest of the effects work.”

Outlaw King - visual effects by Method Studios

Outlaw King is now streaming worldwide on Netflix.

Spotlight – Shauna Bryan

To create cinematic illusions, you need conjurors. In this series of spotlight interviews, we ask movie magicians what makes them tick.

Shauna Bryan is vice president of new business and production executive at Sony Pictures Imageworks. She includes in her personal filmography highlights The DaVinci Code, Blades of Glory and Spider-Man: Homecoming.

Shauna BryanCINEFEX: How did you get started in the business, Shauna?

SHAUNA BRYAN: I specifically attempted to break into the Vancouver film industry, which was the closest I could get to Los Angeles, at a time when there was only a handful of television series being shot up here. Funnily enough, film people were known to work only in the summer, and ski in the winter. That was a scary prospect for me in terms of job security, but I wanted to work in film because I’m a storyteller at heart and movies have always been my main catharsis. I was determined, to say the least!

My first big break was getting a producer internship for the feature film Whale Music, directed by Richard J. Lewis, who’s now one of the main directors and a co-executive producer on Westworld. On Whale Music, I got to work closely with Richard and Raymond Massey, the producer, and be a part of the film from development right through to the end of post and the film festival launch. That was an invaluable experience, and the movie still holds up as a bit of a cult classic today.

CINEFEX: What aspect of your job makes you grin from ear to ear?

SHAUNA BRYAN: I love doing creative tests or pitches designed around a filmmaker’s vision, to prove the strength of our company and artists. I love turning perceptions on their ear and showing that a company that does animation can also do photoreal visual effects. It’s fun.

CINEFEX: And what makes you sob uncontrollably?

SHAUNA BRYAN: I don’t sob. When I was 23, I could stay up all night stressing out, but now I’m at a point in my career where I understand that things tend to change overnight and that there’s definitely always a solution.

Shauna Bryan On SetCINEFEX: What’s the most challenging task you’ve ever faced?

SHAUNA BRYAN: I was working for Rainmaker and we were awarded the entire show of Blades of Glory. We had to work out how to do full CG environments and massive crowds – on top of that, we were tasked by Dreamworks to do full CG face replacements of Jon Heder and Will Farrell that were good enough so that no one would know the actors hadn’t skated the performances themselves. If that weren’t tough enough, Jon Heder broke his leg during rehearsals, so all of the skating face replacement work was shot in August and audience previews started in October, with the same studio mandate that everyone had to believe it was really Jon and Will on the screen. Bear in mind that this was 2006, well before The Curious Case of Benjamin Button and just as complicated. It was a huge ask, but somehow we got it done and the movie is still one of my favorites today.

CINEFEX: And what’s the weirdest task?

SHAUNA BRYAN: I had to deliver green script revision pages to an executive producer who was already on a plane set to fly out of Vancouver. I literally had to run to the gate, talk my way past and get escorted to the plane to hand-deliver the revisions to the executive at his seat. I don’t think he ever even read them, but my job as an uber-executive assistant was secured!

CINEFEX: What changes have you observed in your field over the years?

SHAUNA BRYAN: I’ve seen an odd full circle forming. When I first started in visual effects, it was really only the big companies that could be trusted to do serious work. It took a long time and was expensive. Then, over the years, smaller companies came in like little fighter pilots, turning the tide and doing equally complex work, for cheaper costs. Now, with so many huge visual effects shows out there, containing complex work on fairly short schedules, quality and delivery are of utmost importance. There’s not enough render capacity, artists or production management to feed the worldwide demand, and smaller companies have had a harder time executing as expected. I’m now seeing studios checking more deeply into a company’s capacity, their robust pipeline for delivery, current artistic talent and son on, and being less focused on the lowest bid when it comes to awarding work. Not to say that costs don’t need to be competitive – because they absolutely do – but there seems to be a more holistic thoughtfulness when it comes to placing work at a facility.

CINEFEX: And what changes would you like to see?

SHAUNA BRYAN: I’d like to see more of the above, and to have stronger partnerships between a facility and a studio. There’s more than enough work to go around, so I’d love to see facilities taking on less work, but doing so with more purpose. I’d love to see studios engaging facilities earlier on, and awarding earlier so that facilities can plan and not feel like they have to take on everything that comes their way. More planning, less grabbing. I’m not sure if that’s possible, but I’d love to see it.

CINEFEX: What advice would you give to someone starting out in the business?

SHAUNA BRYAN: Absolutely go for it. The film and visual effects industry is awesome and, in an odd way, more recession-proof. People want content – there’s more and more of a demand for it. There are a lot of opportunities for growth, travel and learning. It’s a constantly evolving landscape technically, which is exciting. That said, this industry is hard work with long hours, so you need to be mindful of your career path and how that can form around a family or a desire to settle in one place.

CINEFEX: If you were to host a mini-festival of your three favorite effects movies, what would you put on the bill, and why?

SHAUNA BRYAN: The original Star Wars, The Empire Strikes Back and Forrest Gump. When I saw each of these films, I wondered, “How did they do that?”and was completely transported into the story. These films wouldn’t have been possible without visual effects and they changed my thinking as to the possibilities of storytelling. Standout sequences: the landspeeder and final battle from Star Wars; the Battle of Hoth and the duel from The Empire Strikes Back; the ping-pong tournament and Forrest in iconic history moments from Forrest Gump.

CINEFEX: What’s your favorite movie theater snack?

SHAUNA BRYAN: Popcorn and Snickers mini-bites, mixed together.

CINEFEX: Shauna, thanks for your time!

Florian Gellinger – Marvel, Barbecues, and the Future of German Cinema

Florian Gellinger

Florian Gellinger is executive visual effects producer at RISE, the company he co-founded in 2007. The chair of the German section of the Visual Effects Society and a member of the Academy of Motion Picture Arts and Sciences, he presented his company’s work on a range of Marvel Studios films including Ant-Man and the Wasp at VIEW Conference 2018. Cinefex spent half an hour in conversation with Florian, discussing a wide range of topics from managing a Marvel project to the benefits of holding company barbecues.

VIEW Conference 2018

CINEFEX: Your talk at VIEW Conference covered RISE’s work on Ant-Man and the Wasp, but that wasn’t the only Marvel Studios show you’ve worked on recently.

FLORIAN GELLINGER: No, we worked on all three Marvel films this year. Black Panther was by far the biggest one, and then on Avengers: Infinity War they threw this stuff at us for the tag sequence – the “blip” effect where people disintegrate. After that we jumped right onto Ant-Man and the Wasp. On that, you could say we were kind of a safety net, doing a lot of comp-heavy stuff.

CINEFEX: Did that mean dealing with the ongoing changes as the Marvel team continued to craft the storyline through post?

FLORIAN GELLINGER: Well, you just have to be aware that Marvel treats a live-action feature like Pixar treats an animated feature. I think that shows what exceptional storytellers they are, that they will not stop until the movie is out. That’s because Kevin Feige, the head of the studio, is still a geek at heart, who loves the source material so much and has his gang of people around him who feel the same. They treat the material with respect, and that’s why the fans feel treated with respect. Also, they are smart enough to make the whole thing appeal to a mass audience. I mean, my parents are watching Marvel films, and they’re in their early 70s!

CINEFEX: Some visual effects facilities have told us they adopt a slightly different workflow for Marvel shows, to deal with the unique demands. Is that how it works at RISE?

FLORIAN GELLINGER: I think it’s more the other way around. Because we’re used to the Marvel way of doing things, now we give all of our other clients the same treatment. We always have roughly 30 percent more crew available for the last two months of production on a show. That also helps us stick to relatively normal working hours. Of course, there are the standout shots which require more work and where we do spend long hours to make something exceptionally beautiful that we can be proud of.

Watch a RISE breakdown reel showcasing its work on Black Panther:

CINEFEX: We’ve always sensed that there’s quite a culture of fun and play at RISE.

FLORIAN GELLINGER: That’s the core thing, yeah. We believe that if you don’t enjoy what you’re doing, you shouldn’t be doing it. After all, we’re working in the entertainment industry. We’re not surgeons – this isn’t life or death.

CINEFEX: That spirit even comes through in your recruitment ads – which are often very amusing!

FLORIAN GELLINGER: Yes, and this is something that always puzzles me. You’re trying to appeal to young artists to work at your place, trying to lure in exceptional talents. Why would you do that with an overly corporate outer shell? I think you get the best talent when people get fair pay, and when they have fun doing their work. We think a company barbecue is always a good idea. Or a company party that maybe slightly escalates along the way!

CINEFEX: Is it hard to maintain that sense of unity across – what do you have, four offices now?

FLORIAN GELLINGER: We learned a lot by watching other companies grow. We always knew that if we were going to branch out and open other offices, then we needed to prepare all the tech that makes the work easy.

CINEFEX: For example?

FLORIAN GELLINGER: When we first opened our Cologne office, we made sure that you could instant-message anyone on their workstation, dial them up on their workstation or phone, share screens if you wanted to give advice or work together on a shot. When we opened in Stuttgart and Munich, we introduced the ability to synchronize all of the files for a specific show, so that all the files will be up to date in that other office all the time. So, if I’m closing my Nuke comp setup in Berlin, it takes about two seconds until someone in Munich can pick up and continue my work. The comp artist in Munich can then use the render farm in Berlin because all the files of his composite are already in Berlin – it’s only the Nuke setup that needs to be sent. It’s the same with assets, and so on. We need to have security clearance when we do this for certain shows, of course.

CINEFEX: It also helps to hold the team together even when everyone is miles apart.

FLORIAN GELLINGER: Yeah, it feels like everyone is working in the same building, but on a different floor. And they’re just too lazy to take the stairs! It’s like a big, slightly dysfunctional family.

Watch a RISE breakdown reel showcasing its work on Avengers: Infinity War:

CINEFEX: You work on the big Marvel movies, but you also do your share of German television and cinema. There’s a great heritage in German cinema, going way back – do you feel part of that continuum?

FLORIAN GELLINGER: Unfortunately, I think that spirit of innovation in cinema was somewhat lost over the last century. In Germany, we have a big divide between popular films and arthouse films. We do see really exceptional films like Jim Button and Luke the Engine Driver, the children’s film that was released this year, where a couple of German visual effects companies contributed amazing work, but these are just unicorn projects that just pop up from time to time. It seems like the audience has lost its trust in bigger-scale shows, and nobody’s expecting to see them any more.

CINEFEX: There was a bit of a boom in German filmmaking during the ‘80s.

FLORIAN GELLINGER: When Wolfgang Petersen made Das Boot and The NeverEnding Story. That was still creative moviemaking. Wolfgang Petersen said that he moved to Hollywood because it was too hard for him to realize his ideas in Germany. There were always so many naysayers who would say, “You can’t do this, you can’t do that.” Same thing with Roland Emmerich. He did his German sci-fi films but he just didn’t feel wanted or respected, so he moved to the States.

CINEFEX: Do you see any opportunities for change?

FLORIAN GELLINGER: I just wonder why are there so few German produced films not being shot in English, because then you would access a much larger audience. It’s not hard to get an international cast in. So, at RISE, we’re going to try that. We’ll start to produce our own films in 2019 with a film production company called RISE Pictures. We’re hooking up with producers around the world to produce visually high level international content.

CINEFEX: That’s exciting. Do you think that will also benefit the German visual effects industry?

FLORIAN GELLINGER: I think that Germany was not at the forefront of the digital film revolution, because the tax breaks around the world focused the work elsewhere. That made it hard for an ecosystem to grow. Now, there is so much work out there, and the German rebates have caught up to the international competition, so it’s getting better and better, and you have companies like Scanline and Mackevision and Trixter competing in the global market.

Also, if you look around the world, there is a lot of German talent out there – people who might have moved abroad in their mid-20s, found a partner, got married and had kids, and now they want those kids to live close to their grandparents – so they’re looking for a place back home. So now you’re looking at really exceptional talents who have done everything from amazing creature sculpting to programming muscle systems and tissue solvers. For them, there’s a big benefit in seeing the industry grow in Germany despite not having maybe the support or the ecosystem to grow in.

CINEFEX: It’s important to get culturally unique filmmakers out there on the world stage – now more than ever, perhaps. A German filmmaker has a different voice than a French filmmaker, or a Swedish filmmaker.

FLORIAN GELLINGER: Yeah. I read that George Lucas and Francis Ford Coppola were always jealous of the French filmmakers with their little handheld cameras. They were so flexible and could just go anywhere with their actors, pick any location they wanted and just start shooting. I think you lose that when your production reaches a certain size and you’re just going the path most traveled, and you’re not inventing any more. It’s like Rob Bredow said in his talk at VIEW – having all the options is not necessarily a good thing. When somebody limits your possibilities, that’s when you start inventing.

Save the date for next year’s VIEW Conference, scheduled for 21-25 October, 2019.

“Black Panther” image copyright © 2018 by MARVEL.

Jay Worth – Sharing a Unique Vision

Jay Worth - VFX supervisor on "Westworld"

Emmy award-winner Jay Worth has worked with Bad Robot since 2005 on television projects including Fringe, Person Of Interest and Westworld. Following his presentation at VIEW Conference 2018, Cinefex caught up with Jay to talk about his life as an independent visual effects supervisor.

VIEW Conference 2018

CINEFEX: You began your VIEW Conference presentation talking about your background in acting, and how you’d had all these other jobs before kind of falling into doing visual effects. You also said that helped you to develop what you call your own “unique vision.”

JAY WORTH: When I started out in visual effects, I felt like I was always playing catch-up – wondering how much was down to me, how much down to the writers, the artists – because I don’t really have a technical background. Then, after I’d been doing it for a while, I realized that I did know what I could bring to the table, and what my own unique vision was. And I love the idea that actually everybody has their own unique vision. Because when you look at any one visual effects shot, that’s been done by a person with their own individual perspective. Even if it’s a fluid simulation, it’s still done by an artist.

CINEFEX: What are the benefits of working as an independent visual effects supervisor?

JAY WORTH: Well, first, I tell people all the time that there are not enough independent visual effects supervisors in television. We all turn down work, a lot, and I think more and more studios and showrunners want to be working with independent supervisors when they can. The great thing for me as an independent is I can work with any companies I want, and I like being able to find those people out there that do certain things really well. I love when vendors send me specific reels. Send me your smoke reel. Send me your fire reel. If you have an artist that you cannot wait to put in front of me, let me know. It’s okay for me if you specialize in something.

CINEFEX: Because every artist has their own vision.

JAY WORTH: Right. There was this one matte painter at CoSA Visual Effects, and I would literally just email them and say, “Send it to her, because I know she’s the one who’s doing the shot.” I love when you find those certain artists. I had this one vendor, and when they were reading their breakdown and found they weren’t assigned this one sequence, they were pissed. They said, “You don’t understand, we’ve got a smoke guy that has to do these shots!” I love it when vendors do that.

CINEFEX: So you’re acting a little bit like a casting director – appropriate, given your background in acting.

JAY WORTH: I never really thought about it that way, but yes. I’m always looking for the next partners, the next CoSA, the next Important Looking Pirates. But also, television shows morph and change. Sometimes you’ll have a pilot that’s heavy on hard surfaces or matte paintings, and then you get into episode four and it’s all fluid simulations. A show doesn’t always want to be stuck with just one vendor.

Watch a Westworld season two breakdown reel by Important Looking Pirates:

CINEFEX: Is it sometimes the case that the right vendor comes along at just the right time?

JAY WORTH: Sure, that happened with Almost Human. It was the big climactic shot that establishes the entire world at the end of the pilot. We had done the whole episode, and I got a note from J.J. Abrams that said, “I hate this shot.” I emailed back to ask what he hated about it, and got no answer. So I used it as an audition piece. I sent the same brief and the same notes, with my idea of what needed to be fixed, to nine different companies. I got back nine of the most different matte paintings you’re ever going to see in your life. A lot of them didn’t work, but there was this one vendor that was seventh on the list in terms of size, and the lowest cost – Artifex Studios up in Vancouver. What they came up with was perfect. Everyone looked at it and said, “That’s the entire show, right there.” They ended up doing every matte painting for the rest of the show.

CINEFEX: These days, in features, the visual effects supervisor is often involved from day one. Is the same true for television?

JAY WORTH: Thankfully, I’ve been blessed that I’ve always been involved from the get-go, but I think that’s a fairly unique position. Bad Robot will call me to say they have a pilot for me; I’ll ask who’s directing it and they’ll say, “We don’t know yet!” So I end up in meetings with the showrunner, maybe a production designer, and me, in a room full of executives, first out of the gate in figuring out what this show will look like.

CINEFEX: Further down the line, when you’re into production, do you end up working on multiple episodes simultaneously?

JAY WORTH: It totally depends on the show, but usually you have four episodes going at once. You’re prepping one, shooting one, editing one, and delivering one or two at a time. Then all of a sudden it backs up, and everything goes a little haywire. With Netflix shows you have a little bit more freedom. From my perspective, because I work on multiple shows, I can have up to 30 shows in the pipeline. Lots of plate-spinning!

This breakdown reel from CoSA VFX features the company’s work on Westworld and other shows:

CINEFEX: Turning to Westworld, did the workload change when you moved from season one into season two?

JAY WORTH: Oh, the volume of work on season two! It was crazy how much more there was. Take Shogun World, my goodness! Because of the look they wanted, they hung silks through the entire street. We had to do hardcore roto to remove all the silks and do sky replacement. At first, we didn’t think we needed Mount Fuji, but we ended up putting Mount Fuji in most shots, because if you saw the California mountains back there it didn’t look right. One of the nice things was that we had a two-day blood element shoot – knife cuts and splashes, so much blood! – which I’d never gotten to do in my career. That was great.

CINEFEX: You had some big moments to tackle, too, like the passage of the hosts into this strange alternate realm in the season finale.

JAY WORTH: Right, there’s this fissure in the universe that goes to what they called the Sublime. But what does that look like? Is it a portal? A doorway? How’s the light going to work? There was this idea that it should bend and kind of tear into itself, and we also talked a lot theoretically about how it’s a computer program that only the hosts can see – but we didn’t want it to look digital …

CINEFEX: You can debate and articulate as much as you like about why something is the way it is, but …

JAY WORTH: Oh, there was a whole lot of “Why?” but it still came down to “What’s it going to look like?” We put in black particles and light particles and vibration and distortion and bending … we just kept chipping away to find out what’s underneath. That’s what this job is – it’s like visual sculpting.

CINEFEX: Do you think things will ramp up even more for Westworld season three?

JAY WORTH: Well, now our hosts are out, so I know that we’re going to have to see them somewhere. So for me it’s about worldbuilding, and I’m fascinated to see how that creative process goes. We’ve all seen future cities, for example, but it always comes back to those classic questions like, “What are the cars going to be?” I know a skyscraper’s going to be steel and glass, but how am I going to make it unique? Worldbuilding is always daunting, and a little scary, but it’s super fun. I’m looking forward to it.

Save the date for next year’s VIEW Conference, scheduled for 21-25 October, 2019.

“Westworld” image copyright © 2016 by Home Box Office.

Matt Aitken – Face to Face with Thanos

Matt Aitken, VFX supervisor at Weta Digital - "Avengers: Infinity War" image copyright © 2018 by MARVEL.

A veteran of films including Avatar, District 9, and all three films in The Hobbit trilogy, Matt Aitken gave a presentation about his role as Weta Digital visual effects supervisor on Avengers: Infinity War at VIEW Conference 2018. Cinefex caught up with Matt at the event and quizzed him not only about the creation of Marvel’s tortured bad guy Thanos, but also the evolution of digital characters at Weta Digital over the years.

VIEW Conference 2018

CINEFEX: Matt, we spoke to you earlier this year for our article on Avengers: Infinity War. Just give us a quick recap on the work that Weta Digital did for the film.

MATT AITKEN: We did everything on planet Titan, where Thanos goes to get the Time Stone from Doctor Strange. Along with Digital Domain, we created Thanos, this digital character who is really the protagonist of the movie. For Marvel, I think this was a bit of a leap, having a lead character who was entirely digital. We were all very aware that if Thanos didn’t work, then the film was going to fail.

CINEFEX: Thanos was played by Josh Brolin, of course.

MATT AITKEN: That’s right, and Josh took to the digital performance space like a duck to water. He did this fantastic reference performance for Thanos, and really seemed to enjoy it. Because, you know, Thanos is a complex character. He’s not just a roaring, screaming baddie – he’s actually motivated by what he thinks are good intentions. We really needed to be able to get into his head for the storytelling to work.

CINEFEX: Weta Digital has this great heritage of doing digital characters. Did you break any new ground with Thanos?

MATT AITKEN: We did. In the past we’ve had concerns about these digital characters being physiologically different from the actors that are playing them – necessarily so. Caesar is different from Andy Serkis, the BFG is different from Mark Rylance, and Thanos is different from Josh Brolin. So, when we’re trying to make sure that we’ve captured all those nuances of performance, we’re kind of comparing apples to oranges.

We solved this for Infinity War by creating an intermediary step in the form of a digital facsimile of Josh. We called it the “actor puppet.” It was like a digi-double, but it was very geared towards facial performance, and had all the same range of motion as the Thanos digital puppet. We did a facial solve on the actor puppet, interpreting the tracking data from Josh’s actual face, then we iterated in that space until we were happy that we’d captured the full intention of his performance. So, at that point, we were in an apples-to-apples comparison space.

Once we’d done all that work, and were happy with it, then it was reasonably straightforward to transfer that motion to Thanos. We would take the animation curves, the timing and extent for each of the muscles on the face, and just apply it to the Thanos puppet, which we’d carefully calibrated to the Josh actor puppet. We feel this really helped us to capture all the subtleties of Josh’s performance.

CINEFEX: In that second stage, going from digital Josh to digital Thanos, did you inject a further level of finessing and performance?

MATT AITKEN: Yeah. We would get to the point where we were happy that we’d captured everything that Josh was doing in the technical sense, but then we would always have a keyframe animator – a craftsperson, if you like – sit down and do an extra pass to polish the performance. That’s something that we’ve done at Weta Digital since the very early days of Gollum and Kong because, as good as the pipeline is, the technology can knock the edges off a performance. I think that’s something that we will always cherish, that polishing pass. It’s part of our secret sauce.

Watch a breakdown reel showcasing Weta Digital’s work on Avengers: Infinity War:

CINEFEX: You mentioned Gollum and Kong. Weta Digital has this wonderful lineage of digital characters stretching from The Lord of the Rings through to Caesar in the Apes films, and now Thanos. Can you pick out a few of the key steps that you’ve made along the way?

MATT AITKEN: Kong was a key moment because, for the first time, we used facial motion capture. Some people may not realize that Gollum’s facial performance was entirely keyframe animated, but with Kong we stuck dots on Andy Serkis’ face, tracked them, and did a facial solve. This used a procedural approach to analyze what Andy’s face was doing, broke it down into individual muscle components, and then applied that to Kong’s face.

Now, for Kong’s facial performance, Andy was restricted to a cube maybe three feet on each side. If he moved out of that space we would lose the track, so we used it mainly for the big drama beats. Avatar was the next big leap forward because, for the first time, we had head-mounted cameras filming dots painted onto the actors’ faces. That meant they could roam freely throughout the performance capture space.

CINEFEX: Alongside those things that have changed, is there anything that hasn’t changed?

MATT AITKEN: Well, the thing that anchors all our digital performance work is that it’s always based on a human performance. That’s because, going all the way back to the time of Greek theater, actors are the people that we go to for performances. They are the specialists in that particular task. Why should we change that?

There’s another thing that hasn’t changed since the beginning – and this emerged through the process of working out how to do Gollum’s facial performance. Back then, we started by looking at taking the movement of Andy’s face and dragging a digital Gollum face around the same way, but that really quickly gave the appearance of somebody wearing a rubber Gollum mask. So we discarded that. We also looked at the same approach that we use for our body work, with muscles under the skin that fire, and bulge, and drag the skin around. But that was too crude for the facial performance – it didn’t give us the microscopic level of sculptural control that we needed.

So, the approach that we settled on for Gollum – and which has been the same ever since – was to take a sculptural approach. We have facial modelers who craft the individual component shapes of the performance to a very fine level of detail – a brow raise, a lip curl, and eyelid open or close, a cheek raise. We sculpt a set of 108 shapes for each facial performance, in a way that gives a full range of motion, but always on character. Then the facial animators create a performance from that by dialing those shapes in and out in a very complex way.

CINEFEX: Do you use that across the board? Take an extreme shot where, say, a character is thrown against a wall and their whole face distorts. Do you add some kind of dynamic simulation into the face rig, or is all that also controlled sculpturally?

MATT AITKEN: You mean like when somebody gets a massive punch to the head – which is the kind of shot that we’re often involved with! It would certainly be tempting to just add dynamics to it, but no, we want to have control over the shape of the face even at that moment, because we feel it’s so important to maintain character at all times. So we’ll sculpt those shapes as well.

CINEFEX: You’ve also made steady advances in the final physical appearance of these digital characters.

MATT AITKEN: Again, that started with Gollum, where we used subsurface light scattering, which makes the skin look very natural and not plasticky. Gollum wouldn’t have been nearly as successful if we hadn’t had access to that technology. Then there’s model weight – the amount of detail in the underlying geometry. Kong’s face had more geometric detail than there was in the whole of Gollum’s body. That’s been a constant progression, from Kong to Neytiri, Neytiri to Caesar, and that’s just about the tools getting better and the computers getting more powerful.

CINEFEX: How about lighting and rendering?

MATT AITKEN: We’re getting a lot better at hair and fur. We render now with our path-trace renderer, Manuka, which is able to capture a global illumination model, so everything feels much more photographic and natural. We also have our PhysLight lighting pipeline where we encapsulate all the physical characteristics of light, light transport, cameras – we’re talking absolute values for light rather than relative stops, for example.

Get comprehensive coverage of “Avengers: Infinity War” in Cinefex 159.

Get comprehensive coverage of “Avengers: Infinity War” in Cinefex 159.

CINEFEX: We know you can’t divulge the recipe for Weta Digital’s secret sauce. But are there any ingredients that you’d like to see added to it?

MATT AITKEN: Oh, I just feel like there’s always more we can do. It’s like we don’t ever complete a project – we just run out of time and they snatch it from us! For me, the first time I see one of these films is like another dailies session, only I’m not able to give notes any more! It’s only on the second viewing that I can watch it as an audience member. But, it’s a very exciting space to be working in – the performance space, virtual production, working with actors on the set for what is ultimately going to be a digital performance – it’s just great fun. I just hope to be able to keep doing that.

Save the date for next year’s VIEW Conference, scheduled for 21-25 October, 2019.

“Avengers: Infinity War” image copyright © 2018 by MARVEL.

David Vickery – Capturing Reality with “Jurassic World: Fallen Kingdom”

David Vickery - VFX supervisor on Jurassic World: Fallen Kingdom

After joining Industrial Light & Magic in 2015, and with film credits including Sherlock Holmes, Harry Potter and the Deathly Hallows: Parts 1 and 2, and Jupiter Ascending, David Vickery took the role of production visual effects supervisor on Jurassic World: Fallen Kingdom. At VIEW Conference 2018, where he gave a presentation on this latest dinosaur epic, Cinefex chatted with David about the benefits of capturing reality on set, and the magic of enhancing it in post.

VIEW Conference 2018

CINEFEX: In your talk on Jurassic World: Fallen Kingdom at VIEW Conference, you stressed how important it was for you as visual effects supervisor to be on the show from day one.

DAVID VICKERY: Yes. In fact, I see ILM more and more not just as a postproduction vendor, a tool that is used by filmmakers to finish their movie, but as a real partner in the whole filmmaking process. On Jurassic World: Fallen Kingdom, we actually tried to find a lot of different ways to engage the shooting crew as part of the postproduction process.

CINEFEX: For example?

DAVID VICKERY: Take a shot where you’ve got a dinosaur standing up against a wall in a corridor, and there’s nothing else going on in that frame. A lot of the time, you would just take a camera operator and say, “We need a clean plate of that corridor. Just lock it off and shoot it, and we’ll add the dinosaur in post.” That’s fine, and we’ve got really talented people that can do all the dinosaur animation, but we don’t have a camera operator with 20 years worth of experience, or a director of photography who’s trained their entire life in how to light sets. So, what we actually did was put a dinosaur performer in the shot and invited the camera operator to shoot what they felt was right. Because the camera operator is a performer as well – they’re the one telling you what to look at.

CINEFEX: You used animatronic dinosaurs created by Neal Scanlan and his team, and you even had an inflatable indoraptor, right?

DAVID VICKERY: It sounds mad, doesn’t it? Just thinking about it seems slightly insane.

CINEFEX: It does, but when you see the behind the scenes footage, it totally makes sense – a super-lightweight puppet that looks like a bit like something out of the War Horse stage show.

DAVID VICKERY: It’s exactly the same sort of idea. Even if you don’t get quite the performance you’ll get from something like War Horse, you get a lot of value by actually bringing a performance to set. Liam and Aiden Cook puppeteered our indoraptor – they’re a father and son team who work with Neal frequently. They actually became the character of the indoraptor and, as oddball as the footage looks, there are pieces in the sequence where we matched their performance.

CINEFEX: So did you keep the inflatable in shot for the actual takes?

DAVID VICKERY: Well, my original idea was to rehearse with the inflatable so that everyone would know their cues, and camera would know where to look. Then we would take the inflatable away and shoot without it, because it would be a lot of work to paint it out. The problem was, as soon as we did that, everyone slightly missed everything. Eyelines went in slightly different directions and we lost that energy. As director, J.A. Bayona was really aware of this, and so we ended up shooting with the inflatable in, all the time. That way, we got this amazing performance.

CINEFEX: But leaving you with all that paint work in post.

DAVID VICKERY: Well, getting the best shot on the set doesn’t necessarily make things quicker or easier in post, but it does guarantee that you get a really good shot. You believe the performances of Bryce Dallas Howard and Chris Pratt, because there’s a fairly lethal-looking inflatable jabbing towards their faces! Those inadvertant flinches they make are all real. Much better than a tennis ball – although we did have to resort to tennis balls every now and then!

CINEFEX: Sounds like a win-win.

DAVID VICKERY: It’s very beneficial for us, and it’s slightly selfish as well. Because what I get on set is a decision from a director as to what’s correct, in real time. Hans Zimmer summed it up beautifully in his talk at VIEW, when he said, “Real time is where shit happens.” He was talking about music, and I’m talking about dinosaurs, but it really is perfectly true, in that good shit happens on set. First, you get decisions. Second, you get people reacting and interacting. So, Neal and I worked hard to find as many ways as we could to bring dinosaurs to the set.

CINEFEX: It’s a theme that’s come up throughout the conference, not least in Dennis Muren’s talk on the critical five percent that makes a shot look great. His idea that you don’t add it in post – you capture it during production.

DAVID VICKERY: Dennis talks about it a lot, and sees it very clearly. If the first five percent is wrong, you’re never going to get the final five percent right. Get it right to start with, and then you’re not fixing it in post. You’re enhancing it in post.

CINEFEX: We’re sensing quite a swing to this way of thinking lately.

DAVID VICKERY: It’s a logical approach to filmmaking and visual effects. I think the slightly ironic thing about my job is that we actually spend most of our time in prep working out how not to do visual effects. It’s our job to help filmmakers understand how they can shoot things for real, not how we can just go do it in post.

CINEFEX: And it’s actually got nothing to do with the tiresome debate of whether you do it practically or digitally.

DAVID VICKERY: That’s right. I would say to Neal, “Look, I’m not precious, it doesn’t need to be digital.” And he would say to me, “Well, I’m not precious either. If you need to fix something, you fix it.” It’s all about making it as good as it can be on screen. We had shots with Blue where we got an amazing performance from the animatronic in camera. While we replaced most of Blue digitally in post to add subtle details, we kept the practical muzzle, because the small vibrations that we got in the leather straps were completely real, and not a detail that we’d have had time to add in post – if we’d even thought of it.

Watch a Jurassic World: Fallen Kingdom featurette:

CINEFEX: The original Jurassic Park exploded out of nowhere and took the world by storm. More than anything, it did something new with dinosaurs. Did you and J.A. and the rest of the team want to do that too?

DAVID VICKERY: We’re all creatives, so we come to every project wanting to bring a little bit of ourselves. We knew that J.A. as a director would bring his own visual aesthetic and sensibilities. There’s a very definite Spanish haunted house vibe going on in the third act of the film, which I love. In some of the concept art you can see a direct homage to frames from Jurassic Park, but J.A. would tweak it slightly, like putting an insane dinosaur in a girl’s bedroom rather than a velociraptor in a kitchen.

CINEFEX: Looking back at your recent portfolio of projects – Fast & Furious 6, Mission: Impossible – Rogue Nation, Jason Bourne, even Jurassic World: Fallen Kingdom – a lot of that work is based on what you might call “grounded reality.” Is that your thing, or is it just coincidence?

DAVID VICKERY: I think it’s just coincidence. But, I do feel very passionate about trying to make things as real as possible. It seems silly to say it, because everybody wants that, but even in Harry Potter and the Deathly Hallows, when we were working on incredibly fantastic things, I would try to look for inspiration in reality, because you can’t just invent that stuff. If you think you’ve invented a crazy animal, look in nature and you’ll find an even crazier one. I actually trained as an industrial designer, which is all about trying to understand how things are put together, and how they work. I think that gave me a really solid basis for visual effects. But really, I try and make films that I think I would enjoy, personally. I just love the idea of creating iconic, beautiful images. The first sight of the brachiosaur in Jurassic Park has to be one of those same moments for me. I love that there’s a line in Fallen Kingdom where Bryce’s character says, “Do you remember the first time you saw a dinosaur?” Every time I hear that line, I think, “Yep, I remember,” and I get the goosebumps.

Get comprehensive coverage of "Jurassic World: Fallen Kingdom" in Cinefex 160.

Get comprehensive coverage of “Jurassic World: Fallen Kingdom” in Cinefex 160.

Save the date for next year’s VIEW Conference, scheduled for 21-25 October, 2019.

“Jurassic World: Fallen Kingdom” image copyright © 2018 by Universal Studios and Amblin Entertainment, Inc. and Legendary Pictures Productions, LLC.

VIEW Conference 2018 – Interview Roundup

VIEW Conference 2018 - Hans Zimmer, Dennis Muren, Rob Bredow, Geoffrey Baumann, John Gaeta, Dadi Einarsson

From 22-26 October, 2018, the Italian city of Turin hosted the 19th international VIEW Conference, celebrating visual effects, computer graphics, interactive techniques, digital cinema, animation, virtual and augmented reality, and gaming. Cinefex was there through the week, enjoying a host of presentations on some of the latest achievements and trends in visual effects and other disciplines.

The event took place in the heart of Turin at OGR (Officine Grandi Riparazioni), a spectacularly refurbished 19th century industrial complex. The former railroad workshops, with their high steel-framed ceilings, were the perfect backdrop for a stellar series of talks, workshops and masterclasses delivered by top talent, including keynote speeches by industry legends Hans Zimmer and Dennis Muren.

If you’re a regular reader of this blog, you’ll have seen some of the interviews we conducted at VIEW Conference popping up online through the week. In case you missed them, here are all the articles we’ve posted so far:

If you’re hungry for more, fear not. We’ll be posting more articles through this week, catching up with some more of the people we spoke to at VIEW Conference. Look out for interviews with:

  • David Vickery, visual effects supervisor, Jurassic World: Fallen Kingdom
  • Matt Aitken, visual effects supervisor, Weta Digital
  • Florian Gellinger, co-founder and executive visual effects producer, Rise
  • Jay Worth – visual effects supervisor, Westworld

David Vickery, Matt Aitken, Florian Gellinger, Jay Worth

We’d like to say a huge “thank you” to everyone at VIEW Conference who made us so welcome, especially Maria Elena Gutierrez, Steven Argula and Rick Rhoades. If you have a space in your 2019 calendar, make sure you save the date for next year’s event, scheduled for 21-25 October, 2019.