About Graham Edwards

I'm senior staff writer at Cinefex magazine. I also write novels. In a former life, I produced animated films for theme park rides and science centres. If you offer me a cold beer, I won't say no.

R is for Robot

R is for RobotIn the VFX ABC, the letter “R” stands for “Robot”.

How do you put a robot up on the silver screen? It’s a question that’s taxed filmmakers throughout the years – today more than ever, with science fiction being as hot a Hollywood property as it’s ever been.

Here at Cinefex, we’ve written an awful lot about droid manufacture over the years. But let’s for a moment imagine that there’s such a thing as a definitive handbook called The Manual of How to Make a Movie Robot. Now wouldn’t that be a useful thing?

Imagine further – if such a handbook existed, what might you find if you started leafing through its pages?

Suiting Up

The first chapter of our imaginary manual would probably be called Stick Your Actor in a Shiny Suit. It’s an approach that worked well for Fritz Lang when he made his 1927 sci-fi classic Metropolis. Brigitte Helm played the Maschinenmensch automaton wearing a costume designed by sculptor and artist Walter Schulze-Mittendorff, who abandoned early plans to manufacture the suit from copper in favor of a pliable ‘plastic wood’ that hardened on exposure to air.

Fifty years later, George Lucas followed in Lang’s footsteps when he introduced the world to the bumbling protocol droid C-3PO in Star Wars. This time the actor in the suit was Anthony Daniels, and the artist who refined the robot’s features – under the supervision of production designer John Barry – was sculptor Liz Moore, who also modeled the Star Child seen at the end of 2001: A Space Odyssey.

Performer Cheryl Sparks, born without legs from the knees down, peers out of the open faceplate of her tiny robot costume beside Bruce Dern, in Douglas Trumbull's sci-fi film "Silent Running."

Performer Cheryl Sparks, born without legs from the knees down, peers out of the open faceplate of her tiny robot costume beside Bruce Dern, in Douglas Trumbull’s sci-fi film “Silent Running.”

Robot suits come in all shapes and sizes. Accompanying C-3PO on his adventures is the diminutive astromech droid R2-D2, who was frequently portrayed by actor Kenny Baker while squeezed into a tight-fitting mechanical can.

Just as squashed were Larry Whisenhunt, Mark Persons, Cheryl Sparks and Steve Brown, all of whom had underdeveloped or missing legs, and who shared the roles of the three robots Huey, Dewey and Louie in Douglas Trumbull’s futuristic eco-fable Silent Running. Locked inside vacuum-formed shells, the agile quartet went through their robot routine whilst walking on their hands.

Rods and Cables

When James Cameron made The Terminator and its sequel Terminator 2: Judgment Day, he used every trick in the manual – and invented a few new ones to boot. At the purely practical level, he found countless ways of energizing his cyborg characters live on set, right in front of the camera lens.

For waist-up shots of the T-800 endoskeleton in "The Terminator," Shane Mahan operated a backpack-mounted puppet created by Stan Winston Studios.

For waist-up shots of the T-800 endoskeleton in “The Terminator,” Shane Mahan operated a backpack-mounted puppet created by Stan Winston Studios.

Artists at Stan Winston Studios applied metallic makeup effects to Arnold Schwarzenegger to support his role as the unstoppable T-800 assassin. The same team created animatronic replicas of the actor that revealed a chrome endoskeleton beneath the robot’s human flesh, and deployed a dazzling range of puppets and mobile rigs to bring the metal machine to life.

Here’s Stan Winston talking in Cinefex 21 about the design of the original T-800 endoskeleton:

“I wanted to retain Arnold’s form in building the robot. Not only is the robot the same height as Arnold, but all of its proportions are scaled down from and matched to fit his. The robot is anatomically correct, and could literally fit inside Arnold’s body. Even the robot’s skull was scaled down from a clay duplicate of Arnold’s head; and its teeth are duplicates of Arnold’s.”

Stop and Go

Our theoretical handbook wouldn’t be complete without a chapter on stop-motion animation – another technique used by James Cameron to bring his relentless robot to life in The Terminator.  For wide shots of the T-800, animators moved a miniature cyborg constructed by Doug Beswick one frame at a time.

Stop-motion also features in RoboCop (1987), for which animators Phil Tippett and Randy Dutra shared duties activating the malevolent ED 209 enforcement droid. In one shot, the film’s hero is seen grappling at close quarters with the mechanical sentry – for this, a nine-inch stop-motion puppet RoboCop stood in for actor Peter Weller.

Phil Tippett animates the ED 209 enforcement droid for a shot in Paul Verhoeven's 1987 film "RoboCop."

Phil Tippett animates the ED 209 enforcement droid for a shot in Paul Verhoeven’s 1987 film “RoboCop.”

Less is More

Sometimes, creating a great movie robot isn’t about what you see. It’s about what you don’t see. Steven Spielberg explored this notion in A.I. Artificial Intelligence. For a startling shot of a broken-down FemMecha Nanny robot – whose face is intact but backed only by a mechanical underskull – Stan Winston Studios fashioned a puppet head with an articulated silicone face. For closeups, Industrial Light & Magic tracked the face of actor Clara Bellar onto a digital replica of the Winston animatronic.

To create Ava, the sophisticated android in Alex Garland’s Ex Machina, Double Negative composited CG robot components into live-action of Alicia Vikander’s on-set performance, artfully subtracting elements of her human form to create a delicately remodeled robot silhouette.

Here’s Ex Machina visual effects supervisor Andrew Whitehurst discussing the ‘less is more’ approach in Cinefex 145:

“We worked hard to make sure that we designed something which could work practically, which looked like it had the right weight distribution, and which still had ‘form follows function’ beauty. We continually removed pieces that seemed superfluous. The great industrial designer, Dieter Rams, has a motto: ‘Less, but better.’ We constantly kept that in mind. In fact, when the design was 3D-printed for the laboratory set, it did all fit together beautifully. That was a proud moment!”

Almost Human

To explore human performance in more depth, our handbook is going to need a whole section on motion capture. Neill Blomkamp used this technique to great effect in Chappie, capturing the on-set performance of Sharlto Copley and translating it onto the film’s titular robot character. Artists at Image Engine developed a highly realistic CG robot based on designs by Weta Workshop, and seamlessly replaced the human actor with his mechanical avatar.

Interviewed in Cinefex 141, Neill Blomkamp had this to say about the creation of his robot star:

“We modeled every twist of the wrist, and every movement of the ankle, all so that Chappie would be able to mimic a human’s motion. We kept refining the three-dimensional model, and then sending that back to the designers so they could tweak the design. Then it would go back to 3D. By going back and forth like that, we got to a place where every detail of the robot moved correctly.”


Despite Yoda’s assertion that size matters not, there are some filmmakers who might disagree. They’ll be the ones poring over our imaginary handbook’s chapter on super-sized robots.

Few robots come bigger than the mechanical stars of Michael Bay’s Transformers movies. Industrial Light & Magic constructed fantastically elaborate digital rigs, modeling each metamorphosing robot character first in standing form, then working backward with animated movements that took it to a crouch, before devising folding actions that would slide limbs and other body parts into place on the vehicle that was its Earthly disguise.

Talking to us in Cinefex 111 about the very first Transformers film, visual effects supervisor Scott Farrar remarked on the levels of detail required truly enormous robots to the screen:

“You’d think that hard-body surfaces, as compared to furry animals, would be easier in CG, but there is a basic rule in movies: if it isn’t complex, it doesn’t look complex. To make them look like real, complex characters and to give them the appropriate razzle-dazzle, every robot had to have thousands of articulated pieces and complex connections, plus layers of paint to look like car paint finishes. The swirl of the brush marks on the metal had to be there; the grease had to be there; the torch marks had to be there. Each of these robot characters needed layer upon layer of bump, texture, dirt, scratches and color.”

Strange Shapes

There are undoubtedly many other chapters to explore in The Manual of How to Make a Movie Robot, but we’ve got time for just one more. Having focused on robots based more or less on the human form, let’s briefly consider those that don’t look like people at all. Christopher Nolan’s Interstellar features a pair of blocky droids called TARS and CASE, represented on set by pneumatically assisted bunraku-style puppets manufactured by special effects supervisor Scott Fisher. For a handful of shots showing the robots unfolding themselves into distinctly inhuman forms, Double Negative took over with animated CG versions.

In "Interstellar," the robots TARS and CASE were created on set as bunraku-style puppets, with animators at Double Negative taking over for shots that could not be achieved by practical means.

In “Interstellar,” the robots TARS and CASE were created on set as bunraku-style puppets, with animators at Double Negative taking over for shots that could not be achieved by practical means.

In Cinefex 140, animation supervisor David Lowry explained how practical experiments helped his team to devise the tricky CG robot rig required for Interstellar:

“I drilled holes into [wooden] blocks and used barbeque skewers as the joints. By playing with that, it became apparent how many different kinds of shapes you could create from just four basic blocks that have three joints each. Although it was simple, it became incredibly complicated very quickly.”

So, if there really were such a book as The Manual of How to Make a Movie Robot, which other cinematic cyborgs do you think it should include?

The Terminator photograph copyright © 1985 by Orion Pictures Corporation. RoboCop photograph copyright © 2087 by Orion Pictures Corporation. Ex Machina photographs copyright © 2015 by DNA Films and Universal Pictures. Chappie photographs copyright © 2015 by Sony Pictures Entertainment and Columbia Pictures Inc., LSC Film Corporation and MRC II  Distribution Company LP. Interstellar photograph copyright © 2014 by Paramount Pictures.

Now Showing – Cinefex 154

Cinefex 154 - From the Editor's Desk

Who’s your favourite king of the swingers? Is it Caesar or Spider-Man? To help you decide, the new edition of Cinefex comes with two spectacular cover options. Subscribers will get Weta Digital’s brooding portrait of the unforgettable ape hero of War for the Planet of the Apes, while newsstand editions feature a dynamic action shot from Spider-Man: Homecoming, courtesy of Digital Domain. Order online and you get to make the choice yourself!

Whichever cover you select, the contents of Cinefex 154 bring you our renowned in-depth coverage of five of this summer’s hottest movies: Spider-Man: Homecoming, War for the Planet of the Apes, Valerian and the City of a Thousand Planets, The Mummy and Pirates of the Caribbean: Dead Men Tell No Tales.

Here’s Cinefex editor-in-chief Jody Duncan to reflect on our August 2017 edition:

Jody Duncan – From the Editor’s Desk

We revisit a lot of old friends in this issue of Cinefex – characters and franchises we’ve covered many times before. But as I read through issue 154, I was struck by how significantly those characters and franchises have evolved through the years. Spider-Man: Homecoming is nothing less than a total reboot of the Spider-Man franchise, and the film boasts an entirely new look, tone and feeling.

We’ve followed the Planet of the Apes saga since Tim Burton’s 2001 remake; and yet, Joe Fordham’s coverage of War for the Planet of the Apes illustrates the leaps and bounds achieved in the creation of ape performances during those 16 years.

We weren’t around to cover the original Universal Mummy pictures, of course, but we were there for the 1999 Brendan Fraser version and its sequels. There, too, the advancements in technology and artistry have ensured that the most recent The Mummy, starring Tom Cruise, has a fresh behind-the-scenes story to tell.

And Graham Edwards’ coverage of Pirates of the Caribbean: Dead Men Tell No Tales celebrates the visual effects heights that have been reached since the franchise began in 2003.

This issue features an off-the-beaten path newcomer, as well – Valerian and the City of a Thousand Planets, from the visionary Luc Besson.

 Happy end of summer, everyone. I hope your Spider-Man lunchboxes are filled with your favorite sandwiches, chips and Moon Pies as you head back into the school year!

Cinefex 154 is on newsstands now, and available to order at our online store. If you’re a subscriber, your copy will be swinging into your mailbox very soon. And don’t forget our enhanced iPad edition, featuring tons more photographs – many of them exclusive to Cinefex – and stunning video content.

SIGGRAPH 2017 Production Sessions


The SIGGRAPH 2017 Production Sessions program provides a platform for creative professionals to explain their processes and techniques in the fields of computer animation, visual effects, games, virtual reality, themed entertainment, and the software applications used by the artists who create them. Each presentation ends with a Q&A session that allows attendees to quiz the experts.

New in 2017 is the Production Gallery, featuring motion picture and games artifacts from major studios. Check out the pitch boards from Moana, concept art and maquettes from Cars 3 and costumes from Guardians of the Galaxy Vol. 2. The gallery will also showcase a 25th Anniversary exhibition from Sony Pictures Imageworks,’ with items from films including Spider-Man, Ghost Rider, Men in Black, Ghostbusters and Stuart Little.

Watch the SIGGRAPH 2017 Conference Overview trailer:

Emily Hsu, SIGGRAPH 2017 Production Sessions Chair, commented:

“Since the Production Sessions program began, it has grown and evolved to become an attendee favorite with universal appeal. I am proud to be part of that tradition and to continue expanding the scope of Production Sessions content as well as initiating the all-new Production Gallery. While we have an amazing lineup that retains strong animation and VFX, we are also featuring a VR panel with Google Spotlight Stories and Oculus Story Studio and presenting a look at Blizzard’s trans-media approach in creating Overwatch. Plus, for the first time, we are bringing a live-action TV series to the SIGGRAPH stage.”

Highlights from the 11 SIGGRAPH 2017 Production Sessions include:

Spider-Man HomecomingThe Making of Marvel Studio’s “Spider-Man Homecoming”

Victoria Alonso, executive vice president of physical production at Marvel Studios, swings into action to explore the visual effects of Peter Parker’s newest adventure. Edwin Rivera, additional visual effects supervisor at Marvel Studios, presents alongside Method Studios second unit supervisor and visual effects supervisor Matt Dessaro, Digital Domain visual effects supervisor Lou Pecora and Sony Pictures Imageworks visual effects supervisor Theo Bialek.

Rogue One: A Star Wars StoryILM Presents: Behind the Magic, The Visual Effects of “Rogue One: A Star Wars Story”

The visual effects team from Industrial Light & Magic discusses the effects work of the latest Star Wars epic centered on the theft of the Death Star plans by a small team of rebel infiltrators. John Knoll, ILM’s chief creative officer and senior visual effects supervisor, joins forces with lighting technical director supervisor Vick Schutz, CG supervisor Stephen Ellis, digital supervisor Russell Paul and layout supervisor John Levin.

Guardians of the Galaxy Vol. 2The Making of Marvel Studio’s “Guardians of the Galaxy Vol. 2”

Victoria Alonso returns with a team of intergalactic effects experts to discuss the dazzling visuals of the second chapter in Peter Quill’s cosmic odyssey. Visual effects producer Damien Carr and visual effects supervisor Christopher Townsend explore a universe of wisecracking aliens and sentient worlds together with Weta Digital visual effects supervisor Guy Williams, Framestore visual effects supervisor Jonathan Fawkner and Method Studios visual effects supervisor Nordin Rahhali.

Valerian and City of a Thousand Planets“Valerian and City of a Thousand Planets”

Sophie Leclerc, visual effect producer of Luc Besson’s epic new space adventure, explores the technologies deployed to create the fabulous universe adapted from the comic books by Pierre Christin and Jean-Claude Mézières. Joining her are Weta Digital visual effects supervisor Martin Hill, ILM CG supervisor Jose Burgos, ILM visual effects art director Christian Alzmann, Rodeo FX associate visual effects supervisor Peter Nofz and Rodeo FX concept artist Olivier Martin.

The annual SIGGRAPH conference is a five-day interdisciplinary educational experience in the latest computer graphics and interactive techniques, including a three-day commercial exhibition that attracts hundreds of companies from around the world. The conference also hosts the international SIGGRAPH Computer Animation Festival, showcasing works from the world’s most innovative and accomplished digital film and video creators. Juried and curated content includes outstanding achievements in time-based art, scientific visualization, visual effects, real-time graphics, and narrative shorts. SIGGRAPH 2017 will take place from 30 July–3 August 2017 in Los Angeles. Visit the SIGGRAPH 2017 website or follow SIGGRAPH on Facebook, Twitter, YouTube, or Instagram for more detailed information.

The Visual Effects of Okja

An Seo Hyun stars as Mija in "Okja," directed by Bong Joon Ho and with visual effects by Deluxe's Method Studios and 4th Creative Party.

In the genre-bending animal adventure Okja, a young girl called Mija (An Seo Hyun) helps her grandfather Hee Bong (Byun Heebong) to raise a gigantic super-pig in the wilds of South Korea. When Lucy Mirando (Tilda Swinton), head of the multinational Mirando Corporation, reclaims the genetically-engineered creature for her own self-serving ends, Mija embarks on an epic adventure to rescue her beloved Okja from the conglomerate’s sinister clutches.

Directed by Bong Joon Ho, Okja relies on visual effects to bring its super-sized animal star to life. Visual effects supervisor Erik-Jan de Boer led a team at Deluxe’s Method Studios in Vancouver to create Okja as a living, breathing presence in environments ranging from lush mountain forests to the sunlit streets of New York City and the grim bunkers of the Mirando Corporation’s meat-producing production line. South Korea’s 4th Creative Party delivered additional visual effects, overseen by visual effects supervisor Lee Jeon Hyoung.

Cinefex spoke with Erik-Jan de Boer – who in 2013 won an Academy Award for his work on Life of Pi – about what it took to put Okja on the screen, and how the team helped the cast to bond with a creature whose performance would only be completed months later in post-production.

An Seo Hyun stars as Mija in "Okja," directed by Bong Joon Ho and with visual effects by Deluxe's Method Studios and 4th Creative Party.

An Seo Hyun stars as Mija in “Okja,” directed by Bong Joon Ho and with visual effects by Deluxe’s Method Studios and 4th Creative Party.

The design of Okja is central to the film. How did her appearance develop from the original concepts through to how she actually looks on screen?

When Bong and I first met at the end of 2014, he had some really detailed concept work of Okja, but he wasn’t ready to share his script yet. So I found myself looking at this weird creature, but not knowing what the story was going to be about. The script followed up pretty soon and so then we got a good idea of what she had to do.

Hee Chul Jang was the creature designer. He designed the initial concept and also sculpted a 3D model of Okja – it was a posed, asymmetrical maquette, only about 5 inches long. We used that as an initial object to scan, and then once she was in our virtural world we built in the symmetry and started to play a little bit more with her proportions and her features.

Deluxe's Method Studios developed a fully digital Okja asset based on concepts by creature designer Hee Chul Jang.

Deluxe’s Method Studios developed a fully digital Okja asset based on concepts by creature designer Hee Chul Jang.

What were some of the specific design details that you worked on?

We did some rounds on the size of the ears. We played around with the toes, so that we could have some good excuses to see shape changes and really sell the contact with the ground. In the original design she had some pretty well-defined lips, but we found that the way they would pick up the lighting made her too cartoonish. So we went to a more canine, jowly setup. Inside of the mouth we did quite a lot of work to make the teeth more appealing.

The main thing that we added was short fuzz and hair that made her softer and more feminine – not full fur, but we definitely wanted some excuse to break up the highlights and get some nice rim lighting on her.

You can see that in the closeups, especially when the various characters are touching Okja.

Exactly. We really needed that as an additional tool to sell that connection and that physicality. Also, if we didn’t have that hair, the renders that we did on the big closeups were plasticky, or too nude.

The final design of Okja referenced elephants, hippos, pigs and manatees. The visual effects team sculpted an animal that would be not only plausible as a genetically-engineered meat-producer, but also appealing to movie audiences.

The final design of Okja referenced elephants, hippos, pigs and manatees. The visual effects team sculpted an animal that would be not only plausible as a genetically-engineered meat-producer, but also appealing to movie audiences.

In the film, Okja is described as a ‘super-pig.’ Did you use one particular animal for design reference?

She’s a hybrid animal in the movie, and a hybrid animal in our production process as well. We used elephants, pigs, hippos. Bong had a fascination with manatees for the skin qualities, so we used that in look development, and also in terms of the sadness in the face.

Okja is constantly interacting with the people around her – especially Mija. Was that close contact important to help audiences buy into her reality?

When I read the script, I realised that the only way for us to sell the relationship that Mija and Okja have was to not avoid the notoriously difficult work in CG, but to just embrace it. We had the normal ground contact – brushing against bushes and all that stuff – but in almost every shot we had someone put their hands on Okja, whether it was tender contact from Mija like a soft tug on her ear or the hand sliding over her skin, or really violent pushing and shoving by six team members at a time. All of that had to be choreographed and designed in such a way that we could put our CG pig underneath that.

An Seo Hyun as Mija in OKJA

Throughout the film, Okja enjoys close contact with the human characters around her, especially Mija. On set, a range of Okja ‘stuffies’ gave the actors a physical presence to perform against.

So how did you simulate Okja’s presence on set?

Before we started shooting, I realised that aspect of the project was something where we could really add to the storytelling. I wanted to make sure that you really believed that Mija and Okja were there in that same space. So we built a series of props or ‘stuffies’ – at the end we had about 25 of them. We designed them in Maya, and then the low resolution models were shipped to a company in Seoul called Cell – they unfolded them and laser cut them out of flat sheets of EVA foam. When they got these huge panels back to their workshop they glued them together to match our 3D models. That gave us a great fidelity between our model and the final props that we used on set, but also a way to quickly prototype things and make sure that we got them right. We started with a warehouse full of Okja pieces all looking pristine and pretty, and then by the end of the trip when these stuffies had traveled from Korea to New York to Vancouver, they were beaten up pieces of crap!

Did you have different stuffies to represent different parts of Okja’s body?

Some of the stuffies were very generic and were our go-to props for a lot of the work. One of our workhorses was a piece of chest connected with a metal spine to a piece of the butt. That allowed us to pull at the neck and push at the butt – we called it the ‘push-pull’ rig. Then we had several versions of the head – one was more built-out and had velcro ears that we could attach and detach, while another was heavier so we could apply proper forces banging into people and objects. Other stuffies were really light so we could run around with them for choreography. Some were one-offs, custom built, like for the shot where Mija is inside Okja’s mouth brushing her teeth, or sleeping on top of her.

Jake Gyllenhaal stars as television zoologist Doctor Johnny, sent by the Mirando Corporation to bring Okja from her South Korean home back to New York City.

Jake Gyllenhaal stars as television zoologist Doctor Johnny, sent by the Mirando Corporation to bring Okja from her South Korean home back to New York City.

Who puppeteered the stuffies on set?

I really wanted us to be there as acting partners for An Seo Hyun, and to help her deliver the best performance possible. So I brought on set my animation supervisor from Method Studios, Stephen Clee. He’s a great animator, and he’s also a kick-boxer and the sweetest guy you can meet. By putting him always in front of the camera with Mija, we built the relationship there between her and the stuffy and Steve that allowed them to get in the zone – really to get to a level that I have never seen before. Of course, she’s a great actress and did an amazing job, but I feel that by always explaining to her what we were trying to do and what Okja was feeling, and building a trust with her by always rehearsing the more tricky stuff, we built a strong workflow that really shows up in the final product.

All of that must have made for some memorable moments.

At one stage, we were rehearsing at the special effects company’s courtyard a few miles from the North Korean border. I was watching Stephen stick his hand up an Okja tongue glove that we had designed, and he was using it to lick Mija’s stunt double, while American F-16s were busting overhead because there was heightened tension with the North Koreans. It was probably the most surreal situation I’ve ever found myself in professionally!

Did you use a greenscreen stage for any of the Okja scenes?

We shot as much as possible in situ. We didn’t do any greenscreen work, except for one element that was shot for Mija riding on Okja in the traffic tunnel. That was shot on a huge pogo stick that we built. We didn’t go for a motion base because I really wanted the right hang time and momentum. I wanted the physicality and the percussiveness of that to be as strong as possible.

For scenes of Okja careering through an underground shopping mall, special effects rigged the set with breakaway gags. Erik-Jan de Boer's visual effects team timed the animation of the digital Okja to syncronise seamlessly with the practical mayhem.

For scenes of Okja careering through an underground shopping mall, special effects rigged the set with breakaway gags. Erik-Jan de Boer’s visual effects team timed the animation of the digital Okja to synchronise seamlessly with the practical mayhem.

There are scenes where Okja has a big physical impact on the environment around her, like the chase through the underground shopping mall.

Yeah, that was spectacular filmmaking. It was a night shoot with hundreds of extras in this underground mall. We had to make sure that everybody was looking in the right direction, that whatever we broke was rigged to be reset and done again. Then we went to a stage where we rebuilt some of that shopping mall to do the final crash where Okja slides into one of the stores and comes to a halt.

How did that work? Did the special effects team smash up the set ready for you to add your CG Okja?

Well, the special effects department kept asking me what I needed for that shot of the crash. I said, “That’s up to you guys – it just needs to look like this six-ton animal is sliding into it.” We showed them previs of how we envisioned her rolling over and what angle she was coming into the store, but again they came back and asked what we needed. I said, “What you should probably do is just drive a minivan into the store and spin it round.” Two days later, they came to me and asked me what colour I wanted the minivan! So that’s what we did. We actually drove a small minivan into the store and replaced it with Okja. Stuff like that was just hilarious to do.

Animators at Method Studios Vancouver developed a range of behaviours for Okja, many based on the playful antics and soulful demeanour of dogs, especially beagles.

Animators at Method Studios Vancouver developed a range of behaviours for Okja, many based on the playful antics and soulful demeanour of dogs, especially beagles.

When the time came to add your CG Okja to the live-action, what sort of movement reference did the animators use?

From an emotional point of view – or even intellectually – dogs were our best source for the performance. Specifically, there was a beagle that we have around our house a lot – it’s the animal of a friend of ours – and for me beagles were the perfect translation of that huge pig, proportionally and for the ears and the big, droopy eyes. I had a lot of fun studying him, and applying on to Okja the small traits and animalistic behaviours that I picked up.

Technically speaking, how did the digital Okja asset function?

We took the usual approach where you get your skeleton, your muscles, you do your cloth simulations, and you build a hybrid of all these sims into a final skin product. Edy Susanto Lim, our creature supervisor, built a very efficient rig that had a lot of really advanced technology in it. If you look at Okja’s armpit and groin, for instance, those areas are better resolved than I’ve seen on any other CG animal. It’s really due to his work that it looks so sophisticated.

But the main difference – and this was a mandate from the start – was that we built in a level of art directability. It was really important for me to be able to look at the final animation, and then buy ourselves enough iterations in the tech animation stage to make the skin look as good as possible. Because we were dealing with an animal that was engineered to produce a lot of pork, I wanted to make the musculature as luxurious as possible. I’m very proud of the tech animation work – there’s a reality or an organic expressive quality to Okja’s skin that I do think is pushing the state of CG work further along.

How did the animators work with the Okja asset?

It’s all keyframe animation because you cannot motion capture an animal like that – well, I guess you could do some fun stuff with a hippo, but I don’t think anybody has ever done it, and with the scenes being so specific you wouldn’t get much mileage out of it anyway. We had a team of about 25 animators, and we created about 40 minutes of total Okja screen time. Our average shot length was well over nine seconds. We had shots that were close to a minute long, and a lot of them that were 30 seconds. With all the contact and physicality that we had in these shots, it was a tough nut to crack for our animators, and they did a great job.

Keyframe animation drove Okja's performance. Effects simulations built into the digital creature rig ensured the realistic movement of her musculature beneath a taut skin. The rig added an extra layer of control by allowing the animation team to art direct muscle movement to a fine degree.

Keyframe animation drove Okja’s performance. Effects simulations built into the digital creature rig ensured the realistic movement of her musculature beneath a taut skin. The rig added an extra layer of control by allowing the animation team to art direct muscle movement to a fine degree.

On top of the character animation, you talked about art directing the actual muscle movements. How did you do that?

We used an approach that I have used before, where we run a script to do the initial flexing on all the muscles – like a procedural way of getting the initial firing going. Then, both in animation and tech animation, we had the ability to put multipliers on specific muscles or muscle groups – we could vary the timing and the flexing pattern of them as well. So we could go in and multiply and offset and tweak to get it the way that we really liked it.

In real animals, muscles fire before the limbs actually start to move, don’t they?

Yes, and our flexing script took all of that into account, in our more procedural initial pass. The main choreography would be dialled in pretty early in the animation stage, so very often we would start running some sims just to see how the rig would respond to a specific action. We had a very robust rig – the first sim passes were always pretty successful from the start – and that put us in a position to really fine-tune it. With shots that are on the screen for a minute long, your eye just starts enjoying the flexing and the spasms and all the small little accidents that happen underneath the skin.

There’s plenty of broad action in the film, but also a lot of extremely subtle character animation. Some of Okja’s movements are almost imperceptible, yet she always feels alive.

Well, very often subtler is better. The animation team went to the zoo in Vancouver and had an opportunity to touch real hippos. I asked them to focus on the tension that the ribcage and the organs put on the skin, and how little deformation and friction you get when you touch a tense skin like that. We did the same with pigs. It showed us that you get very little compression on the skin unless you push really hard. Most of that contact we had to sell with shadow and lighting integration. I think that’s why some of those more intimate moments worked – because we restrained ourselves.

An Seo Hyun stars as Mija in "Okja," directed by Bong Joon Ho and with visual effects by Deluxe's Method Studios and 4th Creative Party.

Later in the film, Mika faces the dark truth behind Okja’s genetically-engineered heritage, as the Mirando Corporation reveals its plans for Okja and thousands of super-pigs like her.

During the film’s final act, the tone gets quite dark as we see the facility where thousands of super-pigs like Okja are being corralled ready for slaughter. Did visual effects create that whole environment?

Yeah, I would say that most of those shots are 90 percent CG. Bong and Darius Khondji, the director of photography, found a location in the middle of Korea, but really it was just a sloping field. We built the ramp that led up to the slaughterhouse and used a piece of the road. We had six to eight fence pillars, but all the wires and signage and everything else was extended. That could have been built on any stage, really, but what we did get was a grittiness and a mood – an emotional connection to those shots that helped bring it to where it needed to be.

Bong gave me some reference frames of a herd of hippos standing in a wide river with caked mud on their backs. Some were wet, some had dried out, and the light was playing on their backs like a sort of hilly terrain. The graphic nature of that was something that really appealed to him. I told the animators that this was a Sophie’s Choice moment, and that we were really looking for that sort of concentration camp feel. I’m proud of the fact that the end result has an organic feel to it, and really feels painterly.


Mija stops at nothing in her quest to rescue her beloved Okja from the sinister clutches of the Mirando Corporation.

How did you go about filling all the pens with super-pigs?

We didn’t use any off-the-shelf crowd systems because I felt that would just overcomplicate us. We also needed to render at 4k. So, we built this little parallel pipeline that allowed us to use vrmesh files and leverage V-Ray’s ability to handle a lot of these objects efficiently. That worked out really successfully, but at 4k some of the render times were still crazy long and really filled our render farm.

How many pigs did you squeeze into those big wide shots?

I think the maximum was 16,000.

That’s a whole lotta of pigs.

Yeah! We could have pushed number that technically, but the hilly slope helped us keep it down to that level. That was good!

What are your final thoughts on the film, now that the world has finally been introduced to Okja.

Well, it was just a blast, really. I had a great team at Method Studios in Vancouver – everybody’s heart was really into this project and I think that shows in the work. And working with Bong was just such a pleasure. He had great respect for us in visual effects – the trust that we had from the start created the perfect workflow in terms of building this creature. I’d love to do it again!

Watch a trailer for Okja:

Okja is a Plan B Entertainment, Lewis Pictures and Kate Street Picture Company production in association with Netflix. Okja is currently on release in selected US theaters, and available to stream on Netflix.

cineSync and the Global Roundtable

Giant pachyderms prowl ancient Britain in "King Arthur: Legend of the Sword. Photograph copyright © 2017 by Warner Brothers Pictures.

Giant pachyderms prowl ancient Britain in “King Arthur: Legend of the Sword. Photograph copyright © 2017 by Warner Brothers Pictures.

Like many industries in the 21st century, visual effects is a worldwide operation. Pick any populated spot on the globe and chances are you’ll find a visual effects facility somewhere in the same latitude. The biggest companies of all maintain offices right across the planet.

But while crossing timezones delivers some amazing efficiencies – and allows productions to access lucrative subsidies – the distances involved are daunting. How do directors critique shots with visual effects supervisors when they’re on opposite sides of the world? And how do the supervisors then communicate notes to multiple vendors scattered across many continents?

At its simplest, the answer is video conferencing. But the very specific demands of motion picture visual effects mean that it’s not enough just to dial up your colleagues on Skype. High on the list of preferred alternatives is Cospective cineSync, which allows remote participants to watch high resolution video in perfect sync, and includes a range of drawing tools that allow notes to be drawn directly onto individual film frames. Robust encryption ensures the whole process conforms to strict studio security rules.

Recent multi-vendor productions that used cineSync include Guardians of the Galaxy Vol. 2 and The Fate of the Furious, both of which are covered in-depth in our June issue, Cinefex 153. The same software facilitated the visual effects work on Warner Brothers Pictures’ King Arthur: Legend of the Sword. Directed by Guy Ritchie, the film stars the young Arthur (Charlie Hunnam) as a streetwise monarch-on-the-rise, fighting to save Dark Ages Britain from the evil Vortigern (Jude Law) and a host of fantastical foes.

Charlie Hunnam stars as a streetwise savior in "King Arthur: Legend of the Sword. Photograph copyright © 2017 by Warner Brothers Pictures.

Charlie Hunnam stars as a streetwise savior in “King Arthur: Legend of the Sword. Photograph copyright © 2017 by Warner Brothers Pictures.

“I came onto King Arthur in 2014, working with visual effects producer Alex Bicknell and visual effects supervisor Nick Davis, who I’d worked with on Edge of Tomorrow,” said visual effects production supervisor Gavin Round. “Thanks to that experience, we had an established, effective workflow in place. My duties involved managing vendors, and making sure that the shots came in on time and that the vendors had everything they need. cineSync enabled us to review the material constantly, so we were always aware of the status of any given shot. We could see it in real-time to discuss with the vendors.”

The London and Montreal teams at Framestore led the visual effects charge on King Arthur: Legend of the Sword, supported by a horde of vendors that included MPC, Method Studios, Scanline VFX, Nvizible and One of Us. For sequences that required Framestore and MPC to share shots portraying a particularly menacing digital character, access to a global roundtable proved essential.

“It was a delicate process, as we had to maintain continuity between the two vendors, who were essentially building different parts of the same being,” Round noted. “We needed to constantly review and check the material back-to-back to ensure everything transitioned correctly, no matter which vendor it came from. Nick liked to do cineSync sessions because he could pull up a shot, make marks on it, draw on it and tell the artists exactly where he wanted a creature to walk.”

Watch the trailer for King Arthur: Legend of the Sword:

Adelaide-based Cospective – originally known as Rising Sun Research – was founded in 2000 as a spin-off from visual effects studio Rising Sun Pictures. Developers Tony Clark, Alan Rogers, Neil Wilson and Rory McGregor received a Technical Achievement Award for cineSync at the 2010 Academy of Motion Picture Arts and Sciences SciTech Awards.

Outpost VFX Dives 47 Meters Down

47 Meters Down - Outpost VFX

Steven Spielberg’s tense 1975 thriller Jaws has a lot to answer for. Not only did it help set the mold for the summer blockbuster, but it also secured the place of the great white shark as cinema’s greatest underwater nemesis.

Jaws featured a notoriously temperamental mechanical shark built by special effects expert Robert A. Mattey – plus a handful of real critters photographed by shark experts and filmmakers Ron and Valerie Taylor. Sharks splashed back onto the big screen in 1999 with Renny Harlin’s Deep Blue Sea, for which Walt Conti fashioned three full-size animatronic beasts and visual effects supervisor Jeffrey A. Okun and Hammerhead Productions led the push to realize the film’s genetically-enhanced predators in digital form.

More recently, Stockholm-based Important Looking Pirates created the voracious sharks seen in Joachim Rønning’s and Espen Sandberg’s Kon-Tiki, and worked alongside a host of other visual effects vendors to bring to life the relentless Carcharodon carcharias of Jaume Collet-Serra’s The Shallows.

The newest sharks on the block are those seen in 47 Meters Down, in which hordes of great whites attempt to wrap their teeth around sisters Lisa (Mandy Moore) and Kate (Claire Holt). Trapped in an underwater cage with only one hour of oxygen left, the siblings engage in a battle for survival as the sharks lays siege to their fragile aquatic fortress.

Watch a video of Outpost VFX’s work on 47 Meters Down:

Outpost VFX was sole visual effects vendor on 47 Meters Down. Director Johannes Roberts worked with Outpost’s on-set visual effects supervisor Sean Mathieson, and visual effects supervisor Mark Gregory oversaw 426 shots at the company’s seafront offices on the UK’s south coast. The work included simulating ocean environments in Side Effects Houdini and tracking them to live-action plates, deploying digi-doubles of the main actors, a CG cage and – of course – creating the all-important school of digital great white sharks, sculpted in Pixologic ZBrush and animated in Autodesk Maya.

47 Meters Down - Outpost VFX

Outpost VFX owner Duncan McWilliam, who was also executive producer on the film, commented:

47 Meters Down was a real breakthrough opportunity for Outpost VFX to show what we are capable of producing. Being the sole vendor on the show across such a broad range of VFX disciplines was fantastic for our team to really flex their creative and technical muscle. This show really accelerated the development of our in-house proprietary tools and pipeline and now gives us a great calling card for creature and environment work across VFX-heavy shows.”

Outpost VFX recently worked on shots for Daniel Espinosa’s Life, covered in-depth in Cinefex 153, out now.

Alien Memories

The xenomorph returns in Ridley Scott's "Alien: Covenant"

Back in 2014, I marked the 35th anniversary of the release of Alien by blogging about my love for the film. Three years on, I’ve just completed work on our upcoming magazine article covering Alien: Covenant, the latest film in the spine-tingling sci-fi franchise. The film hits US theaters today, and you’ll be able to read our in-depth behind-the-scenes story in Cinefex 153 — the new issue is out in June and available to preorder right now.

While writing the article, I spoke at length with the key supervisors who worked on Alien: Covenant in the visual effects, creature effects, and special effects departments. At the end of each interview, I asked everyone the same question: “What are your memories of seeing the original Alien for the first time?”

You see, I had a hunch that most people just can’t shake off the effects of early exposure to Ridley Scott’s classic horror flick. We never quite recover from what we see in the shadows as a kid, right? And facehuggers do have a tendency to cling.

Was my hunch right? I’ll let you judge for yourself …

Director Ridley Scott on the set of "Alien: Covenant"

Director Ridley Scott on the set of “Alien: Covenant”

“I remember being a small kid, watching Alien on a tiny TV in my room at night, and being totally overwhelmed by it. There was nothing else like it at that point. After watching lots of Star Trek, watching Alien I felt like this must be real. It’s so unique, and very powerful. It’s been amazing working with Ridley on another one.”
Charley Henley — production visual effects supervisor

“I did science at university, but I also always did sculpture. Until I was 25 or 26, it hadn’t dawned on me that there was this job out there which suited me. The only book about films I had from my childhood was about Alien — I had Giger’s book up on my shelf — and that was the only reason I got into this business. To actually end up doing Alien: Covenant was quite unique, quite special.”
Conor O’Sullivan – creature design supervisor

Alien was just shocking. It was so out there, so new, and frightening — a proper horror movie. I don’t think there’s anybody who can do it better than Ridley. It’s his baby. He thinks the alien is beautiful, you know.”
Neil Corbould — special effects supervisor

Alien was one of the reasons why I wanted to make creatures and makeup effects. The chestburster scene with John Hurt — when we first saw it, it was like nothing we’d ever seen before. It was brutally intense, and beautifully done. Right before we started Alien: Covenant I watched Alien again, and I remember coming into work and just going, ‘Shit, how are we ever going to compare to this?’”
Adam Johansen — creature effects supervisor

Tennessee (Danny McBride) and Daniels (Katherine Waterston) go up against the ultimate foe in "Alien: Covenant"

Tennessee (Danny McBride) and Daniels (Katherine Waterston) go up against the ultimate foe in “Alien: Covenant”

“I was quite young when I watched the first Alien. They put such care and attention into this futuristic environment that felt at the same time very lived-in. The camerawork moving through the corridors at the beginning — everything feels quite pristine, but there are touches like someone left something hanging on the door. And the chestburster scene, of course — that goes without saying!”
Ferran Domenech — visual effects supervisor, MPC Montreal

“My memories of Alien are getting my hands on it on VHS when I was too young to watch it. I was at my mate’s house. I was scared shitless!”
Ben Jones — visual effects supervisor, MPC London

“I rewatched Alien at the start of this project, just to familiarize myself again. I think what was so strong about the original Alien was the use of not seeing the alien. We wanted to make our creature work scary by not revealing too much too quickly.”
Christian Kaestner — visual effects supervisor, Framestore, Montreal

Alien scared the living daylights out of me. I remember not long afterwards going to see Aliens. It was a midnight screening — probably not the best time to go, coming out at two o’clock in the morning! It’s been an honor to work on Alien: Covenant — kind of a dream come true. Our artists were literally queueing up to come and work on it. We had to turn away so many people.”
Stuart Penn — visual effects supervisor, Framestore, London

“I was over at my friend’s house and he had Alien on videotape. We weren’t allowed to watch it — we would have been pretty young — so we sneakily put it on when his mum and dad were out. I remember being totally freaked out by it — mainly the chestbursting scene. I’d never really watched a horror movie. I’d seen some black and white Quatermass stuff, but I hadn’t seen anything as graphic as that. It scared the crap out of us!”
Paul Butterworth — visual effects supervisor, Animal Logic

(L-R) Amy Seimetz (Faris), Benjamin Rigby (Private Ledward) and Carmen Ejogo (Karine) in ALIEN: COVENANT

Faris (Amy Seimetz) and Karine (Carmen Ejogo) try to save Ledward (Benjamin Rigby) in “Alien: Covenant”

Alien is one of the reasons I got into visual effects. I loved the spaceship interior. It wasn’t clean. It had wear and tear. It felt lived-in. It felt like there were stories and experiences that you weren’t aware of, but that you could imagine. That’s definitely what inspired me, and what’s kept me in visual effects.”
Brendan Seals — visual effects supervisor, Luma Pictures

“When we were little, our babysitter took me and my brother to see Alien at the cinema. She covered our eyes for the horrible bits, but it still had a huge impact. It’s why I got into the industry — it’s one of those films that really influenced me. So to work alongside Ridley Scott has been a huge honor.”
Adam Paschke — visual effects supervisor, Rising Sun Pictures

“My dad let me watch a fair amount of movies that I wasn’t supposed to when I growing up, but Alien was not one of them. One of my close friends in college found out that I hadn’t seen any of the films. He was a huge superfan and had the whole quadrilogy, so we started watching through all of them, one movie every night. It was kind of fun to watch them sequentially like that. It was awesome.”
Jim Gibbs — visual effects supervisor, Atomic Fiction

“I was at college in 1979. I went to see Alien on my own, in the evening. It was back in the day when you had B-movies, so there was a support feature about a woman alone in a house. She eventually got out and into her car, and then this person came up behind and cut her throat — so I was already kind of pumped! Then I watched the film, which was absolutely incredible. I was so scared that I ran all the way back to college.”
Paul Round — visual effects supervisor, Peerless Camera Company

What are your memories of seeing the original Alien for the first time?

Do you remember the moment you first saw the alien derelict looming out of the mist? How about the scene where Kane loses his lunch in the worst possible way? Or Ripley singing You Are My Lucky Star while the big fella bares his fangs? The comments box is open — now it’s your chance to reminisce.

Robert Stromberg Q&A — “Raising a Rukus”

Raising a Rukus by VRCIn the animated adventure Raising a Rukus, feuding twins Jonas and Amy receive an unexpected birthday surprise in the form of Rukus – a magic dog who transports them to a fantastic prehistoric realm. During their adventures, they encounter dinosaurs galore, learn some fascinating facts about bioluminescence, and emerge with a new appreciation for the value of getting along.

More than just a cartoon, Raising a Rukus is a series of virtual reality family adventures produced by The Virtual Reality Company. The first episode – which features an innovative branching narrative – will debut at the flagship IMAX VR Centre in Los Angeles on May 19, marking the first original VR production to premiere through IMAX VR. Raising a Rukus is available for the Samsung Gear VR on the Oculus Store and will roll out on major VR platforms, mobile and premium HMDs through 2017.

Watch the trailer for Raising a Rukus:

At IMAX VR centres, audiences are seated in a virtual reality motion chair that incorporates a premium virtual reality head-mounted display, providing an experience similar to that experienced at a theme park. IMAX Chief Business Development Officer Robert D. Lister said:

“We’re excited to partner with VRC – which brings an immense amount of creative talent and expertise – to premiere Raising a Rukus at our IMAX VR centres. This family-oriented fare is becoming increasingly important as we are seeing visitors of all ages come through our successful flagship centre in Los Angeles.”

Directed by Josh Wassung, co-founder of previsualisation studio The Third Floor, Raising a Rukus features an original score by composer James Newton Howard and an immersive soundtrack mixed at Skywalker Sound. Guy Primus, co-founder/chief executive officer of VRC commented:

“VRC has brought together the best artists, storytellers, filmmakers and technicians who are working to create impactful and immersive VR experiences that will bring people face-to-face with imagination. Raising a Rukus is one of many milestone VR experiences we will announce and release in 2017. This is the premium VR experience that people have been waiting for, and we know they will be thrilled.”

Twins Amy and Jonas just found a dog - or did he find them? Either way, Rukus has a magical secret – he’s about to take them to a whole new world. “Raising a Rukus” is a first of its kind animated VR experience from the Academy Award winner Robert Stromberg and The Virtual Reality Company. (PRNewsfoto/The Virtual Reality Company)

Twins Amy and Jonas just found a dog … or did he find them? Either way, Rukus has a magical secret – he’s about to take them to a whole new world.

Cinefex spoke with Raising a Rukus producer Robert Stromberg, Academy Award-winning director and co-founder/chief creative officer of VRC.

Cinefex: Where did the idea for Raising a Rukus come from?

Stromberg: When we started VRC, we wanted to focus not just on interesting ways to tell stories, but on targeting the family audience. We wanted to make something that anyone from 10-80 years old would enjoy. We started compiling ideas for what that could be, and Raising a Rukus came out of that.

Cinefex: Steven Spielberg is on VRC’s board of advisors. What did he bring to the party?

Stromberg: There’s a magic that happens when creative people like Steven and our team talk, so having his words of wisdom was extremely valuable. Just having that pedigree of creativity to bounce off gives it a unique signature.

Cinefex: Did you go through the same creative development process that you would for an animated film – concept art, storyboards and so on – or did VR introduce any new steps?

Stromberg: Well, what we’re doing is cross-pollinating specialists from the gaming world, and specialists from film. So yes, the approach was to create artwork and storyboards, to define the look as we would in preproduction on any film. Then, we built assets and all of the elements that we needed to run at 90 frames per second in a game engine. The challenge really has been to make both of those worlds work together – cinema and gaming. I think we’ve accomplished that.

Cinefex: What about the director’s role?

Stromberg: The special challenge with VR is: how do you keep the focus where it should be? That’s more of a directing thing. You have to create not just visual cues but also audio cues that direct the viewer to where they should be looking. If the story and what’s happening in front of them is engaging enough, hopefully people won’t be compelled to look behind them for no reason.

Cinefex: I guess the design of the world is integral to that.

Stromberg: Yes, it’s like doing a matte painting, knowing how to direct the viewer’s eye into the composition. I think a lot of those techniques that I used to use for matte painting and visual effects in general apply here. The psychological tools that we use in traditional cinema are still valid.

Cinefex: Only this matte painting surrounds you completely.

Stromberg: Right. In that regard there were elements and techniques we learned from Avatar. When we were creating those 360-degree worlds, I would go into that environment with a virtual camera – almost like a virtual location scout – and compositionally place the assets and elements. I moved through the world before we actually added the animation and the characters, to make sure that it stayed visually appealing as you went down the path.

Cinefex: Did you use similar techniques to previsualise the world of Raising a Rukus?

Stromberg: We did previs with rudimentary assets, as you would on any film. Then we took that into the headsets, retextured everything and bumped it up to a higher resolution. Previs techniques work well in VR.

Cinefex: So all that movie experience comes in handy when you’re doing VR?

Stromberg: There are so many elements from traditional cinema – including visual effects – that apply themselves well to creating something in VR. I think the only unique difference is that we have to cover ourselves more, because it’s 360 degrees. Other than that, all of the old techniques work really well.

"Raising a Rukus" debuts at the Los Angelese IMAX VR Centre on May 19 - the first original VR production to premiere through IMAX VR.

“Raising a Rukus” debuts at the Los Angelese IMAX VR Centre on May 19 – the first original VR production to premiere through IMAX VR.

Cinefex: Tell me about the branching narrative in Raising a Rukus.

Stromberg: At one point in the story, Jonas and Amy get split up. If you’re looking at Jonas, you will go down his path. If you’re looking at Amy, you will go down her path. So if you rewatch it, you can watch the other path and see the different problems that they face along the way, other dinosaurs that they run into and stuff like that. The branching narrative angle is very powerful, and this is the first of its kind. In future episodes or other projects that we do you’ll see many more branching narratives. The repeatability of experiencing the same story from a different angle is very important to us.

Cinefex: This first episode of Raising a Rukus runs for 12 minutes. What made you choose that length?

Stromberg: We wanted something that would be friendly to younger audiences, and we didn’t want to overburden people with an enormous amount of time in the headset. We did elaborate testing with families and kids, and this amount of time seemed to be the sweet spot. Everybody who watches it says the same thing: “Wow, that didn’t feel like 12 minutes – it felt like five minutes.” The subject matter is always moving and energetic, so you get swept up in the story and lose track of time.

Cinefex: The VR industry is evolving at a breakneck speed. Where do you see Raising a Rukus sitting in the context of everything else that’s going on?

Stromberg: In terms of what we’re trying to do at VRC, honestly, I kind of look at this as our Steamboat Willie – although I’m not for a minute trying to say that we’re Walt Disney. What’s interesting is that Disney showed that this art of animation could be a vehicle to tell stories, and a powerful one. Then it just grew from there, and the ability to tell stories in animation just went through the roof.

Cinefex: So it’s achieved what you wanted it to achieve?

Stromberg: I think Raising a Rukus is a milestone for VRC. It’s the first project where I feel like we’ve succeeded in doing all that we set out to do, which was to tell a story in a unique way, in a brand new medium. I grew up watching cartoons and animated films, and now for the first time I feel like I’ve actually been immersed in one. Hopefully we’ll start to see many things evolve out of this: longer VR experiences, cinematic events in VR … I think the door’s wide open.

Cinefex: Robert Stromberg – thank you.

IMAX will also roll out VRC’s Raising a Rukus to its IMAX VR centres set to open in New York City, the UK, and other locations worldwide in the coming months.

Images courtesy of PRNewsfoto/The Virtual Reality Company. Special thanks to Jeff Fishburn. Article updated 05/26/17.

The Future of VR — A Roundtable Discussion

Cinefex VR RoundtableEarlier this year, Cinefex published The Dreamsmiths Unleashed, an in-depth look at the current state of play in the virtual reality industry. In the course of writing the article, we spoke to over 20 VR professionals and amassed around 80,000 words of interview transcript.

Looking back through the wealth of material that didn’t make the article, we thought how great it would have been to get all those people physically round a table together. An impossible task, of course. But maybe there was a virtual solution …

And here it is. The following conversation is made up of choice extracts from all those interviews, woven together to create something unique: a virtual roundtable discussion attended by our own special assembly of VR dreamsmiths.

CINEFEX: So what is virtual reality?

MATTHEW GRATZNER: Virtual reality is an immersive experience that you can’t get in any other form of media. I view it as sort of an amalgamation of cinema and theater.

PATRICK MEEGAN: I agree it’s like theater in that everyone on stage has to be active in the scene, there is no framing out someone and then cutting back. That is pretty critical to virtual reality capture, because you’re doing much less editorial and everything can be seen.

CINEFEX: Cinema … theater … what about games?

LOGAN BROWN: Video games provide huge insight into how to guide users to interact with the virtual world. However, the gap between gamers and non-gamers can pose problems for developers. How do we make an experience challenging enough for a gamer who is familiar with interactive input, and yet easy enough for a non-gamer to enjoy without frustration?

TOM VANCE: Of course we are going to bring things to the table from theater and film and television, and of course we are going to bring gaming into the picture. The exciting thing is how we turn those things into a new way to tell stories.

CINEFEX: So it’s all of these things combined?

ARUNA INVERSIN: Well, I actually think virtual reality is a new medium. It’s not cinema, it’s not television, it’s not the internet — and those are the three main consumption modes that people have right now. And within those modes there are so many avenues other than entertainment — there’s action, interactive, passive, live-stream, educational, medical … these are avenues that haven’t even been tested yet in virtual reality.

ROBERT STROMBERG: I agree that it’s a brand new medium, with an unwritten rulebook on how to tell stories. I’m fixated on that particular aspect of it — how to get real actors in real situations, with real emotional scenes.

CINEFEX: You talk about a rulebook. Do the rules of cinema apply to virtual reality?

MATTHEW GRATZNER: Yes, because you are still telling a story. Gamers who put on a headset are going to be very active, obviously, but for most people who watch content, yes, there’s that initial moment where they say, “Oh wow, this is really cool,” but you’ve got this diminishing curve where people then say, “I just want to be entertained.”

ALEX HESSLER: We do have a lot of techniques in cinema that we know how to use to steer the viewer, but unfortunately those techniques don’t generally work in virtual reality. But, there’s a whole batch of other techniques that people are only just discovering, and I do think working in film gives you that mindfulness and that curiosity about observation that is really important for designing virtual reality.

CINEFEX: Is there still a place for traditional job roles — do you still need a director of photography, for example?

BEN GROSSMANN: There is absolutely a role for a director of photography, but it is a completely different thing. It’s funny when we see a director of photography operate in virtual reality for the first time. They are focused on staging the scene in one direction, and they get frustrated when they put somebody else in the headset and the first thing that person does is start looking in every other direction! It’s like going onto a movie set and giving the camera to a 12 year-old child! The hard part is wrangling the audience into looking where you want them to look, in order to catch the part of the story that you want them to catch. You do that by lighting, by sound, by emotional interest — and that language has yet to be codified.

CINEFEX: How do you go about doing that?

SEBASTIAN MARINO: There’s all these rules of virtual reality that everyone seems to want to make, but as far as I can tell, they’re all things you’re not allowed to do. I personally reject that — it’s just horrible. It’s way too early to make a list of things you can’t do. You just have to build a world that is self-consistent. If you do that, I think you can use all sorts of techniques and make something that looks really interesting.

CINEFEX: So can it actually be a hindrance, holding onto the old ways?

ANDREW MCGOVERN: It’s always a learning curve when you move into a new format, especially with virtual reality. At the same time, people love the freedom that it gives.

PHIL TIPPETT: I think it links in somewhat with the creative process. Throughout the course of a career, you build up your skills as a craftsman, and you have a certain way of going about doing things. But then, once you have created something, you kind of need to forget it, so that there’s room for another idea. It’s like intentionally trying to figure out a way of not falling into the franchise trap. The next thing that you do has got to be something worth doing because it’s different.

ARTHUR VAN HOFF: If you get a Hollywood director to do virtual reality, they’re going to use all the same language, and it’s just movies in virtual reality. Then you have some people that go completely crazy — they put actors all around you, and you have to swing your head around like a madman to follow a conversation. Then there’s where I think most of the action is, which is the young, upcoming directors who have something to prove. There are tons of people still at film school, experimenting with GoPros, who are going to be the Steven Spielbergs of virtual reality.

SASCHKA UNSELD: I actually think the longer people have worked in film, the harder it is for them to switch away from the ways of thinking that they have. The younger people are, the more easy it is for them just to embrace the newness of this medium. The wave of experiences that we’ll see from people who have virtual reality as their first way of expressing themselves, and not as their second or third … I think that wave will be enormous.

CINEFEX: What about visual effects artists — are they generally well adapted to working in virtual reality?

AMY SMALL: A lot of our people at Framestore have adapted super-well, but they all tend to resonate in different areas, depending on what their backgrounds might be. So we look at a project when it comes in, and then try to match the creative who makes the most sense with the project.

ARUNA INVERSIN: I think some visual effects artists don’t want to go into the 360-degree world. But then, every artist that I bring on is new to virtual reality in some form. Everybody’s learning it for the first time.

CHRIS HEALER: As our own toolset has matured, our artists have all pretty much consistently warmed up to virtual reality. They’ve gotten past the tech and math, and they are now into creative thought. But it took time, and the ones who were cold to the idea needed to warm up by seeing other people’s excitement and other people’s success. It’s cool to see that transition.

LOGAN BROWN: The person who adapts well to working in virtual reality is likely someone who embraces the unknown and is persistent enough to work through setbacks. Everything is so new, and there are so many major areas still unexplored. A strong creative willing to wade into the deep end of bleeding-edge technology can make a huge contribution to the medium.

SEBASTIAN MARINO: I really don’t know how you do virtual reality coming purely from a Silicon Valley background. If you’re not used to dealing with artistic criticism, you’re in for it.

CINEFEX: Is the technology anywhere near mature yet, or is there a lot more innovation to come?

SIMON ROBINSON: We’re trying to improve the number of products that we have that make previewing virtual reality very easy. That’s really just an engineering challenge to make sure that output into a headset is as simple as output to a flat screen. But, we are interested in how toolsets in the future might be completely immersed. I think that’s a fascinating challenge for us — if artists wore the headsets full-time, would that fundamentally change the way they did their work, and would you then have to engineer products for media in a completely different way to the way you do them now?

ROBERT STROMBERG: The equipment will get refined, smaller, sexier. Then there’s the social aspect — the option to experience something where you can look over and see your friend, like you do in a movie theater. I think that’s all going to become very standard.

OLLIE RANKIN: But it’s wrong to assume that the virtual reality headset is the final viewing platform. I think the headset is kind of like the laser disk — something that was, at the time, the best way of storing and playing back video but was superseded quite quickly. Just don’t ask me to look into the crystal ball and tell you what is going to replace it!

ARUNA INVERSIN: I think the next big revolution is the software experience. In the next year or two, we’re going to see some really great software that leverages touch controllers, hand controllers and motion controllers. We’re going to see people develop software that allows collaboration in the virtual reality space.

MICHAEL BREYMANN: There are all kinds of analytics and data that can be gathered and processed based on your eye movements, so there is both a scary and wonderful world coming for virtual reality. It’s almost like when you put on sunglasses — you feel a little bit protected because nobody can see where your eyes are, but you have the privilege of gazing wherever you want. In virtual reality you also have that feeling because you’re in your private little box, and you feel safe. But, with sensors and tracking technology you are very much not safe. The things that enables are kind of like science fiction.

MATTHEW GRATZNER: Everybody on Wall Street is going to make money trading paper on the newest virtual reality tech, but that is all meaningless if we are not creating content that the general public is going to download and buy. We need to start investing much more heavily into content, otherwise virtual reality will be just a fad, destroyed because it was overly hyped.

CINEFEX: Is that a real risk — that virtual reality is going to crash and burn?

RAY TINTORI: You know, I think virtual reality is the future, but I actually don’t think it’s an entertainment medium. People think it’s going to have a trajectory like 3D movies, but I feel like virtual and augmented reality is more like the internet — there are so many practical applications that it’s just going to become a part of a lot of stuff that we do, in an invisible way.

CLINT KISKER: Yes, in the future, I think virtual reality will be invisible. It will just be a part of the way that we work, consume stories, learn, travel, buy homes. How all of that will work from a consumer interaction standpoint, I couldn’t say. But I do believe that my son and daughter will not see anything unusual about a persistent virtual world that exists attached to their Vive or their PSVR, that they can check in on at the end of the day.

PATRICK MEEGAN: Right. We will eventually be looking at the virtual reality and augmented reality spaces as a single continuum, creating experiences that can accommodate elements of the real world, but then completely obfuscate the real world and transport you into a virtual one.

CHRIS MORLEY: I think that’s a philosophical question for anybody who wants to put on those goggles! It’s a multi-faceted technology. I see use in medical training, in theme parks — you could have a ride but then have different content for different people, so that they can choose what they want to experience. I think it’s going to be everywhere in the end. It runs the gamut.

CINEFEX: So we’ll all soon be fully immersed in a totally fabricated world?

JOHN GAETA: We can talk all day long about super-complex futuristic scenarios, but we’re a long way away from being Neo running through the back streets in The Matrix. But, it’s not impossible to see a dot between now and then. The next step is to figure out what is the right first thing for us to try to do, so that virtual reality is a joyful, positive, enlightening experience. Being in a cinema is a very powerful form of immersion. People remember things that happen in film so strongly that they carry them throughout their whole lives. It’s natural for us to think that way now about virtual reality. We want people to be able to say, “I stood inside this moment, and I saw these characters up close, and it gave me an emotional reaction.” Hopefully, that will happen not too long from now.

CINEFEX: People say that working in virtual reality right now is like being in the Wild West. And here you all are, riding your wagons along the trail. How does it feel?

RAY TINTORI: You know, virtual reality is so funky right now. It’s like using leeches or medieval technology — everything is jaw-droppingly advanced and embarrassingly clunky at the same time. In five years, everything that we’re doing now is going to feel silly, because there’s all this stuff that just hasn’t been invented yet.

ROBERT STROMBERG: Sometimes it feels like we’re in the early days of Thomas Edison or Henry Ford. I know that’s dramatic, but it really is a very inventive time.

BEN GROSSMANN: Well, I have always enjoyed being an explorer, and the landscape of virtual reality is certainly unexplored. There are no rules written, no language that has yet been defined, and no leaders. There are no legendary filmmakers in whose shadow we all stand. So we’re getting back some of that passion that we had in the early days of visual effects, when we were figuring things out for the first time.

Thank you to all the dreamsmiths on our virtual reality roundtable:

  • Michael Breymann — co-founder, Kaleidoscope VR
  • Logan Brown — virtual reality producer, MPC
  • John Gaeta — executive creative director, ILMxLAB
  • Matthew Gratzner — creative director, New Deal Studios
  • Ben Grossmann — chief executive officer, Magnopus
  • Chris Healer — chief executive officer, The Molecule
  • Alex Hessler — virtual reality supervisor, Tippett Studio
  • Aruna Inversin — creative director, Digital Domain
  • Clint Kisker — co-founder, Reality One
  • Sebastian Marino — co-founder and chief technical officer, Evercoast
  • Andrew McGovern — vice president of augmented and virtual reality, Digital Domain
  • Patrick Meegan — creative director, Jaunt
  • Chris Morley — visual effects supervisor, Tippett Studio
  • Ollie Rankin — head of production, Uncorporeal Systems
  • Simon Robinson — chief scientist, The Foundry
  • Amy Small — global head of virtual reality, Framestore
  • Robert Stromberg — co-founder and chief creative officer, VRC
  • Ray Tintori — director and visual effects supervisor
  • Phil Tippett — founder, Tippett Studio
  • Saschka Unseld — creative director, Oculus Story Studio
  • Arthur Van Hoff — co-founder and chief technical officer, Jaunt
  • Tom Vance — head of content, Jaunt

Photograph courtesy of Magnopus.