About Graham Edwards

I'm senior staff writer at Cinefex magazine. I also write novels. In a former life, I produced animated films for theme park rides and science centres. If you offer me a cold beer, I won't say no.

Inspiring the Monster Makers

What drives people to work in the makeup and creature effects industry? The glamour? The technology? All those ravening monsters and exploding spaceships? Or is it just another job? In an ongoing series of articles, we ask a wide range of effects professionals the simple question: “Who or what inspired you to get into the effects business?”

Here are the responses from professionals working in the field of makeup and creature effects. Yes, it’s time to find out what inspires the people who make the monsters.

Frankenstein's Monster photograph by Universal Studios (Dr. Macro) [Public domain], via Wikimedia Commons

Boris Karloff as the Monster in “Bride of Frankenstein”.

A Universal Inspiration

It’s a given that, to be a successful monster maker, you have to love monsters. So it’s not surprising to learn that many of the makeup and creature effects experts working in the business today were inspired by all those classic movie monsters of old.

And what could be more classic than the iconic creatures that lumbered out of Universal Studios during the era of black-and-white?

“I used to watch the late-night double bill of Universal horror movies like Frankenstein and The Wolfman,” recalled Mark Coulier. “My dad would put me to bed at 8pm and then wake me up again later to watch with him – I was probably about ten years old, and loved it.”

Tom Woodruff Jr. also remembers relishing the antics of Dracula and his bloodthirsty buddies on the small screen, citing as one of his early inspirations “the monster movie craze of the mid ‘60s that put Universal Monsters on TV.”

Another lifelong fan is John Rosengrant: “The Universal Monsters had a huge impact on me when I was a kid, and sparked my interest in practical effects.”

Creature from the Black Lagoon photograph by Florida Memory, via Wikimedia Commons

The Creature rises out of the Black Lagoon.

For Howard Berger, there’s one Universal monster that stands head and shoulders above the rest. “What inspired me was seeing The Creature From the Black Lagoon when I was probably four years old,” Berger enthused. “The movie itself was amazing to me, but when I saw the Creature, I was blown away! My little brain could not process what I was looking at. It was truly a real monster! I still love this film, and the Creature suit is one of the best. The design, execution and performance still holds up. I love the Creature!”

As far as Mike McCarty is concerned, there’s only one person to blame for his childhood fascination with monsters. “It was all my mom’s fault. She loved classic horror, but wouldn’t watch it alone. When I was a kid, I was fascinated by monster movies, all the old classics: The Fly, The Blob, The Wolfman.”

One of the most consistently inspiring figures for many effects artists is filmmaker and animator Ray Harryhausen.

One of the most consistently inspiring figures for many effects artists is filmmaker and animator Ray Harryhausen.

Hats off to Harryhausen

If there’s one effects artist who has inspired more people in the business than any other, it’s probably Ray Harryhausen. The fantasy epics he made through the second half of the 20th century –heavily populated with fantasy landscapes and stop-motion creatures – have motivated animators, VFX artists and practical effects professionals alike.

“At about the age of five, my father awakened me to watch a scene from the network TV premiere of Jason and the Argonauts,” Alec Gillis remarked. “At 9:30 pm it was late, and my little brain squirmed and then permanently twisted as I watched a man in a mini-skirt toss monster teeth on the ground. Who knew skeletons grow from monster teeth?! Had it not been for my dad and Ray Harryhausen, I might not have gotten hooked on effects.”

Sean Sansom was also seduced by stop-motion. When asked what had inspired him to get into the effects business, he replied, “Definitely Ray Harryhausen! Seeing the Cyclops from The 7th Voyage of Sinbad on TV as a little kid made me believe that monsters were alive and well, roaming the land.”

Ray Harryhausen Cyclops

For Phil Tippett, his first experience of a Harryhausen movie not only encouraged him to pursue animation as a career, but also to seek out the very man who had inspired him. “When I was seven, my parents took me to see The 7th Voyage of Sinbad, and that was the moment for me,” Tippett commented. “It was Ray Harryhausen who inspired me to become a stop-motion animator, and he was good enough and approachable enough to answer my calls all those years ago and become a mentor to me. We had a close relationship right up until his death in 2013. He was a truly wonderful guy – artist, friend, everything. Very inspiring.”

In his turn, Tippett has passed the baton down the line to people like David Duke, who said, “It was a combination of two elements for me. The first was seeing Phil Tippett’s stop-motion tests for Jurassic Park.” Duke then added, “The other was seeing all of the behind the scenes footage from Stan Winston Studio in creating the animatronic dinosaurs. Suddenly I became aware of this whole world of practical effects. Before, I had known that what I was seeing in films was an illusion, but it wasn’t until this moment that exactly how the illusion was created came into sharp focus.”

Too Many Movies to Mention

Ask anyone in the movie industry to list their favourite films, and you might as well pull up a chair and settle in for the night. Monster makers are no exception.

Steve Newburn reeled out a list including “Forbidden Planet, The Time Machine … all those great ‘50s and ‘60s genre films that got me wanting to do nothing more than figure out how they created those images and hopefully work on them myself some day. Throw in some Star Wars, Planet of the Apes, and Universal Monsters …”

Mark Coulier remarked, “As a teenager, I had so many influences: Jason and the Argonauts, The Elephant Man, The Godfather. I think the moment that I really decided was when I saw a trailer for The Howling, where they had this man turn into a werewolf before your eyes. I’d never seen anything like that before. It had always been done with lap dissolves, which even to a ten year old are pretty obvious. Now it was like, ‘Wow, how the hell did they do that? I want to make rubber monsters for a job. That seems like a good way to spend your working life!’ I never looked back.”

John Rosengrant noted, “Alien made me decide that this was going to be my career,” while Shaun Smith agreed that favourite films are indeed a major motivating factor: “Movies played a big factor, particularly John Carpenter’s The Thing – that blew me away.”

For this full-body transformation shot from "An American Werewolf in London", actor David Naughton was positioned beneath the set with only his head and arms protruding up through a hole in the floor. The rest of his body was fabricated by Rick Baker's EFX crew and articulated from below via concealed rods.

For this full-body transformation shot from “An American Werewolf in London”, actor David Naughton was positioned beneath the set with only his head and arms protruding up through a hole in the floor. The rest of his body was fabricated by Rick Baker’s EFX crew and articulated from below via concealed rods.

A shuddering Sean Sansom added to the list by commenting, “After being traumatised at my friend’s eighth birthday party by the recently released An American Werewolf in London, I was never the same!”

Rob Gillies completed the picture, remarking, “Growing up in the ‘80s on films like Star Wars, Robocop, E.T.: The Extra-Terrestrial and Labyrinth was hugely inspiring to me wanting to make practical effects. Getting into the business of effects – having the opportunity to be a part of making a plasma rifle, or a ten-foot-tall assault robot for a film – was a dream come true.”

Famous Monsters of FilmlandLearning from the Experts

Seeing inspirational images on the silver screen is all very well, but how do you learn about how they were made? For the inquisitive monster maker, there have always been books, magazines and videos ready to reveal the secrets of the masters.

Tom Woodruff Jr. identified one revered source: ”Famous Monsters of Filmland magazine, which showed behind-the-scenes makeup and creature pictures.”

And there was no better place than Famous Monsters of Filmland in which to find adverts for another of Woodruff Jr.’s youthful inspirations: “Aurora monster models, which put monsters in my hands.”

Aurora Models advertisement from the 1965 Famous Monsters of Filmland Yearbook.

Aurora Models advertisement from the 1965 Famous Monsters of Filmland Yearbook.

"Three-Dimensional Makup" by Lee BayganFor Mark Coulier, it was a chance discovery at a market stall that helped set him on his path: “Big thanks to Lee Baygan and his book on prosthetics, Techniques of Three-Dimensional Makeup, that I found on Cambridge market. That book showed me how to get started.”

Seeking similar kinds of source material was Mike McCarty. “Magazines like Starlog and Fangoria introduced me to a whole new group of names like John Chambers (Planet of the Apes), Tom Burman (Food of the Gods, Invasion of the Body Snatchers) and Rob Bottin (The Thing). I stumbled across two books: Making a Monster by Al Taylor and Sue Roy and Tom Savini’s Grande Illusions. These books gave me step-by-step instructions on how to ruin carpets and wreck my mom’s mixing bowls!”

A young Will Furneaux of Weta Workshop performs one of his first makeup tests in 1988.

A young Will Furneaux of Weta Workshop performs one of his first makeup tests in 1988.

Will Furneaux also found inspiration in the work of a legendary monster maker. “When I was about twelve, I got a video out from the local video rental store called Scream Greats, Vol. 1: Tom Savini, Master of Horror Effects,” Furneaux recalled. “It introduced me to Tom Savini, makeup effects and Fangoria magazine, and got me hooked on horror makeup effects. I got some latex and clay and started making my own effects.”

Gruesome Games

There’s a certain playful quality to the whole business of makeup and prosthetics, as noted by Barney Burman: “I believe that what really inspired me – and still does – is my continuous need to play ‘dress up’. I love being part of a team that comes together to build an alternate world, put characters in it, and watch it all come to life. That’s the fun in making movies! And that’s where I feel at home.”

Hide-and-seek fantasies filled with secret disguises inspired Mark Coulier, who commented, “I read some Enid Blyton books where this kid solved mysteries by going round in disguise, making fake noses and moustaches and such. I loved that idea as a kid – that you could alter your face like that.”

Game-playing in its most literal form had a huge influence on Shaun Smith. “Like most young boys, I was fascinated by robots, aliens, ninjas and monsters,” Smith explained. “My path was really set in middle school, when I started playing the Gary Gygax and Dave Arneson Dungeons & Dragons game with my pals. Little did I know it would be a catalyst that would set me on the path to a dream job: breathing life into these fantastical images. I was often sidetracked from my scholastic obligations, instead spending time painting lead figures and studying game manuals.”

In the Blood

From what we’ve heard so far, it’s clear that there are many parents keen to share their love of monsters with their kids … however young they may be. The role of family in encouraging a monster-making career is just further proof of the importance of blood ties.

“My father, who worked for JPL/NASA, was my most direct influence,” stated Steve Newburn. “A huge fan of sci-fi, he exposed me to effects-based films from as early as I can recall.”

Barney Burman, as the son of award-winning special makeup effects artist Thomas R. Burman, and the grandson of makeup pioneer Ellis Burman Sr., puts family influences front and centre. “My father hired me and taught me some,” he remembered. “My mother added urgency when she told me to go out and make my own way and pay my own bills (for which I’ll always be grateful). My brother allowed me to set my inner artist free. And my son, simply by being born, helped to solidify my need for serious commitment to making a career of it.”

Tom Savini's Scream GreatsA Monstrous Meeting

Few things can compete with the thrill of discovering the true north of your own personal compass. But what happens when you finally get to meet one of the idols who inspired you to follow your chosen path? Will Furneaux found out, when the makeup effects hero of his youth visited his workplace.

“One day, Richard Taylor was giving Tom Savini a tour of Weta Workshop, and he brought him into the 3D department,” Furneaux related. “I was nervous. I stood up, shook his hand, and not being the most articulate person in the world said, ‘Hi! I was a big fan of yours!’

“Tom said, ‘Oh … what happened?’ Before I could explain, Richard hurried him off to see the milling machines, which are a lot more interesting to look at than nerds working on computers! I was so embarrassed! Of course what I’d meant to say was that when I was young I was a massive fan, but had since moved into a different area. Tom had a huge influence on me when I was young, and he’s part of the reason why I’m now at Weta Workshop.”

Read more articles in our Inspiring FX series:

Special thanks to Ri Streeter and Niketa Roman.

The Foundry Sold in New Growth Commitment

The Foundry - Brad Peebler, Bill Collis, Nic Humphries and Simon Robinson

Brad Peebler, Bill Collis, Nic Humphries and Simon Robinson

Explore any business, and you’ll quickly find there are essential pieces of industry-standard software that keep its wheels turning. In the visual effects industry, one of those is the compositing, editorial and finishing tool Nuke.

Nuke is so widely-used that recent rumours that its creators – UK-based The Foundry, which also produces VFX tools Mari and Katana – might be taken over by software giant Adobe, sent ripples of concern through the ranks of VFX artists around the world.

Today, May 20, 2015, those rumours were resolved when The Foundry announced a majority investment from private equity firm HgCapital, who in a deal worth £200 million ($312 million USD) will now take over ownership of the company from The Carlyle Group.

Commenting on the company’s new place within the Technology, Media & Telecommunications (‘TMT’) sector of HgCapital, The Foundry’s CEO Bill Collis said:

“Knowing the direction we plan to take The Foundry, we identified that HgCapital was the ideal partner to build on what The Carlyle Group have helped us achieve. Nic and his team have such deep software experience, take a long term view on investing and have an amazing track record in taking already solid companies to even greater levels of success. HgCapital achieve this through investment, both in R&D and people, with a deep respect for customer loyalty and satisfaction. With this deal, we remain one of the few independent companies solely focused on creative industries. This lets us pursue our best-in-class strategy, prioritizing research and innovation; and teaming with other companies to create powerful collective solutions.”

Nic Humphries, managing partner of HgCapital and the head of the TMT team, added:

“There are so many elements about The Foundry that we find attractive, not the least of which is the core management team. This is a company that constantly innovates, both in terms of their technology, as well as their business. Bill and his team love nothing more than running head on into the challenges facing creative industries, developing exciting disruptive technologies that have huge potential.”

In a tight-lipped but optimistic press release, The Foundry asserts that the HgCapital team shares its “vision to deliver disruptive technologies to creative industries”:

“This will leverage our world-class experience in visual effects, design and games. It continues our leadership in emerging trends around collaborative ideation, the automation of creativity, concurrent marketing and manufacturing, as well as new behaviours in storytelling, media collaboration, creation and consumption, such as VR and AR.”

Dream Landscapes – Outer Space

“The kinds of landscape I try to find in my films exist only in our dreams” – Werner Herzog

Space, as every good Trekkie knows, is the final frontier. It’s also the place where no one can hear you scream. Arthur C. Clarke, author of 2001: A Space Odyssey, claimed that space “can never be conquered”. And, in The Hitch Hiker’s Guide to the Galaxy, writer Douglas Adams gleefully informed us that:

“Space is big. Really big. You just won’t believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it’s a long way down the road to the chemist’s, but that’s just peanuts to space.”

As for taking a trip through that great celestial void, well, for most mortals the promise of interstellar travel is just a dream – despite the recent surge in commercial space tourism projects like Virgin Galactic and Space Adventures. I mean, have you seen their ticket prices?

If you really do want to explore other worlds, however, the answer is really very simple: go to the movies. Not only are the tickets cheaper, but also you get popcorn.

But just how do filmmakers go about putting the wonders of the cosmos, big and small, on to the silver screen?

The sea of stars visible in "Destination Moon" comprised some 2,000 automobile headlight bulbs strung on 70,000 feet of wire.

The sea of stars visible in “Destination Moon” comprised some 2,000 automobile headlight bulbs strung on 70,000 feet of wire.

Seeing Stars

Look up into the sky on a clear night and you’ll see a vast starfield comprised of a gazillion tiny pinpricks of light. You might think it’s easy to re-create that view by using, well, tiny pinpricks of light. But, as discovered by the special effects team on George Pal’s 1950 production of Destination Moon, seeing stars isn’t quite as simple as that.

According to screenwriter and author Robert Heinlein in his essay Shooting Destination Moon, first published in the July 1950 edition of Astounding Science Fiction:

“The greatest single difficulty we encountered in trying to fake realistically the conditions of space flight was in producing the brilliant starry sky of empty space. We fiddled around with several dodges and finally settled on automobile headlight bulbs. They can be burned white, if you don’t mind burning out a few bulbs; they come in various brightnesses; and they give as near a point source of light as the emulsions can record – more so, in fact. We used nearly two thousand of them, strung on seventy thousand feet of wire.”

Watch a behind the scenes video that was broadcast live from the set of Destination Moon in 1950 as part of KTLA’s City at Night talk show:

Six years later, when creating starfields for the sci-fi classic Forbidden Planet, A. Arnold “Buddy” Gillespie rejected light bulbs in favour of tiny discs of reflective Scotchlite material, meticulously adhered to a slab of black masonite. Illuminated by a ring of floodlights arrayed around the camera lens, this low-budget solution is described by Gillespie in his book of collected memoirs, The Wizard of MGM:

“[A] jet void, bejewelled with diamond stars and suns, and billions of light-years-away other galaxies with their own myriad stars and suns … It worked beautifully and inexpensively.”

After early experiments with backlit holes drilled in sheet metal, the stars for "2001: A Space Odyssey" were created by airbrushing constellations on to black paper.

After early experiments with backlit holes drilled in sheet metal, the stars for “2001: A Space Odyssey” were created by airbrushing constellations on to black paper with white paint.

In 1968, Wally Gentleman, the original special effects supervisor on 2001: A Space Odyssey, planned to generate starfields for Stanley Kubrick’s seminal sci-fi film by drilling small holes into metal sheets and backlighting them. As Gentleman explained in Cinefex 85, the concept worked well enough … but only as long as the camera remained stationary:

“As the camera tracked by, the holes became elliptical with relation to the lens, and the light intensity changed and the stars either faded out or twinkled.”

Frustrated by the challenges of working with the film’s notoriously demanding director, Gentleman eventually walked, leaving the film’s credited special effects supervisor, Douglas Trumbull, to step up to the plate. In the same Cinefex article, matte artist Richard Yuricich revealed Trumbull’s homespun solution to the starfield problem:

“Douglas would take an airbrush, and he would turn it down to where the pressure was maybe a couple of pounds. He would take this black paper, stretch it and glue it to a piece of board; and he would just sit there and squirt out these stars. In a matter of seconds it would be done – ‘There it is, boys!’”

Trumbull’s airbrushed starfields were photographed on an animation stand, using polarised light to eliminate any glare coming from the black paper. This technique – and variations upon it – became the industry standard for many years to come.

Over time, the randomly spitting airbrush gave way to the modern VFX artist’s ever-growing box of digital tools. In 1998, Dream Quest created scientifically accurate starfields for Armageddon using a database of constellations as seen from Earth – a procedure refined further by the VFX team at Double Negative for the recent hit Interstellar, as explained by visual effects supervisor Andy Lockley in Cinefex 140:

“Rather than painting star fields or trying to build little bits of geometry to project them onto, Oliver [James] wrote this renderer that referenced NASA star field maps. We could tell the renderer where it was pointing into space, and it would reference that star map and place pixel dots representing each real star in that position.”

MPC created three categories of digital star for Danny Boyle's "Sunshine".

MPC created three categories of digital sun for Danny Boyle’s “Sunshine”.

Lighting Up the Sun

Seeing stars from a distance is all very well. But what happens when filmmakers want to get up close and personal with a blazing sun?

In the 1986 sci-fi adventure Star Trek IV: The Voyage Home, Captain Kirk and the crew of his captured Klingon “Bird of Prey” spacecraft enjoy a close encounter with Earth’s parent star. The sun’s flaming surface was created by ILM using – of all things – basic bathroom accessories, as described by effects cinematographer Don Dow in Cinefex 29:

“Pete [Kozachic] came up with the idea of taking two pieces of Flemish glass – which is plexiglass textured like shower doors – and motorizing one of them so that it would turn against the other and give us a moiré pattern. We backlit it with a gelled 10K to give it a yellowish color and we vaselined the areas around the edge where we wanted it to fall off.”

However, filmgoers had to wait until 2007 for a movie that put the sun front and centre. For Danny Boyle’s Sunshine – in which humanity’s guardian star was almost a character in its own right – MPC forged three categories of sun shots, using Autodesk Maya to model the basic geometry of the star, combined with particle and fluid systems to generate the complex motion of its blazing surface.

For "Gravity", Framestore created a hyper-detailed volumetric rendering of Earth capable of holding up during the film's long takes.

For “Gravity”, Framestore created a hyper-detailed volumetric rendering of Earth capable of holding up during the film’s long takes.

Planetscapes

Stars are too big and hot to stay around for any length of time, so let’s move on to something a little more manageable. Like planets.

One of the earliest cinematic representations of a planet appears in Walter R. Booth’s whimsical 1911 film The Automatic Motorist. However, if you’re looking for a realistic portrayal of this most essential of heavenly bodies, you’re in the wrong place. Just scan forward through this video to the 2:20min mark and you’ll see what I mean:

For Forbidden Planet, Buddy Gillespie created wide shots of the alien planet Altair-4 by hanging painted balls – the smallest only six inches in diameter – in front of his background of Scotchlite stars. Dimensional models were still being used in 1981, when Film Effects of Hollywood photographed a large, spherical miniature of Jupiter to create the background vistas seen in Outland. Atmospheric effects were added to the gas giant by superimposing airbrushed artwork over the original stage photography – not to mention the time-honoured trick of smearing Vaseline on the camera lens.

In this same era, all three films in the original Star Wars trilogy were proving that spectacular shots of distant planets can be created very effectively just by using flat artwork. Orbital visions of Tatooine, Hoth, Dagobah and the other exotic worlds in that galaxy far, far away came courtesy of matte artists Mike Pangrazio and Ralph McQuarrie.

When it comes to a planet as familiar as the Earth, the real-world imagery made available by NASA and other space agencies has in recent years given filmmakers a wealth of reference material to draw from. When Robert Zemeckis made Contact in 1997, senior VFX supervisor Ken Ralston’s multi-vendor team had access to hundreds of highly detailed satellite photographs, which they re-purposed to conjure up realistic views of the Earth as seen from space.

By the time Framestore constructed the Earth seen in Gravity in 2013, satellite imagery had been discarded for everything other than visual reference. In order to accommodate the film’s long takes – during which the camera typically passed over a huge swathe of the Earth’s surface – VFX supervisor Tim Webber’s team had no choice but to re-create the planet as a highly-detailed volumetric rendering that accurately reproduced not only landmasses and oceans, but also atmospheric and meteorological effects.

To create the asteroids used in "Star Wars Episode II: Attack of the Clones", VFX artists at ILM scanned the miniature rocks used in "The Empire Strikes Back" and mapped the textures on to new digital models.

To create the asteroids used in “Star Wars Episode II: Attack of the Clones”, VFX artists at ILM scanned the miniature rocks used in “The Empire Strikes Back” and mapped the textures on to new digital models.

Rock and Roll

Space might be big, as Douglas Adams contested, but if sci-fi filmmakers are to be believed, it’s also full of rocks. For the breathtaking asteroid field sequence in The Empire Strikes Back, the visual effects team at ILM used flat painted artwork, stacked up in multiple layers to create the sea of rocks seen in the backgrounds of the spectacular chase scenes.

Asteroids passing closer to the camera were constructed in miniature and photographed spinning against bluescreen using motion control cameras. Also present, of course, were the speeding Millennium Falcon and a parade of hapless TIE fighters. The resulting chaos was described by visual effects supervisor Richard Edlund in Cinefex 2:

“By the time you’ve shot four ships, ten or twelve separate rock elements, three background paintings, a star field, plus miscellaneous explosions and lasers, you wind up with maybe twenty-five separately photographed pieces of film, each of which has to be broken down into color separations and … have all the intermediate bluescreen steps to extract mattes. So all together, you have maybe a hundred and twenty pieces of film involved.”

Twenty-two years later, ILM modelled a set of digital rocks for the asteroid chase in Attack of the Clones by resurrecting all the asteroid miniatures originally used for Empire. Having scanned these old-school rocks, they projected their textures – combined with procedural crater maps – on to new geometry, creating a randomised field of digital space debris. In a nod to verisimilitude (and just possibly a sideways swipe at the real world) they also had Jango Fett blow up a NASA computer model of Eros, one of the large asteroids in our own solar system.

A miniature asteroid also featured in Armageddon, for which Dream Quest built three practical versions of the giant rock that’s closing in on the Earth. The biggest of these – which was sculpted from foam and mounted on a steel armature in front of a greenscreen – measured a whopping twenty-five feet by fifteen feet. Not quite big enough to destroy a planet, but quite enough to keep a crew of thirty special effects artists busy.

Dream Quest constructed a number of miniature asteroids for "Armaggedon", the biggest of which measured twenty-five feet by fifteen feet.

Dream Quest constructed a number of miniature asteroids for “Armaggedon”, the biggest of which measured twenty-five feet by fifteen feet.

Dreams of Outer Space

From big to small, from stars to asteroids, the endless expanse of outer space is full of wonders. However, you might be wondering why I’m talking about outer space at all, given that this series of blog articles is titled Dream Landscapes. After all, there’s no land in outer space, is there?

The answer’s simple. Landscapes are all about the view.

After I’ve bought my ticket to the movies, picked up my popcorn and taken my seat, what’s the one thing I want to see when the camera pulls back? The answer is a spectacular view. That might mean a burning sunset lighting up the ocean, or the craggy slopes of a precipitous mountain range. It might mean a sea of desert dunes, or a glistening Arctic wasteland.

Or it might mean a cosmic vista filled with fiery stars, spinning planets and tumbling rocks.

Whatever that wide shot may be, filmmaker Werner Herzog was right: the place such landscapes – or spacescapes – really come to life is in our dreams.

In other words, when we go to the movies.

All the Cinefex articles quoted can be read in their entirety as part of the Cinefex Classic Collection, available for iPad. Related blog articles:

“Destination Moon” photograph from the May 1950 edition of “Popular Mechanics”. “The Hitchhiker’s Guide to the Galaxy” photograph copyright © 2005 by Touchstone Pictures “2001: A Space Odyssey” photograph copyright © 1968, 2001 by Turner Entertainment Company. “Sunshine” photograph copyright © 2007 by Twentieth Century Fox. “Gravity” photograph copyright © 2013 by Warner Bros. Entertainment. “Star Wars Episode II: Attack of the Clones” photograph copyright © 2002 by Lucasfilm, Ltd. “Armaggedon” photograph copyright © 1998 by Touchstone Pictures and Jerry Bruckheimer, Inc.

Preorder Cinefex 142 Now

Cinefex 142 contents

Updated 18 June 2015 – Cinefex 142 is now on sale! Pick up your copy from our online store.

Summer is just around the corner, which can mean only one thing – the next issue of Cinefex is nearly here … and available for preorder right now!

Issue 142 of the premier magazine for visual effects professionals and enthusiasts goes deep on the VFX of four of this years biggest movies. First up is Jurassic World, which brings together a team including ILM, Phil Tippett and key Stan Winston Studio crewmembers who were there for the first Jurassic go-round. The reunion promises thrills that could only have been dreamed of for the original, made when computer animation and other relevant technologies were in their infancy.

Next up is Avengers: Age of Ultron, for which VFX supervisor Christopher Townsend assembles a team that includes ILM, Double Negative, Animal Logic, Luma Pictures and Framestore, with special effects supervisor Paul Corbould and the mechanical wizards at Legacy Effects lending practical effects support.

For Mad Max: Fury Road, special effects supervisors Andy Williams and Dan Oliver provided in-camera action with picture vehicle supervisor Geoff Naylor, while hair and makeup designer Lesley Vanderwalt, prosthetic supervisor Damien Marton and Tinsley Studios assisted with mutant characters. Visual effects producers Alex Bicknell and Fiona Crawford guided visual effects with vendors including Iloura and Method Studios, Dr. D Studios, The Third Floor and Stereo D.

Finally there’s San Andreas, which sees special effects supervisors Brian Cox and Matt Kutcher creating practical earthquake destruction and flood effects, while visual effects producer Randall Starr oversees digital enhancements at visual effects studios that include Hydraulx, Scanline VFX, Cinesite, Method Studios and Image Engine.

 

Only one question remains. Which of these effects-heavy blockbusters will feature on the cover of Cinefex 142? All will be revealed soon, but in the meantime why not vote for your favourite in our fun poll?

The VFX of “Bad Land: Road to Fury”

Nicholas Hoult and the Simulit Shadow in "Bad Land: Road to Fury"

Science fiction films aren’t always set in the future. Nor are they always packed full of exotic hardware. Sometimes, science fiction exists in the cracks, walking that fine line between the everyday world and a subtly different reality.

One such film is Bad Land: Road to Fury (originally titled Young Ones), in which homesteader Ernest Holm (Michael Shannon) and his teenage son, Jerome (Kodi Smit-McPhee), struggle for survival in a remote desert land where water is the most precious commodity imaginable.

While the desiccated world created for the film by writer/director Jake Paltrow looks much like our own, what we see on screen also gives us glimpses into an alternate reality. Foremost among these is the “Simulit Shadow”, a robotic mule purchased by the Holms when the harsh desert environment finally takes its toll on their flesh-and-blood beast of burden.

The filmmakers based the design of this lumbering automaton – known as the “Sim” – on Big Dog, an experimental rough-terrain robot developed by Google subsidiary, Boston Dynamics. The Sim was created using a cunning blend of physical and digital techniques, combining the practical skills of Cape Town-based Cosmesis with visual effects by Windmill Lane VFX in Dublin, Ireland.

Watch a video breakdown of the Simulit Shadow effects created by Windmill Lane VFX and Cosmesis for Bad Land: Road to Fury:

According to Windmill’s visual effects supervisor, Ditch Doy, the director’s original intent was to use a fully-functioning robot. “Jake went to Boston Dynamics and tried for a couple of weeks to get a real robot for the film,” Doy revealed. “That just wasn’t realistic, but if Jake had had his way it would have been a completely robotic creation.”

Despite his initial plans being thwarted, Paltrow insisted that the Sim channel the spirit of the Boston Dynamics machines. “Jake wanted it to be clunky,” Doy commented. “He didn’t want it to be a sleek and sexy robot, but more like a piece of agricultural equipment. It was a kind of retro-futuristic look. And he was adamant that it had to look 100% believable – seeing it couldn’t be a visual effects ‘moment’.”

The robot’s robust, utilitarian design perfectly complemented the environment in which the film would be shot: a harsh stretch of desert terrain in Namaqualand, South Africa, just 50 miles south of the Namibian border.

“The locations were fantastic, but they were in the middle of nowhere,” said Doy, recalling his first meeting in the desert with Paltrow and the film’s cinematographer, Giles Nuttgens. “I flew to Cape Town on the red-eye, then had a seven-hour drive up to Springbok, which is a very small mining town. All that for a one-hour meeting with Jake and the cinematographer, Giles Nuttgens. A couple of weeks later I flew down again, and stayed out there for eight weeks on the shoot.”

The faithful Simulit Shadow robot was created by Windmill Lane VFX and Cosmesis using a combination of digital and practical effects techniques.

The faithful Simulit Shadow robot was created by Windmill Lane VFX and Cosmesis using a combination of digital and practical effects techniques.

A Simulit in the Desert

Also making the trip to Springbok was a set of full-scale Sims, each one over two metres long, constructed by Cosmesis. “We built a lightweight version that was hollow, and a fully functional version with hydraulic legs, lenses and functional lights operated by remote control,” revealed prosthetics, puppet and makeup effects designer, Clinton Aiden Smith. “The functional version was not used for walking, but only in scenes where the Simulit was stationary, or needed to stand up from a lying position ready for walking. We also had a version of the Sim that could be dented and take squibs for scenes where it was attacked.”

Cosmesis created three practical versions of the Simulit Shadow robot.

Cosmesis created three practical versions of the Simulit Shadow robot. Left to right: Adrian Smith, Gerald Clark Sutherland, Pierre Smith, Anja Rechholtz, Terri Nicole, Lisa-Marie Bothma. Photograph by Clinton Aiden Smith.

For scenes in which the Sim was required to walk, the lightweight version was used. Legless, it was supported and puppeteered by two performers, most frequently a pair of South African free runners hired for the task. Depending on how it was dressed, the mobile prop weighed in at anything from 23-55kg. “The weight was important, because the parkour performers needed to be able to operate and control its movement in very rocky terrain where temperatures could reach 110°.”

The Sim was initially developed as a 3D computer model using Autodesk Maya. “After we had basic approval of from Jake Paltrow, we projected the front and side views on to cardboard to establish the final size, keeping in mind the weight and the needs of the two performers,” Smith explained. “Once the basic size was locked off, we created a cardboard and PVC pipe mock-up, which we took to the location to get an idea of what we were up against. We realised that the Sim had to be even lighter than the mock-up, as the conditions were so demanding.”

Adrian Smith, Jake Paltrow and others assess an early cardboard and PVC mockup of the Simulit Shadow. Photograph by Gerald Clark Sutherland.

Adrian Smith, Jake Paltrow and others assess an early cardboard and PVC mockup of the Simulit Shadow. Photograph by Gerald Clark Sutherland.

The Cosmesis team created the final lightweight version of the Sim using vacuum-formed PVC components moulded from CNC-cut Superwood formers for the body and legs. The resulting hollow shell was reinforced with aluminium, which also provided anchor points for the puppeteers to grip. The cargo basket perched on top of the main superstructure was constructed using carbon fibre pipes.

While Paltrow was keen that the Sim should have an ungainly quality while walking, nevertheless it had to remain balanced and level regardless of the terrain – a considerable challenge for performers who were bent double with poor sightlines. A distinctly low-tech solution was devised to deal with this. “We hung four strings with weights from the bed of the Sim to remind the performers at what height they needed to operate,” Smith revealed. “This was crucial in showing them how much to bend their knees or backs while performing.”

Lightweight though the final rig was, operating the Sim was punishing. Desert conditions made dehydration and overheating a constant concern, despite the battery-operated cooling suits worn by the puppeteers. “The parkour performers were used to controlling their bodies, which was a plus for us,” remarked Smith. “But there were some days where I and Adrian Smith, our key Sim fabricator, had to stand in to relieve them – believe me, it is not as easy as it seams! Our main performers, Ryno Keet, Chris Jones and a local town boy from Springbok, went beyond expectations to breathe life into the Sim.”

Nicholas Hoult as Flem Lever in "Bad Land: Road to Fury"

Nicholas Hoult as Flem Lever in “Bad Land: Road to Fury”

A Digital Sim

Not only did having the practical Sim on location create a tangible robotic presence for the cast and crew, but for many shots it provided Windmill with a starting point for their visual effects. “A lot of the time we just painted out the puppeteers and stuck in our CG legs,” remarked Doy. “So what we got was a practical robot for the actors to interact with, that also had some nice organic movement. That was very important for Jake. He wanted the Sim to have a way of correcting itself gyroscopically. So whenever you see it, it’s always got a bit of life about it, as if it’s trying to self-adjust and stay upright. We could have done that with straight animation, but this way we got a more quirky performance.”

Despite Doy’s confidence in the approach, not everyone was convinced by what they saw. “It was a bit worrying on set, because it didn’t always look fantastic,” Doy admitted. “We were getting some very strange looks from the cast. In fact, Nicholas Hoult said to me, “You are going to make this look cool, aren’t you? Please?” I had to tell everyone, “Honestly, I know it looks weird now, but we’ll make it look great!”

Windmill also created a fully digital Sim for scenes in which it was not feasible to use the practical prop. By mixing and matching techniques, Doy and his team aimed to keep even the most attentive audience member guessing as to how the Sim had been created in any one shot: “Sometimes we used the puppet rig. Sometimes it was fully CG. We were always moving around the connection points, so hopefully you’ll never be able to spot which shots are purely digital, which ones are half-and-half, and where the joins are.”

Windmill's fully digital Simulit Shadow peers into a treacherous pit on a desert trail.

Windmill’s fully digital Simulit Shadow peers into a treacherous pit on a desert trail.

The digital Sim was modelled using Autodesk Maya, and rendered in Solid Angle Arnold. “This was the first feature where we used our new Arnold renderer, which is very good at doing metals and hard surfaces,” Doy observed. “For every shot we would take HDRIs to use as lighting guides. Mostly we were in desert sun, but we had some night shoots, and also had to shoot in the rain. The shots were composited in Nuke, and our whole pipeline is built around Shotgun.”

In a key scene, a highly emotional Jerome vents his anger on the Sim, attacking it with a sledgehammer. The direct physical contact between boy and machine created significant challenges for the Windmill team.

“We had to use a fully digital robot for that scene,” Doy explained. “I don’t think the puppeteers would have liked us abusing them that much! Originally, Kodi was just going to be swinging the sledgehammer through thin air. I said, “No, we have to have a contact point, because otherwise he’s just going to be swinging through nothing.” So we hastily rigged up some crates, and got Kodi to hammer away at those.”

For a scene in which Jerome attacks the Simulit Shadow with a sledgehammer, actor Kodi Smit-McPhee struck crates covered in green fabric. Windmill later replaced these with their digital robot, timing its animation to McPhee's performance.

For a scene in which Jerome attacks the Simulit Shadow with a sledgehammer, actor Kodi Smit-McPhee struck crates covered in green fabric. Windmill later replaced these with their digital robot, timing its animation to McPhee’s performance.

Even though the crates were covered in green fabric, the lighting conditions meant it was almost impossible for Windmill’s visual effects team to extract Smit-McPhee and the sledgehammer from the background, allowing the Sim to sit behind them in the frame. Much of the keying work was therefore done manually, with artists rotoscoping the action frame by frame, and manipulating the end of the sledgehammer digitally to ensure a good visual lock wherever it made contact with the CG robot. Windmill’s 2D supervisor for Bad Land: Road to Fury was Andy Clarke.

Most importantly, close attention was paid to the animation of the digital Sim, in order to maintain its indomitable spirit throughout the ordeal, despite the abuse being hurled at it. “Jake was adamant that he wanted the robot to have no emotions,” stated Doy. “It’s not stupid – it just doesn’t have any feelings. Jerome is going at it with the sledgehammer, but the robot is unflinching. It just keeps trying to re-stabilise itself and get back to a neutral position.”

The rough, agricultural design of the Sim is carried through to its navigation system – a single laser-equipped lens that constantly scans its surroundings. The Sim’s routine recording of everything it scans ultimately becomes an important plot point, when Jerome accesses its memory and discovers a dark secret. The filmmakers were therefore keen to get the look of the robot’s “Sim-vision” just right.

“Jake wanted to go for a retro, monochromatic display,” Doy explained, “and he didn’t want it to look like it was created through photography – no light or shade. But it still had to put across important narrative points, and you had to recognise certain characters. Jake also wanted a 3D aspect to the Sim-vision, so we projected footage on to crude geometry, and then animated it.”

Kodi Smit-McPhee and Michael Shannon as Jerome and Ernest Holm, accompanied by the Simulit Shadow.

Kodi Smit-McPhee and Michael Shannon as Jerome and Ernest Holm, accompanied by the Simulit Shadow.

Life Support and Future Tech

Visual effects were also used to enhance the appearance of one of the film’s human characters. Ernest Holm’s wife, Katherine (Aimee Mullins) resides at a medical facility, having suffered catastrophic injuries in a car accident. Paralysed, and missing her lower legs, she is able to move around using a complex ambulatory apparatus hooked up to a mobile life-support system.

Mullins, who underwent a double amputation as an infant, found fame as a successful para-athlete, model and actress. For her role in Bad Land: Road to Fury, Mullins wore specially-designed prosthetic legs, while the Rube Goldberg contraption that both keeps her alive and stimulates her spinal cord was added digitally by Windmill.

Windmill Lane VFX used digital effects to create a futuristic life-support system for Katherine Holm (Aimee Mullins) in scenes where she meets her son and husband (Kodi Smit-McPhee and Michael Shannon) in hospital.

Windmill Lane VFX used digital effects to create a futuristic life-support system for Katherine Holm (Aimee Mullins) in scenes where she meets her son and husband (Kodi Smit-McPhee and Michael Shannon) in hospital.

“The rig is a weird back-brace thing that goes up to a life-support machine,” Doy explained. “Jake wanted something that was driven by pistons and steam, rather than an expensive piece of medical equipment. The spinal cord coming from her back joining her to the overhead machine is all CG. Some of the cables are real, but during post, Jake hit on the idea of making Aimee look almost like a marionette. So we were tasked with sticking on extra cables. We weren’t really anticipating that, and it made the tracking much harder, and the shots more involved. But it works.”

Watch a video of Aimee Mullins as Katherine Holm in Bad Land: Road to Fury:

In the film’s later stages, water finally comes to the desert, and crops begin to grow. Despite the arid conditions at the Springbok location, the production initially attempted to cultivate real plants. “We did try to grow some greenery,” Doy commented, “but the sun was so fierce that everything just shrivelled and died. So all of the greenery towards the end of the film is mainly matte painting work. We also did a time-lapse shot of a wheat field growing, which involved a lot of complicated work in Houdini.”

During a brief trip out of the desert to a city across the border, Jerome encounters Anna (Liah O’Prey), a young woman who provides him with safe passage through the high-security checkpoint. Illustrating the difference in technology between the down-at-heel desert world and the neighbouring urban sprawl is Anna’s cell phone, which opens up like a Chinese fan.

“The fan was Jake’s creation,” revealed Doy. “He wanted to show that, across the border, they have nice cars and flashy mobile technology. On the set, it literally was just a fan. Alex opened it up, and we added all the graphics inside. It’s one of those moments in the film where you realise that this isn’t our universe.”

Windmill created interactive graphics for Anna's quirky futuristic fan-phone.

Windmill created interactive graphics for Anna’s quirky futuristic fan-phone.

A View of the Desert

In total, Windmill Lane Pictures VFX delivered around 250 shots for Bad Land: Road to Fury. As well as the practical Sim, Cosmesis also created special dentures for the main characters, while makeup artist Natasha Du Toit managed a progressive range of dry skin, dirt, sweat and tattoos. A number of scenes were enhanced by blood effects and prosthetics, including two broken legs – one human, one mule – while a range of prosthetic bellies simulated the various stages of pregnancy of Mary Holm (Elle Fanning).

Having worked on Bad Land: Road to Fury for about a year, Ditch Doy’s abiding memories are of his time on location in South Africa. “It was the hottest, driest place on the planet – and I think it shows on the screen,” he reflected. “That’s not visual effects – it really was as hot and dry as it looks. It was the toughest film experience I’ve ever had. We all went a bit stir-crazy. But we’d all fallen in love with the desert a little bit by the end. To watch the sun rise and set over that kind of landscape … it really is something you don’t get to experience every day.”

Clinton Aiden Smith craved only one thing while on location in Namaqualand: “Shade! At times, base camp was a long way from the set, so crew were hiking up steep, rocky hills with heavy bags and cases of gear. Man, if you could find some shade under a rock or a tree, you were lucky!”

Fittingly, Doy’s favourite “effects” shot in Bad Land: Road to Fury comes courtesy of Mother Nature. “It never quite rained where we were, but one day, in the distance, we saw a thunderstorm. Jake was smart enough to whirl the camera round and say, ‘Film that!’ So we did, and we caught this great lightning bolt, which is in the film. I’m hoping people don’t know is real, so I can say it’s one of our effects!”

Bad Land: Road to Fury is currently available from Signature Entertainment on Blu-Ray, DVD and Digital HD. Watch the trailer:

Special thanks to Signature Entertainment and Witchfinder. Photographs copyright © 2014 by Signature Entertainment and courtesy of Windmill Lane VFX and Cosmesis.

Maps to the Stars – VFX Q&A

"Maps to the Stars" - visual effects by Switch VFX

Veteran filmmaker David Cronenberg has a reputation for tackling the kind of subject matter that make audiences squirm. His early feature work – which includes Scanners, Videodrome, and The Fly – exposed mainstream audiences to the shock effects of “body horror”, while at the same time getting under their skins with unsettling psychological subtexts.

While Cronenberg’s most recent feature, Maps to the Stars, explores similar themes to some of his earlier films, it dials down the grossness in favour of the psychological. Set against – and satirising – a Hollywood that’s simultaneously larger-than-life and desperately shallow, it explores the intertwined stories of scarred pyromaniac Agatha Weiss (Mia Wasikowska), fading movie star Havana Sagrand (Julianne Moore), TV psychologist Dr. Stafford Weiss (John Cusack), his control-freak wife Cristina (Olivia Williams) and struggling actor Jerome Fontana (Robert Pattinson).

Visual effects for Maps to the Stars were provided by sole vendor Switch VFX, with the exception of a small number of minor shots undertaken by Deluxe. Under the supervision of its co-founder Jon Campfens, Switch delivered around 145 shots, including matte paintings, set extensions and greenscreen composites.

There was a little body horror, too – well, this was a Cronenberg film, after all. In addition to the “invisible” environment work, Switch also developed complex fire simulation effects for a key sequence in which Cristina Weiss sets light to herself beside a swimming pool, and enhanced other scenes involving blood and gore.

In this Q&A for Cinefex, Jon Campfens discusses the challenges involved in turning Toronto locations into the Hollywood Hills, and reveals just what David Cronenberg is looking for when it comes to setting a woman on fire.

Julianne Moore and Mia Wasikowska star in David Cronenberg’s “Maps to the Stars”

Julianne Moore and Mia Wasikowska star in David Cronenberg’s “Maps to the Stars”

How did you get involved with Maps to the Stars?

The line producer, who I know quite well, introduced me to David Cronenberg. We were brought on board a bit late in the project, during prep, as the original VFX company had decided to pass on it. They had already gone far enough down the road on some of the VFX approaches that they could not be changed – especially concerning the fire sequence – so we were locked into doing it that way. We had a number of discussions with David and the creative team to ensure we were all on the same page before we went to camera.

Tell us more about the fire sequence.

During the sequence, Dr. Stafford Weiss discovers that his wife, Christina, has set herself on fire. He proceeds to push her into their swimming pool with a deck-chair to try and save her. Beau Parsons, one of our on-set supervisors, attended the shoot along with Brandon Rogers, a member of our CG team and lead for the department that would be using Fume FX.

What direction did you receive from David Cronenberg regarding the action by the pool?

It was expressed to us by David during the prep stages that the performance comes before anything else. Despite our suggestions that shooting fire elements or using a stunt double would provide more realistic fire, David wanted to go the CG route in order to preserve Olivia’s performance. I am not a great lover of CG fire, but it was the only way to proceed in order to get what David wanted. It was important to him that the audience knew it was really her and not a stunt double using fire retarder – which has a very reflective quality to it.

For the immolation sequence, actress Olivia Wilde was photographed on location, in costume and wearing minimal tracking markers.

For the immolation sequence, actress Olivia Wilde was photographed on location, in costume and wearing minimal tracking markers.

Artists at Switch VFX, under the supervision of Jon Campfens, added CG fire effects simulated using Fume FX, and motion-tracked to Wilde's performance using PFTrack.

Artists at Switch VFX, under the supervision of Jon Campfens, added CG fire effects simulated using Fume FX, and motion-tracked to Wilde’s performance using PFTrack.

Given those parameters, how did you go about executing the fire?

Face replacement was out of the question, because she is moving around quite a bit – as you would be if you were trying to put out a fire. We knew motion capture would be key, to get the fire to interact with her body in a believable way; we also knew we would need to use a fairly minimal rig. We ended up using two DSLR witness cameras synched to the shooting camera to triangulate Olivia’s position. These provided motion capture data through PFTrack – not the most elegant way of doing it, but it’s fast, unobtrusive and cheap.

During the scene, the gown worn by the character Christina burns away to reveal her burned flesh beneath. How did you achieve the transition?

We shot the scene twice, once with Olivia in her gown and once wearing a burn suit. The burn suit was actually very problematic. It took much of the night to prepare, and the tracking markers didn’t want to stay on. By the time it was ready, the sun was starting to come up and we only had time for one run-through. In any case, as soon as she fell in the pool the suit became unusable, so it was very important to get it right on the first try.

How did you integrate the CG fire with Olivia Williams’ performance?

The key to getting the CG fire to react realistically to Olivia’s flailing body was to get a good, useable track. For this we used PFTrack and its motion capture capabilities. Using a three-camera setup, we were able to generate data that allowed us to track her movements in 3D space.

For shots later in the sequence, CG fire and smoke were combined with live-action plates of Wilde wearing a burn suit.

For shots later in the sequence, CG fire and smoke were combined with live-action plates of Wilde wearing a burn suit.

Using a rigged 3D model of Olivia, we attached specific joints to the corresponding locator data returned by PFTrack’s motion capture output. The model was then hand-animated to compensate for the joints that did not return valid tracking data. Once a camera track was achieved, and the witness cameras were lined up properly, we then did an object track for every tracking marker.

What was your approach to the fire and smoke simulation?

We used Fume FX, but we couldn’t get much information on, or examples of, using it with Maya – and we’re a Maya-based company. So we did a lot of R&D, working through all the kinks, errors, and complications.

We generated smoke and fire from the tracked geometry, and rendered countless simulations in order to get what David was looking for. To make adjustments and revisions less time-consuming, we broke down the simulations and renders into smoke and fire passes for individual body parts – arms, legs, torso, etc. This also allowed for more time-friendly simulations at a smaller voxel spacing, creating extra detail in the fire tendrils and smoke. Animated texture maps were created in Nuke, and piped into Fume’s emitter channels. This permitted the fire to start at her legs and spread upwards, completely engulfing her.

Was it difficult keeping the flames locked to her body as she moved?

At times, the fire simulations would run into issues when Olivia’s arms were flailing around. For example, there might not be enough voxels being emitted because of the speed of her movements, or the fire might shoot out in wrong directions.

By setting keyframes on specific parameters within Fume, we were able to tame the behaviour of the simulation so that it would behave exactly as we needed. Using parameters like Wavelet Turbulence in Fume, we also achieved more time-friendly simulations, adding extra detail to low-resolution simulation caches, while preserving the overall motion. We added reflections and interactive light in compositing. So, lots of work was needed for this sequence!

In another scene, Agatha attacks Havana with one of the acting awards she’s won. What did Switch VFX contribute to the scene?

Production had applied some blood to Julianne Moore’s forehead, face and chest, but it wasn’t enough, and the forehead prosthetic didn’t look real. Throughout the sequence of shots, we added more and more blood and damage to her forehead, and blood spray to the chair and lamp.

David wanted to have the forehead wound look concave, so David Alexander, head of our 3D department, made an appropriate element in Maya. The plates were tracked using PFTrack, with our lead tracker and coordinator, Beau Parsons, making sure the CG element moved properly, because Julianne did fling her head around quite a bit. The CG element was composited into the shot with colour correction, using Nuke, enhancing highlights so it would look wetter and bloodier, and we added extra dripping blood.

Viewing the sequence on a large screen, we realised that the chair Julianne was sitting on was moving subtly with her, so the added blood spray wasn’t tracking properly. The audience would probably never see it, but we went in anyway and fixed it with a gridwarp track in Nuke.

For a scene in which Havana Sagrand (Julianne Moore) is bludgeoned with a movie award, Switch FX added extra blood and gore, and enhanced the appearance of the forehead prosthetic worn by the actress.

For a scene in which Havana Sagrand (Julianne Moore) is bludgeoned with a movie award, Switch FX added extra blood and gore, and enhanced the appearance of the forehead prosthetic worn by the actress.

Many of David Cronenberg’s films contain iconic “body horror” scenes. Was there any sense that you were adding to that body of work?

There was at first, but after seeing the film twice, now I’m not so sure. Maps to the Stars deals a lot with apparitions seen by some very unstable people, many of which appear around that same pool area. The immolation scene seems to come out of left field. There is no gas can or anything that would cause her to ignite as she does. We clearly see the pool, but where does her body go after? So I’m not sure if that scene happened at all, or if it was in Stafford’s head – not an unusual thing for a Cronenberg movie, but it was definitely more subtle than his earlier work.

David is very good at getting under your skin and making you feel uncomfortable, whether it’s with graphic body mutilations, or with the way people communicate or interact with each other with topics that the audience might feel unsettled.

Switch VFX also created a number of set extensions and background replacements. Tell us about the scene in the coffee shop, where you replaced the Toronto exterior with a Los Angeles street scene.

In the script, the location takes place inside Denny’s on Hollywood Boulevard. The interior sequence was shot in Toronto at a coffee shop/diner, on one of the major roads here in the city. A greenscreen was placed outside the window at the curb to allow extras to walk back and forth along the sidewalk outside the diner.

"Maps to the Stars" background replacement by Switch VFX.

It was a pretty straightforward set-up. The foreground plates were shot, with camera information recorded, and then comped with a location plate of Hollywood Boulevard. In order to get the right perspective from inside the diner, we had to manipulate the background plate by resizing the image, creating a few matte painting building elements, and add some vehicles driving by. Also, there were timing issues on the extras walking in the background – some of them had to be removed or replaced.

Another thing we did for this sequence was a continuity fix with a reverse angle on Jerome, which required us to remove a waitress, and make her appear on the angle on Agatha. We also had to add some arm movement to the fry cook – in a lot of his takes he looked frozen, like a still image. We did this by rotoscoping out his arm from another take and compositing it into the hero plate. This sequence was mostly handled by Barb Benoit using Digital Fusion.

What else did you do to turn Toronto into Hollywood?

During the editing process, there were a number of shots where David wanted to add the Hollywood Hills to the original Toronto location. All of them required rotoscoping of the actors and the objects within the frame. Luckily, the shots were filmed outside, and the sky was quite uniform, so we were able to get a rough key for most of it. It did still require a lot of rotoscoping of any fine hair on Mia and John Cusack, as well as moving objects with in the frame.

All the Hollywood Hills shots were tracked using Nuke’s 3D and conventional trackers. The background plates were shot with a still camera in L.A. from a number of angles, because we were not sure which one would work best for the shots until we started putting them together. Also, couldn’t actually shoot where the scenes took place in the script, so our matte painter took the stills and tiled a number of them together to create the correct view.

We had to remove a lot of foreground buildings, the Hollywood sign, electrical cables and so on, in order to get exactly what David was looking for. We replaced all the skies as part of the matte painting process as well. This was then put all together by our senior compositor, Gudrun Heinze, using Nuke.

How involved was David Cronenberg with the visual effects?

He’s a very technically-oriented guy, so you can have a conversation with him about what steps are needed to achieve these shots, and he gets it. He also knows when to take a step back. I feel he treats his departments the same way he treats his actors – he wants them to perform as best they can, and he knows to achieve that he doesn’t need to direct every single detail. He was very clear in describing what he was looking for, so after he had taken us through it, he pretty much left us to decide how we wanted to execute it.

Most of the shots in Maps to the Stars were straightforward and seamless, so not much was required creatively. However, the fire sequence was very much a back-and-forth process with David, because the outcome was very important to him. As everyone knows with fire and water simulations, you set one up and see what the outcome is, so we had to go through quite a number of simulations before he was happy. Overall, David was a pleasure to work with.

How do you feel about Switch VFX’s work on the show, looking back at it now?

We are very happy with the work. Most of what we did are seamless effects, and they are integrated into the film quite well. Some critics seemed to focus on the fire scene as taking you out of the film because of the clearly CG nature, but considering what was involved, and knowing we had to go this route – and because the scene is very surrealistic and unsettling – we feel that it works quite well.

In this industry, we all wish we had more time to perfect things. But if we were left to do that we would never deliver, because as artists we are never satisfied! We are very proud of the end result, and we look forward to getting an opportunity to work again with such a respected filmmaker as David.

Maps to the Stars was released on DVD/Blu-ray and Netflix on 14 April 2015.

Maps to the Stars photographs copyright © 2014 by Focus Features and courtesy of Switch VFX.

Some Very English VFX

Aidan Turner in "Poldark"

With its dramatic coastline, bleak moorlands and rolling hills, the picturesque West Country of England has long been popular with filmmakers seeking landscapes filled with both character, and a sense of history.

The counties of Dorset, Devon and Cornwall have featured in films like Barry Lyndon, The French Lieutenant’s Woman and War Horse – not to mention almost every Jane Austen adaptation ever made. They’ve even made appearances in recent big-budget blockbusters including World War Z and Edge of Tomorrow. Coming right up to date, these very English landscapes play a significant part in two new productions – one made for the big screen, and one for the small.

First up is the feature Far from the Madding Crowd, adapted from the Thomas Hardy novel and directed by Thomas Vinterberg for DNA Films. Distributed by Fox Searchlight Pictures, the film tells the tempestuous story of Bathsheba Everdene (Carey Mulligan), and her attempts to juggle three different suitors, in rural 19th century England.

Second is Poldark, a new television adaptation of the classic novels by Winston Graham. The Mammoth production is directed by Edward Bazalgette and Will McGregor, and stars Aidan Turner as Ross Poldark, recently returned to England after fighting in the American Revolution to find his father dead and his inheritance in tatters.

Both productions feature a range of visual effects by two Soho-based companies, designed to enhance the natural landscape and help in the task of whisking viewers back through time to a bygone age.

Watch a Union VFX breakdown reel showing their work on Far from the Madding Crowd:

Far from the Madding Crowd

As sole vendor on Far from the Madding Crowd, Union VFX created visual effects for a number of key scenes, including a burning-barn sequence using a combination of CG and real fire elements, with embers and smoke created using Houdini.

Equally dramatic was a scene in which a flock of sheep is herded over a cliff by a sheepdog. In order to choreograph this complex action, a small group of sheep was photographed leaping over a small drop. Union took these live-action plates and multiplied them many times over to create the illusion of the entire flock in free-fall. Similar replication techniques were used to enhance additional harvest and snow scenes.

Tim Caplan, Union’s co-founder and lead VFX producer on the film, commented, “It was inspiring to work in Dorset – such a beautiful part of the world – and re-imagine it as it was at the turn of the 20th century. It was also a privilege to work with one of the film industry’s most exciting directors in Thomas Vinterberg. And the project itself was an exciting challenge. The big fire sequence, for example, came with a host of logistical, and health and safety restrictions. We had to work out how to keep that sense of real jeopardy and authenticity in the picture – but without of course endangering any of the cast or crew. Far From The Madding Crowd has been a fascinating experience and we are hugely proud to have been a part of it.”

Watch a trailer for Far from the Madding Crowd:

Poldark

Visual effects for Poldark were supervised by Alexis Haggar, and include a number of set extensions and digital matte paintings created by Lexhag VFX.

One of the major challenges facing the production was the re-creation of the 18th century Cornish tin mining insdustry. While some mining architecture still survives in the region, much of it is in ruins. The task of restoring it to full working order therefore fell to the Lexhag team.

51-PD(19JUN)2014_240

The first step towards building a digital tin mine involved a detailed 3D scan of the location. “All of the major set extensions were started with a LiDAR scan,” Haggar explained. “For Wheal Leisure – Ross Poldark’s mine – the art department built the lower areas around an existing mine on the Cornish coast and we took over for the higher elements, such as the roof structure and windows. Grambler, the large mine set into the hillside, was a combination of digital matte painting and 3D elements all composited in Fusion Studio.”

Visual effects artist Ken Turner elaborated, “All the 3D elements were rendered as .exr files and brought into Fusion to relight and grade. The .exr files handle multiple light passes, and masks for all of the separate elements, which gives you a lot of control for interactive adjustments in Fusion. Once the still frame was close to the finished article, I then took it into Photoshop for a final paint, breaking up the clean CG edges with grime and rock before taking it back into Fusion where I added people, smoke, grain, lens aberrations and lots of little tweaks to make the still matte painting come to life.”

Commenting on the demands of television production timescales, Haggar concluded, “Speed is the key for us. Keeping your creativity alive while compositing has always been a challenge. Waiting for elements to render or playback is always frustrating. Fusion provides the best of both worlds: fast compositing or high accuracy “pixel pushing” for absolute perfection.”

42-PD(09JUN)2014_816

Special thanks to Stephanie Hueter and Cheryl Clarke.

The Making of Robot Overlords

Callan McAuliffe in "Robot Overlords"

In the low-budget British sci-fi adventure, Robot Overlords, alien automatons rule the streets and ordinary citizens are locked under curfew in their homes. Just by stepping outside, you risk being vaporised by a hulking Sentry, or picked off by a lethal Sniper.

Amid the ruins of civilisation, Sean Flynn (Callan McAuliffe) leads his group of young friends on a quest to join the resistance forces that are standing against the robot invaders. Hot on their heels, however is their old teacher – now treacherous robot collaborator – Robin Smythe (Ben Kingsley) and his captive Kate (Gillian Anderson). Who will prevail? Man or machine?

Directed by Jon Wright, Robot Overlords features some 265 visual effects shots delivered by Soho-based Nvizible, supervised by the company’s founder and co-owner, Paddy Eason. The film’s complex tracking requirements were fulfilled by Peanut FX, with additional compositing support being provided by Boundary VFX, under the supervision of Nick Lambert.

Watch a wide selection of Nvizible’s VFX shots in the official video for Robots Never Lie, written by Matt Zo for the Robot Overlords end credits:

Origins of the Overlords

The original story concept for Robot Overlords came to director and co-writer, Jon Wright, in a dream. “I dreamed that I was playing cowboys and Indians with a little boy,” Wright recalled. “A huge two-storey robot came marching up the street and swung its laser cannon arm towards him, and a voice boomed out, ‘Citizen, drop your weapon immediately!’ I assumed I was just recycling a movie that I’d already seen, but eventually, I came to the conclusion that maybe it was an original idea.”

After expanding the dream concept into a two-page treatment, Wright began developing a script with co-writer Mark Stay. “Jon and I are both big fans of those 1980s Amblin movies like E.T.: The Extra-Terrestrial and The Goonies,” said Stay. “But those films were always set in American suburbia – we wondered why nobody had ever set a film like that in the UK.”

“It came together very quickly, especially for a British film,” Stay continued. “Our elevator pitch was, ‘It’s like The Goonies, but with robots blowing stuff up!’ We took Jon’s treatment to Piers Tempest – one of the producers from Jon’s previous film, Grabbers. The BFI put us under their wing, and before we knew it we had some development money.”

VFX supervisor Paddy Eason was also consulted at this early stage. “I knew Jon because we did the effects for his first two films – Tormented, and an Irish monster movie called Grabbers.” Eason commented. “The first thing I received was the treatment, and the next throw of the dice for me was getting invited to be part of a group that critiqued the script – Jon calls it the ‘Tiny Think Tank’.”

Sentry – "Robot Overlords"

The Tiny Think Tank

The Tiny Think Tank is an initiative conceived by Wright as a means by which a group of filmmakers can gather together periodically, in order to review scripts in development. “It’s a gang of writers, actors, directors, visual effects people,” Wright explained. “There aren’t any producers or financiers involved – it’s purely creatives. We come together, read a script completely cold, then have a round-table discussion. It was inspired by Pixar. They have their team of creatives – who are also shareholders – who come together and critique each other’s films. The Tiny Think Tank is our low-budget UK version of that, and I think it’s one of the reasons Robot Overlords got made. Our script accelerated in terms of quality much more quickly than it would have done had we been writing it in isolation.”

Nvizible's VFX breakdown for a key sequence in "Robot Overlords"

Nvizible’s VFX breakdown for a key sequence in “Robot Overlords”

As part of the Tiny Think Tank, Paddy Eason was able to offer a level of opinion and criticism not commonly afforded people in the visual effects profession. “I had some suggestions from my own visual effects point of view,” Eason remarked. “But I also put some story ideas out to Jon and Mark. As a visual effects person, that’s not normally part of your remit. In fact, on many shows, to put forward story ideas would be viewed as a breach of etiquette. So it was delightful to be invited to give that kind of feedback.”

Nvizible’s association with Robot Overlords developed further when the company became a production partner on the project. Nevertheless, they were still required to bid for the work. “Robot Overlords was a regular production with a bond company and other financiers, so it had to be seen not only that our bid was value for money, but also that our costs weren’t unrealistically low,” Eason pointed out. “Also that, if the unthinkable happened and we dropped the ball, there would be somewhere else to go to get the work done. Obviously it didn’t come to that!”

Robin Smythe (Ben Kingsley) and the Mediator (Craig Garner) visit the alien Cube in "Robot Overlords".

Robin Smythe (Ben Kingsley) and the Mediator (Craig Garner) visit the alien Cube.

How to Design a Robot

Key to the success of Robot Overlords was the convincing creation of the alien machines that have descended from outer space to occupy planet Earth. Design concepts for the mechanical marauders developed naturally alongside the story.

“The Robot Empire is stretched across the galaxy,” explained Stay. “If you compare it to the Roman Empire, the Earth is like Caledonia – it’s the last place any of them want to be. Resources are stretched thin. So the Sentries, for example, are quite sorry figures. They walk with a kind of stoop, and they’ve got chips and dents and dinks in their armour, because they’ve been here for three years and survived a war.”

According to the storyline, the Robot Empire studies the inhabitants of each planet they plan to subjugate, and designs an automated invasion force specifically to tap into the psychological profiles of that planet’s inhabitants. This bespoke engineering approach is evident in the design of the Sentries: two-storey, bipedal robots with small heads and enormous, hulking arms.

According to co-writer Mark Stay, the robot Sentries are designed to tap into our instinctive fear of "people with tiny brains and big muscles".

According to co-writer Mark Stay, the robot Sentries are designed to tap into our instinctive fear of “people with tiny brains and big muscles”.

“In studying us, the robots have discovered that we’re frightened of people with tiny brains and big muscles,” explained Stay. “That informed the design of the Sentry. Our other robots – like the Snipers and Air Drones – are very functional, each with a specific purpose. ”

A different psychological insight led to the design of the Mediator – a kind of robot diplomat which acts as an interface between human beings and their automated oppressors. “The Mediator is made to resemble a child, because they’ve observed that, on the whole, adults don’t behave aggressively towards children,” Jon Wright remarked. “But it’s been designed by an alien species who aren’t familiar with the minutiae of how we react, how our culture is organised. So they’ve got it a bit wrong. They’ve attempted to make something likeable, but they’ve actually made something that’s extremely creepy.”

Early conceptual designs for the robots were developed by storyboard and concept artist, Gabriel Schucan. “Jon and I sat with Gabriel,” Stay recalled. “We scribbled ideas, and he would develop them and send drawings back to me and Jon. That was still very early on in the writing of the script.”

Also at this early stage, the team created the “Robot Compendium”, a manual containing a piece of key art for each robot, drawn by Jack Dudman, along with a description of the machine’s capabilities. “The Robot Compendium, helped us to sell in the script to investors,” remarked Stay. “So it was a really important document for us.”

During the story development process, visual effects constraints influenced what might be achievable, both practically and financially. “While we were still talking about the script, we started doing visual effects breakdowns in a very loose, back-of-an-envelope way,” said Eason. “Some scenes would turn out to be looking very expensive, and that was fed back into the writing process. For example, there was a scene in the woods with our heroes being chased by robots called Octobots. That was quite heavy-duty in terms of visual effects, and wasn’t particularly important in the story. So it was elbowed fairly early on.”

"Robot Overlords" concept art by Paul Catling.

“Robot Overlords” concept art by Paul Catling.

The Overlords Become Real

Once Robot Overlords was greenlit, the initial robot concepts were handed over to designer and illustrator Paul Catling, known for his work on films including Guardians of the Galaxy, Captain America: The First Avenger and the Harry Potter series.

“Paul is incredibly productive,” Eason remarked. “He did lots and lots of pen-and-ink designs, just exploring how the robots might look. Early on, some of them were much more organic-looking, and skeletal, and frightening in a slightly Satanic kind of way. Ultimately, we decided the robots should be slightly crude-looking. We wanted to avoid comparisons with Transformers – Jon wanted our robots to look much more industrial. We also wanted them to be slightly reminiscent of WWII Nazi technology.”

Although the design of the Sentry was inspired directly by Wright’s inspirational dream, it nevertheless went through a lengthy development process. “One of the original tests we did for the Sentry was a lot more like Robby the Robot,” Wright revealed. “Then he developed into more of an Iron Giant character – we just pushed that design to make it slightly more peculiar. He’s slightly lop-sided, and made up of different geometric shapes. That was partly driven by the plot device of having them fold up into cubes, which was our way of showing that they’ve been deactivated.”

The Mediator robot was played by Craig Garner, whose appearance was enhanced in post-production to give the actor's skin an unsettling plastic sheen. For some scenes, frames were removed to create staccato movements.

The Mediator robot was played by Craig Garner, whose appearance was enhanced in post-production to give the actor’s skin an unsettling plastic sheen. For some scenes, frames were removed to create staccato movements.

The Mediator, in contrast, was conceived from the beginning as a machine that looked like a human – but not too much like a human. “Right from the beginning, we wanted deliberately to put the Mediator in the uncanny valley, rather than try to avoid it,” Wright noted. “We gave the actor, Craig Garner, perfect false teeth, trimmed his hairline to a perfectly straight line, gelled his hair rock-solid, and tried to remove all of his imperfections and blemishes with make-up. Then that was enhanced in post-production with clean-up, skin glows, making his eyes uniform. Sometimes we’d take out every other frame to give him a kind of clockwork, staccato movement.”

“The bulk of the Mediator work was done in the grade by the colourist, Gareth Spensley, on his Baselight,” Eason elaborated. “However, the software we use at Nvizible for skin work is a suite of NUKE tools that we use as part of our ‘Nhance’ service. These allow us to selectively filter out features of a certain size – for example, wrinkles or spots – while retaining everything else, like image grain, shadows and regular skin texture.”

Robots at Nvizible

With the exception of the Mediator, Nvizible further developed the robot designs in their CG workspace. This process was facilitated by initial 3D work done by Paul Catling. “As Paul progresses his designs, he starts working in 3D,” explained Eason. “He’ll do a render, then go on top of it in Photoshop and paint lighting effects and so on to turn it into an illustration. But the underlying geometry is there. He even rigs it a little so he can move the arms and legs around to come up with certain key poses. So he ended up with some pretty well-realised CG models, which he then gave to us. We had to rebuild them, but they gave us a head-start.”

Nvizible’s robot models were built, rigged and animated primarily using Maya, with additional sculpting work done using ZBrush. “The Sentry ended up being a very complicated rig, with hundreds of moving parts,” Eason noted. “But it does play the biggest part in the film.”

Texturing was applied in MARI, using photographic reference chosen to support the story point that the robots have been operating in a hostile environment for many years, and are therefore both weathered and battle-worn. “There was no decoration on the robots – no painted symbols or information; it was all slabby, raw, metal, moving parts,” Eason noted. “We took photographs of similar unadorned, mechanical things, like old steam engines, big industrial lifts, big heavy things made out of cast iron. Part of the fun was making them nice and weathered – chipped and broken – which also helped to give them scale.”

Sentries await deployment in "Robot Overlords"

Ncam and the Air Drone

Many key scenes featuring the robots were shot using a real-time camera-tracking system provided by Nvizible’s sister company, Ncam. The Ncam system allows a camera operator to view pre-visualised CG assets through the viewfinder, adjusted to conform to the parameters of the lens and superimposed in the correct position over the live-action.

“Ncam works without any tracking markers. It lets you see a previs version of your visual effects assets through the camera, or at video village,” Eason explained. “Ncam combines the live video feed from the camera with a render from a workstation, all in real-time, showing your VFX asset in situ. It could be anything from a set extension to a creature.”

Ncam was used during the film’s opening sequence, in which a crazed man runs down a suburban street shouting defiance at his robot oppressors. An Air Drone flies down, warning the man to return to his home; when the man ignores the robot, it vaporises him.

"My view with visual effects is to say, ‘Go ahead and do everything practically that you can, and we'll deal with it in post.’ Rain? No problem!" - Paddy Eason, Nvizible.

“My view with visual effects is to say, ‘Go ahead and do everything practically that you can, and we’ll deal with it in post.’ Rain? No problem!” – Paddy Eason, Nvizible.

The sequence was shot on location in Bangor, on the outskirts of Belfast, Northern Ireland, using primarily hand-held cameras. Following Wright and Schucan’s storyboards, Nvizible created previs for the sequence, using survey data of the location gathered by members of their Belfast office.

During the night shoot, interactive lighting was coordinated by Fraser Taggart, director of photography, to simulate the glow from the Air Drone’s blue hover-jets, while practical rain simulated a torrential downpour. “Rain gives you a huge amount of visual texture, so it’s a lovely thing to have,” observed Eason. “Generally speaking, my view with visual effects is to say, ‘Go ahead and do everything practically that you can, and we’ll deal with it in post.’ Hand-held cameras? No problem! Rain? No problem! We’ll add CG rain, and we’ll make sure it’s composited behind your rain. But there were a couple of shots where the camera was looking up into the sky, where it would have been too much of a nightmare to have rain falling into the lens, so I did ask them to shoot dry versions to give us the option.”

Practical pyrotechnics were used for the moment when the man is blown up by the robot, somewhat in the manner of a human firework. “We shot the plate with the actor, a clean plate with no actor, and a third plate with the pyro going off,” Eason revealed. “We used a similar camera move each time, but there was no motion control – which is fine as long as everyone’s on the same page. The practical pyro gave us real interactive lighting, and a good clue of how the rain is lit up. We enhanced it with additionally-shot bluescreen pyro elements afterwards.”

For the shots featuring the Air Drone, Ncam was used extensively to aid the shooting of the live-action plates. “It was exciting to see the reaction of the camera operator,” Eason recalled. “Like a lot of people, he was a bit sceptical about Ncam – sometimes people view it as a bit of an encumbrance. But when he realised he was actually able to see the hovering Drone through his viewfinder, and walk around it and choose angles, he was really excited and impressed.”

Robot gun barrel

In one close-up shot, the camera executes a 180° move around the muzzle of the Air Drone’s gun, with the camera racking focus on the barrel. “I don’t think you’d have shot that plate without Ncam,” Eason asserted.

While the live-action plates were used for most scenes featuring the flying robot, Nvizible created some backgrounds using 360° HDRI panoramas shot on location by VFX assistant Erran Lake, using a DSLR camera. “Sometimes the plate done with the movie camera might be nice, but technically very difficult,” Eason explained. “So we would create our own version of that using my stitched stills, and put the rain in ourselves.”

NCam was used for a scene in which a giant robot Sentry pursues Connor (Milo Parker) down a suburban street.

Ncam was used for a scene in which a giant robot Sentry pursues Connor (Milo Parker) down a suburban street.

As director, Jon Wright appreciated the freedom afforded by Ncam, notably during a sequence in which a gigantic Sentry chases a boy down a suburban street. “You’re able to shoot something as if it was actually there, so it unlocks your shooting style,” he commented. “We were able to imbue those shots with a hand-held quality, and be quite cavalier about the way we shot. Normally when you shoot a visual effect like that, and there’s nothing there, you’re worried about not framing it up properly.”

“It was quite difficult to set up, positioning the robot and getting good tracks on the camera,” Wright continued, “but it meant we were able to do a running hand-held shot backwards with the boy, framing up the robot above him. The Sentries are massive, so they tend to go out of frame. If there’s nothing there, it makes the operator nervous and they tend to err on the side of caution and give you a slightly peculiar, lop-sided frame. If they can see the thing, they frame it much more naturally, and you get a much more interesting shot.”

“I imagine in years to come this will be fairly standard practice,” Wright concluded. “The days of tennis balls on sticks are numbered.”

Watch a video of the live NCam feed as seen by the camera operator:

Robots in Post

By the time post-production began, Robot Overlords already existed in rough-cut form, since editor Matt Platts-Mills had been present on location throughout production, cutting each day’s rushes together into a rough assembly. “It was very helpful to have Matt with us, cutting as we went,” Eason observed. “By the time the shoot was done, we were in a good position to hit the ground running on certain key visual effects scenes.”

Since much of the film had been shot hand-held using anamorphic lenses, one of the first challenges for the visual effects team was tracking. Peanut FX handled all the film’s tracking requirements, delivering 3D layouts for roughly 115 shots, mostly with 3D camera and/or object tracks for set extensions and robot integration.

“We mainly used 3DEqualizer,” commented Amelie Guyot, matchmove supervisor at Peanut. “We find it great for solving anamorphic camera tracks when lens distortion grids and camera info are available. We were provided with a lidar scan model for scenes involving a castle, which we used to perfectly match the plates – 3D scans are always very helpful, especially with anamorphic shots. We also used Autodesk’s Matchmover to calibrate sets; it’s still one of our favourite tools for image-based 3D reconstruction.”

“Tracking was a concern early on,” Eason admitted, “but the guys at Peanut absolutely rocked it and did a brilliant job.”

Tracking for "Robot Overlords" was handled by Peanut FX, who worked meticulously to deal with director Wright's extensive use of hand-held cameras, and some challenging sequences in which the robots interact directly with humans and their technology.

Tracking for “Robot Overlords” was handled by Peanut FX, who worked meticulously to deal with director Wright’s extensive use of hand-held cameras, and some challenging sequences in which the robots interact directly with humans and their technology.

During post-production, Jon Wright made himself available to the Nvizible team on a regular basis. “I made a point of remaining on the film throughout the time that Nvizible were working on the visual effects, so I could go in and out of their offices at my leisure,” the director stated. “It saves a lot of work if you’re seeing updated shots regularly – you tend to have less waste. I don’t know how to run Nuke or anything, but I think I’m quite geeky, and have a good understanding of how effects work. I can tell if something’s going to be quick to do, or time-consuming. I think maybe some directors don’t have that.”

With regard to the animation of the robots, Wright paid particular attention to the way the movement of the metal invaders expressed their character – or rather, their lack of it. “There’s a tendency to anthropomorphise robots,” Wright observed. “Our robots are quite implacable, so it was important to strip out emotion.”

Even given this direction, nevertheless Wright found that individual animators inevitably had their own take on how the robots should move. “After a bit I could guess quite accurately who’d animated what, because they bring their own personality to it,” he commented. “It got to a point where I would say, ‘I know this shot would be great for this animator,’ on the basis of what they’d done before.”

Lighting the CG robots brought its own set of challenges, due to the difficulty of making metallic objects look both highly reflective and convincingly three-dimensional. “It’s a bit of a double-edged sword,” Eason remarked. “If you’ve got good HDRIs for everything, you get good reflections, but because most of what you see with metal is based on reflections, it’s hard to make stuff that’s physically juicy, with a nice keylight and a nice fill light. You’re always fighting that chrome robot thing, where it all looks a bit floaty and unlit.”

Skyship concept art by Paul Catling.

Skyship concept art by Paul Catling.

Riding the Skyship

One of the most difficult visual effects sequences occurs at the film’s climax, when Sean finds himself riding through the skies on the front of the enormous robot Skyship.

“They were really hard shots, because they were extremely ambitious,” Wright admitted. “I’d imagine that even the likes of ILM working for J.J. Abrams would have found those shots tough to get right.”

Once again, NCam was used to combine a previs version of the Skyship with the live-action as it was being shot on a small studio set. “The set had a slight bounce, which made things interesting,” Eason recalled. “We ended up replacing all the practical set with CG, although it was useful to have it there for reference, shadow catching and so on.”

The film's ambitious climactic scenes, during which Sean (Callan McAuliffe) rides on the exterior of the massive Skyship, proved particularly challenging.

The film’s ambitious climactic scenes, during which Sean (Callan McAuliffe) rides on the exterior of the massive Skyship, proved particularly challenging.

In post-production, the low-resolution Skyship model was replaced by its high-resolution counterpart. “The main challenges with the Skyship came in look-dev, working out how to make a big shiny collection of metal blocks look big and scary and cool,” Eason commented. “In reality, large steel things tend to look like concrete, so we had to cheat the physics of the renders quite a bit. We had a lot of fun with the Skyship engines – big jets of dirty orange and blue fire. Our FX lead, Wayde Duncan Smith, did a great job of simulating all that stuff in Houdini.”

Because the sequence featured multiple camera angles, with moves that were very specific to the animation of the various characters and craft, it was deemed unfeasible to shoot live-action background plates. Instead, generic aerial plates of landscapes were shot using an array of DSLR cameras.

“We flew the route in a chopper several times, shooting 160° panoramas using three Canon 5Ds, shooting stills at 3fps,” Eason revealed. “We fed these image sequences through a pipeline of undistortion, stitching, stabilising, optical flow and re-projection, to make beautiful background plates of moving land for all the flying Skyship shots. It was hard work, but our Nuke artist, Antoine Jannic, absolutely nailed it.”

Watch a video of a completed shot from the Skyship sequence:

Enter the Spitfire

Also featured during the film’s finale is that most British of icons – a Supermarine Spitfire. The old-school technology of the WWII fighter plane proves pivotal in the battle against the robots, whose jamming devices have grounded more sophisticated jet aircraft.

The production shot live-action of the historic plane on the ground using a mock-up provided by Gateguards UK. An authentic Spitfire was also made available for one day of the shoot by The Aircraft Restoration Company, Duxford. In addition, Nvizible built a CG Spitfire which they used for additional aerial shots.

Jon Wright (right) directs the action around the museum-piece Supermarine Spitfire, on location in the Isle of Man.

Jon Wright (right) directs the action around the museum-piece Supermarine Spitfire, on location in the Isle of Man.

“We used the mock-up Spitfire in a remote, disused quarry location on the Isle of Man,” explained Eason. “Then we also had the real Spitfire fly out to us for the day. We positioned it on the airfield and dressed it similar to how the mock-up had been in the quarry. It was a huge privilege to have a real museum-piece there, and a dream for us in terms of creating our CG Spitfire for the flying shots, because we were able to get perfect reference, taking thousands of photographs, right down to every rivet.”

Eason was doubly delighted when he found himself booked in for a flight in a two-seat Spitfire. “The producer, Piers Tempest, knew I was a big fan of Spitfires, and he made a casual comment: ‘We’ll get you a flight in it.’ I didn’t believe it would ever happen, but a couple of months later it was arranged for me to go to the Imperial War Museum at Duxford. I got to fly around, and control the Spitfire, and everything. Ostensibly it was to record audio inside the plane – I was wired for sound – but also it was just a marvellous day out.”

ORO-Robot Overlords

Robot Reflections

Looking back at his experience of working on Robot Overlords, Mark Stay noted the benefits brought by the collaborative process: “I’ve been involved in the whole procedure in a way that many other writers aren’t usually. I was writing the novelisation at the same time, so I would email Paddy to ask questions like, ‘What colour are the engines?’ He would invite me over to Nvizible so I could sit in on VFX test screenings, and I would be there furiously scribbling all this stuff down for the novel. I’ve been really blessed with the access that I’ve had.”

“It was a delightful experience, because of our creative involvement from beginning to end,” Paddy Eason added. “I love the fact that it’s quite an unusual British film – very charming and quirky, starting at a very small personal scale and ending up quite epic and extraordinary.”

The film’s ambition is something that Jon Wright is proud of. However, he remains sanguine about the challenges involved in bringing such a film to the screen.

“It’s a bit of a shame that we aren’t really making these kinds of movies in Britain and Ireland, and that British cinemagoers almost prefer to go and see American films over British films. And we’ve got all our best filmmakers going across the water to do remakes,” Wright commented. “But the truth is, when you make a movie like this, you go up against massive Hollywood blockbusters, and it’s difficult to deliver in the way that they deliver. So we tried to focus on character, and give the film a different tone to a typical Hollywood movie at the moment – not dark and moody, but kind of light-hearted and optimistic. Our kids swear a lot and are playful, rather than being the earnest, soul-searching types you get in typical American YA movies.”

Echoing Wright’s thoughts, Eason recalled, “I remember one meeting we had early on, where we pitched the project to quite a well-known film producer. This producer said, ‘If you pull it off, I’ll eat my hat.’ So now we’re going to get a nice hat-shaped cake made for him!”

Having enjoyed limited theatrical release across the UK through Signature, Robot Overlords may also be gaining momentum as a TV series. “We have strong interest from two big broadcasters, in the US and the UK,” Wright revealed. “In 2015, television is probably the way to go with this sort of project. We could really explore into the human politics of the situation – life under enemy occupation. Obviously, the occupation is going on across the entire planet, so there’s a million different directions you could expand the story into. And ten hours of television is probably more exciting than another 90-minute film. It could be very good – watch this space.”

Robot Overlords will be soon to be released on DVD. In May 2015, the production team will be hosting panels at MCM Comic Con Belfast and MCM Comic Con London.

Special thanks to Marek Steven, Alex Coxon, Piers Tempest and Phil Guest. “Robot Overlords” photographs and video copyright © 2015 and courtesy of Tempo Productions and Nvizible.

Inspiring MPC

What drives people to work in the visual effects industry? The glamour? The technology? All those ravening monsters and exploding spaceships? Or is it just another job? In an ongoing series of articles, we ask a wide range of VFX professionals the simple question: “Who or what inspired you to get into visual effects?”

Here are the responses from the staff at MPC.

Starting with Stop-Motion

Many of today’s visual effects professionals are suckers for a little old-fashioned stop-motion animation. Richard Stammers, production VFX supervisor at MPC, is no exception. “Growing up, I had a real love of Ray Harryhausen’s work like Jason and the Argonauts and Sinbad and the Eye of the Tiger,” he enthused. “But seeing Phil Tippett working behind the scenes on the AT-AT Walkers for the Hoth battle from The Empire Strikes Back was a turning point for me.”

For “The Empire Strikes Back”, Phil Tippett and Jon Berg employed the Lyon-Lamb animation system (right) to provide instant replay capability as they animated the AT-AT Walkers for the principal VistaVision cameras. Image copyright © by Industrial Light & Magic. All rights reserved.

For “The Empire Strikes Back”, Phil Tippett and Jon Berg employed the Lyon-Lamb animation system (right) to provide instant replay capability as they animated the AT-AT Walkers for the principal VistaVision cameras. Image copyright © by Industrial Light & Magic. All rights reserved.

Paul Chung, animation supervisor, is a fan of the old school too. “I came from that generation when VFX meant back-projection, stop-motion and optical,” he noted. “My inspiration was Ray Harryhausen and Disney films. That was all I knew.”

Harryhausen also inspired Dan Zelcs, lead rigger, who included the stop-motion legend in his round-up of early influences: “As a kid, my inspirations and interests ranged from Bugs Bunny, Star Wars and Ray Harryhausen’s stop-motion skeletons, to my own activities of re-creating film characters and spaceships in Lego, and playing computer games.”

Ray Harryhausen applies glycerine to skin of the Kraken puppet used in “Clash of the Titans” to make it appear wet.

Ray Harryhausen applies glycerine to skin of the Kraken puppet used in “Clash of the Titans” to make it appear wet.

Digital Delights

Ask any group of VFX professionals which film inspired them to get into the business, and a good percentage of them will come up with a movie from the 1990s – that fast-moving decade during which the digital revolution was beginning to sweep through the entire visual effects industry.

Jurassic Park just blew my mind!” stated Joan Panis, head of FX. “The Tyrannosaurus Rex chase scene was incredible. I re-watched the movie recently, and although it wouldn’t get a PG-13 nowadays, the CG still holds up pretty well. Kudos to ILM for that. When I discovered that most of the dinosaurs in the movie were computer-generated, my interest in CG grew exponentially and I started becoming obsessed with VFX.”

Dinosaurs were also responsible for chasing Rob Pieke, software lead, into career in VFX. “I was 14 years old when Jurassic Park came out. I didn’t even know how to appreciate what I was seeing, but it triggered inside me a strong sense of ‘I know what I want to do when I grow up’ – especially since I was crazy about dinosaurs as a kid.”

A turning point for Ferran Domenech, animation supervisor, was the moment when, as a teenager, he left the cinema after seeing Jurassic Park. “I clearly remember telling my father that they had almost made you believe the dinosaurs were real,” he recalled. “I researched Jurassic Park in specialist magazines, and learned that they’d changed from a computer-controlled miniature system called go-motion to fully-rendered CG for the wide shots of the dinosaurs. I was blown away by what could be achieved with computers. This truly sparked my passion for 3D and VFX.”

Sophie Marfleet, lead envirocam artist and compositor, found the 1990s just as inspiring as her colleagues. “I’ve been obsessed with movies since I watched Star Wars and Indiana Jones as a kid,” she commented, “but I was inspired to work in film by watching movies like Terminator 2: Judgment Day, Jurassic Park and Fight Club. The visuals blew me away, and I knew I wanted to be a part of that.”

The developing potential of computer graphics continued to inspire wannabe VFX professionals right up to the end of the decade. Reminiscing about a certain mind-bending classic from 1999, Damien Fagnou, global head of VFX operations, said, “Since I was ten years old, I’ve been fascinated with computers and their ability to produce graphics of all kinds. My path was set in that direction when The Matrix came out. That film was really the trigger that made me think, ‘This is what I want to do in life: contribute to making amazing movie experiences like the one I’ve just experienced.’”

But on the subject of the digital revolution, it was Marco Carboni, crowd supervisor, who picked out what many experts regard as the watershed moment when computer-generated characters truly came of age. “I had a blast when, as a kid, I saw the stained-glass knight in Young Sherlock Holmes,” he commented.

Stained glass knight - "Young Sherlock Holmes". Image copyright © by Industrial Light & Magic. All rights reserved.

ILM modelshop crew member Jeff Mann was photographed in costume against a grid to provide reference footage for the computer animation. Standing on the right are Pixar artistic supervisor John Lasseter and visual effects supervisor Dennis Muren. A clay and glass maquette of the knight was digitised using Pixar’s Polhemus three-space digitiser, with the resulting geometry rendered in vector form on an Evans and Sutherland monitor. Image copyright © by Industrial Light & Magic. All rights reserved.

Believing Anything Can Fly

Of course, there are plenty of people at MPC whose memories stretch back further than the 1990s. Take Tony Micilotta, R&D lead, who remembers a time when superheroes had to save the world without the help of digital doubles. “Superman (1978) really made me believe that a man could fly!” Micilotta remarked. “As I grew older, it was pioneering technologies such as the Zoptic front-projection system used in Superman that inspired me to join the VFX industry, where I could develop new techniques to create imagery that had never been seen before.”

Meanwhile, Scott Eade, head of layout, has fond memories of the 1980s: “As a kid I grew up watching such imagination-building films like Blade Runner, E.T.: The Extra-Terrestrial, Tron, Ghostbusters and Star Wars. So I’ve always been drawn towards the magic in visual effects.”

But for Matt Packham, 2D supervisor, one movie stands head and shoulders above the rest: “Which film inspired me to get into VFX? Simple – Stanley Kubrick’s 2001: A Space Odyssey. Seeing this for the first time in the late 1980s was a sublime experience. And when I learned what it took to make a movie like this in 1968, it continued to amaze me!”

Starchild - "2001: A Space Odyssey"

Inspired by a series of intra-uterine photographs, the Starchild seen in “2001: A Space Odyssey was sculpted by Liz Moore and mechanised so its eyes could move. The Starchild was filmed through multiple layers of gauze, with immense levels of backlight. As a finishing touch, visual effects supervisor Douglas Trumbull airbrushed an enveloping cocoon on to a piece of glossy black paper, which was aligned to the model and filmed on an animation stand.

Realising the Dream

Inspiration is all very well, but how did the staff at MPC develop their youthful enthusiasm into actual careers? For Richard Stammers, the practical exploration of VFX techniques started early. “My school art teacher saw my enthusiasm and lent me his Bolex camera to film my first stop-motion project,” he revealed. “I used the articulated joints of my camera flash to push my camera, caterpillar-style, on to my prone tripod, which then stood up, walked and bowed to camera. I was hooked! This enthusiasm fuelled education decisions and career aspirations.”

For Paul Chung, it was a passion for art that ultimately drew him into the VFX business. “I grew up drawing a lot,” he remembered, “but I was also into filmmaking, so I ended up at film school in London. After that, I got into hand-drawn animation, combining my two interests together. Some 20 years later, I went to Dreamworks, and that was the beginning of my life in digital.”

PCjr photograph by Rik Myslewski, via Wikimedia Commons (own work – CC0)

PCjr photograph by Rik Myslewski, via Wikimedia Commons

Visual effects relies as much on science as it does on art, as demonstrated by Rob Pieke. “My entry into the industry was really fostered by two influences,” Pieke reflected. “The first was my parents, who both worked for IBM and taught me how to write computer graphics programs in BASIC on the PCjr in the 1980s. I originally aspired to be an animator, but programming and problem-solving is clearly in my genes, so an R&D role was the natural fit for me.”

Dan Zelcs also got his start during the early days of home computers. “I think my journey into visual effects really started at age ten,” he commented, “with me programming my Sinclair ZX Spectrum, using BASIC to animate a pixelated version of the Teenage Mutant Ninja Turtles – complete with the cartoon’s theme tune, executed in 8-bit beeps! This formative experience, entering lines of code to produce a living piece of animation, was the ‘2001 monolith’ moment of my career. It made me realise I could use mathematics to make art, and turn my imagination into reality.”

Zelcs went on to recall the dizzying leap from his ZX Spectrum to a computing machine with far more number-crunching power. “Later, I’d play with software like Deluxe Paint III on my Amiga 500,” he explained. “I would animate simple cut-out characters, and render zooming camera moves through Mandelbrot fractal sets. Then I found Sculpt-Animate 4D – free modelling software that came on the cover of a magazine – which enabled me to model and ray-trace simple objects. This experience influenced my choices of classes at school – mixing mathematics and art – and then the degree that I chose.”

While Catherine Mullan, head of  animation, acknowledges the art/science debate, she prefers not to take sides. “I was always drawing as a kid,” she remarked. “I loved art and drama but also maths and science. I wanted to pursue a job that was a mix of all these things, but the usual options didn’t fit the bill.  When applying for university, I stumbled across a computer animation course and was immediately excited. I spent the next few years discovering the delights of computer graphics, and was especially drawn to animation.”

On even the best-planned career path, however, there’s always room for a little blind chance to play its part, as in the case of Sophie Marfleet: “It was actually a conversation with an old compositor friend called Tim, who I bumped into on a year abroad in New Zealand, that convinced me to take the visual effects route.”

End Results

So, regardless of what brought them into the field of visual effects, are these members of the MPC team happy with the path they’ve chosen? It’s clear that Marco Carboni wouldn’t want to be anywhere else: “I’ve always loved VFX – after seeing the “crossing the sea” scene from Prince of Egypt, I knew that I wanted to be part of that magic world.” And Scott Eade seems happy with the company he’s keeping: “I’ve had the opportunity to work with, and for, people who share the grand vision of making the unreal real. My first experience directly working as a visual effects artist was on James Cameron’s Avatar, and I’ve been lucky enough to continue to work on great films since.”

Catherine Mullan summed up this collective enthusiasm by concluding, “To this day I get a huge kick out of my job and my inspiration is continuously renewed by the amazing work happening around the world.”


Watch MPC’s 2015 film reel:

MPC is currently working on movies including Disney’s The Jungle Book, Batman V Superman: Dawn of Justice, Terminator: Genisys, Spectre, Goosebumps and The Martian. Thanks to all the staff from MPC who contributed to this article.

Special thanks to Jonny Vale.

Orphan Black and Twinning in the Movies

This article was first published in slightly different form on the Cinefex blog on 8 April 2014.

Orphan Black - Season 3

What’s the best visual effect of them all? Which camera trick brings everything together to make a perfect whole – conceptual elegance, technical expertise, editorial sleight of hand, dramatic performance? Which cinematic illusion wins the grand VFX prize? My answer may split opinion.

It’s the twinning effect.

I know. You’re scratching your head in puzzlement. How is creating twins more impressive than blowing up a planet? Does a pair of chatty clones really beat a ninety-foot robot grappling a multi-tentacled mutant from another dimension?

Yes. And yes. Let me tell you why. But first, let me explain what I’m talking about.

By “twinning”, I mean the process whereby a single actor plays two or more roles in the same film. For the performer, it’s a delicious challenge. For the visual effects artist, the challenge comes with the shots where both (or multiple) incarnations of said actor appear on screen at the same time.

"Orphan Black" clone strangling scene

One of the latest productions to use this time-honoured trick is the TV series Orphan Black, the second season of which begins its run on BBC America later this month. In the show, Tatiana Maslany plays a woman who encounters several cloned versions of herself and becomes caught up in a deadly conspiracy, in a remarkable performance that saw her nominated for a Golden Globe. Orphan Black’s visual effects are by Intelligent Creatures; according to visual effects producer Che Spencer, their mandate was “to push the effect and not settle for what was easy.”

Watch an Intelligent Creatures breakdown video of the extended “clone dance party” from the Orphan Black season 2 finale (including a surprise unaired ending):

We’ll hear more from Intelligent Creatures about Orphan Black in a moment, including a breakdown of one of Season One’s most daring multi-clone shots. Before then, let’s take a brief look at the history of the twinning effect.

Old-School Double Acts

A good early example of twinning is the 1944 Bing Crosby musical Here Come The Waves, in which Betty Hutton stars as identical twins Susan and Rosemary Allison. The film uses a fairly standard range of twinning tricks including a body double with her back to the camera, and judiciously-placed split screens.

"Here Come the Waves" trailer image

In many of the shots in Here Come The Waves, it’s easy to spot where split line is (the binary Betties are generally positioned on opposite sides of the screen, with plenty of empty set gaping between). Some shots – such as the one where both characters leave the stage after a dance number (at 1:45 in the clip below) – make effective use of a moving split, allowing the twins to occupy the same physical space, albeit after a small but convenient time interval.

For some journalists of the time, such trick photography was akin to witchcraft, as evidenced in a contemporary article from May 29, 1944 by Frederick C. Othman of Associated Press – here’s an extract:

This piece is going to be complicated; it involves two Betty Huttons and how can anybody expect you to understand what’s going on, when the writer doesn’t exactly understand himself? … The boys are making with the double talk about split screens and synchronous recordings. … If one Miss Hutton is a squillionth of an inch off her marks when she gets out of her chair, the other Miss Hutton is a blur. And, of course, vice versa. That’s because of the split screen (says Othman, who has only the vaguest idea of what he’s talking about).

Olivia de Havilland faces herself in "The Dark Mirror"

If Here Come The Waves exemplifies the early frivolous use of twinning techniques, The Dark Mirror, released two years later in 1946, is its shadowy counterpart.

In the film, Olivia de Havilland plays twins Terry and Ruth Collins, both suspected of murder and both possessing an alibi for the night the crime was committed. While this psychological melodrama uses similar techniques to Here Come The Waves, director Robert Siodmak exploits its darker themes with shots like the one at 1:20 in the clip below, in which moody lighting is used to conceal the use of the ever-reliable body double.

Before we run forward in time, let’s quickly wind the clock even further back to 1937 and take a look The Prisoner of Zenda, in which Ronald Colman plays both the king of Ruritaria and his English lookalike.

The Prisoner of Zenda contains an early example of twins not only appearing side by side, but also physically interacting, in a shot where the two Ronald Colmans shake hands. This quote from David O. Selznick’s Hollywood by Ronald Haver*, makes the intricate matte work used to pull the shot off sound deceptively straightforward:

The camera shot through a plate of sheet glass that had been taped to cover the area of the double’s head and shoulders. After exposing the action, the film was rewound in the camera, the plate glass was retaped to cover everything except the area of the double’s head and shoulders, and Colman changed costumes and stood in. Colman’s head and shoulders were then photographed in perfect register with the double’s body.

Attack of the Clones

Throughout the 20th century, there was a regular flow of twinning films, most of which relied on these familiar visual effects techniques – perhaps most famously when a young Hayley Mills played identical twins in The Parent Trap (1961). Then, in 1988, came a matched pair of twinning films that upped the ante and doubled the stakes.

The first was Big Business, which starred Bette Midler and Lily Tomlin as two sets of identical twins. The second was David Cronenberg’s Dead Ringers (1988), in which Jeremy Irons played twin gynaecologists Elliot and Beverly Mantle. Both films made a bold leap by using motion control to introduce camera moves into their split-screen shots.

Jeremy Irons doubles up in "Dead Ringers" (1988)

Luckily for us, when dissecting the revolutionary visual effects of Dead Ringers in Cinefex #36, Don Shay demonstrated a little more understanding of the twinning process than Here Come The Waves reporter Fred Offman did back in 1944:

The most difficult of the motion control setups was a reverse tracking shot of the twins walking towards camera. To compensate for normal arm and body sway, Film Effects of Toronto had to develop matting sequences that constantly shifted the split from side to side. And since diffused splits of varying widths were required – depending on background light levels – different splits were dissolved in and out as the scene progressed. From start to finish, the shot required four separate split-screen mattes – each with an average of four dissolves.

Once the Pandora’s Box of motion control twinning effects had been opened, there was no going back. From Back to the Future II through Multiplicity to Adaptation and beyond, filmmakers have experimented with ever-more elaborate ways of duplicating the talent. In The Social Network, Lola raised the bar higher than ever when they created the Vinklevoss twins by mapping Armie Hammer’s face on to that of fellow actor Josh Pence. Read all about how they did it in this excellent article at FXGuide.

These recent refinements mean filmmakers can now do proper justice to that staple of science fiction: the clone story. In The City of Lost Children, Pitof/Duboi presented us with more copies of Dominique Pinon than we knew what to do with. More recently, Moon pitted Sam Rockwell against, er, Sam Rockwell, in a stunning variety of clone scenes that showcased not only Rockwell’s acting chops, but Cinesite’s invisible digital effects.

In planning Moon’s judiciously-used clone shots, director Duncan Jones studied both Dead Ringers and Spike Jonze’s Adaptation.  “[Spike] told me that when you’re working through scenes, you need to choose which character really leads the scene, and shoot that one first,” Jones remarked in Estelle Shay’s article Moon Madness (Cinefex #118).

Sam Rockwell checks his counterpart's temperature in "Moon"

The same article has the following to say about the above Cinesite split-screen shot in which “Sam1” feels the forehead of “Sam 2”:

In the hero pass, Rockwell as the ill Sam1 performed to a stand-in serving as Sam 2, with a C-stand used to record the position of the double’s left shoulder. In the second pass, Rockwell performed as Sam 2, aligning his shoulder with the marker and using his body to occlude that of the double. A third pass allowed for the removal of extra lighting, cameras and floor markers, and the shadow cast by the C-stand and opposing action. In post, Cinesite attached the double’s arm to Rockwell’s Sam 2 through careful rotoscoping and warping of clothing.

Orphan Black

All this talk of clones brings us neatly back to Orphan Black and Intelligent Creatures. Early on, the show’s producers told visual effects supervisor Geoff Scott that the budget wouldn’t allow for motion controlled camera moves, prompting Scott to explore other ways of taking away the curse of the locked-off twinning shot. In the end, however, motion control won the day, as described here by the Intelligent Creatures team:

Before production began we considered many different techniques from simple handheld camera moves to repeatable slider rigs, but ultimately it came down to a full motion control system.  In fact, we shot the scene from the pilot where Sarah meets Katja on two different motion control rigs before settling on what became the go to rig for the series – the Super TechnoDolly. The first of its generation, the TechnoDolly is a robotic camera system, essentially a smart Technocrane. It allowed us to create movements of unlimited length and complexity, and more importantly, repeat those moves with incredible precision. We shot the entire scene with Tatiana playing Sarah alongside a stand in actor to work out blocking and eyelines. Then we repeated the scene with Tatiana alone following carefully placed eyeline markers. Finally, Tatiana changed over to Katja and we did the whole thing over again. The passes were later combined in compositing using Digital Fusion to create the seamless effect.

The TechnoDolly proved adaptable enough in operation to give the director flexibility on set, and – crucially in a show where ADR needed to be kept to an absolute minimum – it was near-silent in operation.

Orphan Black uses every trick in the twinning book to help create Maslany’s various clone characters, from old-school over-the-shoulder shots to complex composites involving moving cameras and selected body parts from one actor stitched on to those of another.

With each episode the challenges grew. The one main request was that once an episode the clones would touch. Sometimes we had as many as three clones in the room all interacting with each other, delivering dialogue and making eye contact. We used the Super TechnoDolly for these really complex movements in order to maintain image integrity and repeatability. In the penultimate episode, we had one clone pour wine for two others, and another hug one in a deep embrace. As the episode continued we saw clones strangling each other, head butting, and eventually shoot the other. In a single episode we had a season’s worth of visual effects.

The Intelligent Creatures team is adamant that the general lack of attention drawn to their work on Orphan Black is in fact a great compliment:

The truest testament to our skill is how little the audience notices it. If people can immerse themselves within the plot enough to forget that this shot was done with VFX, then our jobs are done. We used visual effects to help do what the show’s creators intended to do: tell a story. The rest might as well be magic.

Watch the Intelligent Creatures sizzle reel for their work on Orphan Black:

Two Are One

There’s one twinning technique I haven’t discussed here. That’s because it puts visual effects artists out of work. I’m talking about those rare occasions when the director needs to double up the lead actor … and that actor just happens to have a real twin.

The example that springs into my mind (and probably into the minds of most regular Cinefex readers) is Terminator 2: Judgment Day, in which the shape-shifting T-1000 makes a last-ditch attempt to fool John Connor by mimicking his mother’s physical form. Director James Cameron placed the two Sarah Connors on screen simultaneously not with visual effects, but by drafting in actress Linda Hamilton’s twin sister Leslie. (Cameron used the same trick with twins Don and Dan Stanton, who played Lewis the Guard and his deadly doppelganger respectively.)

Audiences who don’t realise that Hamilton and Stanton are twins undoubtedly assume they’re seeing a camera trick, which only underlines just how tough it is for any visual effects artist to take on the twinning challenge. Why is it so hard? Because the audience knows.

They know the famous actor they’re seeing doesn’t have a twin. They know it’s a trick. When presented with a twinning effect, the average Joe Schmoe in the second row will put down his popcorn, sit forward in his seat and try his damndest to spot the join, even if ordinarily he has no interest in VFX whatsoever. Nowhere are the creators of visual effects placed under greater scrutiny than when they’re giving birth to twins.

And that’s why, of all the illusions a filmmaker might choose to put on screen, the twinning effect is undoubtedly in the running for my all-time number one.

Season 3 of Orphan Black is currently airing on BBC America:

*Published by Bonanza Books, 1987, quote sourced via The Ronald Colman Appreciation SocietyMoon image copyright © 2009 Lunar Industries/Sony Pictures. Orphan Black images copyright © 2014 Intelligent Creatures/Temple Street Productions.