“X-Men: Days of Future Past” – Cinefex 139 Extract

X-Men: Days of Future Past - Cinefex 139

We’re wrapping up our week of previews of Cinefex issue 139 with Mutant Destiny, Joe Fordham’s 21-page article about the visual effects of X-Men: Days of Future Past, in which Wolverine (Hugh Jackman) plunges back through time to save the world of the future, with the help of earlier incarnations of his familiar mutant allies.

In this extract, Digital Domain visual effects supervisor Lou Pecora explains how his team developed an organic approach to the skin-shifting abilities of the mutant Mystique (Jennifer Lawrence), avoiding visible wipes or dissolves.

“We thought about what would happen if a creature like this existed,” said Pecora. “We hit on the idea of how a magician flips a line of playing cards — he spreads them out in a line, flips the first one, and others follow suit, revealing patterns on their reverse. Each Mystique transformation began with the texture of the first actor in the plate projected onto geometry of Mystique’s ‘feathers,’ with blue skin textures on the reverse. When the feathers flipped, revealing the blue, we dissolved to Jennifer’s human texture on the hidden side, which is then revealed when the feathers flipped again. That gave us three sides to a two-sided card, with blue in between, and disguised the dissolve when the feathers were oriented away from camera.”

Digital Domain lead rigging developer David Corral created the Mystique transformation technique – dubbed the ‘potato chip rig’ – in Autodesk Maya, and CG Supervisor Hanzhi Tang and animation lead Jack Kasprzak propelled feather-flipping motions by moving Boolean shapes across the model surface. Feathers were scalable, which allowed more detail in closeups. Compositing supervisor Michael Maloney then integrated blends between feather flips.

Cinefex 139Read the complete article in Cinefex 139, which also features Edge of Tomorrow, Dawn of the Planet of the Apes and Guardians of the Galaxy.

All content copyright © 2014 Cinefex LLC. All rights reserved.

“Dawn of Apes” – Cinefex 139 Extract

Dawn of the Planet of the Apes - Cinefex 139

Our third taster from Cinefex 139 is Ape Apocalypse, Joe Fordham’s 22-page article on the visual effects of Dawn of the Planet of the Apes, the latest film in the rebooted franchise, which delivers not only spectacle but also high emotion, thanks to astonishing ape performances created by a cutting-edge blend of motion capture and animation.

In this extract, motion capture supervisor Dejan Momcilovic details the complexities of capturing multiple actors’ performances on location.

“We were often capturing performances in rain,” said Momcilovic. “We used underwater camera bags to protect every piece of gear, and applied hot packs. When we went to New Orleans, it was so hot and humid we had to keep the gear cool with ice packs.”

The compact nature of Standard Deviation’s camera housings allowed Weta to dress motion capture cameras discreetly into sets, using both wireless and cabled systems, depending on the terrain. Each ape scene required an average of 20 motion capture cameras positioned around groups of eight ape performers. To assist image tracking, Weta affixed additional LEDs to trees and sticks driven into the ground, and triangulated the terrain geometry to help integrate animated characters.

In addition to motion capture cameras, Weta surrounded ape scenes with Sony F3 witness cameras, which served as calibration devices. “We used a calibration wand with visible light and infrared markers that could be seen by the F3s, the motion picture camera, and our mocap cameras,” Momcilovic explained. “Synchronizing all those systems in one go greatly reduced our setup times on set.” The F3s gave Weta the option to optically generate motion data by tracking checkerboard markers on performers’ bodies.

Cinefex 139Read the complete article in Cinefex 139, which also features Edge of Tomorrow, Guardians of the Galaxy and X-Men: Days of Future Past.

All content copyright © 2014 Cinefex LLC. All rights reserved.

“Edge of Tomorrow” – Cinefex 139 Extract

Edge of Tomorrow - Cinefex 139

All this week, we’re serving up appetisers from issue 139 of Cinefex magazine. In The Longest Day, Jody Duncan’s 21-page article on the making of Edge of Tomorrow, in which Major William Cage (Tom Cruise) is trapped in a seemingly endless cycle of life and death as he tries to unravel the mystery behind a devastating alien invasion.

In this extract, production visual effects supervisor Nick Davis discusses the futuristic exo-suits worn by Cage and his battle-weary comrades.

Early on, the filmmakers had considered putting performers in minimal armor pieces, and then tracking the majority of the suit as a digital construct; but they soon dismissed the idea.

“We knew that the suits would be able to do a lot more if they were mostly CG,” said Davis, “but both Tom Cruise and Doug Liman wanted to do as much as they could practically. Tom was absolutely adamant that he be in the suit, because he felt that it would help him as an actor.”

Suit modeler Pierre Bohanna built the practical exo-suits from art department designs, which the actors then trained in for several months. “The actors worked really hard to learn to perform in these very heavy suits,” said Davis. “Pierre and his team made them as lightweight as possible, but they still weighed many, many pounds, and it was very hard work for the actors to wear them.”

The sheer weight of the suit required that some digital parts be tracked to Emily Blunt, whose small size precluded her from being able to carry the full load for extended periods of time. The one digital feature tracked to all of the suits were the ‘angel wing’ guns mounted on the back of the arms. “They built practical ones for us to scan and photograph, but they never got used in the movie. They were always digital.”

Cinefex_139_Apes_CoverRead the complete article in Cinefex 139, which also features X-Men: Days of Future Past, Dawn of the Planet of the Apes and Guardians of the Galaxy.

All content copyright © 2014 Cinefex LLC. All rights reserved.

“Guardians of the Galaxy” – Cinefex 139 Extract

Guardians of the Galaxy - Cinefex 139

Issue 139 of Cinefex magazine contains in-depth articles on the visual effects of four of this summer’s biggest movies. Through this week, we’re giving you daily tasters from each article in this, our latest issue.

First up is Guardians of the Galaxy, the smash hit space movie starring the funkiest fellowship of interstellar misfits ever to take to the heavens. In this extract from The Rocket Files, Jody Duncan’s 21-page article on the film’s VFX, Framestore visual effects supervisor Jonathan Fawkner talks about everybody’s favourite genetically-enhanced raccoon.

Framestore animators based Rocket animation on Sean Gunn’s on-set performance, as well as on Bradley Cooper’s vocal performance, captured with a six-camera system at a recording studio in London. “We were able to film Bradley Cooper as he performed Rocket’s lines,” said Fawkner. “Using the six-camera system was great for us because we had all the angles covered. We typically used that for our first pass at the animation, and then we’d show it to James and Stephane, with reference of Bradley doing the voice in the corner of the screen.

Only in the final stages of animation did the animators incorporate raccoon behaviors into Rocket’s performance. “James wanted an actor more than he wanted a raccoon,” said Fawkner. “We had done some early animation studies with more raccoon behaviors, but Marvel and James never responded well to those tests. Later on, though, we started introducing the odd nose twitch and ear scratch. The one character trait James did emphasize was that he wanted Rocket to be very nimble with his hands. He often said, ‘If Rocket has a super-power, it is his super hands.’ And so we found moments in our sequences to highlight Rocket’s hand movements.”

Cinefex 139Read the complete article in Cinefex 139, which also features Edge of Tomorrow, Dawn of the Planet of the Apes and X-Men: Days of Future Past.

All content copyright © 2014 Cinefex LLC. All rights reserved.

“The Maze Runner” – Visual Effects

The Maze Runner - Cinefex VFX Q&A

In recent years, filmmakers have found fertile ground – and enjoyed impressive box office returns – in adapting young adult novels. First came Harry Potter, then Twilight and The Hunger Games. Now the latest fantasy adventure has hit the screens in the form of The Maze Runner.

Adapted from the best-selling novel by James Dashner and directed by Wes Ball, The Maze Runner begins when a boy called Thomas awakes mysteriously in a glade set at the heart of a giant labyrinth. Thomas has no memory of how he got there, but by exploring the maze along with the other people trapped there, he gradually pieces together the puzzle of his past … and begins to believe in the possibility of escape.

Method Studios was the major visual effects vendor for The Maze Runner, delivering around 530 shots for the film. VFX supervisor duties were shared by Sue Rowe and Eric Brevig. Rowe led the studio team, visiting the set in Baton Rouge for crucial VFX sequence photography, while Brevig remained on set for the duration of the shoot.

Watch a montage of clips from The Maze Runner:

“My daughter had read the book and loved it,” said Sue Rowe, recalling the early days of the project. “When we heard there was talk of a film, the Method Studios team hopped on a plane and met with Joe Conmy at Fox. Later we met Wes Ball, the director. Within minutes of our meeting him, he was describing the first scene from the movie – he was so full of energy, describing the elevator rising up, using sound effects and jumping around the room. I thought, “If he can do that on our first meeting, then wow! Dailies are going to be fun!”

“Eric and I have worked in a similar way in the past,” said Rowe. “We have a great mutual respect and a shorthand that makes the split easy. Eric has a wealth of knowledge on set, having supervised many shows and directed his own feature. I come from an animation background. I’ve done a number of creature shows in the past, most notably The Golden Compass.”

Having been awarded the work, Rowe and Brevig grew the team by bringing in lead animator Eric de Boer – a Life of Pi veteran – and creature supervisor James Jacobs, formerly of Weta Digital. “With a kicking team like that,” said Rowe, “you know you have a great show on your hands!”

The Maze Runner

The Grievers

One of the key challenges for Rowe and Brevig was bringing to life the Grievers: menacing, part-mechanical creatures that emerge at night to patrol the labyrinth.

“We had some very cool concept designs from Wes illustrating how the Griever should look,” said Rowe. “We started developing it by looking at macro photos of bed bugs and fleas – I always find that whatever crazy creature you want to create, nature has already done it for you somewhere. We studied how ants move, and looked at slugs for the organic soft parts of the body. We used whatever grossed me out for the textures, like a slug’s skin with warts from a frog. An image that struck a chord with us was one of crabs crawling over each other in the sea; we kept this in mind for the finale when multiple Grievers attack the Gladers.”

Maze-Griever-Jaws-1

Real-world research informed not only the way the Grievers looked, but also how they moved.

“The Griever concept had metal legs – like a purpose-built freak of nature – and we wanted to make these move in an unusual way,” remarked Rowe. “While we were researching, there was some major construction going on outside our office window. One morning, Eric and I laughed that the pneumatic crane and drills outside looked like the Griever’s legs. We realised we were on to something, so we took video reference and introduced percussive drills and hammers into the design. We liked the idea that the Griever’s metal legs could hyper-extend and contract like a crane arm. It added physical truth to the model, and allowed the character to reduce or extend his leg dimensions for added dramatic effect.”

Design development took the Griever concept through a number of stages. “At one point the Griever had red flickering eyes and a glowing chest cavity,” Rowe recalled. “These looked great, but with movement and motion blur they lost the demonic feel and began to look like a Halloween lantern. We dropped the glow and instead designed extra folds and cracks in his reptilian upper jaw. We also created an empty eye cavity with a demented scowl. The last iteration added nasty, compacted teeth replicated multiple times inside the mouth.”

Paulo Welter and Kyeyong Peck created the final Griever models using Zbrush, Mudbox and Mari. The creatures were rendered in V-Ray, which proved efficient at handling the many sub-dermal surfaces.

“We started the Griever build in May and had a working model nine weeks later, because Fox wanted to showcase the Griever on the last day of the shoot as a milestone,” said Rowe. “We showed it to the actors on set, and they were stoked to see the character come to life. Especially Dylan O’Brien – he spent much of the movie running from a guy in a blue suit, so he loved to see our works in progress.”

Watch Method Studios’ visual effects breakdown reel from The Maze Runner:

Digital Gardening

The Maze itself comprises a complex labyrinth built from 100-foot walls. Sets were constructed to include the first 16 feet, with Method filling in the rest with a range of digital extensions.

“Wes set the visual tone of the story, showing how important the lighting was on the walls,” said Rowe. “The Maze doors opened early in the morning and at night, so we worked on dawn and dusk colour palettes. My CG supervisor Andrew McPhillips and lead lighter Larry Weiss created long shafts of raking light across the walls. Not only did this look stunning, but it also it added to the feeling of confinement as the kids are imprisoned each night. Building walls sounds simple, but creating photo-real scale and texture takes a good eye. I was really happy with the end result.”

This slideshow requires JavaScript.

Once the walls were built, they had to be covered with acres of ivy. “In Houdini, we created a custom procedural growth system for the thousands of tiny ivy leaves, which allowed the artists to draw on the walls where we wanted the ivy to grow. To make it look organic, it needed to wrap around the walls and find crevices to attach itself to. For this we wrote a script which could determine how many seeds the ivy branches would split into, and how thick the stems would be. Lastly, we added a really neat variable which allowed us to determine the direction of the sun, adding a random factor to the directions in which the leaves grew. Koen Vroeijenstijn, Harsh Mistry and Kuba Roth were fearsome digital gardeners!”

Griever Attack

Towards the end of the film, the maze runners confront the Grievers in their lair, in an attempt to escape the prison. Like the rest of the interior scenes, the sequence was shot in a Sam’s Warehouse, on a partial stage floor with a 360° bluescreen backing.

“The ceiling was only 24 feet high, so it was hard to fly in blues,” commented Rowe. “We had stunt guys dressed in blue suits, with poles to represent where the Grievers’ metal legs would be. The shoot was full of crazy energy. At one point Wes Ball jumped in there and was performing as the Griever, thrashing around with these huge blue legs. The actors really threw themselves into the fight.”

Griever attack - bluescreen shoot

The Griever attack was shot on a blue screen stage, with environments and creatures added digitally by Method Studios.

Rowe’s team replaced the bluescreen with the cathedral-like environment of the Griever’s lair, and the blue-suited performers with Griever animation.

“The ceiling was cathedral-high, with light bleeding in from overhead vents,” said Rowe. “We added dripping water and wall decay, steam coming out of the ventilation shafts … all the tricks in the book to keep the environment looking edgy and real. We removed the men in blue suits and choreographed a wicked fight sequence out of a tight cut. For the first half of the movie, you only see glimpses of the Grievers –- but by the end they are out in the open fighting the kids, so there was nowhere to hide. We were able to enhance the cut by making a number of shots full 3D – like the shot where a Griever is hit in the face by Thomas, where you see a full mouth interior with multiple teeth. It wasn’t something we’d planned for, but it was so cool we did it anyway.”

This slideshow requires JavaScript.

Destroying the Maze

In a spectacular destruction sequence, the vertiginous walls of the Maze collapse around the fleeing characters. “The destruction was a big data challenge but visually a great success,” Rowe asserted. “Wes had done  some of his own previz using Modo.The basic concept of “Maze Rearrange” is that large concrete slabs were rigged to move when the Maze was originally designed. In the intervening years, they have accumulated layers of soil and dust, and the concrete itself has begun to degrade. When they finally begin their predetermined sequence of movements, they begin to crack and disintegrate.

“Lisa Nolan and Niall Finn made an awesome destruction team. Our weapon of choice was Houdini. Our Maya model supervisor Ian Sorensen spent ten weeks building walls, floors and rusted doors, then the Houdini team spent the next three months destroying them!”

The Maze - aerial view

Aerial view of the extensive digital Maze environment.

As the film progressed, the destruction sequence grew bigger and more elaborate. “At one point Thomas and Minho are running for their lives with a wave of concrete destruction just a few feet behind them. We tested for this with Houdini’s DOPs integration of the Bullet RBD solver, but found that for the kind of densely-packed fragments we needed for close-up destruction work, we couldn’t keep the sum stable.

“For that reason Niall switched to Houdini’s own, slower, higher quality RBD solver. For this application, we felt this produced consistently better-looking results. Layered on top was the finer rigid body debris, point particle sand and soil, and multiple layers of volumetric dust, generated with our in-house Studio Pyro toolset. We exported geometry with Alembic so we could render in V-Ray, but the dust we kept in Mantra.”

To assemble the complex shots, the Method team used deep compositing throughout. It was particularly valuable in the destruction sequence, which features characters running through of clouds of falling dust and debris.

“Deep compositing put a strain on our data management, for sure,” Rowe admitted. “I’d like to research further, as we reached the limits when it came to disc space and render power. Arek Komorowski and Abel Milanes were our 2D supervisors, leading a team of 30 Nuke artists – very strong visual and technical guys. Combining the talents of the digital matte painting team into our Nuke pipeline allowed us to complete some great 2½D work in comp rather than full 3D build. Rasoul Shafeazadeh was key in some of the early concept work for this. Often I would have his team do ‘love stills’ or paint-overs to indicate to Wes where we were taking the shots. This visual shorthand saved us many wasted days.”

This slideshow requires JavaScript.

Why Did It Have To Be Snakes?

One unexpected issue with the Baton Rouge location was the preponderance of hostile wildlife. “We had a snake handler on location to clear all dangerous snakes and creepy crawlies away from the set,” said Rowe. “We all bought snake-proof boots. On the last day of the shoot, I put a plastic snake in the briefcase of my producer, Scott Puckett. He didn’t find it until he was back in his hotel. He freaked! He’ll get me back some day, but it was worth it!”

The ubiquitous clouds of insects did however provide some creative inspiration. “We saw flying bugs everywhere on location, so we added digital bugs to the final shots to add movement.”

Reflecting on the demands of the work, Rowe concluded, “Our on-set data and production team were awesome, and the whole team at Method never lost the fighting spirit make every shot the best we could. Our team of artists may have had to work to a challenging budget, but we never let that appear on the screen. I hear Fox are very happy with the result, and I’m looking forward to the next collaboration with them!”

The Maze Runner - blue screen shoot

Thanks to Rita Cahill and Ellen Pasternack. “The Maze Runner” photographs TM & © 2014 Twentieth Century Fox Film Corporation. All Rights Reserved. Not for sale or duplication.

Visions of Mars

NASA's MAVEN probe approaches Mars

Image courtesy of NASA’s Goddard Space Flight Center.

MAVEN has reached Mars!

“Hold up!” I hear you cry. “What the heck is MAVEN?” Well, I’ll tell you. It’s the latest in a long line of spacecraft sent to gather data on the Red Planet. Its full title is the Mars Atmosphere and Volatile EvolutioN mission, and it’s the first of its kind, dedicated as it is to exploring in detail the upper atmosphere of Mars.

But all that’s a bit of a mouthful, so MAVEN it is.

One of the puzzles the MAVEN mission controllers are hoping to solve is the mystery of how the sun may have stripped Mars of its early atmosphere, creating a barren desert out of a world that may once have supported microbial life.

What they’re unlikely to find are the irradiated survivors of a doomed Martian race, a bat-headed spider, an abandoned atmosphere processing plant or a race of green, six-armed warriors.

All of the above have graced our cinema screens over the years, and little wonder. As one of Earth’s closest celestial neighbours, Mars has long fascinated filmmakers …

The Re-making of a Rocketship

One of the earliest movies to explore Mars was Rocketship X-M, in which a botched attempt to fix an engine glitch sends the crew of a moon rocket spinning so far off course that they eventually land on the Red Planet. Released in 1950, the film was shot in black and white, with the Martian sequences tinted red to evoke a suitably otherworldly atmosphere.

Bizarrely, the film was revisited – and to some extent remade – in the late 1970s, when lifelong fan Wade Williams acquired the rights, and set out to shoot new visual effects sequences. His aim? To introduce Rocketship X-M to a new audience hungry for interplanetary thrills to match the recently released Star Wars, Close Encounters of the Third Kind and Star Trek: The Motion Picture.

Williams managed to assemble a team comprising some of the top VFX experts of the day, including Dennis Muren, Bob Burns, Tom Scherman, Robert Skotak and Harry Walton. Between them, they re-created the original spaceship in the form of a two-foot-tall miniature.

The "Rocketship X-M" reshoot visual effects crew.

The “Rocketship X-M” reshoot visual effects crew. Image from Cinemagic Issue 1.

In this extract from David Hutchison’s article Re-making “Rocketship X-M”, published in 1979 in the first issue of Starlog spinoff magazine Cinemagic, Mike Minor describes the process of building and shooting a foreground miniature of the rocket on location at Trona Pinnacle, near Death Valley:

“It took about three hours to complete the miniature. We had just barely enough time to get the takes. It was a constant battle, because as the day went on, the shadows got longer and the colors changed, so there was constant repainting. The 40-mph winds moved the rocket ever so slightly, even with the brace Tom had built. The takes in which the rocket moved, of course, will not be used – it looks like an earthquake had started!”

More Mars

Ever since Rocketship X-M was first released, like a planet trapped in endless orbit Mars has periodically circled back into movie theatres. After Conquest of Space came the dubious thrills of The Angry Red Planet. In 1964, Mars even got a visit from Robinson Crusoe. Later, the planet’s desolate deserts popped up on the small screen, when Rock Hudson starred in a 1980 TV mini-series adapted from Ray Bradbury’s classic The Martian Chronicles.

In 1990, Arnold Schwarzenegger visited an especially lurid Martian landscape in Paul Verhoeven’s Total Recall. A decade later, a slew of less-than-successful Martian movies arrived – and swiftly departed. Among them were Mission to Mars, Red Planet and John Carpenter’s Ghosts of Mars.

Simmering beneath all these was the rumoured adaptation of Kim Stanley Robinson’s definitive science fiction trilogy – Red Mars, Blue Mars and Green Mars – a project originally pursued by James Cameron and now, by all accounts, mired somewhere in Hollywood development hell*.

Barsoom!

In 2012, Disney released John Carter, Andrew Stanton’s adaptation of the classic Edgar Rice Burroughs “Barsoom” books, the first of which was A Princess of Mars. As with previous Martian movies, depicting the Red Planet’s barren landscape was a fundamental requirement.

This slideshow requires JavaScript.

Just as with the reshoot of Rocketship X-M, a  location in North America was chosen for key exteriors. In this extract from Joe Fordham’s in-depth article Under the Moons of Mars, published in Cinefex 129, Stanton describes why Utah was the perfect analogue of Mars for location shooting on Earth:

“There’s something about the northern part of the Grand Canyon going into Utah. You can just tell that the whole landscape was once underwater. That is pretty much the topography of Mars, and that is how it was described from a romantic standpoint in the fantasies of Edgar Rice Burroughs.”

Sue Rowe, who shared visual effects supervisor duties on John Carter with Peter Chiang, overseeing VFX vendors Double Negative, Cinesite, MPC and Nvizible, goes on to describe the peculiarly alien quality of the Utah light:

“Utah was a wonderful resource, with vast plains of red and ochre. And the light there was amazing. Back in the UK, I spent quite a lot of time explaining to my crew how the light was in Utah. Everything was so sharp and bright and highly contrasted, with huge over-exposures but still retaining details.”

Visions of Mars

Now that the MAVEN spacecraft has slipped smoothly into orbit – and with the Curiosity rover still romping across the Red Planet’s rocky terrain – Martian conditions no longer need to be simulated. Instead they can observed at close quarters. Still, I hope the mission controllers take a moment or two to train their cameras on some of the forgotten corners of that dry, desert realm.

Who knows? Maybe they’ll spot the long shadow of a sleek silver rocketship caressing the side of a remote sand dune. Perhaps they’ll see Arnie tumbling down the mountain slopes of Olympus Mons, with his eyes bulging and his hands clamped to his throat. They might even spy the long trail of the walking city of Zodanga as it marches relentlessly across the dusty plains.

Watch the NASA MAVEN video Targeting Mars:

Whatever new visions MAVEN does bring us, few words can be more appropriate to celebrate its arrival at Mars than those spoken by science fiction author Ray Bradbury in his address The Search for Life in Our Solar System, which he delivered at JPL, Pasadena, California on 8 October 1976:

“Today we have touched Mars. There is life on Mars, and it is us — extensions of our eyes in all directions, extensions of our mind, extensions of our heart and soul have touched Mars today. That’s the message to look for there: ‘We are on Mars. We are the Martians!’”

*Within days of this blog post first appearing, Variety reported that Kim Stanley Robinson’s Mars novels have just been snapped up for development by Spike TV and Vince Gerardis, co-executive producer of HBO’s popular Game of Thrones … Read the Variety report here

“Total Recall” photograph copyright © 1990 by Tri-Star Pictures, Inc. “John Carter” photographs copyright © 2011 by Walt Disney Pictures. John Carter ERB, Inc. All rights reserved.

Now Showing – Cinefex 139

Cinefex 139 - From The Editor's Desk

It’s a wrap! Actually, it’s more than a wrap – the new issue of Cinefex is now officially out on general release!

Issue 139 features in-depth articles on the visual effects of Edge of Tomorrow, in which Major William Cage (Tom Cruise) is trapped in a seemingly endless cycle of life and death as he tries to unravel the mystery behind a devastating alien invasion. Then there’s Marvel’s Guardians of the Galaxy, the smash hit space movie starring the funkiest fellowship of interstellar misfits ever to take to the heavens.

Next comes Dawn of the Planet of the Apes, the latest film in the rebooted franchise, which delivers not only spectacle but also high emotion, thanks to astonishing ape performances created by a cutting-edge blend of motion capture and animation. Last but not least is X-Men: Days of Future Past, in which Wolverine (Hugh Jackman) plunges back through time to save the world of the future, with the help of earlier incarnations of his familiar mutant allies.

Here’s Cinefex editor-in-chief Jody Duncan with her thoughts about the movies covered in our latest issue:

Jody Duncan – From The Editor’s Desk

What struck me when putting together Cinefex’s 139th issue is how much I enjoyed all of the films it covers.

Take Edge of Tomorrow. I recall sitting in a small screening room near the Warner Bros. lot to see an early screening, and thinking: “Oh, good! We have a winner here!” And I thought Dawn of the Planet of the Apes and X-Men: Days of Future Past were among the best offerings of those respective franchises.

The biggest surprise, though, was Guardians of the Galaxy. I’m not a comic book gal, generally, nor am I a fan of raccoons, genetically enhanced or not (having often encountered the au naturel type in the middle of the night in my own backyard, where they regularly eat my grapefruit and hiss at my dog). So, when I met Gregg Shay at the Marvel offices on the Disney lot to see an early screening, my expectations were not high.

But, oh, what a time we had! Gregg and I sat in that little screening room, alone except for the editor running the Avid, and we howled! The film had charm, and laughs, and a fresh perspective on the whole “galactic adventurer” genre.

I even loved the raccoon.

Thanks, Jody. Now, please take your seats, set your cell to silent and prepare yourselves for a high definition trip behind the scenes of some of your favourite films of the summer. There’s thrills! There’s spills! Most important of all, there’s the best visual effects coverage you’ll get on this or any planet!

On with the show!

“The Giver” – Visual Effects

"The Giver" - visual effects by Method Studios

Adapted from the 1993 children’s novel by Lois Lowry, The Giver is a coming-of-age story set in a futuristic utopian society known as the Community. Having turned its back on war and suffering, the Community idealises the concept of “Sameness”, a state in which emotion and memory – as well as differences in physical appearance – are suppressed. Elected to the honoured position of “Receiver of Memory”, a boy called Jonas has his mind opened to a past that has until now remained a mystery, thus triggering a journey into enlightenment and, perhaps, escape.

Method Studios was responsible for the majority of the visual effects work on The Giver, delivering around 300 shots. Their contribution included creating the main Community environment and a sequence in which Jonas is pursued by a remote drone, as well as a number of one-off shots.

Method also shared assets with Mr X. Inc. in New York for additional shots as the project progressed. Method’s VFX supervisor Mark Breakspear was on set for most of the production shoot, supporting overall VFX supervisor Robert Grasmere and VFX producer Pablo Molles. The Giver was directed by Philip Noyce.

Watch Method Studios’ VFX breakdown reel from The Giver:

Mark Breakspear Talks The Giver

Thanks for sharing your experiences on The Giver, Mark. Can you begin with how you first got involved with the show?

Method Studios has a long and excellent track record on environment-type visual effects. And we had previously worked with production visual effects supervisor Robert Grasmere on Salt, which was also directed by Philip Noyce.

Robert reached out to Method early on. We talked about the kind of work that would be required and the general approaches that would make most sense, given what we knew about the story at that time. It was early days and we didn’t yet have a production designer – Ed Verreaux joined soon after – but we did have some very early concept drawings for the Community. We focused on how we would create that world, based on the book and how Philip wanted to take the look.

The key concept we had to deal with was that of “Sameness”: the idea that no one person, and no one object, should stand out from the others. People wore similar clothes; homes all looked the same; even the geography of their world was mirrored along a central spine.

We met several times in LA to talk through the major VFX shots: how best to shoot them and, more importantly, how to afford them. VFX producer Pablo Molles joined the show and we began to plan the shoot, deciding how Robert and I would oversee the VFX shots, and how the shots would be turned over so as to be ready for any previews and trailers that might come up.

The futuristic Community settlement sits atop a 10km-wide mesa surrounded by cloud

The futuristic Community settlement sits atop a 10km-wide mesa surrounded by cloud.

How did you go about designing the futuristic environment of the Community?

At the beginning of the project, the initial brief asked for the Community to be located on a 1km-wide mesa surrounded by impenetrable clouds. Ninety percent of the movie was to take place in this location, so its believability was essential. The production would shoot in many different South African locations, so the digital environment needed not only to be photographically convincing, but also able to work alongside multiple locations and blend in without trace.

The art department provided us with an initial layout of the Community, consisting of all the key landmarks and natural elements: trees, grassland, lakes and so on. While shooting was underway, the team built a basic previz model. On viewing this, the client quickly decided that a 1km diameter mesa was not large enough. We tried several scales, going up to 20km, and finally settling on 10km.

This slideshow requires JavaScript.

What visual references did you use?

Philip sent us reference of “bits” of places – parks, homes, forests, fields, roads, views – and we used them to develop a bigger picture of what the Community would look like. It was the superlative of many locations all rolled into one, but always based on the “Sameness” concept.

That worked well, but often what makes one corner of a park look really great is its relationship to another section of the park that doesn’t look as good. The yin and yang of a place. When you strip away the yang, the yin just doesn’t have the same pizazz. CG Supervisor Anthony Zwartouw and Matte Painting Lead Jeremy Hoey built a pretty amazing system not only for building the levels of detail in the Community, but also allowing for changes to be made without having to go back to the drawing board.

Also, the Community is artificially flat. This was something we fought over because perfect flatness didn’t look logical at this huge scale. Production was confident that this would play to the “Sameness” feel and, in the end, I believe it worked. But it required a huge amount of detail.

How did you go about building the assets?

The Community environment needed to be full CG, as it would be featured in over 100 shots, ranging from wide shots of the entire mesa to helicopter shots skimming over buildings. The design progressed over several months from a natural, random environment – scattered wooded areas and grasslands with buildings mingled between – to a highly structured, geometric landscape that was almost completely symmetrical.

This posed an interesting challenge, because one of the methods we normally use to create realism is to introduce natural randomness into our work. But the gardens of the Community were all very ordered, with trees gridded and equidistant. So we developed scales of randomness, giving the illusion of rigid symmetry from a distance, but up close revealing the uniqueness of the individual trees and pathways. This allowed the fully practical locations – which were anything but symmetrical – to blend in with our digital Community.

How big was the final Community model?

Given the range of shots we had to create, we needed a way of populating the environment with high-resolution assets on a huge scale. This amounted to half a trillion polygons overall. The assets team, led by Kyeyong Peck, created 40 types of trees (around a million polygons each), plants, street furniture, dozens of buildings, people and the landscape itself. The latter comprised cliffs, urban areas around the main structure called the Odeon, manicured parks and the main living areas.

This slideshow requires JavaScript.

Did that give you a rendering headache?

V-Ray is our main render engine at Method, so we used a plugin called V-Ray-Scatter, and Houdini, to meet our specific layout requirements. Scatter enabled us to multiply an asset thousands of times without hitting the usual memory limit. Scatter also had tools to vary the assets in scale, rotation and colour, so to not create glaring repetition. The environment lighting team, led by Jon Reynolds, took on a lot of this work on top of their usual lighting duties.

For layout, we developed a system in which our DMP department created a graphical map to define asset placement on the environment. This was then fed into Houdini for processing, and placement geometry would be spat out into Maya, where Scatter would apply the asset geometry at render time. The cloud layer was simulated in Houdini by the FX team, led by Ian Farnsworth. It was set up to encircle the entire mesa. This enabled us to put the camera pretty much where we wanted.

It sounds like a monumental task.

The Community was huge, I mean huge! We had millions and millions of trees in that thing – all different and all modelled down to leaf detail – thousands of lamp posts, benches, people, houses, bikes, motorbikes … The details mattered. It wasn’t an over-built place, but our wide shots showed every little detail and, because we didn’t know which areas would be featured early enough, we had to build everything.

In the end, this allowed Philip to create new shots that the story needed, like close fly-overs of the houses with people going about their daily lives below. It was a daunting build, but Anthony’s CG team pulled it off and then some.

Our biggest shots of the Community weren’t even known about until a month before the deadline. It wasn’t bad planning – it’s just that the cut was changing subtly. There was a need to tell part of the story in more detail, and a picture is worth a thousand words. In just one month, we turned around five extra huge Community shots, all full CG with people, vehicles, clouds … everything.

A remote drone catches up with Jonas during the film's climactic chase sequence.

A remote drone catches up with Jonas during the film’s climactic chase sequence.

Tell me about the drone chase that happens near the end of the film. Was the drone fully CG, or did you work with practical elements?

A drone was built on set, and initially it was decided that the practical model would just need “enhancing” and the addition of side wings – something both Robert and I had one perpetually raised eyebrow about throughout the shoot! It’s not that it was a bad design. It just restricted how we shot things because it was so big and heavy; you could only shoot it upside down for the most part and never move it around.

So Robert and I hatched a plan to completely replace it in post and, as we went along, started to shoot more and more shots entirely on greenscreen. The reason it wasn’t planned that way from the start was that the drone initially had a far smaller part to play. As things developed – and long after the budget to build it practically had been assigned and spent – the story grew, and the drone needed to do more than was originally thought. If nothing else, in some shots, it was great to have the correct shadows fall on the practical object, and it was a great tracking object for matchmove. But, in retrospect, it should have been just a simple box on a greenscreen.

FX simulations for the hovering drone.

FX simulations for the hovering drone.

Can you describe how the drone chase plays out on screen?

During the sequence, Jonas (Brenton Thwaites) is escaping from the Community, pursued by a drone being piloted remotely by his former friend Asher (Cameron Monaghan). The drone catches up with Jonas, and pulls him up in to the air ready to drop him from a great height.

Many of the shots done with Jonas and the practical drone were quite static, despite being “in the air”. We spent a lot of time replacing the practical drone with the fully CG one, adding in new movements such as pitch and yaw, air compression from the suction devices that lift Jonas, and engine rotation. The end result was to take a sequence that lacked visual energy and give it a huge boost.

We spent a couple of days shooting motion capture at The Capture Lab in Vancouver to add a CG version of Jonas into some shots. Our digital Jonas looked great – the digital double team, led by Chris Norpchen, did a fantastic job of getting all the hair and cloth detail in and working well. To top it all off, it actually looked like the actor. You assume the scanning and texture maps would take care of this, but it doesn’t always work out that way. When you look at the shots and wonder which ones are digital Jonas, you know it worked out well.

The drone chase - basic geometry laid over the original background plate.

The drone chase – basic geometry laid over the original background plate.

The drone chase - final composite.

The drone chase – final composite.

How did you shoot the scene in which the Chief Elder – played by Meryl Streep – appears as a hologram?

Early on in the production, Method used The Embassy in Vancouver to create previz for this key sequence. Due to schedule issues, we knew we would not have access to both the location and Ms Streep at the same time, so we planned to combine them together later on. That meant shooting motion control, so we decided to justify the extra cost by creating more visually complex shots.

The key shot occurs when the Chief Elder appears in hologram form for the first time. We designed a move that starts off in the same volume the Chief Elder will eventually occupy, and pulls up through her body. It’s a neat effect that shows her body building up towards us as the move progresses.

Our FX team, led by Ian Farnsworth, designed the hologram look, while our set layout team, led by lead matte painter Jeremy Hoey, created the complex set plans needed to allow motion control in the two different locations. Working with General Lift, CG artist Christian Emond created camera moves that could be imported in to the motion control rig, building a hugely successful system for future motion control shots. General Lift provided the motion control system and despite our not sleeping up to, during and for a few days after, the whole thing came off without a hitch!

How long were you working on “The Giver”?

I started work on the show in May 2013, and finished in July 2014. The shoot was from September to December 2013. Turnover happened just after Christmas in 2013 through to February 2014. We had several previews, a couple of trailers and the overall VFX team hit a maximum peak of around 70 people. Matchmove and roto were done both in and out of house, and all shots were reviewed remotely over CineSync. Method Studios had regular calls with Robert Grasmere, who was in New York with Philip Noyce and the editorial team.

"The Giver" posterHow do you feel looking back at the production now?

The Giver reminded me that details matter. The whole team at Method Studios did an amazing job of putting this show together. Our industry spends a lot of time talking about the artistry and the crazy hours spent putting those images together, but this movie was also made with the help and dedication of all those producers, digital production managers and coordinators who, after the artist team has gone home, stay to upload, download, prep, and make sure the things we spend so much time doing, actually get seen by the clients. My biggest thanks goes out to them.

Also, the on-set team in South Africa was superb. I was worried that shooting so far away would present challenges in finding on-set wranglers and coordinators, but luckily I was very wrong. And South Africa is beautiful. I mean absolutely stunning. I was amazingly lucky to spend time on its northern border, flying drones up and down Augrabies Gorge. It felt like we were on another planet.

Special thanks to Anthony Zwartouw, Rita Cahill, Ellen Pasternack. “The Giver” photographs copyright © The Weinstein Company 2014.

The Future of Practical Creature Effects

What does the future hold for practical creature effects?

Ah, was there ever a question more likely to raises the hackles of fans of old-school special effects? Or to cause a look of blank bemusement to cross the face of the average moviegoer? Ever since Jurassic Park, argument has raged over which is better: a CG dinosaur or its mechanical equivalent? Whenever the issue is raised, emotions run high; discussing it at all is only marginally safer than poking a nest of snakes.

But is it an issue? Is it valid even to ask the question at all? Isn’t it time we rejected all that “either/or” nonsense and concentrated on simply getting the job done in the best possible way?

There’s only one way to get a decent answer: ask the experts. So that’s what I’ve done. I put my question to a group of top professionals from the field of practical effects, left them to ponder … and then stood well back.

You want to know what the future holds for practical creature effects? You’re about to find out.


Richard TaylorRichard Taylor
Co-Founder & Creative Director, Weta Workshop

“At Weta Workshop, we very much believe there is still a dynamic place for physical creature effects in the entertainment industry. While there has been a huge shift towards CG creatures over the last ten years, there are still directors who are interested in utilising more traditional, real-world effect solutions for characters in their films.

“There is no doubt that films such as the latest Planet of the Apes, with work by Weta Digital, give a clear indication how extraordinary characters can be fully realised through CG, but there remains a very compelling reason to have real actors in prosthetics and creature costumes on set to create particular characters for the right project. A great example is the recent blockbuster Guardians of the Galaxy, which uses a huge ensemble cast of aliens of which, I believe, only three are digital. Here is a wonderful use of traditional makeup and creature effects (created by a truly superb and world class team) to do something absolutely extraordinary.

“Owning a physical effects company, I have to believe there will always be some demand by an audience and the directing community for the wonderful attributes of practical creature effects and makeup. It may become niche, but just look at the way stop-motion animation continues to have a presence, delivering some of the most engaging and beautiful films of the last ten years, such as Fantastic Mr. Fox, ParaNorman and Coraline. In the same way, those wonderful monster makeups that were pioneered by so many greats in the early days of filmmaking can still have power today.”

Richard Taylor and Mike Asquith of Weta Workshop study a Bilbo scale-double makeup design sculpture for "The Hobbit"

Richard Taylor and Mike Asquith of Weta Workshop study a Bilbo scale-double makeup design sculpture for “The Hobbit”


Phil Tippett - Photo by Chris Morley - © Tippett StudioPhil Tippett
Co-Founder, Tippett Studio

“The answer to your question can be easily determined with the use of a crystal ball …!

“The method of producing VFX and character work is usually determined by the studio, the VFX supervisor, and the director. A great deal depends on the director. Films like Hellboy 2, The Wolfman, and The Hobbit all employed a mix of technologies that was decided during the preproduction design process.

“The current spate of scripts we’re seeing requires an unbridled quotient of spectacle. Current production methodologies dictate that you spend as little time on the set as possible – it’s preferable to just push the issues downstream and “fix it in post”. Given all the cash these recent VFX-heavy films have scored – Godzilla, Planet of the Apes, Ninja Turtles, Transformers – I think the die is cast.

“Frankly I’m not so sure there are any significant demographics to indicate a real ground-swell movement toward practical. Currently, studios tend to just go with the “let’s do it all CG” bag because it’s the flavor of the day, and god forbid you’d want to go against the prevailing trends in a corporate climate. Ain’t gonna happen.

“There will continue to be the anomalies like Beasts of the Southern Wild, where budgetary issues force smart, low-budget approaches. In that case, VFX that drove the story in a very economical way were pulled off with a fine level of execution. For the same reasons, there’s still a good amount of practical stuff going on for TV. Hopefully that will continue.

“There are still skilled folks out there who know how to do this practical stuff. Let’s hope there can be enough work to keep their knowledge base from falling into atrophy, lest we go the way of those ancients of yore who built the Pyramids and the Mayan temples. We couldn’t pull that off today.

“SO! To answer your question: the future lies with extraterrestrials who tell us how to make stuff. And crystal balls. I think those are the way to go. Weird magic, no? (That’s what it’s all about anyhow, aint it?)”

Phil Tippett with the Rancor from "Return of the Jedi"

Phil Tippett supervised the creature shop on “Return of the Jedi”. He is currently supervising the dinosaurs of “Jurassic World”.


Alec Gillis - Photo by John AlesAlec Gillis
Co-Founder, Amalgamated Dynamics Inc.

“Fans seem to love practical creature FX, as do many directors, but studios tend to promote the near-exclusive use of digital imagery. This has led to an audience belief that the look of CG equates to a “big Hollywood” look.

“While I don’t like not being invited to the party, being regarded as a PFX pariah has an upside. PFX are becoming more unique, less corporate and edgier than mass-produced CG FX. While I miss the days when the aesthetics of PFX artists like Rob Bottin, Rick Baker and Stan Winston drove the look of the creature work, I can still appreciate the scope of today’s big pixel-fueled VFX orgies. Whether you think modern VFX films are hollow, overblown corporate sputum, or a dazzling nexus of art and technology, doesn’t really matter: they sell.

Harbinger Down teaser poster

“Of course, most fans don’t really care what technique is employed, as long as the movie is good and the effect looks cool. It doesn’t even need to look real, just cool. So there’s no need to argue which technique is better. They’re just tools used to build a film. If you’re constructing a house, is a nail gun better than a table saw?

“Here at ADI, we believe in the mixed bag approach – we try to promote the strengths of both techniques. The frustration arises when there’s an effort to suppress or eliminate techniques as a matter of corporate policy. It’d be nice if the new Star Wars movies had more PFX in them, or if Mr. Del Toro and Mr. Favreau swayed studios towards PFX, but I can’t wait around for that to happen. I plan to keep making modest movies like Harbinger Down that rely on PFX. If the audience responds positively, the PFX resurgence will grow. If not, we’ll join the theramin and become a faint echo from a quaint bygone era.”


John RosengrantJohn Rosengrant
Co-Founder, Legacy Effects

“Of course there is a future for Practical Effects. Hey, they provide a spontaneous, tactile quality, an actual in-the-scene feeling. They provide not only a visual reference for an actor but something to actually perform with. In Real Steel, for example, having actual robots for the young actor to work with was essential.”

Iron Man 3

Legacy Effects created partial and complete suits for the “Iron Man” films and “Avengers Assemble”, used both on camera and as reference for the visual effects teams.


Mike ElizaldeMike Elizalde
President & Creative Director, Spectral Motion Inc.

“I have never wavered from my position that practical effects will always be relevant. The huge pendulum swing that occurred over the past two decades in favor of digital effects – where practical effects would have been more effective – has lost its momentum and is beginning to swing the other way. This discussion has raged for a long time, but the proof is in the meetings I take weekly with young, upcoming filmmakers. The dialogue in those meetings is the same, time and time again. They want their creatures to be practical.

“They need the tactile accessibility for not only their audiences, but also for their cast members. Practical effects are emotionally engaging on a level that we can immediately relate to. When it comes to organic, living characters, pixels cannot stand in for molecules. Digital effects have their place and they are here to stay, but the prevailing sensibility among filmmakers is that practical effects should have never taken a back seat. I agree with them.

“I feel that when an artist touches clay, or paint, or any other tactile medium with their hands, they impart a measure of their soul into their creation. There is passion and life and an undeniable soulfulness that’s clear to the human mind. Their weight and presence are not an illusion, and that is something that we all recognize on both a primal and an intellectual level. I feel no greater joy, creatively, than when I stand back and look at something I have been sculpting or building with my own hands, and see the piece claim its place in the tactile universe.”

Russel Lukich of Spectral Motion works on a parasitic "roly-poly" for "Pacific Rim".

Russel Lukich of Spectral Motion works on a parasitic “roly-poly” for “Pacific Rim”.


Howard BergerHoward Berger
Co-Founder, KNB EFX

“One of the many questions I get asked frequently – aside from if I have fun on Halloween – is how VFX has affected the world of special makeup and creature effects. This sparks two feelings. One is how frustrating this often-asked question is. The second is how much I enjoy working with VFX, and utilizing their tools and art to enhance ours.

“It all started with Jurassic Park. When we saw that first shot of the dinosaur walking across the screen, we looked at one another and said, “We are extinct”. After numerous discussions among the shop owners, it seemed there were three ways to deal with this potential new threat:

  1. Get out of doing special makeup effects and join the digital revolution, which a few shops did. They are now extinct.
  2. Fight the digital revolution and convince ourselves that anything VFX can do, we can do practically.
  3. Find a way to integrate what we do best, and work with VFX to create a brand new magic trick for filmmakers and audiences.

“We at KNB EFX Group chose option 3.

“At first it was a bit of a struggle, as certain film makers wanted the new “state of the art” way of doing things, even if the practical solution was the better way to go. For example, when we worked on Little Nicky, the director went digital for a certain gag. If we’d been allowed to accomplish the effect via special makeup, it would have looked heaps better, and not been the strange, floating mess that ended up in the final film.

“There was grandstanding from both sides. We’d come to set with mechanical puppets, set them up and shoot a few takes, only to see the VFX supervisor whisper something in the director’s ear. The next moment we’d hear, “OK, lets move the puppet and get a plate shot, just in case”. Well, you can guess what happened next …

“When we were awarded The Chronicles Of Narnia: The Lion, The Witch And The Wardrobe in 2004, we knew this would only work if we all came together and formed a team. Luckily our VFX supervisor Dean Wright and Bill Westenhofer from the late Rhythm & Hues felt the same way. Take Otmin, the Minotaur. We made a huge suit and mechanical head that was worn by Kiwi actor Shane Rhangi. For full-body shots, Otmin was ours from the waist up, and R&H’s from the waist down.

“Otmin was a perfect blend of techniques, and the beginning of a very successful collaboration between special makeup and digital VFX for KNB. We love making creatures, and now we can create anything by working hand in hand with VFX. It’s a great marriage of the two arts.”

Otmin - KNB EFX

Beth Hathaway and Clare Mulroy from KNB EFX prepare Otmin for . scene in “The Chronicles of Narnia: The Lion, The Witch and The Wardrobe”


Tom Woodruff JrTom Woodruff Jr.
Co-Founder, Amalgamated Dynamics Inc.

“Practical VFX has never completely disappeared, but it has been vastly overshadowed during the last two decades, as CG has become the primary tool for visual effects. Yet the audience knows the difference, and has grown steadily to be the strongest advocate for a balance onscreen.

“For the longest time, we couldn’t get producers and studios to consider that balance. Even studio VFX supervisors – for all their understanding of the value of real images – couldn’t turn the tide. In recent years, there has been some turnaround. The big studio ships aren’t exactly turning around and coming back to us, but more and more are dangling their dinghies.

“Our next challenge is re-educating people about what we do. The industry has a short memory. Today I sit in meetings where producers and directors ask if it’s possible to change the look of an actor’s face without rebuilding it as a digital asset. Alarming too, that the term “asset” has replaced the term “art”. We have to convince producers and supervisors that practical VFX can work – when they are designed with a plan of how they will be shot, and that plan is followed on stage.

“Is there a war between practical VFX and digital VFX? There is when you listen to the buzz of social media. Personally, I’ve always advocated balance. Our studioADI YouTube channel presents over a hundred clips of the work we’ve done for the past 30 years – a great deal of which works by overlapping with digital VFX. In fact, most of us working understand we’re all practitioners of the single art of visual effects – I myself have an Academy Award for Special Visual Effects. That term can’t be usurped solely for digital effects simply because of a misconception that we’re at odds.

“Yes, I’ve got a dog in this fight. Some of our best recent work at ADI has been altered, ruined or simply replaced in post by those who don’t have a big-picture vision. I don’t like seeing our art being downgraded to a service. I don’t want to close down my studio and send the artists away for good.

“The future won’t restore what has been compromised, but we do seem to be at the brink of a renaissance, with smaller films choosing an alternative to digital-only VFX, and big films understanding they can use the support of practical VFX. We’re not going anywhere. We just want to have our cake and eat it too.”

Director Tom Woodruff, Jr. tests makeup with actress Danielle Chuchran, for his crowdfunded feature film "Fire City".

Director Tom Woodruff, Jr. tests makeup with actress Danielle Chuchran, for his crowdfunded feature film “Fire City”.


Steve NewburnSteve Newburn
Creature FX Supervisor, Pacific Rim, The Strain

“It can be tough for a traditional creature shop to survive today. Profit margins are tight. Seven-figure paychecks are a thing of the past. Those who openly embraced change early on, incorporating 3D modeling, milling and printing techniques into their workflow, are generally still going. But CG has had 20 good years to dig in, and we are on the verge of a generation of viewers who don’t know what “real” looks like any more.

“I remember a practical effect from a few years back. The director allowed three takes – which took roughly 15 minutes – before saying, “This is taking too long. We’ll do it in post.” No explanation about what wasn’t liked, even though it was exactly what was asked for. Everyone on the crew sat there for another hour-and-a-half while they shot take after take of the setup, followed by clean plates and HDR for the VFX team – around 18 additional takes. The attitude of the director fueled the perceived failure of the practical effect. Self-fulfilling prophecy.

“Yet there are many good directors (generally veteran filmmakers) still willing to work with practical effects. And most of the digital artists who come to reference our work say nothing but positive things about it. Many say practical effects are what they wanted to do for a living but, because of their apparent decline, they chose digital instead.

Strigoi makeup

Strigoi makeup created by Steve Newburn and Sean Sansom for “The Strain”

“So what does the future hold? Personally, I think the industry will continue, but in constantly-evolving form. Creature shops are starting to look at smaller films, art projects and corporate work. This approach can keep smaller operations going, and is a good filler between other projects for the big shops.

“For both Pacific Rim and The Strain, we knew we couldn’t deliver the volume of work under the standard creature shop model. Our solution was to create a department under the production roof. This planted us firmly in the minds of the other departments – art, props, wardrobe, even construction – and resulted in additional work being sent our way. The producers saw that every dollar spent was going on camera and not into paying vendor overhead for months after wrap.

“With this model, we use the same crew we would as a vendor and, in most cases, those people are paid better … yet the ultimate price tag is cheaper. The producers have more control, and can see exactly where every dime is going. It eliminates our own need for clerical staff, as much of financials goes through the production’s accounting department.

“Further still, it encourages a more open-minded approach as to what tool is best for the job, and fosters a more fluid working relationship with the digital VFX team. As for the “practical vs digital” debate – it can finally take a back seat.”


Todd MastersTodd Masters
President, MastersFX

“Personally, I’m banking on a growing use of practical materials and FX. As more artists – both practical and CG – become more comfortable with integrating these tangible items, thing should continue to open up. Of course, practical elements don’t help every effect, but I feel they can often help keep things grounded, honest and organic. Even if only when used as reference.

Cochise - Falling Skies

The alien character Cochise was created by MastersFX for “Falling Skies”

“There’s little debate that, when possible, filming physical elements on set can be good for the overall production. Actors perform better to something if they can see it, touch it, experience it. Directors work better with a real monster than a tennis ball. Scenes can be blocked out more successfully, and the overall belief in the experience is far more satisfying. The trick is figuring out which methodology is best suited for each case.

“As CG has evolved, so have practical FX materials. Now our “real”  stuff looks and reacts like CG. Of course, CG has become more convincing as well. The art is in mixing them in a satisfying way. Ultimately, there’s only cool shots and good work. There aren’t sides to FX, until you choose to take them.”


Mark CoulierMark Coulier
Creative Director, Coulier Creatures FX

“Has there been a better werewolf transformation scene since An American Werewolf in London? Has there ever been a better or more popular creature character in a film than Chewbacca? Name an alien that has captured the audience’s imagination – especially children – as much as E.T. Has there been a better creature built since the T-Rex in Jurassic Park?

“The Jurassic Park T-Rex and raptors have never been bettered as real, believable characters and I think that is partly down to the mix between practical and VFX. It is, after all, about fooling the audience’s eye and creating characters that are real. Once you have a King Kong that does multiple somersaults, and a herd of stampeding Brontosauri with people running between their legs, it doesn’t matter how real they look; it looks impressive from a technical point of view, but it fools no-one.

“I see one technological Hollywood masterpiece after another and they are all starting to look the same. I think that lack of technology in the past forced filmmakers to be inventive about how the story was told. We should make sure that magic is not lost forever. Fortunately, there are filmmakers out there who still embrace all aspects of our craft. None of it should be discounted. We should use whichever effect tells the story best.”

Voldemort

Mark Coulier, working under creature effects designer Nick Dudman, sculpted and applied this makeup for Voldemort in the “Harry Potter” series of films. The makeup was further refined with digital enhancement by MPC.


Conclusion

Despite the challenges of the last couple of decades, it’s clear to me that the practical creature effects industry is very much alive and kicking. Sure, it’s been forced to move with the times – and not always painlessly. But isn’t change inevitable, in all walks of life?

The practical shops that are surviving – in some cases thriving – are doing so because they’ve embraced that change. Some have done this by seeking ways to integrate with their digital peers, other by entering new markets, or by exploring new business models. Some have embraced all these things and more. Meanwhile, a new generation of filmmakers has emerged that is keen to explore the possibilities of today’s sophisticated brand of practical effects.

Throughout – and without exception – the creature-makers and makeup effects artists have continued to do what they’ve always done: used extraordinary skills to create amazing visions of imaginary beings. And if the above passionate – and sometimes impassioned – responses to my question prove anything, it’s this: however great the challenge, there is a viable alternative to extinction.

Evolution.

Now it’s your turn. Which side of the fence do you stand? Or are you ready to burn that fence down? Cast your vote below:


“Return of the Jedi” photograph copyright © 1983 by Lucasfilm Ltd. “Pacific Rim” photograph copyright © 2013 by Warner Bros. Pictures. “The Chronicle of Narnia: The Lion, the Witch, and the Wardrobe” photograph copyright © 2005 by Disney Entertainment and Walden Media. “The Hobbit: An Unexpected Journey” photograph copyright © 2012 by Warner Bros. Pictures. “Real Steel” photograph copyright © 2011 by Dreamworks II Distribution Company. “The Strain” photograph copyright © by 2014 FX Networks, LLC. “Harry Potter” photograph copyright © by Warner Bros. Pictures. All rights reserved.

Doug Trumbull’s “UFOTOG”

Douglas Trumbull's "UFOTOG"How many frames per second is enough? According to film director and visual effects pioneer Doug Trumbull, it’s 120fps. And he’s determined to prove it.

This week, on Thursday 11 September, the Toronto International Film Festival is staging the first public showing of UFOTOG, an experimental sci-fi adventure film written and directed by Trumbull. At the event – which will be moderated by Scott Feinberg – Trumbull will also deliver a keynote speech about the creation of the film.

UFOTOG is designed to showcase MAGI – a new combination of technologies that will, Trumbull believes, take motion pictures to the next level and compete with the various technologies threatening to draw audiences away from conventional theatrical presentations.

“Younger audiences are enjoying the benefits of low cost and convenience via downloading and streaming, causing tidal shifts in the entertainment industry, and particularly in theatrical exhibition. Theaters must offer an experience that is so powerful and overwhelming that people will see the reward of going out to a movie.” – Doug Trumbull

Trumbull is no stranger to stretching the boundaries of cinema. After supervising the groundbreaking visual effects of 2001: A Space Odyssey, he went on to develop Showscan, a 60fps film process which, had it taken off, would have brought high frame rates to theaters nearly thirty years before Peter Jackson managed it with The Hobbit.

UFOTOG was shot on a combined laboratory/stage/studio, with 120fps 4K 3D live action being staged inside virtual environments, and the results being relayed to a large screen adjacent to the shooting space.

MAGI isn’t just about high frame rates. According to the Trumbull Studios press release, UFOTOG “explores a new cinematic language that invites the audience to experience a powerful sense of immersion”. In collaboration with Christie Digital and Dolby Laboratories, Trumbull aims to transform the theatrical experience by improving screen size, image brightness, atmospheric sound and the effectiveness of 3D, as well as optimising the colour saturation and dynamic range of the projected image.

“We are exploring and discover a new landscape of audience excitement and do it inexpensively and quickly – we are pushing the envelope to condense movie production time, intending to make films at a fraction of current blockbuster costs, yet with a much more powerful result on the screen.” – Doug Trumbull

Watch the teaser trailer for UFOTOG: