About Graham Edwards

I'm senior staff writer at Cinefex magazine. I also write novels. In a former life, I produced animated films for theme park rides and science centres. If you offer me a cold beer, I won't say no.

Chappie – Cinefex 141 Extract

Cinefex 141 - "Chappie"

All this week, we’re featuring exclusive extracts from our brand new magazine issue, Cinefex 141, now available in both print and digital editions.

First up is Chappie. Starring Hugh Jackman, Sigourney Weaver and Sharlto Copley, District 9 director Neill Blomkamp offers this action/thriller story about a robot child prodigy kidnapped by criminals and raised within their dysfunctional family. Weta Workshop provided practical props and effects, along with special effects supervisor Max Poolman. Visual effects were created by Image Engine, Ollin VFX Studio and The Embassy VFX.

In this extract from Jody Duncan’s article, Rules of Robotics, Chris Harvey of Image Engine discusses the creation of the film’s robot characters.

Final robot designs went to visual effects supervisor Chris Harvey and the crew at Image Engine, the show’s primary visual effects provider, which would deliver close to 1,000 shots, including all of those involving digital robot characters. “We brought Weta Workshop’s two-dimensional designs into Image Engine,” recalled Harvey, “and began to flesh them out in three dimensions. That was a very long, detailed process – and it was quite different from what usually happens. Typically on a film, someone does concept design, and then that is built physically, and then the visual effects team has to replicate what was built. What we did was the opposite of that. The concepts came to us first, and then we developed those. We worked out how the robot would function, physically – how all the gears and mechanisms, joints and limbs would function in the real world. So it was a very physically-based design, which was important to Neill. He wanted it to look like something that could be conceivably built, even today.”

Ultimately, Image Engine modeled 16 different versions of Chappie to accommodate his evolution through the film, as well as 12 generic Scouts. “We had to create a whole police force of these guys,” explained Chris Harvey. “There was the prototype droid, called Robot Deon, and then droids for the end of the film when they go offline and they are vandalized – spray-painted, burned and beat up. Counting all the different versions of Chappie and the Scouts, we created 28 unique robots for the film.”

Read the complete article in Cinefex 141, which also features The Hobbit: The Battle of the Five Armies, Jupiter Ascending and Unbroken.

All content copyright © 2015 Cinefex LLC. All rights reserved.

“Cinderella” – VFX Q&A

"Cinderella" - a Cinefex VFX Q&A with MPC

Like most fairy tales, the tale of Cinderella is part of an oral storytelling tradition stretching back hundreds, if not thousands of years. Many versions exist of this classic story of the persecuted heroine, but the one most familiar to modern Western audiences is a French variant, Cendrillon, written in 1697 by Charles Perrault.

Perrault’s story – which introduced the now-familiar devices of the fairy godmother, the pumpkin turning into a carriage, and the glass slipper – was first adapted for the screen as Cendrillon by George Méliès in 1897. However, the movie remembered by most people is the 1950 Disney animated feature Cinderella.

Now, Disney have released a new version of the rags-to-riches tale: a live-action feature starring Lily James as Cinderella, Cate Blanchett as the Stepmother, Helena Bonham Carter as the Fairy Godmother and Richard Madden as the Prince.

Directed by Kenneth Branagh, the film eschews the recent trend for reimagined, edgy fairy tales. Instead, it tells the traditional story straight, and is unafraid to draw on the heritage of its animated predecessor.

The majority of the visual effects work for Cinderella – approximately 500 shots – was delivered by MPC, with Charley Henley in the role of production VFX supervisor, and Patrick Ledda supervising for MPC.

In this Q&A for Cinefex, Ledda discusses magic and mice, palaces and pumpkins, and that all-important glass slipper.

"Cinderella" - palace exterior shot by MPC

How did you first get involved with Cinderella?

I joined the show in Summer 2013, a couple of months before the start of the shoot, and was on the show for 15 months or so. I knew the production VFX supervisor, Charley Henley, so it was easy to get started. We had several meetings about the style of the movie, previs, sets, locations, and so on. A few weeks later, we commenced principal photography at Pinewood. I attended the shoot, and subsequently went on to supervise MPC’s work.

What was the scope of MPC’s work on the film?

It was varied. We did creature work, including the mice, lizards, goose, transformations, digi-doubles and stag. Also magical transformations: the carriage, the shoe, the dress. There was a considerable amount of environment work, including fully-CG wide establishers, the palace, the town and various set extensions. Our CG supervisor, Richard Clegg, did a tremendous job of managing such a variety of assets and shots, ensuring a consistent style and quality throughout. Supervisors Richard Little and Reuben Barkataki led the comp team.

How involved was the director, Kenneth Branagh, with the visual effects?

We were very fortunate to work closely with Kenneth. We had some conversations on set about certain VFX shots and how to shoot them, but his real involvement with us started at the end of principal photography. We met him several times to discuss big-picture things – such as what the mice would look like – to more in-depth conversations going through the entire movie shot by shot.

Can you give us some examples?

We discussed questions concerning the personality of the mice, or ways in which we could transform the lizard into the coachman. It was clear that Ken understood the VFX process well, having worked on movies such as Thor in the past. That helped us tremendously. But what I found most useful was his amazing ability to act scenes and characters. That gave us the clearest briefs of all. Just from his expressions, we could understand what he was after.

It sounds as if the process was quite collaborative.

He was interested in our ideas, so our sessions weren’t just briefs but more like creative conversations. He also came to visit the team in the Montreal office, which was great for everyone. Going through the film with him and listening to his ideas was inspiring for the entire team.

How closely did you study the original Disney animated film?

By the end of the movie, the entire crew was very intimate with the 1950 animated feature! We used it as reference and inspiration, however we were also keen to put our own stamp on the movie. We also worked with the production art department to ensure that our CG work would be in line with practical sets.

What other visual cues did you use?

As usual, for human characters, we did photoshoots, photogrammetry and scans. For the animals, we used a ton of reference photography and videos. Additionally, we had our in-house real mice, which our animators looked after and used as reference on a daily basis. We also looked at many landscape paintings to get a mood and palette for Cinderella’s world.

The film has quite a saturated colour palette. Did that affect your approach to the visual effects?

That’s a good question. Firstly I should mention that Charley Henley and director of photography Haris Zambarloukos had several conversations about the look of the film as a whole. It was shot on Kodak film stock, as both Ken and Haris wanted a classic look, and then went to digital intermediate, which can make quite a dramatic visual change to a shot.

In order to deal with this, we obtained grade references early on, so we would know where the grading process would take our shots. For fully CG sequences, we delivered shots “neutral” or with a simple representative grade, which was used as a guide for the digital intermediate.

How much creative control were you able to use during the previs stage?

We did a considerable amount of previs early on, and continued to produce previs way after the shoot; some scenes are fully CG and were completely designed in post. We were given quite a lot of creative freedom – Ken was always interested to see our ideas and was happy to see rough work and proof of concepts instead of waiting for something more polished. Most of the big CG sequences had been prevised and/or postvised, and for the most part were used as references during the shoot. I should mention that a lot of the previs/postvis work was done by the production VFX team.

"Cinderella" - bluescreen shoot for later digital enhancement by MPC

How much of the palace was built practically, and how much did MPC create?

The exterior of the palace was fully CG, apart from the door and balcony which are visible in some shots. A section of the stairs and gate was also practical. Together with the production art department, MPC designed and created the entire model.

The interior was mainly built practically. We did set extensions, CG chandeliers, digi-doubles and so on, but I won’t take any more credit than we deserve. The set was outstanding – beautifully designed and created – so it was really great to complement it with our work.

Building the digital palace sounds like a big task. How did you go about it?

We drew inspiration from a number of European palaces, as well as palaces from other Disney films. From there, under the supervision of our asset lead, Jung Yoon Choi, we created the design that we wanted. The whole process was fairly elaborate, mainly due to the sheer size of the palace, and not knowing to what extent we would have to build it, and to what level of detail.

What was the biggest challenge with the palace?

Finding a way to marry the palace and the landscape. Ken wanted it to feel grand, but at the same time immersed in the landscape. A lot of work went into the design and creation of the palace gardens, led by environment lead, Hubert Zapalowicz. Size-wise, the model of the palace was fairly big, but still manageable in our pipeline. What was more complex were the trees and vegetation surrounding the palace. For a large part they are fully 3D.

How many elements does a typical shot of the palace contain?

To pick one shot is difficult, but I can briefly describe the shots where Cinderella is running away from the Prince down the stairs of the palace. In this scene, apart from the actors and a section of the stairs, the majority was CG; even the plate containing actors got re-projected to allow for a nicer camera move. We extended the stairs, created vegetation to the sides, added digi-double guards, the palace and sky. The carriage was practical in this sequence, although we applied a 2D treatment to make it look more magical.

Do you have a favourite shot of the palace?

The opening shot of the ball, where we fly through fireworks and have a first establisher of the palace at night.

"Cinderella" - CG mice by MPC

Let’s talk about the digital characters. How many did you create in total?

The main characters were four mice, lizards, the two coachmen, a goose, the footman, stag, bluebirds, and white horses. There are also many other lesser creatures, such as butterflies, birds and so on. I believe the total number of assets was in the region of 80.

Which were the most challenging?

The mice! The brief was to go for a photorealistic look, because they interacted with Cinderella quite often. But they needed enough character and personality to engage with Cinderella and the audience. It was a fine line, and our animation supervisor, Warren Leathem, and lookdev head, Thomas Stoelzle, did a great job in finding that balance.

We gave the mice a slight anthropomorphised feel in order to differentiate them slightly and give them personality, but all in all we were going for photoreal shading. Although we used a lot of reference material, the mice are not the digital reproduction of any real mice. We created our own version.

How were the creatures animated?

For the vast majority we used keyframe animation. The animals other than the mice – particularly the transforming characters – had a much broader animated style to help the comedy and fairy tale aspect of certain scenes. We created our own concepts of the various transformation stages from fully animal to fully human.

MPC's digital mice interact with Cinderella's dress - original plate

MPC’s digital mice interact with Cinderella’s dress – original plate

MPC's digital mice interact with Cinderella's dress - final composite

MPC’s digital mice interact with Cinderella’s dress – final composite

How did you rig the models for the transformations – the mice changing into horses, for example?

Building a system with enough flexibility was the biggest challenge that our rigging lead, Davide La Sala, faced on this project. We needed a system that would allow lots of creative freedom when animating the transformation shots.

Each character had three rigs: a horse rig, a mouse rig and a transformation rig. The animators could choose to animate the different parts of the character with either the horse or the mouse rig, depending on what suited. The horse and mouse rigs were constrained and linked to the third transformation rig, which was used to blend between horse and mouse shapes.

Tell us more about the blending process.

The transformation rig calculated both scale changes and how “transformed” various parts of the body were. This information was baked into the geometry cache. MPC’s software team added features to our proprietary hair system, Furtility, to be able to read this data back in from the geometry cache and use it to drive changes in the hair.

For example, as the head grew massively in size from mouse to horse, so the mane would grow and the fluffy mouse hair would transition to short horse fur. This data was also used by the shaders to modulate between textures and different shading setups for the different modes of animal.

Stylistically, how did you manage the animation during the transformations?

This was probably an even bigger challenge. The director was adamant that the transformation had to look enjoyable; he wanted to convey excitement as the mice become beautiful and powerful horses. We went through many iterations, experimenting with several ideas and edits. Interestingly, the transformation back from horse to mouse, although more challenging and of a higher shot count, was in a way easier as we had a clearer idea of how the scene was going to develop.

How did you approach the transformation of a humble pumpkin into a shining golden carriage?

This particular transformation sequence went through quite a few conceptual changes. In the end, the story that we wanted to tell was the greenhouse exploding into particles of dust, which would then collect to forge the carriage.

We destroyed the greenhouse and the pumpkin procedurally with our proprietary FEA (Finite Element Analysis) destruction tool, Kali. We then ran many particle simulations on top of the broken pieces to give the effect that the solid chunks were vaporised into magical golden dust before materialising to form the frame and shell of the carriage.

Was it hard to match to the practical carriage?

The practical golden carriage on set had a very ornate and complex design. We built an exact digital replica which our technical animation team stripped apart, allowing us to hand-animate the various parts so that it felt like the carriage was self-assembling in an organic and elegant way.

How much of what we see in the dress transformation is practical wardrobe, and how much digital effects?

We ran motion control shoots of Lily James spinning in the different pink and blue dresses. The dress transformation then involved stitching two different performances together. It was tricky to find a moment in both performances that blended perfectly and at the right time. We helped the 2D blend with a digital Cinderella for a few frames in the middle.

For the dress transformation, we ran lots of cloth simulations on our CG version. The dress needed to float up and feel light as it grew in size to fill the volume of the blue ballroom gown. The trick was to make the dress expand and move as if it was underwater, but at the same time stay coherent and feel part of Lily’s performance.

How did you integrate all those magical sparkles?

Once we had our cloth animation just right, we ran multiple layers of particle simulations on top of it. Butterflies fly into camera, then land on and form part of the dress. We emitted magic dust from the ground, air and butterflies as they flew in. It was important for all the dust to interact and feel like it was being influenced by the swooshing of the dress.

Digital set extension by MPC for "Cinderella" - original plate

Digital set extension by MPC for “Cinderella” – original plate

Digital set extension by MPC for "Cinderella" - final composite

Digital set extension by MPC for “Cinderella” – final composite

Finally, let’s talk about the story’s most memorable icon: the glass slipper. What did you do to enhance the practical shoe used on-set?

We had the challenge of matching the practical shoe, which was covered with a special coating to give it an iridescent effect. Our lookdev team did a fantastic job, firstly by developing a custom shader, and secondly by making sure that shader would give us enough artistic control when needed.

What was the most difficult slipper shot?

The moment when the Prince first puts the shoe on Cinderella at the ball. In this shot, we had to replace the practical shoe (which was a plastic prop) with our CG version, and re-create the prince’s arm so that it had a more coherent movement with the shoe.

On a side note, the practical shoe was so small that it wouldn’t fit even Cinderella! Therefore, for several shots, we had to find ways to alter the shape of the shoe and foot in an “invisible” way.

What are your feelings looking back on the show?

It was a great pleasure to work on such an iconic film. Kenneth Branagh and Charley Henley both gave us creative freedom, but at the same time challenged us with their ideas. Although the vast majority of the work was done by MPC Montreal, all our other facilities helped in different capacities. It was a great effort by everyone.

Special thanks to Darrell Borquez, Marshall Weinbaum, Riki Arnold and Jonny Vale. “Cinderella” photographs copyright © 2015 The Walt Disney Company.

Now Showing – Cinefex 141

Cinefex 141 - From the Editor's Desk

Meet Chappie, the robotic star not only of Neill Blomkamp’s new futuristic action/thriller, but also of the cover star of Cinefex issue 141. Available now, this new edition of the premier magazine for visual effects professionals and enthusiasts features behind-the-scenes analysis of the latest films by leading moviemakers.

As its cover promises, our latest edition investigates the making of Chappie, in which a robot child prodigy is kidnapped by criminals and raised within their dysfunctional family. Weta Workshop provided practical props and effects, along with special effects supervisor Max Poolman. Visual effects were created by Image Engine, Ollin VFX Studio and The Embassy VFX.

Also featured in Cinefex 141 is The Hobbit: The Battle of the Five Armies, the epic conclusion to Peter Jackson’s 20-year filmmaking odyssey through J.R.R. Tolkien’s fantasy landscape. Assisting him on this final leg of the journey were visual effects supervisor Joe Letteri, special makeup and creature designer Richard Taylor, armies of artists at Weta Digital and Weta Workshop, and NZFX special effects supervisor Steve Ingram.

Next we have the exotic science fiction fantasy Jupiter Ascending, for which visual effects supervisor Dan Glass reunited with The Matrix filmmakers Andy and Lana Wachowski to realize science fiction action and spectacular cosmic realms assisted by visual effects consultant John Gaeta, and artisans at Method Studios, Double Negative, Framestore, One Of Us, BlueBolt, The Aaron Sims Company, Halon Entertainment, Mokko Studio, Rodeo FX, BUF and The Third Floor. Special effects supervisor Trevor Wood, makeup effects supervisor Jeremy Woodhead and Ironhead Studios supplied practical effects.

Wrapping up this issue of Cinefex is Unbroken, a true tale of boundless courage and survival, for which director Angelina Jolie engaged effect artists at Industrial Light & Magic, Rodeo FX, Animal Logic, Ghost VFX, Hybride and Lola VFX to replicate the 1936 Berlin Olympic venues and create intense sequences of aerial combat and sea and land ordeals.

We think it’s an amazing line-up. All the same, the list of contents isn’t quite what was originally planned for issue 141. How so? I’ll let Cinefex editor-in-chief Jody Duncan tell you more …

Jody Duncan – From The Editor’s Desk

After 36 years in business, Cinefex normally runs like a well-tuned Lamborghini. The writing team writes and the production team produces — usually — without a hitch. But, every once in a while, a wrench is thrown in, grinding the gears of our Lamborghini’s engine. Issue 141 was just such an issue.

Our plan was to do a big story on Ron Howard’s In the Heart of the Sea, which was scheduled to be released in early March — perfect timing for our mid-March issue. I saw a very early screening of the film, and I came out of it charged up and ready to go. The film was stunning, and it offered me a chance to write about something outside our typical sci-fi subject matter. Whales instead of aliens. 19th century whaling ships instead of spacecraft. I was exhilarated!

I interviewed all of the visual effects principals, as well as Ron Howard — the first time we’d managed to snag an interview with the director, despite our having covered eight of his previous films. (Cocoon was, in fact, my first writing assignment for Cinefex.) I then spent several weeks writing the article, and was within a day of sending it off to typesetting … when I got a call from Warner Bros. The studio had made a last-minute decision to change the film’s release date from March 2015 to December 2015. In the Heart of the Sea could not be featured in our March issue.

We were now looking at the prospect of about 30 empty pages in the magazine, which would only be filled if I could write a substitute article in record time. Fortunately, the loss of In the Heart of the Sea (which will be featured in our December issue) meant the gain of a story on Unbroken — a movie I had greatly admired, based on a book I had read and loved. Unbroken turned out to be a terrific story from a visual effects standpoint!

Cinefex 141 also contains coverage of the third Hobbit movie — Joe Fordham’s final foray into Middle-Earth with Peter Jackson and Weta — as well as his story on the Wachowski siblings’ latest extravaganza, Jupiter Ascending. Our cover boy is Chappie, a charming fellow I came to know through interviews with director Neill Blomkamp, visual effects supervisor Chris Harvey and a host of other visual effects artisans. Fortunately, our engines were purring throughout the writing and production of all three articles. As for the fourth — Unbroken was worth losing a few hours’ sleep, and I can look forward to a light workload come the 2015 Christmas season, because one of my articles for our final issue of the year is in the bag. I think I’ll go shopping.

Thanks, Jody – enjoy that shopping spree when it comes!

As for issue 141, the time has come to stop talking and start reading. Use the links below to access the latest Cinefex in your favourite format.

Oh, and if you’re a robot, please note that Cinefex is optimised for human eyesight, so please make the appropriate adjustments to your optical sensors.

Blade Runner Returns

"Blade Runner" Returns

In the autumn of 1982, I seated myself in a darkened cinema and waited for Ridley Scott to transport me 37 years into the future. I was seventeen years old, a card-carrying movie geek, and almost beside myself with excitement. Would the film I’d been anticipating all year live up to my high expectations?

The film, of course, was Blade Runner, and it didn’t disappoint. Even before the titles had finished rolling, I’d been seduced by the haunting tones of Vangelis’s sultry score. When the opening shot faded up – a spectacular moving vista in which flying cars soared above a fiery, industrialised future Los Angeles – I gasped.

Come to think of it, I think that most of Blade Runner’s future city shots made me gasp. Through the course of the film, I felt myself swept bodily into that glistening, neon-lit metropolis. I could feel its rainfall stroking my face. I could feel its smoke choking my lungs. I’d never felt so immersed in an imaginary world.

I embraced the people of the future, too. Rick Deckard was Harrison Ford like I’d never seen him before: no dashing, romantic hero this, but a cynical, downtrodden gumshoe. I fell head-over-heels in love with Sean Young as the immaculate Rachel, and with Daryl Hannah as the primal Pris. Most arresting of all was the leader of those renegade replicants, Roy Batty, played by Rutger Hauer in a performance that moved effortlessly from chilling to heartbreaking and back again.

Yet it was the ground-breaking visual effects work that spoke to me most clearly. Created by Douglas Trumbull, David Dryer, Richard Yuricich and the rest of the team at Entertainment Effects Group, they were simply breathtaking. What’s more, they blended seamlessly with the live-action that had been shot on the Warner Brothers backlot in Burbank. Was that real rain I was seeing, or some kind of animated effect? Where was the join between the full-scale set and the matte painting? The futuristic environment conjured by Blade Runner was so soaked in atmosphere, and so unbelievable in its complexity, that I fell for it hook, line and sinker. I’d never seen anything like it before.

Frankly, I don’t think I’ve seen anything like it since.

This miniature cityscape for "Blade Runner" was constructed on its side, so as to be aligned correctly for the camera

“In order to get aerial views of some of the cityscapes, the miniature structures were tilted sideways and aligned individually at varying angles so as to appear correct to the barrel distortion of the camera’s wide-angle lens. Numerous in-camera passes were required to balance external and practical lighting. Separate multi-pass film elements were also created for the various billboard and spinner insertions. Like most of the other miniature work, the cityscapes were filmed in smoke and augmented optically with rain.” Original caption: “2020 Foresight” by Don Shay, Cinefex 9, July 1982.

Now, in the spring of 2015 – just four years before Blade Runner’s predicted future is due to arrive, and just weeks after Alcon Entertainment announced that Ford would return in a sequel to be directed by Denis Villeneuve – I find myself anticipating the film all over again. As part of its Sci-Fi: Days of Fear and Wonder season, the British Film Institute is bringing Blade Runner back to cinemas across the UK.

In advance of the release, the BFI has prepared a brand new trailer. Here’s what director Ridley Scott had to say about it:

The Final Cut is my definitive version of Blade Runner, and I’m thrilled that audiences will have the opportunity to enjoy it in the way I intended – on the big screen. This new trailer captures the essence of the film and I hope will inspire a new generation to see Blade Runner when it is re-released across the UK on 3 April.”

The version that’s being released theatrically is the 2007 digitally remastered Blade Runner: The Final Cut, which is different to the 1982 original in a number of crucial respects. For example, it lacks both the tacked-on happy ending and the controversial Deckard voiceover (regarded by many as clumsy and unnecessary). Equally controversial is the most notable addition: a Deckard dream sequence featuring a unicorn. The unicorn’s appearance suggests – via Deckard’s uneasy relationship with his detective colleague, Gaff – that our hero may be a replicant himself …

Blade Runner: The Final Cut also features myriad other changes, including tweaks to both edit and soundtrack, a dusting of new shots, and a number of “fixes” and upgraded visual effects, executed primarily by The Orphanage, supervised by Jon Rothbart, with additional shots supplied by Lola VFX.

I asked Stu Maschwitz, co-founder of The Orphanage, what it was like treading on the hallowed ground of Los Angeles, 2019:

I’m very proud of The Orphanage’s work on Blade Runner: The Final Cut. We all truly felt a sense of reverence, working to preserve a film that meant a lot to us, and everyone involved was completely committed to doing the work at the highest possible quality. It’s a touchy thing, trying to tastefully update a classic and beloved film, but The Final Cut is, in my opinion, a perfect example of how to do it right.

One of the first questions I asked when I found out we were doing the work was, “Are we going to paint out the Steadicam shadow in the final chase through the Bradbury building?” Being a huge fan of Blade Runner, that camera shadow was something I’d seen, and wondered about, a hundred times. The answer was yes, and it was an incredibly difficult shot, replacing an entire wall behind layers of shadow and aerial haze, tracking through the complex warp of an anamorphic lens.

We did all that work in Flame, on a 4K scan of an interpositive that was the highest quality original they could find. We were almost done when the production managed to locate the original negative. We scanned that at 4K and started the work completely over from scratch! That’s how committed to doing it right everyone was on the production.

A police spinner comes in to land in Ridley Scott's "Blade Runner"

Are all the changes in The Final Cut necessary? The question is moot. This version exists, so live with it. Personally, I like The Final Cut best out of all the versions of this timeless classic. But I’d be just as happy to watch the original when Blade Runner appears on cinema screens again next month. I’m just glad of the chance to submerge myself once more in that dark and dazzling world of future noir.

The reason for my enthusiasm is simple. When you leave a showing of Blade Runner, the only possible thing you can say is to echo the words of Roy Batty during the film’s closing scenes, as he sits on the rooftop beneath the tears of that endless, future rainstorm:

“I’ve seen things you people wouldn’t believe.”

What are your memories of Blade Runner? Were you there in 1982, or are you one of the millions who discovered this sci-fi classic later on home video, DVD or Blu-ray? Which version do you prefer?

And here’s the biggie … IS Rick Deckard a replicant?

L is for Lidar

L-is-for-LidarIn the VFX ABC, the letter “L” stands for “Lidar”.

Making movies has always been about data capture. When the Lumière brothers first pointed their primitive camera equipment at a steam locomotive in 1895 to record Arrivée d’un train en gare de La Ciotat, what were they doing if not capturing data? In the 1927 movie The Jazz Singer – the first full-length feature to use synchronised sound – when Al Jolson informed an eager crowd, “You ain’t heard nothing’ yet!”, what was the Warner Bros. microphone doing? You guessed it: capturing data.

Nowadays, you can’t cross a movie set without tripping over any one of a dozen pieces of data capture equipment. Chances are you’ll even bump into someone with the job title of “data wrangler”, whose job it is to manage the gigabytes of information pouring out of the various pieces of digital recording equipment.

And in the dead of night, if you’re very lucky, you may even spy that most elusive of data capture specialists: the lidar operator.

Lidar has been around long enough to become commonplace. If you read behind-the-scenes articles about film production, you’ll probably know that lidar scanners are regularly used to make 3D digital models of sets or locations. The word has even become a verb, as in, “We lidared the castle exterior.” Like all the other forms of data capture, lidar is everywhere.

But what exactly is lidar? What does the word stand for, and how do those scanners work? And just how tough is it to scan a movie set when there’s a film crew swarming all over it?

To answer these questions and more, I spoke to Ron Bedard from Industrial Pixel, a Canadian-based company, with incorporated offices in the USA, which offers lidar, cyberscanning, HDR and survey services to the motion picture and television industries.

Ron Bedard lidar scanning in Toronto for "Robocop" (2014)

Ron Bedard lidar scanning in Toronto for “Robocop” (2014)

What’s your background, Ron, and how did you get into the lidar business?

I was a commercial helicopter pilot for 17 years, as well as an avid photographer. During my aviation career, I became certified as an aircraft accident investigator – I studied at Kirtland Air Force Base in New Mexico. I also got certified as a professional photographer, and following that as a forensic photographer.

At my aircraft accident investigation company, we utilised scanning technology to document debris fields. We used little hand-held laser scanners to document aircraft parts, and sent the data back to the manufacturers to assess tolerances.

How did you make the leap then into motion pictures?

The transition wasn’t quite that abrupt. Local businesses started to find out that I had scanners, and we began to get calls, saying, “Hey, we make automotive parts, and we have this old 1967 piston head, and we want to start machining them. Can you scan this one part and reverse engineer it for us?” Or there were these guys who made bathtubs, who said, “We don’t want to use fibreglass any more.” So we scanned their tubs to create a profile for their CNC machine.

Carl Bigelow lidar scans a Moroccan market set

Carl Bigelow lidar scans a Moroccan market set

Let’s talk about lidar. What is it, and how does it work?

Lidar means light detection and ranging. It works by putting out a pulse, or photon, of light. The light hits whatever it hits – whether it’s an atmospheric phenomenon or a physical surface – and bounces back to the sensor, which then records the amount of time that it’s taken for that photon to return.

Does a lidar scanner incorporate GPS? Does it need to know where it is in space?

Only if your lidar sensor is physically moving, or if it is incorporated into the scanner system, because the lidar is always going to give you the XYZ information relative to the sensor. Most terrestrial-based lidar systems are predicated on the sensor being in a single location. If you’re moving that sensor, you have to attribute where that sensor is in three-dimensional space so you can compensate the XYZ values of each measurement point. That’s commonly used in airborne lidar systems.

What kind of data does the scanner output?

Every software suite does it a little differently, but they all start with a point cloud. We do offer a modelling service, but primarily what we end up providing our clients is an OBJ – a polygonal mesh created from the point cloud – as well as the raw point cloud file.

It sounds like a lot of data. How do you manage it all?

Our scanner captures over 900,000 points per second. And a large movie set may require over 100 scans. That generates a massive amount of data – too much for a lot of people to work with. So we provide our clients with the individual point clouds from each of the scans, as well as a merged point cloud that has been resurfaced into a polygonal mesh. So, instead of making the entire model super-high resolution, we create a nice, clean scene. Then, if they want some part at higher resolution, they let us know and we create it from the original raw point cloud. If they have the point cloud themselves, they just highlight a certain area and work from that.

So you’re effectively giving them a long shot, together with a bunch of close-ups.


Is lidar affected by the weather?

Rain can create more noise, because anything that affects the quality of the light will affect the quality of the scan data. And wet surfaces have a layer of reflectivity on top. Then there’s the effects of the weather on the technology itself. Our modern system has a laser beam that comes out of the sensor and hits a spinning mirror, bouncing the light off at 90°. So if you get a raindrop on that mirror, that can certainly affect where the photons are travelling to.

How do you get around that?

Well, here on the west coast, if you can’t scan in the rain, basically you’re not scanning from November until April! We’ve built a rain umbrella system for our scanners, so we can scan in the rain. We obviously can’t scan directly straight up, but we can point upwards at about a 60° or 70° angle, and all the way down to the ground.

Is cyberscanning an actor the same as lidar?

No, it’s completely different. You have to think of lidar as a sledgehammer – the point cloud generated is not of a high enough resolution to be able to capture all those subtle details of the human face. So when it comes to scanning people, there are other technologies out there, such as structured white light scanning or photogrammetry, which are better suited to the task.

Do you find actors are used to the process of being scanned now?

For the most part, I think they are. I think there’s still some caution. It’s not that the technology is new – it’s more about the ability to re-create somebody digitally. There are some people who have cautions about that, because they’re never sure how their likeness might be used in the future.

Do they worry about safety?

When laser-based systems first started being utilised on film, there was a lot more hesitation from a personal safety point of view. But the amount of ordinary white light that’s being emitted from our little hand-held scanners is less than a flashlight. I have had people say, “I can feel the scanner entering into me!” And I say, “No, you can’t!” So there is still a little bit of magic and mystery to it, but that’s only because people don’t know exactly what it’s doing.

A mailroom is lidar scanned to capture the conveyor system prior to creating digital set extensions

A mailroom is lidar scanned to capture the conveyor system prior to creating digital set extensions

Tell us about photogrammetry.

With photogrammetry, you take enough photos of a subject that you have a lot of overlap. Then you use software to look for common points within each of the images – the software can tell where that pixel is in each image, and its relationship to each neighbouring pixel.

One of the challenges with photogrammetry is that there is no sense of scale. If you have one set of images of a full-scale building, and another of a miniature building, the software isn’t smart enough to figure out that one is smaller than the other. It just re-creates the three-dimensionality.

So you have to cross-refer that with survey data?

Yes. Or you try to place something in the images, like a strip or measuring tape, so that when you’re creating your photogrammetric model, you can say, “Hey, from this pixel to that pixel is one metre.” You can then attribute scale to the entire model. Lidar, on the other hand, is solely a measurement tool and accurately measures the scale.

When you’re working on a feature film, would you typically be hired by the production, or by an individual VFX company?

Every job is a little different. It usually works out to be about fifty-fifty.

Is there such a thing as a typical day on set?

No. Every day is a new day, with new challenges, new scenes, new sets, new people. That’s part of the beauty of the job: the variety. You’re not showing up to work Monday to Friday, 9 to 5, sitting in a cubicle and pushing paper.

Do you get a slot on the call sheet, or do you just scurry around trying not to get in people’s way?

If we’re doing lidar, nine times out of ten we’re there when nobody else is there. If we’re trying to create our digital double of the set with people running around, that creates noisier data and possible scan registration issues. So we do a lot of night work, when they’ve finished filming.

If we’re on location, scanning an outdoor scene downtown for example, usually the night-time is best anyway, for a couple of reasons. First, you’re going to get a lot less interference from people and traffic. Second, if there are lots of skyscrapers with glass facades, you can get a lot of noise in the scanning data as the sun is reflecting off the buildings.

You must be constantly up against the clock, having to get the scans done before the sets are struck.

Yes. A lot of times, we’ll actually be in there scanning while they’re breaking the set down! We just try to be one step ahead. We’re used to it – it’s just the nature of the business. There’s such a rapid turnaround now as far as data collection is concerned. You’ve just got to get in and get out.

So it’s all about mobility and fast response?

Exactly. One of the things that our customers really appreciate is our ability to be very portable. All of our systems – whether it’s cyberscanning or lidar – pack up into no more than three Pelican cases. And we can be on a plane, flying anywhere in the world.

Ron Bedard and Ian Galley lidar scan the 300-foot naval destroyer HMCS "Haida" in Hamilton, Ontario

Ron Bedard and Ian Galley lidar scan the 300-foot naval destroyer HMCS “Haida” in Hamilton, Ontario

Is it hard to keep up with scanning technology as it develops?

Oh, absolutely. We’re dogs chasing our tails. With today’s rapid advancements, if you can get three years out of a technology, maybe four, you’re lucky.

Is there any future or near-future piece of technology you’ve got your eye on?

I think photogrammetry is really making a comeback. It’s been used ever since cameras were invented, and from an aerial survey point of view since long before World War II. But it’s made a real resurgence of late, and that really has to do with the resolution of the sensors that are now available. Now that you’re talking high numbers of megapixels, you’re able to get much finer detail than you were in days past.

As these high-density sensors come down in price, and get incorporated into things like smartphones, I think we’ll see 3D photography – in combination with a sonar- or a laser-based system to get the scale – really getting market-hard.

And what does the future hold for lidar?

I think flash lidar will become much more prevalent. Instead of a single pulse of light, flash lidar sends out thousands of photons at once. It can fire at a really rapid rate. They use flash lidar on spacecraft for docking. They use it on fighter jets for aerial refuelling. You’re starting to see low-cost flash lidar systems being incorporated into bumpers on vehicles for collision avoidance.

So what are the benefits of flash lidar for the film business?

When you’re trying to do motion tracking, instead of putting balls on people and using infra-red sensors, you can use flash lidar instead. It is much more versatile in long-range situations. You can create an environment with flash lidar firing at 24 frames per second, and capture anyone who walks within that environment. That’s something I know we’re going to see a lot more of in the future.

Alex Shvartzman uses a handheld structured light device to scan a horse

Alex Shvartzman uses a handheld structured light device to scan a horse

What’s the weirdest thing you’ve ever had to scan?

Everything’s weird. We’ve scanned horses. We’ve scanned dogs. The beauty of working in film is that one day we can be scanning a Roman villa, and that evening be scanning the set of some futuristic robot movie.

Animals are tricky because each one is different, and you never know how they’re going to react to the light source. We scanned around thirty horses for one particular job, and some of them were happy and docile, and some of them reacted as soon as the scanner started.

Another challenging question we were asked was, “Can you scan a boat that’s floating out in the open sea?” I thought about it and said, “Sure you can. You’ve just got to have the scanner move the same way the boat’s moving.” We built a custom rig so that the scanner was constantly moving with the boat, and we hung it out over the edge of the boat and scanned the whole hull.

Lidar providers are among the many unsung heroes of movies. Do you ever crave the limelight?

No. In the end, our job is to provide solutions for our customers. For us, that’s the reward. When they’re happy, we’re happy.

Cinefex 141 Cover Reveal

Cinefex 141 Cover

The wait is almost over. The brand new issue of Cinefex hits the newsstands next week. On the cover is the cybernetic star of Chappie, Neill Blomkamp’s action/thriller about a robot child prodigy kidnapped by criminals and raised within their dysfunctional family. Weta Workshop provided practical props and effects, along with special effects supervisor Max Poolman. Visual effects were created by Image Engine, Ollin VFX Studio and The Embassy VFX.

The other big movies featured in Cinefex 141 are The Hobbit: The Battle of the Five Armies, Jupiter Ascending, and Unbroken. So what are you waiting for? Order your copy now!

“ATROPA” – Q&A with Eli Sasich

"ATROPA" - science fiction short by Eli Sasich

Lone off-world detective Cole Freeman stumbles on a giant vessel adrift in the depths of space. It’s the ATROPA, the very ship he’s been pursuing, but something is wrong. He shouldn’t have caught up with it for another 98 days. Waking the vessel’s crew from cryosleep only deepens the mystery, and when the ATROPA collides with something unimaginably strange, confusion turns to disbelief … and the real adventure begins.

This cliffhanger note marks the end of ATROPA, a sci-fi short directed by Eli Sasich. The suspenseful climax is deliberate, because the film is in fact merely a proof-of-concept for a feature film currently being pitched to studios.

In this Q&A for Cinefex, Sasich discusses the making of ATROPA, his first film project since festival favourite HENRi, in which a robot discovers there may be more to life than mere artificial intelligence.

Eli SasichWhere did the story concept for ATROPA originate?

The story really came out of a thought experiment: what would happen if you (literally) crashed into yourself in space? How might something like that be possible, and how would you deal with it? What would be the physical and emotional toll? It’s like a Twilight Zone episode – one of those classic sci-fi genre tropes – but the writer, Clay Tolbert, and I found a unique way into it.

Taking that as a jumping-off point, we then formed the story around our main characters, Cole and Moira. Strip everything else away, and it’s a love story: how two people rekindle a dying relationship under the most extreme circumstances – kind of like The Abyss. We have these big sweeping ideas about time and fate, but it’s the character story that really excites me.

You made the short film as a way of advancing the feature-length project. Why go to all that trouble?

As a director without a feature credit, I felt like I needed something to show I could effectively work with actors and small budgets. HENRi was like a master’s program for me – I learned a ton, and it took two years to complete – but it was a very different type of film. We only had a few actors, and I was dealing mostly with miniatures and effects. It was more precision craft and less spontaneous problem-solving. It’s all the same process, but the challenges were different. I really wanted to get back and shoot something more conventional again.

It’s risky – you have to make it look and feel like a professional feature film, but you have a fraction of the time and money. Being able to show an executive or financier what you are talking about is a huge advantage, as long as it lives up to expectations. Luckily, we had an awesome crew who stepped up to the challenge, and we were able to make something that portrayed the tone and mood I was going for.

"ATROPA" concept art

“ATROPA” concept art

Can you tell us anything more about the feature?

I can say that the film explores the ideas of fate and free will, actions and consequences. That’s one of the things I love about science fiction – the ability to explore big philosophical ideas in an organic way.

What stage are you at with the feature development?

We are actively pitching the feature right now – I’m really excited to say that we are in discussions with Pukeko Pictures to develop and produce it. Pukeko Pictures is a sister company to Weta Workshop; it was founded in 2008 by Sir Richard Taylor, Tania Rodger, and Martin Baynton. I couldn’t be more excited by the possibility of collaborating with such an amazingly talented and creative team.

What budget and timescale did you work to for the short film?

There was very little money for the short. We had three weeks of pre-production, we shot it in two days, and finished the entire film for right around $10,000. Most of that money came from HENRi sales, which I’m very thankful for.

During the production of the ATROPA short, did you apply any lessons you’d learned while making HENRi?

You learn from every project – sometimes the hard way. Keeping calm, trusting my intuition, and adapting to unexpected problems were important lessons from HENRi that will stick with me for the rest of my career. In a more technical realm, the post-production effects workflow was something I really got a good handle on with HENRi, and that helped immensely when it came time to finish ATROPA quickly.

"ATROPA" concept art

“ATROPA” concept art

Let’s talk about the design of the film. Who did the concept art?

We had eight or nine pieces of concept art done by artists from around the world. Ioan Dumitrescu worked mostly on the designs for the ATROPA and Cole’s ship, the Morinda. Mike Sebalj and Roger Adams did some really nice character and environment designs for us. We also had storyboards done for all our exterior space sequences – artist Jean Claude De La Ronde provided those.

Did you use any specific design cues for the two spaceships?

For the Morinda, I had a vague idea of the shape I was after. The articulated dual engines or nacelles were a concept I had for a different type of propulsion/steering mechanism. I thought it would be a unique and intuitive way to depict roll and pitch. That was really born out of the idea that there is no “up” or “down” in space. When the Morinda approaches the ATROPA, it’s not on the same plane, meaning Cole has to reorient his ship to the proper course. I really wanted to depict that, because it seems that every time I see an approach sequence in sci-fi, the two vessels are always perfectly aligned with each other.

The ATROPA was more difficult. We explored many different shapes and sizes – some were pretty out there in terms of design. Ultimately, it came down to finding something that fit within our world. We needed the ATROPA to quickly read as a large industrial ship to the audience. Obviously we took some design cues from the Sulaco from Aliens – but that bold, elongated shape seemed to read the best in the short amount of screen time.

You shot the film on the standing spaceship set at Laurel Canyon Stages in LA. Why did you choose that particular location?

I had visited the Laurel Canyon stages years earlier as a possible location for HENRi, long before we decided to use quarter-scale miniatures. I had always wanted to shoot there – it has that wonderfully gritty and grimy look that I love. It also perfectly invokes ‘70s and ‘80s sci-fi, which was the ideal world for ATROPA.

Did you adapt the Laurel Canyon set to give it your own personal stamp?

That set has been used in thousands of projects, so it’s pretty easy to spot once you know what you’re looking for. Since we couldn’t change the set in any way, other than some rearranging of props, it was really important to use lighting and shot composition to make the space our own. Our director of photography, Greg Cotten, did an amazing job of capturing the set in a different way than it’s usually shot.

"ATROPA" was shot on the standing spacecraft set at Laurel Canyon

“ATROPA” was shot on the standing spacecraft set at Laurel Canyon

For example, we made the conscious choice not to light from the ceiling grates, which creates amazing texture and patterns, but it’s also how everyone seems to light that set. We weren’t afraid of letting things fall off into darkness either – it created mystery and tension and hid some of the less camera-friendly elements of the space.

In terms of shot composition, we tried to keep things wide and cinematic wherever possible, and we motivated camera moves when we could, to keep things interesting. Specific original set pieces were also built and utilised to give us our own unique identity.

Tell us about some of the set pieces you brought in.

Our production designer, Alec Contestabile, designed and built the hologram board in Cole’s ship, the cryosleep pods, and the table in the mess hall, matching the look and feel of what already existed at Laurel Canyon. The hologram board was a wooden box with a glossy tabletop, and practical LED lights hidden in the inner lip. The LEDs helped sell the effect of the chessboard by providing interactive lighting on Cole’s face.

Production designer Alec Contestabile created a number of bespoke set pieces to enhance the Laurel Canyon set, including the holographic table in the cockpit of the "Morinda"

Production designer Alec Contestabile created a number of bespoke set pieces to enhance the Laurel Canyon set, including the holographic table in the cockpit of the “Morinda”

The cryosleep pods were made out of foam-core board, cardboard, and various pieces of junk – tubes and wiring – all attached to a wooden frame. They only weighed about 15 pounds, so one crewmember could move them. The table in the mess hall was fashioned by bolting two plastic pallets together and covering them with scrap pieces of Plexiglass.

We lit the table practically from the inside. Alec carefully placed six iPads on a shot-by-shot basis, which we used to loop tech graphics. It was nice to get some in-camera data screens, and it gave the actors something to look at and interact with. Alec worked wonders with the production design, all with essentially no money. He used junk and whatever odds and ends we could find lying around to create a completely believable world.

Eli Sasich and Anthony Bonaventura on the set of "ATROPA"

Eli Sasich and Anthony Bonaventura on the set of “ATROPA”

The performances are uniformly calm and measured. Did the cast get to read the whole feature script before working on the short, to help them build their characters?

It was important to me that everything seemed fairly routine to these characters, until the big reveal at the end – which is certainly not routine. Because the short had to set so many things up, it’s a heavily truncated version of the actual first act of the script. The dialogue had to get more information across, and things certainly happen faster. I did speak with each actor about the journey their character takes in the feature, but they never read the full script. The cast was fantastic, and they were great to work with. We had to shoot fast, down and dirty, and they were always prepared and willing to do so.

Visual effects for "ATROPA" were created by The Light Works

Visual effects for “ATROPA” were created by The Light Works

What was the workflow for the visual effects?

Tobias Richter and his team at The Light Works did all the exterior spaceship effects. After working with Jean Claude to storyboard the space sequences, we handed those boards off to Tobias and his team. They would return with low-resolution animatics of each shot, which I would then give notes on. The Light Works is located in Germany, so all of post was coordinated via email and Skype. It was a truly seamless process, and they did an amazing job for us.

For the few shots that included live action elements – like the pull-back from Cole’s ship – we coordinated ahead of time, providing photo reference and measurements. In terms of direction, I provided examples of various shots I liked. I would reference lighting and compositing elements from different films, and we would work off of those ideas. Interestingly enough, the finished shots wouldn’t feel right until we added imperfections – slight camera shake, lens distortion, grain and tasteful flares.

That pull-back shot from the Morinda is quite complex. How did you put it together?

It was a difficult shot for several reasons. First and foremost, we didn’t have the space to dolly back from the cockpit set, let alone make the turn around the side. So we shot the plate of Cole in the cockpit as a wide lock-off. Tobias and his team then projected that live action footage onto a 2D card within a low-poly CG cockpit set which they had modelled.

The pull-back was done in the computer, but since we didn’t have a perspective shift on our live-action footage, wrapping around the side of the ship posed many challenges. We ended up doing a hand-off to a CG double of Cole, using the cockpit window strut as a natural wipe once we reached about a 45-degree angle. The effect works fairly seamlessly, and was a really crafty way to fake a very complicated move.

How did you track the 3D chessboard graphics into the live-action plate?

Our VFX supervisor, Ryan Wieber, did the 3D chessboard and proximity alert display, as well as all of our compositing. He was able to track and create these completely believable shots without greenscreen – because there wasn’t enough room to light the screen – and usually without any tracking markers.

The chessboard hologram was an incredible effect. We had a set piece with built-in practical lighting, but the LEDs were visible in many shots, and quite distracting. Ryan ended up replacing the top of the chessboard in every shot, so the lights were covered. He then built the hologram projection in Adobe AfterEffects, utilising Element 3D for the chess pieces, together with layer upon layer of compositing tricks. The opening shot, where we pull out of the holographic chess piece, was a completely digital camera move up until the tilt-up to Cole.

For the cockpit shots, Ryan rotoscoped Cole and added in glass and stars. He has an incredible eye, and an amazing design sense – he also created all of our display graphics and screens. Ryan was instrumental in helping create a believable sci-fi world – I call him the magician!

The music has an grand, epic quality. Was that a deliberate creative choice?

I love film music – especially big orchestral scores. I think we’ve lost a bit of the art of film scoring today. Music has become filler noise, and the use of strong themes has strangely gone out of style. Our composer, Kevin Riepl – who also scored HENRi – feels the same way. Kevin and I share a passion for the same types of film scores, as well as the philosophy that themes should develop just like characters over the course of a story.

We certainly didn’t have the budget to record a live orchestra for ATROPA, but I really wanted that epic feel. Fortunately (and somewhat ironically), Kevin did the score for the videogame Aliens: Colonial Marines, which was live orchestra. We had access to all the stems from that recording session, so he rearranged and remixed them, and added some electronic elements to create something new. The music is barely recognisable from the game, and it gave us the big, live, cinematic sound I was looking for. It’s another example of working within low-budget constraints and still finding ways to get what you want creatively. A bit of Aliens obviously snuck through, which is fine with me – it’s one of my favourite movies, and certainly an inspiration.

The film contains a reference to the Valley Forge, a cap that might have been worn by a member of the Nostromo’s crew … what other sci-fi in-jokes did you put in there?

Good eyes catching those Easter eggs! I like adding little in-jokes wherever possible. It’s a fun way to both acknowledge the projects that inspired you, and add little hidden elements for yourself and your friends. Most will never be seen, but there are a few more. For example, the hand-held case-file used by Cole displays “VL-426”, which is a reference to the planet “LV-426” from Alien and Aliens. In fact, all the crew ID numbers on the case-file bio pages are the original ID numbers of the Nostromo crew from Alien.

You mentioned Valley Forge – Cole’s last name, Freeman, also comes from Silent Running, in reference to the main character, Freeman Lowell, played by Bruce Dern. The logo for the ATROPA is actually the same logo for the Pythagoras ship from HENRi, only turned upside down with different colours. The sound of the Morinda’s engines was inspired by the speeder bikes from Return of the Jedi. Our sound designer, Michael Ault, created his own take on that classic sound by pitch-shifting elephant trumpets.

Holographic chess board - "ATROPA"

As well as opening the film, the chess game also appears during the end credits. Is that significant?

Chess is a battle of wits and a game of strategy; I liked the symbolism. I also wanted to foreshadow the ending in a subtle visual way: the chessboard is an exact mirror image of itself with the two sets of opposing pieces. The idea that Cole will be squaring off against himself becomes literal by the end of the short.

How do you feel about the short, now that you’re pitching the feature?

ATROPA was made possible by a very passionate and hardworking crew. The end result has definitely opened doors for us, and the response to the release online was overwhelming. We hope we can make the feature – the story goes to some really thought-provoking and unexpected places.

Do you have any other projects in development?

I have a few other projects at varying stages. I’m working with another writer on a really fun action/adventure film, which follows the oddball friendship of a couple of historical figures. It has the tone of Ghostbusters and Sherlock Holmes, with some steampunk design sense thrown in for good measure. It would be an absolute blast. I’ve also written a smaller indie film that deals with another historical figure, and a little-known fact about his death. It’s a passion-project for me, and something I’ve been kicking around for years. I’m interested in strong character stories, regardless of genre, time period, or setting.

ATROPA photographs and video copyright © Corridor Productions 2015.

Inspiring ILM

What drives people to work in the visual effects industry? The glamour? The technology? All those ravening monsters and exploding spaceships? Or is it just another job? In an ongoing series of articles, we ask a wide range of VFX professionals the simple question: “Who or what inspired you to get into visual effects?”

Here are the responses from the staff at Industrial Light & Magic.

Prehistoric Encounters

When asked what inspired him to get into visual effects, Michael DiComo, CG technology supervisor (ILM, San Francisco), was in no doubt, answering, “Simple: Jurassic Park. Seeing the effects in that movie made me realise that THAT was what I wanted to do for a living. The great thing is that, not only did I get hired by ILM only three years after Jurassic Park was released, but I also got to work on The Lost World: Jurassic Park as my second film project ever.”

Beverley Joy Ang, production engineer (ILM, Singapore), was also inspired by dinosaurs, but of a more cuddly kind.

“I remember stumbling upon this program called CorelMove during grade school,” Ang recalled. “I didn’t know how to do animations back then, but the CorelMove library had this pre-animated purple dinosaur that you could import into a scene. I had a lot of fun changing the backgrounds, adding shrubs and trees, and moving the dinosaur around. I think it’s that little purple dinosaur that kick-started my love for computer graphics.”

The prehistory of cinema has also had its influence today’s VFX professionals.

The Wonderful World of Disney

“Colouring characters in all day – who wouldn’t love that job?” Betsy Mueller, ILM

“Growing up, my family and I used to watch The Wonderful World of Disney movies on Sunday evenings,” commented Betsy Mueller, lighting technical director (ILM, Vancouver).

“One episode was introduced with an old black-and-white clip in which Walt Disney explained what some of the traditional animation departments did. The Ink and Paint department fascinated me! Colouring characters in all day – who wouldn’t love that job? That was when I became hooked on the magic of making movies.”

Mad About the ’80s

Many people currently working in visual effects have a soft spot for the films of the 1980s, that golden age of fantasy and adventure movies that marks the high point of optical and photochemical effects.

One such person is Danielle O’Hare, senior manager of technical training (ILM, San Francisco), who remarked, “The reason I started working in VFX was because of movies like Indiana Jones, Back to the Future, and E.T.: The Extra-Terrestrial.”

Craig Hammack, visual effects supervisor (ILM, San Francisco) shared O’Hare’s love for this era, and also added a favourite from 1962 into the mix: “I loved the escapism and thrill of Star Trek: The Wrath of Khan, and the absolute beauty of Lawrence of Arabia.”


“Gremlins” – actress Phoebe Cates serves her unusual clientele, while Chris Walas’s crew of puppeteers hunker down out of shot.

Kate Lee, layout artist (ILM, Vancouver), recalled a formative ‘80s moment with a little extra bite. “My earliest memory of a VFX movie scene was Gremlins – launching Mrs. Deagle out of the window in her wheelchair. I was too young to understand that the gremlins weren’t real, and so terrified that I couldn’t go anywhere on my own after dark for a while!”

You can’t ask a group of VFX professionals what inspires them without stumbling over at least one person who loves Star Wars.

“For me, it’s an insatiable appetite for watching movies, a love of the filmmaking process from conception to projection,” stated Daniel Cavey, production manager (ILM, San Francisco). “And of course my first movie theatre memory: my Mom taking me to see The Empire Strikes Back!”

Milestone Movies

Jurassic Park isn’t the only game-changing film that’s proved inspirational to the staff at ILM. Recalling a summer afternoon in 1992 in South Africa, Mike Jutan, animation/creature R&D engineer (ILM, San Francisco), said, “I was a 10 year-old, enthusiastically eating a snack and watching Terminator 2: Judgment Day. Somehow, I’d convinced my parents it was okay to watch it: ‘Don’t worry Mom, this version was edited for TV!’ The moment where the T-1000 melts up from the checkerboard floor cemented my life goals in a matter of seconds. I loved computers, I loved movies, and with this – the single-most awesome visual effect of all time – I loved ILM.

“From there, my career and education goals revolved around combining math, movies and computer science. Fifteen years later, an ILM recruiter called and asked if I’d ever ‘considered working at ILM’. Not attempting to hold back any glee, I laughed, ‘Yes – only since I was 10 years old!’”


“The moment where the T-1000 melts up from the checkerboard floor cemented my life goals in a matter of seconds” – Mike Jutan, ILM

Colette Mullenhoff, R&D engineer (ILM, San Francisco), was also seduced by James Cameron’s metallic assassin from the future: “I always had my sights set on computer graphics for entertainment, but watching the T-1000 liquid metal cyborg in Terminator 2: Judgment Day sealed the deal.”

Another seminal movie for many VFX professionals is The Matrix. “When The Matrix came out, it inspired my high school class to re-create the 360 degree bullet-time scene,” explained Kate Lee, layout artist (ILM, Vancouver). “But we could only improvise with one person holding a camcorder in a car that was spinning around another person, who was leaning back and ever-so-slowly throwing his arms in the air. Clearly there was much to be learned about the technical side of the amazing images!”

Outside the VFX Box

It isn’t always a love of visual effects that draws people into visual effects. Craig Hammack might be a Star Trek fan, but he’s also inspired by architecture. “I have a love for the kind of experience that can be evoked by the light and form of architectural spaces. Architects like Louis Khan, Le Corbusier and Carlos Scarpa capture my imagination through design. While searching for a way to create experiences myself – without the need for understanding civil engineering codes and electrical schematics – I discovered computer graphics, and then visual effects.”

For Johan Thorngren, CG supervisor (ILM, San Francisco), it was all triggered by a passion for military aircraft. “I grew up near a military airbase and loved watching the jets from afar,” he remembered. “I built the kit-models of military airplanes that were available at the time. Due to a random chance in my previous career, I got hold of a copy of 3DSMax and picked up model-building as a hobby again – this time digitally. This led me to explore rigging and the FX aspects related to the models, as well as propelling me into the rendering and shading side of things. So I very quickly started to get more interested in making synthetic things look real.”

An aptitude for computer science can help lead to a career in visual effects, but on its own it may not always be enough – as the experience of Wajid Raza, technical director (ILM, San Francisco) proves. “In the late 90’s, when I was in middle school, my parents got me a Pentium computer,” Wajid recalled. “I was instantly hooked on the graphics packages – including an early version of Adobe Photoshop. A few years later, still fascinated with computers, I enrolled in an undergraduate computer science program.

“But during my second year of college, I started to get bored – it was not as creatively satisfying as I had hoped it to be. That was when Peter Jackson’s The Lord of The Rings came out. I was floored with the whole experience – the CG characters and epic battle scenes. I spent the next two years finding out everything about VFX, and eventually landed my dream job at Industrial Light & Magic.”


“The Lord of the Rings” – “I was floored with the whole experience – the CG characters and epic battle scenes” – Wajid Raza, ILM

However, the ultimate outside-the-box story comes from Jon Alexander, compositing supervisor (ILM, San Francisco), who puts it all down to, well, a higher power.

“I bounced around four universities, studying different sorts of engineering, but my heart was not in it,” commented Alexander. “My parents suggested I enrol at Mary Manse, the small Catholic college where my Mom was on the faculty, because tuition would be free. But I wanted to go to a film school. My Dad said, “Your Mom’s praying that you just get a degree.” I snottily replied, “I’m praying to go to film school.” But I enrolled anyway.

“A month or so later the Ursuline nuns who ran the school said that, after much prayer and because of the financial situation, that it was God’s will that the school was closing after 50 years. My Dad called me up and said, “Okay I guess its God’s will you go to film school, but you’ve really pissed off a bunch of nuns!”

The Final Effect

Whatever inspired these ILM-ers to get into the business in the first place, it’s clear that, years later, their passion remains strong.

“To this day, whenever we get a new Jurassic movie in house, I get all jazzed up by the artwork, dinosaur maquettes, motion and lighting tests,” remarked Michael diComo. “It makes me want to roll up my sleeves and be a shot-lighter again.”

“In 2003, ILM called me up and asked if I wanted to come and work on Star Wars: Revenge of the Sith,” recalled Johan Thorngren. “I was a bit hesitant, until I heard that ILM had a department that allowed people to work in many different areas at once. Now I get to jump into many different aspects of the work, and I find it really satisfying being part of a team responsible for a given shot or sequence of shots.”

Danielle O’Hare concluded, “The reason I’ve stayed at ILM is because of the incredibly talented and generous population of artists, producers, and engineers. These are people at the top of their game, who are more than happy to share what they know with their peers. It makes my job as a training manager very easy, and a whole lot of fun.”

For some, however, working in VFX poses one problem that can be insurmountable. As Kate Lee quipped: “The biggest challenge is to get my parents to understand what I do for living.”

ILM LogoIndustrial Light & Magic was founded by George Lucas in 1975. Since then, ILM has created visual effects for more than 250 feature films, notably the movie franchises Transformers, Iron Man, Harry Potter, Indiana Jones, Jurassic Park, Pirates of the Caribbean and, of course, Star Wars. ILM has offices in San Francisco, Singapore, Vancouver and London. Thanks to all the staff from ILM who contributed to this article.

Special thanks to Greg Grusby. “Terminator 2: Judgment Day” photograph copyright © 1991 by Carolco Pictures, Inc. “Gremlins” photograph copyright © 1984 by Warner Bros., Inc. “The Lord of the Rings: The Fellowship of the Ring” photograph copyright © 2001 by New Line Cinema.

“Interstellar” Wins VFX Oscar

Cinefex 140 "Interstellar" cover

Christopher Nolan’s space epic Interstellar has won the Academy Award for Best Visual Effects at the 87th Academy Awards, in a glittering ceremony held at the Dolby Theatre, Hollywood & Highland Center on February 22, 2015.

The award was collected by Paul Franklin, Andrew Lockley, Ian Hunter and Scott Fisher in recognition of the ground-breaking visual effects images created by Double Negative, with on-set special effects orchestrated by Scott Fisher, and other practical effects – notably a suite of large-scale spacecraft miniatures – by New Deal Studios.

Complete list of nominees:

  • Interstellar – Paul Franklin, Andrew Lockley, Ian Hunter and Scott Fisher
  • Guardians of the Galaxy – Stephane Ceretti, Nicolas Aithadi, Jonathan Fawkner and Paul Corbould
  • X-Men: Days of Future Past – Richard Stammers, Lou Pecora, Tim Crosbie and Cameron Waldbauer
  • Captain America: The Winter Soldier – Dan DeLeeuw, Russell Earl, Bryan Grill and Dan Sudick
  • Dawn of the Planet of the Apes – Joe Letteri, Dan Lemmon, Daniel Barrett and Erik Winquist

You can read the full story behind Interstellar‘s award-winning visual effects in Cinefex 140. Oh, and while you’re at it, why not catch up on the rest of the nominated films in our previous two issues – links below.

K is for Kinematics

K is for KinematicsIn the VFX ABC, the letter “K” stands for “Kinematics”.

If you’ve ever animated a creature using a piece of 3D software like Autodesk’s Maya or 3DSMax, you’ll know all about kinematics.

If you haven’t, here’s a quick primer …

Imagine you’re going to animate a troll. Just an regular troll: twelve feet tall with massive hands and a bad attitude. Let’s call him Tarquin.

The scene you’re going to animate requires Tarquin to reach out his hand and throttle a nearby dwarf. The dwarf’s name, by the way, is Doug.

Forward Kinematics

One way to perform the task is by using forward kinematics.

This is very tricky. Forward kinematics requires you to animate Tarquin’s arm starting from the shoulder and working your way out. In other words, you move his upper arm a bit, then adjust his forearm, then proceed to his hand, and finally manipulate those fat troll fingers.

The reason this is tricky is because what you really want to do is make sure Tarquin’s fingers connect with Doug’s neck in exactly the right place, at exactly the right time. That’s a tough call when those fingers are always the last appendage on the list of things you move.

Inverse Kinematics

If all that sounds like too much hard work, you might prefer to fall back on inverse kinematics.

This is much more satisfactory. With inverse kinematics, you get to focus entirely on Tarquin’s hand, grabbing it with your cursor moving it from its starting position to, you guessed it, the waiting neck of the poor, doomed Doug.

As for the rest of Tarquin’s arm, you simply rely on the software knowing how all the joints are interconnected, and trust it to move the entire limb accordingly. It’s like moving the hand of a jointed puppet and letting the laws of physics do the rest.

Of course, it’s not quite as simple as that. If you want to get any kind of expression into the movement (and, being an animator, that’s precisely your aim) you have to go in and make adjustments to the gross movement that’s been determined by the software. You know, all those small accelerations and delays, not to mention the unexpected muscle twitches caused by Tarquin’s predilection for strong ale.

If you have friendly rigging team, they might cause certain secondary movements to happen automatically. But it’s still up to you to coax a performance out of your troll.

If you want to learn more about kinematics, you’ll need to dig deep into a software tutorial, like the one in this Maya training video:

However, before you get lost down a rabbit hole filled with IK handles and pole vectors, let’s take a moment to consider the long and illustrious history of the word kinematics. Along the way, we might learn what it actually means.

Kinetographs and Kinetoscopes

One of the earliest devices developed for the presentation of moving pictures was the Kinetoscope. Both it and its counterpart, the Kinetograph, were created in the US at the Edison Lab during the 1890s.

Edison's Kinetoscope was developed during the final decade of the 19th century.

Edison’s Kinetoscope, one of the earliest motion picture viewers, was developed during the final decade of the 19th century.

The Kinetoscope presented moving images to its viewers by causing a strip of perforated celluloid film to pass in front of a light source at high speed, with each successive image on the film being isolated by a moving shutter. If that sounds like a movie projector to you, you’re nearly right.

Although the Kinetoscope contained all the essential components of a typical film projector, it was actually a peephole device, and thus could be viewed by only one person at a time.

As for the Kinetograph, that was the camera used to create the celluloid images in the first place.

By the turn of the century, a number of other inventors had jumped on the motion picture bandwagon, including Louis and August Lumière , whose Cinématographe machine was capable of displaying projected moving images to a large audience.

Kinematics in the Kinema

The root of all these words is the Greek word “kinema”, which means “motion”. As with many Greek words, during its journey across Europe and beyond, the “k” has been transformed into a “c”.

Which is why nobody goes to the kinema any more.

However, you can still find the word “kinema”, in all its derived forms, if you look hard enough. The technical and craft organisation that is the British Kinematograph, Sound and Television Society (BKSTS) – originally formed in 1931 as the British Kinematograph Society – is still going strong.

Then there’s the American Society of Cinematographers (ASC), a non-profit organisation dedicated to the art of filmmaking. Its magazine, American Cinematographer, was first published in 1920. It’s still going strong too.

Finally we have kinematics, that esoteric aspect of the modern art of animation, the mastery of which demands the application of both artistic sensibilities and technical smarts.

And the name of which contains a pleasing echo of the long history of the motion picture craft.