Kubrick’s Aries 1B

Sunday morning, Deadline Hollywood broke the news:

Academy Museum Buys Rare ‘2001: A Space Odyssey’ Model For $344,000

"2001" key art by Robert McCall © Metro-Goldwyn-Mayer.

“2001” key art by Robert McCall © Metro-Goldwyn-Mayer.

Fans were stunned. As any Stanley Kubrick aficionado will tell you, it has long been legend that all the spaceship miniatures from Kubrick’s landmark science fiction film were destroyed after filming at the filmmaker’s request, to prevent recycling in cheap imitations. Could this be the real McCoy?

Before the facts were known, a small studio in El Segundo, California, became mecca for a pilgrimage of visual effects professionals who arrived to gaze in awe at the Aries 1B – the spherical trans-lunar spaceship from 2001: A Space Odyssey – that, miraculously, had been found after 47 years in obscurity.

The miniature was up for auction and the curator, Premiere Props, welcomed guests to verify the find. Facebook images began appearing of spectators posing with the ship — Dennis Muren, Greg Jein, Matthew Gratzner, Ian Hunter, Shannon Gans, Dave Jones, Bruce Logan, Pat McClung, Harrison Ellenshaw, Peter Anderson, Bill Taylor, André Bustanoby, Gene Kozicki, Rob McFarlane, Ted Rae, Dan Winters, John Goodson and Kim Smith (and guest appearances, by phone, from Douglas Trumbull and Steve Gawley). The general consensus: the miniature was real.

The AMPAS Museum of Motion Pictures eventually acquired the ship for a princely sum. Prior to finalizing the sale, event organizer Dan Levin allowed Visual Effects Society Archive Committee co-chair Gene Kozicki and VFX artist André Bustanoby to a make detailed photographic record of the ship; and Gene shared the experience with Cinefex:

“I have to admit that when I first heard that the Aries-1B filming model was up for sale at a local auction house, I was skeptical. Model makers, prop makers, and costumers have been making replicas of key items for years and quite often they show up in an auction as the authentic item, only to be ‘de-bunked’ later. At first I thought that would be the (unfortunate) case here.  But the photos and story had a ring of authenticity to them and I figured it was worth seeing it in person.

“Upon first examination, the model appeared to be in pretty good shape. It was covered in thick layer of dirt and dust, but structurally it was sound and aside from the landing gear shrouds, was largely intact.  It was what I expected a model that was made in 1966 to look like. The story that the auctioneer provided about how the consignor obtained it lined up somewhat with the story I had been told by Dave Larson (who’s researched the making of 2001 extensively and has worked with Doug Trumbull on presentations). 

Photo courtesy Gene Kozicki

Aries 1B – photo courtesy Gene Kozicki

“Within about five minutes of looking it over, I was pretty much convinced it was the genuine filming model and not a replica. What clinched it for me was a phone conversation that I had with Doug Trumbull just after I left the auction house. He mentioned details about the nature of some of the materials – something that wasn’t readily apparent in photos. (Some of the panel detailing was done with thin metal foils cut to shape. The foils featured a subtle embossed texture that don’t really show up on film but can be seen up close with the naked eye. Doug said they used that material because it looked like quilted insulation.) The model was built as a one-off using some pretty heavy-duty industrial model making techniques. The body is thick acrylic and fiberglass. The landing gears are a combination of steel and brass. The model parts are individually glued on – not castings. (This lack of shortcuts played no small role in its survival. I’ve seen younger filming models in worse shape due to ‘modern’ materials and the need to get it done quickly and cheaply.)

“From a design aesthetic, all of the models in 2001 were so unlike anything we had seen before that it helped sell the concept of a believable future. Up until that time, space travel and lunar landings were done in ships that looked far more elegant that practical. Buck Rogers and Flash Gordon journeyed into space standing up. The public had seen a few real launches and it was clear that the way we were going to get into space was not the way our fathers had envisioned it. In 2001, space travel wasn’t fantasy, it was plausible. In fact, it was mundane. Doctor Floyd (William Sylvester) sleeps most of his way to the Space Station and the moon. They have stewardesses and receptionists just sitting there, reading a magazine. In 2001, space travel was boring. That I think made it far more realistic than anything we had seen prior. And that’s why it still holds up.

Photo courtesy Gene Kozicki

Aries 1B detail – photo courtesy Gene Kozicki.

“So how did this thing get here? A lot of the stories surrounding the filming of 2001 have grown to legend or myth status, probably due to time and the fact that the director was Stanley Kubrick. The ‘myth’ states that Kubrick ordered everything destroyed in order to prevent cheap imitations from beating 2001 to the screen. The reality appears to be far more mundane. Like most movies, post-production ran longer than expected and during this period the models were just stored away — either on a stage or in a building at M-G-M’s Borehamwood studio, in England. And they stayed there even after the film was released in 1968. Why? Probably because it was easier to just leave the stuff there than figure out what to do with it. 

“Finally, at some point in the early/mid 1970’s, M-G-M finally decided to clear the material out.  Kubrick was offered the models but reportedly refused. (Obviously the logistics involved in moving and storing them were considerable.  With the ‘small’ Discovery model being around 15 feet long and the ‘large’ Discovery being 54 feet long, it would require several trucks and a large shed to store them.)  But I think there’s something else we should consider — by the time this reportedly happened (mid-70’s), Stanley Kubrick had tried to get Napoleon off the ground, directed A Clockwork Orange, and was working on Barry Lyndon. Given Kubrick’s tendency to go ‘all in’ on a given film, it seems reasonable to me that he simply wasn’t interested in anything from one of his (by then) old films.

“Over the years, there were rumors that some of the models survived. In fact, the Aries 1B, along with the Orion Clipper (the ‘Pan-Am’ ship), and the Moonbus were thought to have been set aside for Kubrick to claim. But we’ve also ‘heard’ that the Moonbus was taken home by a crew member and eventually suffered an ignominious fate at the hands of that crew member’s son and some fireworks. And of course, there were the photos of Space Station V rotting in a field. The problem with all of this was that the only evidence we had as to the fate of the models were those photos – and those photos supported the ‘myth’ that everything had been destroyed. (Despite the fact there was no mention of any of the other models being in that same field.) Now that the Aries has shown up, obviously that lends at least some credence to some of the models being put to one side. (And if they were put to one side for Kubrick, how did the consignor get his hands on the Aries?)  Hopefully, this discovery will allow new research and we can get a better picture of what happened.

Aries 1B miniature construction, 1966.

Special effects technician Rodney Fuller (R) attends to the Aries 1B at Borehamwood, 1966. Image courtesy Matte Shot © Hawk Films / M-G-M.

“As this item was coming up for auction, we really didn’t know where it would end up, or if we would ever be granted access again. Premiere Props’ owner Dan Levin allowed us to come in and document the model several times. It was his hope that if enough industry people saw it and talked about it, it would remove any lingering doubts from people’s minds that this was the real thing. (He was skeptical at first, too.) Additionally, Dan recognized the significance of the prop and wanted to make sure it wound up in some institution that would not only restore it but display it. We all recognized that it had a unique story apart from the basic fact that it was used in a movie.

“Joining us in examining the model were John Goodson from ILM, model maker Greg Jein, the folks at New Deal Studios, visual effects designer Dennis Muren — and everyone else who saw it was almost giddy that this thing survived. André Bustanoby and I set up tracking markers and took photos from all sides, and inside.

“With the removal of the hardware that controlled the landing gear, we noticed that the model was sitting a bit lower than what was seen on screen. The landing gears still move, albeit manually. The main part of the body appears to be made from a blown plexiglass dome about 30 inches in diameter. The endoscope was a handy gadget I picked up at a hardware store that transmits imagery to your iPad or cell phone via an app. I got it to see inside the guts of the model. I was curious to see if any of the landing gear linkages were still inside the model. (Like Al Capone’s vault, it was empty.) I was also trying to see if the mount points for the model were still intact. The last thing we wanted to see happen was the model get damaged as it was being moved back into its shipping crate. Thankfully, the mounts and inner structure appear to be in good condition. The Academy will have to come up with some way to mount it securely — I wouldn’t want to rely on a 50 year old mount point as the sole method of support. But it seems sturdy enough to get it into a crate and to a facility where it can be examined further.

“Given the influence of 2001 and all the legends surrounding the making of the film, the significance of this model cannot be overstated.  This is the equivalent of someone discovering Orson Welles’ cut of The Magnificent Ambersons or Dorothy’s ruby red slippers from The Wizard of Oz. 2001 set the bar very high — without 2001 and the contributions of the artists that worked on the visual effects, we wouldn’t have Star Wars, Blade Runner, The Matrix, or Interstellar. I am very pleased that the Motion Picture Academy is the new custodian of this iconic artifact and I hope that its eventual display will inspire future storytellers as much as it has inspired ourselves.”

Gene Kozicki, André Bustanoby, Harrison Ellenshaw and the Aries 1B. Photo courtesy Dan Levin.

Gene Kozicki, André Bustanoby, Harrison Ellenshaw and the Aries 1B. Photo courtesy Dan Levin.

Special thanks to Brian Johnson and to Peter Cook at Matte Shot.

Kingsman: The Secret Service – VFX Dossier

Cinefex reports on "Kingsman: The Secret Service"

Based on a series of comic books by Mark Millar and Dave Gibbons, Kingsman: The Secret Service chronicles the recruitment of bad boy Gary “Eggsy” Unwin into a top secret spy organisation.

Recently unclassified files reveal that the action comedy is directed by Matthew Vaughn (Kick-Ass, X-Men: First Class), with visual effects by a range of vendors delivered under the surveillance of visual effects producer Stephen Elson, and deployed by production visual effects supervisor Steve Begg, and a team of supporting agents. (Begg ultimately handed over to veteran VFX operative John Bruno, in order to undertake another espionage mission, this time on Her Majesty’s secret service.)

Cinefex has obtained additional top secret documents from three of the visual effects vendors assigned to Kingsman: The Secret Service. They are presented here on an “eyes only” understanding. Once read, all trace of their existence must be destroyed.

Top Secret

Dossier 1 – Prime Focus World

Agent Name: Marc Jouveneau | Agent Role: VFX Supervisor

Mission Background

Agent Jouveneau – previous association with Agent Begg, on missions including Casino Royale and Tomb Raider: The Cradle of Life.

Prime Focus World – previous association with show VFX producer, Stephen Elson.

Roxy rises into the stratosphere in "Kingsman: The Secret Service" - visual effects by Prime Focus World.

Roxy rises into the stratosphere in “Kingsman: The Secret Service” – visual effects by Prime Focus World.

Mission Scope

Prime Focus World was operational on several key sequences and one-off shots. Agent Jouveneau supervised from shoot to final delivery:

  • Fight in church
  • Individual shots in tunnel sequence
  • White House shot
  • Worldwide fights finale sequence, including shooting aerial plates and motion capture
  • Roxy (Sophie Cookson) in near-space

Additional resources deployed for:

  • Designing and building house belonging to Valentine (Samuel L. Jackson) for aerial establishing shots
  • Comped graphics/footage inserts into monitors, HUD, phones, etc.
  • Various additional enhancements

Shot Count

410 shots delivered by a team of approximately 160 Prime Focus World agents, in all departments.

Agent Jouveneau involved on shoot Oct 2013–Feb 2014. Post-production commenced Mar 2014. Most shots delivered Jun–Aug 2014. Retakes and graphic inserts addressed Sep–Oct 2014. Final shot delivered 10 Oct 2014.

For the near-space sequence, VFX supervisor Marc Jouveneau's team studied weather balloons, skydiving, footage of Felix Baumgartner’s high-altitude jump and various NASA image sources.

For the near-space sequence, VFX supervisor Marc Jouveneau’s team studied weather balloons, footage of Felix Baumgartner’s high-altitude jump and various NASA image sources.

Research & Development

Key reference sources:

  • Original Kingsman graphic novel
  • Previous films of director Matthew Vaughn – notably Kick-Ass, emblematic for editing style i.e. quick cuts and fast moves during action sequences
  • Style guidelines provided by Steve Begg, drawn from action/spy films including Casino Royale and Skyfall
  • Previs used on most sequences – either CG or stunt choreographies
  • Brad Allan, second unit director and stunt coordinator, planned all fights with his team
  • Near-space sequence: weather balloons – including explosion of same; skydiving through clouds; Felix Baumgartner’s jump from near-space; various NASA pictures

Interrogation Extracts – Marc Jouveneau

“The fight in the church was originally designed to be one long continuous shot lasting over six minutes. It was edited and cut in the end, but there are still three long chunks – each one over 1,000 frames. I supervised it on-set with the second unit, working with director of photography, Fraser Taggart, and assistant director, Joe Geary.”

“We took a lot of measurements, scans and reference pictures of actors, stunts, special actors and extras, as well as texture references and a laser scan of the church. During the shoot, we focused on continuity and the rhythm of the action, and, the sequence was edited by Eddie Hamilton as we went along.”

“Much of the hard work in post came from match-moving the camera, which was mainly handheld, sometimes with zoom. The cameras were ARRI Alexas, used with a 45° shutter to enhance the energy of the action, but RED EPIC and Blackmagic cameras were also used for POV shots.”

“We also had to object-track for transitions like the flying gun, or the stick-in-the-throat at the end, and body-track on several shots in order to replace Colin Firth’s face. All the clean-up of tracking markers, wires and props had to be addressed, while at the same time applying re-time and re-frame to the shots.”

“Another thing we had to do was to add and define the look of the blood and wounds. Matthew Vaughn wanted a misty blood that spread and disappeared fast, in order to reduce the goriness of the sequence. This stylistic approach was used throughout the movie.”

This slideshow requires JavaScript.

Mission Summary

“When you start a show, there are always a lot of unknowns and surprises. This is part of the process of moviemaking, I guess. On Kingsman, it was good being involved on-set, and to have worked with Steve Begg before – both these things made us more involved in the creative discussions throughout the reviews with Matthew Vaughn, Eddie Hamilton, George Richmond and the producers from MARV. As you do the shots, you learn more and more about the director’s expectations; after a while, you know pretty much what he is after, and it becomes easier.”

“We were working on Kingsman when we heard the Prime Focus World London VFX facility was closing. You can imagine how difficult it is to focus in such a situation. I truly appreciated the team spirit adopted by the artists and production. I think they proved their real quality: creative talents who love making movies! You can’t beat that. I work freelance now, and I’ll miss these guys. But I would like to use this opportunity to thank the entire Prime Focus World crew for their great work, and say, ‘Well done!’”

Top Secret

Dossier 2 – Nvizible

Agent Name: Matt Kasmir | Agent Role: VFX Supervisor

Mission Background

Nvizible – previous association with director, Matthew Vaughn, on Kick Ass 2.

Mission Scope

Nvizible created previs for the mission, under the supervision of Martin Chamney. Primary VFX duties concerned the design, shooting and execution of prosthetic leg blades belonging to the character Gazelle.

Shot Count

210 shots delivered, over a period of 14 months. Half of these involved Gazelle’s blades.

Research & Development

Key reference sources:

  • James Bond films, particularly from the 60s and 70s
  • Paralympics and the prosthetics worn by competing athletes

Additional design considerations:

  • “Feminisation” of prosthetic blades from original graphic novel design, to accommodate character change from male to female
  • Development of “designer” look and feel, reflective of Gazelle’s fashion-conscious character

Interrogation Extracts – Matt Kasmir

“For Gazelle’s death, we created a special stunt previs sequence involving four Gazelle stunt doubles and two Eggsy doubles. On the day, we shot high speed for the majority of shots, and used The VFX Company’s motion control system to create the multiple passes we needed. In all we used four different Moco rigs, the high-speed Bolt, a conventional rig, a 360° rig and a high-speed rail cam. In some shots, up to six individual performers are combined into our two characters. This was done using CG body doubles to blend animation over a few frames, face replacements, and a projected 2½D environment for clean-up. We also inserted screens, legs and CG props.”

“All the shots were variable speed – a Phantom shot running at 600fps could ramp from real-time to 25 times slower. We worked very closely with editorial, as we were potentially working on 25 times the normal number of frames, yet starting from scratch each time a cut changed.”

“Overall, we allowed the performers to drive the animation, and concentrated on the subtleties and giving the blades a sense of intent.”

Top Secret

Dossier 3 – Doc & A Soc

Agent Name: John Paul Docherty | Agent Role: VFX Supervisor

Mission Scope

Docherty worked with digital matte painter Jim Bowers to create 360° environments, including a huge hangar filled with aircraft and secret service staff. Docherty was assigned to the mission, together with John Bruno, when Steve Begg had to move on to another production.

This slideshow requires JavaScript.

Shot Count

100+ shots delivered.

Interrogation Extracts – John Paul Docherty

The hangar shot was created by Jim; we added in various moving elements, including workmen, a plane being towed and a man arc-welding at the back, which is a little throwback to Lost in Space – there are about ten layers of environment in that shot. Then we ran it through Fusion Studio’s 3D environment, where the shot went through an awful lot of changes, including re-lighting the whole thing.”

“For a major explosion sequence, we had to deal with four shots filmed with high-speed cameras at Leavesden on a cold, rainy day. I rendered these in together with glass shattering effects I’d created. We had to deal with multiple image formats and lots of lens distortion, as well as some pretty dramatic colour space and resolution differences, before we could effectively comp in separate office and taxi elements.”

“In another scene, we move from the aftermath of an action sequence to a moving taxi with the ‘Kingsman’ logo flashing on a monitor. This looks like quite a simple shot, but the speeds on both sides had been adjusted by the editor. It worked really well, however he threw in cut frames that made the re-speeds very complex. Fusion’s Optical Flow did very well handling all of that.”

This slideshow requires JavaScript.

Top Secret

Special thanks to Steve Begg, Stephen Elson, Tony Bradley, Alex Coxon and Stephanie Hueter. “Kingsman: The Secret Service” photographs copyright © 2014 Twentieth Century Fox Film Corporation.

Unbroken – Cinefex 141 Extract

Cinefex 141 - "Unbroken"

All this week, we’re featuring exclusive extracts from our brand new magazine issue, Cinefex 141, now available in both print and digital editions.

Unbroken tells the remarkable true story of Louis Zamperini, an American long-distance runner and Olympian, who joined the Army Air Forces as a B-24 bombardier during World War II, was shot down over the Pacific, survived on a raft for 47 days, only to be rescued, incarcerated and relentlessly tortured by his Japanese captors. In presenting this tale of boundless courage and survival on screen, director Angelina Jolie engaged effect artists at Industrial Light & Magic, Rodeo FX, Animal Logic, Ghost VFX, Hybride and Lola VFX to replicate the 1936 Berlin Olympic venues and create intense sequences of aerial combat and the sea and land ordeals that followed.

In this excerpt from Jody Duncan’s article, The Long Run, Bill George explains how Lola VFX used digital techniques to simulate the effects of extreme weight loss on actors Jack O’Connell and Domhnall Gleeson,

Though both actors managed to decrease their weight significantly through fasting, the severity of the characters’ emaciation required some digital effects enhancement by Lola VFX. Production visual effects supervisor Bill George shot clean plates, when possible, on the set, recorded camera data, and put tracking markers on the actors’ bodies as aids to visual effects supervisor Edson Williams and his team at Lola VFX.

At Lola, artists created a 2D mesh that was tracked to the actors, enabling them to shrink areas of their bodies to within the confines of the mesh. “It was very effective,” said George. “They also had to do a lot of roto and re-creating backgrounds using the clean plates – if they had them. For many of the emaciation shots, the background was a sea of other soldiers milling about, so no clean plates were possible.” For a closeup on Louie’s face, which Jolie wanted to look more sunken and hollow, the on-set visual effects team sent Lola a scan of Jack O’Connell’s head. Lola then tracked and lit the head scan to create highlights on the brows and shadows in the eye sockets and under the zygomatic arch. “Lola did an amazing job with all of these shots.”

Read the complete article in Cinefex 141, which also features The Hobbit: The Battle of the Five Armies, Jupiter Ascending and Chappie.

All content copyright © 2015 Cinefex LLC. All rights reserved.

Jupiter Ascending – Cinefex 141 Extract

Cinefex 141 - "Jupiter Ascending"

All this week, we’re featuring exclusive extracts from our brand new magazine issue, Cinefex 141, now available in both print and digital editions.

In the exotic science fiction fantasy Jupiter Ascending, a genetically engineered superman (Channing Tatum) arrives on Earth to help an unsuspecting young cleaning lady (Mila Kunis) realize her destiny as the leader of an intergalactic super race. Visual effects supervisor Dan Glass reunites with The Matrix filmmakers Andy and Lana Wachowski to realize science fiction action and spectacular cosmic realms assisted by visual effects consultant John Gaeta, and artisans at Method Studios, Double Negative, Framestore, One Of Us, BlueBolt, The Aaron Sims Company, Halon Entertainment, Mokko Studio, Rodeo FX, BUF and The Third Floor. Special effects supervisor Trevor Wood, makeup effects supervisor Jeremy Woodhead and Ironhead Studios supplied practical effects.

In this extract from Joe Fordham’s article, Imperial Earth, Jeremy Woodhead describes the practical techniques used to create a range of characters who have been genetically modified, or “spliced”.

Makeup effects and hair designer Jeremy Woodhead and prosthetic effects supervisor Nik Williams’ Animated Extras translated concepts into makeups, including those for Channing Tatum’s ex-military human/wolf hybrid. “I did 20 or 30 drawings of Channing in various guises as Caine,” recalled Jeremy Woodhead. “We went from quite extreme canine, to barely there, and ended up somewhere in the middle. I did quite a few makeup tests on him, with different wigs and ear placements.” Caine’s final makeup featured Polytek PlatSil silicone prosthetics of swept-back pointed ear tips, and Pro-bondo applications of bar code brands on the warrior’s skin.

“Lana and Andy Wachowski came up with the idea that splices went through factories, where they were fitted with bands on the backs of their necks. For Caine’s hair, we used Channing’s real beard and hair, and added color and paint to give it more of a fur quality. At one point during the film, Caine takes off his shirt to reveal that, as a legionnaire, he once had metallic wings that had been chopped off, leaving nubs on his back. We made organic/mechanical prosthetics for each stump.”

Read the complete article in Cinefex 141, which also features The Hobbit: The Battle of the Five Armies, Chappie and Unbroken.

All content copyright © 2015 Cinefex LLC. All rights reserved.

Hobbit 3 – Cinefex 141 Extract

Cinefex 141 - "The Battle of the Five Armies"

All this week, we’re featuring exclusive extracts from our brand new magazine issue, Cinefex 141, now available in both print and digital editions.

In The Hobbit: The Battle of the Five Armies, Bilbo Baggins (Martin Freeman) arrives at the climax of his adventure, leading a company of dwarves to the Lonely Mountain, where the dragon Smaug unleashes his wrath on the citizens of Lake-town, the dwarf leader Thorin (Richard Armitage) seeks the precious ancestral Arkenstone, and dark forces gather for an epic battle that threatens the future of Middle-earth. Director Peter Jackson concludes his 20-year filmmaking journey through J.R.R. Tolkien’s fantasy landscape, along with visual effects supervisor Joe Letteri, special makeup and creature designer Richard Taylor, armies of artists at Weta Digital and Weta Workshop, and NZFX special effects supervisor Steve Ingram.

In this excerpt from Joe Fordham’s article, King Under the Mountain, Christopher White describes how Weta Digital’s new renderer, Manuka, helped create the epic scenes in which the fire-breathing dragon Smaug destroys Laketown.

Fire spraying from the dragon’s gullet required special attention to work with the creature’s aerial trajectory. “Smaug was racing over the city at around 400 kilometers an hour,” said White, “and he had to hit very specific buildings, so we engineered sims to control fire ejected at high speeds. Peter referred us to flame-throwers for how fuel was emitted and stuck to timbers. Those simulations caused other timbers to catch fire, and we ran destruction simulations on top, causing burning pieces to fall into the water, generating splashes and steam.”

Weta Digital used Pixar’s RenderMan to render fire and smoke elements, but relied on its new in-house physics-based renderer, Manuka, to streamline lighting and art direction for all other elements in these scenes. “Manuka helped minimize the amount of lights we had to set in pre-passes and caches,” said Christopher White. “That simplified the lighting process and made it easier for technical directors to work with lighting rigs. Once we set our fires, we got all the natural reflections in the water, smoke in the sky and indirect lighting effects on the geometry of the buildings in a very naturalistic response.”

Read the complete article in Cinefex 141, which also features Chappie, Jupiter Ascending and Unbroken.

All content copyright © 2015 Cinefex LLC. All rights reserved.

Chappie – Cinefex 141 Extract

Cinefex 141 - "Chappie"

All this week, we’re featuring exclusive extracts from our brand new magazine issue, Cinefex 141, now available in both print and digital editions.

First up is Chappie. Starring Hugh Jackman, Sigourney Weaver and Sharlto Copley, District 9 director Neill Blomkamp offers this action/thriller story about a robot child prodigy kidnapped by criminals and raised within their dysfunctional family. Weta Workshop provided practical props and effects, along with special effects supervisor Max Poolman. Visual effects were created by Image Engine, Ollin VFX Studio and The Embassy VFX.

In this extract from Jody Duncan’s article, Rules of Robotics, Chris Harvey of Image Engine discusses the creation of the film’s robot characters.

Final robot designs went to visual effects supervisor Chris Harvey and the crew at Image Engine, the show’s primary visual effects provider, which would deliver close to 1,000 shots, including all of those involving digital robot characters. “We brought Weta Workshop’s two-dimensional designs into Image Engine,” recalled Harvey, “and began to flesh them out in three dimensions. That was a very long, detailed process – and it was quite different from what usually happens. Typically on a film, someone does concept design, and then that is built physically, and then the visual effects team has to replicate what was built. What we did was the opposite of that. The concepts came to us first, and then we developed those. We worked out how the robot would function, physically – how all the gears and mechanisms, joints and limbs would function in the real world. So it was a very physically-based design, which was important to Neill. He wanted it to look like something that could be conceivably built, even today.”

Ultimately, Image Engine modeled 16 different versions of Chappie to accommodate his evolution through the film, as well as 12 generic Scouts. “We had to create a whole police force of these guys,” explained Chris Harvey. “There was the prototype droid, called Robot Deon, and then droids for the end of the film when they go offline and they are vandalized – spray-painted, burned and beat up. Counting all the different versions of Chappie and the Scouts, we created 28 unique robots for the film.”

Read the complete article in Cinefex 141, which also features The Hobbit: The Battle of the Five Armies, Jupiter Ascending and Unbroken.

All content copyright © 2015 Cinefex LLC. All rights reserved.

“Cinderella” – VFX Q&A

"Cinderella" - a Cinefex VFX Q&A with MPC

Like most fairy tales, the tale of Cinderella is part of an oral storytelling tradition stretching back hundreds, if not thousands of years. Many versions exist of this classic story of the persecuted heroine, but the one most familiar to modern Western audiences is a French variant, Cendrillon, written in 1697 by Charles Perrault.

Perrault’s story – which introduced the now-familiar devices of the fairy godmother, the pumpkin turning into a carriage, and the glass slipper – was first adapted for the screen as Cendrillon by George Méliès in 1897. However, the movie remembered by most people is the 1950 Disney animated feature Cinderella.

Now, Disney have released a new version of the rags-to-riches tale: a live-action feature starring Lily James as Cinderella, Cate Blanchett as the Stepmother, Helena Bonham Carter as the Fairy Godmother and Richard Madden as the Prince.

Directed by Kenneth Branagh, the film eschews the recent trend for reimagined, edgy fairy tales. Instead, it tells the traditional story straight, and is unafraid to draw on the heritage of its animated predecessor.

The majority of the visual effects work for Cinderella – approximately 500 shots – was delivered by MPC, with Charley Henley in the role of production VFX supervisor, and Patrick Ledda supervising for MPC.

In this Q&A for Cinefex, Ledda discusses magic and mice, palaces and pumpkins, and that all-important glass slipper.

"Cinderella" - palace exterior shot by MPC

How did you first get involved with Cinderella?

I joined the show in Summer 2013, a couple of months before the start of the shoot, and was on the show for 15 months or so. I knew the production VFX supervisor, Charley Henley, so it was easy to get started. We had several meetings about the style of the movie, previs, sets, locations, and so on. A few weeks later, we commenced principal photography at Pinewood. I attended the shoot, and subsequently went on to supervise MPC’s work.

What was the scope of MPC’s work on the film?

It was varied. We did creature work, including the mice, lizards, goose, transformations, digi-doubles and stag. Also magical transformations: the carriage, the shoe, the dress. There was a considerable amount of environment work, including fully-CG wide establishers, the palace, the town and various set extensions. Our CG supervisor, Richard Clegg, did a tremendous job of managing such a variety of assets and shots, ensuring a consistent style and quality throughout. Supervisors Richard Little and Reuben Barkataki led the comp team.

How involved was the director, Kenneth Branagh, with the visual effects?

We were very fortunate to work closely with Kenneth. We had some conversations on set about certain VFX shots and how to shoot them, but his real involvement with us started at the end of principal photography. We met him several times to discuss big-picture things – such as what the mice would look like – to more in-depth conversations going through the entire movie shot by shot.

Can you give us some examples?

We discussed questions concerning the personality of the mice, or ways in which we could transform the lizard into the coachman. It was clear that Ken understood the VFX process well, having worked on movies such as Thor in the past. That helped us tremendously. But what I found most useful was his amazing ability to act scenes and characters. That gave us the clearest briefs of all. Just from his expressions, we could understand what he was after.

It sounds as if the process was quite collaborative.

He was interested in our ideas, so our sessions weren’t just briefs but more like creative conversations. He also came to visit the team in the Montreal office, which was great for everyone. Going through the film with him and listening to his ideas was inspiring for the entire team.

This slideshow requires JavaScript.

How closely did you study the original Disney animated film?

By the end of the movie, the entire crew was very intimate with the 1950 animated feature! We used it as reference and inspiration, however we were also keen to put our own stamp on the movie. We also worked with the production art department to ensure that our CG work would be in line with practical sets.

What other visual cues did you use?

As usual, for human characters, we did photoshoots, photogrammetry and scans. For the animals, we used a ton of reference photography and videos. Additionally, we had our in-house real mice, which our animators looked after and used as reference on a daily basis. We also looked at many landscape paintings to get a mood and palette for Cinderella’s world.

The film has quite a saturated colour palette. Did that affect your approach to the visual effects?

That’s a good question. Firstly I should mention that Charley Henley and director of photography Haris Zambarloukos had several conversations about the look of the film as a whole. It was shot on Kodak film stock, as both Ken and Haris wanted a classic look, and then went to digital intermediate, which can make quite a dramatic visual change to a shot.

In order to deal with this, we obtained grade references early on, so we would know where the grading process would take our shots. For fully CG sequences, we delivered shots “neutral” or with a simple representative grade, which was used as a guide for the digital intermediate.

How much creative control were you able to use during the previs stage?

We did a considerable amount of previs early on, and continued to produce previs way after the shoot; some scenes are fully CG and were completely designed in post. We were given quite a lot of creative freedom – Ken was always interested to see our ideas and was happy to see rough work and proof of concepts instead of waiting for something more polished. Most of the big CG sequences had been prevised and/or postvised, and for the most part were used as references during the shoot. I should mention that a lot of the previs/postvis work was done by the production VFX team.

"Cinderella" - bluescreen shoot for later digital enhancement by MPC

How much of the palace was built practically, and how much did MPC create?

The exterior of the palace was fully CG, apart from the door and balcony which are visible in some shots. A section of the stairs and gate was also practical. Together with the production art department, MPC designed and created the entire model.

The interior was mainly built practically. We did set extensions, CG chandeliers, digi-doubles and so on, but I won’t take any more credit than we deserve. The set was outstanding – beautifully designed and created – so it was really great to complement it with our work.

Building the digital palace sounds like a big task. How did you go about it?

We drew inspiration from a number of European palaces, as well as palaces from other Disney films. From there, under the supervision of our asset lead, Jung Yoon Choi, we created the design that we wanted. The whole process was fairly elaborate, mainly due to the sheer size of the palace, and not knowing to what extent we would have to build it, and to what level of detail.

What was the biggest challenge with the palace?

Finding a way to marry the palace and the landscape. Ken wanted it to feel grand, but at the same time immersed in the landscape. A lot of work went into the design and creation of the palace gardens, led by environment lead, Hubert Zapalowicz. Size-wise, the model of the palace was fairly big, but still manageable in our pipeline. What was more complex were the trees and vegetation surrounding the palace. For a large part they are fully 3D.

How many elements does a typical shot of the palace contain?

To pick one shot is difficult, but I can briefly describe the shots where Cinderella is running away from the Prince down the stairs of the palace. In this scene, apart from the actors and a section of the stairs, the majority was CG; even the plate containing actors got re-projected to allow for a nicer camera move. We extended the stairs, created vegetation to the sides, added digi-double guards, the palace and sky. The carriage was practical in this sequence, although we applied a 2D treatment to make it look more magical.

Do you have a favourite shot of the palace?

The opening shot of the ball, where we fly through fireworks and have a first establisher of the palace at night.

"Cinderella" - CG mice by MPC

Let’s talk about the digital characters. How many did you create in total?

The main characters were four mice, lizards, the two coachmen, a goose, the footman, stag, bluebirds, and white horses. There are also many other lesser creatures, such as butterflies, birds and so on. I believe the total number of assets was in the region of 80.

Which were the most challenging?

The mice! The brief was to go for a photorealistic look, because they interacted with Cinderella quite often. But they needed enough character and personality to engage with Cinderella and the audience. It was a fine line, and our animation supervisor, Warren Leathem, and lookdev head, Thomas Stoelzle, did a great job in finding that balance.

We gave the mice a slight anthropomorphised feel in order to differentiate them slightly and give them personality, but all in all we were going for photoreal shading. Although we used a lot of reference material, the mice are not the digital reproduction of any real mice. We created our own version.

How were the creatures animated?

For the vast majority we used keyframe animation. The animals other than the mice – particularly the transforming characters – had a much broader animated style to help the comedy and fairy tale aspect of certain scenes. We created our own concepts of the various transformation stages from fully animal to fully human.

MPC's digital mice interact with Cinderella's dress - original plate

MPC’s digital mice interact with Cinderella’s dress – original plate

MPC's digital mice interact with Cinderella's dress - final composite

MPC’s digital mice interact with Cinderella’s dress – final composite

How did you rig the models for the transformations – the mice changing into horses, for example?

Building a system with enough flexibility was the biggest challenge that our rigging lead, Davide La Sala, faced on this project. We needed a system that would allow lots of creative freedom when animating the transformation shots.

Each character had three rigs: a horse rig, a mouse rig and a transformation rig. The animators could choose to animate the different parts of the character with either the horse or the mouse rig, depending on what suited. The horse and mouse rigs were constrained and linked to the third transformation rig, which was used to blend between horse and mouse shapes.

Tell us more about the blending process.

The transformation rig calculated both scale changes and how “transformed” various parts of the body were. This information was baked into the geometry cache. MPC’s software team added features to our proprietary hair system, Furtility, to be able to read this data back in from the geometry cache and use it to drive changes in the hair.

For example, as the head grew massively in size from mouse to horse, so the mane would grow and the fluffy mouse hair would transition to short horse fur. This data was also used by the shaders to modulate between textures and different shading setups for the different modes of animal.

Stylistically, how did you manage the animation during the transformations?

This was probably an even bigger challenge. The director was adamant that the transformation had to look enjoyable; he wanted to convey excitement as the mice become beautiful and powerful horses. We went through many iterations, experimenting with several ideas and edits. Interestingly, the transformation back from horse to mouse, although more challenging and of a higher shot count, was in a way easier as we had a clearer idea of how the scene was going to develop.

How did you approach the transformation of a humble pumpkin into a shining golden carriage?

This particular transformation sequence went through quite a few conceptual changes. In the end, the story that we wanted to tell was the greenhouse exploding into particles of dust, which would then collect to forge the carriage.

We destroyed the greenhouse and the pumpkin procedurally with our proprietary FEA (Finite Element Analysis) destruction tool, Kali. We then ran many particle simulations on top of the broken pieces to give the effect that the solid chunks were vaporised into magical golden dust before materialising to form the frame and shell of the carriage.

Was it hard to match to the practical carriage?

The practical golden carriage on set had a very ornate and complex design. We built an exact digital replica which our technical animation team stripped apart, allowing us to hand-animate the various parts so that it felt like the carriage was self-assembling in an organic and elegant way.

How much of what we see in the dress transformation is practical wardrobe, and how much digital effects?

We ran motion control shoots of Lily James spinning in the different pink and blue dresses. The dress transformation then involved stitching two different performances together. It was tricky to find a moment in both performances that blended perfectly and at the right time. We helped the 2D blend with a digital Cinderella for a few frames in the middle.

For the dress transformation, we ran lots of cloth simulations on our CG version. The dress needed to float up and feel light as it grew in size to fill the volume of the blue ballroom gown. The trick was to make the dress expand and move as if it was underwater, but at the same time stay coherent and feel part of Lily’s performance.

How did you integrate all those magical sparkles?

Once we had our cloth animation just right, we ran multiple layers of particle simulations on top of it. Butterflies fly into camera, then land on and form part of the dress. We emitted magic dust from the ground, air and butterflies as they flew in. It was important for all the dust to interact and feel like it was being influenced by the swooshing of the dress.

Digital set extension by MPC for "Cinderella" - original plate

Digital set extension by MPC for “Cinderella” – original plate

Digital set extension by MPC for "Cinderella" - final composite

Digital set extension by MPC for “Cinderella” – final composite

Finally, let’s talk about the story’s most memorable icon: the glass slipper. What did you do to enhance the practical shoe used on-set?

We had the challenge of matching the practical shoe, which was covered with a special coating to give it an iridescent effect. Our lookdev team did a fantastic job, firstly by developing a custom shader, and secondly by making sure that shader would give us enough artistic control when needed.

What was the most difficult slipper shot?

The moment when the Prince first puts the shoe on Cinderella at the ball. In this shot, we had to replace the practical shoe (which was a plastic prop) with our CG version, and re-create the prince’s arm so that it had a more coherent movement with the shoe.

On a side note, the practical shoe was so small that it wouldn’t fit even Cinderella! Therefore, for several shots, we had to find ways to alter the shape of the shoe and foot in an “invisible” way.

What are your feelings looking back on the show?

It was a great pleasure to work on such an iconic film. Kenneth Branagh and Charley Henley both gave us creative freedom, but at the same time challenged us with their ideas. Although the vast majority of the work was done by MPC Montreal, all our other facilities helped in different capacities. It was a great effort by everyone.

Special thanks to Darrell Borquez, Marshall Weinbaum, Riki Arnold and Jonny Vale. “Cinderella” photographs copyright © 2015 The Walt Disney Company.

Now Showing – Cinefex 141

Cinefex 141 - From the Editor's Desk

Meet Chappie, the robotic star not only of Neill Blomkamp’s new futuristic action/thriller, but also of the cover star of Cinefex issue 141. Available now, this new edition of the premier magazine for visual effects professionals and enthusiasts features behind-the-scenes analysis of the latest films by leading moviemakers.

As its cover promises, our latest edition investigates the making of Chappie, in which a robot child prodigy is kidnapped by criminals and raised within their dysfunctional family. Weta Workshop provided practical props and effects, along with special effects supervisor Max Poolman. Visual effects were created by Image Engine, Ollin VFX Studio and The Embassy VFX.

Also featured in Cinefex 141 is The Hobbit: The Battle of the Five Armies, the epic conclusion to Peter Jackson’s 20-year filmmaking odyssey through J.R.R. Tolkien’s fantasy landscape. Assisting him on this final leg of the journey were visual effects supervisor Joe Letteri, special makeup and creature designer Richard Taylor, armies of artists at Weta Digital and Weta Workshop, and NZFX special effects supervisor Steve Ingram.

Next we have the exotic science fiction fantasy Jupiter Ascending, for which visual effects supervisor Dan Glass reunited with The Matrix filmmakers Andy and Lana Wachowski to realize science fiction action and spectacular cosmic realms assisted by visual effects consultant John Gaeta, and artisans at Method Studios, Double Negative, Framestore, One Of Us, BlueBolt, The Aaron Sims Company, Halon Entertainment, Mokko Studio, Rodeo FX, BUF and The Third Floor. Special effects supervisor Trevor Wood, makeup effects supervisor Jeremy Woodhead and Ironhead Studios supplied practical effects.

Wrapping up this issue of Cinefex is Unbroken, a true tale of boundless courage and survival, for which director Angelina Jolie engaged effect artists at Industrial Light & Magic, Rodeo FX, Animal Logic, Ghost VFX, Hybride and Lola VFX to replicate the 1936 Berlin Olympic venues and create intense sequences of aerial combat and sea and land ordeals.

We think it’s an amazing line-up. All the same, the list of contents isn’t quite what was originally planned for issue 141. How so? I’ll let Cinefex editor-in-chief Jody Duncan tell you more …

Jody Duncan – From The Editor’s Desk

After 36 years in business, Cinefex normally runs like a well-tuned Lamborghini. The writing team writes and the production team produces — usually — without a hitch. But, every once in a while, a wrench is thrown in, grinding the gears of our Lamborghini’s engine. Issue 141 was just such an issue.

Our plan was to do a big story on Ron Howard’s In the Heart of the Sea, which was scheduled to be released in early March — perfect timing for our mid-March issue. I saw a very early screening of the film, and I came out of it charged up and ready to go. The film was stunning, and it offered me a chance to write about something outside our typical sci-fi subject matter. Whales instead of aliens. 19th century whaling ships instead of spacecraft. I was exhilarated!

I interviewed all of the visual effects principals, as well as Ron Howard — the first time we’d managed to snag an interview with the director, despite our having covered eight of his previous films. (Cocoon was, in fact, my first writing assignment for Cinefex.) I then spent several weeks writing the article, and was within a day of sending it off to typesetting … when I got a call from Warner Bros. The studio had made a last-minute decision to change the film’s release date from March 2015 to December 2015. In the Heart of the Sea could not be featured in our March issue.

We were now looking at the prospect of about 30 empty pages in the magazine, which would only be filled if I could write a substitute article in record time. Fortunately, the loss of In the Heart of the Sea (which will be featured in our December issue) meant the gain of a story on Unbroken — a movie I had greatly admired, based on a book I had read and loved. Unbroken turned out to be a terrific story from a visual effects standpoint!

Cinefex 141 also contains coverage of the third Hobbit movie — Joe Fordham’s final foray into Middle-Earth with Peter Jackson and Weta — as well as his story on the Wachowski siblings’ latest extravaganza, Jupiter Ascending. Our cover boy is Chappie, a charming fellow I came to know through interviews with director Neill Blomkamp, visual effects supervisor Chris Harvey and a host of other visual effects artisans. Fortunately, our engines were purring throughout the writing and production of all three articles. As for the fourth — Unbroken was worth losing a few hours’ sleep, and I can look forward to a light workload come the 2015 Christmas season, because one of my articles for our final issue of the year is in the bag. I think I’ll go shopping.

Thanks, Jody – enjoy that shopping spree when it comes!

As for issue 141, the time has come to stop talking and start reading. Use the links below to access the latest Cinefex in your favourite format.

Oh, and if you’re a robot, please note that Cinefex is optimised for human eyesight, so please make the appropriate adjustments to your optical sensors.

Blade Runner Returns

"Blade Runner" Returns

In the autumn of 1982, I seated myself in a darkened cinema and waited for Ridley Scott to transport me 37 years into the future. I was seventeen years old, a card-carrying movie geek, and almost beside myself with excitement. Would the film I’d been anticipating all year live up to my high expectations?

The film, of course, was Blade Runner, and it didn’t disappoint. Even before the titles had finished rolling, I’d been seduced by the haunting tones of Vangelis’s sultry score. When the opening shot faded up – a spectacular moving vista in which flying cars soared above a fiery, industrialised future Los Angeles – I gasped.

Come to think of it, I think that most of Blade Runner’s future city shots made me gasp. Through the course of the film, I felt myself swept bodily into that glistening, neon-lit metropolis. I could feel its rainfall stroking my face. I could feel its smoke choking my lungs. I’d never felt so immersed in an imaginary world.

I embraced the people of the future, too. Rick Deckard was Harrison Ford like I’d never seen him before: no dashing, romantic hero this, but a cynical, downtrodden gumshoe. I fell head-over-heels in love with Sean Young as the immaculate Rachel, and with Daryl Hannah as the primal Pris. Most arresting of all was the leader of those renegade replicants, Roy Batty, played by Rutger Hauer in a performance that moved effortlessly from chilling to heartbreaking and back again.

Yet it was the ground-breaking visual effects work that spoke to me most clearly. Created by Douglas Trumbull, David Dryer, Richard Yuricich and the rest of the team at Entertainment Effects Group, they were simply breathtaking. What’s more, they blended seamlessly with the live-action that had been shot on the Warner Brothers backlot in Burbank. Was that real rain I was seeing, or some kind of animated effect? Where was the join between the full-scale set and the matte painting? The futuristic environment conjured by Blade Runner was so soaked in atmosphere, and so unbelievable in its complexity, that I fell for it hook, line and sinker. I’d never seen anything like it before.

Frankly, I don’t think I’ve seen anything like it since.

This miniature cityscape for "Blade Runner" was constructed on its side, so as to be aligned correctly for the camera

“In order to get aerial views of some of the cityscapes, the miniature structures were tilted sideways and aligned individually at varying angles so as to appear correct to the barrel distortion of the camera’s wide-angle lens. Numerous in-camera passes were required to balance external and practical lighting. Separate multi-pass film elements were also created for the various billboard and spinner insertions. Like most of the other miniature work, the cityscapes were filmed in smoke and augmented optically with rain.” Original caption: “2020 Foresight” by Don Shay, Cinefex 9, July 1982.

Now, in the spring of 2015 – just four years before Blade Runner’s predicted future is due to arrive, and just weeks after Alcon Entertainment announced that Ford would return in a sequel to be directed by Denis Villeneuve – I find myself anticipating the film all over again. As part of its Sci-Fi: Days of Fear and Wonder season, the British Film Institute is bringing Blade Runner back to cinemas across the UK.

In advance of the release, the BFI has prepared a brand new trailer. Here’s what director Ridley Scott had to say about it:

The Final Cut is my definitive version of Blade Runner, and I’m thrilled that audiences will have the opportunity to enjoy it in the way I intended – on the big screen. This new trailer captures the essence of the film and I hope will inspire a new generation to see Blade Runner when it is re-released across the UK on 3 April.”

The version that’s being released theatrically is the 2007 digitally remastered Blade Runner: The Final Cut, which is different to the 1982 original in a number of crucial respects. For example, it lacks both the tacked-on happy ending and the controversial Deckard voiceover (regarded by many as clumsy and unnecessary). Equally controversial is the most notable addition: a Deckard dream sequence featuring a unicorn. The unicorn’s appearance suggests – via Deckard’s uneasy relationship with his detective colleague, Gaff – that our hero may be a replicant himself …

Blade Runner: The Final Cut also features myriad other changes, including tweaks to both edit and soundtrack, a dusting of new shots, and a number of “fixes” and upgraded visual effects, executed primarily by The Orphanage, supervised by Jon Rothbart, with additional shots supplied by Lola VFX.

I asked Stu Maschwitz, co-founder of The Orphanage, what it was like treading on the hallowed ground of Los Angeles, 2019:

I’m very proud of The Orphanage’s work on Blade Runner: The Final Cut. We all truly felt a sense of reverence, working to preserve a film that meant a lot to us, and everyone involved was completely committed to doing the work at the highest possible quality. It’s a touchy thing, trying to tastefully update a classic and beloved film, but The Final Cut is, in my opinion, a perfect example of how to do it right.

One of the first questions I asked when I found out we were doing the work was, “Are we going to paint out the Steadicam shadow in the final chase through the Bradbury building?” Being a huge fan of Blade Runner, that camera shadow was something I’d seen, and wondered about, a hundred times. The answer was yes, and it was an incredibly difficult shot, replacing an entire wall behind layers of shadow and aerial haze, tracking through the complex warp of an anamorphic lens.

We did all that work in Flame, on a 4K scan of an interpositive that was the highest quality original they could find. We were almost done when the production managed to locate the original negative. We scanned that at 4K and started the work completely over from scratch! That’s how committed to doing it right everyone was on the production.

A police spinner comes in to land in Ridley Scott's "Blade Runner"

Are all the changes in The Final Cut necessary? The question is moot. This version exists, so live with it. Personally, I like The Final Cut best out of all the versions of this timeless classic. But I’d be just as happy to watch the original when Blade Runner appears on cinema screens again next month. I’m just glad of the chance to submerge myself once more in that dark and dazzling world of future noir.

The reason for my enthusiasm is simple. When you leave a showing of Blade Runner, the only possible thing you can say is to echo the words of Roy Batty during the film’s closing scenes, as he sits on the rooftop beneath the tears of that endless, future rainstorm:

“I’ve seen things you people wouldn’t believe.”

What are your memories of Blade Runner? Were you there in 1982, or are you one of the millions who discovered this sci-fi classic later on home video, DVD or Blu-ray? Which version do you prefer?

And here’s the biggie … IS Rick Deckard a replicant?

L is for Lidar

L-is-for-LidarIn the VFX ABC, the letter “L” stands for “Lidar”.

Making movies has always been about data capture. When the Lumière brothers first pointed their primitive camera equipment at a steam locomotive in 1895 to record Arrivée d’un train en gare de La Ciotat, what were they doing if not capturing data? In the 1927 movie The Jazz Singer – the first full-length feature to use synchronised sound – when Al Jolson informed an eager crowd, “You ain’t heard nothing’ yet!”, what was the Warner Bros. microphone doing? You guessed it: capturing data.

Nowadays, you can’t cross a movie set without tripping over any one of a dozen pieces of data capture equipment. Chances are you’ll even bump into someone with the job title of “data wrangler”, whose job it is to manage the gigabytes of information pouring out of the various pieces of digital recording equipment.

And in the dead of night, if you’re very lucky, you may even spy that most elusive of data capture specialists: the lidar operator.

Lidar has been around long enough to become commonplace. If you read behind-the-scenes articles about film production, you’ll probably know that lidar scanners are regularly used to make 3D digital models of sets or locations. The word has even become a verb, as in, “We lidared the castle exterior.” Like all the other forms of data capture, lidar is everywhere.

But what exactly is lidar? What does the word stand for, and how do those scanners work? And just how tough is it to scan a movie set when there’s a film crew swarming all over it?

To answer these questions and more, I spoke to Ron Bedard from Industrial Pixel, a Canadian-based company, with incorporated offices in the USA, which offers lidar, cyberscanning, HDR and survey services to the motion picture and television industries.

Ron Bedard lidar scanning in Toronto for "Robocop" (2014)

Ron Bedard lidar scanning in Toronto for “Robocop” (2014)

What’s your background, Ron, and how did you get into the lidar business?

I was a commercial helicopter pilot for 17 years, as well as an avid photographer. During my aviation career, I became certified as an aircraft accident investigator – I studied at Kirtland Air Force Base in New Mexico. I also got certified as a professional photographer, and following that as a forensic photographer.

At my aircraft accident investigation company, we utilised scanning technology to document debris fields. We used little hand-held laser scanners to document aircraft parts, and sent the data back to the manufacturers to assess tolerances.

How did you make the leap then into motion pictures?

The transition wasn’t quite that abrupt. Local businesses started to find out that I had scanners, and we began to get calls, saying, “Hey, we make automotive parts, and we have this old 1967 piston head, and we want to start machining them. Can you scan this one part and reverse engineer it for us?” Or there were these guys who made bathtubs, who said, “We don’t want to use fibreglass any more.” So we scanned their tubs to create a profile for their CNC machine.

Carl Bigelow lidar scans a Moroccan market set

Carl Bigelow lidar scans a Moroccan market set

Let’s talk about lidar. What is it, and how does it work?

Lidar means light detection and ranging. It works by putting out a pulse, or photon, of light. The light hits whatever it hits – whether it’s an atmospheric phenomenon or a physical surface – and bounces back to the sensor, which then records the amount of time that it’s taken for that photon to return.

Does a lidar scanner incorporate GPS? Does it need to know where it is in space?

Only if your lidar sensor is physically moving, or if it is incorporated into the scanner system, because the lidar is always going to give you the XYZ information relative to the sensor. Most terrestrial-based lidar systems are predicated on the sensor being in a single location. If you’re moving that sensor, you have to attribute where that sensor is in three-dimensional space so you can compensate the XYZ values of each measurement point. That’s commonly used in airborne lidar systems.

What kind of data does the scanner output?

Every software suite does it a little differently, but they all start with a point cloud. We do offer a modelling service, but primarily what we end up providing our clients is an OBJ – a polygonal mesh created from the point cloud – as well as the raw point cloud file.

It sounds like a lot of data. How do you manage it all?

Our scanner captures over 900,000 points per second. And a large movie set may require over 100 scans. That generates a massive amount of data – too much for a lot of people to work with. So we provide our clients with the individual point clouds from each of the scans, as well as a merged point cloud that has been resurfaced into a polygonal mesh. So, instead of making the entire model super-high resolution, we create a nice, clean scene. Then, if they want some part at higher resolution, they let us know and we create it from the original raw point cloud. If they have the point cloud themselves, they just highlight a certain area and work from that.

So you’re effectively giving them a long shot, together with a bunch of close-ups.


Is lidar affected by the weather?

Rain can create more noise, because anything that affects the quality of the light will affect the quality of the scan data. And wet surfaces have a layer of reflectivity on top. Then there’s the effects of the weather on the technology itself. Our modern system has a laser beam that comes out of the sensor and hits a spinning mirror, bouncing the light off at 90°. So if you get a raindrop on that mirror, that can certainly affect where the photons are travelling to.

How do you get around that?

Well, here on the west coast, if you can’t scan in the rain, basically you’re not scanning from November until April! We’ve built a rain umbrella system for our scanners, so we can scan in the rain. We obviously can’t scan directly straight up, but we can point upwards at about a 60° or 70° angle, and all the way down to the ground.

Is cyberscanning an actor the same as lidar?

No, it’s completely different. You have to think of lidar as a sledgehammer – the point cloud generated is not of a high enough resolution to be able to capture all those subtle details of the human face. So when it comes to scanning people, there are other technologies out there, such as structured white light scanning or photogrammetry, which are better suited to the task.

Do you find actors are used to the process of being scanned now?

For the most part, I think they are. I think there’s still some caution. It’s not that the technology is new – it’s more about the ability to re-create somebody digitally. There are some people who have cautions about that, because they’re never sure how their likeness might be used in the future.

Do they worry about safety?

When laser-based systems first started being utilised on film, there was a lot more hesitation from a personal safety point of view. But the amount of ordinary white light that’s being emitted from our little hand-held scanners is less than a flashlight. I have had people say, “I can feel the scanner entering into me!” And I say, “No, you can’t!” So there is still a little bit of magic and mystery to it, but that’s only because people don’t know exactly what it’s doing.

A mailroom is lidar scanned to capture the conveyor system prior to creating digital set extensions

A mailroom is lidar scanned to capture the conveyor system prior to creating digital set extensions

Tell us about photogrammetry.

With photogrammetry, you take enough photos of a subject that you have a lot of overlap. Then you use software to look for common points within each of the images – the software can tell where that pixel is in each image, and its relationship to each neighbouring pixel.

One of the challenges with photogrammetry is that there is no sense of scale. If you have one set of images of a full-scale building, and another of a miniature building, the software isn’t smart enough to figure out that one is smaller than the other. It just re-creates the three-dimensionality.

So you have to cross-refer that with survey data?

Yes. Or you try to place something in the images, like a strip or measuring tape, so that when you’re creating your photogrammetric model, you can say, “Hey, from this pixel to that pixel is one metre.” You can then attribute scale to the entire model. Lidar, on the other hand, is solely a measurement tool and accurately measures the scale.

When you’re working on a feature film, would you typically be hired by the production, or by an individual VFX company?

Every job is a little different. It usually works out to be about fifty-fifty.

Is there such a thing as a typical day on set?

No. Every day is a new day, with new challenges, new scenes, new sets, new people. That’s part of the beauty of the job: the variety. You’re not showing up to work Monday to Friday, 9 to 5, sitting in a cubicle and pushing paper.

Do you get a slot on the call sheet, or do you just scurry around trying not to get in people’s way?

If we’re doing lidar, nine times out of ten we’re there when nobody else is there. If we’re trying to create our digital double of the set with people running around, that creates noisier data and possible scan registration issues. So we do a lot of night work, when they’ve finished filming.

If we’re on location, scanning an outdoor scene downtown for example, usually the night-time is best anyway, for a couple of reasons. First, you’re going to get a lot less interference from people and traffic. Second, if there are lots of skyscrapers with glass facades, you can get a lot of noise in the scanning data as the sun is reflecting off the buildings.

You must be constantly up against the clock, having to get the scans done before the sets are struck.

Yes. A lot of times, we’ll actually be in there scanning while they’re breaking the set down! We just try to be one step ahead. We’re used to it – it’s just the nature of the business. There’s such a rapid turnaround now as far as data collection is concerned. You’ve just got to get in and get out.

So it’s all about mobility and fast response?

Exactly. One of the things that our customers really appreciate is our ability to be very portable. All of our systems – whether it’s cyberscanning or lidar – pack up into no more than three Pelican cases. And we can be on a plane, flying anywhere in the world.

Ron Bedard and Ian Galley lidar scan the 300-foot naval destroyer HMCS "Haida" in Hamilton, Ontario

Ron Bedard and Ian Galley lidar scan the 300-foot naval destroyer HMCS “Haida” in Hamilton, Ontario

Is it hard to keep up with scanning technology as it develops?

Oh, absolutely. We’re dogs chasing our tails. With today’s rapid advancements, if you can get three years out of a technology, maybe four, you’re lucky.

Is there any future or near-future piece of technology you’ve got your eye on?

I think photogrammetry is really making a comeback. It’s been used ever since cameras were invented, and from an aerial survey point of view since long before World War II. But it’s made a real resurgence of late, and that really has to do with the resolution of the sensors that are now available. Now that you’re talking high numbers of megapixels, you’re able to get much finer detail than you were in days past.

As these high-density sensors come down in price, and get incorporated into things like smartphones, I think we’ll see 3D photography – in combination with a sonar- or a laser-based system to get the scale – really getting market-hard.

And what does the future hold for lidar?

I think flash lidar will become much more prevalent. Instead of a single pulse of light, flash lidar sends out thousands of photons at once. It can fire at a really rapid rate. They use flash lidar on spacecraft for docking. They use it on fighter jets for aerial refuelling. You’re starting to see low-cost flash lidar systems being incorporated into bumpers on vehicles for collision avoidance.

So what are the benefits of flash lidar for the film business?

When you’re trying to do motion tracking, instead of putting balls on people and using infra-red sensors, you can use flash lidar instead. It is much more versatile in long-range situations. You can create an environment with flash lidar firing at 24 frames per second, and capture anyone who walks within that environment. That’s something I know we’re going to see a lot more of in the future.

Alex Shvartzman uses a handheld structured light device to scan a horse

Alex Shvartzman uses a handheld structured light device to scan a horse

What’s the weirdest thing you’ve ever had to scan?

Everything’s weird. We’ve scanned horses. We’ve scanned dogs. The beauty of working in film is that one day we can be scanning a Roman villa, and that evening be scanning the set of some futuristic robot movie.

Animals are tricky because each one is different, and you never know how they’re going to react to the light source. We scanned around thirty horses for one particular job, and some of them were happy and docile, and some of them reacted as soon as the scanner started.

Another challenging question we were asked was, “Can you scan a boat that’s floating out in the open sea?” I thought about it and said, “Sure you can. You’ve just got to have the scanner move the same way the boat’s moving.” We built a custom rig so that the scanner was constantly moving with the boat, and we hung it out over the edge of the boat and scanned the whole hull.

Lidar providers are among the many unsung heroes of movies. Do you ever crave the limelight?

No. In the end, our job is to provide solutions for our customers. For us, that’s the reward. When they’re happy, we’re happy.