L is for Lidar

by Graham Edwards

L-is-for-LidarIn the VFX ABC, the letter “L” stands for “Lidar”.

Making movies has always been about data capture. When the Lumière brothers first pointed their primitive camera equipment at a steam locomotive in 1895 to record Arrivée d’un train en gare de La Ciotat, what were they doing if not capturing data? In the 1927 movie The Jazz Singer – the first full-length feature to use synchronised sound – when Al Jolson informed an eager crowd, “You ain’t heard nothing’ yet!”, what was the Warner Bros. microphone doing? You guessed it: capturing data.

Nowadays, you can’t cross a movie set without tripping over any one of a dozen pieces of data capture equipment. Chances are you’ll even bump into someone with the job title of “data wrangler”, whose job it is to manage the gigabytes of information pouring out of the various pieces of digital recording equipment.

And in the dead of night, if you’re very lucky, you may even spy that most elusive of data capture specialists: the lidar operator.

Lidar has been around long enough to become commonplace. If you read behind-the-scenes articles about film production, you’ll probably know that lidar scanners are regularly used to make 3D digital models of sets or locations. The word has even become a verb, as in, “We lidared the castle exterior.” Like all the other forms of data capture, lidar is everywhere.

But what exactly is lidar? What does the word stand for, and how do those scanners work? And just how tough is it to scan a movie set when there’s a film crew swarming all over it?

To answer these questions and more, I spoke to Ron Bedard from Industrial Pixel, a Canadian-based company, with incorporated offices in the USA, which offers lidar, cyberscanning, HDR and survey services to the motion picture and television industries.

Ron Bedard lidar scanning in Toronto for "Robocop" (2014)

Ron Bedard lidar scanning in Toronto for “Robocop” (2014)

What’s your background, Ron, and how did you get into the lidar business?

I was a commercial helicopter pilot for 17 years, as well as an avid photographer. During my aviation career, I became certified as an aircraft accident investigator – I studied at Kirtland Air Force Base in New Mexico. I also got certified as a professional photographer, and following that as a forensic photographer.

At my aircraft accident investigation company, we utilised scanning technology to document debris fields. We used little hand-held laser scanners to document aircraft parts, and sent the data back to the manufacturers to assess tolerances.

How did you make the leap then into motion pictures?

The transition wasn’t quite that abrupt. Local businesses started to find out that I had scanners, and we began to get calls, saying, “Hey, we make automotive parts, and we have this old 1967 piston head, and we want to start machining them. Can you scan this one part and reverse engineer it for us?” Or there were these guys who made bathtubs, who said, “We don’t want to use fibreglass any more.” So we scanned their tubs to create a profile for their CNC machine.

Carl Bigelow lidar scans a Moroccan market set

Carl Bigelow lidar scans a Moroccan market set

Let’s talk about lidar. What is it, and how does it work?

Lidar means light detection and ranging. It works by putting out a pulse, or photon, of light. The light hits whatever it hits – whether it’s an atmospheric phenomenon or a physical surface – and bounces back to the sensor, which then records the amount of time that it’s taken for that photon to return.

Does a lidar scanner incorporate GPS? Does it need to know where it is in space?

Only if your lidar sensor is physically moving, or if it is incorporated into the scanner system, because the lidar is always going to give you the XYZ information relative to the sensor. Most terrestrial-based lidar systems are predicated on the sensor being in a single location. If you’re moving that sensor, you have to attribute where that sensor is in three-dimensional space so you can compensate the XYZ values of each measurement point. That’s commonly used in airborne lidar systems.

What kind of data does the scanner output?

Every software suite does it a little differently, but they all start with a point cloud. We do offer a modelling service, but primarily what we end up providing our clients is an OBJ – a polygonal mesh created from the point cloud – as well as the raw point cloud file.

It sounds like a lot of data. How do you manage it all?

Our scanner captures over 900,000 points per second. And a large movie set may require over 100 scans. That generates a massive amount of data – too much for a lot of people to work with. So we provide our clients with the individual point clouds from each of the scans, as well as a merged point cloud that has been resurfaced into a polygonal mesh. So, instead of making the entire model super-high resolution, we create a nice, clean scene. Then, if they want some part at higher resolution, they let us know and we create it from the original raw point cloud. If they have the point cloud themselves, they just highlight a certain area and work from that.

So you’re effectively giving them a long shot, together with a bunch of close-ups.

Exactly.

Is lidar affected by the weather?

Rain can create more noise, because anything that affects the quality of the light will affect the quality of the scan data. And wet surfaces have a layer of reflectivity on top. Then there’s the effects of the weather on the technology itself. Our modern system has a laser beam that comes out of the sensor and hits a spinning mirror, bouncing the light off at 90°. So if you get a raindrop on that mirror, that can certainly affect where the photons are travelling to.

How do you get around that?

Well, here on the west coast, if you can’t scan in the rain, basically you’re not scanning from November until April! We’ve built a rain umbrella system for our scanners, so we can scan in the rain. We obviously can’t scan directly straight up, but we can point upwards at about a 60° or 70° angle, and all the way down to the ground.

Is cyberscanning an actor the same as lidar?

No, it’s completely different. You have to think of lidar as a sledgehammer – the point cloud generated is not of a high enough resolution to be able to capture all those subtle details of the human face. So when it comes to scanning people, there are other technologies out there, such as structured white light scanning or photogrammetry, which are better suited to the task.

Do you find actors are used to the process of being scanned now?

For the most part, I think they are. I think there’s still some caution. It’s not that the technology is new – it’s more about the ability to re-create somebody digitally. There are some people who have cautions about that, because they’re never sure how their likeness might be used in the future.

Do they worry about safety?

When laser-based systems first started being utilised on film, there was a lot more hesitation from a personal safety point of view. But the amount of ordinary white light that’s being emitted from our little hand-held scanners is less than a flashlight. I have had people say, “I can feel the scanner entering into me!” And I say, “No, you can’t!” So there is still a little bit of magic and mystery to it, but that’s only because people don’t know exactly what it’s doing.

A mailroom is lidar scanned to capture the conveyor system prior to creating digital set extensions

A mailroom is lidar scanned to capture the conveyor system prior to creating digital set extensions

Tell us about photogrammetry.

With photogrammetry, you take enough photos of a subject that you have a lot of overlap. Then you use software to look for common points within each of the images – the software can tell where that pixel is in each image, and its relationship to each neighbouring pixel.

One of the challenges with photogrammetry is that there is no sense of scale. If you have one set of images of a full-scale building, and another of a miniature building, the software isn’t smart enough to figure out that one is smaller than the other. It just re-creates the three-dimensionality.

So you have to cross-refer that with survey data?

Yes. Or you try to place something in the images, like a strip or measuring tape, so that when you’re creating your photogrammetric model, you can say, “Hey, from this pixel to that pixel is one metre.” You can then attribute scale to the entire model. Lidar, on the other hand, is solely a measurement tool and accurately measures the scale.

When you’re working on a feature film, would you typically be hired by the production, or by an individual VFX company?

Every job is a little different. It usually works out to be about fifty-fifty.

Is there such a thing as a typical day on set?

No. Every day is a new day, with new challenges, new scenes, new sets, new people. That’s part of the beauty of the job: the variety. You’re not showing up to work Monday to Friday, 9 to 5, sitting in a cubicle and pushing paper.

Do you get a slot on the call sheet, or do you just scurry around trying not to get in people’s way?

If we’re doing lidar, nine times out of ten we’re there when nobody else is there. If we’re trying to create our digital double of the set with people running around, that creates noisier data and possible scan registration issues. So we do a lot of night work, when they’ve finished filming.

If we’re on location, scanning an outdoor scene downtown for example, usually the night-time is best anyway, for a couple of reasons. First, you’re going to get a lot less interference from people and traffic. Second, if there are lots of skyscrapers with glass facades, you can get a lot of noise in the scanning data as the sun is reflecting off the buildings.

You must be constantly up against the clock, having to get the scans done before the sets are struck.

Yes. A lot of times, we’ll actually be in there scanning while they’re breaking the set down! We just try to be one step ahead. We’re used to it – it’s just the nature of the business. There’s such a rapid turnaround now as far as data collection is concerned. You’ve just got to get in and get out.

So it’s all about mobility and fast response?

Exactly. One of the things that our customers really appreciate is our ability to be very portable. All of our systems – whether it’s cyberscanning or lidar – pack up into no more than three Pelican cases. And we can be on a plane, flying anywhere in the world.

Ron Bedard and Ian Galley lidar scan the 300-foot naval destroyer HMCS "Haida" in Hamilton, Ontario

Ron Bedard and Ian Galley lidar scan the 300-foot naval destroyer HMCS “Haida” in Hamilton, Ontario

Is it hard to keep up with scanning technology as it develops?

Oh, absolutely. We’re dogs chasing our tails. With today’s rapid advancements, if you can get three years out of a technology, maybe four, you’re lucky.

Is there any future or near-future piece of technology you’ve got your eye on?

I think photogrammetry is really making a comeback. It’s been used ever since cameras were invented, and from an aerial survey point of view since long before World War II. But it’s made a real resurgence of late, and that really has to do with the resolution of the sensors that are now available. Now that you’re talking high numbers of megapixels, you’re able to get much finer detail than you were in days past.

As these high-density sensors come down in price, and get incorporated into things like smartphones, I think we’ll see 3D photography – in combination with a sonar- or a laser-based system to get the scale – really getting market-hard.

And what does the future hold for lidar?

I think flash lidar will become much more prevalent. Instead of a single pulse of light, flash lidar sends out thousands of photons at once. It can fire at a really rapid rate. They use flash lidar on spacecraft for docking. They use it on fighter jets for aerial refuelling. You’re starting to see low-cost flash lidar systems being incorporated into bumpers on vehicles for collision avoidance.

So what are the benefits of flash lidar for the film business?

When you’re trying to do motion tracking, instead of putting balls on people and using infra-red sensors, you can use flash lidar instead. It is much more versatile in long-range situations. You can create an environment with flash lidar firing at 24 frames per second, and capture anyone who walks within that environment. That’s something I know we’re going to see a lot more of in the future.

Alex Shvartzman uses a handheld structured light device to scan a horse

Alex Shvartzman uses a handheld structured light device to scan a horse

What’s the weirdest thing you’ve ever had to scan?

Everything’s weird. We’ve scanned horses. We’ve scanned dogs. The beauty of working in film is that one day we can be scanning a Roman villa, and that evening be scanning the set of some futuristic robot movie.

Animals are tricky because each one is different, and you never know how they’re going to react to the light source. We scanned around thirty horses for one particular job, and some of them were happy and docile, and some of them reacted as soon as the scanner started.

Another challenging question we were asked was, “Can you scan a boat that’s floating out in the open sea?” I thought about it and said, “Sure you can. You’ve just got to have the scanner move the same way the boat’s moving.” We built a custom rig so that the scanner was constantly moving with the boat, and we hung it out over the edge of the boat and scanned the whole hull.

Lidar providers are among the many unsung heroes of movies. Do you ever crave the limelight?

No. In the end, our job is to provide solutions for our customers. For us, that’s the reward. When they’re happy, we’re happy.

One thought on “L is for Lidar

  1. Pingback: L is for Lidar - LiDAR News

Comments are closed.