Rewind to “Back to the Future”

"Back to the Future" bluescreen stage photography and final composite by Industrial Light & Magic

Great Scott! It’s nearly October! That means our lives are about to intersect with a monumental moment on one of cinema’s trickiest timelines – the day when Marty McFly arrives in the Hill Valley of the future, in Robert Zemeckis’s 1989 sequel Back to the Future Part II.

Yes folks, 21 October 2015 is “Back to the Future Day”.

Across the world, fans of the classic time-travel trilogy are looking forward to a whole slew of events dedicated to celebrating this fantasy watershed moment. Your local theatre may be running a double or triple feature of the films. Some of the big venues are showing Back to the Future with live orchestra playing Alan Silvestri’s memorable score. There are panels and charity galas galore. I’ll bet there are even a few high schools putting on their very own “Enchantment Under the Sea” dance.

ILM modelshop supervisor Steve Gawley checks the internal lighting systems of the one-fifth scale replica DeLorean constructed for "Back to the Future".

ILM modelshop supervisor Steve Gawley checks the internal lighting systems of the one-fifth scale replica DeLorean constructed for “Back to the Future”.

But I’m not here to look forward. I’m here to look back – to 1985, when Back to the Future was first released.

In those halcyon days, I was an impoverished art student living in London. Like my fellow Brits, I frequently had to wait for what felt like a lifetime to see films that had been released the States months earlier – films like Back to the Future, which hit North American screens in July but didn’t reach good old Blighty until just before Christmas.

The incredible thing is that, despite my film-geek credentials and voracious appetite for movie news, when the film finally landed I knew almost nothing about it. I’d seen a trailer, but it hadn’t stuck in my mind. Spielberg’s producer credit was a good sign, but who was this Zemeckis fellow? Oh yeah, the Romancing the Stone guy. Well, that was an okay film, I suppose …

It’s hard to imagine such ignorance now. Everywhere we look we’re bombarded by film publicity. To go into a film cold, you have to consciously engage in a total media blackout because, let’s face it, your average 21st century movie trailer does more than just tease – it lays out all three acts, and spoils at least six big action scenes and a dozen key reveals.

Sometimes I really do wish I had a time machine.

Michael Lantieri and his physical effects team created specialised rigging for the hoverboard chase in "Back to the Future Part II", including this crane-suspended camera platform and wire-rigged hoverboard.

Michael Lantieri and his physical effects team created specialised rigging for the hoverboard chase in “Back to the Future Part II”, including this crane-suspended camera platform and wire-rigged hoverboard.

Another benefit of watching Back to the Future in 1985 was the venue. For those of you who remember the Empire, Leicester Square, the way it used to be, you’ll know exactly what I mean. No heartless multiplex this. The main auditorium was roughly the size of Texas. The screen, curved to geometric perfection, was concealed behind acres of ruched and ruffled curtains. Just walking in was like ascending to heaven.

When you’d taken you seat, the pre-show began.

If memory serves, the pre-show for Back to the Future involved a scanning laser being fired at those gorgeously draped curtains, creating a kaleidoscope of fire that gyrated in perfect synchrony with a presentation of Jean Michel Jarre’s Oxygène loud enough to make your whole body break out in gooseflesh. Once that was over, the curtains unpeeled and the dazzled audience was granted a brief recovery time courtesy of a few short reels of adverts and previews, during which they were able to recover their proper senses.

Then the movie began.

I’ll be honest – at first, I wasn’t sure what to make of Back to the Future. I didn’t recognise any of the actors, and I wasn’t at all sure where the story was going. The scenes in Marty’s house seemed oddly paced, dwelling curiously on the offbeat reminiscences of his dorky parents. Maybe they’d become relevant later in the movie … Still, the whole thing had a fun feel, just right for the holiday season. Maybe it would warm up.

Of course, it did. The instant Marty arrived at Twin Pines Mall and met Doc Brown, it started to win me over. The chemistry between Michael J. Fox and Christopher Lloyd made their every exchange a delight. I felt the smile growing on my face with each new turn of that pivotal scene – the DeLorean reveal, Einstein’s first trip through time (those iconic trails of fire!), the appearance of the terrorists, their shocking revenge on Doc Brown and the breakneck pursuit of Marty around the shopping mall car park leading ultimately to his escape into the past, all to the accompaniment of the most exuberant musical score I’d heard for years.

By the time Marty had been catapulted back to the year 1955, I realised that the Empire, Leicester Square, had indeed become its own little pocket of heaven. And heaven was where I remained for the rest of the film. It became an instant favourite, and is a favourite still – one of those rare films that captured lightning in a bottle, a film made with such confidence, such chutzpah, that you just knew its director must have been driven by a kind of visionary white heat. Zemeckis stated recently that he’ll resist any attempt to remake it. I hope he sticks to his guns. How can you improve on perfection?

ILM camera assistant Kate O'Neill programmes one of the computer-controlled functions on the miniature locomotive created for "Back to the Future Part III". Seven feet long, the model boasted twenty-four mechanical gags and was controlled by two motion-control systems simultaneously.

ILM camera assistant Kate O’Neill programmes one of the computer-controlled functions on the miniature locomotive created for “Back to the Future Part III”. Seven feet long, the model boasted twenty-four mechanical gags and was controlled by two motion-control systems simultaneously.

Some time after seeing the movie for the first time (and second, and third …) I picked up a copy of Cinefex issue 24. To my delight, it included a feature on Back to the Future – quite an extensive one, considering the film contains fewer than thirty visual effects shots. I learned that Industrial Light & Magic turned those shots around in roughly eight weeks, an incredible accomplishment. The detailed article, written by Janine Pourroy, also confirmed my suspicions about Zemeckis: it seemed he was all over the visual effects, ensuring at all times that ILM’s wizardry was as true to his vision as the rest of his film.

Small though the number of effects shots may be, they still threw plenty of challenges at VFX supervisor Ken Ralston and his team. The “time-slice” effect was an incredibly complicated blend of practical lighting and optical trickery. The climactic lightning strike – described in Zemeckis’s and Bob Gale’s script as “the largest bolt of lighting in cinema history” – was created frame by frame using meticulously hand-drawn animation by Wes Takahashi.

Miniature composite shot by Industrial Light & MagicAnd the film’s crowd-pleasing closing shot, in which the time-travelling DeLorean reveals its new flight mode shortly before bursting out of the movie screen, was a state-of-the-art optical composite boasting a manually-tracked match-move of a live-action plate, a meticulously-constructed miniature vehicle photographed under motion control, and some tricky hand-drawn rotoscoping to mask the airborne speedster as it swoops behind those distant trees.

Gazing back across the thirty years that lie between now and then, I’m filled with a fuzzy nostalgia. They say you can’t turn back the clock but, hey, this is Back to the Future we’re talking about.

So what should I do when “Back to the Future Day” finally comes around? Keep it simple and re-watch all three films in the comfort of my own home? Dress up in a life-preserver and gatecrash my local theatrical event? Rent a DeLorean and see if I can coax that sucker up to 88mph?

Never mind. I have a few weeks left to decide. If I run out of time, I can always fire up the flux capacitor and buy myself some extra breathing space. In the meantime, only one question remains:

What are you doing on “Back to the Future Day”?


Watch the trailer for the upcoming documentary Back in Time – due for release on 21 October 2015 – in which cast, crew, and fans explore the classic time-travel trilogy’s resonance throughout our culture:

Photographs copyright © 1985, 1989 and 1990 by Universal Studios Inc. and Industrial Light & Magic.

N is for New

Cinefex VFX ABC - N is for New in VFXIn the VFX ABC, the letter “N” stands for “New”.

Writing on this blog a little over a year ago, I asked a panel of visual effects experts the following question: What cutting-edge technique or technology is getting you excited about the future of visual effects?

The question prompted some fascinating responses, which you can read in my article I is for Innovation. Collectively, they provide a snapshot of the world of VFX as seen by a range of industry professionals in July 2014.

It’s a snapshot that’s now a year out of date. That’s the thing with cutting edges – the darn things just keep on cutting.

That’s why I’ve decided to revisit the topic, one year on. Because innovation doesn’t go away. In fact, the desire to create something new appears to be hard-wired into the mind of the visual effects artist. And the industry itself, like so many others, is constantly evolving.

This time around, I wanted to hear not only about the latest techniques and technologies, but also the latest business trends. So I stripped my original question back to its simplest possible form:

What’s New in VFX?

How did our panel of experts respond? Let’s find out!


Oculus Rift DK2Virtual Reality

Michele Sciolette, Head of VFX Technology, Cinesite

There is a lot of expectation that 2016 will be the year when immersive technologies such as virtual and augmented reality will become mainstream. Current-generation devices have many limitations, but clearly show the potential for truly immersive experiences. This will inevitably drive demand for new types of entertainment. I expect that the ability to support and create content for immersive experiences will become a common task for visual effects houses in the relatively near future.

Aruna Inversin, CG/VR Supervisor, Digital Domain

With true virtual reality around the corner, content creators and studios are already building their teams and their pipelines to take vancevantage of this next wave of new immersive experiences, the likes of which people have never seen. Using positional tracking, high fidelity screens and haptic (touch-sensitive) inputs, we’ll see a surge in consumer consumption that hasn’t be matched since the invention of the television.

The virtual reality challenges awaiting visual effects artists are numerous – from multiple camera takes and multiple cameras, to 360° video and extremely long frame ranges. As visual effects artists, we’re at the beginning of this amazing ride, with technologies finally catching up to the visions in our heads. Not just on a screen in front of you, but wherever you may look.

Hubert Krzysztofik, Director of Interactive Technology, Pixomondo

The implications of VR, AR and interactive experiences mean that the VFX industry is undergoing historic change. The demand for talented game engine developers is as high as the demand for excellent VFX artists versed in the specifics of working within game engines. Game engines already have a nascent presence in the VFX industry and are increasingly being used in pre-visualisation and look development.

Currently, there are three major players in the game engine field: Epic Game’s Unreal Engine, Crytek Cinebox and Unity Technologies Unity. From a business perspective, it’s important to be platform and technologically agnostic. We identify the strengths of each engine and use them based on the project requirements.

An important part of VR development is the headset component. Currently, Oculus Rift, Sony Morpheus, Google Cardboard and HTC Valve are available in all the engines on the market. Ease of use and minimal bug issues are a major consideration in a VFX studio, which lives and dies by its pipeline.

It’s an exciting time for VR and visual effects, and we seem to building toward an eventual merging of disciplines. I’m looking forward to seeing how VR continues to develop and improve, and how it can be beneficially integrated into the VFX industry and beyond.

Hannes Ricklefs, Head of Software (London, Bangalore) MPC Film

Virtual reality is definitely one of the areas with the most interest and investment being put into it. FMX and SIGGRAPH this year were loaded with talks, panels and demos around the topic. In general, VR is seen as the enabler for pushing the convergence of film and games technology. For fully immersive experiences, it needs the quality of VFX at the speed of a game engine.

One of the major excitements surrounding VR is that there are no established processes around the creation of VR content. Existing process such as 2D stitching, stereo virtual cameras and camera tracking now need to work within a full 360° vision. Recognising the importance of this area, MPC has established MPC VR, a creative- and technology-led VR team focused on immersive exploration across multiple industries, and integrated with Technicolor’s R&I (Research and Innovation) group.

Karl Woolley, VR Lead, London, Framestore

Spend five seconds on the Oculus CV1 or HTC Vive, and you’ll immediately understand what a difference it brings to see your hands, grab objects and move around a space, as opposed to being sat in a locked-off, passive position.

Whether the VR experience is based on live-action, pre-rendered or generated in real-time, game engines are at its heart. They allow you to leverage the input devices, and to craft worlds and environments based on your VFX assets … after a bit of work! Game engines have come on leaps and bounds in terms of visual quality and performance in the last five years, with the folks at Epic (Unreal Engine) releasing dedicated tools to make the VFX-to-VR process even easier and accessible for all in 2016.

With our roots in VFX, we traditionally focus on perfecting the pixel to tell a story. But VR is Virtual Reality, not just Visual Reality. 2016 will be the year we get natural, low-barrier methods of input, with Sony, HTC and Oculus all having consumer kit out making virtual reality available to the masses. That’s when we’ll truly see what the public’s appetite is for VR.


Videogame Engines

Michele Sciolette, Head of VFX Technology, Cinesite

The quality of real-time graphics has been improving at an incredible pace, and the output of modern game engines – supporting features such as physically based shading – is fast approaching the level of quality we need in high-end visual effects. The explosion of independent game publishers in recent years has led to new licensing options, making top quality engines accessible to everyone. Thus game engines could soon become a viable option for certain kinds of work – especially if you drop the constraint to run the engine in real-time.

In my opinion, the main obstacle still to overcome is pipeline integration. In order for visual effects houses to fully embrace video game engines, we need the ability to transparently move assets between a more traditional pipeline and one based on a game engine, to ensure consistency of the final look and minimise duplicated effort.

Watch the animated short A Boy and his Kite, created in Unreal Engine 4 and rendered in real-time at 30fps:


Integration into the Creative Process

Christian Manz, Creative Director, Film, Framestore

The biggest change I’ve observed in recent times is how VFX has been embraced as an important part of the filmmaking and storytelling process from start to finish, not just in post. Only the director and producers serve longer time than the VFX team on a big movie, and in that time you get to collaborate with a lot of talented people across multiple departments to bring the director’s vision to the big screen. Being part of that creative process really excites me – it’s why I think there hasn’t been a better time to be involved in the world of VFX.

Chris MacLean, CG Supervisor, Mr.X

Over the last few years, there’s been a trend towards VFX houses having more creative input with respect to the storytelling process. Whether with a design element like a creature, or something as robust as editorial input, VFX artists are being given more creative responsibility. It’s exciting for us as it gives us an opportunity to be part of the action as opposed to simply facilitating it.

It used to be that you would do your budget, get your shots with your line-up sheets, and drop the shot into your pipeline. Now, we’re being asked to pitch previs or postvis ideas, help design camera moves, and collaborate with production design and art departments. In some cases, we’ve even helped redesign sequences in post. It’s nice to see growing respect for our contributions to the filmmaking process, and to be recognised as visual storytellers in our own right.


Perfect Integration of CG with Live-Action

Mark Wendell, CG Supervisor, Image Engine

One exciting industry advance that’s now becoming possible – with the proper setup – is the near-perfect integration of CG into live-action plates in a practical and efficient way.

A number of developments are contributing to this: improvements in path-tracing renderers, the adoption of physically plausible shading models, and the use of lookdev and lighting setups calibrated to reference photography. While this isn’t particularly “new”, the tools and techniques have reached the point where we’re finally seeing a huge payoff not only in terms of realism, but also in production efficiency.

Of course, getting the proper reference and building colour-calibrated setups requires a bit of up-front investment in time. But when it’s done properly, we’re seeing lighters achieve a nearly perfect plate-match on their first render, in any lighting environment. Amortised over multiple shots, that investment more than pays for itself, and that’s really exciting. Rather than spending endless iterations just getting their CG elements to match the plate, artists actually have the time to refine shots and give them that last five percent kiss of awesome.

Watch Image Engine’s VFX reel showcasing their work on Chappie:


Physically Plausible Shading

Howard Campbell, Lead Technical Director, Tippett Studio

Rendering of computer generated images involves very complex calculations of light and material interactions – so complex, in fact, that we can only ever approximate the results. Traditionally, most render approaches have involved a high level of artist manipulation to arrive at a convincing render. But in recent years, as computer power has increased significantly, there’s been a shift towards more expensive – but much more accurate – physics-based rendering.

Physically plausible shading is a strategy whereby rays of light are traced through the scene according to the physical laws of energy conservation. It results in a more believable-looking result out of the box with much less artist tweaking. In Teenage Mutant Ninja Turtles (2014), for example, Tippett Studio used physically plausible shading to render the young turtles and sewer environments. This greatly reduced the need for the kinds of lighting and surface adjustments common under previous strategies, allowing us to do more shots in less time.

Watch Tippett Studio’s VFX reel showcasing their work on Teenage Mutant Ninja Turtles (2014):


Remote Resources and Cloud Computing

Rob Pieke, Software Lead, MPC Film

At the infrastructure level, one of the more interesting changes is a move towards on-demand remote resources for computing and storage. Platforms such as the recently re-released Zync Render offer opportunities to maintain a smaller “always busy” render farm on-site, but still have near-instant access to extra temporary resource.

Hannes Ricklefs, Head of Software (London, Bangalore) MPC Film

There have been great advances to enable cloud computing for large scale VFX productions. Once the remaining roadblocks – such as security concerns – are resolved, having this “infinite” resource available to more dynamically schedule and scale around the peaks and troughs of any VFX production will be a major game changer.

Kai Wolter, Software Lead, MPC Film

My list of what’s new in VFX includes:

  • Cloud computing and cloud rendering
  • Smarter ways to handle large data sets
  • Finite Element Method (FEM) for muscle simulation
  • VR in VFX (previs, postvis, final images)

Software for All

Damien Fagnou, Global Head of VFX Operations, MPC Film

In recent years, the rise of Open Source Standards like USD, Alembic or OpenSubdiv have given studios really important foundation tools to create VFX. Alongside this, VFX-focused software platforms liked Fabric Engine, authoring software like The Foundry’s Katana or Mari, and the move to fully raytraced renderers such as Pixar’s RenderMan RIS have dramatically changed the game for software development inside VFX studios where the focus is more on workflows and artist tools rather than having to build an entire software stack for VFX from scratch. Furthermore, many of these software packages are now available for non-commercial use, giving students full access to the same toolset as that used by large VFX Studios.

Manuel Huertas, CG Artist, Atomic Fiction

As a surfacing and lookdev artist, I’m very glad to see the most recent release of Mari (Version 3) including shading models from third party vendors such as Chaos Group and Solid Angle being incorporated. This will help to get almost real-time feedback during the texturing workflow for certain materials, using relevant shading models similar to the ones that will actually be constructed in Katana and used in production for final rendering.


Clarisse iFX

Ben VonZastrow, CG Painter, Tippett Studio

Clarisse iFX, by Isotropix, is part of a new breed of VFX tools that allow the artists to work directly on the final image, instead of on a proxy image of variable quality and low accuracy. By foregoing OpenGL and focusing instead on a unified, lightning-fast renderer, Clarisse allows users to directly manipulate scenes with trillions of polygons directly in the final render view.

City compositing in Clarisse iFX. Image copyright © 2015 by Isotropix SAS.

City compositing in Clarisse iFX. Image copyright © 2015 by Isotropix SAS.


6k is the New 2k

Gresham Lochner, VFX Producer, Locktix

We’ve seen a trend recently with our clients increasing to a working format of 4k and 6k for VFX work. Traditionally, this would severely strain a small VFX facility. We’ve built out our infrastructure and pipeline with this work in mind – from the beginning, we’ve invested in a solid Open Drives infastructure as our back end, as well as some proprietary code that sits on top of our hardware, allowing us to easily get around IO bottlenecks. Because of all of this, we don’t blink at 6k work – it’s become the norm.


Drones and Photogrammetry for Aerial Reference Gathering

Ray Sena, Environment Artist, Tippett Studio

With the variety of quadcopter drone rigs available to us, it’s now easy for artists to gather large amounts of reference cheaply and with limited time. For example, in a project we’re currently working on, we needed to obtain a huge number of reference photos over about a dozen different types of landscape in China. The VFX director and I had the ability to launch off-the-shelf quadcopters at each of our shoot locations and quickly capture a vast amount of material – the rigs were small enough to mount on to our hiking packs with all of our other camera gear. With the artists in control of the rigs, the director could fly his own flight path to envision the shots, and I could gather specific texture reference which I’m now using for look development.


Machine Learning and Artificial Intelligence

Michele Sciolette, Head of VFX Technology, Cinesite

Machine learning and artificial intelligence is an area that has the potential to disrupt many industries, including visual effects. It’s difficult to predict what the effect might be in the long term, but we’re already in areas such as computer vision, machine learning techniques are already performing at par with, if not better than, the best algorithms that have been developed so far.

The major next generation of advanced tools for visual effects artists may well involve running what a machine has learned over very large datasets, rather than implementing a specific image processing algorithm designed by a human.

In the future, machine intelligence may be smart enough to make VFX decisions. "Ex Machina" image  copyright © 2015 by Universal Pictures and courtesy of Double Negative.

In the future, machine intelligence may be smart enough to make VFX decisions. “Ex Machina” image copyright © 2015 by Universal Pictures and courtesy of Double Negative.


Back to Practical

Matthew Bramante, VFX Supervisor, Locktix

We use a lot of practical techniques with our clients. Although you can do everything in CG and VR these days, we like to go back to our roots, using traditional cinematography with CG augmentation.


Ambition Exceeds Budget

Dominic Parker, Director, One of Us

The biggest challenge for visual effects is that ambition is heading in one direction, and budgets are waving goodbye as they head in the other. While anyone can promise the moon on a stick, meeting creative challenges and surviving commercial realities means the work can often suffer. But for those who are able to deliver … the future is bright.


VFX Home Working

Webster Colcord, Animation Supervisor, Atomic Fiction

Like Uber, the taxi company that owns no vehicles, there’s a growing list of VFX and animation boutiques who outsource their work entirely to freelancers working from home.  With just a few on-staff supervisors, they manage the workflow to self-employed private contractors who, like Uber’s drivers, use their own hardware and licenses and have flexibility in choosing the work they take on.


The Rise of the Boutique

Peter Rogers, Creative Producer, Bait Studio

In the UK, it feels like the playing field for VFX is levelling out a little. The huge names still dominate most of the blockbusters, but the rise of the boutique studio continues apace. Some of those companies who were seen as boutique a year or two ago have expanded so rapidly that they’ve almost created a middle tier between the big names and studios with under a hundred seats. As a result, there’s more choice than ever for producers.

As a studio in South Wales, we’ve also noticed a change in attitude towards non-London companies in the past year or so. We’ve found it easier to get producers to consider using us and are meeting more and more experienced people and recent graduates who don’t see London as the only option for working in the industry.


California Tax Incentives

Jules Roman, President, Tippett Studio

The downward pressure and stress on the industry in California has been at breaking point for years. California tax incentives are a great move in the right direction.

Will California's film industry weather the storm of tax incentives? "San Andreas" photograph copyright © 2015 by Warner Bros. Entertainment.

Will California’s film industry weather the storm of tax incentives? “San Andreas” photograph copyright © 2015 by Warner Bros. Entertainment.


Conclusion

So, what new VFX developments does the above snapshot reveal?

The one big thing on everybody’s mind is VR. After years of simmering, the pot of immersive technologies appears finally to be coming to the boil. As production companies old and new fall over themselves to jump on this latest media bandwagon, visual effects facilities are well-placed to stake out territory in the brave new world that is virtual reality.

Closely related to VR is the steady convergence of film and television with the gaming industry. Not only is there crossover in terms of content, but VFX studios are now seriously considering the integration of gaming engines into their pipelines.

Then there’s realism. Visual effects artists are using technologies and procedures that mimic the physics of the real world with ever-increasing verisimilitude. If you doubt this, take a moment to think about this year’s big films. Think about how stunningly real the visual effects looked. When you’ve done that, take another moment to reflect on the fact that, for every shot that impressed you, there were at least a dozen others that you didn’t even know were VFX shots.

Setting the technology aside, we see that senior visual effects professionals are becoming more closely involved with the wider creative process. Any why not? These days, it’s not unusual for VFX artists to touch almost every shot in a film. For many feature directors, the methodology for visual effects is becoming as important as that for cinematography, production design or any of the other key disciplines.

There’s plenty more to see in our snapshot, from the rise of 3D scanning technology to the ongoing – and perpetually thorny – issue of tax incentives. Many artists are calling for more practical effects, smartly integrating with CG, while others are placing their faith in machine intelligence acquiring not only the skills, but also the judgement to undertake certain kinds of visual effects duties. And the software used by visual effects artists – not to mention the on-demand and cloud-based computing platforms on which it runs – continues to develop at a breathtaking pace.

As for the future … well, some of the new arrivals outlined above will undoubtedly gain strength, and perhaps even endure over time. Others will flower briefly, then fade. However, there’s one thing we can be sure of.

This time next year, everything will be new all over again.

What new VFX trends or technologies have got you all fired up? Share your thoughts below with a comment or two – we’d love to hear what’s on your mind!

Special thanks to Niketa Roman, Stephanie Bruning, Bronwyn Handling, Helen Pooler, Sophie Hunt, Joni Jacobson, Tiffany Tetrault, Jonny Vale, Alex Coxon, Geraldine Morales and Liam Thompson. This article was updated with additional material on 11 September 2015.

Flying by Wire

Ever since the dawn of cinema, people have been flying by wire.

In Fritz Lang’s 1927 classic Metropolis, for example, shots of flying machines soaring over the film’s iconic cityscapes were achieved by mounting miniature planes on taut wires. A similar technique was used in the original King Kong in 1933, for which a tiny squadron of biplanes was inched along its guide wires one painstaking frame at a time.

Creating the miniature effects of "Metropolis". Illustration taken from "Science and Invention" magazine, June 1927, via Smithsonian.com

Creating the miniature effects of “Metropolis”. Illustration taken from “Science and Invention” magazine, June 1927, via Smithsonian.com

Then as now, there were plenty of amateur filmmakers keen to re-create the kinds of sequences they’d ogled in the blockbusters of the day. Luckily for fans of miniature aircraft shots, cinematographer Jerome H. Ash was on hand to offer advice.

Here’s an extract from Ash’s article Substandard Miniature Shots, published in the May 1936 edition of American Cinematographer:

“I think that by far the most satisfactory way to handle miniature plane shots is to hang the plane from wires, as the professionals do. To begin with, stretch three parallel wires well above the path you want the plane to take: these are strictly for support. From these, hang a little T-shaped wooden framework, on pulleys or eyelets; this supports and guides the plane. From the framework, three wires descend to the plane – one to each wing, and one to the tail.”

Ash is at pains to point out to his enthusiastic amateur readers that the wires mustn’t show up on camera. If only a little camouflage is required, he recommends a light application of blue vitriol. A more extreme solution involves painting the wires with alternating black and white stripes, each around half an inch in length – Ash likens this bold approach to the dazzle camouflage used on WWII battleships.

Model aircraft suspended thus are hardly going to be doing aerobatics, but they should at least be capable of running through a few basic manoeuvres:

“The three-point suspension prevents the plane from turning or flying sidewise. The supporting wires may be rigidly fixed to the frame for some types of action, but you’ll have more complete control of the model if the wires extend, like puppet-strings, to where someone standing beside the camera can manipulate them, altering the level and the inclination of the plane. With a little practice, you can make the plane land, take off, climb, glide, stall or sideslip, as well as “flying” level.”

Wire rigs remained in favour with visual effects artists throughout the twentieth century. Even during the 1980s, after Star Wars had kickstarted the trend of shooting miniatures under computer control in front of bluescreens, there were still people around who preferred to string things up the old-fashioned way – notably the effects team behind the gritty flying sequences seen in the 1983 film The Right Stuff.

Setting up a shot for "The Right Stuff", model shop supervisor Earle Murphy fits a practical rocket motor into the engine port of a miniature wire-mounted X-1.

Setting up a shot for “The Right Stuff”, model shop supervisor Earle Murphy fits a practical rocket motor into the engine port of a miniature wire-mounted X-1.

For scenes in The Right Stuff where USAF test pilots push various jet and rocket craft to the limit, director Philip Kaufman turned to the newly-formed USFX, led by Gary Gutierrez. The first footage produced by the effects team was rejected by Kaufman specifically because they’d been shot using motion control and so lacked the visceral feel he was after.

With the production on temporary hold, Gutierrez started experimenting with different ways of creating the desired hand-held look. One of his offbeat test shots was achieved using a wheelchair to ride the camera past the miniature aircraft. Another saw the model jets attached to helium balloons.

The wackiest of all involved hurling the miniature planes out of a top-floor window.

In the end, a variety of tricks were used to put the magnificent flying machines of The Right Stuff on the screen. Most involved wires and mechanical rigs, enhanced with fan-blown smoke, and photographed using telephoto lenses so that everything looked shot from the hip.

Interviewed in Cinefex 14, in Adam Eisenberg’s article Low-Tech Effects, Gutierrez remarked:

‘The amount of delight [Kaufman] got from the success of any shot was directly proportional to how funky the method that accomplished it.”

ILM visual effects supervisor Bruce Nicholson directs head stage technician Joe Fulmer in his manoeuvring of the Krzanowski wire rig used to puppeteer the miniature flying saucers in "Batteries Not Included".

ILM visual effects supervisor Bruce Nicholson directs head stage technician Joe Fulmer in his manoeuvring of the Krzanowski wire rig used to puppeteer the miniature flying saucers in “Batteries Not Included”.

Sometimes funky, wire rigs can also be finely-crafted pieces of high-tolerance engineering. Take the beautiful flying rig created by model mechanical supervisor Tad Krzanowski to put the pint-sized flying saucers of Batteries Not Included through their paces.

Capable of operating both under computerised motion control and live on set, Krzanowski’s multi-wire rig proved so versatile that director Matthew Robbins was able to capture around 30% of his flying saucer scenes in-camera. And producer Steven Spielberg was impressed enough by the results to remark, “I can’t tell the wire work from the motion control work.”

There’s yet more impressive wire-work in the early films of James Cameron. The Hunter-Killers seen in the future war sequences of both The Terminator and Terminator 2: Judgment Day are strictly fly-by-wire machines. For the most part, so too is the Colonial Marines dropship from Aliens.

The crane and wire rig used for the Colonial Marines dropship in "Aliens" is just visible on the left of this wide-angle shot of the Acheron base miniature set.

The crane and wire rig used for the Colonial Marines dropship in “Aliens” is just visible on the left of this wide-angle shot of the Acheron base miniature set.

As digital techniques have advanced, wires have blended more and more into the background. To paraphrase Ultron, digital characters got no strings. Nevertheless, wire-work remains an essential tool for effects artists … especially when it comes to flying a person around.

For many of the zero-g scenes in Gravity, Sandra Bullock was flown around using a custom harness created by special effects supervisor Neil Corbould and his team. Boasting no less than 12 wires under individual servomotor control, the sophisticated rig was capable of turning its astronaut payload on a dime.

Stuntwoman Juliette Cheveley stands in for Sandra Bullock to assist line-up of an ISS interior scene for "Gravity", suspended from a custom wire harness created by Neil Corbould's special effects team.

Stuntwoman Juliette Cheveley stands in for Sandra Bullock to assist line-up of an ISS interior scene for “Gravity”, suspended from a custom wire harness created by Neil Corbould’s special effects team.

With Gravity, as with other modern movies, digital brings its own special benefit – you no longer have to worry about hiding those pesky wires with dazzle camouflage. The more prominent the cables, the easier it is for visual effects artists to paint them out.

However, the future of human flight – in the movies at least – may lie not in wires but robots. Specialist companies like Robomoco – whose latest work can be seen in the upcoming Pan and In the Heart of the Sea – now offer a range of precision robots capable of flying artists, stuntmen and props through an extraordinary range of movement.

Watch Robomoco’s robot “Leia” in action:

So are the days of flying by wire numbered?

Perhaps not. Mad Max: Fury Road built an entire publicity campaign based on its use of practical stunts and effects. Even the new Star Wars movie has jumped on the old-school bandwagon, with director J.J. Abrams asserting at Star Wars Celebration 2015 that, “Building as much as we could [for real] was a mandate.”

Who’s to say the grand old tradition of wire-work can’t be part of this resurgence?

Who’s to say movie heroes won’t once more find themselves flying by wire?

“The Right Stuff” photograph copyright © 1983 by The Ladd Company. “Batteries Not Included” photograph copyright © 1987 by Universal City Studios Inc. “Aliens” photograph copyright © 1986 by Twentieth Century Fox Film Corporation. “Gravity” photograph copyright © 2013 by Warner Bros. Entertainment.