In the VFX ABC, the letter “I” stands for “Innovation”.
If you’re a VFX artist, you probably dream about showing movie audiences something new. A truly original visual effect. A cinematic illusion that nobody has ever seen before.
It’s a dream that crosses disciplines. You might be an artist or a technician, it doesn’t matter. Whether you’re brushing sunlight across a digital matte painting, or wrangling code to perfect the physics of the latest cloth simulation, the driving force that keeps you seeking that Holy Grail is the same.
I’m talking about the desire to innovate.
A love of innovation appears to be hard-wired into the brains of most VFX artists. If you’ve any interest in the history of the craft, you’ll know it’s littered with quotes from people saying: “We just wanted to do something new.”
But let’s not dwell on the past. Let’s look instead into the place where innovation takes us. The future. To find out what innovation means to VFX professionals today, I asked a panel of experts the following question:
What cutting-edge technique or technology is getting you excited about the future of visual effects?
What The Panel Thought
Andrew Whitehurst, Visual Effects Supervisor, Double Negative
“Advances in 3D printing and photogrammetry enable VFX, SFX and art departments to work even more closely together. We can now move between the physical and virtual worlds more easily than ever before, and the possibilities for creative collaboration are vast.”
Sam Hodge, CG Supervisor, Rising Sun Pictures
“I think the hot new thing is a suite of tools that allow us to collaborate more easily: more people, working on bigger shots, together!
First, tools such as Aspera, Cinesync and Shotgun allow us to easily transfer data between each other. We can communicate and collaborate all over the planet at a moment’s notice, passing work in progress through the chain of command to turn discussion into notes. The artists and technicians can iterate on these instructions, turning the vision in the creative’s head into pixels on the screen. We love working face to face but the internet allows us to reach across borders and time zones.
Then there are the tools that allow us to pass data between artists, departments and facilities in a standard common format. OpenSource solutions such as OpenEXR and Alembic have succeeded where previous commercial attempts have failed. It is possible to receive whole complex set ups from one facility and have them working in another’s pipeline in minutes. If you don’t like how the software works, the source code can be modified, after which other can benefit from the improvements made.
Finally, computer resources have become vast in this 64-bit and concurrent computing age, allowing us to reach for new levels of complexity. Tools such as Katana from The Foundry, and the developing Universal Scene Description from Pixar, allow massive permutations of data to be treated as simple database queries. Also computing power is available the cloud, so resources can scale up and down on a per needs basis. It’s a brave new world indeed.”
Geoff Scott, VFX Supervisor, Intelligent Creatures
“Live action deep compositing. I would love a future where our film cameras have the ability to create a per frame scan of the actors and environment and store it within the data of the frame. If (when) we can create a system like this, with a fidelity that is detailed enough to capture hair and differentiate between transparent objects, we can eliminate the need for blue screens or green screens. Artists will no longer have to do the painstaking task of rotoscoping. It would be amazing.”
Simon Stanley-Clamp, VFX Supervisor, Cinesite
“I read recently that Sony have developed a system which analyses textures to assess materials’ properties during the scanning process. These are then relayed to the final shader and lighting passes. Ultimately, techniques like this will help speed up the creation of realistic, believable CG integrating with live action environments.”
John Dietz, VFX Supervisor & Head of Production, Pixomondo
“Currently editing technology feels archaic in that you are only able to ‘cut’ between multiple streams of ‘footage’. After the shoot is done, why can’t we change the camera moves? Why can’t we change the lighting? Why can’t we change the actor’s performance? With all of our VFX innovation we are still at the mercy of the live action plate.
“Over the recent years we have gotten much better at pre-production and planning for principal photography to supply us better plates. BUT we have not been very good at changing the fact that once footage has been captured through a lens, we can’t change much without huge effort.
“The VFX industry is in a race to create the biggest most realistic robot destruction on or off earth. But I’d prefer, in post, to simply change the actress’s little frown into a smile and swing the camera from her front around to her profile to better catch that playful twinkle in her eye.
“Innovation is not about the next algorithm or simulation that makes our CG output more impressive or realistic. Big innovation is about harnessing the many advances in data capture, from across many diverse industries, to acquire more and more highly accurate information from the real world shoot and give filmmakers a way to improve their story by editing everything up until the very last minute.”
Aladino Debert, VFX Supervisor (Commercials & Games), Digital Domain
“I’m always more interested in the creative aspects of a visual effect shot than the technical ones, but that is not to say I don’t pay attention to developments in the field. For me technology is a means to an end – everything depends on what we are trying to achieve visually. Although new technology is essential, I’m more excited about what our technological advances are allowing us to dream about than the techniques themselves.”
Christian Kaestner, VFX Supervisor, Framestore
“As a visual effects artist, I’ve had the chance to work on various innovative productions with some of the smartest people in the industry. Every time we start a new project, I wonder, ‘How do we push the bar to the next level?’ These days, from a purely technological point of view, the possibilities are endless. The biggest challenge for me personally is always: how do we combine our cutting edge technology with artistry that makes your jaw drop?
“I think the one area we have yet to conquer properly is the art of digital characters. Making a believable digital character has evolved enormously in the last decade: we have gone from tiny actor replacements to full screen medium close ups of digital characters. The industry keeps throwing new challenges at us with high frame rate cinema and 4K resolution, but I am certain that one day we will be able to create digital characters that even the most critical of experts won’t be able to distinguish from real life footage. Maybe even shown in 4K at 72 frames per second. One day.”
Blake Sweeney, Head of Software, Method Studios
“There’s a lot to be excited about. Although not the sexiest, I love the work that teams across the industry are doing to establish standards which transcend a single application and simplify interchange. It’s hard to imagine life before OpenEXR and Alembic; these and other projects already make supporting a multi-package VFX pipeline considerably easier. I’m excited to see how Pixar’s Universal Scene Description develops in the future.
“Real-time rendering is another exciting topic. While many renderers have supported progressive rendering on the CPU or GPU for years – and games have completely adopted it – performance often limps along when faced with the high-density scene data required for large-scale productions, especially on the CPU. With 6GB graphics cards becoming available and affordable, artists can really start to render full-scene geometry with raytracing and nothing pre-baked, and have a near real-time experience.
“The capability for real-time rendering without concessions in production will open possibilities for different workflows; it’ll be interesting to see how VFX studios react to this. For example, it’s not hard to imagine reviewing huge portions of dailies with a VFX supervisor becoming an interactive experience – much like a DP works with gaffers on set. In fact, some companies are already starting to do this.”
Jordan Soles, Chief Technical Officer, Rodeo FX
“I’ve been looking into Time-Of-Flight cameras and evaluating how best to pair them with simple off-the-shelf cameras like GoPros, to build a simple and cost-effective rig capable of producing an animated depth-map synced with the high-res footage. I like the idea of having a few of these low-profile reference cameras on-set. Ultimately the data might translate directly into a painted point-cloud, and eventually a virtual set.”
Lou Pecora, VFX Supervisor, Digital Domain
“I am excited to see what opportunities VR will bring to the VFX party. It will be interesting to see what comes out of a technology that allows us to create an experience unencumbered by the confines of the physical reality we are so often charged to recreate. Visions of William Gibson’s Neuromancer, Strange Days, and The Matrix are the obvious examples that spring to mind, but I am looking forward to more extreme uses of this technology – applications that will allow content creators to bring experiences to audiences that are not just alternate versions of our same world, but ones that are totally different and as yet unimagined.”
So what conclusions can we draw from all this? In short, what does the future of visual effects really hold? Here are a few predictions based on the panel’s answers. Nor are these fanciful projections – they’re all beginning to happen right now.
Collaborative working will become more and more commonplace, not just between departments and neighbouring facilities but right across the globe. 24-hour work schedules will become the norm as digital assets leapfrog their way through the timezones, bouncing from one workstation to the next in an effort to conquer time. OpenSource protocols will facilitate this. It will no longer matter where in the world you work; it will only matter how well connected you are. VFX Nation anyone?
Real-world data capture will continue to improve. Cameras will no longer record just a two-dimensional moving image, but will capture complete spatial, textural and lighting information. These data will be used – probably in real-time – to generate environments and synthetic characters and integrate them with live action. Compositing will be instant, seamless and, in the current vernacular, deep.
Simulations will become ever more complex. Despite recent speculation that the effects of Moore’s Law appear to be slowing down, computer capacity will continue to find ways to grow. Cloud-based systems will dominate, supporting the trend towards remote working and international cooperation. Artists will no longer use render farms. They will use a render world.
Visual effects technologies will influence filmmaking as never before, as image manipulation becomes increasingly embedded in the filmmaking process. As everything in the motion picture frame becomes infinitely editable, directorial choices will open up that were previously unavailable (what directors do with those choices, of course, is entirely up to them).
Now it’s time for you to bend over your own crystal ball. Which innovations do you think have changed the world of visual effects? And what do you think the future holds?
- Digital Domain
- Double Negative
- Intelligent Creatures
- Method Studios
- Rising Sun Pictures
- Rodeo FX
Special thanks to Sarah Harries, Ian Cope, Che Spencer, Helen Moody, Melissa Knight, Stephanie Bruning, Rob Goodway, Joni Jacobson, Tiffany Tetrault and Anouk Devault. “Dawn of the Planet of the Apes” photograph copyright © 2014 by Twentieth Century Fox Film Corporation. “The Matrix” photograph copyright © 1999 by Warner Brothers and Village Roadshow Pictures.