N is for New

by Graham Edwards

Cinefex VFX ABC - N is for New in VFXIn the VFX ABC, the letter “N” stands for “New”.

Writing on this blog a little over a year ago, I asked a panel of visual effects experts the following question: What cutting-edge technique or technology is getting you excited about the future of visual effects?

The question prompted some fascinating responses, which you can read in my article I is for Innovation. Collectively, they provide a snapshot of the world of VFX as seen by a range of industry professionals in July 2014.

It’s a snapshot that’s now a year out of date. That’s the thing with cutting edges – the darn things just keep on cutting.

That’s why I’ve decided to revisit the topic, one year on. Because innovation doesn’t go away. In fact, the desire to create something new appears to be hard-wired into the mind of the visual effects artist. And the industry itself, like so many others, is constantly evolving.

This time around, I wanted to hear not only about the latest techniques and technologies, but also the latest business trends. So I stripped my original question back to its simplest possible form:

What’s New in VFX?

How did our panel of experts respond? Let’s find out!


Oculus Rift DK2Virtual Reality

Michele Sciolette, Head of VFX Technology, Cinesite

There is a lot of expectation that 2016 will be the year when immersive technologies such as virtual and augmented reality will become mainstream. Current-generation devices have many limitations, but clearly show the potential for truly immersive experiences. This will inevitably drive demand for new types of entertainment. I expect that the ability to support and create content for immersive experiences will become a common task for visual effects houses in the relatively near future.

Aruna Inversin, CG/VR Supervisor, Digital Domain

With true virtual reality around the corner, content creators and studios are already building their teams and their pipelines to take vancevantage of this next wave of new immersive experiences, the likes of which people have never seen. Using positional tracking, high fidelity screens and haptic (touch-sensitive) inputs, we’ll see a surge in consumer consumption that hasn’t be matched since the invention of the television.

The virtual reality challenges awaiting visual effects artists are numerous – from multiple camera takes and multiple cameras, to 360° video and extremely long frame ranges. As visual effects artists, we’re at the beginning of this amazing ride, with technologies finally catching up to the visions in our heads. Not just on a screen in front of you, but wherever you may look.

Hubert Krzysztofik, Director of Interactive Technology, Pixomondo

The implications of VR, AR and interactive experiences mean that the VFX industry is undergoing historic change. The demand for talented game engine developers is as high as the demand for excellent VFX artists versed in the specifics of working within game engines. Game engines already have a nascent presence in the VFX industry and are increasingly being used in pre-visualisation and look development.

Currently, there are three major players in the game engine field: Epic Game’s Unreal Engine, Crytek Cinebox and Unity Technologies Unity. From a business perspective, it’s important to be platform and technologically agnostic. We identify the strengths of each engine and use them based on the project requirements.

An important part of VR development is the headset component. Currently, Oculus Rift, Sony Morpheus, Google Cardboard and HTC Valve are available in all the engines on the market. Ease of use and minimal bug issues are a major consideration in a VFX studio, which lives and dies by its pipeline.

It’s an exciting time for VR and visual effects, and we seem to building toward an eventual merging of disciplines. I’m looking forward to seeing how VR continues to develop and improve, and how it can be beneficially integrated into the VFX industry and beyond.

Hannes Ricklefs, Head of Software (London, Bangalore) MPC Film

Virtual reality is definitely one of the areas with the most interest and investment being put into it. FMX and SIGGRAPH this year were loaded with talks, panels and demos around the topic. In general, VR is seen as the enabler for pushing the convergence of film and games technology. For fully immersive experiences, it needs the quality of VFX at the speed of a game engine.

One of the major excitements surrounding VR is that there are no established processes around the creation of VR content. Existing process such as 2D stitching, stereo virtual cameras and camera tracking now need to work within a full 360° vision. Recognising the importance of this area, MPC has established MPC VR, a creative- and technology-led VR team focused on immersive exploration across multiple industries, and integrated with Technicolor’s R&I (Research and Innovation) group.

Karl Woolley, VR Lead, London, Framestore

Spend five seconds on the Oculus CV1 or HTC Vive, and you’ll immediately understand what a difference it brings to see your hands, grab objects and move around a space, as opposed to being sat in a locked-off, passive position.

Whether the VR experience is based on live-action, pre-rendered or generated in real-time, game engines are at its heart. They allow you to leverage the input devices, and to craft worlds and environments based on your VFX assets … after a bit of work! Game engines have come on leaps and bounds in terms of visual quality and performance in the last five years, with the folks at Epic (Unreal Engine) releasing dedicated tools to make the VFX-to-VR process even easier and accessible for all in 2016.

With our roots in VFX, we traditionally focus on perfecting the pixel to tell a story. But VR is Virtual Reality, not just Visual Reality. 2016 will be the year we get natural, low-barrier methods of input, with Sony, HTC and Oculus all having consumer kit out making virtual reality available to the masses. That’s when we’ll truly see what the public’s appetite is for VR.


Videogame Engines

Michele Sciolette, Head of VFX Technology, Cinesite

The quality of real-time graphics has been improving at an incredible pace, and the output of modern game engines – supporting features such as physically based shading – is fast approaching the level of quality we need in high-end visual effects. The explosion of independent game publishers in recent years has led to new licensing options, making top quality engines accessible to everyone. Thus game engines could soon become a viable option for certain kinds of work – especially if you drop the constraint to run the engine in real-time.

In my opinion, the main obstacle still to overcome is pipeline integration. In order for visual effects houses to fully embrace video game engines, we need the ability to transparently move assets between a more traditional pipeline and one based on a game engine, to ensure consistency of the final look and minimise duplicated effort.

Watch the animated short A Boy and his Kite, created in Unreal Engine 4 and rendered in real-time at 30fps:


Integration into the Creative Process

Christian Manz, Creative Director, Film, Framestore

The biggest change I’ve observed in recent times is how VFX has been embraced as an important part of the filmmaking and storytelling process from start to finish, not just in post. Only the director and producers serve longer time than the VFX team on a big movie, and in that time you get to collaborate with a lot of talented people across multiple departments to bring the director’s vision to the big screen. Being part of that creative process really excites me – it’s why I think there hasn’t been a better time to be involved in the world of VFX.

Chris MacLean, CG Supervisor, Mr.X

Over the last few years, there’s been a trend towards VFX houses having more creative input with respect to the storytelling process. Whether with a design element like a creature, or something as robust as editorial input, VFX artists are being given more creative responsibility. It’s exciting for us as it gives us an opportunity to be part of the action as opposed to simply facilitating it.

It used to be that you would do your budget, get your shots with your line-up sheets, and drop the shot into your pipeline. Now, we’re being asked to pitch previs or postvis ideas, help design camera moves, and collaborate with production design and art departments. In some cases, we’ve even helped redesign sequences in post. It’s nice to see growing respect for our contributions to the filmmaking process, and to be recognised as visual storytellers in our own right.


Perfect Integration of CG with Live-Action

Mark Wendell, CG Supervisor, Image Engine

One exciting industry advance that’s now becoming possible – with the proper setup – is the near-perfect integration of CG into live-action plates in a practical and efficient way.

A number of developments are contributing to this: improvements in path-tracing renderers, the adoption of physically plausible shading models, and the use of lookdev and lighting setups calibrated to reference photography. While this isn’t particularly “new”, the tools and techniques have reached the point where we’re finally seeing a huge payoff not only in terms of realism, but also in production efficiency.

Of course, getting the proper reference and building colour-calibrated setups requires a bit of up-front investment in time. But when it’s done properly, we’re seeing lighters achieve a nearly perfect plate-match on their first render, in any lighting environment. Amortised over multiple shots, that investment more than pays for itself, and that’s really exciting. Rather than spending endless iterations just getting their CG elements to match the plate, artists actually have the time to refine shots and give them that last five percent kiss of awesome.

Watch Image Engine’s VFX reel showcasing their work on Chappie:


Physically Plausible Shading

Howard Campbell, Lead Technical Director, Tippett Studio

Rendering of computer generated images involves very complex calculations of light and material interactions – so complex, in fact, that we can only ever approximate the results. Traditionally, most render approaches have involved a high level of artist manipulation to arrive at a convincing render. But in recent years, as computer power has increased significantly, there’s been a shift towards more expensive – but much more accurate – physics-based rendering.

Physically plausible shading is a strategy whereby rays of light are traced through the scene according to the physical laws of energy conservation. It results in a more believable-looking result out of the box with much less artist tweaking. In Teenage Mutant Ninja Turtles (2014), for example, Tippett Studio used physically plausible shading to render the young turtles and sewer environments. This greatly reduced the need for the kinds of lighting and surface adjustments common under previous strategies, allowing us to do more shots in less time.

Watch Tippett Studio’s VFX reel showcasing their work on Teenage Mutant Ninja Turtles (2014):


Remote Resources and Cloud Computing

Rob Pieke, Software Lead, MPC Film

At the infrastructure level, one of the more interesting changes is a move towards on-demand remote resources for computing and storage. Platforms such as the recently re-released Zync Render offer opportunities to maintain a smaller “always busy” render farm on-site, but still have near-instant access to extra temporary resource.

Hannes Ricklefs, Head of Software (London, Bangalore) MPC Film

There have been great advances to enable cloud computing for large scale VFX productions. Once the remaining roadblocks – such as security concerns – are resolved, having this “infinite” resource available to more dynamically schedule and scale around the peaks and troughs of any VFX production will be a major game changer.

Kai Wolter, Software Lead, MPC Film

My list of what’s new in VFX includes:

  • Cloud computing and cloud rendering
  • Smarter ways to handle large data sets
  • Finite Element Method (FEM) for muscle simulation
  • VR in VFX (previs, postvis, final images)

Software for All

Damien Fagnou, Global Head of VFX Operations, MPC Film

In recent years, the rise of Open Source Standards like USD, Alembic or OpenSubdiv have given studios really important foundation tools to create VFX. Alongside this, VFX-focused software platforms liked Fabric Engine, authoring software like The Foundry’s Katana or Mari, and the move to fully raytraced renderers such as Pixar’s RenderMan RIS have dramatically changed the game for software development inside VFX studios where the focus is more on workflows and artist tools rather than having to build an entire software stack for VFX from scratch. Furthermore, many of these software packages are now available for non-commercial use, giving students full access to the same toolset as that used by large VFX Studios.

Manuel Huertas, CG Artist, Atomic Fiction

As a surfacing and lookdev artist, I’m very glad to see the most recent release of Mari (Version 3) including shading models from third party vendors such as Chaos Group and Solid Angle being incorporated. This will help to get almost real-time feedback during the texturing workflow for certain materials, using relevant shading models similar to the ones that will actually be constructed in Katana and used in production for final rendering.


Clarisse iFX

Ben VonZastrow, CG Painter, Tippett Studio

Clarisse iFX, by Isotropix, is part of a new breed of VFX tools that allow the artists to work directly on the final image, instead of on a proxy image of variable quality and low accuracy. By foregoing OpenGL and focusing instead on a unified, lightning-fast renderer, Clarisse allows users to directly manipulate scenes with trillions of polygons directly in the final render view.

City compositing in Clarisse iFX. Image copyright © 2015 by Isotropix SAS.

City compositing in Clarisse iFX. Image copyright © 2015 by Isotropix SAS.


6k is the New 2k

Gresham Lochner, VFX Producer, Locktix

We’ve seen a trend recently with our clients increasing to a working format of 4k and 6k for VFX work. Traditionally, this would severely strain a small VFX facility. We’ve built out our infrastructure and pipeline with this work in mind – from the beginning, we’ve invested in a solid Open Drives infastructure as our back end, as well as some proprietary code that sits on top of our hardware, allowing us to easily get around IO bottlenecks. Because of all of this, we don’t blink at 6k work – it’s become the norm.


Drones and Photogrammetry for Aerial Reference Gathering

Ray Sena, Environment Artist, Tippett Studio

With the variety of quadcopter drone rigs available to us, it’s now easy for artists to gather large amounts of reference cheaply and with limited time. For example, in a project we’re currently working on, we needed to obtain a huge number of reference photos over about a dozen different types of landscape in China. The VFX director and I had the ability to launch off-the-shelf quadcopters at each of our shoot locations and quickly capture a vast amount of material – the rigs were small enough to mount on to our hiking packs with all of our other camera gear. With the artists in control of the rigs, the director could fly his own flight path to envision the shots, and I could gather specific texture reference which I’m now using for look development.


Machine Learning and Artificial Intelligence

Michele Sciolette, Head of VFX Technology, Cinesite

Machine learning and artificial intelligence is an area that has the potential to disrupt many industries, including visual effects. It’s difficult to predict what the effect might be in the long term, but we’re already in areas such as computer vision, machine learning techniques are already performing at par with, if not better than, the best algorithms that have been developed so far.

The major next generation of advanced tools for visual effects artists may well involve running what a machine has learned over very large datasets, rather than implementing a specific image processing algorithm designed by a human.

In the future, machine intelligence may be smart enough to make VFX decisions. "Ex Machina" image  copyright © 2015 by Universal Pictures and courtesy of Double Negative.

In the future, machine intelligence may be smart enough to make VFX decisions. “Ex Machina” image copyright © 2015 by Universal Pictures and courtesy of Double Negative.


Back to Practical

Matthew Bramante, VFX Supervisor, Locktix

We use a lot of practical techniques with our clients. Although you can do everything in CG and VR these days, we like to go back to our roots, using traditional cinematography with CG augmentation.


Ambition Exceeds Budget

Dominic Parker, Director, One of Us

The biggest challenge for visual effects is that ambition is heading in one direction, and budgets are waving goodbye as they head in the other. While anyone can promise the moon on a stick, meeting creative challenges and surviving commercial realities means the work can often suffer. But for those who are able to deliver … the future is bright.


VFX Home Working

Webster Colcord, Animation Supervisor, Atomic Fiction

Like Uber, the taxi company that owns no vehicles, there’s a growing list of VFX and animation boutiques who outsource their work entirely to freelancers working from home.  With just a few on-staff supervisors, they manage the workflow to self-employed private contractors who, like Uber’s drivers, use their own hardware and licenses and have flexibility in choosing the work they take on.


The Rise of the Boutique

Peter Rogers, Creative Producer, Bait Studio

In the UK, it feels like the playing field for VFX is levelling out a little. The huge names still dominate most of the blockbusters, but the rise of the boutique studio continues apace. Some of those companies who were seen as boutique a year or two ago have expanded so rapidly that they’ve almost created a middle tier between the big names and studios with under a hundred seats. As a result, there’s more choice than ever for producers.

As a studio in South Wales, we’ve also noticed a change in attitude towards non-London companies in the past year or so. We’ve found it easier to get producers to consider using us and are meeting more and more experienced people and recent graduates who don’t see London as the only option for working in the industry.


California Tax Incentives

Jules Roman, President, Tippett Studio

The downward pressure and stress on the industry in California has been at breaking point for years. California tax incentives are a great move in the right direction.

Will California's film industry weather the storm of tax incentives? "San Andreas" photograph copyright © 2015 by Warner Bros. Entertainment.

Will California’s film industry weather the storm of tax incentives? “San Andreas” photograph copyright © 2015 by Warner Bros. Entertainment.


Conclusion

So, what new VFX developments does the above snapshot reveal?

The one big thing on everybody’s mind is VR. After years of simmering, the pot of immersive technologies appears finally to be coming to the boil. As production companies old and new fall over themselves to jump on this latest media bandwagon, visual effects facilities are well-placed to stake out territory in the brave new world that is virtual reality.

Closely related to VR is the steady convergence of film and television with the gaming industry. Not only is there crossover in terms of content, but VFX studios are now seriously considering the integration of gaming engines into their pipelines.

Then there’s realism. Visual effects artists are using technologies and procedures that mimic the physics of the real world with ever-increasing verisimilitude. If you doubt this, take a moment to think about this year’s big films. Think about how stunningly real the visual effects looked. When you’ve done that, take another moment to reflect on the fact that, for every shot that impressed you, there were at least a dozen others that you didn’t even know were VFX shots.

Setting the technology aside, we see that senior visual effects professionals are becoming more closely involved with the wider creative process. Any why not? These days, it’s not unusual for VFX artists to touch almost every shot in a film. For many feature directors, the methodology for visual effects is becoming as important as that for cinematography, production design or any of the other key disciplines.

There’s plenty more to see in our snapshot, from the rise of 3D scanning technology to the ongoing – and perpetually thorny – issue of tax incentives. Many artists are calling for more practical effects, smartly integrating with CG, while others are placing their faith in machine intelligence acquiring not only the skills, but also the judgement to undertake certain kinds of visual effects duties. And the software used by visual effects artists – not to mention the on-demand and cloud-based computing platforms on which it runs – continues to develop at a breathtaking pace.

As for the future … well, some of the new arrivals outlined above will undoubtedly gain strength, and perhaps even endure over time. Others will flower briefly, then fade. However, there’s one thing we can be sure of.

This time next year, everything will be new all over again.

What new VFX trends or technologies have got you all fired up? Share your thoughts below with a comment or two – we’d love to hear what’s on your mind!

Special thanks to Niketa Roman, Stephanie Bruning, Bronwyn Handling, Helen Pooler, Sophie Hunt, Joni Jacobson, Tiffany Tetrault, Jonny Vale, Alex Coxon, Geraldine Morales and Liam Thompson. This article was updated with additional material on 11 September 2015.

5 thoughts on “N is for New

  1. Thanks for these great insights about the industry, I hope you will keep on with this topic next year!

  2. Further to your Occulus Rift article,one can find a great suite of Occulus authoring
    cameras,mature & up & running for a few months now at liberty3d.com, they fit
    into a Unity/Lightwave pipeline very well.

  3. Tired to heard about 6K, 8K… why not 128K ???
    it’s not an advance, hardware evolve so slow for fast rendering and huge storage ( if you’re not working for a strong compagny)
    What about Z-cam ??
    Nothing new since 10 years, it’s the future of compositing, and possibly track 3d, integration, and a lots of applications.
    (We know that real time is the main goal, but each time they increase the size of the image they push away that goal.)

Comments are closed.