About Graham Edwards

I'm senior staff writer at Cinefex magazine. I also write novels. In a former life, I produced animated films for theme park rides and science centres. If you offer me a cold beer, I won't say no.

Celebrating Cinefex – The Video

Celebrating Cinefex

If you weren’t at the Billy Wilder Theater in LA on May 16th, then you missed one heck of a celebration.

“Celebrating Cinefex” was a special evening of conversation with Cinefex founder and publisher Don Shay and Cinefex editor in chief Jody Duncan. The event was presented by the Visual Effects Society and the UCLA Film & Television Archive, and featured visual effects supervisor Craig Barron as host.

If you did miss the show, don’t worry – we have the whole thing right here on video. Now’s your chance to spend a couple of hours in the company of the team responsible for 35 years (and counting) of in-depth reporting on visual and special effects, covering films from Star Wars and Star Trek to Jurassic Park, Avatar, Gravity and beyond.

If you were there in the audience, why not watch again anyway? Everyone loves a re-release, right?

Here’s what Don Shay had to say about “Celebrating Cinefex”:

Jody and I have spent a good part of our professional lives asking other people questions, so it was fun having the tables turned for a change. We wondered what we’d have to say to fill an evening’s worth of questioning, but that turned out not to be an issue. When our allotted time was up, we felt we were just getting started. Craig Barron did a great job of directing the conversation at a breezy pace, and we enjoyed sharing our thoughts and recollections with him and with our fans who came out to spend the evening with us.

Although there were only three of us up on stage, many others contributed to the success the evening. Inspiration for “Celebrating Cinefex” came from visual effects supervisor Joe Bauer, who, during a lunchtime chat with Jody during her Game of Thrones interviewing, asked if the Visual Effects Society had ever done something of that nature with us. No? Well, a few weeks later we heard from the VES proposing a 35th-anniversary celebration of Cinefex. So, thanks, Joe.

Ben Schneider of the VES and Paul Malcolm of the UCLA Film & Television Archive, which co-sponsored the program, worked tirelessly behind-the-scenes to organize and promote the event. Craig Barron stepped in at the last minute to host it, and was a relaxed, yet focused moderator. Van Ling and Gene Kozicki gathered and presented a wealth of video and still imagery to enhance the evening’s visual appeal, and Jeff Casper recorded the event on video, edited it crisply, and made it available in its entirety for us to share with our fans. I’m sure there were others involved, as well. We’d like to thank them all.

Double Negative and Prime Focus World Unveil Merger

Double Negative merges with Prime Focus World

Europe’s largest independent visual effects company Double Negative and the international creative services provider Prime Focus World have just announced their decision to merge. According to an announcement on the Prime Focus World website, the result will be the world’s largest integrated VFX, 3D Conversion & Animation services company.

The news comes hot on the heels of Double Negative’s announcement in April of their new high-end CG feature animation studio, and anticipates the opening of their Vancouver branch later this year.

Alex Hope, Managing Director of Double Negative, said:

“Our new relationship with Prime Focus World combines fantastic opportunities to grow our business with the freedom to continue to manage Double Negative in the way we always have: Providing a great creative environment for our artists and producing ground breaking effects for our clients.”

Namit Malhotra, Prime Focus founder and CEO of Prime Focus World, had this to say:

“This is a transformational event – both for the companies involved and for the industry. Prime Focus has proven over the last five years the undeniable benefits of global collaboration: the flexibility of working in different time zones; the coming together of creative talent from across the globe; and the ability to leverage tax incentives. We can now bring this together with Double Negative’s unquestionable creative excellence to build a truly formidable offering.”

Double Negative’s management team of Matt Holben and Alex Hope will manage the global VFX business and will become Directors and Shareholders in Prime Focus World, while Namit Malhotra will become Executive Chairman of the Board. According to Holben and Hope:

“We have ambitious plans to build on what we’ve achieved in VFX over the last 15 years. This deal allows Double Negative to develop into a truly global operation that provides great work for our clients and great opportunities for our staff.”

The combined credits of the two companies – both recent and upcoming – include Godzilla, Maleficent, Edge of Tomorrow, Transformers: Age of Extinction, Sin City: A Dame To Kill For, Interstellar, Exodus: Gods and Kings and Jupiter Ascending.

 

Exceptional Minds

The students and staff of Exceptional Minds

It takes an exceptional mind to do great work in any field. That’s as true for visual effects and animation as it is for everything else. So what better place to look for talent than in a place called … Exceptional Minds?

Exceptional Minds is a non-profit vocational centre and animation studio that caters exclusively for young adults on the autism spectrum. During a three-year course, students learn the skills they need to earn a living in multimedia, computer animation and post production.

It doesn’t stop there. Not only does Exceptional Minds provide its students with experience working on major productions, but it also operates as a working studio in its own right. Its success is all the more remarkable when one considers that, for young adults on the autism spectrum, the average unemployment rate is around 90%.

I spoke to Susan Zwerman, who is both Job Developer at Exceptional Minds and an experienced VFX producer.

Yudi Bennett, Susan Zwerman, and Ernie Merlan of Exceptional Minds

Yudi Bennett, Susan Zwerman, and Ernie Merlan of Exceptional Minds

What was the inspiration behind Exceptional Minds?

Exceptional Minds was started in September 2011 by a group of parents and professionals who were alarmed at the lack of jobs for young adults with autism. Most have a child or grandchild of their own on the autism spectrum, and so have a personal interest. They view this unique school as a way to provide opportunities for their kids where none existed before.

I became involved through my good friend Yudi Bennett, who helped co-found the school, which her son now attends. It’s been a real joy to be able to use my experience and contacts in the VFX industry to help set up the programme from the beginning and, more recently, to bring in work for the students.

Josh, Arielle, Eli, Patrick discuss a shot

Josh, Arielle, Eli, Patrick discuss a shot

What skills do you teach the students?

The school provides the bridge between high school and the working world, so our instructors teach technical, creative, and work skills needed for animation and visual effects. They have real working experience in the digital arts fields, so the knowledge they pass down to our students is hard-earned and practical.

Students gain proficiency in graphic arts, animation, web design, visual effects and rotoscoping. We train them to use Adobe software, gaining ACA Certification in Flash, Photoshop, Illustrator, Dreamweaver and Premier Pro. We also teach Silhouette and are starting Maya classes.

Watch the Exceptional Minds sizzle reel for breakdowns of the students’ work:

Are there other ways in which the students benefit from being on the programme?

Our students work both individually and on group projects; the latter give them the real-world social skills they need to be successful working collaboratively. The more proficient students function as project leaders, sharing their knowledge with other members of the team.

They also learn the basic skills necessary for employment, as well as how to start a project, manage workflow, and understand deadlines. Students work on projects from the commercial world as well as internally generated content. They’re taught how to market themselves to prospective employers or clients and how to create and present their resumes, portfolios and proposals.

Graduating students display their diplomas

Graduating students display their diplomas

How much emphasis do you place on forging industry links?

Exceptional Minds is known for having close working relationships with the visual effects industry. We get a lot of support and encouragement from companies including:

The Exceptional Minds students on a field trip to Zoic Studios

Exceptional Minds field trip to Zoic Studios

These links have enabled us to provide our students with hands-on experience, including screen credit for student work on features including American Hustle, Lawless, and Dawn of the Planet of the Apes. We’ve also delivered graphic work for United Front, and animation work for Film Roman.

Throughout the first and second year of our three-year programme we plan many field trips to these facilities so that our students can explore the work environment and what they might be interested in pursuing as careers. This has led to real work for our studio as well as employment for our students once they graduate.

Exceptional Minds field trip to StereoD

Exceptional Minds field trip to StereoD

We recently had a major industry networking event to celebrate our first graduating class – seven exceptional young men and one exceptional young woman – after three years of hard work.

We were humbled by all the industry executives who came out to share that special day with us and to offer their encouragement.

What are your long-term plans for Exceptional Minds?

We just moved into a new facility last year with blue room and state of the art computer workstations … and we’re already outgrowing it! We’ve grown three-fold since we started, and we have a waiting list, as you can imagine. There’s huge demand for programmes like ours as more and more young people on the spectrum enter adulthood seeking meaningful employment. Exceptional Minds isn’t the answer for everyone on the spectrum, but we’re told that we’re a pretty good model for what works in teaching and preparing these young adults for careers.

Long term, we expect to continue to expand. We have students applying from all over the US and Canada – we even had a student travel from his home in Singapore to participate in one of our summer workshops.

Summer activities will likely be a part of our growth as well – these include workshops for younger high school students. The sooner we can introduce these individuals to the field, the more prepared they’ll be. Recently we received scholarship funds from Autism Speaks and a grant from the Academy of Motion Picture Arts and Sciences to help out with this.

So do individuals on the autism spectrum have a particular aptitude for this kind of work?

Yes. They are very focused and pay close attention to detail, and many of them are visually gifted. Plus, they show up on time and rarely miss work!

We look for students who are talented and passionate about their work. Passion is key – and there’s no shortage of that at Exceptional Minds, as anyone who has been around someone on the autism spectrum will guess. Sam Nicholson, founder of Stargate Studios, told us this was one reason he was interested in one of our students for employment. In fact, Kevin Titcher from our first graduating class started his job at Stargate this week.

Sam Nicholson of Stargate Studios with new employee Kevin Titcher

Sam Nicholson of Stargate Studios with new employee Kevin Titcher

Yudi Bennett, Exceptional Minds Director of Operations, is an award-winning Assistant Director with feature film credits including Kramer vs. Kramer, The Four Seasons and Pleasantville.

Ernie Merlan, Exceptional Minds Program Director, has a background in visual effects and a passion for mentoring and inspiring young minds.

Susan Zwerman, Exceptional Minds Job Developer, has worked in the film industry as a VFX Producer for over 20 years and is the author of The VFX Producer: Understanding the Art and Business of VFX.

Now Showing – Cinefex 138

Cinefex 138 - From the Editor's Desk

The doors are open on the latest issue of Cinefex!

Issue 138 features in-depth articles on the visual effects of Captain America: The Winter Soldier, in which Marvel’s all-American superhero (Chris Evans) battles against the threat of a powerful Soviet agent. Next up there’s Maleficent – a live-action fantasy starring Angelina Jolie as the villainess from Walt Disney’s 1959 animated feature, Sleeping Beauty – and The Amazing Spider-Man 2, a second outing for Andrew Garfield as the web-slinging crime-fighter, who this time goes up against Electro (Jamie Foxx), The Rhino (Paul Giamatti), and The Green Goblin (Dane DeHaan). Rounding out this issue is Godzilla, in which the iconic monster wreaks havoc once again in an adaptation directed by Gareth Edwards.

But wait. It doesn’t stop there. Also in Cinefex 138, we’re proud to publish O’Brien vs Dawley, a special article in which Stephen Czerkas presents startling new evidence about the legendary Willis O’Brien and his bitter rivalry with producer Herbert M. Dawley … and which effectively rewrites the history of visual effects. Cinefex editor-in-chief Jody Duncan calls it “The Da Vinci Code of visual effects.” And multi-Oscar-winner Dennis Muren says: “A shocking betrayal fit for Extra or TMZ finally gets told … in Cinefex. And it’s a doozie.”

Before you take your seats for the main feature, here’s Jody with a few thoughts about this new issue:

Jody Duncan – From The Editor’s Desk

As I was writing the Captain America: The Winter Soldier story for Cinefex issue 138, I was struck – yet again – by the extraordinary cooperation offered to us by Marvel Studios. Under the leadership of Victoria Alonso (one of the most dynamic, compelling people I have ever met), Marvel let me come in and see the film very early, when there were still a lot of unfinished effects shots – something most studios will never do. They fear we will think an unfinished shot is a finished shot, and then go out and spread the word that the effects in their film stink.

Marvel “gets it”. They get that we have been doing this for 30-plus years, that we DO know a finished effects shot from an unfinished effects shot, and that we aren’t in the business of going online to critique a film or blab about its plot-line. Our confidentiality record is squeaky-clean, yet most studios treat us as if we are newcomers with an “I Leak Film Plots” blog. (Do you detect some exasperation on my part when it comes to dealing with movie studios? You detect correctly.)

Getting to specifics, issue 138 features the first byline for FXGuide’s Mike Seymour, who has been a big supporter of Cinefex for a long time. Mike’s friendship with Godzilla director Gareth Edwards afforded him a front-seat view of the making of the film. Joe Fordham pulled off a particularly heroic feat this issue, meeting the deadlines for both of his articles – on Maleficent and The Amazing Spider-Man 2 – while also writing a soon-to-be-published book on the Planet of the Apes franchise. (I don’t think Joe got much sleep the past three months.) Finally, issue 138 is a nice “culmination” moment for me, because I first interviewed Maleficent director Rob Stromberg many years ago when he was a shy, 19 year-old matte painter. I love seeing the talent rise!

Thanks, Jody. Now, is everyone sitting comfortably? 3D glasses free of fingermarks? No rustling the candy wrappers when the lights go down. You two in the back row – stop that at once! Everybody ready? On with the show!

Firing Up The Machine

The Machine - birth scene

Pick a robot from the movies. Go on, I know you have a favourite. There are plenty to choose from, so I’ll give you a moment …

Maschinenmensch - Metropolis

Which did you choose? Was it the Maschinenmensch – the mechanical doppelganger of Maria, as seen in Fritz Lang’s seminal Metropolis? Or did you opt for Arnold Schwarzenegger as the relentless cyborg assassin from The Terminator? Or how about Robbie, that benevolent Ariel-analogue from Forbidden Planet?

Perhaps you picked one of the replicants from Blade Runner, arguing that, for a filmmaker, a genetically-engineered organism serves the same narrative function as most fictional robots – namely to hold up a mirror to our own humanity. Bishop from Aliens would probably agree with your thesis – after all, he does prefer the term “artificial person”.

As an alternative to all these classic movie icons, I’m inviting you to consider Ava, android star of British indie film The Machine, released this week in the US on DVD/Blu-Ray. Like many of her illustrious android predecessors, Ava is a physical marvel. Intellectually and emotionally she’s no slouch either; indeed, she may very well possess world’s first self-aware artificial intelligence. Make no mistake – Ava is a robot to be reckoned with.

Caity Lotz as Ava in "The Machine"

Caity Lotz as Ava in “The Machine”

The Machine begins with computer scientist Vincent McCarthy (Toby Stephens) trying to create a true thinking machine. When a military agenda adds robot weaponisation into the mix, the result is Ava – an acrobatic, intelligent robot with plenty going on beneath her artificial skin.

Development of The Machine’s visual effects began early on, during the production of a three-minute proof-of-concept film by producer/director team John Giwa-Amu and Caradog James of Red and Black Films. Bait Studio created the promo’s VFX, in particular developing a look for the robot’s glowing skin. Once finance was secured, they worked alongside Minimo VFX and Tim Smit to deliver the visual effects for the feature.

“Pulling together such a talented team of VFX guys across Bait (Wales), Minimo (Barcelona) and Tim Smit (Holland) was a key factor in the film’s success,” said Giwa-Amu. “We knew that The Machine might live or die on the quality of the VFX and our three lead providers allowed us to compete with some of the best films on release.”

Director Caradog James on the set of "The Machine"

Director Caradog James on the set of “The Machine”

Christian Lett and Llyr Williams, Visual Effects Supervisors at Bait Studio, described the scope of their work on the film:

“We designed, animated and composited the computer user interface graphics, CG prosthetic enhancements, set extensions and matte paintings, gore and gunfire, skin glows, and all the eye glows. We worked very closely with the director, helping to shape the look he wanted, developing approaches that matched his vision, and giving us a real insight into how our work fitted into the narrative, as well as the look of the movie.”

As well as contributing to the near-future ambience, the film’s many and varied computer displays also communicate key story points.

“Caradog wanted the computer UI to look cool, but functional – and not too slick,” said Lett and Williams. “In particular, he told us not to get UI envy from films like Avengers Assemble or Minority Report. He wanted the system to look like it might have cost a lot of money to develop back in the day, but had since been hacked by many MOD engineers to make it do what they wanted. We researched some real-world military UI design – along with designs from other films and TV shows – to help us find the right balance for The Machine. Bait Studio has an award-winning motion graphics team alongside our VFX artists, so these shots in particular played to many of our strengths.”

Ava glowing skin effect

Ava glowing skin effect – before and after digital enhancement

The Machine features an array of robot tech, from prosthetic limbs and brain implants to the star of the show, Ava, played by Caity Lotz. All the tech – Ava included – looks human on the outside, but its fleshy exterior conceals a powerful metal chassis that, under certain circumstances, emits light.

“We worked a lot with Nuke on the skin glows,” said Lett and Williams, “using projection mapping techniques to track the movement of the actor’s skin. This meant we could quickly achieve and fine-tune the subsurface ‘metal under skin’ look that Caradog was looking for. The compositing department had the most heavy lifting to do because of the volume of shots. Most of the work was 2D, with small amounts of CG being brought into Nuke’s 3D environment, rather than relying solely on render passes out of Maya.”

In one memorable sequence, created by Minimo VFX, Ava explores her physicality by dancing through a huge underground chamber. Shot in near darkness, the lyrical scene makes striking use of the robot’s glowing substructure. Minimo VFX Directors and Co-Founders Felix Balbas and Maurizio Giglioli explained their approach:

“Minimo VFX worked extensively with the director on the design of the glowing inner skeleton, in order to create not merely a believable effect, but something with a dreamy-evanescent feeling. Our design had to reflect the very intimate journey taken by the character in the sequence. The footage was already very elegant, so it was a matter of preserving it, adding a touch of “magic”. Because it was shot in darkness and silhouette, the dance sequence was a very difficult body-tracking challenge, for which Minimo turned to Peanut VFX for support.

“We went through several iterations of shading and comping tests, trying to successfully incorporate all the ingredients: internal layers of self-shading, sub-surface scattering and normal ambient effect – all quite challenging for objects that are inside a body, and almost completely in silhouette. After many tests, we realised that the sharper the contrast was, the better it worked. This way we maintained the original feeling of the photography, and kept the layers as subtle as possible.”

Prosthetic and digital effects create the illusion of Dawson's missing skull

Prosthetic and digital effects create the illusion of Dawson’s missing skull

In the film’s dramatic opening sequence, McCarthy interviews a wounded soldier, Paul Dawson (John Paul MacLeod), whose catastrophic injury has cratered a large section of his head. The soldier’s startling appearance was the result of Bait Studio’s digital work, combined with prosthetics by Paul Hyett.

“The actor wore prosthetic make-up for the edges of the wound, with a green section that was removed and replaced with a CG prosthetic,” said Lett and Williams. “We used Autodesk 123D Catch to create the CG wound from the silicone prosthetic, then cleaned it up and rendered it in Maya. The actor’s head was tracked using SynthEyes, and the final composite was done using Nuke. The most challenging part of this shot was rebuilding Toby Stephens’s hair as it passed in front of the yellow tracking markers.

“The opening sequence has 55 VFX shots, ranging from glowing eyes, tabletop computers (later covered in blood) and gore to robotic tentacles lifting a girl into the air. We were really pleased and proud to have our biggest sequence at the very start of the film, and the feedback has been overwhelmingly positive. This scene establishes the film visually as a cut above most low budget science fiction films, and helps sell the world that you’re in. Being able to help with both these things was very rewarding.”

Watch the video for breakdowns of Bait’s visual effects from The Machine:

"The Machine" posterIn all, Bait Studio delivered 250 shots for The Machine over a period of about four months from concept to delivery. Reflecting on the company’s first feature experience, Lett and Williams concluded:

“We’re constantly learning on projects and applying that knowledge to the next, so we can streamline our workflow and develop as a studio. It also showed us how beneficial it is to have a design capability within the studio, so we can offer creative as well as technical solutions to filmmakers. The Machine was a great opportunity to showcase our work and also to show that we have the resources and talent here in Wales to tackle this type of project.”

Images copyright © 2014 Red and Black Films & Bait Studio.

Sony Pictures Imageworks Moves HQ to Vancouver

"The Amazing Spider-Man 2" with visual effects by Sony Pictures Imageworks

“The Amazing Spider-Man 2” features state-of-the-art visual effects by Sony Pictures Imageworks.

“The survival of Los Angeles has always depended upon the daring use of massive resources to create new opportunities in a difficult environment.” Robert M Fogelson, The Fragmented Metropolis, University of California Press, 1993.

At the end of the nineteenth century, the area where Los Angeles now sits was a semi-arid plain dotted with pueblos and prone to catastrophic flooding. Although it was close to the sea, it had no natural harbour, and transport links with the rest of the US were rudimentary. It’s hard to imagine a less likely spot for a major urban centre to rise up.

Yet rise it did, a sprawling metropolis conjured into reality by sheer force of will, a city pulling itself up by its own bootstraps. Little wonder Los Angeles is know as the City of Dreams: the place literally dreamed itself into reality.

It’s fitting that cinema – an industry built on dreams – should have played such an important part in the growth of the region. Culver City in particular has lain at the heart of filmmaking ever since 1915, when city developer Harry Culver encouraged film pioneer Thomas Ince to move his centre of operations to the growing urban centre. The historic tract of land on which Ince’s Triangle Studios was built is now part of the Sony Pictures Studios lot.

"Ince Culver City Studios Now Open" - from the February 5, 1916 edition of "Motography"

“Ince Culver City Studios Now Open” – from the February 5, 1916 edition of “Motography”

It’s ironic, therefore, that Sony Pictures Imageworks – the visual effects and animation unit of Sony Pictures Digital Productions – should be the latest filmmaking enterprise to move its centre of operations not only away from the Los Angeles region, but all the way across the US border and into British Columbia.

As announced in a press release of May 30, 2014, the company is moving into a state-of-the-art facility, leaving only a small presence in Culver City to interact with local filmmakers. With accommodation for up to 700 employees, the new headquarters will have the largest footprint of any visual effects company in Vancouver. The move follows the opening in 2010 of SPI’s Vancouver production office with a staff of 80 artists, and the subsequent growth of the workforce in 2013, during the production of The Amazing Spider-Man 2, to over 350 staff.

According to Randy Lake, Executive Vice President and General Manager, Digital Production Services, Sony Pictures Digital Productions:

“Vancouver has developed into a world-class center for visual effects and animation production. It offers an attractive lifestyle for artists in a robust business climate. Expanding our headquarters in Vancouver will allow us to deliver visual effects of the highest caliber and value to our clients.”

Sony Pictures Imageworks

Sony Pictures Imageworks recently delivered the visual effects for Edge of Tomorrow, and current and future projects include Guardians of the Galaxy, Pixels, Angry Birds, Hotel Transylvania 2 and the next as-yet-untitled Smurfs movie. It’s a line-up that will keep the Vancouver team busy for the foreseeable future, and further cement British Columbia as one of the world’s go-to places for visual effects.

The good news is that Sony Pictures Imageworks appears to be thriving, and that its artists continue to create jaw-dropping, award-winning work. It’s encouraging to see continued investment in, and commitment to, the visual effects industry. In Vancouver, VFX is on the up, just as it is in London and other centres around the world.

But there’s a cost, not only to the workers watching their jobs march across the horizon, but also to the regions that find themselves unable to attract industries in an increasingly competitive global marketplace. Plenty of commentators are warning of the potentially fragile state of the VFX industry, particularly when a company’s decisions about where to set up shop may be significantly influenced by government subsidies.

Life After Pi

Scott Leberecht, director of Life After Pi, a film documenting the demise in 2013 of Rhythm & Hues Studios, had this to say about SPI’s move to Vancouver:

“While making Life After Pi, we quickly realized that only movie studios and the corporations that own them benefit from the temporary nature of subsidized regions. When the handouts in New Mexico stopped affecting the bottom line, Sony pulled out. They will do this again when things change in Vancouver. Decision makers at the top don’t care about the consequences, because it doesn’t affect them in a negative way at all. They don’t have to move their lives to a different country or state every time a new deal is negotiated. As long as it is legal, they will continue to milk governments all over the world, regardless of how damaging it is to the lives of their workers and the future of the visual effects industry.”

So where does all this leave the visual effects industry in Los Angeles? Will VFX prove to be just another gold rush that brought brief prosperity to the American West, only to collapse, leaving behind abandoned facilities and ghost towns? Has the time come for the City of Dreams finally to wake up?

The romantic in me hopes the home of cinema has a few surprises left up its sleeve. Given the history of LA, I want to believe the dream isn’t over yet. After all, was the city not founded on the belief that anything is possible? If Robert Fogelson is to be believed, all it needs is for someone to come along who is daring enough to use “massive resources to create new opportunities in a difficult environment”.

Any takers?

“The Amazing Spider-Man 2” image copyright © 2014 Sony Pictures Digital Productions Inc. “Motography” page image from Media History Digital Library.

They’ve Been Fixing It In Post For Years

Say “visual effects” to the average person in the street, and what will they think about? Enormous monsters? Even more enormous spaceships? Superheroes performing enormously impossible stunts?

Sounds about right to me.

Say “visual effects” to a visual effects artist, and they’re just as likely to think about wire removal, background replacement and the tedious restoration of some tiny detail that should have been photographed in production, but wasn’t.

It’s tempting to imagine that, given the pixel-perfect possibilities of digital manipulation, fixing things in post is a relatively new thing. Not so. To prove it, I’ve unearthed an article from the September 1939 edition of International Photographer magazine, discussing the role of Paul Lerpae, who at the time was Paramount’s first cameraman in charge of optical printing and montages.

Paul Lerpae laces up a Paramount optical printer, circa 1939

Paul Lerpae with a Paramount optical printer, circa 1939

I fell in love with this article right out of the gate. Here’s how it begins:

All camera wizardry is figurative. The special effects branch of motion picture photography includes a number of able gentlemen who combine science and ingenuity in a form of magic as baffling as any trickery concocted by Merlin for visiting Connecticut Yankees.

The notion that visual effects artists are “able gentlemen” might be considered sexist today, but in context it’s utterly charming. As for what these gentlemen did, well, some of it was as mundane as it gets:

It was discovered one day that in an important scene, an American flag had been photographed with the stars on the right instead of on the left, as they should be. To retake the scene with the flag hung correctly would have cost from $5000 upwards. The optical printer was called into service and the flag was “doubled in.” The job was photographically satisfactory and the saving was obvious.

And there you have it: fixing it in post, circa 1939. Here’s how the article sums up the revolutionary process of applying a photochemical Band-Aid:

The most spectacular aspect of the work of the optical printing cameraman is this sensational and wizard-like ability to save scenes, to solve problems and, in short, to play the role of “safety man” for the rest of the production team.

So, was being “safety man” the only thing a visual effects artist had to look forward to in those pioneering days? Not at all:

Under modern production conditions … the special effects and montage work is an integral part of the story preparation from the early stages of scripting. Method varies on different lots, but the general procedure calls for advance planning for unusual and dramatically effective photographic twists and stunts, rather than using special effects and optical printing merely as time and money saving mediums.

The way this is shaping up, I reckon the day-to-day toil of a 1930s VFX artist was similar to that of his modern counterpart. Could the same be said of his working conditions? Well …

Each studio organization likes to train its own men to its particular method of operation. Familiarity with the particular studio’s individual technical gags and devices, trade secrets and pet ways of accomplishing results, is absolutely essential. Consequently, there is less shifting from one lot to another than in any other branch of motion picture photography.

Hmm. In a week when Sony Imageworks announced it’s moving its head office from Culver City to Vancouver, I guess we can hardly assert the 21st century VFX industry enjoys “less shifting from one lot to another”.

Given all this emphasis on the technicalities of the profession, and the training of staff in the “studio way”, was there any place for artistry in those early days? Were visual effects artists just grunts obeying orders, or were they actually furthering the art of cinema? Here’s what the article has to say:

Special effects workers view their jobs as integral parts of the complete task of creating dramatically effective screen entertainment. They can’t see why special effects can not be regarded as equally legitimate phase of artistic contribution to screen entertainment with the dramatic tricks of a skilled playwright or the artistic license taken by poets and painters. In other words, special effects tricks and stunts are just another aspect of industry progress toward placing a richer and more effective type of entertainment upon the screen.

Finally, Lerpae has his own views on where visual effects might take filmmaking in the future. While he doesn’t cite an actual year, it’ s tempting to imagine he might have been looking a nice round 75 years ahead … say to the unimaginably futuristic year of 2014.

Future of special effects progress holds great possibilities for the industry, Lerpae believes, dependent entirely upon the enthusiasm generated among production creators for more and trickier special effects work, plus ability of special effects technicians to satisfy this enthusiasm with practical results.

Enthusiasm. Ability. Great possibilities. Is there anyone working in the industry today who can’t get behind those three things? And is there anyone who wants to contradict the simple truth demonstrated by this article?

The truth that, for the most part, nothing changes so much as it stays the same.

G is for Greenscreen

G is for Greenscreen - The Cinefex VFX ABCIn the VFX ABC, the letter “G” stands for “Greenscreen”.

You’re standing on a film set. What do you see? Cameras? Lights? A craft service table laden with muffins? A hundred people standing around waiting for something to happen?

Look hard, and you may also see something else: a piece of visual effects technology so commonplace that the eye just skitters over it, barely even registering it’s there – strangely appropriate, because the object’s sole function is to appear completely invisible to the camera.

I’m talking, of course, about the humble greenscreen.

Everyone knows what a greenscreen does. When you point a camera at it, the flat primary colour creates a blank space into which those clever visual effects artists can put anything they like. The greenscreen is a blank canvas ready and waiting to be painted with a spectacular Himalayan panorama, a brooding alien cityscape, a speeding freeway … whatever the backdrop, green is queen.

But can its reign continue? To find out, I asked a panel of VFX professionals whether they thought greenscreens would still be around in ten years time. Before they offer their thoughts on the future of greenscreen, however, let’s take a moment to consider its past.

Greenscreen composite from "Avengers Assemble"

Before and after greenscreen composite from “Avengers Assemble” by ILM

Greenscreen Past

The history of greenscreen is really the history of compositing, which the Cinefex VFX ABC explored in C is for Composite. Still, it never hurts to refresh the memory.

A fundamental discipline of visual effects is the combining of one image with another in a sort of kinetic collage. Typically, this involves cutting the moving image of an actor out of one shot and pasting it into the background of another. To do this effectively, you need a foolproof way of making a moving mask that precisely matches the actor’s constantly-changing silhouette. This mask is known as a travelling matte.

Ever since the early days of cinema, filmmakers have experimented with different ways of creating travelling mattes. One of the earliest solutions is still in use today: filming an actor in front of a coloured screen.

Technicolor bluescreen composite from "The Thief of Bagdad"

“The Thief of Bagdad” features some of the earliest Technicolor bluescreen composites

Developed in the 1930s, the Dunning Process used a blue screen, and required the actors to be illuminated with yellow light. Coloured filters were used to separate foreground from background, but the process only worked in black and white. The arrival of colour film led to more complicated systems of filters and optical printers being used to isolate the actors against the bright blue screens.

Why blue? Because the cool colour of the screen was at the opposite end of the spectrum to the warm skin tones of the actors standing in front of it; the contrast made it easier to create a good matte. You just had to make sure the wardrobe department didn’t dress your leading lady in a bright blue evening gown, or else she’d disappear before your eyes.

Preparing a bluescreen shot for "Ghostbusters"

Preparing a bluescreen shot for “Ghostbusters”, in which the Stay-Puft Marshmallow Man was composited into live-action plates shot in New York

In the ‘60s and ‘70s, Disney had great success with yellow screens lit by sodium vapour lights, used in films such as Mary Poppins. But for the most part the colour of choice remained blue. Once digital techniques came on the scene, however, blue began giving way to green.

So why the colour shift? One reason is that many digital cameras are configured using a Bayer Pattern, in which there are twice as many green sensors as either red or blue; these cameras are naturally more sensitive to the green end of the spectrum. And greenscreens often perform better outdoors, in environments where a traditional bluescreen might blend with the sky.

In many situations, however, the bluescreen is still the filmmaker’s best option – it just depends on the demands of the individual shot.

"White House Down" bluescreen composite

This composite shot by Crazy Horse Effects from “White House Down” proves the traditional bluescreen is still alive and kicking

Greenscreen Present

In the old days, lighting a bluescreen was a big deal. Because the optical department was reliant on delicate photochemical processes, it was vital that the blue colour captured in the original photography was as flat and clean as possible. For that reason, most bluescreen shots were set up on the soundstage, under carefully controlled conditions.

The effectiveness of modern colour separation tools – and the trend towards smaller set builds augmented by digital extensions – has led to a more relaxed approach. You’ll find greenscreens of all shapes and sizes on many location shoots, filling in the gaps between buildings or blocking off the ends of streets. Entire sets might be built and covered in greenscreen material, allowing actors to clamber over blocky toytown structures which will be replaced in post-production by entire digital environments.

Smaller greenscreens are used within the sets, or even on the bodies of the actors. Wondering what to display on that bank of monitors in the spaceship’s control room? No problem – just set the screens to green and drop in the funky graphics later. Need to alter the anatomy of your lead actor’s head? Easy – just give him a greenscreen bald cap and get VFX to track in the tentacles.

With a greenscreen, you really can do anything.

In fact, greenscreens have become so familiar that even Joe Public – who’s more interested in popcorn than post-production – understands broadly what they do. Granted, the only key he knows is the one that fits his front door, and he might wonder why the rotoscope department is always griping that there might as well not be a greenscreen there at all – “Gee whiz, the thing doesn’t run to the edge of the set, and it’s not even lit properly, I mean, these things aren’t magic carpets, you know!” Nevertheless, the greenscreen has become a universal shorthand for “visual effects go here”. If there’s a single image that symbolises the visual effects industry for the outside world, the greenscreen is it.

It’s an icon for people within the industry too. Take a look at all the VFX professionals you follow on social media. How many of their online avatars are bright green squares? Quite a few, right?

The Go Green movement rose up over a year ago with an agenda to raise awareness of inequalities within the visual effects industry – in particular the effect of nationally-granted subsidies across an international marketplace. The movement is still going strong, and the symbolic power of the greenscreen remain at the heart of its campaign.

There just no escaping it: the greenscreen is a dominant force in visual effects. In fact, it’s hard to imagine what filmmaking would be like without it.

Greenscreen Future

Let’s fast-forward ten years to a movie set of the near future. Look – there’s the camera. Mind your head on the lights. Hmm, looks like we could do with a fresh batch of muffins on the craft service table.

Now, let’s look for the greenscreens. Ten years on, are they still around? If not, what new technology has come along to replace them?

Here’s what our panel of visual effects experts had to say:

Visual effects technology continues to progress and develop at a high rate. Even now our teams have had to become adept at working around lack of green screen when time constraints/filming schedule prohibit its use. Having said that, I think in ten years time, greenscreen or an equivalent will still be needed when actors are in frame. I can see a time when greenscreen could be replaced with live feeds that can still be keyed off, but have the massive advantage of providing actors with on-set feedback. It would be an interesting development that would be beneficial both for us and for the wider production. – Jeff Clifford, Head of R&D, Double Negative

We’ll probably be using more sophisticated systems for real-time keying on location in order to visualize complex visual effects shots, but the reality is that green (or blue) screens are still very useful, and will likely continue to be for the foreseeable future. We are still coming up with better ways to light actors on green screen to make the integration better. But there are techniques that will likely revolutionize this, ie real-time rendering and the motion capture of performances. I can imagine a not-too-distant future in which we can create 100% photo-real characters, captured in real-time and rendered on a 100% digital environment. – Aladino Debert, Creative Director and VFX Supervisor, Advertising & Games, Digital Domain

I’m pretty sure that in 10 years we won’t be using color difference matting with green or blue screens any more. Future VFX youngsters will feel about this technique much the way we feel about using miniatures today. Cameras which capture depth data are already available. When the resolution of these channels increases, we’ll place set extensions and digital creatures not just behind the plate, but within it. This will complete the deep compositing idea. Meanwhile, I guess, VFX artists will continue spending their time on rotoscoping plates, where it was not possible or too expensive to setup a green screen. – Sven Martin, VFX Supervisor, Pixomondo

Yes, I believe we will still be using greenscreens. Manual rotoscoping is an art form in itself, but even the best roto artist will never match the precision of a greenscreen key. It’s impossible to determine the exact colour and opacity of a hair at a given pixel using even the best rotoscoping system, and to be consistent and accurate over the entire image and a whole sequence of images. Other software solutions which have attempted to extract foregrounds from their backing have been promising, but thus far have proven to be either temporally inconsistent, or simply less precise than a greenscreen. Rear projection has recently been tried again with stunning success in Oblivion. With improvements in projectors (increased dynamic range) I can see this idea being used more often. It does have its disadvantages though; you need to know in advance exactly what you want in the background. Rather than an end to greenscreen use, I hope we will see a hybrid solution: the continued development of the technology and an amalgamation of ideas targeting the same problem. A more intelligent keyer might consider not only colour, but depth, focus, disparity and other image factors to compute whether a pixel is solid foreground, solid background, spill, or transparent foreground. But it seems like it will be a long time before there is a set of circumstances in which a greenscreen would not be at least part of the solution. – Charlie Tait, Head of Compositing, Weta Digital

We will definitely still be using green and blue screens in 10 years time. Technology and techniques are improving, but some classes of problem just require them, and will for the foreseeable future. – Ken McGaugh, VFX Supervisor, Double Negative

I anticipate still using greenscreen insofar as there will be a need to extract live performance from unwanted background. It will be more electronically procedural, with less burden on set-up and lighting to specifications. I think on-set needs will be more forgiving. – Joe Bauer, VFX Supervisor, HBO’s Game of Thrones

Yes, we will still be using greenscreens. There will be advances in technology that will simplify the process, but I don’t think enough of an advance to automate the cutting of mattes. I also don’t believe all advances in technology will be accessible to every filmmaker. However, I do feel this is where 3D stereo technology will come in handy, with further exploration of depth maps. This is probably the area that will bring about the eventual elimination of green screen. – Lon Molnar, Owner & VFX Supervisor, Intelligent Creatures

I would love to see a day when we could do “deep” filming: somehow map out depth and use this to help automate our composites. This is years away from being a reality. Often there are times we choose to not use a blue or green screen, opting to rotoscope instead, but blue and green screens are here for the next 10 years and beyond. – Geoff Scott, VFX Supervisor, Intelligent Creatures

Bluescreen set-up from "Return of the Jedi"

Cameraman Don Dow attends to the miniature sail barge while assistant Patrick McArdle prepares the Vistarama motion control camera, in this bluescreen set-up from “Return of the Jedi”

Conclusion

Well, the consensus seems to be that greenscreens – and blue – aren’t going anywhere anytime soon. Still, given that I can already download an app to my smartphone that will scan an object, isolate it from its background and derive 3D geometry from the data, the dream of “deep filming” may be closer than we think.

Until it becomes a reality, however, the greenscreen seems likely to dominate as the VFX background of choice, and thus will continue to be what it’s always been: the original field of dreams.

Avengers Assemble photographs © 2012 by Marvel Entertainment. Ghostbusters photograph copyright © 1984 by Columbia Pictures Industries Inc. Prometheus photographs copyright © 2012 by Twentieth Century Fox. White House Down photographs © 2013 by Columbia Pictures. Return of the Jedi photograph copyright © 1983 by Lucasfilm Ltd.

The Five Laws of Movie Mutants

A scene from Warner Bros. Pictures' and Legendary Pictures' epic action adventure "Godzilla," a Warner Bros. Pictures release.

A scene from Warner Bros. Pictures’ and Legendary Pictures’ epic action adventure “Godzilla,” a Warner Bros. Pictures release.

With Godzilla stomping once more on to our screens, Marvel’s X-Men heading for those days of future past, and a fresh batch of teenage turtles preparing to clamber out of the sewers, there’s never been a better time to reflect on that most enduring of movie icons: the mutant.

But what exactly is a mutant? It isn’t as if they all look the same. In a police line-up, who’s going to mistake Michelangelo for Mystique? (And will the giant lizard even fit in the interrogation room?) Yet mutants they all undoubtedly are. So what are the rules?

Well, the Oxford English Dictionary describes a mutation as an “alteration or change in form”. In other words, a mutant is any critter that shouldn’t look the way it does. Sometimes a mutant’s abnormalities occur naturally. More often than not (in the movies at least) they happen because some idiot forgot to shut down the nuclear reactor.

Tod Browning's "Freaks" from 1932Mutants in the Movies

An early example of a movie that deals with mutants is the 1932 film Freaks, directed by Tod Browning, just a couple of years before the Hays Code irradiated Hollywood with its sanitising beam and effectively told filmmakers to play nice. With its cast of genuine carnival performers including conjoined twins, the Human Torso and the Stork Woman, Freaks makes for unsettling viewing even today, yet for the most part treats its subject matter with admirable respect.

No doubt experts will weigh in to tell me the cast of Freaks were not mutants in the strictest sense, any more than John Hurt was portraying a mutant when he played John Merrick in The Elephant Man. But chances are high that the average moviegoer, upon seeing any character whose anatomy deviates wildly from the norm, will shout: “Mutant!”

However, if you’re looking for the best way to put a mutant on the screen, look no further than The Elephant Man. Merrick’s startling physical appearance was achieved by encasing Hurt in elaborate prosthetics created by Christopher Tucker, making the film a perfect exponent of the First Law of Movie Mutants:

  • If it’s a deformity you want, it’s make-up effects you need

Of course, there are exceptions to this law – Freaks is one, The Hills Have Eyes (1977) is another. In the latter, actor Michael Berryman’s genetic disorder ectodermal dysplasia was behind the physiognomy of Pluto, one of horror cinema’s most iconic mutant characters.

But really, it’s all about rubber masks and foam latex appliances. The schlockier the better. In movies like The Toxic Avenger, Basket Case, and Slither, the audience is bombarded with ever-more gruesome make-up effects, all with a single purpose: to gross you out. Thus we discover our Second Law of Movie Mutants:

  • Horror is where it’s at

Science Goes Bad

The Third Law of Movie Mutants is a cautionary one:

  • Don’t leave the plutonium where someone could trip over it

During the 1950s, the atomic age spawned a whole new breed of movie mutants: ordinary animals given extraordinary powers by an unhealthy dose of gamma rays.

Radioactive mutants call for more elaborate visual effects than just a rubber mask. The giant ants of Them! were brought to life primarily by using full-scale mechanical props, while It Came from Beneath the Sea boasts a bridge-bashing octopus created by stop-motion maestro Ray Harryhausen.

For economy and ease of animation, the giant octopus in "It Came Beneath the Sea" had only six tentacles

For economy and ease of animation, the giant octopus in “It Came Beneath the Sea” had only six tentacles

Replace radiation with toxic waste and you get either a bunch of Eight Legged Freaks, or those Teenage Mutant Ninja Turtles I mentioned earlier. In their 1990 screen outing, the avenging amphibians were brought to life by Jim Henson’s Creature Shop. In the upcoming 2014 reboot, the visual effects burden is shared between ILM, Image Engine and Legacy Effects.

Behind the scenes of "Godzilla" (1954) - image via Retronaut.com

Behind the scenes of “Godzilla” (1954) – image via Retronaut.com

The most famous product of misdirected radiation is, of course, Gojira – better known in the Western world as Godzilla. The huge mutant lizard – an unfortunate by-product of a nuclear test explosion – flattened his first city block in Ishirō Honda’s classic film of 1954. Now this legendary monster’s on the loose again, in Gareth Edwards’s Godzilla (2014). The special effects credits in the Japanese original include Teizô Toshimitsu with the wonderful job description: Monster Builder. The latest outing features visual effects from a host of vendors including:

(To find out exactly how Godzilla’s mega-monster was created, order your copy of Cinefex issue 138, out in mid-June and featuring the definitive behind-the-scenes story about the film’s visual effects. And visit Retronaut for more great behind-the-scenes shots from the original movies.)

Messing with Nature

The Fourth Law of Movie Mutants is a kissing cousin of the Third Law:

  • Man makes mutants

Yes, we’re talking about all those thrillers featuring genetic mutations, military experiments … in short, any scenario whereby man has deliberately messed with nature. Whatever the dastardly misdeed, you can be sure of one thing: it’s not going to end well.

Mutants in this category range from the super-fish of Piranha to the hyper-sharks of Deep Blue Sea. Factor in guilty pleasures like Jonathan King’s Black Sheep, and you’ll soon realise that this enduring theme has achieved the kind of mythical status best appreciated in the comfort of your own home, preferably with a jug of cold beer at your side and in the company of friends just as ready as you to throw pretzels at the screen.

Laughable though much of this particular brand of mutant mayhem might be, the “man-made mutants” sub-genre isn’t without its genuine chillers. The Mist conceals an extraordinary menagerie of half-seen monsters – not to mention a monster of an ending. The Fly – both the 1958 original and the 1986 remake – is a compelling cautionary tale about what happens when you fail to keep a can of Raid beside the teleporter. And I defy anyone to watch the classic “room of failed clones” scene from Alien: Resurrection without feeling queasy.

The Space Bug from David Cronenberg's "The Fly"

The walking Space Bug from the finale of David Cronenberg’s “The Fly” functioned off a counterbalanced slave system, with the puppet moving in a direction opposite to the movements of operator John Berg, who was harnessed behind the creature’s mobile support system

Mutants to the Max

In most of the above-mentioned films, the mutants are there to exploit our fears. Fear of nuclear tests, fear of deviancy, fear of the unknown. But there are a few films – just a few – that take a more thoughtful approach. These our the movies that fall under The Fifth and Final Law of Movie Mutants:

  • Mutants can be a serious business

First up in this category is John Carpenter’s The Thing, which explores ideas of identity and what it is to be human. How does it do this? By letting the mutations happen right in front of our eyes. Thanks to Rob Bottin’s amazing practical effects, men transform in outrageous fashion into the most eye-boggling array of movie monsters ever put on the screen. You could argue that the end result is just another monster movie – though an extraordinarily good one – but what sets The Thing apart is that the theme of mutation isn’t simply tacked on for the sake of cheap thrills; it actually drives the story.

Kuato and George puppet - "Total Recall" (1990)

Even with a computer to record and perform the lip-sync articulations, “Total Recall” Kuato-and-George puppet required as many as twenty on-set operators

Similarly, the colony of downtrodden Martian mutants in Total Recall (1990) is not just there to titillate (with the possible exception of the triple-breasted woman), but is an essential plot element. Under Paul Verhoeven’s direction, Total Recall can hardly be called subtle, but its subtext of prejudice and exploitation is handled with surprising sensitivity.

Total Recall used all the tricks in the book to create its cast of mutants. According to special make-up effects supervisor Rob Bottin, “Mutants are a lot of fun in terms of design because there are really no rules – a mutant can be anything. It was Paul’s approach that as we revealed each mutant, the deformities would be progressively more shocking.”

Bottin’s biggest effects challenge in Total Recall was Kuato, the parasitic twin of George. To create the illusion of the tiny human emerging from his brother’s abdomen, actor Marshall Bell wore a full-body prosthetic. “His jawline was joined to the prosthetic,” explained Bottin, “which had a parachute harness to support Kuato’s mechanical head and all the cables necessary to operate the arms.” For close-ups and dialogue scenes, the prosthetic approach was abandoned in favour of a fully mechanical Kuato-and-George puppet. “The arms on both Kuato and George moved via a slave mechanism operated and performed by a single puppeteer.”

"X-Men: Days of Future Past" posterThe other popular example of mutants done right is, of course, the X-Men series. Professor Charles Xavier’s private academy first unleashed its class of misfits on to cinema screens in 2000, elaborating on Total Recall’s “mutants as an underclass” theme with its story of politics, prejudice and superpowers.

The popular Marvel franchise shows no signs of slowing as we anticipate this month’s release of X-Men: Days of Future Past. Visual effects vendors for the new film include Digital Domain, MPC, CinesiteHydraulx, Rising Sun Pictures, Legacy Effects and The Third Floor and we’ll be taking a look at them shortly, right here on the Cinefex blog.

The movies are full of mutants – which one’s your favourite, and why? Are mutants a serious business, or it all just shock treatment? And are there any Mutant Laws I’ve failed to include? It’s over to you!

Total Recall photographs copyright © 1990 by Tri-Star Pictures, Inc. Godzilla (2014) photograph courtesy of Warner Bros. Pictures and copyright © 2014 Warner Bros. Entertainment Inc. & Legendary Pictures Productions LLC

Fulldome – Films in the Round

National Space Centre, Leicester, UK

When was the last time you went to a planetarium? What did you see? An astronomer telling you all about the night sky? A bunch of stars projected on to the inside of a dome? Maybe a laser making psychedelic patterns to a prog-rock beat.

Well, things have moved on.

One of the places they’ve moved to is the UK’s National Space Centre in Leicester. As well as containing a pair of rockets, a real chunk of moon rock, and the Sir Patrick Moore Planetarium, this futuristic visitor attraction also houses NSC Creative, a leading creator of fulldome immersive experiences.

Fulldome shows started popping up at planetariums during the 1990s. Suites of nifty new projectors meant you could suddenly do more with your dome than just throw a few dots up on the ceiling. Instead, you could project a movie – but not any ordinary movie. A fulldome film extends beyond the confines of the traditional cinema screen, presenting moving images that both fly over your head and creep up on you from behind.

Fulldome really is film in the round.

I spoke to Paul Mowbray, Head of NSC Creative, and Aaron Bradbury, CG Supervisor, about what life is like under the dome.

The NSC Creative team

The NSC Creative team.

The Fulldome Format

In the early days, fulldome experiences were cobbled together using a range of technologies old and new. NSC Creative’s first show – Big, a documentary exploring the vast scale of the universe – employed a traditional opto-mechanical star ball for those all-important dots, but also featured cross-dissolving panoramic photography, computer animation, and even some stop-motion, courtesy of the team’s ex-Aardman staff.

“We had this crazy menagerie of different analogue technologies, plus a bit of early digital,” Mowbray recalled. “We had a partial dome system with three enormous CRT projectors running NTSC video, plus two banks of all-sky 35mm film projectors. We had a laser; we had lighting effects. It was a real hodge-podge.”

Fisheye image projected on to fulldome

Fulldome shows generally use fisheye images projected on to a hemispherical screen.

These days, the jumble of equipment has given way to a streamlined bank of six digital projectors. They’re fed a single 4K diameter fisheye image, 360° x 180°, rendered as a circle on a square frame and then sliced into segments, one for each projector. Each segment is projected on to the dome in perfect alignment with its neighbour.

“We use software edge-blending, as well as hardware blends – that means physical combs around the edge of the projector lens,” explained Mowbray. “There’s a computer dedicated to each projector, each of which plays its slice of the pie as a high resolution MPEG.

“Other people play back uncompressed footage from one computer, off a fast SSD striped RAID with a 6-head graphics card. 4K digital cinema projectors are also quite common, because you get more pixels with fewer projectors. With that you have two projectors facing each other; each has a half-fisheye lens with a  blend across the middle. 4K is the current accepted standard, but we’re starting to see 8K systems using six or more 4K projectors. We’ve been doing research to work out how far you need to go before you can’t actually perceive any difference in the resolution.”

Aaron Bradbury added: “Overall, we think 16K is a happy medium. You capture most of the audience at retinal resolution, and the people sat near the edge of the dome are still getting a reasonable experience. The image is distorted for them, but that’s no different to be sat to the side of the cinema when you’re watching a feature film.”

Dome format comparison

Fulldome resolution compared to other common formats.

Fulldome Production

Google Lunar XPRIZE fulldome show

“Back to the Moon for Good” is a fulldome production promoting the Google Lunar XPRIZE.

NSC Creative specialises in producing quirky, family-friendly science documentaries like Astronaut (narrated by Ewan McGregor),  We are Astronomers (narrated by David Tennant) and We Are Aliens (narrated by Rupert Grint).

The in-house productions are interspersed with commissioned work for other dome centres or commercial clients. A recent commission was Back to the Moon for Good, promoting the Google Lunar XPRIZE, which offers a $30 million prize to private companies for developing a new lunar lander.

Typical turnaround for a 25-minute fulldome show is around a year. Under extreme circumstances that can be reduced to just four months. The key to success under such tight timescales is planning … and sticking to the plan.

“We’re brutally efficient,” said Bradbury. “Everything we do ends up on the dome. We do the storyboard, we do all the layout and previs, then the cameras are locked. We do that with our in-house shows, and not just the commercial ones. I think that surprises some producers.”

The business of asset creation, modelling and rendering is very much industry-standard, with Maya, 3DS Max and SoftImage being used to pump out the polygons. But producing that all-important fisheye image limits the choice of renderer.

“At the moment we’re tied to Mental Ray, because that’s the only one with the ray-traced spherical lens shader we need,” said Mowbray. “The other technique is to stitch five cameras together, but we prefer a one-click renderer.”

Most members of the NSC Creative team are generalists, although recent growth has seen them assign more specific roles in areas like lighting, FX and environment. All face the very specific challenges that fulldome brings.

“Just pulling the data off the network drives is insane – it kills everything,” Mowbray reflected. “We’re constantly trying to figure out workarounds for the creative and technical challenges of the format, all while trying to manage an ever-growing team of CG artists. At the same time we’re developing a new cinematic language. We’re effectively growing a whole industry.”

The Goldilocks Zone - "We Are Aliens"

“We Are Aliens” presents scientific principles in a quirky, family-friendly way.

The Language of Fulldome

A debate about cinematic language might sound a bit esoteric for a commercial operation, but under the dome it’s a live issue. Rapid transitions in the 360° environment can be overwhelming, and so a typical fulldome show tends to feature long takes, favours dissolves over cuts, and pays particular attention both to where things are in space, and how fast they’re moving.

“You have to be a lot slower,” said Mowbray. “The rule of thumb is: if it feels right in flat screen, then it’s too fast for fulldome.”

Then there’s bounce. When images are projected on to the light grey curved ceiling of a typical dome environment, reflected light can wash things out. This must be taken into account early in the production process, and shots designed to mitigate its effects.

"We Are Aliens"

Many fulldome shots use vignettes to mitigate the effects of light bounce in the dome.

“For shots where the focus is towards the front of the dome, we darken off everything towards the back,” Bradbury explained. “Nobody notices there’s a big dark section behind them because they’re looking straight ahead. But you can also have shots that are more experiential, where the audience is free to explore the frame. That’s the beauty of fulldome. We don’t normally vignette those experiential shots; instead, we keep them dark overall to reduce the bounce.”

Mowbray added: “The new projectors are getting better, with a good balance of brightness and contrast that mostly overcomes these issues. There are projection technologies out there derived from flight simulator tech that have extra panels to give true black. The ideal would be an OLED dome surface.”

There’s one thing that works particularly well in the dome: making people feel sick!

“In Astronaut, we have a centrifuge scene that first spins you around, then rolls the camera, inducing motion sickness,” said Bradbury. “It’s won several ‘most immersive shot’ awards. It’s a big ‘wow’ moment that leaves a lasting impression.”

Paul Mowbray stressed the need for restraint. “If someone actually feels ill when they’re watching one of our shows, then we’ve failed. It’s a really fine line. We have to use it to the advantage of our narrative, and not just for a cheap trick.”

Finally there’s the whole business of frame rates. Love it or hate it, there’s no denying that HFR is a hot topic. Especially under the dome.

“With The Hobbit, a lot of people think HFR detracts from the story,” Mowbray stated. “But we’re trying literally to transport people to other places. In the dome, that weird high frame rate thing actually works. 60fps is where we’re heading at the moment, in 8K.”

The 10,000-Frame Shot

One of NSC Creative’s recent commissions is Tomorrow Town – an architectural visualisation project for the Schindler Group. The centrepiece of Tomorrow Town is a 10,000-frame shot during which the camera moves through a futuristic cityscape.

“It’s very hard to cheat shots like this,” explained Aaron Bradbury. “With fulldome, everything’s in the shot, all the time. As you move past buildings you can’t progressively lose them behind you. That’s the biggest challenge we have: setting up scenes so that we can move constantly and seamlessly through them.”

Tomorrow Town architectural visualisation

The “Spiral Town” arcology from “Tomorrow Town”.

Mowbray elaborated: “The normal tricks you do in flat screen work to switch out scenes with a bit of judicious comp work are mostly not viable. We used Mental Ray proxies for the background city, as well as iToo Rail Clone – a parametric modelling tool perfectly suited to this kind of arch-viz work. The biggest killer was the trees. We couldn’t use cards because of the camera move so the geometry got quite dense. The total poly count was about 50 million for the big ‘Spiral Town’ model.”

The lengthy shot contains live action of actors walking around a futuristic concourse. Actors were shot at 4K against greenscreen with locked off cameras, with the footage mapped on to cards in the 3D scenes.

“We were pushing the limits of the technique,” said Mowbray. “Parallax issues occurred if we pushed the angle of the camera too much. But we mixed in lots of CG doubles and controlled the area of interest in the dome, so it worked pretty well.”

Oculus Rift

It’s tempting to think of dome as a closed environment – a specialist arena from which a show producer might never emerge. Not so. Emerging VR technologies like the Oculus Rift mean the immersive 360° experience is no longer trapped under the dome.

“You can’t have a dome at home, but now there’s an opportunity for anyone to have an amazing immersive experience,” said Mowbray. “The Oculus Rift doesn’t deliver the collective group experience, but we are looking at it as a secondary opportunity to monetise our shows and expertise.”

Oculus Rift DK2

The next generation Oculus Rift DK2 ships in July 2014.

The Oculus Rift is also great for the production process. For the first time, it’s possible for artists to preview content at their workstations in immersive stereo.

“Up to now we’ve had to use a five-camera rig – front camera, two side cameras, a back camera and an up camera,” Bradbury explained. “With that we can get a good feeling of what’s going on, but it’s never the same. We still have to render it out, go down and watch it in the dome.”

Mowbray added: “The Holy Grail for us has always been having a dome attached to your workstation, so you’re working directly in the dome space. The Oculus Rift has the potential to realise that dream. For us, it’s a game changer.”

But is a show designed for the Oculus Rift the same as a dome show? Or is it something new? If fulldome is different to cinema, is VR different again?

“The experience is not quite the same,” Bradbury observes. “In the dome, it’s a bit like you’re in a craft, being flown to different places. With the Oculus, it’s more dreamlike. We’ve done tests with it in 360° stereoscopic, and you really are floating in the world. But it’s weird in that you’re looking around, but your body isn’t there. I can already see the differences in making something for the Oculus compared to the dome. And that’s interesting.”

“I think this is an opportunity for real-time filmmaking to come of age,” Mowbray concluded. “There’s a convergence point that hasn’t quite happened yet where gaming, interactive storytelling, immersion and virtual presence collide and create something new. I think that’s why so many people are excited about the Oculus. It really could be a whole new way of telling stories. I can see a point where we’re making dome films in real-time, and we’re not pre-rendering anything.

“At the end of the day we’re technologists only because we have to be. We just want to create amazing experiences using cool stuff. The more great dome experiences people have, the more potential there is for work and the more viable it becomes as an art form. If there was a dome in every multiplex, a dome in every art gallery in every city, that would change everything.”

The Dome at Home

One member of the NSC Creative team has already managed to transport the dome experience out of the science centre and into the wider world. Specifically, his living room.

“I built my own 1.6m geodesic dome,” Aaron Bradbury confessed. “I get a cushion and crawl in on my back. I use it to test the work I do at home, and then I go back into the bedroom and work on it some more.”

While Bradbury is enthusiastic about his construction, his wife is not so sure.

“Sometimes she brings friends around hoping they’ll have a negative reaction and she can make me get rid of it. Instead, everyone loves it! But now the Oculus has come along, I’ve started to think that maybe I can throw all that cardboard away.”

Special thanks to Ruth Coalson