In Universal Pictures’ Jurassic World, advances in genetic engineering lead to the creation of a dinosaur theme park on the remote island of Isla Nublar. By the end of the movie, the genie is out of the bottle and the prehistoric beasts are roaming free.
The sequel, Jurassic World: Fallen Kingdom — covered in our August 2018 issue Cinefex 160 — picks up the story three years later. Meanwhile, a new virtual reality experience neatly bridges the gap between the two films, by inviting dino-devotees to explore the island and save as many of the wandering critters as they can.
Launched on June 14, 2018, at over 100 Dave & Buster’s venues, Jurassic World VR Expedition whisks four people at a time to Isla Nublar. Hopping aboard a proprietary two-axis motion base rigged with HTC Vive headset technology, each brave quartet navigates through jungle, valley and coastal environments in search of seven prehistoric species — gallimimus, brachiosaurus, stegosaurus, mosasaur, pteranodon and, of course, the iconic velociraptor and T-rex. Scanning devices allow them to tag each animal as they spot it. If interactivity is not their thing, they can set their scanners aside and simply enjoy the ride.
Jurassic World VR Expedition was created by The Virtual Reality Company under license from Universal. Cinefex caught up with writer and director James Lima, visual effects supervisor Carey Villegas, animation supervisor David Schaub, and VRC co-founder and chief creative officer Robert Stromberg, to discuss the creative and technical challenges involved, and the place of the experience within the ever-evolving world of virtual reality.
With “Jurassic World VR Expedition,” The Virtual Reality Company development team of animators, visual effects artists, and technicians took a novel approach to virtual reality, combining visual effects and gameplay to achieve the level of integrity expected of cinematic-quality VR. (PRNewsfoto/The Virtual Reality Company)
CINEFEX — Millions of people are already familiar with Jurassic World, whether through watching the films or playing the videogames. What does this virtual reality experience bring to the party?
JAMES LIMA — This is a portal into the Jurassic World franchise, another way to enter into the story universe. Because virtual reality is immersive and has some interactivity, there is this level of agency that you just don’t get in watching a film. At the same time, we didn’t want to abandon the emotionality that you get from a film. We wanted to add the sense of fear, the magnificence, all the things that inspire us and make us love the medium of film.
CINEFEX — Cinematic sensibilities are all very well, but virtual reality has a rather different visual language. You don’t have cuts, or establishing shots, or closeups. How did you go about choreographing the action, and directing the attention of viewers?
JAMES LIMA — I started looking at the choreography of old musicals, because people like Gene Kelly and Fred Astaire were geniuses. They would typically do a single shot with a 1,000-foot magazine, and it was brilliant in terms of how they moved from one idea to the next. The construction of the set focused your attention into a general area, then a sound or a movement narrowed that view to a specific area. Another thing we looked at was the first virtual reality experience ever created — Disneyland. When you go to Disneyland, you transition from one environment to another, and the sounds and smells are different, and suddenly you’re in a new location.
CINEFEX — In Jurassic World VR Expedition, you’re trying to direct the attention of four people at once. Does that complicate things?
ROBERT STROMBERG — Well, you’re being driven through the environment remotely, so the points of interests approach all four people at the same time — although you still have the choice of where to look. Using those points of interest — whether it’s something visually stimulating or a sudden shock — we purposefully draw the attention of all four people to certain areas.
CINEFEX — Does the experience change depending on where you sit on the motion base — left, right, in the middle?
DAVID SCHAUB — Yes, we’ve choreographed things that way. The person seated on the far left is going to get one experience, and the person on the right is going to get a different experience. We get velociraptor moments on the left that are very profound, but at the same time we don’t cheat the people on the right because later on they get their own close-up opportunities. It’s all balanced out.
CINEFEX — Are these the same dinosaurs that are in the movies?
DAVID SCHAUB — Absolutely. The models and textures came from Industrial Light & Magic. They were massive files, of course, but we very carefully decimated and retopologised them, while keeping the resolution high — somewhere around 100,000 polygons for each dinosaur. Then we took them through a typical visual effects pipeline, rigging them, building all the controls you would need for a film, and animating them. All that got piped through to Unreal Engine, so you are in fact seeing dinosaurs rendered in real time at 90 frames per second. If you were standing up, you could literally walk around them.
CINEFEX — How do you keep up the image quality when you’re rendering all these complex assets in real time?
JAMES LIMA — We put all of the dinosaurs through a texture evaluation stage — that helped us to match what was done in the films. Also, Carey Villegas did something that surprisingly is not common in virtual reality or videogames. Like a colorist, he went through and relit all the scenes, all the dinosaurs, with kind of a director of photographer’s eye. It was cool. A lot of people would come in and watch what he was doing, and they were like, “I’ve never seen this before.”
CAREY VILLEGAS — I think that was probably one of the more complicated things we had to figure out. We had ILM’s full suite of textures, yet there was no way to utilize all of those great things because you’re limited with the color correction tools you have available in game engines right now. They don’t give you the kind of control that we’re used to in a digital intermediate, say, where we can pretty much have our way with any portion of the image. We figured out how to extract the best parts of each texture, and distilled them down into the two or three or four fundamental components that we needed.
CINEFEX — How important was this process to the overall look of the experience?
JAMES LIMA — Critically important. One of the challenges that I put to the team was that I wanted this experience to happen at first light — there’s this time of day before the sun rises that’s spectacular, but it’s not your classic orange sky picturesque thing. The first iterations they gave me of what that light would look like in the game engine was, to put it mildly, perfunctory. They looked like bad matte paintings. But then Carey got in there and suddenly we started finding the magic.
ROBERT STROMBERG — I would also say that the quality of the experience is only partly about image resolution, or how much we can render as fast as possible. That’s why we brought in seasoned visual effects animators, with their decades of accumulated experience. That’s what takes it to another level.
CINEFEX — Did the high frame rate affect the way the animators worked?
DAVID SCHAUB — We started animating at 24 frames per second because that’s our world. Then we transitioned over to 30, because that divides into 90. But when we started actually looking at the stuff at 90, we found it tended to soften a lot of the animation. Those hard hits that you really need to enforce big gravity impacts, we had to pump those up. That was one of those interesting discoveries.
CINEFEX — Did the artists work in front of monitors in the usual way, or were they able to put on headsets and work in the virtual world?
DAVID SCHAUB — Maybe 90 percent of the time they were working as normal, on the desktop. Occasionally we would put on a headset because, while something may look wonderful on the monitor, in the virtual world you pick up on things that you just don’t see in the 2D view. If a character is slightly off balance, for example, an animator might not pick up on that. When you put on a headset and walk around, you get a different perspective and see all those flaws.
In “Jurassic World VR Expedition,” up to four people at a time put on an HTC Vive headset and board a Dave & Buster’s motion simulator, where they will be transported into what remains of Jurassic World for a five-minute interactive expedition. (PRNewsfoto/The Virtual Reality Company)
CINEFEX — How did you integrate the animation with the movements of the motion base?
CAREY VILLEGAS — You know, we work with motion simulators all the time in the visual effects world — typically six-axis bases with rotators that we can pretty much move any way we choose. In this case, however, we were limited to a simple two-axis rig with just pitch and roll, because it has to be durable enough to run thousands of times all day long.
CINEFEX — So was it simply a case of piping the animation data into the motion base and getting it to respond accordingly?
DAVID SCHAUB — Not quite! If we used data straight out of the box, people got motion sickness. It was a little distressing, actually, because everything we tried in the beginning was just instant vomitosis!
CINEFEX — Sounds messy. How did you get around that?
DAVID SCHAUB — We stripped it down to bare bones and went one layer at a time. The first pass was basically just a glide through the experience. Then it was about deciding things like how much of a bank do we need to put into the turns? If we put in a little bit too much — guess what? — we started to feel nauseous. Once we got that first gliding layer, we started introducing bumps over every single rock — that really made you feel like you were grounded. It really was an artistic choice. All the time we were looking for that sweet spot.
CAREY VILLEGAS — Even then, we couldn’t just pump our data straight into the platform and expect the visuals to be in line. Instead, we came up with a technique that reads what the platform is actually doing physically, because the inertia is different depending on how many players are on it. That really reduced the possibility for motion sickness. It was something we had to push hard for, because that wasn’t built into the design of the platform.
ROBERT STROMBERG — We also had to get over latency issues. Our inner ear is so sensitive that, if things are off even a fraction, you feel it. And we learned that people are very sensitive to where the horizon is. That always has to be where it needs to be, because we’re aware when it’s not.
CINEFEX — It sounds like motion sickness is caused by lots of different stimuli, all interacting with each other. How did you isolate each individual thing that was causing a problem?
CAREY VILLEGAS — Every time we added a new physical component, we would bring in different people as test subjects. Then we would try and figure out what it was — latency in the frame rate, or some motion that we’d introduced that wasn’t sitting well with the inner ear. By building things up with this layered approach, we were able to quickly decipher where the problem lay.
CINEFEX — How important is the interactive dimension, using your scanner to tag dinosaurs?
JAMES LIMA — The idea was to keep it simple and fun so that even Aunt Pickle could do it, but it’s not necessary to the experience. It’s the icing on the cake. Colin Trevorrow, the executive producer and writer of Jurassic Park: Fallen Kingdom, rode on it and he was like, “I couldn’t give a f*** about using the scanner!” But some people are kickass with this thing, and they’ll tag themselves a lot of dinosaurs.
CINEFEX — Towards the end of 2016, we published a long article in Cinefex issue 151 looking at the state of the virtual reality industry. One observation that kept coming up during interviews was that virtual reality was a bit like the Wild West — wide frontiers, no rules, everyone trying to stake their claim. You commented, Robert, that it was like trying to build an airplane in flight! How have things moved on since then?
ROBERT STROMBERG — Well, we’ve attached the wings!
CINEFEX — That’s progress.
ROBERT STROMBERG — Right! Actually, what I have seen over time is a big shift from working with people from the gaming community to a bigger presence of visual effects artists. That’s making a big difference to the quality. There’s a bar that visual effects reached a long time ago and we want to take advantage of that. It’s a kind of embedded knowledge about what we need in order to make things look real. That’s not to say that there isn’t some sort of fusion with elements of the gaming world. We just wanted to take it to a higher level, and work with people who wanted to be bold. Our biggest hurdles have been in trying to find that perfect balance.
CINEFEX — Creative advances are one thing, but how does the business model stack up?
ROBERT STROMBERG — This experience will be the widest-released virtual reality content to date. Our first rollout is with the Dave & Buster’s franchise, going out into 116 stores. It’s a little bit of a business experiment, but I think it’s positioned in a way that’s unique and for the first time has great potential for revenue, because we’re putting it into a place where people are accustomed to spending money. We’re going after other share partners and outlets, of course, and it’ll eventually be worldwide.
CINEFEX — So, has virtual reality’s time finally come?
JAMES LIMA — You know, I really think this is as exciting a time as 1992, when Jurassic Park ushered in the digital era of filmmaking. Here we are with that same franchise and we’re doing it again with virtual reality — this might even be considered a first step into what film might one day become. We’re at that baby stage right now of a whole new technology, a whole new medium. It gives me goosebumps thinking about it.
Special thanks to Jeff Fishburn.