VR is burgeoning with projects being created across AR, promos, TV spin-offs, journalism and more. Jon Creamer looks at some real 
world examples

Framestore 
Our Mars

Framestore created Our Mars for McCann and aerospace company Lockheed Martin.

Our Mars is billed as the first-ever headset-free group virtual reality vehicle experience. It involves a classic American yellow school bus that transports its passengers to the surface of Mars. Passengers get on to the bus, and as it begins to navigate the streets of Washington DC, transparent 4K monitors in the place of windows, suddenly switch on, blocking the view of the city street and replacing it with the surface of the red planet.

The imagery of Mars tracks on to the movement of the bus: as the bus turns in real life, the bus turns on Mars, as the bus goes over a bump, it does so on the Mars surface too. Sound design adds to the experience.

McCann brought the idea to Framestore, and left it to them to discover how to achieve the result. “They had an idea bubbling in their minds of what it was going to be rather than having the foggiest idea how they were going to make it happen,” says Jonathan Shipman, Framestore’s head of integrated production. One major challenge was that passengers had to have no idea what was going to happen before it did. “The challenge how do you build a bus that looks and acts like a bus until it’s not a bus anymore?” The transparent screens made this a possibility.

Creative director Alexander Rea and CG supervisor Theo Jones created a system that would allow real bus speed, GPS and accelerometer to be translated into the Unreal game engine, creating a real school bus that would exist inside the realm of a video game

The Mars terrain (just the interesting bits) was modelled by Framestore according to satellite photography along with additions that will come in the future – the Rover Curiosity, a space colony and Lockheed’s Orion capsule for the upcoming Mars mission.

The biggest challenge was “linking all the different pieces of technology and using things in a way they’ve never been used before,” says Shipman. The monitors had only just come off the production line and weren’t commercially available, for instance.

“This kind of experience leads you to understand what the value of VR will be. It takes the leash off and opens up what the possibilities can be of VR in the real world,” says Shipman.

Nexus
Chapita promo

Director, Eran Amir “The main theme of Mind Enterprises’ single Chapita is time. Time is ticking around and around repeating itself ad infinitum. The goal was to take that theme to the extreme by creating a mesmerising world of loops, clones, and repetitions. The premise is very simple: the viewer starts in the middle of an empty warehouse (our canvas). The warehouse is then slowly painted in vibrant colors with our dancer. As the song develops our protagonist conquers more and more of the space (and the viewer’s attention). In the end she is everywhere and there is no possibility of looking away. From the outset, the main guideline was to keep everything as real as possible. Although the final cut is composed out of hundreds of separate clips, it had to feel like it was shot in one take.”
 
Nexus technical director and VR artist, Elliott Kajdan “The constraints of filming this were having one day to shoot multiple passes of the same dancer going around the viewer in 360 degrees, and it had to look like it was all one three-minute shot. With this in mind I started looking for places where they could control the lighting for consistency between the takes. Once we secured the warehouse location, we realised that the action cameras typically used to capture 360 videos were struggling with the limited amount of lighting on site.

Eran’s idea was to put the viewer in the middle of a colorful procession of dancers. Therefore it made sense to film at 50 frames per second allowing more fluidity and to be in line with the high refresh rate of current head mounted displays. However, tests we did with consumer grade 360 rigs proved to be blurry, grainy and not good enough.

So instead of working with multi-camera rigs, I caught on to the fact that we didn’t need to film everything around the warehouse. Although we had to keep the dancer in frame, everything else would be discarded. We rigged a single RED Epic on a nodal head, fitted with an Arri 8mm wide-angle lens. After some preparation of the footage, we could layer multiple instances of the dancer on a still panoramic image of the warehouse. It made the editing process smoother and we could skip the stitching process completely.”

Atlantic
Great Barrier Reef

To coincide with the recent BBC1 and Atlantic Productions series, David Attenborough’s Great Barrier Reef, a VR experience was also created for an exhibition at the natural History Museum.

The television show and the VR experience followed Attenborough under the waves in a Triton 3300/3 submersible.

The main underwater camera was a RED Dragon 6K and, inside the Triton, a Sony F55. There were various GoPros fixed inside and outside the sub.

For the VR, the production had a Jaunt rig inside the submarine “so you could sit with David and hear him talking to you,” says series director Mike Davis. Outside of the submersible, the production used the Kolor Abyss spherical rig with six GoPros in an underwater housing system. “That allowed the cameramen to capture scenes with marine life swimming all around you,” says Davis. “You could also see the submersible off in the distance. The audience was able to hop in and out of the sub and feel really feel immersed.”

The production also let the VR see behind the scenes. “With the VR we embraced that,” says Davis. “Because it’s David you expect it to be a filmed experience anyway. And it’s fun to be able to see the other divers and the sub and the boat above you. It’s set dressing in a way. We embraced that. We deliberately haven’t spent time painting out poles and divers, they’re part of the experience. It makes you feel like you’re one of the divers.”

The Mill
The Guardian 6×9

The Guardian brought in The Mill to help create 6X9: a Virtual Experience of Solitary Confinement.

The project is a VR piece that placed the viewer in a small cell to start a discussion about the use of solitary confinement and the effects it can have. “The UN states that solitary confinement is a form of torture, yet it is really hard for the general public to rally around an issue like this when criminals are involved,” says Carl Addy, creative director at The Mill. “The act of using VR was tactical, so as to generate empathy and conversation around the topic by giving members of the public a way to experience a simulation of solitary confinement. Essentially this was a well researched piece of journalistic filmmaking, a factual documentary that has been translated and directed in VR as an immersive experience.”

The interactive team from The Mill used game engine technology to create the film.

The Mill worked from first-person accounts and documentaries as references for both the cell design and spatial audio capture. The cell was designed in Maya and then further developed in Unity. Environmental binaural audio was also used which ensured the audio was anchored to the environment, enhancing the sense of space and ensuring the sound continually moved with the viewer.

Effects typical after long-term sensory deprivation were played with to mimic a prisoner’s experience of being locked away for 23 hours a day in solitary confinement. “Part of us trying to build empathy was to give a user agency; the ability to make choices and interact with the experience makes you invest emotionally in the narrative and outcome,” says Addy. “VR puts you in the cell without any of the safety one gets from the detachment of a screen. This is not like watching a documentary, you are in it.”

Guardian 6×9 was initially pre-launched at Sundance Film Festival on the Gear VR with the public release taking place at Tribeca Film Festival for Google Cardboard.

Made in Chelsea
Monkey, Rewind, NBC Universal

Indie Monkey along with parent company NBC Universal’s innovation unit and VR specialists Rewind produced two VR specials for Made in Chelsea. “We make a lot of shows that target a young demographic so we’ve always done a lot digitally and online over the years,” says Monkey md, David Granger. “And ever since Made in Chelsea started there’s always been a massive appetite for extra stuff. And I was keen we played with VR just to find out more about it frankly.”

Granger says he was “interested from a producing point of view. What’s it like managing talent in that situation? Will they react well? What about narrative structure? It was really a pilot project.”

Granger says he found the VR show required “a different mind set” for a producer. “Essentially it’s live and everybody’s on all the time. You can’t go back and edit that frame. It is what it is.” But the viewer will accept that. “Where you might have to be tighter for an episode, on this thing you’re permitted for it to be a bit more relaxed.”

The restrictions took some getting used to though. “You can’t move around the room. In some ways it feels like it hems you in. And it’s expensive. It’s also quite laborious. Literally stitching it together is a pretty intense process. You can’t say ‘sod it, let’s shoot something and put it up online tomorrow.’ But those techniques are changing quickly. The more instant it becomes the more exciting it’ll be.”

Jon Creamer

Share this story

Share Televisual stories within your social media posts.
Be inclusive: Televisual.com is open access without the need to register.
Anyone and everyone can access this post with minimum fuss.