Virtual production is changing the way that high-end TV and films are being made, directing actors on set in conjunction with real-time computer-generated environments, but how is it likely to evolve? Michael Burns reports

Modern virtual production (VP) encompasses a wide field of the physical mixed with the virtual. From virtual scouting, techvis and remote multi-user collaboration, to in-camera visual effects (ICVFX) and ‘final pixels’ rendered with the Unreal Engine or other real-time engines.  Fully enclosed spaces lined with live LED walls are becoming increasingly common for virtual production, but it’s not the only method.

“Large, 360° LED Volumes are certainly in vogue and talked about a lot these days but we still use projection, or smaller LED walls, or even green screen when appropriate,” says Zach Alexander, co-President of Lux Machina Consulting, a NEP Virtual Studios company, which has worked on The Mandalorian and other projects. “They all have their strengths and weaknesses, it’s more of a matter of what you are trying to ultimately do. The current pairing of LED volumes and real-time render engines exists because those two technology stacks feed very nicely into each other but at the end of the day it’s about using the right tool for the job.”

Scope for innovation

This is a main focus at Final Pixel, but the global creative studio is also exploring using LED walls for live motion capture to drive a CG creature using cluster rendering. It recently captured an in-camera composite with interaction from a real-world actor.

“We’ve only really scratched the surface of what can be achieved with all these tools together, so further integrations and smoothing out control will undoubtedly happen quickly,” says CEO and Co-Founder Michael McKenna. “The scope for innovation now that technology has firmly landed on the film set is colossal. Typically many people are using VP to project static backgrounds – but these are game engines. We’ve been working on AI-driven background characters, live motion capture, cluster rendering and DMX driven lighting in Unreal.”

1899 provides a good example of a number of present and future trends in virtual production.

The forthcoming multilingual period mystery television series for Netflix was shot on a dedicated virtual production stage, one of Europe’s largest, built by DARK BAY at Studio Babelsberg, located outside of Berlin.

According to Philipp Klausing, Executive Producer / Managing Director at DARK WAYS & DARK BAY, the virtual production studio environment was built for the show, and a collective was formed to operate it. “We are content creators, so we formed a partnership with companies who are highly qualified world-class players in their area of expertise,” says Klausing.

The facility contains a LED volume using ROE Visual LED walls, Vicon and Trackmen tracking systems, ARRI Skypanels and an adjustable ceiling for interactive lighting, eight workstations for processing the virtual production content (commonly known as a brain bar) and a rain rig for water effects. A turntable (capable of taking up to 25t) was integrated to offer an innovative revolving stage. “You can rotate physical sets on the turntable and the changeover of any a shooting setup is three minutes,” he says. “I think every VP stage should have one.”

The collective includes Framestore, which as well as VFX supervision works with the production designer to develop content on the virtual stage, along with Faber AV, a leading vendor for audiovisual installation in the entertainment industry. “You need to work a lot with colours and processing inside the displays, how they blend into the foreground, how you can manipulate the foreground to blend into the background,” says Klausing. “That was Faber’s expertise.”

Also on board is ARRI, chosen not just because the production is shooting on ARRI cameras but also for its lens expertise. “Our DOP Nik Summerer and ARRI’s lens engineers customised a set of the new ARRI Alpha anamorphic lenses for 1899,” says Klausing. “They are specially designed to push the bokeh on the lenses to perfectly fall into the LED wall; the goal is to have a more foreground in focus, while keeping the LED Wall slightly out of focus. Having more space to direct the scene inside the volume is essential on scripted drama formats.”

Klausing sees virtual production as a methodology in the middle of production. It obviously requires investment, but the key factor is getting the fullest cooperation. “You need to really work with every department to see what the synergy is, what you can take out of that department in order to give it to virtual production.”

“We’re only at the end of the very first cycle of this way of working,” says James Whitlam, Managing Director – Episodic at Framestore. “Once the investment is made in a bespoke build there’s usually a desire to continue as an ongoing studio for hire. Those studios who invest in a library of high-quality vanilla content (for example, a castle courtyard, an airport, a city street) which can then be digitally dressed relatively quickly to make it unique to each production will have an advantage over those who just dry hire the hardware.”

Game on

A key driver behind the current VP wave has been the Unreal Engine from Epic Games. “Next year is going to be very exciting for virtual production across the board, with the release of Unreal Engine 5 and new features like Lumen and Nanite, which will help to further standardise workflows and bring additional realism into any project without as much processing overhead,” says Miles Perkins, Indus-try Manager, Film & Television for Epic Games. “I’m also really excited about the continued growth of content creation tools like Quixel Megascans and Capturing Reality for amazing photo-real assets and environments, as well as the 3Lateral and Cubic Motion teams who are reaching new heights with the MetaHuman Creator. All of these, put into the hands of creatives, are going to enable them to tell better stories with more efficiency and faster iteration.”

Content creators have big appetites for updates. “Performance capture is an area I am obsessed with, and love combining it with virtual production – so seeing the quality of the latest Optitrack cameras and the ability to run live mocap smoothly into Unreal is really exciting,” says McKenna. “It would be great to see more integration with the whole pipeline on-set.”

“As good as UE4 is we still have to pre-light,” says Whitlam. “As the software develops we should see great improvements in the realism of true real-time rendering which will give us the ability to cue interactive lighting events. This will give directors the freedom to shoot more dynamic action scenes.”

Another key player in real-time 3D content (RT3D), Unity Technologies recently announced it was to acquire the tools, pipeline, technology, and engineering talent of Weta Digital. Bay Raitt, principal of UX design at Unity, says: “[The Weta Digital] tools have all been built to have as much memory as they need, and as much compute as they need, and they’re fast. The tools [work in parallel], you can take simulations and farm them out across multiple machines and artists can work interactively.”

Raitt, who helped devise the facial animation for Gollum in The Lord of The Rings while at Weta, is keen to share some of the possibilities leading from a combination of Unity’s RT3D focus and the  high-fidelity asset creation pipeline from Weta.

“If you have a real-time visualisation tool that allows people on the virtual production using Gazebo to see the final lighting on the iPad in front of the mocap actors while they’re shooting, it means they can start lighting on set with physical objects. And that increases not only the speed that they can produce, but the quality of what they’re getting; they’re iterating in context, and making better quality choices,” he says. “You’re seeing it the way the audience is going to see it.

“Weta is going to continue to accumulate incredible generic assets when they do a show,” continues Raitt. “There’s an asset library that we’re working on, that’s not just rigid rocks and static objects. It’s actual skeletons and rigs, and those deeper assets that can be destroyed and can be dynamic.”

Enhancements to CG asset libraries would certainly make a difference.  “We need to focus more on how to make the assets available faster,” says Klausing. “How can we streamline the process, how can we unify the technologies so that everybody can just plug in. Content is a really big thing and it needs a lot of artists to create an asset for you. [As] you shift your post work into prep, you have to pre-commit half a year before you go on camera to full backgrounds and colours, often when you’re simply not there yet in the creative process with scriptwriting, casting and closed financing. It’s a huge boundary to embrace this methodology, which only high resolution asset libraries can help us overcome.”

“Pre-production schedules don’t always allow enough time for a single company to build all the assets for a big show, so we’re going to see a model develop where multiple vendors are feeding a single brain bar content,” says Whitlam. “For this to succeed we’re going to need agreed nomenclature and delivery specs, as well as a clear understanding of roles and responsibilities, titles and hierarchy between VFX, VP and traditional roles on set.

“Having a well-designed foreground set and the time in pre-light to properly integrate the virtual environment so the seams disappear between foreground and background is key to the success of a VP shoot,” adds Whitlam. “Giving production designers the tools to visualise how their physical sets will look in the volume before they build them would be of great benefit.”

“There is still a gap between final content processes to build worlds,” says Martin Taylor, Director and Co-Founder of Prox and Reverie, which has recently announced the opening of a purpose-built XR studio, The Forge, in Doncaster. “We’re exploring different options, including two-way capture into virtual worlds and collaboration across distance. Our main aim now is getting tools into the hands of directors much earlier in the process.”

“For us the bit that is missing is interoperability – being able to do everything in real-time virtual settings without leaving a creative session, to save session progress, then pick up in different locations and be able to invite in others seamlessly,” says Taylor. “The cloud needs to play a role, we need to be able to share across sessions and locations, and any way to have ‘always with you’ tech is helpful.  The more cloud rendering the better, as this will help develop greater photorealism on smaller devices.”

Raitt hints this is a way in which the Unity-Weta deal could deliver, with the cloud giving access to the massive compute cluster of servers known as the Weta render wall. “You could put them into the cloud as virtual instances that anyone in the world who was within a low latency zone could spin up. You could summon the Weta art stations and render wall to do your simulation computes and your renders. It’s not just about everybody logging into the same document and editing at the same time, it’s about being able to hand off a lot of work so that if you have 1000 people working on a show, they can, in an organised way, collectively contribute to the final experience.”

Another company which has seen a huge uptake in virtual production is disguise. CTO Ed Plowman echoes Taylor’s observation: “Producers end up spending copious amounts of time constantly re-working an approach to fit their next shoot, especially as the current methods of synchronisation are not fit for this particular set of purposes. That’s where problems with signal chain latency and other limiting factors start to creep in.

“This lack of reproducibility is what we, at disguise, are aiming to resolve by synchronising the camera and camera tracking systems, LEDs, media servers and capture capabilities through our Extended Reality (xR) workflow and get them all to work together reliably, repeatedly and at scale,” he adds.

“Suitable cameras need to accept genlock, which can restrict choice, particularly for those filming on a budget,” says McKenna. “Having more automated communications between LED processing, Sync Gen and cameras could save time on set and give greater freedom to shoot at different frame rates on the fly.”

“In the camera/capture space, an end-to-end production data pipeline is still something I think could be greatly improved, including camera tracking, sync with the display, lens information, and more standards in general in how we’re compiling important data,” says Perkins. “And some of this can also factor into latency issues, which we’ve been able to dramatically improve. We’re looking at ways to improve scalable workflows to give the larger content creators more confidence in scaling up both in complexity and shot numbers, knowing that the production data is appropriately captured and tracked. That said, I think there’s still plenty of room for more improvement to the data flow, both for production and to continue to shave frames off of latency.”

Colour conundrum

“Every digital film camera in the world records colours slightly differently, depending on the sensor, hardware and software they use,” says Plowman. “Content creation tools – producing image and movie assets – all produce output in different colour spaces in a variety of file formats. The colours on the LED wall where the content is displayed can slightly change depending on your point of observation. Then you’re introducing physical set items as well, like people and objects in the real world, which are lit by physical lighting. Hence the need for standardisation.

The disguise Designer software comes fully integrated with the ACES colour management pipeline, “allowing for full-colour control while unlocking the potential of all colour sources, whether it is pre-rendered, live camera or video streams from content engines”, says Plowman.

“You have to know that the content being created is colour accurate not only in the digital space but also that it’s colour accurate with the physical components of your set because the physical and virtual must be seamless,” says Perkins. “When we start getting into some of the limited colour ranges inherent in the current state of the art on LED stages, it becomes even more important. This even extends to clearly understanding signal flow, making sure you’re tracking exactly where any LUTs or colour profiles are being added or updated. At Epic we’re prioritising support for OpenColorIO in Unreal Engine, something critical for consistency from content creation all the way through to the final pixel.”

Democratisation for the virtual nation

“We’re committed to breaking down the barrier of entry to this revolutionary workflow,” says Plowman. “We have just come out of a UK Government-funded research project, that allowed us to devel-op our xR workflow into a more integrated, comprehensive solution for virtual production that is scalable to any production size and technical needs. This, paired with our commitment to offering free access to our software platform and our free eLearning platform covering all disguise workflows, will allow us to make virtual production even more accessible.”

Recently announced, the Final Pixel Academy will run courses at all levels across the virtual produc-tion workflow, including creating Unreal environments optimised for virtual production and teaching the skills needed for an on-set virtual production crew.

Perkins says Epic is already seeing results from a significant focus on training and education. “Cur-rently most components of virtual production are readily available for anyone wanting to employ them,” he adds. “Unreal Engine is free, Quixel assets are free, and you can go into the Unreal Marketplace and buy things for very little, allowing filmmakers to pull from existing libraries to create the worlds they want to tell stories in.”

“These workflows should evolve to be accessible to as many filmmakers as possible to allow them to tell new stories and create new creative challenges for us to tackle,” says Alexander. “The hope is we are continuing to move towards a place where the time between creative ideation and execution is shorter than ever.”

 

1899 shooting at DARK BAY pics are copyright: Alex Forge/NETFLIX

Jon Creamer

Share this story

Share Televisual stories within your social media posts.
Be inclusive: Televisual.com is open access without the need to register.
Anyone and everyone can access this post with minimum fuss.