This 3D depth mapping technology for broadcast passed us by at IBC this year but the other week we had a one-to-one with zLense boss Bruno Gyorgy about his virtual studio system zLense.
Basically zLense looks like a huge matte box on the front of your camera, but inside is a box of tricks from Hungarian tech company Zinemath which promises to revolutionise the world of 3D virtual studio effects and graphics.
Gyorgy describes it as a 3D depth mapping technology for broadcast cameras which will dramatically lowering the cost of 3D effects for live and recorded TV.
zLense is a virtual production platform for film, production, broadcast and gaming which provides the world’s first depth-mapping camera solution that captures 3D data and scenery in real-time and adds a 3D layer.
The zLense virtual production platform combines depth-sensing technology and image-processing in a standalone camera rig that works with most standard broadcast cameras.
The system processes spatial information and allows production teams to create 3D effects and utilise state-of-the-art CGI in live TV or pre-recorded transmissions – with no specialist studio set up.
The zLense depth-sensing technology allows for a full 360 degree freedom of camera movement and gives presenters and anchormen greater liberty of performance.
Directors can combine dolly, jib arm and handheld shots as presenters move within, interact with and control the virtual environment and, in the near future, using only natural gestures and motions.
Gyorgy says the system is poised to shake up the world of virtual studios by putting affordable high-quality real-time CGI into the hands of broadcasters at a fraction of the cost of other virtual studio technologies.
The solution is quick to install, requires just a single operator, and is operable in almost any studio lighting.
“With minimal expense and no special studio modifications, local and regional TV channels can use this technology to enhance their news and weather graphics programmes – unleashing live augmented reality, interactive simulations and visualisations that make the delivery of infographics exciting, enticing and totally immersive for viewers,” says Gyorgy.
The ‘matte box’ sensor unit, which can be mounted on almost any camera rig, removes the need for external tracking devices or markers, while the platform’s built-in rendering engine cuts the cost and complexity of using visual effects in live and pre-recorded TV productions.
The zLense virtual production platform can be used alongside other, pre-exisiting, rendering engines, VR systems and tracking technologies.
The VFX real-time capabilities enabled by the zLense Virtual Production platform include:
• Volumetric effects
• Additional motion and depth blur
• Shadows and reflections to create convincing state-of-the-art visual appearances
• Dynamic relighting
• Realistic 3D distortions
• Creation of a fully interactive virtual environment with interactive physical particle simulation
• Wide shot and in-depth compositions with full body figures
• Real-time Z-map and 3D models of the picture
Share this story