Real-Time On-Air Virtual Set Production Takes Off

The use of photorealistic technology is changing the way broadcasters are looking at virtual sets. Now it is possible to create scenes that are indistinguishable from reality, which provides excellent new possibilities for enhancing storytelling.

For live, on-air broadcast operation, photorealism is one of the toughest challenges.  What really makes the difference for viewers is being unable to tell whether the images they are watching are real videos or digital renders. Creating such realism requires the background scenes to have an extremely high render quality, which can only be achieved using hyper realistic rendering technology and carefully designed sets to create photorealistic images.

Game engines such as Unity or Unreal Engine (the technology behind the popular online videogame Fortnite) provide the ability to render in realtime opening up some fantastic possibilities.

Dock10, based at MediaCityUK in Salford for example has invested in a virtual set system driven by Unreal.

“It gave us the opportunity to get to grips with the technology and to build the team,” explains Andy Waters, Head of Studios. “It’s not just about buying the kit, sticking it in and turning it on. There’s a huge amount of knowledge needed to deploy the technology successfully.”

Waters points to the availability of inexpensive, ready made graphics assets which can be bought online, downloaded and ready to turn a new production around.

“We did a trial the other day where we could either have built a set of No. 10 Downing Street for ourselves from first principal or buy it online for $200.”

Where, until recently, broadcasters would milk a virtual set system for all its worth but end up tieing up one facility for use on one show, now the flexibility of games-engine driven systems means any space can be turned into a virtual set in a day.”

Dock10 plans to turn all of its studios into virtual-set ready spaces in a bid to attract more entertainment productions alongside existing news, sports and childrens programming users.

3D graphics developer Brainstorm created a unique approach called Combined Render Engine which pairs Unreal Engine with its own eStudio Render Engine to allow its graphics toolset InfinitySet not only to show excellently rendered realistic background scenes, but also to integrate broadcast graphics elements in the final scene. 3D data-driven motion graphics, charts, lower-thirds, tickers, CG and many other elements are not part of your standard games engine.

InfinitySet.

InfinitySet.

“With the inclusion of these graphic elements, the scene can result in a highly complex composition, seamlessly integrating in real-time different render engines, virtual 3D backgrounds, real characters and synthetic graphics elements - no matter if we are working in HD, 4K or even higher resolutions,” explains Miguel Churruca, Marketing and Communications Director of Brainstorm. “By using Unreal Engine, InfinitySet users also have access to the extensive library of assets Unreal provides to its community.”

Augmented Reality allows for displaying real footage or live videos mixed with virtual backgrounds or scenes, chroma-keyed talent and data-driven 3D graphics. The talent can interact with the environment for the audience.

Technology developer and producer The Future Group (TFG) recently claimed the first live broadcast containing real-time ray tracing and real-time facial animation. Ray tracing is an image rendering technique that can produce highly realistic CG lighting effects but it consumes huge quantities of rendering power, which is why it has only been used in postproduction for film and television. Using souped-up graphics cards and by blending CG with standard video frame rates, TFG animated and live streamed an augmented reality character interviewed by human presenter during RIOT Games League of Legends’ regional finals from Shanghai.

Brainstorm’s InfinitySet includes “industry-first” technologies such as 3D Presenter, which generates a real 3D object out of the talent’s live video feed, casting real shadows over the virtual set. Another feature, HandsTracking, allows presenters to trigger animations and graphics with the simple movement of their hands.

Interface for AR and Mixed Reality creative applications.

Interface for AR and Mixed Reality creative applications.

Its new TrackFree technology - a patented and “revolutionary approach” to Virtual Set production – according to Churruca, is an independent camera-tracking technology that provides total freedom for operators to use any tracking system, trackless or fixed cameras, or a combination of these at the same time.

The latest version of its InfinitySet takes advantage of the latest hardware developments of NVIDIA RTX GPU technology to deliver what Brainstorm claim is the best rendering quality available in the market.

“Using these technologies along with other advanced rendering capabilities like the Combined Render Engine with Unreal Engine, PBR materials, HDR I/O and the new 360º output, InfinitySet can create the most realistic content for virtual and mixed reality, virtual sets, real-time postproduction and film pre-visualization that can’t be distinguished from reality.”

InfinitySet also features a new module to enhance presentations by creating AR content using material from presentation tools, such as Microsoft PowerPoint or Adobe PDF, and including keyed talents.

Most broadcasters understand now that virtual set technology represents a huge cost-saving feature, both in broadcast and feature film production, allowing for real-time, virtual production with enough quality to be sent on-air right out of the box.

Churruca says, “Our guess is that new technological advances like virtualization, on-demand cloud rendering, enhanced integration with AR and, of course, an even more developed photorealistic quality with Unreal Engine or other game engines will be the key drivers in this area for years to come.”

Supported by

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.