Virtual Production For Broadcast: Principles, Terminology & Technology

A broad look at the technology and techniques of virtual production, from the camera back through the video wall, processors, and rendering servers.

No matter how successful the big effects movies of the last few decades have been, almost nobody becomes involved in the film industry because of a desire to spend hours in a room with bright green walls. That might be why virtual production has become so popular; it might also be that it has contributed so visibly to creatively and financially successful shows. In the end, the draw is that it allows people to shoot as they always have, but with results limited only by imagination that previously took months of visual effects work to create – although there are a few practicalities to bear in mind.

Beginnings And Fundamentals

Most new technologies change rapidly soon after introduction, and virtual production is fairly new. Properly displaying an image on a video wall in such a way that it looks right to the camera involves a lot of different disciplines. The camera’s position and lens configuration must be tracked, along with any geometric distortion created by that lens. The simulated environment must be prepared, which might mean a matte painting or a fully three-dimensional virtual world with all the considerations of art direction and lighting common to CGI. That environment must be rendered, colour-corrected, and displayed with geometric corrections based on where the camera is and where it’s looking.

The mechanical details may be handled by the facility rather than the production, and in many ways the workload is not dissimilar to conventional visual effects – it’s just before shooting, as opposed to afterward, which greatly changes the planning process. From the camera crew’s perspective, it’s reasonable to think of virtual production as an evolution of back projection. That’s a technique which goes back at least as far as synchronised sound on film, because the synchronisation technologies which made The Jazz Singer sing also made it possible to synchronise a projector and a camera.

Since then, some spectacular-looking films have used back projection. Kubrick’s 2001: A Space Odyssey is a prime example. The back projection effects of Aliens’ dropship crash and elevator ride through a huge sci-fi structure mostly hold up to this day. When the Wachowskis chose to use back projection to simulate car journeys through The Matrix’s green-tinted faux cityscape, the intention was to create a sense of faint unreality intended to clue the audience into the fact that the characters lived in a faked world. In the end, the effect is so well done that very few people noticed.

Back Projection Becomes Virtual Production

More recent examples of projection cross over with virtual production. Oblivion used craftily front-projected video to create a backdrop around a science-fiction habitat atop an impossible spire. It also allowed that projection to light the scene, something that had only become possible in the modern world of cameras with huge sensitivity (the production was shot on the Sony F65). LED walls are much brighter than most projections, of course, but the benefits are similar.

Oblivion’s set was a green screen compositor’s nightmare, filled with gauzy, transparent textiles and specular reflections. The benefits of working with a backdrop that’s actually visible to the camera can hardly be overstated. Things which would usually be anathema to green screen compositing become desirable ways to sell the effect. Smoke, mist, hair, water, and transparent objects integrate perfectly. Reflections – with some caveats – are effortless. The current interest in vintage lenses for their softness, aberration and distortion can make green screen hard work, but all of those things actively help a virtual production by blending the real and the not-so-real.

An LED wall has more sheer power than almost any other video display technology which has ever existed, and the light it casts on the scene is another huge boon to convincing composites. Tron: Legacy was possibly among the first to use video wall panels, albeit kept out of shot, as lighting devices. Gravity, likewise, used the same technique to depict the fall of light on George Clooney and Sandra Bullock. None of those productions put video wall panels in shot; the technology of the time lacked the resolution to create a convincing backdrop.

As well as being powerful, an LED wall is also far less subject to the problems suffered by either front or back projection when extraneous light falls on the projection screen. A projection screen is white, making stray light hard to deal with. Meanwhile, most of the surface of an LED wall is black, so contrast is much higher and problems harder to see.

The Challenges

That’s a good point, though, to accept that there are some caveats to deal with. Despite the huge contrast and high brightness of an LED wall, it is not completely immune to the fall of light; a large proportion of the surface is made not of black paint but of reflective plastic LEDs. Other boundaries include frame rate limits. Nobody’s shooting material at a thousand frames per second on a virtual production stage. Certain kinds of reflective surfaces, particularly hard mirrors, can also break the illusion by making it clear that the video wall is much closer than it realistically ought to be.

And while that much-vaunted interactive lighting is enormously effective, the light cast by an LED wall doesn’t have the colour quality we would demand from production lighting tools. That means LED wall light is useful only as a way of integrating foreground and background; we must still light conventionally. Image-based lighting techniques allow us to control production lighting devices from individual units to arrays of pixel tubes. The result can be convincing, animated, interactive lighting effects with excellent colour quality.

All of this means that in early 2023 – it’s inevitable that taking a new camera department onto a virtual production stage will involve spending some time in test and configuration to ensure things look right on the day, although that’s getting easier as time goes by. Mostly, the technical solution on the day will involve a conversation between the camera crew and the video wall specialists. Creating content for the video wall, meanwhile, might have been going on for weeks.  As we’ll see, collaboration of exactly that kind is key to successful virtual production.

It’s a mistake to think that it involves less work per se. It does involve different work, crucially, at different times; a post production workload becomes a pre production workload. Again, that’s nothing new; the model unit on Aliens worked simultaneously with the rest of the production to create the background plates that the main unit would need. That’s not a complicated requirement, but it is outside the experience of much of the film industry in 2023. The upside is that, in an ideal world, virtual production requires absolutely no special consideration in post production whatsoever.

It does, however, impose an absolutely cast-iron deadline on the people who are preparing the material which will be displayed on the video wall, whether that’s footage or a 3D world or a hybrid of the two. Similar workloads exist; they simply exist in different disciplines, and at different times.

Scale And Budget

Whether virtual production seems expensive depends very much on the intent. It’s perhaps safe to say that it can best save money for productions which were otherwise looking at a considerable spend on moving lots of people to lots of international destinations. It can make expensive things cheaper; whether it can make cheap things even cheaper is a more complicated question.

Still, it’s a mistake to assume that every instance of virtual production involves a permanent facility with a vast video wall. Upscale facilities have huge capability, but it’s just as possible to rent a small section of LED wall, the appropriate people and equipment to display images on it, and to wheel it into position in the background of a specific shot. The resulting setup has limited scope, though it enjoys all the advantages of virtual production in terms of handling difficult reflective and transparent subjects and avoids the expense of a big facility.

Stable Fundamentals

All of these things are becoming well-established as standard techniques available to film and TV productions. As with any newly emergent technology, though, advanced techniques are evolving almost daily. It’s increasingly common to involve motion capture, 3D object scanning, and almost any of the things done by visual effects artists. That might include motion capture, for virtual characters that can react to a live actor in realtime, object scanning and recording, and virtual camera motion techniques which create almost limitless freedom in a 3D world.

There’s still far too much variation in the way virtual productions are configured for anyone to create a point-by-point how-to. What flexibility that gives us, though, is fundamental to some of the most advanced tricks that virtual productions can pull off. Still, the fundamentals are likely to remain, and that’s what we’ll be discussing as the series continues.

Supported by

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.

2024 BEITC Update: ATSC 3.0 Broadcast Positioning Systems

Move over, WWV and GPS. New information about Broadcast Positioning Systems presented at BEITC 2024 provides insight into work on a crucial, common view OTA, highly precision, public time reference that ATSC 3.0 broadcasters can easily provide.

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.