Creating Virtual Production Environments With Production Supervisor Don Bitters
Much of the attention enjoyed by virtual production currently goes to the spectacular stages with LED displays the size of half a dozen cinema screens. The material we put on those displays, though, can come from a number of places, and anyone putting together a virtual shoot will quickly encounter some subtleties that can make life easier – or, if mishandled, a lot harder.
Don Bitters is Lead Visual Designer and Virtual Production Supervisor at Orbital Virtual Studios. He is an experienced virtual production supervisor, coming from a background in VFX that’s become more relevant to realtime graphics as the technology has improved. “I oversee everything from the LED screens, to the controllers, to the systems bringing the content to the wall, as well as what that content is,” Bitters begins. “I also have a VFX company that, over the course of the last couple of years, has shifted more to realtime as it’s become more of a viable thing. We've done dozens of environments for virtual production, and we have a lot of info on best practice – things you should do and things you shouldn't do.”
There are several ways to create those environments, from shooting plates to creating fully three-dimensional virtual environments for realtime rendering. Bitters cautions that plates can be a less straightforward solution than they might seem, although “you can then take advantage of something that's a hundred per cent real. It depends on the project. We have one coming up where that's what they're using because they want to move through a lot of environments in a short time, and they need to match live action.”
Bitters cites a sequence produced on an upscale FX series as a good example. “If you're trying to recreate a specific world and not move the camera around it works great. On Snowfall we did the exterior view from the apartment set and it was really critical… our team went out and lined up three cameras all shooting in sync to allow us to create multiple 6K angles. We could change the image, change time of day, or change where the horizon line was. But if you were to try to do something more complicated – a forest, a city street, but you want control over traffic, or control over the light, you might find yourself very restricted by plates. You do trade off a lot of the fine control that you'd get with fully virtual environments.”
The sheer bulk of data also creates what can be a time-consuming workload. Bitters warns that while shooting can be fast, “plates can take time. It does take time to generate these very high resolution plates that will run on our systems. Alternatively we'll shoot three versions of a plate, three times of day, and I’ll go in and do variations. On Snowfall we had to create every time of day.”
An intermediate option, between shooting real locations and building virtual ones, is to scan a real location and then render the resulting 3D model in realtime. It’s an approach Bitters likes: “if you're dealing with real world locations, what you want to do is scan those locations and turn it in to 3D assets and you do that with lidar or photogrammetry. Some stuff will need to be [edited] to be functional.”
By far the most renowned applications of virtual production, though, exist in worlds that are rendered in real time as the shoot happens. While the process relies on hardware and software – things like Epic Games’ Unreal engine – that exist only because of video gaming, and while there are many transferable skills and technologies, Bitters has some pointers. “You can't just build it like you would a game. Key things to keep in mind are that that once you've locked in your general design, you have to approach it knowing you're not creating a whole world. If you have great concept art and you build the whole world without considering where the stage is going to be in that world, you end up overbuilding and have performance issues, or you build too light and then you have too many places you can't go with it.”
“We've also had a few clients who had hired game designers or grabbed assets off the Epic marketplace,” Bitters goes on. “The problem is that none of that stuff is focussed for virtual production. You have game designers who are thinking ‘I have one system I'm running this on and I can have two hundred million pieces of foliage.’ They're not thinking what the end goal is, and they're designing like they are for games.”
Bitters’ practical advice begins straightforwardly. “If something’s not being seen, remove or reduce it. In Unreal, large landscapes can really tax the system and if you have, say, a ten kilometre square landscape and you're only seeing one or two, you should really just remove the rest because it's hitting your performance. The ability to synchronise the rendering over multiple systems does have some overhead, so you have to target everything to a high frame rate. All these systems have to be synced perfectly together and that does take some process time.”
At the same time, the people behind the Unreal engine have worked hard to keep things moving quickly, particularly, as Bitters confirms, in the most recent version. “The benefit with Unreal 5 is they introduced Nanite technology, which that means you can have billions and billions of polygons on screen. They've done tests with a billion polygons on screen. You can often use the same asset you used for VFX. There's still optimisation, the same steps you'd have to take on with previous versions of Unreal, but we've done scenes with enormous amounts of Nanite meshes – a single mesh that's 2.8 million polygons!”
Certain special effects which are common in video gaming need, as Bitters puts it, special consideration. “Particles are usually oriented to point at camera. If you turn that off and keep them fixed you solve problems of things looking incorrect from one rendering system to another. There are also rules of how certain textures should be implemented. One designer did a cloud as flat images configured to point at camera. When the camera moved they would rotate in the sky. All that had to be turned off.”
The flexibility to adjust certain parameters can be important too, not only to create changes of lighting but also to trim the final match between the real and virtual worlds. “If you want to go in and move lights you can’t have the shadows painted on the textures. Maybe I want to change the time of day, the color of the fog, the color of those houses,” Bitters says. “You need to have all those controls that the creative team will want built in ahead of time, if you want to control how fast a waterfall moves or where the sun is or the colour or density of the foliage. You need a slider, or material controls so we can do live adjustments.”
Treating a virtual production stage as a sophisticated evolution of back projection, though, is not what Bitters finds most exciting. “If you look at shows like Obi-Wan Kenobi and Star Trek, they're treating it as a backdrop, a very expensive Translight. But it's not a backdrop. It's a window into a virtual space. We're doing stuff where we're considering what's in the environment and how that interacts with your performance. For instance, we took a motion-captured actor controlling a goliath the size of a twenty-story building, and our live action character was scaling it. To do that we pinned the world to the giant’s arm, so when the motion capture actor moves her arm the whole world moves.”
“We have stuff where we have animations running at the time of production,” Bitters continues. “For a shot where the goliath hits the ground and the live action actress goes flying, we had to prebuild an animation and run it at the right moment, so we knew she'd be pulled for the stunt at the exact moment of the impact. We rehearsed a couple of times, then on the day when we went to shoot we'd hit play on that sequence and it'd hit at the moment she'd be pulled.”
Another innovation which takes virtual production beyond simply behaving as a backdrop is the potential ability to follow an actor through what Bitters describes as a limitless virtual space. “We're also doing another proof of concept – I don't know if this would apply to every virtual production stage, but I've worked out a methodology where you could walk around, follow a performer from one place to the other, look back through the doorway and there's never any physical set build. We can shoot forever and move from room to room in whatever space you want.”
The state of the art, then, is limited more by the ability of users to conceptualise than the technology itself, something that Bitters links to some of the earliest visual trickery in film production. “I look at what we have the capability of doing with virtual production almost like what Lumiere was doing early on, experimenting and creating wild VFX that had never been done before. We're talking about very high-end new technology, but you can also use very old tricks like forced perspective that you couldn't do with traditional backdrops. It really opens up the door to brand new ways of doing shots and telling stories that people are only beginning to get an idea of.”
No matter how clever the techniques become, though, the draw of virtual production remains the truly seamless integration of live and virtual worlds that comes from a completely in-camera composite. “Our first AC has worked in this industry for may years,” Bitters recalls. “He was looking at this completely virtual scene, going through his scopes on the monitor and said ‘the camera can't tell it's not real.’ He was seeing the same as if he'd got that exact same scene on a real set.” Creating that result with minimum effort and maximum effectiveness demands attention to detail, then, and not just in terms of that enticing stage with a huge curve of LED display.
You might also like...
Designing IP Broadcast Systems
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.
Expanding Display Capabilities And The Quest For HDR & WCG
Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.