Creative Analysis: Part 5 - Video Displays With Markus Förderer
It’s perhaps a little unfair to blame modern visual effects people for the fact that audiences are becoming a little jaded about green screen. If we’re to conclude that there’s some sort of quality problem with VFX, we’d need to be sure that we were noticing each and every use, so we know how big the sample is. Many of the applications of VFX, in modern movies, are actually comparatively simple fixes or paint-outs of inappropriate details that nobody ever notices – the boom reflected in a window, the anachronistic sign in a period piece. If we’re not aware of these VFX, they can’t really be objectionable, and we have no idea what proportion of VFX are actually a problem.
Other Articles in this series:
Even so, there has to be something behind the general audience exhaustion with CGI and green screen. Cutting out a piece of one picture and replacing it with another is something that’s long since trivial, but the rub is in making that new image interact convincingly with the rest of the scene. LED video displays have already been much used, perhaps most famously in Gravity, to provide very detailed interactive lighting to suit later visual effects integration, and front projection was used on Oblivion to suit a set with a lot of reflective surfaces that might have been very difficult to handle on green screen. It’s only been recently, though, that LED video displays have achieved a combination of power and resolution sufficient to be both a video display and a lighting device simultaneously.
It’s a technique cinematographer Markus Förderer, ASC, BVK, admires. “In my experience people had bad experiences,” he begins. “Oh, we're going to be locked in – why don't you do it green screen? But it has its advantages: the video wall becomes the main light source and when you walk away, the shots are just done.” Förderer’s credit history boasts an enviable combination of shorts and documentaries alongside major theatrical releases including Independence Day: Resurgence, Skyscraper and this year’s Red Notice. As such, he’s very familiar with the VFX approvals process of “endless calls, iterations, something feels fake. The process takes a while, versus if you shoot it in camera you can just move on. Even for wider shots, they may need some VFX touchups, but it's simple fixes.”
Here, Förderer references Megan, a sci-fi short by director Greg Strasz in which significant action on board a helicopter might usually have demanded acres of green. Förderer, however, had other ideas. “They wrote a script for a potential Cloverfield sequel. We had a lot of sponsor companies to support it and we shot that as a proof of concept. They pitched it to Bad Robot and they liked it. I don't know what's going on at the moment, but they loved it. I used it as a platform to test out new technologies, especially the LED video walls. When producers hear about rear projection and similar technologies, they think of a 1950s back projection from a Hitchcock movie. This helped me a lot to show to producers to get a video wall, as it can be quite expensive.”
Förderer describes director Strasz as “a long time VFX supervisor, and an FX artist for [Independence Day impresario] Roland Emmerich.” With only a two-day shoot available and a limited budget, the production had found a company willing to make a Black Hawk helicopter available and then assessed how best to convincingly shoot in-flight interiors without actually flying the aircraft. “The question was how can we effectively shoot that in such a short time and make it not look cheap. We had a limited budget. That's when I pitched the video wall idea, when I saw how little time we had. The shots are pretty much lit by the video wall.”
The technique naturally relies on availability of footage to be displayed. “It really means you have to shoot plates before day one,” Förderer confirms. “During prep, people will forget about it - ‘oh we can fix this later’ - but the whole purpose is to shoot the background plates first and capture it on the day… I captured some drone plates with a tiny DJI drone at the LA river which we almost crashed to get the illusion of crashing!” The more advanced alternative is to use realtime computer-generated backgrounds, something that’s particularly being pushed by Epic Games for its Unreal Engine. Förderer used the technique on the television project Nightflyers. “We worked with a company called AR Wall. They provided the live tracking. We didn't have to capture any plates and they'd generate it in 3D, load it into a pretty fast computer and created real-time renderings of the backgrounds.”
Most exteriors of the aircraft shown in Megan are entirely computer-generated, but, as Förderer says, there’s even an advantage to the LED video wall there, since the VFX facility was not saturated with dozens of individually trivial shots requiring green screen replacement. “I knew there was a lot of VFX involved,” Förderer continues. “In a regular scene you have coverage, reverse, insert, every time you see green it becomes a VFX shot, and it takes the focus away from the big VFX shots.” With the video display handling the lion’s share of the work, Förderer and the VFX team were free to focus postproduction effort on those shots that could not be created any other way.
Some shots were computer-generated, but the reduced VFX workload allowed for more time to be spent on them.
Anyone who’s walked down the Las Vegas strip at night has seen large, high-power video displays, but they are not cheap. As Förderer says, “on the big shows it gets really expensive and takes a lot of prep work if you go really big. What we tried, because of the constrained budget, was to build it on wheels. I did roughly the math to get it as small as possible, so for each shot we could move it in to the right spot. It was like a massive TV on wheels and it was really effective because we could roll it all around the helicopter. If we were shooting toward the pilots, we rolled it in front, for the side the side window.”
Megan was shot on Red Helium using Todd-AO anamorphic lenses, a combination of modern super-sharpness and classical optical effects that Förderer likes. “Traditionally you stopped those lenses down to 5.6, but I like the combination of soft lenses and a high res sensor. Some shots were wide open. If there's a lot of backlight they maybe get washed out, because the sensor itself is so sharp or has such high resolution it makes up for the soft lens. There's only a handful of sets around in LA, I tested them all. Each rental house has maybe one set, they're all different generations, different spherical elements.” Eventually, Förderer chose a set held by renowned Los Angeles rental facility Keslow Camera.
Ordinarily, a visual effects supervisor might have been cautious about the complications imposed by such characterful lenses. With the LED video display, though, there were real light sources to which the classic glass could react, and no concerns over keying at a later date. Förderer confirms that it’s a match made in heaven: “with the video wall it helps to shoot with dirtier lenses, which usually makes it harder for VFX. On green the supervisors want to talk you into shooting spherical, but then even the people look sharp and crisp. With anamorphic, all the blooming, flaring, and highlights reacts interactively with the light coming from the video wall.”
The video wall hardware was provided by PRG, who came fully equipped with equipment to scale and retime footage to suit the shot, although Förderer did his best to keep things simple. “We ended up not using most of those tools. I felt it was better to keep it naturalistic. You can add fake light sources on the wall, though - a big white circle on the wall, for instance. It was a proof of concept to do it as low-tech as possible. Usually you use a sync box to sync your shutter to the video. We didn't do any of that stuff. We shot everything the same frame rate. With faster motion you might see the lack of synchronisation, but you can get away with a lot..”
There is the potential for moire being generated by the dense dot pattern of an LED display, and diffusion of various types is sometimes used. Still as Förderer warns, it can create an unacceptable compromise in the contrast and power for which LED video displays are renowned, and on Megan the simple solution was to ensure the display was always slightly out of focus. The stage was generally lit with daylight from one Skypanel and Digital Sputnik DS-6 and DS-3 lights, to which the LED display matched well, requiring only a small green-magenta shift on some shots.
“It was a great experience,” Förderer concludes. “It really helped me pitch video wall to directors. Sometimes they can't really grasp what it means. People talk about The Mandalorian. That’s a massive approach and people think they can't afford it, but we've shown it can be beneficial to almost any kind of production.” The advantage, in the end, is simply the vastly improved sense of reality. “People get annoyed by all this fake-feeling stuff. The big challenge is to change the mindset of production. It's so separate in the producer’s head between production and post – whatever ends up [in the background] will be full CG or some plate unit will take care of it. But on all the projects I've shot since Megan, I've used video wall.”
You might also like...
Expanding Display Capabilities And The Quest For HDR & WCG
Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.
NDI For Broadcast: Part 2 – The NDI Tool Kit
This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.
HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG
Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…
Great Things Happen When We Learn To Work Together
Why doesn’t everything “just work together”? And how much better would it be if it did? This is an in-depth look at the issues around why production and broadcast systems typically don’t work together and how we can change …
Microphones: Part 1 - Basic Principles
This 11 part series by John Watkinson looks at the scientific theory of microphone design and use, to create a technical reference resource for professional broadcast audio engineers. It begins with the basic principles of what a microphone is and does.