Creative Analysis: Part 5 - Video Displays With Markus Förderer

It’s perhaps a little unfair to blame modern visual effects people for the fact that audiences are becoming a little jaded about green screen. If we’re to conclude that there’s some sort of quality problem with VFX, we’d need to be sure that we were noticing each and every use, so we know how big the sample is. Many of the applications of VFX, in modern movies, are actually comparatively simple fixes or paint-outs of inappropriate details that nobody ever notices – the boom reflected in a window, the anachronistic sign in a period piece. If we’re not aware of these VFX, they can’t really be objectionable, and we have no idea what proportion of VFX are actually a problem.


Other Articles in this series:


Even so, there has to be something behind the general audience exhaustion with CGI and green screen. Cutting out a piece of one picture and replacing it with another is something that’s long since trivial, but the rub is in making that new image interact convincingly with the rest of the scene. LED video displays have already been much used, perhaps most famously in Gravity, to provide very detailed interactive lighting to suit later visual effects integration, and front projection was used on Oblivion to suit a set with a lot of reflective surfaces that might have been very difficult to handle on green screen. It’s only been recently, though, that LED video displays have achieved a combination of power and resolution sufficient to be both a video display and a lighting device simultaneously.

It’s a technique cinematographer Markus Förderer, ASC, BVK, admires. “In my experience people had bad experiences,” he begins. “Oh, we're going to be locked in – why don't you do it green screen? But it has its advantages: the video wall becomes the main light source and when you walk away, the shots are just done.” Förderer’s credit history boasts an enviable combination of shorts and documentaries alongside major theatrical releases including Independence Day: Resurgence, Skyscraper and this year’s Red Notice. As such, he’s very familiar with the VFX approvals process of “endless calls, iterations, something feels fake. The process takes a while, versus if you shoot it in camera you can just move on. Even for wider shots, they may need some VFX touchups, but it's simple fixes.”

A helicopter interior with the video display providing the backdrop.

Here, Förderer references Megan, a sci-fi short by director Greg Strasz in which significant action on board a helicopter might usually have demanded acres of green. Förderer, however, had other ideas. “They wrote a script for a potential Cloverfield sequel. We had a lot of sponsor companies to support it and we shot that as a proof of concept. They pitched it to Bad Robot and they liked it. I don't know what's going on at the moment, but they loved it. I used it as a platform to test out new technologies, especially the LED video walls. When producers hear about rear projection and similar technologies, they think of a 1950s back projection from a Hitchcock movie. This helped me a lot to show to producers to get a video wall, as it can be quite expensive.”

Förderer describes director Strasz as “a long time VFX supervisor, and an FX artist for [Independence Day impresario] Roland Emmerich.” With only a two-day shoot available and a limited budget, the production had found a company willing to make a Black Hawk helicopter available and then assessed how best to convincingly shoot in-flight interiors without actually flying the aircraft. “The question was how can we effectively shoot that in such a short time and make it not look cheap. We had a limited budget. That's when I pitched the video wall idea, when I saw how little time we had. The shots are pretty much lit by the video wall.”

Black Hawk helicopter against the video display.

The technique naturally relies on availability of footage to be displayed. “It really means you have to shoot plates before day one,” Förderer confirms. “During prep, people will forget about it - ‘oh we can fix this later’ - but the whole purpose is to shoot the background plates first and capture it on the day… I captured some drone plates with a tiny DJI drone at the LA river which we almost crashed to get the illusion of crashing!” The more advanced alternative is to use realtime computer-generated backgrounds, something that’s particularly being pushed by Epic Games for its Unreal Engine. Förderer used the technique on the television project Nightflyers. “We worked with a company called AR Wall. They provided the live tracking. We didn't have to capture any plates and they'd generate it in 3D, load it into a pretty fast computer and created real-time renderings of the backgrounds.”

Most exteriors of the aircraft shown in Megan are entirely computer-generated, but, as Förderer says, there’s even an advantage to the LED video wall there, since the VFX facility was not saturated with dozens of individually trivial shots requiring green screen replacement. “I knew there was a lot of VFX involved,” Förderer continues. “In a regular scene you have coverage, reverse, insert, every time you see green it becomes a VFX shot, and it takes the focus away from the big VFX shots.” With the video display handling the lion’s share of the work, Förderer and the VFX team were free to focus postproduction effort on those shots that could not be created any other way.

Some shots were computer-generated, but the reduced VFX workload allowed for more time to be spent on them.

Anyone who’s walked down the Las Vegas strip at night has seen large, high-power video displays, but they are not cheap. As Förderer says, “on the big shows it gets really expensive and takes a lot of prep work if you go really big. What we tried, because of the constrained budget, was to build it on wheels. I did roughly the math to get it as small as possible, so for each shot we could move it in to the right spot. It was like a massive TV on wheels and it was really effective because we could roll it all around the helicopter. If we were shooting toward the pilots, we rolled it in front, for the side the side window.”

Megan was shot on Red Helium using Todd-AO anamorphic lenses, a combination of modern super-sharpness and classical optical effects that Förderer likes. “Traditionally you stopped those lenses down to 5.6, but I like the combination of soft lenses and a high res sensor. Some shots were wide open. If there's a lot of backlight they maybe get washed out, because the sensor itself is so sharp or has such high resolution it makes up for the soft lens. There's only a handful of sets around in LA, I tested them all. Each rental house has maybe one set, they're all different generations, different spherical elements.” Eventually, Förderer chose a set held by renowned Los Angeles rental facility Keslow Camera.

Ordinarily, a visual effects supervisor might have been cautious about the complications imposed by such characterful lenses. With the LED video display, though, there were real light sources to which the classic glass could react, and no concerns over keying at a later date. Förderer confirms that it’s a match made in heaven: “with the video wall it helps to shoot with dirtier lenses, which usually makes it harder for VFX. On green the supervisors want to talk you into shooting spherical, but then even the people look sharp and crisp. With anamorphic, all the blooming, flaring, and highlights reacts interactively with the light coming from the video wall.”

The LED video wall creates highly convincing interactivity between the environment and the subject.

The video wall hardware was provided by PRG, who came fully equipped with equipment to scale and retime footage to suit the shot, although Förderer did his best to keep things simple. “We ended up not using most of those tools. I felt it was better to keep it naturalistic. You can add fake light sources on the wall, though - a big white circle on the wall, for instance. It was a proof of concept to do it as low-tech as possible. Usually you use a sync box to sync your shutter to the video. We didn't do any of that stuff. We shot everything the same frame rate. With faster motion you might see the lack of synchronisation, but you can get away with a lot..”

There is the potential for moire being generated by the dense dot pattern of an LED display, and diffusion of various types is sometimes used. Still as Förderer warns, it can create an unacceptable compromise in the contrast and power for which LED video displays are renowned, and on Megan the simple solution was to ensure the display was always slightly out of focus. The stage was generally lit with daylight from one Skypanel and Digital Sputnik DS-6 and DS-3 lights, to which the LED display matched well, requiring only a small green-magenta shift on some shots.

Simulating wind and dust.

“It was a great experience,” Förderer concludes. “It really helped me pitch video wall to directors. Sometimes they can't really grasp what it means. People talk about The Mandalorian. That’s a massive approach and people think they can't afford it, but we've shown it can be beneficial to almost any kind of production.” The advantage, in the end, is simply the vastly improved sense of reality. “People get annoyed by all this fake-feeling stuff. The big challenge is to change the mindset of production. It's so separate in the producer’s head between production and post – whatever ends up [in the background] will be full CG or some plate unit will take care of it. But on all the projects I've shot since Megan, I've used video wall.”

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.