HDR: Part 44 - Creative Technology - 2021 Cinematography Roundup

Filmmaking is now an artform with a long history, and that means a lot of received knowledge. Things like primes being better than zooms, and characterful primes best of all. The public attitude to that orthodoxy, though, is starting to show some signs of changing, and the dawn of a new year seems like a good time to examine some of the ways that’s starting to show up in production.

Zoom Lenses

Zooms suitable for cinema have only really existed for a few decades, and for much of that time they've often been considered a second-best option. To be fair, for much of that time it was true; early designs were soft, and the overuse of on-shot zooms in 1970s feature films, when the technology was new, is often viewed as clumsy. That clumsiness might even have been a bigger issue than the optical performance.

Both those problems are solvable, though. The choice to zoom on shot is just that, a choice, and modern zooms do not necessarily give away anything in terms of sheer image quality to primes - almost to a fault. Some of Angenieux and Fujifilm's output are so clean they have been described as boring, although in clever lenses, below, we find that's perhaps more a blessing than a curse.

Either way, it's never been any secret that while prominent productions often keep a high-performance modern zoom available, discussion of those productions often centers around the accompanying set of primes, on the assumption that the zoom is rarely used. Off-the-record conversation with directors of photography, though, increasingly reveals that zooms being used more widely than is often imagined.

Primes will always be smaller and often faster, which matters to gimbal and drone people, and yes, they might have prettier flares. It's still likely that a lot of discussion is going on about images which people imagine to have been shot on a renowned prime, but which were actually made using a zoom.

Characterful Lenses - Or Not

Digital cinema robbed cinematographers of the choices of film stock and processing, things that had long been used as a way to make a particular production unique. Different digital cinema cameras certainly have unique looks, although for some reason that was never as popular an approach as using lenses with something often vaguely described as character.

Phrases like "take the edge off" are commonly applied here; the softness and low contrast of old designs, which would once have been considered flaws, are often posited as an advantage. The problem is twofold. First, lens manufacturers have spent the last few decades improving their designs for a reason; modern lenses tend to be sharper, faster, more rectilinear and more consistent, especially at wider apertures.

The second and subsequent problem is that historic designs were often made at a time f/5.6 was a common shooting stop. Now wide-open lenses are commonplace, the pursuit of something that's characterful can end up capturing something that's simply, well, bad. The solution is perhaps related to the same issues underlying the increased use of zooms. Cinematographers using sharp, fast, modern glass have a greater ability to shoot under reduced artificial light or into a naturally-lit evening, with the image less likely to fall apart as night falls and the aperture opens, and that saves money.

Anyone still keen to take off that edge can still filter, an approach that's less likely to create issues with inconsistent black levels and varying degrees of sharpness and flare. All this has always been true, but cinematographers have perhaps become more willing to discuss the sharp-lens-plus-filter approach.

Full Frame

Many column inches have been spent on the validity of various sensor sizes for various jobs. Given the success of The Hurt Locker (Super 16mm) and The Curious Case of Benjamin Button (2/3" digital), it's clear that big chips are not a prerequisite to high-end results.

In principle, the performance of large sensors may not even be much better than smaller ones. Assuming we're targeting a similar field of view and a similar depth of field, a larger sensor with larger photosites might end up leveraging the improved sensitivity to achieve the smaller required aperture. Using higher sensitivity in that way might yield similar performance in terms of noise and dynamic range to a smaller-sensored option; a zero-sum game, perhaps.

The practical reality is that very few cinematographers have the lighting budget to fully offset the differences in effective depth of field, requiring more than double the light level, and rely on the skill of the focus puller to work at much reduced depth of field. The results should be proportionally and genuinely better pictures, at the cost of more frequent focus buzzes.

The concern is how much it matters. With companies such as Blackmagic able to pack 12K of resolution into a Super-35mm sensor with good performance, the near future might allow us to rely on a sober consideration of what best serves the production. For the time being, though, rental houses report huge popularity for large format cameras and lenses.

Resolution

Speaking of Blackmagic's world-beating photosite count, let's consider resolution. It's impossible to reliably cite a point at which the resolution of cinema cameras became a moot point, though the combination of original camera negative, interpositive, internegative and print were capable of significantly less than 2K. With higher pixel counts now in cellphones, it's clear resolution has been a won war for a while.

The lukewarm reception to Blackmagic's 12K release is a concomitant of that, although there was at least some misunderstanding of the purpose of the camera. Blackmagic clearly did not intend to propose 12K as a distributable resolution; it was perhaps more interested in the increased color fidelity of a finer-pitched sensor, which is very valid. Still, the company's recent announcement of a 6K camera of similar general layout has generated a warmer response, possibly because 6K is a good engineering compromise, avoiding the demand for impractically-small photosites on a Super-35mm sensor, while maintaining enough excess resolution to accommodate a significant amount of stabilization, cropping and VFX work and still ensure a good 4K finish.

The public debate has been cautious about the resolution race for years, although that race was only definitively won in the last few, when really enormous resolutions became not only available but affordable. There is some commonality here with the large-format situation: cameras such as the massively-capable Panasonic S1H offer beyond-4K resolution and big chips at almost enthusiastic-amateur prices.

Color Management

Resolution, though, is only half the story. The color performance of electronic (not even digital) cinema technology has been a keen target for improvement ever since the first attempts in the 1980s. Camera negative and print stocks of have gamut limits; there are extremes of color they can't reproduce, but they are generally much more capable than color video of decades past.

That's no longer quite so true. With recent improvements such as Rec. 2020 color and various HDR standards making home TVs capable of huge ranges of color and brightness, our electronic distribution standards are probably more capable than film ever was. The problem, as evidenced by endless online question-and-answer sessions, is that the race to innovate has led to a proliferation of standards so vast that even experts find themselves thumbing through the literature on a film set, and even the most basic terminology is often subject to heated discussion.

Cameras and displays usually understand several varieties of color and brightness encoding, and the software we use to handle this material in post production often introduces another set of opportunities for things to be wrong. Fundamental concepts such as "scene referred," where the values recorded in the file are directly related to the light which entered the camera's lens, and "display referred," where those values are manipulated for display on a particular type of monitor, are widely misunderstood. Color and brightness terms are confused with compression or subsampling terms, and even where terminology is used correctly, documentation can be sparse.

It's a complex field, and the problems are not the fault of any one entity, but discussion in the last year or so has often involved a realization of just how out of hand the explosion of standards has become. Happily, once a workflow has been designed for a particular job that workflow tends to remain valid for the duration of that job.

The Future

In the end, all of the things we've discussed here, from lenses to color encoding, exist in the context of huge capability facilitated by enormous recent innovation in electronics, sensor and optical design. We might reasonably hope that the early twenty-first century will turn out to have much in common with the early twentieth, when motion picture technology was in its infancy and new technologies proliferated wildly. Perhaps now, as then, the technology will settle down to just a few common approaches.

In the meantime, we work in the knowledge that the things we do now may well influence the next near-century's worth of production technology, and as such, we should tread lightly.

You might also like...

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.

HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG

Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…

Great Things Happen When We Learn To Work Together

Why doesn’t everything “just work together”? And how much better would it be if it did? This is an in-depth look at the issues around why production and broadcast systems typically don’t work together and how we can change …

Microphones: Part 1 - Basic Principles

This 11 part series by John Watkinson looks at the scientific theory of microphone design and use, to create a technical reference resource for professional broadcast audio engineers. It begins with the basic principles of what a microphone is and does.

Designing An LED Wall Display For Virtual Production - Part 1

The LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in an interdependent array of technology.