Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

HDR images are easy to like, especially alongside better color rendering.  The demand from the people who are concerned with budget and engineering, though, is to achieve that without replacing every person and piece of equipment involved.

Infrastructure

Happily, broadcast infrastructure designed for quality Standard Dynamic Range pictures will generally accommodate HDR images. Since television production began using digital systems, ten bits have been used to represent the three values of each pixel (whether that meant red, green and blue, or the Y, Cr and Cb of component pictures). Ten bits can represent numbers between zero and 210, or 1024. It has been proposed that 12-bit systems, which represent 4096 levels, are necessary for HDR, although that is almost never used in practice.

Re-purposing existing equipment to handle HDR has meant redefining which digital values represent which levels of brightness. The specifics vary from manufacturer to manufacturer, although a lot of current equipment can optionally handle HDR using standards such as Hybrid Log-Gamma (HLG), the Perceptual Quantizer (PQ) or a manufacturer-specific approach such as Sony’s SLog series, which was developed for field production then applied to live broadcast work.  As such, field production targeting HDR on a single camera might not need much more than different menu options in camera, and in the edit suite, alongside HDR-capable monitors.

Bringing HDR to live production will usually involve changes to more equipment, particularly in vision engineering, whether that’s reconfiguration (of cameras and CCUs) or outright replacement (of displays). Not everything need change. Many people, from camera operators to vision mixers, do not absolutely need to view an HDR display. Vision engineers and other key technical crew certainly do, though, and it’s here where new technologies and whole new skillsets can be necessary.

At the same time, perhaps not all picture sources will be HDR. Special purpose cameras, such as high speed cameras used on sporting events, might have different (or no) HDR capabilities. Sources such as graphics, remote contribution and wireless links might present the same problem. Many sources might, in principle, be capable of HDR, though practical considerations may mean that the proper configuration is hard to achieve.

All of these situations can be handled to varying degrees with conversion tools, whether those tools exist as an external device or as a feature of a vision mixer or other device. Up-conversion of non-HDR picture sources is unlikely to match the performance of high-performance cameras with HDR performance built in, although with a little problem-solving a wide variety of configurations can be accommodated.

Designing The Workflow

There are at least two main approaches to live broadcast HDR, regardless of the format to be distributed.

It is instinctive to generate the HDR version as a master, then down-convert the SDR from that. It is (or at least recently was) not uncommon, though, for vision engineers to work with reference to an SDR image. Both SDR and HDR are derived from the camera’s original image data which does not itself represent either SDR or HDR. There are many subtleties to this approach, but generally, ensuring that exposure and color are properly configured for the more restrictive world of SDR will generally ensure that the more-capable HDR output is also reasonable.

Regardless, the vision engineer or close colleague will likely monitor the HDR image to ensure proper results, so some amount of HDR test and measurement equipment will be involved. Monitors must, naturally, be more capable. Test and measurement displays such as waveform monitors must vary in how they are calibrated, since the relationship between signal level and sheer light output is different. Given HDR is often associated with wide color gamut, color is also likely to change. Traditional vectorscope-style approaches may lack the range to meaningfully display colorspaces such as the ITU’s Recommendation BT.2020.

Various new test and measurement devices have been invented to handle HDR and the commonly-associated wide color gamut standards. Often, vectorscopes are replaced with devices which represent color on a CIE 1931 chart, which covers the entire range of human color vision. The resulting image is similar in character to a vectorscope but creates a more meaningful display of wide color gamut images.

These are technical considerations. Possibly the biggest challenges, though, are related to those parts of vision engineering which are inherently subjective.

Vision Engineering

Possibly the greatest challenge for vision engineers is the inevitable subjectivity, the need to arrive at the most reasonable, viewable representation of a scene. It has always been part of the role, but HDR changes what “reasonable” means.

In principle, HDR and Wide Color Gamut should make color and brightness handling easier. Even for Standard Dynamic Range, cameras have long had more dynamic range than the distribution system and home TVs are designed to handle. Camera features such as auto knee and various gamma modes have been developed to handle this. HDR requires less restriction. However, the potential for HDR to be unpleasantly dazzling or to represent uncomfortably large swings of contrast creates new considerations.

One good example of that - based on a real event - might be an outside broadcast of a ski competition. The snow is blindingly bright, to the point where snowblindness is a real-world consideration. At the same time, pine forests are commonly seen in the background, where the dark trees and their even darker shadows create a huge contrast against the snow. At the same time, competitors might wear clothing using very saturated dyes which might easily exceed the color gamut of a standard dynamic range broadcast mastered according to The ITU’s Recommendation BT.709.

Technologically, HDR and wide color gamut might handle all this more easily. Creatively, though, vision engineers must seek a compromise. On one hand, the improved capability of HDR should be leveraged, so that the audience (and producers) see the best experience.  On the other hand, long-established requirements for consistency remain important, and audiences must not be made uncomfortable.

The World Of Single-Camera

The work of vision engineering on a live broadcast has always been somewhat analogous to grading a recorded single-camera production. Generally, a colorist working on a drama will aim for a more interpretational approach, though both aim for viewability and consistency. Only on rare occasions have vision engineers been persuaded to aim for a creative result that’s clearly and visibly different from the public’s general expectations for conventional television.

The biggest difference between the creative decision making in drama and broadcast, though, is that color grading for drama is descended from decades of techniques created for photochemical film. When digital cinematography began to take over, cinematographers had a clear desire to bring the capabilities of the telecine suite to feature film production. So, digital cinematography, unlike live broadcast production, has always operated on the assumption of a grading step. That step has the responsibility (among other things) of taking the high dynamic range camera original, whether that means film or digital, and preparing it for a lower-dynamic-range distribution.

So, concerns around re-engineering the signal chain to accommodate HDR are less relevant to single-camera and other non-live material which will be graded. The system more or less already existed in a state which could handle HDR without enormous changes, beyond software updates and a better display. HDR also offers many of the same advantages as it does to broadcasters, potentially easing the work of packing the huge dynamic range of the real world down into a much less capable distribution format.

Even so, working in the field, where there is no vision engineer to control the camera, creates challenges of its own. Some productions will use a digital imaging technician, part of whose responsibility is somewhat analogous to the broadcast vision engineer’s role. Field production for broadcast, however, is less likely to assign a specific individual to that role, and field crews must become used to shooting for HDR results.

Problems may occur on both live and recorded material when - for instance - a less-than-ideal camera position, awkward weather or other externally-imposed factors affect the contrast content of an image. In standard dynamic range, shooting an underlit presenter against a bright sky might be unattractive. Ultimately, though, the system lacks the sheer power to make that bright sky dazzling. In high dynamic range the same scene might actually be difficult to watch. In the field, crews may lack a high-grade reference display and will certainly lack the controlled environment in which to view it, potentially making these problems hard to detect and solve.

Both At Once

No matter the circumstances, however, for the foreseeable future, broadcasters will often be required to satisfy the requirements of both HDR and SDR simultaneously. Recognising the pitfalls of both makes that work easier - but some significant technologies have been developed to make things easier.

Supported by

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.