Momentum Builds Behind HDR+

Ultra HD (UHD) has gone through various iterations since it emerged from the shadow of 3D TV to become the broadcasting industry’s standard bearer for immersive or next generation TV. At first it was all about the higher resolution of 2160 x 3840, four times the pixel density even of 1080p “full HD”, but high frame rate (HFR) and Wide Color Gamut (WCG) at 10 bit sampling, rather than 8 bit as before, were also in the picture.

Subsequently High Dynamic Range (HDR) was added to the mix and quickly became one of the most prominent ingredients, because in consumer tests it appeared to deliver a greater improvement in perceived picture quality than the higher resolution. The latter is usually now referred to on its own as 4K, distinct from UHD which embraces all the other technology improvements, including object based audio as well.

A point quickly picked up by a number of vendors as well as broadcasters and operators was that HDR coupled with WCG increased the bit rate relatively slightly by 25% at most, compared with 4K which normally quadruples it in the absence of new compression methods such as HEVC. Even with advanced coding 4K generates about 2.5 times as many bits as full HD.

Another key component is the signaling used to transmit HDR from camera through the delivery infrastructure to TV or client device, the emphasis being on conveying as much contrast information as possible. The aim of HDR is to simulate the human eye’s ability to distinguish between huge contrast ranges by making blacks darker and light colors brighter on a screen in such a way that looks natural. But this requires an agreed way of converting light to electrical signals and transmitting faithfully through the ecosystem. This process is referred to by various names such as Electro Optical Transfer Function (EOTF) and perceptual HDR transfer function. These three standards have been bundled together as HDR+, which is being positioned as “UHD Lite” without the 4K or HFR, since both of those impose a heavy bandwidth penalty.

The most recent development is agreement within the ITU, the main body for standards related to HDR, over two recommendations for the EOTF. This will enable distribution of HDR+ programming over existing or emerging workflows, without substantial change, although HDR compliant TVs will be required to enjoy the full benefits. These standards in turn are being incorporated within the first Phase A of the Ultra HD Forum’s guidelines for end-to-end workflows involved in creating and delivering live and pre-recorded UHD content, published in April 2016.

This more or less coincided with the parallel launch of the Ultra HD Premium logo from the UHD Alliance, the other main standards development body for UHD. While the Alliance focuses on the two ends of the delivery chain, that is the cameras and CPE such as TVs, the Forum deals with all the infrastructure in between, including encoding, video transport and aspects of security. The Ultra HD Premium logo is supposed to guarantee that a compliant TV can display UHD pictures including HDR. For example Panasonic’s DX902 4K TV carries the badge, while its UB900 Blu-ray player is one of the first devices other than a television to be approved.

Then the Ultra HD Forum’s guidelines ensure that those pictures are delivered at the agreed quality to the TVs. The incorporation of the two EOTF standards is a major milestone because it opens the door to commercial UHD services that can harness the infrastructure and do justice to the growing number of TVs carrying the Ultra HD Premium logo.

HDR+ offers the greatest bang for the bit, according to Matthew Goldman, Senior Vice President Technology, TV Compression at Ericsson.

HDR+ offers the greatest bang for the bit, according to Matthew Goldman, Senior Vice President Technology, TV Compression at Ericsson.

Of the two standards for encoding and transmitting HDR pictures, the one most broadcasters would prefer to use is Hybrid Log-Gamma (HLG10), developed jointly by the BBC and NHK, because this one is backwards compatible with existing standard dynamic range (SDR) displays. This means that current TVs can benefit to some extent from the greater contrast, but the problem is that it will not work with many existing workflows, as was explained by Matthew Goldman, Senior Vice President Technology, TV Compression at Ericsson.

HLG10 includes metadata that enables the backwards compatibility and maximizes quality but is not accommodated by all existing workflow processes such as transcoding, file conversion and content management. For this reason a stripped down version called PQ (Perceptual Quantization)10 was developed.

“PQ10 is the core subset of HLG10 without the metadata,” said Goldman. “Then it can survive through existing live workflows. Both these are in the process of being standardized as a new ITU-R recommendation and are now in draft form without a number.” They are currently referred to as Draft Rec ITU-R BT.{HDR-TV}, with final ratification expected in July 2016. “The significance is that now we will have a standard way of distributing HDR content,” said Goldman. “It is unfortunate we have two ways of doing it, but there were originally six.”

There are still some issues to be resolved for HDR, such as how to deal with variations in ambient lighting. When viewing in a dark room the human eye can pick up subtler distinctions between shades of black than can currently be displayed even by Ultra HD Premium compliant TVs, especially LED displays. The point here is that there are in fact two variants of HDR, one for LED displays and the other for OLED models which have different picture characteristics. While LED TVs can display HDR images with better peak brightness, OLED TVs can display deeper blacks. To cater for these differences, LED TVs have to be capable of reaching 1,000 nits or more peak brightness and less than 0.05 nits black levels to be deemed HDR capable, while OLED TVs have the less stringent target of 540 nits brightness but more demanding 0.0005 nits black level.

This suggests that over time the standard for TV displays in general may become more rigorous but there are constraints, such as the fact that higher brightness levels require more energy. This may make it harder to meet environmental requirements, as well as draining batteries too quickly in portable devices like tablets.

Then coming back to 4K resolution there are issues around educating consumers as well as bandwidth. While Ericsson in particular is pushing HDR+ as a practical subset of full UHD as an interim solution, there is still the expectation that 4K will follow while TV ecosystems evolve the required capacity and capability. But as Goldman noted, 4K only improves the viewing experience at wider viewing angles, for otherwise the extra pixels are not appreciated.

“We have to educate the viewer to sit at the right viewing distance, otherwise the eye can’t see it,” said Goldman. “The human visual system can resolve roughly one arc minute (one 60th of a degree). That is roughly three picture heights back for HD resolution, but we have to sit twice as close to see 4K in its glory. To appreciate HDR+ you can be at the far end of room and still appreciate the extra contrast between bright and dark. Because the eye is so sensitive to contrast we think that is the real wow factor for TV, more than higher resolution.”

If that is the case many operators and broadcasters may be tempted to shelve plans to include 4K in their plans for immersive TV and concentrate on HDR+. There may then be a case for deploying higher frame rates before 4K as capacity allows, since that does improve the viewing experience for fast moving action in live sports in particular.

You might also like...

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.