Ultra HD Forum Factors High Frame Rate Into Latest Infrastructure Guidelines

The Ultra HD Forum has given a stimulus to UHD deployments with the release of its latest 2.1 guidelines that give proper weight to all the ingredients constituting next generation A/V (Audio/Video).

Earlier versions of the guidelines were more skewed towards resolution in the first instance and then the combination of HDR (High Dynamic Range) with WCG (Wide Color Gamut) and 10-bit color, leaving HFR (High Frame Rate) and to an extent Next Generation Audio (NGA) for the future. Now with its comprehensive 180-page document, the Ultra HD Forum has set the stage for future deployments, even though it still concedes there is work to be done concerning interoperability between different variants in particular.

Ben Schwarz, the Ultra HD Forum’s Communications Chair, believes the new guidelines will stimulate UHD deployments.

Ben Schwarz, the Ultra HD Forum’s Communications Chair, believes the new guidelines will stimulate UHD deployments.

The Ultra HD Forum has also moved closer to its sister organization, the UHD Alliance, which focuses on capture and presentation of video, involving the cameras and TVs at either end of the pipeline as well as content itself, rather than the infrastructure. The Ultra HD Forum focuses particularly on the four core aspects of UHD, that is resolution, color (HDR/WCG/10-bit color), HFR and NGA, addressing interoperability of the various options within the delivery chain. It also within its latest guidelines deals with a number of key infrastructure issues beyond the actual delivery, including security and Content Aware Encoding (CAE) to reduce bandwidth and storage for a given quality level.

The two organizations have come into closer alignment over these aspects, with the UHD Alliance dovetailing through its logos for TVs and displays issued upon conformance with specifications for minimum performance levels. Such specifications include resolution, HDR, WCG and NGA.

The UHD Alliance also addresses the content itself and in August 2019 passed a significant milestone by announcing collaboration with leading consumer electronics (CE) makers, Hollywood studios and content creators to develop a new advanced viewing mode for watching movies and episodic TV called “Filmmaker Mode”. This is designed to elevate content production to the standards now capable of being displayed by TVs certified by the UHD Alliance, and by inference able to be delivered over infrastructure governed by Ultra HD Forum guidelines.

The latest version 2.1 Ultra HD forum guidelines are relevant in this context, not just with reference to the foundation specifications also the extension technologies that can enhance the more basic UHD content and services. The Ultra HD Forum notes that these additional UHD technologies can be “layered” onto the underlying Foundation UHD technologies in order to upgrade the consumer experience, citing the example of 7.1+4 immersive audio incorporating four overhead speakers reproducing sound above the listener, instead of 5.1 surround sound.

Another enhancement is dynamic metadata, which improves the way TVs render HDR by providing instructions on a frame by frame basis, as opposed to static metadata which changes only with the whole content, not down to frames. Without metadata at all, displays would just render HDR images either as over-bright, washed out SDR images, or, very dark images with little detail. Static metadata ensures that account is taken of the extra range included even when the TV is incapable of displaying that fully. In that case if say the range of the content was up to 4000 Nits but the TV was only capable of 800 Nits, then it would render pixels as realistically as possible within these limitations, but without being able to address the individual requirements of each frame. With dynamic data a given frame may be confined within a much smaller Nit range than others and this can be exploited to render it accurately without worrying about compensating for the display’s limitations.

Another enhancement in the Ultra HD forum’s 2.1 guidelines concerns CAE, which also in practice can improve quality as well as efficiency. CAE exploits variations between frames in their spatial and temporal complexity to reduce the overall bandwidth required to sustain a given level or perceptual quality. The Ultra HD Forum has argued that because CAE reduces the bit rate required by 40% to 50% for a given quality compared with traditional CBR (Constant Bit Rate) delivery, it can improve the experience, with up to a 50% reduction in re-buffering events and a 20% improvement in stream start times for VOD services. As the video bit rate is lower, higher resolutions can be made available to more viewers compared with the CBR encoding schemes in operation today.

The Ultra HD Forum has identified the sweet spot for CAE being for delivery of 2160p60 (4K) content at bit rates between 9 Mbps and 15 Mbps. This is because within this range of bit rates 4K content can be delivered at the same quality as with CBR at the recommended rate around 22 Mbps. In the case say of standard full HD at 1080p the efficiency savings are rather less because the content is at a lower resolution in the first place. Presumably then the potential for CAE would be even greater for 8K content when that starts being streamed.

The Ultra HD Forum is supposed to be agnostic over which features deliver greatest value but in its guidelines there is a revealing chart showing that extra bandwidth delivers a much bigger boost in terms of quality when applied to some aspects of quality than others. For example a 300% increase in bandwidth is required to step up from 2K full HD to 4K, but only a 50% increase to bring on a HFR of 120 fps and just 20% to go from SDR (Standard Dynamic Range) to HDR. WCG and 10-bit color are even less expensive, boosting bit rate by under 5%.

On the other hand, with the help of CAE, the bandwidth cost of higher resolution can be greatly reduced, often beyond the claimed savings of 50%, as the Ultra HD Forum demonstrated itself on its stand at IBC 2019. It showed footage of an equestrian arena with little movement apart from a horse and rider trotting in the background and then by applying CAE full 8K resolution could display at just 14 Mbps. Although artefacts quickly emerged when there was more movement occupying a larger proportion of the frame, it demonstrated the potential for CAE given that so often movement even within sporting events is either slight or confined to a small subset of a frame. 

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

If It Ain’t Broke Still Fix It: Part 2 - Security

The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…