Professional Live IP Video - Migration And Media Standards

As the adoption of IP for live broadcast production begins to mature, we examine the standards and some of the factors to consider in migration.

In many ways, IP isn’t ‘new’ any more. It’s been six years since the debut of the IP Showcase, now a fixture at every IBC and NAB demonstrating the latest in IP-based live production technology. In the audio world, AES67 has been around for almost a decade and the development of Dante started over sixteen years ago. This passage of time has allowed solutions to mature however, many individuals and broadcasters are yet to use any IP outside of their file-based workflows. In this article, we will explore the questions to ask when considering your move to IP, the types of network traffic and the importance of software defined networks (SDNs).

Online streaming services are great examples of real-time IP workflows for video and audio with an interesting, modern and evolving technology stack. However, for live productions with a studio element, technologies like HLS, DASH and WebRTC don’t have the low latency performance needed nor do they work for broadcasters who care about having high-quality video assets of their productions. With compression the biggest cause of latency and quality loss, turning to uncompressed media is an obvious choice, though not without its challenges.

Motivation & Migration to IP

Each organization will find their mix of reasons to move to IP will be different to others’. Some simply can't expand without the higher density routing and switching that IP offers compared to SDI. Some need the flexibility it enables whereas others benefit from sharing equipment between facilities. Carefully consider the business reasons for moving to ensure that any decision to implement IP is for the right reasons and at the right time.

It’s no secret that ‘going IP' doesn’t have to be an all-or-nothing game. For broadcasters who are at the point of re-investing in their systems, keeping some or all as SDI should be considered. A gradual move to IP can be the key to helping engineers who need to plan and maintain the system gain the additional skills they need. For companies that have their own OB vehicle, for instance, running the truck as an IP system can be a great option. OB vehicles benefit from weight savings from reduced cabling and, for an almost SDI-free truck, the lack of infrastructure A.K.A. ‘glue’ and SDI matrices is a major saving for space as well. With an IP infrastructure, a truck can easily scale up for larger events which is ideal for sports where important games need more cameras. The critical factor in IP migrations, though, is to keep the scope tightly defined and ensure you are testing in an island. An alternative to a truck would be a single studio, perhaps a self-op or bulletin studio which would offer your internal teams lower-risk opportunities to get experience in diagnosing IT issues as well as broadcast problems.

Key Technologies for Uncompressed Live Production

The two SMPTE standards, ST 2022-6 and its sister ST 2022-5, describe how to send an SDI signal over IP. An SDI signal is a carefully multiplexed combination of video and embedded audio with subtitles and other signals in the VANC blanking interval. ST 2022-6 describes how to send that same uncompressed data and structure over IP with ST 2022-5 explaining how to add forward error correction (FEC) to give it some resilience to data loss in the network. ST 2022-6 was the first notable standard to emerge for moving uncompressed video over IP for live production and is still used to this day, particularly where the embedded nature of the signal is important. ST 2022-6 would be for handing off the linear channels from a playout system to a satellite operator because it’s inherently resilient to synchronization errors. For live production, however, ST 2022-6 acted as a stepping stone to SMPTE ST 2110.

SMPTE has standardized a whole suite of documents which together are called ST 2110. The standards explain how to manage timing, video, audio and other metadata. ST 2110’s concept is that every type of stream should be sent independently. The advantage of keeping everything separate is that very low bitrate streams such as subtitles are not intertwined with 12Gbps UHD video so computing and bandwidth requirements become proportional to the bandwidth of data needing to be processed.

Considering Compression

NDI takes a different tack by recognizing that low-latency is the most important thing for live production, but crucially it moves away from the SDI model by sending the video compressed. NDI is a technology owned by Vizrt although in many cases vendors are free to use it in their own products. By applying a light compression, latency can be kept to a few frames with quality being visually lossless or very good making it suitable for many applications even for major broadcast channels. Although NDI latency is higher than ST 2110, for many uses it’s not a problem. Due to its ease of use, ‘free’ price tag and compatibility with gigabit networks, NDI has been widely used since its launch in 2016 in the fields of AV, esports and, indeed, parts of broadcast. The cloud, where SDI cables can’t exist, is a natural home for NDI and many broadcasters, including some who don’t use NDI in their studios have completely NDI-based workflows in their cloud production environments.

Considering Audio

Looking at audio, a number of proprietary technologies exist including Q-LAN, Livewire, and Audinate’s Dante all of which combine discovery and connection management with transport protocols and timing protocols for synchronization. Each technology uses a mix of proprietary and open standards. RAVENNA is an open specification which aims to meet or better the functionality of the proprietary technologies. All of these allow for high quality, live production with uncompressed audio leading to a challenge of moving audio between systems.

AES67 is a standardized audio protocol which aims to provide interoperability between different Audio-over-IP systems. AES67 is the designated choice for audio transport in SMPTE ST 2110-30 offering 48KHz audio streams of up to 8 channels at 1ms latency with the option to go to 125us latency. AES67 is also able to be used within other major proprietary audio-over-IP or audio-over-Ethernet protocols discussed above. Each of these technologies adds functionality on top of AES67’s audio transport mechanism such as control and monitoring or device discovery, but importantly by keeping AES67 at the heart a stadium running a Dante network can easily exchange audio with OB trucks running RAVENNA or ST 2110-30.

Taking Control of Your Network

Software Defined Networking (SDN) is a way of controlling a network to make it behave deterministically like typical non-IP broadcast infrastructure and when used with broadcast control software, users and operators can be unaware that anything IP is happening at all. It’s not always healthy to bring along ways of working when moving to new technologies; a lot of the ideas of tape-based workflows were irrelevant or wrong for file-based workflows. But IP or not, broadcasters need to know their system is resilient and redundant which is where SDN comes in by taking top-down control.

Normally, switches make millions of individual, isolated routing decisions as packets flow through. This is very effective, but there’s a practical limit on how much control you can have on traffic flow. SDN allows a centralized controller to talk to all switches and orchestrate exactly how each stream travels through the network dictating each step of the route and it can also configure the sender and receiver devices. This ensures path redundancy and deterministic routing. Now, a broadcast control system can simply ask for video ‘Replay 1 Output A’ to be sent to ‘Mixer Input 3’ and the SDN controller will ensure that all the IP information and redundant routing is handled. It’s by adding in these layers of abstraction, we can build a system which is understandable and hence manageable at each level.

SDNs have another major benefit which is managing the scale of ST 2110 networks. Once audio and metadata are taken into account, for each video route, there could easily be 12 routes needed in total, each having IP addresses, port numbers and connection profiles which need to be tracked. For a larger broadcaster it’s easy to have far in excess of 200,000 end points. Keeping all this information up to date yet instantly accessible to flexibly route signals around the facility is challenging but when SDN is implemented for broadcast, this ability to control the network comes with the power to make sense of the tens or hundreds of thousands of end-points.

You might also like...

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.

HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG

Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…