Delivering High-Quality, Low-Latency Video on Every Screen With Multicast ABR Technology

New devices, including tablets and smartphones, enable television viewers to enjoy live sports and news anytime, anywhere, but latency remains a real issue, causing frustration amongst end-users. In today’s connected world, where viewers are simultaneously checking social media feeds on their smartphones and tablets, a significant delay means that end-users might find out about a game-winning soccer goal from Twitter before actually seeing it. With traditional unicast streaming, buffering several big chunks is necessary for avoiding service interruption with bursty and irregular http traffic. This article will explore the sources of latency in the video delivery chain and explain how multicast technology and managed network capabilities can ensure smooth traffic without requiring massive buffering on the player side to guarantee a good quality of experience (QoE). Using multicast technology and other recent technology innovations, service providers can stream live on connected devices, with ultra-low latency.

Nivedita Nouvel is vice president of marketing at Broadpeak.

Nivedita Nouvel is vice president of marketing at Broadpeak.

Why Does Latency Exist?

Traditional television services such as IPTV rely on network bandwidth that is smooth and guaranteed. Since there is a requirement for limited buffering in the set-top box (STB), low delay is typically experienced. The challenge with streaming on secondary screens such as connected TVs, smartphones, and tablets is that they are accessible on various unmanaged networks (i.e., 3G, 4G and OTT) where bandwidth is not guaranteed. Having a need for a high quantity of video buffered in the player creates a high delay.

Latency is a delay in the video stream that happens due to the usage of ABR protocol. Players need to buffer some of the data before streaming in order to cope with the bursty nature of http traffic delivered over unmanaged networks like the internet, which is meant more for data than for video real-time constraints. In the multiscreen environment, operators cannot guarantee a constant bandwidth.

Apple recommends that video chunks last six seconds, and three chunks are usually buffered in the players, implying an extra 18 seconds delay. Some Android players even require five chunks in the player, meaning the delay can be longer than 30 seconds.

Figure 1. Sources of latency with unicast ABR delivery.

Figure 1. Sources of latency with unicast ABR delivery.

The amount of data that is present in the player buffer before the video helps hide the irregular characteristic of the traffic to avoid service interruption. As a consequence, the video chunks sent by the origin servers are based on “old” information. Preserving QoE requires buffering, in order to avoid service interruption. Without this buffer, the video playout would be consistently interruptedfor rebuffering.

How Multicast Technology Aids in Reducing Latency

At the CDN level, operators can use multicast ABR technology to eliminate jitter. It requires deploying a unicast-to-multicast transcaster in the head-end and multicast-to-unicast agents in the home gateways. 

Figure 2. Unicast vs. multicast setup.

Figure 2. Unicast vs. multicast setup.

Multicast ABR technology ensures smooth traffic without requiring massive buffering on the player side to guarantee a good quality of experience.

Figure 3. Multicast ABR technology ensures a smooth quality of experience compared with the peaks that are associated with unicast delivery

Using multicast ABR technology, operators can bring the latency down to a few seconds instead of the typical 30 seconds, resulting in increased viewer satisfaction.
To summarize, the steps that have to be taken to reduce end-to-end latency include:

  • On the encoder side:Adapt the GOP size in order to be compliant with smaller chunks (2s).
  • On the origin packager side:Generate small size (2s) chunks for HLS or DASH.
  • On the CDN side:Use multicast ABR in the operator’s network to kill jitter.
  • On the player side:Adapt the volume of content to buffer to context. Buffer only one chunk in the home network for good quality and low latency, buffer more chunks in mobility to keep a good quality, even if it is with a longer latency.

Just to recap, multicast ABR uses the same network as traditional IPTV. Traffic is prioritized inside the network, and the jitter inside the network is very low. There is no reason to buffer more in the device than is done with traditional IPTV.

Additional Steps to Decrease Latency

Beyond employing multicast ABR technology at the CDN level, there are several additional steps operators can take to reduce latency. At the encoder stage, operators can use I&P frames. This might, however, impact video quality.

Steps can also be taken to reduce latency at the origin packager stage. CMAF, or Common Media Application Format, is a standard that provides a common media formatfor ABR video (a goal only partially achieved because of the dual encryption possibility). CMAF offers a low latency flavor that allows operators to produce chunks between 40ms and 250ms and 2s fragments.This has to be used in conjunction with the “chunked transfer encoding” technique to bring latency down.

Chunked transfer encoding is a streaming data transfer mechanism specific to http 1.1. It allows a server to maintain an HTTP persistent connectionfor dynamically generated content. The key benefit is that it’s not necessary to generate the full content before writing the header; rather the chunks can be processed on the gowithout waiting for the full segment to be processed. Combined with CMAF, which allows very small chunks, CTE brings latency down at the level of the head-end.

Note that the player should be compliant with CMAF and CTE. It should also adapt the volume of content to buffer to context. With CMAF, playback can start as soon as a chunk with an IDR is fetched.

Conclusion

If operators reduce the amount of video buffered in the player by decreasing the chunk size, use CMAF low-latency packaging, chunked transfer encoding, and limit the number of chunks buffered in the player without adapting the delivery network to streamline the traffic, they can achieve low latency. However, the video quality will suffer dramatically due to constant rebuffering.

The solution is to use multicast ABR technology in the network. Multicast ABR technology creates the required network conditions to deliver low latency without impacting quality of service. Being able to stream live content with no delay and guaranteed quality of service gives operators a competitive edge over on-demand streaming services, increasing their monetization opportunities and customer satisfaction, especially for those customers who are sports fans and want to enjoy live streaming on new screens (i.e., connected TVs). 

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Operating Systems Climb Competitive Agenda For TV Makers

TV makers have adopted different approaches to the OS, some developing their own, while others adopt a platform such as Google TV or Amazon Fire TV. But all rely increasingly on the OS for competitive differentiation of the UI, navigation,…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.

Broadcasters Seek Deeper Integration Between Streaming And Linear

Many broadcasters have been revising their streaming strategies with some significant differences, especially between Europe with its stronger tilt towards the internet and North America where ATSC 3.0 is designed to sustain hybrid broadcast/broadband delivery.