Live IP Delivery - Part 3 – Monitoring

Throughout this series of articles, we’ve been investigating the detail of how OTT internet delivery works. In this article, we dig deeper into the operational systems, and investigate the additional benefits and necessity of monitoring.

Live television delivery is the key feature for broadcasters and content developers looking to differentiate themselves from on-demand services. Terrestrial, satellite, and cable operators have all benefited from low latency delivery provided by point-to-point and radio frequency mechanisms.

However, consumer viewing habits have changed, and audiences now want to watch their favorite programs on a host of devices from mobile phones to smart TV’s. Although the internet is the most convenient delivery platform for viewers, it presents new challenges for broadcasters.

Viewers will Switch

Audiences now have greater choice and are more discerning. If their live sports event is breaking up or distorting, they are more likely to find an alternative broadcaster or streaming service to switch to. And the viewer may even go straight to social media to vent their anger. Consequently, broadcasters need to monitor the whole program chain from playout to device delivery.

As viewers move from linear television to streamed live events, new methods of monetizing content are being established. Pay-per-view events may seem the obvious solution, but even these have their challenges and customer viewing must be closely monitored.

QoE is Paramount

OTT monitoring needs to go much further than the standard video and audio level tests to keep viewers engaged with their service. Delivering IP media packets to a mobile device is not enough as broadcasters must be sure the viewer is receiving a good QoE (quality of experience).

Delivery networks usually consist of many different service providers. An ISP for the last mile, a telco to interconnect to the broadcasters’ playout center, and a CDN may provide a network to connect the viewers last mile to the broadcaster’s playout server. Each of these can be independent companies and have their own interests at heart.

Diagram 1 – Placing monitoring probes throughout the OTT distribution chain will allow broadcasters to gather a mine of information about the integrity of the viewing experience as well as viewer trends and habits for marketing and direct advertising.

Diagram 1 – Placing monitoring probes throughout the OTT distribution chain will allow broadcasters to gather a mine of information about the integrity of the viewing experience as well as viewer trends and habits for marketing and direct advertising.

Once a program has left the playout facility and the video and audio has been quality tested and proved to be in-spec, the next two parameters critical to a good QoE are latency and bit rate.

HTTP/TCP provides greater resilience over the internet but at the potential cost of increased latency. If an IP link is dropping packets then it may not affect the integrity of the data as any lost packets will be resent, but it will increase the latency as resent packets need to be buffered prior to transmission.

Such a condition can occur at all three service providers or at any point within the CDN network, resulting in potential differential latency between regions.

Strategic Monitoring

Ideally, monitoring probes should be placed at the three points of demarcation of each service. Monitoring probes on the output of the playout center can measure latency and bit-rate as the packets make their way to the central CDN playout server. And placing probes at the output of each CDN regional playout server will help determine if excessive latency has occurred.

As well as providing monitoring data to help locate errors, probes can be used to provide QoE information to allow broadcasters to fine tune delivery and give audiences the best possible viewing experience.

OTT delivery is a fine balance between streaming quality and playback device buffering size. When moving around, the quality of data service provided varies enormously, especially when travelling.

Mobile devices switch between cell transceivers and even WiFi if the service comes into range. This initially causes a disturbance to the received data stream causing some loss of data. Any dropped segments will need to be resent by the playout server.

Buffer Compromise

The length of the buffer employed by the mobile device is designed to smooth out these transitions so there is no loss in service to the device media playback streamer and therefore a high quality of service is maintained for the viewer.

The disadvantage of buffering is that the viewer must wait until the buffer reaches its playback threshold and this could take many seconds. The effect for the viewer is to see the “downloading” icon on top of a freeze frame of the live event they are trying to watch.

Congestion, buildings, and other mobile device users all conspire to reduce the quality of data streams available and force the mobile device to switch between carriers and cause fluctuation in data rates. But DASH (Dynamic Adaptive Streaming over HTTP) goes a long way to fix this as the playback device can constantly monitor the available bit rates and move seamlessly to the best available rate at that time.

Strategically placed probes, especially at the end of each regional CDN, allows broadcasters to analyze and determine the best media bit rates and segment sizes available to the mobile device. Too many available bit rate streams will cause unnecessary switching in the playback device and is wasteful of bandwidth. Not enough will cause picture disturbance and defeat the concept of DASH. 

Diagram 2 – Mobile playback devices on a moving train will switch between cell masts 1, 2, and 3, as the train travels. DASH will switch between the best data rates available giving the viewer the best QoE experience possible.

Diagram 2 – Mobile playback devices on a moving train will switch between cell masts 1, 2, and 3, as the train travels. DASH will switch between the best data rates available giving the viewer the best QoE experience possible.

Segmentation sizes greatly influence the QoE. If they are too large then there will be increased latency in the distribution network and the playback device buffering will be too large, causing long delay times between the live event and the viewer seeing it on their device. If segmentation sizes are too small, then the latency will be lower, but the playback device buffering will be lower too, and if it’s too small then picture disturbance will result as the viewer travels.

Broadcasters will increasingly make live event special promotions available on pay-per-view OTT. To achieve this, the viewer will log onto a website, enter their credentials, and validate their payment method. The playback device will receive a decryption key allowing its DRM (Digital Rights Management) software to display the content.

Validate Viewers

Audiences tend to start viewing close to the beginning of the event and this could result in thousands, if not millions of viewers, all validating their playback devices at the same time. This would result in significant congestion just at the beginning of the game resulting in many viewers not being able to receive the first few minutes.

Although this is technically not a broadcast issue, it still falls under the remit of QoE and if the viewer is not satisfied with the service as they cannot stream the game, then they will go elsewhere. This is as serious as the video breaking up or there being no audio.

Consequently, QoE is part of the media streaming process and broadcast engineers will need to collaborate with their IT colleagues to understand how to rectify these challenges. Remedies available include allowing viewers access to the first ten minutes of the game and then staggering validation in the background across all playback devices or spinning up more validation servers at the beginning of the game to take on the extra anticipated load.

Unlike traditional broadcasting, OTT quickly provides the necessary data to make these decisions possible empowering broadcasters to dynamically optimize systems to deliver the best QoE possible.

Simplify with Orchestration

Orchestration has been developed to bring this altogether and simplify OTT delivery. It provides a schematic overview of the network (especially at the demarcation points between different network suppliers), consolidation of the monitoring probe data, and control of key parameters in encoding and segmentation.

Broadcasters can even switch between CDN or telco providers on-the-fly if they do not think they are performing well. And they can analyze their data with the service provider to find out where bottlenecks, congestion, and lost packets have occurred to pro-actively keep systems optimized.

As well as providing engineering data, broadcasters now have virtually instant information about the viewing habits of their audiences and can gain a better individual understanding to personalize advertising.

OTT requires more than making sure the media signal has reached the viewer intact. It encompasses the whole quality of experience of the audience to make sure their complete viewing experience is satisfied, they stay watching, and come back again.

Part of a series supported by

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

IP Security For Broadcasters: Part 1 - Psychology Of Security

As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…

Operating Systems Climb Competitive Agenda For TV Makers

TV makers have adopted different approaches to the OS, some developing their own, while others adopt a platform such as Google TV or Amazon Fire TV. But all rely increasingly on the OS for competitive differentiation of the UI, navigation,…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.