Designing Media Supply Chains: Part 5 - Analytics And Monitoring Drive Automation

Analytics and monitoring are now more critical than ever for media supply chains to deliver on various targets, including customer retention, regulatory compliance, revenue generation through effective targeting of ads and content, and efficient use of network resources.


Other articles in this series:


Automation is essential for capturing the relevant information at large scale quickly, some of which becomes metadata for use at various points of the supply chain, and also for acting upon such data to make decisions over routing within the infrastructure and over selection of content for distribution.

At one time, metadata was generated manually but this is not feasible today even for information of use mainly for archiving, simply because it would take too long and cost too much. Furthermore, a lot more metadata is generated today for a greater range of functions, for example to enforce rules over privacy and content moderation. This applies particularly to the big online streamers and social media platforms, such as Facebook and You Tube, because they have become accountable for the content their platforms distribute, becoming more akin to publishers than neutral brokers of communications like telco voice and data networks.

But most broadcasters and video service providers are also under growing obligation to moderate live content as it goes out, which is a particularly pressing challenge for their supply chains. Some have responded through collaborations with their peers and technology providers, such as Al Jazeera, which teamed with a group including news agencies Associated Press (AP) and Reuters, Irish broadcaster RTE, and IBM, in a Proof of Concept to apply AI techniques, primarily machine learning algorithms, to analyse metadata either existing or generated on the fly for moderating live content. Such content might be produced by news teams, by freelance journalists, or user generated.

Most if not all big hyperscalers and major IT technology vendors have tools for content analysis, but this particular collaboration exemplified the growing participation of broadcasters in such projects. They involve analysis both of audio and video for classifying content by type and identifying possible infringement of rules relating to incitement to violence, racism and many other categories. The AI referred to is almost all machine learning through use of neural networks, with the exception of Natural Language Processing (NLP) for analysing the audio and extracting meaning from the words. Machine learning comes into NLP as well, but that also includes techniques derived for example from mathematical graph theory to analyse words and meanings in tree-and-branch structures.

Such AI-based metadata generation and analysis is also valuable for non-live content, but then there is scope for greater depth not just for content moderation but also classification for search, recommendation, and navigation purposes.

On this front, metadata has become more diverse and expanded beyond basic search to enable more dynamic decision making with constant updating during the workflow processes. This can range from straightforward updating of content to conform with house formats during ingest, to identifying content that matches trending keywords on social media. That can lead to it being recommended more prominently or vigorously, taking account of user preferences.

AI-driven metadata is also being used increasingly for QC (Quality Control), which is especially important for streaming content distributed over unmanaged networks, where at one time content owners or rights holders had little idea of what quality their subscribers were enjoying, or not, until they complained or worse churned to an alternative provider. Netflix was one of the first to harness advanced data science for QC both in distribution and also vetting content as it comes in from numerous third-party sources, including studios and documentary filmmakers. For at least eight years Netflix has been performing automated inspections of assets both before and after encoding of the source files, aiming to identify any assets that do not meet specified quality standards and if necessary, replace them.

Netflix has more recently highlighted its supply chain analytics as one factor helping it recover from the temporary loss of subscribers earlier in 2022. Its use of analytics has even been emulated in non-broadcast sectors such as logistics, given that some of the same core principles apply across almost types of supply chain. It is about increasing levels of automation in data extraction and reduction to identify key metrics for decision making across different time scales and functions.

One of the challenges that Netflix and others had to get on top of was the proliferation in false positives relating to QC issues resulting from the great expansion in the amount of data collected and analysed. Such data was of varying accuracy and contributed to an increasingly complex rather fuzzy model aiming to compute probabilities of certain events occurring, or indeed having already taken place. The use of ML has helped filter this data more effectively, weed out false positives, and lead towards more accurate useful conclusions that can be acted upon automatically when possible, allowing human experts to concentrate on a manageable range of indicators or outcomes.

Standardization has also become more critical in the streaming era because of the increasing juxtaposition between different content providers and networks within supply chains. Broadcast metadata standardization is an old subject dating back at least two decades to the creation of MPEG-21, also enshrined as ISO 21000, which has evolved to define and help specify resources involved in content delivery and also playback. For the latter, extensions have been required for streaming in particular, such as timed metadata to allow control over specific events during playback on the user’s devices, like insertion of targeted adverts.

The Alliance for Open Media (AOM) has been working on ID3 Timed Metadata in the MPEG Common Media Application Format (CMAF) designed to simplify streaming of content by providing a common framework for encoding, packaging, and storage. This works through the concept of Event Message Boxes to interrupt playback of content at specified times or overlay some desired object such as graphics, or scores in a sporting event.

There are two types of timed metadata relevant for supply chains, in band and out of band. ID3 metadata is in-band, which means it is distributed in part of the same container as the payload content, for example in the TS (Transport Stream) containers of an HLS stream, or indeed one of these Event Messages in CMAF containers.

But timed metadata can also be sent out of band outside a container, which could be in the manifest of an MPEG-DASH stream.

Whichever is employed this adds further complexity to the workflow and as a result is only being implemented where considered necessary. When it comes to ad insertion as well as overlaying of graphics, timing and synchronization issues can be mitigated by performing the operations on the server rather than the client side.

One common thread running through all implementations of analytics within media supply chains is growing automation both in extraction of data and metadata from content and in its application. This changes the role of humans in the supply chain to overseers of the processes and the ultimate arbiters of what is important. Machines may now execute functions, some of which may rely on calculations of probability, but it is up to humans to set confidence thresholds, and at a higher level determine evolving strategy for content production, distribution and revenue generation.

You might also like...

The New Frontier Of Interactive Rights: Part 1 - The Converged Entertainment Paradigm

Interactive Rights are at the forefront of creating a new frontier in the media industry. Driven by the Streaming era, but applicable to all forms of content platforms, Interactive Rights hold an important promise – to deeply engage the modern viewer i…

IP Security For Broadcasters: Part 1 - Psychology Of Security

As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…

Operating Systems Climb Competitive Agenda For TV Makers

TV makers have adopted different approaches to the OS, some developing their own, while others adopt a platform such as Google TV or Amazon Fire TV. But all rely increasingly on the OS for competitive differentiation of the UI, navigation,…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.