MAM is Dead. Long Live Media Logistics—Part 3

In the third and final part of BroadcastBridge’s MAM feature we contend that MAM as we’ve known it is dead and that today’s broadcaster and content delivery firm want a media logistics solution which encompasses all ingest, production, distribution and archive with rich metadata including rights. If so, are the tools in most MAM’s appropriate at ‘orchestrating’ all of these assets?

Here are the comments of Tony Taylor, CEO TMD.

TT: Many MAM solutions have been designed around a siloed approach. This has been typical of the way software has been developed in the broadcast industry for many years. Even now I find it incredible when I hear some of the stories of MAM implementations that have taken no account of joining up the business of media across organisations.

That joining up has to start with the metadata. The successful media businesses are those who realise the value of the metadata which exists alongside the content, and implement a MAM solution that uses it to the fullest extent possible.

There can be no argument that the future will be around file-based workflows in data centre environments. This depends upon metadata: acting on it, reacting to it and enriching it as it passes between and through facilities. The protection and enrichment of metadata has always been at the heart of any asset management system worth the name, and today it is the only logical place to put the workflow orchestration layer.

If workflow orchestration is about drawing on and adding to metadata, why would you even consider putting orchestration in a separate system?. It has to be in the system which is charged with holding the metadata.

Content preparation and delivery firms are required to deliver assets to an ever increasing variety of platforms. How have manufacturers helped content companies gear up for life in a multi-platform world?

TT: You have to think in terms of layers. At the bottom is the hardware: the servers, the encoders and transcoders, and the content delivery networks. Above that is a control layer, which tells the hardware what to do with each piece of content.

Above that is the business layer. This is where executives look at the economics of the operation and make commercial decisions. In a modern media enterprise, these executives should be able to make decisions based on purely commercial considerations, not what the technology allows them to do.

The middle layer is the asset and workflow management. Its rich metadata captures all the information on the content: what rights are available; when and where it can be shown; what content needs to take priority through the encode farms and more. Most important, the asset and workflow management system should both be controlling the hardware at the bottom, and reporting and responding to the business systems above it.

Put simply, a CEO should be able to look at one screen – familiar to him or her because it is in the enterprise management layer – and make a decision to, say, put a particular programme on iTunes. That decision should pass automatically to the workflow management system which will draw on the technical metadata to determine precisely which processes are required, and implement them at the right time, again fully automatically.

What are the tools to create, deliver and store files and metadata for broadcast, VoD, mobile and web in one workflow?

TT: The very simple answer to that is a rich metadata schema. If the asset and workflow management system knows all there is to know about the content, from rights to resolution, then it can command whatever other equipment is around to make all these things happen.

It is, frankly, ridiculous to think that the media industry can think about multi-platform delivery in anything other than a single workflow environment. Conceptually, you are delivering your content to your audience. It is one concept, so how can it be anything other than one workflow environment?

There are many tools that exist to achieve this, from editors to transcoders. But the primary tool to ensure efficient automated media business process management is content intelligence, relying on the metadata. There is no need to compromise if you use the biometrics inherently encapsulated in the metadata and content.

How important is the ability to integrate tools from a range of vendors?

TT: Broadcast engineers have always chosen best of breed solutions: the right set of functionality and performance for a specific installation. Do we really think anyone wants to change that?

However, as we move into the IT-centric and increasingly the cloud era, we have to find ways to maintain and simplify that choice. One of the biggest challenges is scaling services up and down to cater for peaks and troughs in volumes as well as introducing new technologies and services. At TMD we have designed, integrated and implemented a platform called UMS – unified media services – which is a simple approach to service-oriented architectures that enables broadcast and media organisations to cost effectively integrate third-party technologies.

There is of course the FIMS standard as a good open foundation, but this does not answer all of the needs of the current broadcast customer. So UMS provides a service bus to support integrations, which includes FIMS, proprietary APIs and other methods to decouple the technology from the operations, allowing users to choose best of breed hardware yet still operate it from automated, metadata-driven workflow orchestration.

Is it best to adopt a single system or opt for a modular workflow?

TT: It is best to implement a system that fulfils the real commercial needs of the media company. In some cases that can be done in a one-stop shop solution. In most cases, I suspect, it will best be served by components from a number of top vendors, brought together under a metadata-driven environment. Either way, the question should never be “who do I buy this from?” but “what do I need to make money?”. It has to be looked at from the business perspective and not simply the technology preference of an engineering or IT department.

TMD's Tony Taylor

TMD's Tony Taylor

You might also like...

HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows

Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…

IP Security For Broadcasters: Part 4 - MACsec Explained

IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.

Standards: Part 23 - Media Types Vs MIME Types

Media Types describe the container and content format when delivering media over a network. Historically they were described as MIME Types.

Building Software Defined Infrastructure: Part 1 - System Topologies

Welcome to Part 1 of Building Software Defined Infrastructure - a new multi-part content collection from Tony Orme. This series is for broadcast engineering & IT teams seeking to deepen their technical understanding of the microservices based IT technologies that are…

IP Security For Broadcasters: Part 3 - IPsec Explained

One of the great advantages of the internet is that it relies on open standards that promote routing of IP packets between multiple networks. But this provides many challenges when considering security. The good news is that we have solutions…