The changing landscape of Command and Control

The transition to an IP architecture has created significant changes in command and control and in how systems are monitored and managed. Command and control is the all encompassing automated set of processes that control the acquisition, file movement, handling and delivery of media. Monitoring is more than a set of scopes and meters. Dashboards and browsers provide the system monitoring tools to manage the handling and Quality Control of media and metadata in the entire facility.

Automation was the system that master control used to integrate the traffic system to the play out system. The automation controls all the program source devices i.e. tape machines, servers, plus routers and master control switches to originate programming following the instructions of the traffic schedule. As the industry has moved to multi-channel delivery, the requirements for automation systems has become more complicated. This translates into receiving multiple instructions sets (commands) ie traffic schedules and playlists, then issue multiple commands the various devices it controls to playout content. And while automation has evolved to support it, it is now only one element in the overall automation system of an IP broadcast center.

Editing systems control source machines, switchers and mixers. This is one of the primary examples of command and control. The editing system issues a set of commands to the source device each time an element is selected. It issues a different set of commands when the finished program has to be rendered for each delivery format.

Command and control has evolved from Serial RS232, 422, 485 with GPI/O relay closure to an IP layer in the stream and file-based architecture that also handles all the management and movement of media. 

IP TRANSPORT LAYERS
Media
Metadata
Communication
Command and Control

This shows the layered topology in the IP environment and how command and control is one layer within a single IP transport stream that carries media, metadata and communication in addition to command and control.

The stream and file based technologies and workflows have introduced new requirements for broader command and control systems. One that takes control of the entire media lifecycle from the time of concept to the point of distribution and beyond with all processes along the way.

In the file-based broadcast and production environment, for each process there is a need for control. Ingest devices need to know to start recording plus the format profiles and direction where to place the media and metadata once it’s created. The production and media management systems need to be notified that the file is ready for use. There is a media handling process that controls the movement of the files across the different business and production units and into different storage areas for production, media management, archive and delivery.

Every aspect of the file-based environment is managed by the command and control layer. And each process and device is typically controlled by an automated process. Even when a process is manual, the controller is the dashboard or a separately integrated control system that handles the manual process and shows the status and movement of files and streams within the entire media management environment.

Orchestration is the new term for an all-encompassing command and control management system. These systems are also called conductors. Conductors provide a unified dashboard to manage the command and control system. They host the rules and policies that manage all the devices and processes.  

The conductor is the dashboard that is the command and control center. It shows all the active processes, device status and where the files are in the system. The conductor controls the ingest processes and devices, handles media movement, interfaces with media management and controls the master control automation system for delivery. Following the schedules, rules and policies, the conductor will trigger processes then track their progress managing priorities and insuring continuity of the flow of media and metadata.

On a large and unique project I am involved in, there are as many as 17 multi-camera production spaces in operation at the same time daily. Each space produces multi-hour live programming. In each space is a multi-camera robotic system driven by a microphone system. Each time a microphone opens it triggers a camera preset that is recalled and stabilized prior to getting switched to air. The program feeds are all encoded to file, accessed for editing and streamed live to web live and in real time. Additionally metadata gets recorded for all productions.   

There are numerous processes involved prior to the production and multiple concurrent processes during and post the live events. It is virtually impossible for an operator to manage and control this volume of production. The only way is through command and control and multiple automation processes.

A composite schedule with for all the production spaces is delivered as XML to the Media Asset Management system. This is parsed to individual schedules or each encoding chain. The same schedule is passed to the microphone management system to know when the production will begin. A separate XML is delivered to the microphone management system so it will know who is assigned to each microphone. Both of these XML files are entered as metadata to each production prior to the production starting. As the production starts the robotic controller begins tracking the microphones and both systems are writing metadata. In the Network Operation Center, technicians are monitoring the command and control systems that enable the production while monitoring the signal quality of the program output of each space as it passes through a router to the encoder.

The file is moved from the encoder to the storage environment where production personnel can browse and access for real time editing. As editing packages are completed they are registered to the asset manager and delivered to the various distribution platforms. There are more processes happening during the live production and many more downstream once the file is encoded.

This is a clear demonstration of the need for command and control throughout the entire media lifecycle. It would be difficult if not impossible for an operator to manually manage and perform these operations in a timely manner that assure the program gets to air.

In the stream and file based ecosystem processes get triggered, data moves between systems, devices are controlled, media moves throughout the environment all while metadata is controlling the media management.

Command and control is a core process within the IP architecture and using dashboards with centralized management and monitoring are essential tools the engineer needs.  

You might also like...

HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows

Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…

IP Security For Broadcasters: Part 4 - MACsec Explained

IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.

Standards: Part 23 - Media Types Vs MIME Types

Media Types describe the container and content format when delivering media over a network. Historically they were described as MIME Types.

Building Software Defined Infrastructure: Part 1 - System Topologies

Welcome to Part 1 of Building Software Defined Infrastructure - a new multi-part content collection from Tony Orme. This series is for broadcast engineering & IT teams seeking to deepen their technical understanding of the microservices based IT technologies that are…

IP Security For Broadcasters: Part 3 - IPsec Explained

One of the great advantages of the internet is that it relies on open standards that promote routing of IP packets between multiple networks. But this provides many challenges when considering security. The good news is that we have solutions…