Changing Architecture In The New IP World

The Cloud is the future of live TV production.

Much of the valuable information many engineers and technicians depend on gathering at broadcast TV trade shows and technical sessions has migrated to various forms of webcasts. Today’s hottest industry news is what is being said in some of these webcasts. TV engineering presentations are easily found but require watching in real time and extra time to replay segments that need further review for clearer understanding.

The Broadcast Bridge knows busy TV engineers and technicians often have more demanding issues at hand and perhaps not time to view and absorb information as fast as it can come in a Webcast. One such recent live Webcast worthy of an abridged review was the TAG Video Systems presentation about changing facility architecture on the fly. It was relevant and revealing, presented by TAG’s Chief Architect Paul Briscoe, and Pini Maron, TAG VP of Special Services.

Briscoe set the tone saying “The world is changing, and live TV has gone through a vast change. Live news and talk shows have moved to remote production, instantly, overnight, in some very haphazard and terrifying ways, and its working out.” He went on to say that remote production “is indeed the future of live production and future of how we make media. It’s time that we talk about this stuff.” The following is an edited transcription of the highlights of the discussion that followed.

Consumer computer products are becoming interfaces for live production because we can put GUIs on them that do the job.

Consumer computer products are becoming interfaces for live production because we can put GUIs on them that do the job.

REMI

The live production structures we know about are typically the sort of things that are the infrastructure of a TV facility. Live production is a very important part of what broadcasters do and it has been greatly affected by the pandemic. Most TV facilities are hard wired, usually based on routing switchers, with common control rooms where numerous people communicate and work together under time pressure to make a live TV show.

Typical TV station gear has little connectivity. External sources such as satellites, microwaves and the internet are usually wired and appear as if they were in the building. Other than that, there is little dynamic connectivity to the outside world and external collaboration becomes logically and technically difficult.

Remote production is also called REMote Integration (REMI). By either name, it’s all about remote collaboration and production decision-making. It gives the ability for a live crew to not all be working in the same room. Today’s live production is glued together with consumer tools, broadcast tools, ingenuity and sometimes chewing gum. It is frequently not a seamless solution and is seldom scalable.

Three Pillars

Pini Maron followed up by saying the first big thing that came after IP were solutions based on software applications running on commonly available COTS hardware. This trend has led to tremendous technical and commercial benefits and it has also led the industry to the “Three Pillars” of new facility IP architecture: Flexibility, Agility and Scalability.

The idea of building a facility without racks of dedicated hardware or attached to a specific location can make a huge difference. Developing trends indicate that typically hardware solutions are going away. Richard Friedel, EVP of Engineering, Operations and Technology at FOX recently said that the long-term trend of the pandemic will affect the industry and come to focus starting with the swift adaptation of the Cloud and IP infrastructures. He went on to say, “We’re going to make this part of our DNA core. Forget what you know. We are going somewhere new and this is what will stay and dominate our industry.” Friedel recently invented the use of clear plastic wrap to cover control panel surfaces at FOX to slow the pandemic spread among operators. 

Replacing the SDI router with an IP switch doesn’t advance production technology.

Replacing the SDI router with an IP switch doesn’t advance production technology.

Moving to IP

Briscoe said that typical SDI live production architecture usually consists of a core SDI router, some players and cameras and remote incoming content feeds from the internet, all of them coming in through hardware-based devices that turns the various internet formats into SDI. Additionally, there are clip and replay systems, and CG/GFX systems that also connect to the production switcher and the SDI router. Modern systems use multiviewers to drive videowalls for various operational monitoring.

When a facility moves from SDI to IP, the difference is that the SDI router is replaced with the IP switch fabric with gateways supporting the SDI equipment that is still in operation. The production workflow remains essentially similar, but it does nothing to advance the production technologies the facility is already using.

Cloud Datacenters advance live TV production technology. Most TV people are familiar with a central equipment room filled with racks of dedicated broadcast gear from familiar manufacturers. In Datacenter IP live production architecture everything is based on some type of computer, server, or engine, running pure software - sometimes from familiar broadcast hardware manufacturers and increasingly from newer names.

The Datacenter does not need to be on-premise. A shared datacenter located microseconds away may be more than sufficient. Sometimes the Cloud may be sufficient. The new COTS equipment resides in a data center while all the studio operational positions remain available in the studio or from home or a non-studio-control-room location. IP provides the opportunity and means to gain flexibility, agility, and scalability.

License Serving

Maron commented that in the hardware world everything is rigid and technology licensing is fixed to a specific device. Doing something similar on another device requires another license. The TAG Zero Friction concept was developed on the idea that licenses can be moved around and ‘served’ as needed and not tied to a specific piece of hardware or functionality. The user acquires a pool of licenses and assigns them to applications instead of specific hardware. Often it is to use a function that is occasionally but not normally needed, other times it is supporting multi-use facilities. The ability to move licenses between different workflows and even applications enables extreme flexibility and maximizes resource usage.

Licenses can be shared as needed by equipment in the same room or in different time zones, reducing the need for individual licenses that can be easily shared between distant facilities, in New York and Paris for example. Resources in either data center can be used by any piece of the system in either location.

“This is only the beginning.” Maron said. “Who would have thought Zoom, Skype or Webex could be central to broadcast production this time last year?”

People wearing the headsets work together as if they are all in the same room but can be scattered across the globe working from home.

People wearing the headsets work together as if they are all in the same room but can be scattered across the globe working from home.

Into the Cloud

Once a facility has moved to a distributed IP control room running broadcast applications, it can look to moving them into the cloud. Once in the cloud, the domain of the production changes. For example, live cameras may be in Paris, but everybody is interconnected on the web through an intercom. All the same functions exist, but their means of operation have changed. There are few or no more dedicated control panels. A character generator may be a laptop. The prompter may be on a tablet or iPad. The live reporter may be on the telephone or one of another internet-based calling apps. Nobody must be any place specific anymore, which is the current pandemic model most are working under today.

Cloud production is the model for the future. Who needs a brick-and-mortar building with a fancy control room and people coming in on shifts? Why travel to a crowded location when it is no longer necessary? People can work from home. Moving everything to a Cloud data center opens the door to casual, cost-reduced operations with new flexibility, agility and scalability.

Web feeds are a natural in the Cloud because they are internet-native. Zoom, for example, doesn’t need to be brought into the system with a device and turn it into SDI or broadcast IP because it is native IP already. The same holds true for broadcasting out. The Cloud leads naturally to OTT and linear transmission; a local OTA transmitter STL can simply be fed from the Cloud. Playout already lives in the Cloud. There is no reason that the Cloud cannot become the go-forward basis for live production.

Human Interface

Separate from the technology under the hood is the evolution of human interfaces. Broadcasters are used to control rooms full of special, custom, bespoke equipment that is no longer necessary nor appropriate. Even technical directors are okay to switch on something besides the ubiquitous, high-travel, big-button, big-panel production switcher control surface. Everybody is used to touchscreen interfaces.

While control interfaces are moving to laptops and tablets with small screens, the same is happening with monitoring and viewing. Multiviewers can be displayed anywhere on any device, so the picture needed on a screen to do the job remotely can be provided as easy as one to a screen in a local studio.

Bandwidth has been increasing since the dawn of computers and the trend shows no signs of slowing. Within the Cloud, bandwidth typically is not an issue. Today’s issue is getting to and from the Cloud. Gigabit local service is not universal. There are still some things that can’t be done quite as directly as we want, but new solutions are coming and they’re coming fast.

In the meantime, broadcasters are doing what they always do best: Using ingenuity to turn an adverse situation into something new and exciting, and making a consumer product like Zoom look good on TV. Well done, everyone.

You might also like...

IP Security For Broadcasters: Part 1 - Psychology Of Security

As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

If It Ain’t Broke Still Fix It: Part 2 - Security

The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…