Live Cloud-Based Remote Production Slowly Gaining Traction

Although latency and resource coordination continue to challenge those considering cloud-based remote live production, the distributed architecture model is steadily gaining traction as a cost-effective alternate to hardware-based on-premise projects. To date this IT-centric architecture has not been deployed for high-profile productions like the Super Bowl or World Cup, but remote IP-video contribution, production and distribution has allowed second-tier sporting events to be televised globally whereas they might not be - due to cost and fully remote access - using traditional production methods.

The Switch, a live production and distribution production company based in New York, offers a live production platform suite of on-demand cloud services called MIMiC and has seen an increase in demand since the beginning of the pandemic. Leveraging UK-based Tellyo’s Stream Studio and Tellyo Pro as core elements, the cloud-based TV Production-as-a-Service offering also provides clipping and real-time editing capabilities. The global IP delivery service allows sports broadcasters, streaming services and other rights holders to take video feeds from anywhere in the world and simultaneously deliver them to multiple – up to hundreds – of destinations via the internet and The Switch’s private network.

“We have a number of customers who use the MIMiC Production platform for streaming studio cameras to the cloud and producing the show in it,” said Robert Szabo-Rowe, Senior Vice President of Engineering and Product Management at The Switch. “Certainly the trend is increasing and, more and more, we are seeing this type of workflow being used for lower-cost shows.”

He said the two main drivers for this use case is the ability to operate with an on-demand type OpEx model that doesn’t require capital investment in a traditional control room. Also, for those with existing control rooms, there is a desire to free up those facilities for more complex productions by using cloud-based production for those shows where it makes better sense.

Using MIMiC, camera feeds are encoded using SRT or RIST encoders and delivered via the internet into the public cloud-based production system. Audio is embedded in the camera feeds to ensure that there is no audio/video misalignment. The camera and audio feeds are then available to cloud-based production switchers and audio mixers where the show is produced, along with perhaps other file-based video material, graphics and other contributors/guests to the show. The final output of the show is streamed live to its various destinations, be they traditional broadcast destinations or social media platforms.

Using MIMiC, camera feeds are encoded using SRT or RIST encoders and delivered via the internet into the public cloud-based production system.

Using MIMiC, camera feeds are encoded using SRT or RIST encoders and delivered via the internet into the public cloud-based production system.

In addition, the crew producing the shows are generally distributed among different locations and all receive web-based multi-viewer feeds, control surfaces for switching and audio mixing and replay. All of the crew are connected by a cloud-based intercom system, including the talent at the studio and talent elsewhere.

As the foundation of the MIMiC service, Tellyo Stream Studio supports up to 24 live streams that can be designed within eight scenes (as Mixed Effect units) enabling composition of multi-layered video, audio, live graphics, etc., all fully customizable for resizing, cropping, repositioning and transforming. Stream Studio also offers direct editing at scene (or ME) level and also on a specific source.

As anyone experienced in remote production (REMI) understands, latency is one of the issues that you have to work around when operating in the cloud. The fundamental cause of the latency is the use of unmanaged IP connections to deliver video to public cloud-based infrastructure. This is true for internet-based delivery of the video and also, to a lesser extent, within the cloud infrastructure through the use of shared networking.

These unmanaged IP connections require the use of protocols that will allow the recovery of missing packets such as SRT or RIST, said Szabo-Rowe.

“Although you can operate these protocols at sub-500 mS delays, typically you end up with 1-2 seconds of delay between the camera and the switcher. If you consider that the switcher interfaces and multi-viewers are also delayed through web-based delivery to the technical directors and producers, you can end up with 2-3 seconds of latency between camera and crew (and return feeds for the talent). For some productions, this isn’t difficult to handle. However, when there is a need to have very tight coordination between technical director, graphics, talent and producer, you need to make sure that everyone involved adjusts to, and is aware of, the delays.”

The real-time intercom communications allow instant voice interactions between all parties, which helps coordinate the production effectively. Determining how much latency is too much all depends on the show. For some shows, 30 seconds of delay isn’t an issue. For others, when two remote parties require an interactive discussion, you need to have sub 100mS. The Switch solves this by deploying a real-time cloud-based audio overlay that allows for the interactivity while the video with embedded audio is handled with the normal delays.

The Switch chose Tellyo's cloud production technology due to its flexibility and “broadcast-grade” capabilities.

The Switch chose Tellyo's cloud production technology due to its flexibility and “broadcast-grade” capabilities.

Szabo-Rowe said that the biggest challenge to using a cloud-based production workflow is getting everyone involved in the production used to handling and accommodating the latency.

“Professionals in the industry are very used to operating with systems where latency is very low and it takes a little getting used to in order to operate within a system where there is more latency involved,” he said. “Those that have embraced this have made the transition with little difficulty. It is very similar to the situation where production employs remote feeds via satellite or bonded cellular – which also requires that people adopt an operating practice different from the norm.”

Of course, the advantages of a distributed architecture are numerous, with the most significant being cost – which manifests itself in several ways.  Live production is well suited to on-demand infrastructure – since live production systems are rarely used continuously around the clock. The investment required to build a dedicated control room can be difficult for some organizations. For those productions suited to the cloud, tapping into its flexibility, scalability and efficiency offers an excellent alternative to traditional production methods.

“A cloud-based approach can offer a lower cost that enables both more secondary productions and the creation of more content targeted at specific audiences,” said Szabo-Rowe. “Cloud-based production allows content creators to produce events for more niche audiences and to serve smaller geographic areas. This enables them to produce content that likely would otherwise not be commercially viable with more traditional production methods.”

Other advantages of cloud-based production include: the ability to easily scale the number of simultaneous productions without having to find the investment and space required for traditional control rooms to meet peaks in demand; flexibility in using a distributed crew with operators and production staff working remotely and dispersed; and the ability to easily include remote guests in productions via web cameras over the internet.

The Switch’s client list includes the NBA, NHL, NFL and MLB in sports, as well as broadcasters like the BBC, CBS, Fox, NBC and Sky.

“The feedback we get from our clients is that they view their use of our MIMiC cloud services platform as a competitive advantage – they are able to do more with less,” said Szabo-Rowe.

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

If It Ain’t Broke Still Fix It: Part 2 - Security

The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…