Current SRT Technology Perspectives, Strategies And Workflows
In layman’s terms, SRT is an open-source video transport protocol that enables the delivery of high quality and secure, low latency video over the public internet. In more technical terms, it fixes jitter effects and bandwidth changes as a video is being streamed, while minimizing packet loss.
SRT certainly was an industry game-changer when it was introduced by Montreal based company, Haivision and it still is a key component of the streaming revolution as it helps solve the top three fundamental challenges and workflows, such as: point-to-point streaming, IP contribution, and cloud-to-cloud transport. The idea for it was born much like other great technologies of their day, to primarily fix a pre-existing problem concerning the desire to send high quality video using IP-based transport protocols, which at the time didn’t deliver what the broadcast market required for low latency video transport. The story goes that at the IBC Show in 2012 Haivision noticed a recurring conversation topic amongst broadcasters - that they needed to transport video over the public internet, but satellite and fiber were too cumbersome, costly and took too long to provision. It was also not flexible nor scalable. It was these conversations that inspired Haivision to invent the SRT protocol.
The Strategy Behind The SRT Streaming Protocol
The strategy was simple - drive down costs and increase flexibility. There was a unique opportunity for SRT to solve the many broadcast challenges of that time and carve out a spot as the leading contender for organizations looking to adopt and deploy similar technologies. Also, at the time, people didn’t want to be locked into heavy-duty style satellite provisioning with all the associated costs when all they wished to do was backhaul video. While satellite is great for point-to-point contribution, it is costly. This created the opportunity for SRT, as sports and news organizations wanted to simply get the video without having to schedule and pay for costly satellite links and by having a lower cost reliable alternative, enable media companies to cover more events without compromising quality.
Once the concept went live, implementing it was pretty straightforward as it aimed to get video input from one place and switch it into another. At the time, Haivision had the high-performance encoders and decoders, so they only needed the solution to connect the dots between the two over any network and get the video across as quickly and cleanly as possible between two points. They were solving the raw point-to-point challenge. However, as the SRT story unfolds accompanied with the move towards cloud computing and the cloud in general, SRT evolved beyond the simple point-to-point challenge.
Another important development was that Haivision made SRT Open Source. This really opened up the technology and enabled many more organisations to take advantage of the workflows it enables. There are now more than five hundred organisations that are part of the SRT Alliance and that number continues to grow. With more manufacturers incorporating SRT into their products, the more opportunities exist for it to be integrated into even more workflows. Just about every camera, encoder, and decoder in the market has built SRT into their devices and apps which allows for contribution and distribution from and to any device.
Comparing Latency And Bandwidth Between RTMP vs SRT
On the other hand, we also have a variety of other underlying protocols for low latency transmission of video over the Internet. Some have even called it “protocol wars”. Real-Time Messaging Protocol (RTMP) is one of them, a mature, well-established streaming protocol with a reputation for reliability thanks to its TCP-based pack retransmit capabilities and adjustable buffer. It dominates cloud contribution streaming.
To compare the end-to-end latency of SRT and RTMP over public networks, Haivision conducted a test. The set up included a still camera to capture a single image of two screens side by side displaying a photograph of a single video frame and burnt-in timecode. One of the screens was connected directly to the source and the other to the decoder output so that it was possible to clearly see the round trip end-to-end latency. They tested four returning routes, Germany to APAC Sydney, Germany to US California, Germany to US N. Virginia and Germany to DE Frankfurt and back. In these tests, SRT dominated as it was more than twice as fast and, when tested using dedicated hardware encoding and decoding equipment, the difference was even more dramatic with SRT being five to 12 times faster than RTMP.
How is Avid Making Use Of SRT And Other Streaming Protocols?
Avid recently released MediaCentral | Stream. This is a virtual machine which enables the easy ingest of streams such as SRT and RTMP into an Avid production environment. MediaCentral | Stream runs on a VM and can be Cloud-hosted to bring up to four simultaneous streams into MediaCentral | Production Management to make the feeds available to users to view, edit and repurpose on any platform. One of the key features of MediaCentral | Stream is that it transforms the streams on the fly into the codec used by the production team, so there is no change in the downstream workflow by editors using Avid Media Composer or the web-based MediaCentral | Cloud UX. We see growing interest in this kind of workflow, particularly for customers who are looking to take their workflows and put them in a Cloud-hosted or hybrid Cloud environment. It is an exciting development. In addition, Avid is adding SRT output to Media Composer so that teams can collaborate more easily when they are not in the same physical space. The editor can enable SRT output from their edit client and then other members of the team can view the output.
How Over-the-shoulder (OTS) Collaboration Can Finally Enter The Mix In Broadcast And Post Workflows
A clear and direct effect of the ongoing pandemic for our industry meant that team members often had to work remotely and at considerable distances from one another. This presented a challenge in how to recreate the “over-the-shoulder" type of collaboration without a centralized on-premises work environment. It also meant that clients and collaborators had limited access to the content, making it harder to review and make changes quickly. The originally planned model ended up being deployed successfully and led to investments in new technologies and capabilities to help employees work better no matter their location. With some colleagues on-premises and others working remotely, creating a setup that enables real-time collaboration among everyone was a challenge that needed to be overcome.
The fix is quite simple. If content is to be viewed by a single endpoint, encoded video from Avid Media Composer broadcast input can be transported using the SRT open-source protocol to ensure extreme security, low latency and reliability to one SRT-enabled device. When the content needs to be viewed by multiple stakeholders, encoded video from Avid Media Composer can be routed and replicated through an SRT Gateway to reach numerous destinations at the same time in different bit rates and protocols. The Haivision SRT Gateway is a scalable and secure solution for routing live video streams to one or multiple destinations and can run on premises or within the cloud. It’s a win-win.
What The Future Holds For SRT And The Alliance Itself
As we can’t predict an exact future (Look at the ongoing global pandemic, who would’ve predicted that?), the future certainly looks promising. From a raw tech point of view, SRT has had several inflection points across its development. The first big point was to enable file transfer. So, for some people using SRT, it isn’t for real-time video but for fast file transfer.
Another interesting technology being released by Haivision and the alliance is called Socket Group. This is a piece of technology that can take multiple network paths and apply resiliency between different network paths or bonding. This is coming along rapidly, so expect more developments as we go on this year. From a workflow point of view, the major inflection point is going to be in the cloud and in cloud-based workflows. As we move towards remote, all or most broadcast infrastructure solution providers are trying to judge when this shift is going to cross over and spill past a ‘no-return’ stage.
Another relevant avenue, albeit a surprising one, may be a big innovation shift towards the CDN space. Big questions are asked around peer-to-peer technology and what’s that going to look like in the future, so at this stage we’re looking at answering some tricky questions such as “can we do it again? do we need to do it again?” and “what's the most important thing to do in the industry?”. It’s certainly an exciting time for the streaming sector and our industry overall as we look to come up with more innovative ideas and solutions that will only drive our market up and take consumer experiences through to that next level.
You might also like...
HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows
Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…
IP Security For Broadcasters: Part 4 - MACsec Explained
IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.
IP Security For Broadcasters: Part 3 - IPsec Explained
One of the great advantages of the internet is that it relies on open standards that promote routing of IP packets between multiple networks. But this provides many challenges when considering security. The good news is that we have solutions…
The Resolution Revolution
We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.