AWS Completes Streaming Protocol Set With SRT Support

AWS (Amazon Web Services) has joined the SRT Alliance and added native support for the SRT protocol in its AWS Elemental MediaConnect package.

This completes the low latency protocol set for Elemental MediaConnect, which already supported video transport using protocols including Zixi, Reliable Internet Stream Transport (RIST), and Real-time Transport Protocol (RTP) with forward error correction (FEC).

Although last to be added, AWS indicated customer demand had built up, which it said determines which protocols it supports and which technologies it adopts or builds. The inclusion of SRT within the portfolio also extends to AWS Elemental Live, an on-premises appliance and software based live video encoder, whose users now have access to the protocol in the latest software release. Elemental Live can now receive streams using SRT, taking as an input a secure, reliable, and low-latency video source with protection against packet loss. SRT can be used in applications that require security, since video can be secured end-to-end using 128 bit or 256 bit AES encryption via the protocol.

“SRT has shown that it’s an important transport protocol, and it provides secure and reliable transport of live video to and from the AWS Cloud,” said Dan Gehred, solutions marketing manager for AWS. “With SRT protocol input and output in AWS Elemental MediaConnect, along with input in AWS Elemental Live appliances and software, AWS customers have more options when it comes to building scalable, reliable, and secure live video workflows.”

This comes at a time of growing momentum behind SRT, a month after Sony joined the Alliance, with other scalps including Microsoft, Alibaba Cloud, Sony and Tata Communications among over 450 members in total.

However, SRT still has a close rival in RIST (Reliable Internet Stream Transport) protocol proposed by the Video Services Forum, which is similar in capabilities and approach. Both address latency by reducing the delay associated with the TCP protocol retransmitting lost IP packets after receiving requests from the receiving end of a network link. Both work by incorporating an error-correction mechanism already associated with the UDP (User Datagram Protocol) called Automatic Repeat ReQuest (ARQ). If the receiver identifies a gap in a stream resulting from missing IP packets, it requests those to be sent again via a negative acknowledgment (NAK) packet. The advantage of this is that only missing packets are retransmitted and this is done faster because sender and receiver remain in conversation throughout the transmission.

SRT is sometimes described as combining the speed of UDP without ARQ with the reliability of TCP. But in practice ARQ does add some latency, worse impact depends on the retransmission or round-trip time between sender and receiver, largely a function of distance.

This distance-related delay comprises signal latency, the time taken to traverse the physical medium, but the bigger contributor is the time taken to buffer IP packets while waiting for possible retransmissions. The farther apart the nodes, the higher the latency as buffers have to be larger to accommodate the added delay in transmission, and so take longer to fill and empty.

Between say London and New York, that total delay would be about 150 milliseconds, rising to up to 500 ms if the source and destination are almost diametrically opposite on the globe. That may still be acceptable for video streaming, but not for some two way interactive applications such as video conferencing and gaming. For those applications, the WebRTC protocol will usually be preferred because this involves no packet retransmission at all. Instead Forward Error Correction (FEC) is used to cater for dropped IP packets up to a certain level, by incorporating some redundant information in the stream. This allows some packet recovery while imposing very little additional latency, but there is a bandwidth overhead and FEC cannot be guaranteed to deliver high definition streams at sufficient quality over the internet.

SRT was developed by Haivision around 2012 to meet the challenges of low latency video streaming by cutting down on those TCP error correcting delays. Wowza Media Systems later joined the party and together with Haivision launched the SRT Alliance in April 2017, while making the protocol available open source to encourage adoption.

You might also like...

The Interactive Rights Technology Ecosystem: Part 2

As we continue our dive into the new frontier of Interactive Rights we delve deeper into the Interactive Rights technology ecosystem with an exploration of the required functionality and the components required to deliver it.

5G Broadcast Update 2025

After some trials of varying success, European broadcasters are most interested in exploiting 5G Broadcast as part of their hybrid offerings with hopes of reaching mobile devices. The key missing ingredient is support by the major device makers.

IP Security For Broadcasters: Part 12 - Zero Trust

As users working from home are no longer limited to their working environment by the concept of a physical location, and infrastructures are moving more and more to the cloud-hybrid approach, the outdated concept of perimeter security is moving aside…

Disruptive Future Technologies For HDR & WCG

Consumer demands and innovations in display technology might change things for the future but it is standardization which perhaps holds the most potential for benefit to broadcasters.

EdgeBeam Wireless Technology Furthers ATSC 3.0 Datacasting

Simultaneous broadcast of real-time data to an unlimited number of one-way receivers and locations is the unique catalyst of the amazing potential of the Broadcast Internet. EdgeBeam Wireless is a new market offering from a group of TV broadcasters seeking…