AWS Completes Streaming Protocol Set With SRT Support

AWS (Amazon Web Services) has joined the SRT Alliance and added native support for the SRT protocol in its AWS Elemental MediaConnect package.
This completes the low latency protocol set for Elemental MediaConnect, which already supported video transport using protocols including Zixi, Reliable Internet Stream Transport (RIST), and Real-time Transport Protocol (RTP) with forward error correction (FEC).
Although last to be added, AWS indicated customer demand had built up, which it said determines which protocols it supports and which technologies it adopts or builds. The inclusion of SRT within the portfolio also extends to AWS Elemental Live, an on-premises appliance and software based live video encoder, whose users now have access to the protocol in the latest software release. Elemental Live can now receive streams using SRT, taking as an input a secure, reliable, and low-latency video source with protection against packet loss. SRT can be used in applications that require security, since video can be secured end-to-end using 128 bit or 256 bit AES encryption via the protocol.
“SRT has shown that it’s an important transport protocol, and it provides secure and reliable transport of live video to and from the AWS Cloud,” said Dan Gehred, solutions marketing manager for AWS. “With SRT protocol input and output in AWS Elemental MediaConnect, along with input in AWS Elemental Live appliances and software, AWS customers have more options when it comes to building scalable, reliable, and secure live video workflows.”
This comes at a time of growing momentum behind SRT, a month after Sony joined the Alliance, with other scalps including Microsoft, Alibaba Cloud, Sony and Tata Communications among over 450 members in total.
However, SRT still has a close rival in RIST (Reliable Internet Stream Transport) protocol proposed by the Video Services Forum, which is similar in capabilities and approach. Both address latency by reducing the delay associated with the TCP protocol retransmitting lost IP packets after receiving requests from the receiving end of a network link. Both work by incorporating an error-correction mechanism already associated with the UDP (User Datagram Protocol) called Automatic Repeat ReQuest (ARQ). If the receiver identifies a gap in a stream resulting from missing IP packets, it requests those to be sent again via a negative acknowledgment (NAK) packet. The advantage of this is that only missing packets are retransmitted and this is done faster because sender and receiver remain in conversation throughout the transmission.
SRT is sometimes described as combining the speed of UDP without ARQ with the reliability of TCP. But in practice ARQ does add some latency, worse impact depends on the retransmission or round-trip time between sender and receiver, largely a function of distance.
This distance-related delay comprises signal latency, the time taken to traverse the physical medium, but the bigger contributor is the time taken to buffer IP packets while waiting for possible retransmissions. The farther apart the nodes, the higher the latency as buffers have to be larger to accommodate the added delay in transmission, and so take longer to fill and empty.
Between say London and New York, that total delay would be about 150 milliseconds, rising to up to 500 ms if the source and destination are almost diametrically opposite on the globe. That may still be acceptable for video streaming, but not for some two way interactive applications such as video conferencing and gaming. For those applications, the WebRTC protocol will usually be preferred because this involves no packet retransmission at all. Instead Forward Error Correction (FEC) is used to cater for dropped IP packets up to a certain level, by incorporating some redundant information in the stream. This allows some packet recovery while imposing very little additional latency, but there is a bandwidth overhead and FEC cannot be guaranteed to deliver high definition streams at sufficient quality over the internet.
SRT was developed by Haivision around 2012 to meet the challenges of low latency video streaming by cutting down on those TCP error correcting delays. Wowza Media Systems later joined the party and together with Haivision launched the SRT Alliance in April 2017, while making the protocol available open source to encourage adoption.
You might also like...
Playout Monitoring & Compliance At NAB 2025
Automation, interoperability, and scaling are overarching themes at NAB 2025, associated with continued progression of hybrid video services that are tilting more and more towards streaming. For monitoring and compliance, this means increasing integration across the whole workflow and content lifecycle,…
Streaming Delivery At NAB 2025
Hybrid workflows combining cloud and on-premise systems, and application of AI for personalization, are major streaming themes for NAB 2025. There is an even stronger focus on remote production than at previous shows, especially for live sports. Security of live streams…
OTA TV Transmission At NAB 2025
It is time to consider the state of the US TV Transmission industry and how this might be reflected on the NAB 2025 show floor.
Channel Creation & Playout At NAB 2025
Playout is moving to the public cloud as broadcasters take this next step in their strategies for master control, even as some analytics functions are being drawn back towards on premise systems. This will be reflected by the offerings and…
Mobile Codecs: The Battle Of The Codecs Continues But AI May Disrupt The Field
The continued proliferation of streaming video consumption on mobile devices and especially smartphones has boosted activity around mobile codecs with new releases of VVC and AV2 slated in the next 2 years. At the same time, Generative AI is disrupting the…