AWS Completes Streaming Protocol Set With SRT Support
AWS (Amazon Web Services) has joined the SRT Alliance and added native support for the SRT protocol in its AWS Elemental MediaConnect package.
This completes the low latency protocol set for Elemental MediaConnect, which already supported video transport using protocols including Zixi, Reliable Internet Stream Transport (RIST), and Real-time Transport Protocol (RTP) with forward error correction (FEC).
Although last to be added, AWS indicated customer demand had built up, which it said determines which protocols it supports and which technologies it adopts or builds. The inclusion of SRT within the portfolio also extends to AWS Elemental Live, an on-premises appliance and software based live video encoder, whose users now have access to the protocol in the latest software release. Elemental Live can now receive streams using SRT, taking as an input a secure, reliable, and low-latency video source with protection against packet loss. SRT can be used in applications that require security, since video can be secured end-to-end using 128 bit or 256 bit AES encryption via the protocol.
“SRT has shown that it’s an important transport protocol, and it provides secure and reliable transport of live video to and from the AWS Cloud,” said Dan Gehred, solutions marketing manager for AWS. “With SRT protocol input and output in AWS Elemental MediaConnect, along with input in AWS Elemental Live appliances and software, AWS customers have more options when it comes to building scalable, reliable, and secure live video workflows.”
This comes at a time of growing momentum behind SRT, a month after Sony joined the Alliance, with other scalps including Microsoft, Alibaba Cloud, Sony and Tata Communications among over 450 members in total.
However, SRT still has a close rival in RIST (Reliable Internet Stream Transport) protocol proposed by the Video Services Forum, which is similar in capabilities and approach. Both address latency by reducing the delay associated with the TCP protocol retransmitting lost IP packets after receiving requests from the receiving end of a network link. Both work by incorporating an error-correction mechanism already associated with the UDP (User Datagram Protocol) called Automatic Repeat ReQuest (ARQ). If the receiver identifies a gap in a stream resulting from missing IP packets, it requests those to be sent again via a negative acknowledgment (NAK) packet. The advantage of this is that only missing packets are retransmitted and this is done faster because sender and receiver remain in conversation throughout the transmission.
SRT is sometimes described as combining the speed of UDP without ARQ with the reliability of TCP. But in practice ARQ does add some latency, worse impact depends on the retransmission or round-trip time between sender and receiver, largely a function of distance.
This distance-related delay comprises signal latency, the time taken to traverse the physical medium, but the bigger contributor is the time taken to buffer IP packets while waiting for possible retransmissions. The farther apart the nodes, the higher the latency as buffers have to be larger to accommodate the added delay in transmission, and so take longer to fill and empty.
Between say London and New York, that total delay would be about 150 milliseconds, rising to up to 500 ms if the source and destination are almost diametrically opposite on the globe. That may still be acceptable for video streaming, but not for some two way interactive applications such as video conferencing and gaming. For those applications, the WebRTC protocol will usually be preferred because this involves no packet retransmission at all. Instead Forward Error Correction (FEC) is used to cater for dropped IP packets up to a certain level, by incorporating some redundant information in the stream. This allows some packet recovery while imposing very little additional latency, but there is a bandwidth overhead and FEC cannot be guaranteed to deliver high definition streams at sufficient quality over the internet.
SRT was developed by Haivision around 2012 to meet the challenges of low latency video streaming by cutting down on those TCP error correcting delays. Wowza Media Systems later joined the party and together with Haivision launched the SRT Alliance in April 2017, while making the protocol available open source to encourage adoption.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
Broadcasters Seek Deeper Integration Between Streaming And Linear
Many broadcasters have been revising their streaming strategies with some significant differences, especially between Europe with its stronger tilt towards the internet and North America where ATSC 3.0 is designed to sustain hybrid broadcast/broadband delivery.
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.