Hybrid CDN - Part 2
This is the second instalment of our deep dive into the rapid growth of OTT, high user expectations and the developments in hybrid systems which combine CDN with storage and distributed processing to meet demand.
CDN Control
Another alternative is to use a complete private CDN service as it gives a broadcaster full control over the distribution. They can place monitoring probes where they like and be able to monitor the system to incredibly granular levels of detail. Generally speaking, a private CDN consists of installing cache servers and monitoring systems, it does not include the fiber network or home delivery. The costs of this vary depending on the use case and private CDN can be cheaper if sufficient viewers are watching. As a very rough rule-of-thumb, 50,000 viewers is the tipping point at which private CDN becomes cost effective. However, the number of viewers may change depending on the programs being streamed. For example, if a broadcaster routinely streams 500Gbps per day, it can be much more cost effective to use Private CDN. But if there are spikes up to 2Tbps one per month, it might be best to use the Public CDN for the extra 1.5Tbps. This isn’t a linear relationship, and many variables influence the point where public, private, or public and private become more efficient.
The third option that exists is the hybrid CDN model.
Public CDN’s do have many advantages; they are cost effective and have deep reach into many of the ISPs needed to broadcast to wide audiences. However, they are shared, and this causes compromises for the viewers quality of service and experience. But this compromise doesn’t necessarily affect the whole network, there will be many parts of the distribution that are working well enough for the public CDN to operate adequately and meet the expectations of the viewer.
It’s often difficult to monitor a public CDN. This is as much of a logistical challenge for the public CDN provider as anything else. Public CDN providers may want to move their equipment, upgrade it or just service it. Within the organization, there will be tight operational controls on when these tasks can take place through change control notices. These require agreement from all affected parties within the organization and are notoriously difficult to get signed off. It would be even more difficult if they had to require sign- off from many other broadcasters if their monitoring equipment was also installed.
As well as providing managed services, private CDNs provide the opportunity to move the packager and storage closer to the ISP and viewer. This takes advantage of the shorter latency between the viewers mobile device and the playout servers resulting in a much-improved quality of experience for the viewer. In this diagram, the transcode, packager and storage is closer to the customer in the private CDN than it is with the public CDN.
Private CDN’s do allow monitoring to be installed so the broadcaster can keep close control of how the network is performing. Again, broadcasters using private CDN’s have better access to the managed network so are able to determine any anomalies quickly, but to provide private CDN for a complete OTT distribution network would result in a high level of complexity and cost that few broadcasters would be able to deal with.
There is less chance of the private CDN being shared between broadcasters. It’s possible there could be some sharing, but the private CDN service provider will establish brick-wall rate control between clients so that minimum viable latencies, bandwidths and jitter can be guaranteed as part of a larger service agreement.
Hybrid CDN
However, if we keep the use of private CDN for routinely delivered content or areas we know congestion is likely to occur, then they can be used as part of a public-private partnership strategy. CDN capacity planning is critical here and with the correct monitoring, broadcasters will be able to determine where best to focus their resources.
Greater efficiencies can be gained through the placement of storage, encoding and packaging servers. In our example discussed earlier, the packaging process would require approximately 18 streams of video and audio leaving the broadcaster for each service. Even the time delayed +1hour services would need to be streamed from the broadcaster. As these are effectively the same stream delayed by one hour, this is an incredibly inefficient method of operation.
Moving the storage and packaging as close to the edge as possible is a much more efficient and effective method of operation. In terms of processing power, packaging is relatively cheap. However, transcoding is resource hungry and tends to be centralized. By “the edge”, we mean moving as close to the consumer as we can. This may well mean installing equipment directly in the ISPs POPs and in doing so the required data capacity needed in the core internet is also reduced and greatly optimized.
Instead of sending three versions, each containing six ABR streams (one each for HLS, DASH and Android), one version with six-bit rates needs to be sent over a private CDN and the packaging can be provided at the edge. At the ISP, the private CDN will have sufficient infrastructure installed to provide the services needed to package the streams for ABR distribution to the ISPs clients, who are in effect, the broadcaster’s viewers.
Efficient Internet Delivery
This method reduces the pressure on the core internet resulting in better delivery of the program stream to the ISP.
Monitoring is much more effective as the private CDN provider will have monitoring solutions installed as part of their service thus giving much greater transparency to the broadcaster. This will not only help with managing key parameters such as jitter and latency but will also assist in predicting viewer demand, so when it is at its lowest, the broadcaster can take advantage of the caching aspect of the private CDN and distribute their on-demand programs to the storage servers or consider using the computing processing power of the edge servers.
Placing distributed storage within the private CDN as close as possible to the consumer will reduce congestion on the core internet as users will effectively grab their program streams from the edge storage and associated servers and not have to request program segments back from the broadcaster. It’s inevitable then that the latency between the viewer and the ISP is significantly lower than that between the viewer and broadcasters, thus further reducing delay and the potential for the dreaded “wait, loading” symbol on the viewers device.
Predictable Delivery
Live programming is greatly improved as the distribution from the broadcaster to the ISP and then viewer becomes more predictable and will result in a much better quality of experience. If the core internet infrastructure isn’t having to contend with the spontaneous, bursty and unpredictable behaviour of the network caused by the on-demand services, then the service is bound to be better.
The edge storage and servers used for on-demand can also be used for the live programming and distribution. Again, only a reduced number of streams needs to be sent to the edge devices and they can provide all the packaging, processing and mobile device interaction needed to make the program delivery reliable and comparable to OTA broadcasting.
The combination of private and public CDN solves many of the challenges broadcasters face when providing premium OTT services. Combined, they provide the best of all worlds. That is the client reach of public CDN with the control and quality of service a private infrastructure delivers.
But providing private CDN’s isn’t just about delivering faster connectivity to POPs within the internet. It’s also about distributing the core infrastructure technology to take it as close as possible to the viewer. OTT fundamentally differs from traditional OTA as there is significant interaction between the playout server and the mobile device. Instead of pushing the program stream, as happens in OTA, to the viewer, the mobile device requests packets of streams from the playout device as it needs them.
Improving Viewer Quality of Experience
ABR technologies are greatly affected by any latency in the delivery network as they fundamentally rely on TCP to allow them to work reliably. As the latency increases, as would happen over a highly utilized core internet link, the reliability of the program greatly deteriorates leading to the “waiting buffer” issues and even picture break up. Moving the interaction between the mobile device and playout server to the edge greatly reduces latency resulting in a much-improved user experience, and therefore viewer retention.
Hybrid CDN also scales to the cloud. Many broadcasters are taking advantage of cloud-playout using virtualized storage and servers. Often, the public cloud provider has data circuits directly into the main ISP POPs throughout the world. This further encourages the use of private CDN as the edge servers can be placed close to the viewer in the ISP.
Hybrid CDN should be a key consideration for any broadcaster looking to provide reliable OTT for their viewers. The combination of deep reach and control is a winning formula. Working together, it’s even possible to have the public CDN acting as a backup to the private CDN. Quite often in broadcasting we find that a partnership of methods provides the best overall solution, and Hybrid CDN is one of those.
Supported by
You might also like...
Expanding Display Capabilities And The Quest For HDR & WCG
Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.
C-Suite Insight: The Broadcast Bridge In Discussion With MainStreaming CEO Tassilo Raesig
Tassilo Raesig became CEO at MainStreaming this year straight from being CEO of the German Streaming Service Joyn (part of ProSieben). We sat down with him to discuss his unique perspectives on the state of the streaming industry from the…
Standards: Part 20 - ST 2110-4x Metadata Standards
Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…
HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG
Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…
Great Things Happen When We Learn To Work Together
Why doesn’t everything “just work together”? And how much better would it be if it did? This is an in-depth look at the issues around why production and broadcast systems typically don’t work together and how we can change …