Flexible IP Monitoring - Part 1
Video, audio and metadata monitoring in the IP domain requires different parameter checking than is typically available from the mainstream monitoring tools found in IT. The contents of the data payload is less predictable and packet distribution more tightly defined leading to the need to use specialist media stream centric monitoring tools.
ST2110 is the first step for many into the IP world. To keep latency low, the designers restricted ST2110 IP packet distribution to tight tolerances so that smaller receive buffers are required, which in turn leads to lower latency. This is quite a unique method of operation for IP networks as packets in traditional IT workflows tend towards a more flexible distribution. To rebuild the data for the higher-level applications, large buffers are needed at the receiver which in turn leads to higher latency.
Although PTP is used in wider industry, regular NICs (Network Interface Cards) are not able to meet the tight timing constraints ST2110 demands and specialist NICs with hardware PTP processing are required. This further leads to the complexity broadcasters demand from their monitoring equipment.
The data domain in a broadcast IP environment represents the video, audio and metadata. Other than when using test signals, it’s difficult to predict data values with any certainty due to the dynamic nature of video and audio. Our human visual and auditory systems are extremely adept at detecting differences and faults. Hence the reason that broadcast engineers often opt to display the video data on screens and listen to the audio data on loudspeakers.
Monitoring in the traditional broadcast sense is merely representing the underlying data in a different domain, that is vision and sound. Looking at thousands of data samples flying past our eyes may have its occasional use, but the best method we have of detecting faults and monitoring quality is by displaying on a screen and listening on loudspeakers.
Using traditional IT monitoring simply does not provide the level of monitoring we need, it certainly has its uses and it provides us with a great deal of information about the underlying IP distribution and accuracy of data, however, it does not provide us with a useable visual and auditory monitoring system.
It is possible to retrofit video and audio monitoring devices to traditional IT monitoring tools, however, it’s inevitable that code will need to be written to facilitate this and it’s very difficult to achieve a workable link between the two. Often, when looking at a video image we will want to simultaneously look at the waveform diagram and vectorscope, as well as the underlying transport stream. This is very difficult to achieve when using diverse and disconnected unit solutions, especially when working under the pressure of live television, such as high value sporting events.
Having an integrated monitoring solution that easily and ergonomically connects the monitoring of the video, audio and metadata, with the underlying IP transport stream is critical for anybody working in broadcast facility. Although the data in the IT monitoring equipment may be the same as that in the integrated broadcast solution, the ability to switch between the different monitoring domains is key.
There’s also a new set of metadata emerging from the use of systems such as HDR. The metadata is essential in describing the display and acquisition formats so that the best immersive viewing experience can be maintained. We need to be able to not only monitor this data, but also monitor it in real time in the context of the streaming video and audio for the television production.
Advanced monitoring is one of the most important tools for any broadcaster either transitioning to IP or already there. Integrated monitoring takes this process one step further and delivers a complete toolset that helps broadcasters maintain their audiences and enhance the immersive experience.
Transitioning to IP delivers incredible opportunity for broadcasters. But the asynchronous nature of packet switched networks is new for most engineers and being able to understand what is going on within the network is essential. The best method we have of observing a networks behavior is through monitoring and IP even has advantages here too.
With the benefit of hindsight, it’s now clear that SDI networks are relatively easy to understand. The point-to-point connectivity of synchronous signals increases their predictability but at the expense of flexibility. But with IP and the packet switched network topology, we increase flexibility at the expense of predictability.
A consecutive stream of IP packets does not guarantee each packet will take the same route through a network. Even with the relatively straight forward spine-leaf or monolith switch technologies broadcasters are opting for, there is scope for packets to vary their route within the network leading to out of sequence packets arriving at the receiver.
Network Practicalities
Packet multiplexing is the fundamental method of operation within an ethernet switch leading to a potential to increase packet jitter, which in turn will lead to timing anomalies if the jitter is excessive.
It’s clear that there is a lot more to monitor within an IP network than was required with SDI. However, this level of complexity far outweighs the limitations of the static SDI systems and provides dynamic and scalable IP. And that’s before we start considering the advances in the viewers immersive experience with HDR and Wide Color Gamut (WCG).
Sports has traditionally been the leader for exhibiting the forefront of technological excellence. HD, UHD, WCG and HDR are just a few of the technology advances that have been demonstrated in major sports events. Each new technological advance adds further weight to the sports story allowing production teams to continually build on their productions to create truly outstanding programs.
OB Advances
IP is further adding to this list of technology accolades as OB trucks seemed to be a natural fit with the technology. The reduction in equipment space and weight has delivered incredible benefits and that’s before we even think of the application agnostic benefits IP brings.
Monitoring is our window on reality. We have no way of understanding what is happening within a network if we cannot monitor it. With SDI networks we could use a variation of an oscilloscope, however, with IP networks life is much more interesting. Not only do we need to consider the detail of the transport stream, but also how the media specific data being transported is behaving.
This is one of the areas where the SDI analogy starts to break down when thinking about IP networks. The video and audio were an intrinsic part of the SDI transport stream. The bit rate, frame rate, color subsampling and frame size were all a function of the SDI synchronous system and inherently tied to the fundamental clock of 270Mb/s, 1.485 Gb/s or 1.485/1.001 Gb/s, for example. But this is no longer the case with IP networks.
Flexibility And Complexity
By separating the application video and audio from the underlying IP transport stream we’ve massively increased our system flexibility and scalability, but at the cost of complexity.
As each IP packet has its own source and destination address, the network itself can determine the optimal route when transferring packets. This often means that systems outside the direct control of the broadcast infrastructure are determining how packets traverse a network.
Although software defined networks are growing in popularity for broadcasters, and there is some analogy between them and SDI routers, it’s important to remember that the intelligence and routing options exist at the packet level. This routing is fundamental to IP networks and is one of the reasons the internet is so successful, and why IP networks for broadcasters will increase in popularity. However, an unintended consequence of IP networks is that they are considerably more complex than SDI networks, hence the need for flexible monitoring.
IP Packets Underly Media Streams
It’s interesting to think of video and audio essence, and even metadata as just data, and in the IT world, this is exactly what they are. But in broadcast infrastructures we cannot completely separate the data from the IP network as it is intrinsically aligned, especially when we start to think about PTP timing.
SMPTE’s ST2110 can and does work independently of the IP network. This can be seen when media is streamed and stored to a hard disk drive. But unlike more general IT data, ST2110 imposes a rigid timing structure on the streaming and temporal gapping of packets. And this is one of the reasons why specialist monitoring tools are needed for broadcast engineers.
It is possible to use IT centric monitoring and logging tools, and many broadcasters do in some circumstances, but the complex interaction of the media essence and the IP network demands that broadcast specific tools are used. Furthermore, IT monitoring generally doesn’t measure packet distribution to any great accuracy, and certainly not to the detail needed for ST2110. IT expects the IP packet distribution in a network to be bursty and evenly gapped packets are a bit of an anathema.
Supported by
Broadcast Bridge Survey
You might also like...
The New Frontier Of Interactive Rights: Part 1 - The Converged Entertainment Paradigm
Interactive Rights are at the forefront of creating a new frontier in the media industry. Driven by the Streaming era, but applicable to all forms of content platforms, Interactive Rights hold an important promise – to deeply engage the modern viewer i…
IP Security For Broadcasters: Part 1 - Psychology Of Security
As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…
Operating Systems Climb Competitive Agenda For TV Makers
TV makers have adopted different approaches to the OS, some developing their own, while others adopt a platform such as Google TV or Amazon Fire TV. But all rely increasingly on the OS for competitive differentiation of the UI, navigation,…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.