Just what is SMPTE ST-2110?
Joint-Task Force on Networked Media (JT-NM) assumed of path of networked media development as of April 2017. It will evolve over time.
The big news out of IBC this year was Hmmm, what a great question! Oh yeah, and at the very end of IBC, SMPTE ST 2110 parts 10, 20 and 30 were officially approved. What you ask? What about the rest of the standard or suite of standards? Don’t worry, they will be here in time for Xmas or sometime in the New Year in time for NAB LV 2018 or so they say.
So, what was approved? Well, you could go back and read my article, “SMPTE ST-2110 - We are Family.” Let me summarize.
SMPTE ST-2110-10 is system timing and synchronization or the standard formerly known as SMPTE 2059 parts 1 & 2 (PTP).
SMPTE ST-2110 – 20 is uncompressed video or VSF TR-03 or IETF RFC 4175.
SMPTE ST-2110 – 30 is uncompressed PCM Digital Audio or AES67.
OK so what’s missing?
SMPTE ST 2110 -21 – Packet shaping uncompressed video.
SMPTE ST 2110 -31 – AES3 Audio transport.
SMPTE ST 2110 – 40 – Ancillary Data – also called SMPTE ST 291 or RTP. This is all the messy stuff like captions, subtitles, active format description, time codes, dynamic range and more.
SMPTE ST 2110 - 50 – Video, known as SMPTE 2022 part 6 also known as VSF TR-04.
Notice what is not part of the standard? That would be the Networked Media Open Specifications (NMOS) for Discovery and Registration (IS-04), Connection Management (IS-05) and coming soon Network Control (IS-06).
SMPTE ST 2110 is a transport standard
Here’s where I get into trouble. I got into a discussion with a colleague on what IP really meant. His position was it is just transport not a format, that digital video was basically the same component digital video that’s been around. He is correct.
SMPTE ST-2110 is a transport standard, not a video standard.The video starts life as ITU-R (CCIR) 601 component digital video or YCrCb or RGB, 4:2:2/10/12, 4:4:4/16, etc. In the Old Days, this was encoded into SDI for transport with audio and all the ANC’s embedded. When the SDI payload reached its destinations, the video, audio and everything else was unbundled at the receiving device to perform whatever its job was i.e. production switch, audio console, monitor , test and measurement. These devices did not process SDI, once they processed the appropriate signal they re-encoded to SDI for transport to the next device.
Then came SMPTE 2022 5/6 which further encapsulated the SDI into an IP packetized transport so Ethernet could be used as the transport mechanism. Same process at the end, IP back to SDI back to component video and audio.
So how does SMPTE 2110 fit into this? Well it’s a new mode of transportation not a new format. Video still originates the same way as component digital video and then it is packetized to IP for transport from a device to a device. Audio works the same way. The receiving device removes the IP and continues to process audio and video in the same way it always has in the digital world. If it needs to output video or audio it packages it up in IP and sends it along as an IP stream or essence. Timing and sync are a different story.But we will leave that for another article.
So if all the hub-bub and noise was about transport, then isn’t the IP network, the configuration of the devices and the communication between devices on the network a critical element to all this? Well, it appears that discovery, registration and network control are not part of the SMPTE standard. This was given to a separate joint task force to sort out.
The JT-NM NMOS or Network Media Open Specification is the interoperability piece for devices to find each other on the network. Again transportation.
There are two main specifications in NMOS that are essentially API’s (Application Protocol Interface) that manage and control how the devices on the network communicate with each other and move media.
IS-04: Discovery and Registration – This specification is so network-connected devices become listed in a shared registry and gives a uniform way to query the registry. It includes a way for peer-to-peer discovery to allow for smaller networks or device to device operations.
IS 05: Connection Management – This specification enables a production application to create or remove media stream connections between sending and receiving devices. This is independent of the protocol in use and supports both unicast and multicast operation. It allows a connection to be prepared and scheduled for immediate delivery or at a specified time. It allows multiple connections to be created or removed in a single "bulk" operation.
There is a third specification in development for Network Control (IS-06). This will be a control API for the NMOS services and looks at the whole network.
So here’s the conundrum. If SMPTE ST-2110 is a transport standard for media over an IP network and the network part of it is not included in the standard, but as a separate specification that’s not quite ready. What actually was approved and how does it get implemented?
It seems like an SDI router without a control protocol?
The IP Pavilion at IBC showing interoperability was a great success and lots of manufacturers participated, however it was hard coded to work. It would seem that for a complete IP solution we need all the parts in the network and a way to control them.
I hear the IP Pavilion is getting boxed up and floating across the pond to NAB NY. And more to come by NAB 2018. I remain ever hopeful.
Back to my colleague – his metaphor was “IP is the box car, the network is the freight train, and video and audio are the cattle”. I guess NMOS is the rail system switch control.
Editor’s Note: Gary Olson has a book on IP technology, “Planning and Designing the IP Broadcast Facility – A New Puzzle to Solve”, which is available at bookstores and online.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.