Navigating the Many Layers of the ATSC 3.0 Ecosystem: Part 1

ATSC 3.0 is slowly rolling out in the form of actual on-air tests at multiple sites. This type of launch provides engineers the perfect opportunity to use the time to better understand this complex technology before equipment purchase decisions must be made. This primer peels back the multiple layers of ATSC 3.0 and explains what each does and how they all work together to create a high-quality, multiple-format and new business model signal.

ATSC 3.0 brings the promise of pristine video and sound, along with advanced applications from second screen to targeted advertising to over-the-air broadcasters. Transitioning to ATSC 3.0 will not be a trivial feat, and it is unlikely that anyone will “turn on” every feature it has to offer on day one.

While ATSC 3.0 seems complex, it’s important to recall that ATSC 1.0 was the same for yesteryear’s NTSC experts. Therefore, it’s helpful to begin with a simpler viewpoint and focus on getting audio, video and data thru the system first. Then we can identify where future expansion opportunities lie with the use of advanced codecs, 4K/HDR, immersive audio, mobile and second screen experiences, interactive VoD, targeted advertising and other advanced services through ATSC 3.0.

In part one of this two-part series, we’ll explore the myriad layers of ATSC 3.0 as a means of understanding how these pieces come together to enable new services.

Unidirectional Delivery

ATSC 3.0 is a unidirectional RF delivery system that offers a great deal of flexibility and efficiency. It delivers more bits, inherent Single Frequency Network (SFN) operation and Physical Layer Pipes (PLP) for simultaneous fixed and mobile service in the same RF channel, with enhanced robustness. It’s fully abstracted from the payload for future extensibility. The broadband stack is a typical network topology for the associated physical layers, including Ethernet, Fiber, Wi-Fi and Mobile.

There are a number of layers to understand:

  • Link-Layer Protocols optimize packaging for a particular physical layer. For example, ALP (ATSC Link-Layer Protocol) optimizes high-efficiency delivery over RF.
  • ALPTP (ALP Transport Protocol) maintains the relationships of certain components when the ALP packaging is separate from the scheduler/framer, such as ALP to PLP structures and associated metadata.
  • The network layer uses the same IP (Internet Protocol) hands the appropriate packets between the physical layer and the appropriate session layer (in this case from ATSC or Broadband physical layers to UDP or TCP session layers).
  • The ATSC 3.0 Transport layer uses UDP (User Datagram Protocol) for unicast or multicast operation for broadcast, whereas the broadband TCP (Transport Control Protocol) delivery uses a bidirectional connection.
  • The session layers, such as MMTP, ROUTE, and HTTP, describes the source and destination relationship for IP transport:
    • MMTP (MPEG Media Transport Protocol) is optimized for linear broadcast delivery, and supports many innovative and dynamic new features such as dynamic ad insertion or content changes
    • ROUTE (Real-time Object Delivery over Unidirectional Transport) brings higher flexibility for Digital Video Recorder (DVR) functions, Video on Demand (VOD), and Non-Real-Time (NRT) delivery functions.
    • HTTP (Hyper-Text Transfer Protocol) is for delivering services between a client and server over an IP network.
  • Service management layers (presentation and application) are where the remaining decoding takes place for media players and consumption of streams, VOD, and other Non-Real-Time (NRT) data.

Even with the signaling removed, the stack of layers still has plenty of overhead, as it goes through about seven to eight steps of packaging, encapsulation, and wrapping. However, the primary goal was to be the future of TV, and because the popular IP/OTT is still rapidly evolving, the ATSC followed this topology as a means to embrace future extensibility. That is also the reason for all the layers of overhead.

However, the beauty of ATSC 3.0 is the broadcaster can use all of these stacks together to provide high value to both the broadcaster with enhanced revenue opportunities and the end user with a rich and personalized experience.

Basic studio workflow. Click to enlarge.

Basic studio workflow. Click to enlarge.

Components and Functions

Vendor equipment will vary with respect to which functions they support. For a typical ATSC deployment you will normally see two main locations: the studio and the transmitter site. The interface between the two locations will most likely be Studio to Transmitter Link Tunneling Protocol (STLTP). STLTP over ST-2022-1, and the interface to the end user will be ATSC 3.0 RF. Implementations may vary due to RF delivery configurations such as channel bonding or SFN operation and equipment functionality.

The goal is to have everything connected via IP and use common routers, switches and hubs to communicate between broadcast equipment. High Efficiency Video Coding (HEVC) holds the highest promise of bitrate versus quality, and is required for 4K/UHD delivery. Scalable High Efficiency Video Coding (SHVC) is also an option to maximize delivery options to a variety of end user equipment.

Dolby AC4 and MPEG-H are the best options for immersive and personalized audio capabilities. Channel support includes 2.0 stereo, 5.1 and 7.1 surround plus there are four additional object audio profiles outlined in the A/342 standard. Broadcasters have the benefit of structuring groups of audio elements, and creating audio program components that consist of complete main mixes, music and effects submixes, dialog only submixes, video description services, and other audio feeds and mixes which are delivered in one or multiple audio elementary streams.

Captions and Subtitle technology is based on SMPTE Timed Text (SMPTE-TT). The Timed Text Markup Language (TTML) standards  covers support for worldwide language, character, image and symbols in addition to High Dynamic Range (HDR) and Wide Color Gamut (WCG).

The next layer of encapsulation includes ROUTE and/or MMTP headers. ROUTE is used to map media segments and their descriptors contained in the Media Presentation Description (MPD) between the UDP transport layer and a common DASH presentation layer via a unidirectional HTTP proxy. ROUTE allows for abstracting various audio, video, and data tracks as well as segments.

MMTP is used to map media segments and their descriptors contained in the Media Processing Unit (MPU) between the UDP transport layer and a MPU presentation layer. At this stage the media/payload is also encapsulated with UDP/IP to allow communications with other broadcast equipment over an IP network.

Next comes encapsulation of the ATSC Link-Layer Protocol (ALP). A multicast IP address is reserved along with a set of ports to map particular ALP packet streams to specific Physical Layer Pipes (PLPs). The ALP process will enforce optimal segmenting/packaging for efficient RF broadcasting.

Through an adaption module, the input formatting will decompose and separate the timing, control and management information from the content and will build signaling information from each packet stream to be passed to the scheduler. Compression of the overhead in the IP headers will take place via Robust Header Compression (ROHC), and the content will be passed to the baseband formatter. The scheduler will provide control and management to the baseband formatter, which handles the baseband packet construction, addition of headers, packet scrambling, and baseband framing.

The output of this section will be framed baseband packets for each individual PLP. The baseband PLPs are encapsulated into an IP layer along with timing, scheduling, and management data.

PLPs offer increased protection and isolation for the signal. The most important of which is to enable a different modulation configuration and forward error correction for each service. For example, you can configure a service to use MMTP and deliver 4K to a fixed location in one PLP, while using ROUTE/DASH and Single Frequency Network (SFN) with channel bonding to deliver to mobile receivers on another PLP. A single PLP can carry multiple services each with the same robustness.

ATSC 3.0 transmitter site workflow. Click to enlarge.

ATSC 3.0 transmitter site workflow. Click to enlarge.

The media and data components are now fully encapsulated into their individual ALP containers and packetized into individual PLP structures. For transport to a remote transmitter site, the individual PLPs will be multiplexed and encapsulated into the Studio to Transmitter Link Tunneling Protocol (STLTP). The STLTP is used to traverse IP links and carries three distinct types of data:

  • Baseband Packet Streams, the data from each individual PLP
  • Preamble data packets derived from the scheduling process outcome and used both to populate emitted Preambles and to control Exciter configurations
  • Timing and Management Data packets used to control Transmitter synchronization provide Physical Layer frame identification, and other management and control in an SFN configuration

The STL Receiver will, within its error correction capabilities, correct any issues with the stream coming into the transmitter site. The PLP demultiplexer will decompose the stream back to individual PLPs and ALP streams.

In the second part of this feature, we will explore best practices for signal verification and compliance across the ATSC 3.0 ecosystem. Look for Part 2 in July.

Ted Korte, COO, Qligent

Ted Korte, COO, Qligent

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Operating Systems Climb Competitive Agenda For TV Makers

TV makers have adopted different approaches to the OS, some developing their own, while others adopt a platform such as Google TV or Amazon Fire TV. But all rely increasingly on the OS for competitive differentiation of the UI, navigation,…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.

Broadcasters Seek Deeper Integration Between Streaming And Linear

Many broadcasters have been revising their streaming strategies with some significant differences, especially between Europe with its stronger tilt towards the internet and North America where ATSC 3.0 is designed to sustain hybrid broadcast/broadband delivery.