NAB22 BEIT Sessions: ATSC 3.0, Web 3.0, And The Metaverse

What we’ve seen as ATSC 3.0 deploys and develops is just the tip of the NextGen TV iceberg.

Cloud security is a significant concern for everyone including TV broadcasters. The most serious issues are hacking, phishing, denial of service (DoS), and a collection of bad things that can happen at a large scale on the web. TV broadcasters are looking to ATSC 3.0 as a solution to get many of these web security issues under better control by decentralizing web distribution.

Lynn Rowe, Founder, President, and COO of AlchemediaSG LLC moderated the “What's ATSC 3.0 Got to Do with Web 3.0 and the Metaverse?” session. He noted that in 1960 the industrial age began evolving into the information age with the introduction of mainframe computers and pocket calculators, later followed by PCs and the internet. The market shifted to the revolutionary digital age in 2007, the same year Apple introduced its first iPhone.

Relevant digital age issues in 2022 include strategic cooperation as a means of accelerated development and deployment, connected multi-purpose products (IoTs), and a sharing economy. The digital age has also brought AI, blockchain, autonomous vehicles, wearables, and the cloud. 

TV broadcasting is an industry where shared strategic collaboration will be crucial for survival. Without it, the industry won’t survive. The good news is that TV broadcasting is at a point where a shared digital smart grid makes sense. Rowe said the massive complexity, skills, and amount of money to set up a new infrastructure competitive with hyper scalers and telcos can’t be done by individual companies, TV networks, or stations.

He said IP-based ATSC 3.0 was the original Web 3.0. OTA TV has always been a public service, based on public airwaves, benefiting the public good. ATSC 3.0 is something of a reinvention of the original concept of OTA public service as the center of the business model. What could be better than helping people? The industry is beginning to call it the Broadcast Services Core.

3GPP And 5G

The discussion asked, “Are mobile telecommunications communications protocols from the 3rd Generation Partnership Program (3GPP), such as LTE and 5G a threat to the broadcast industry?” ATSC 3.0 datacasting holds great promise as a public service. It needs to be a shared and open platform with a core backbone that includes edge compute that will be extended to ATSC 3.0 stations by automation. The broadcast community contributes to the broadcast services core. It isn’t a single station or company. The point of Web 3.0 is to decentralize the Web infrastructure which has been built and controlled by a few companies. The goal for TV broadcasters is to make the ‘best use’ of bits transmitted with monetization.

ATSC 3.0 opens up real-time communications with millisecond latency. LTE and 5G mobile communications can have seconds or dozens of seconds of latency, often depending on factors such as the number of viewers watching a stream. The latest Super Bowl stream was delayed by up to 30 seconds or more to some mobile viewers. Realtime ATSC 3.0 communications facilitates emergency broadcasts, live sports, and news without delay.

More Than A/V

ATSC 3.0 is capable of much more than just audio and video. The larger parts of the iceberg you don’t see or hear about are the multicasting, data and metadata possibilities. The telco version of multicasting using enhanced Multimedia Broadcast Multicast Service (eMBMS) is based on multipath TCP. Multipath TCP uses multiple paths to maximize throughput and increase redundancy. The back reporting structure of TCP networks was not made for video encoding or streaming. ATSC 3.0 uses multicast User Datagram Protocol (UDP) with bonding or similar automated split-switching built-in for optimum forward error correction.

In theory, ATSC 3.0 should be deeply integrated into all IP networks because it is the best live, linear, multipath, common file type distribution capability currently available.

Is Web3 Hype?

About month or so ago, the NAB announced the formation of a Web3 Advisory Council. Web3, aka Web 3.0, is the idea of a new version of the internet based on decentralized blockchain and token-based economics. It extends the network of hyperlinked people-readable web pages with machine-readable metadata about pages and how they are related to each other. This enables automated agents to access the Web more smartly and perform more work for users.

Web1 was basic TCP/IP, where one source could distribute to potentially everyone on the web. Web2 introduced sharing economy to the web. Web1 was essentially read only. Web2 is more of a two-way street, still centralized but with the ability to share. Web3 is decentralized, where information can not only be distributed from one to many, but from one or more to a ‘specific tribe.’ A ‘tribe’ can be defined in many ways, from a group of social friends and game competitors to groups such as NAB Show visitors and virtual visitors, or digital currencies.

Rick Champagne, Global Industry Strategy and Marketing, Media & Entertainment, with NVIDIA, discussed his view of Web3. He agrees the metaverse must be open and interoperable. For instance, NVIDIA introduced Omniverse for Media & Entertainment, which is open, interoperable, and based on Pixar’s Universal Scene Description (USD). USD is an easily extensible, open-source 3D scene description and file format developed by Pixar for content creation and interchange among different tools. NVIDIA has expanded upon USD by developing new tools, integrating technologies, and providing samples and tutorials.

NVIDIA built a sophisticated, fully playable, physics-based marble game in a lab that uses NVIDIA AI Ray and Path Tracing technology to illustrate the power of RTX on the Omniverse Platform. Courtesy NVIDIA via YouTube.

NVIDIA built a sophisticated, fully playable, physics-based marble game in a lab that uses NVIDIA AI Ray and Path Tracing technology to illustrate the power of RTX on the Omniverse Platform. Courtesy NVIDIA via YouTube.

The Omniverse platform is for simulating the physical world in the virtual world to create a better physical world. It provides interoperability of 3D worlds and interoperability in the physical world with the virtual world.

In media & entertainment, Omniverse is a business-to-business solution for content collaboration. Individuals with unique skills from around the world can work together in real time to create and fine-tune fresh content.

Anything that is manufactured or built can first be simulated in a virtual world. Anything that moves can be autonomous and simulated in a virtual world. For example, an autonomous car can be simulated on billions of hours on different roads, weather conditions, countries, seasons, times of day and other scenarios, and then transfer what is learned to a real autonomous car.

Gaming In The Metaverse

Bjorn Book-Larsson is VP of Product for the Avatar Team at Roblox. His team makes the identity of everyone that plays on the platform or the tools that allow the community to shape their identities. Book-Larsson has been in the gaming industry for more than 20 years, and he joined Roblox a couple of years ago. At Roblox he learned it wasn’t a game, it’s a platform. The way it empowers people to make their own experiences is unique. It imposes some technical challenges, and it also creates engaging environments that people around the world can participate in regardless of where the content was made.

Roblox as a platform can also be thought of as a broadcast medium because millions of creators make content which is then consumed by 100s of millions of players every month. "We in our 'sandbox' are solving specific things which could lead one day to the larger metaverse," he said.

Book-Larsson played a video showing the variety of content on the Roblox platform and demonstrating how the platform is a world unto itself. Most of the content looked like powerful video game graphics. Part of the video was shots from the Lil Nas X Roblox concert. It attracted 37 million viewers who participated in the Roblox concert over one weekend, more than the number of viewers of the final episodes of Home Improvement or Family Ties.

Roblox isn’t a game, it is its own language and ecosystem, a kind of metaverse unto itself. Roblox has been in business for 17 years. About half the Roblox market is age 14 and under. They are the future NextGen TV users, although they may not use it to passively watch linear TV content.

The MILE

Millions of people can watch traditional linear video, but they can’t impact what happens. An online video game typically allows about 50-100 people to interact. Genvid created a Massively Interactive Live Event (MILE) that allows millions of people not only watch an experience as it happens, but also influence events in the game.

For example, Jacob Novok, Co-founder and CEO of Genvid Technologies showed a video trailer about Rival Peak, an experimental show Genvid developed for Facebook Watch. The show is about 12 avatar contestants who must survive in the woods. The audience can make choices that affect the actions of the contestants, such as tell them what to do next.

Currently, there 12 Rival Peak episodes posted on Facebook. Novok then presented another video trailer about PAC-MAN Community. It allows real-time multi-play with others who are also participating live to control an AI PAC-MAN with AI ghosts and create custom mazes. Both titles are available to see and play on Facebook and are easily approachable.

The future of entertainment is literally changing before our eyes. Monetizing NextGen TV will keep TV broadcasters in the game.

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

Standards: Part 21 - The MPEG, AES & Other Containers

Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.

Broadcasters Seek Deeper Integration Between Streaming And Linear

Many broadcasters have been revising their streaming strategies with some significant differences, especially between Europe with its stronger tilt towards the internet and North America where ATSC 3.0 is designed to sustain hybrid broadcast/broadband delivery.

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.