The Sponsors Perspective: Metaverse - The Next Chapter Of Broadcast

The broadcast industry has been evolving towards greater immersion since its inception. As technology evolves, so do the capabilities of broadcasters and content creators to bring audiences inside of their stories.


This article was first published as part of Essential Guide: Broadcast And The Metaverse - download the complete Essential Guide HERE.

Today’s immersive technologies and methodologies span the entire content pipeline from creation to delivery to consumption. They include 360-video capture, volumetric video capture, and spatial audio in content creation; augmented reality (AR) and virtual reality (VR) headsets for extended reality (XR) content distribution; and speech AI and data-driven visualization for interactivity during content consumption.

These technologies combine to deliver another layer to the viewing experience. This evolution of immersion in broadcast is taking us closer to the metaverse - the 3D Internet. The next step in this evolution is a shared virtual world. This is where audiences are headed, examples of which we’re seeing in other parts of the media and entertainment industry, from gaming to advertising to film.

Technologies That Enable The Metaverse

Delivering experiences to audiences in a 3D Internet requires using several technologies, many of which are already in use by media companies today.

Accelerated Computing

At its core is accelerated computing to process vast quantities of data generated in creating, delivering and consuming the virtual experience.

The Dell Precision 7865 workstation combined with the NVIDIA RTXTM 6000 Ada Generation GPU delivers impressive computing capabilities ideal for content creation and dissemination in the metaverse. This computational powerhouse boasts up to 64 cores of processing power via its AMD Threadripper CPU and comes equipped with up to two RTX 6000 professional graphics cards. This provides the computational resources to render high-quality, detailed 3D graphics, to train and deploy AI models, and to encode video to multiple platforms, including head-mounted devices. The Precision 7865 can easily exceed all the stringent power requirements for these workflows.

Content creators in this space need workstations that can handle vast amounts of data and provide extreme processing power, to handle the complex tasks required to create immersive virtual environments, avatars, and interactive experiences. The Precision 7865 workstation equipped with the RTX 6000 is more than up to the task. Its quick processing and rendering capabilities enable content delivery that is both fast and efficient, ensuring that users have a seamless and immersive experience.

Extended Reality (XR)

The Precision 7865 workstation supports a full range of professional NVIDIA graphics cards, making it capable of running high-resolution VR software and headsets. To provide customers with the best experience, Dell offers the Ready for VR program, ensuring customers choose a workstation and professional graphics card capable of driving superior VR experiences.

CloudXR, NVIDIA’s streaming technology, delivers VR and AR across 5G and Wi-Fi networks. Built on NVIDIA RTX technology, CloudXR is fully scalable for data center and edge networks.

With the CloudXR SDK, extended reality content from OpenVR applications can be streamed to Android and Windows devices, dynamically adjusting to network conditions for maximum image quality and frame rates. This frees users from traditional VR and AR confines, streaming complex experiences from remote servers across 5G and Wi-Fi networks to any device, wirelessly.

Platform For Metaverse Application

NVIDIA Omniverse Enterprise is a scalable, end-to-end platform enabling enterprises to build and operate metaverse applications. Omniverse is a real-time, large-scale virtual world simulation engine. A computing platform that enables 3D designers and teams to better connect and build custom 3D content creation pipelines.

Omniverse unlocks the entire scope of today’s 3D workflows – touching every single industry whether for building 3D assets and worlds or operating digital twins. Omniverse Enterprise is open, interoperable – built on Universal Scene Description – an open 3D framework and the foundation of 3D worlds, and MDL. It is easily extensible and customizable – customers can inspect, tweak, customize, build all our apps and extensions that we offer as source from scratch.

It is scalable from workstation including the Precision 7865 workstation, to data center, to cloud – and Omniverse performance scales just as much as the compute you throw at it. The platform can be deployed across hybrid infrastructure – and soon will be accessible from anywhere.

Avatars

Intelligent, lifelike avatars are a critical component of the broadcast metaverse. They heighten audience engagement in this new environment. Some examples of how interactive avatars can be used in broadcast include virtual news anchors, commentators and meteorologists that can be created to deliver the news in a virtual studio, virtual hosts who can guide participants through an event, and virtual celebrities and television personalities that can interact with fans about what they just watched.

NVIDIA Omniverse Avatar Cloud Engine (ACE) offers the fastest and most versatile solution for creating interactive avatars and digital human applications at-scale. Broadcasters can leverage ACE to animate 2D or 3D characters, and to give them the ability to speak and interact with users. The ACE end-to-end avatar development suite enables seamless integration and deployment, allowing broadcasters to build, configure, and deploy avatar applications across any engine in any public or private cloud.

Artificial Intelligence

AI is one of the cornerstones of the broadcast metaverse. AI is woven into many of the applications mentioned above, but it deserves a specific call out. Speech AI, computer vision, recommendation engines, and more come together to accelerate content production, increase the accessibility of content as it gets distributed, and make content more personalized in its consumption. These technologies will allow audiences to navigate through virtual environments, interact with avatars, request the content they want to see or ask for recommendations, and more.

Cindy Olivo - Global Media and Entertainment Marketing Manager - Dell Technologies (left) and Sepi Motamedi - Global Broadcast Industry Marketing and Strategy - NVIDIA (right).

Cindy Olivo - Global Media and Entertainment Marketing Manager - Dell Technologies (left) and Sepi Motamedi - Global Broadcast Industry Marketing and Strategy - NVIDIA (right).

The Time To Build Towards The Metaverse Is Now

With its immersive and interactive nature, the metaverse represents the next chapter for content delivery and consumption in broadcast. With the increasing popularity of virtual and augmented reality and interactive visualizations, viewers are looking for immersive experiences beyond traditional TV and streaming. By creating virtual worlds and interactive avatars, broadcasters can further engage existing audiences, expand their reach to new demographics, and create new revenue streams.

As the metaverse becomes more accessible, it can revolutionize entertainment. Broadcasters must be at the forefront of this change to stay in step with audiences in the ever-evolving media landscape.

Supported by

You might also like...

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.

HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG

Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…