The Sponsors Perspective: Metaverse - The Next Chapter Of Broadcast
The broadcast industry has been evolving towards greater immersion since its inception. As technology evolves, so do the capabilities of broadcasters and content creators to bring audiences inside of their stories.
Today’s immersive technologies and methodologies span the entire content pipeline from creation to delivery to consumption. They include 360-video capture, volumetric video capture, and spatial audio in content creation; augmented reality (AR) and virtual reality (VR) headsets for extended reality (XR) content distribution; and speech AI and data-driven visualization for interactivity during content consumption.
These technologies combine to deliver another layer to the viewing experience. This evolution of immersion in broadcast is taking us closer to the metaverse - the 3D Internet. The next step in this evolution is a shared virtual world. This is where audiences are headed, examples of which we’re seeing in other parts of the media and entertainment industry, from gaming to advertising to film.
Technologies That Enable The Metaverse
Delivering experiences to audiences in a 3D Internet requires using several technologies, many of which are already in use by media companies today.
Accelerated Computing
At its core is accelerated computing to process vast quantities of data generated in creating, delivering and consuming the virtual experience.
The Dell Precision 7865 workstation combined with the NVIDIA RTXTM 6000 Ada Generation GPU delivers impressive computing capabilities ideal for content creation and dissemination in the metaverse. This computational powerhouse boasts up to 64 cores of processing power via its AMD Threadripper CPU and comes equipped with up to two RTX 6000 professional graphics cards. This provides the computational resources to render high-quality, detailed 3D graphics, to train and deploy AI models, and to encode video to multiple platforms, including head-mounted devices. The Precision 7865 can easily exceed all the stringent power requirements for these workflows.
Content creators in this space need workstations that can handle vast amounts of data and provide extreme processing power, to handle the complex tasks required to create immersive virtual environments, avatars, and interactive experiences. The Precision 7865 workstation equipped with the RTX 6000 is more than up to the task. Its quick processing and rendering capabilities enable content delivery that is both fast and efficient, ensuring that users have a seamless and immersive experience.
Extended Reality (XR)
The Precision 7865 workstation supports a full range of professional NVIDIA graphics cards, making it capable of running high-resolution VR software and headsets. To provide customers with the best experience, Dell offers the Ready for VR program, ensuring customers choose a workstation and professional graphics card capable of driving superior VR experiences.
CloudXR, NVIDIA’s streaming technology, delivers VR and AR across 5G and Wi-Fi networks. Built on NVIDIA RTX technology, CloudXR is fully scalable for data center and edge networks.
With the CloudXR SDK, extended reality content from OpenVR applications can be streamed to Android and Windows devices, dynamically adjusting to network conditions for maximum image quality and frame rates. This frees users from traditional VR and AR confines, streaming complex experiences from remote servers across 5G and Wi-Fi networks to any device, wirelessly.
Platform For Metaverse Application
NVIDIA Omniverse Enterprise is a scalable, end-to-end platform enabling enterprises to build and operate metaverse applications. Omniverse is a real-time, large-scale virtual world simulation engine. A computing platform that enables 3D designers and teams to better connect and build custom 3D content creation pipelines.
Omniverse unlocks the entire scope of today’s 3D workflows – touching every single industry whether for building 3D assets and worlds or operating digital twins. Omniverse Enterprise is open, interoperable – built on Universal Scene Description – an open 3D framework and the foundation of 3D worlds, and MDL. It is easily extensible and customizable – customers can inspect, tweak, customize, build all our apps and extensions that we offer as source from scratch.
It is scalable from workstation including the Precision 7865 workstation, to data center, to cloud – and Omniverse performance scales just as much as the compute you throw at it. The platform can be deployed across hybrid infrastructure – and soon will be accessible from anywhere.
Avatars
Intelligent, lifelike avatars are a critical component of the broadcast metaverse. They heighten audience engagement in this new environment. Some examples of how interactive avatars can be used in broadcast include virtual news anchors, commentators and meteorologists that can be created to deliver the news in a virtual studio, virtual hosts who can guide participants through an event, and virtual celebrities and television personalities that can interact with fans about what they just watched.
NVIDIA Omniverse Avatar Cloud Engine (ACE) offers the fastest and most versatile solution for creating interactive avatars and digital human applications at-scale. Broadcasters can leverage ACE to animate 2D or 3D characters, and to give them the ability to speak and interact with users. The ACE end-to-end avatar development suite enables seamless integration and deployment, allowing broadcasters to build, configure, and deploy avatar applications across any engine in any public or private cloud.
Artificial Intelligence
AI is one of the cornerstones of the broadcast metaverse. AI is woven into many of the applications mentioned above, but it deserves a specific call out. Speech AI, computer vision, recommendation engines, and more come together to accelerate content production, increase the accessibility of content as it gets distributed, and make content more personalized in its consumption. These technologies will allow audiences to navigate through virtual environments, interact with avatars, request the content they want to see or ask for recommendations, and more.
Cindy Olivo - Global Media and Entertainment Marketing Manager - Dell Technologies (left) and Sepi Motamedi - Global Broadcast Industry Marketing and Strategy - NVIDIA (right).
The Time To Build Towards The Metaverse Is Now
With its immersive and interactive nature, the metaverse represents the next chapter for content delivery and consumption in broadcast. With the increasing popularity of virtual and augmented reality and interactive visualizations, viewers are looking for immersive experiences beyond traditional TV and streaming. By creating virtual worlds and interactive avatars, broadcasters can further engage existing audiences, expand their reach to new demographics, and create new revenue streams.
As the metaverse becomes more accessible, it can revolutionize entertainment. Broadcasters must be at the forefront of this change to stay in step with audiences in the ever-evolving media landscape.
Supported by
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
If It Ain’t Broke Still Fix It: Part 2 - Security
The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…