Vendor Content.

Implementing & Operating A Virtual Production System For Broadcast

Our partner Roe Visual discuss how like any production engineering project, creating and operating a successful virtual production system is all about methodical planning, configuration & testing.

Like any production engineering project, creating and operating a successful virtual production system is all about methodical planning, configuration & testing.

Virtual Production (VP) merges traditional broadcast techniques with real-time technology, blending live-action with computer-generated imagery (CGI) to create immersive sets in real time. While technologies like LED volumes, real-time rendering, and live camera tracking are essential, the true innovation lies in integrating visual effects throughout the entire production process, beginning in pre-production. This approach ensures high-quality effects and imagery are part of the project from the start.

News shows, sports channels (like Sky Sports and ESPN), and election programs (such as the BBC) have been early adopters of VP. But what does this technology involve?

At its core, VP consists of an intricate system comprising LED displays, LED processing, content sources, render nodes, set extensions, camera tracking, SPG, and synchronized playback. This guide provides a technical overview for broadcast technicians, covering the essential aspects of setting up and operating a VP system. The technical complexity of a virtual production eco-system is considerable, requiring multiple technologies to work in perfect harmony. Hiring an expert and thoroughly testing your setup is strongly advised.

LED Display Panels

LED panels are the cornerstone of virtual production, replacing traditional green screens with dynamic, real-time backdrops. To ensure your setup looks great on camera, consider the following specifications:

  • Pixel Pitch: The distance between the center of two adjacent pixels. Standard pixel pitches for VP range from 1.5mm to 2.6mm, balancing resolution and cost.
  • Refresh Rate: The frequency at which the LED screen refreshes its image, for this application, typically 7680Hz. A higher refresh rate is crucial to avoid visible scan lines and flickering, especially at lower brightness levels, ensuring a smooth and consistent visual experience.
  • Frame Rate Support: The responsiveness of an LED panel impacts the fluidity of motion in video content. High-quality panels that support high frame rates reduce motion blur, enhancing overall visual clarity.
  • Color Consistency: Each LED must be calibrated to meet specific color standards, ensuring accurate and consistent color reproduction, which is vital in broadcast or film productions. High-quality LED panels like those from ROE Visual offer superior color and brightness uniformity, contributing to a seamless visual output. The most critical parameters are specific wavelength preferences for the colors, brightness (efficiency), stability, black level, and viewing angle (no color shift). This meticulous selection process contributes to a seamless and consistent visual output. Quality LED panels are characterized by their uniformity, ensuring that every pixel on the display emits light with the same intensity and color accuracy.
  • Contrast: The contrast ratio, a key determinant of image quality, is enhanced by the panel’s ability to deliver deep blacks and bright whites, creating a more immersive viewing experience.

The quality of an LED panel is foundational to achieving outstanding visual displays. It influences everything from color accuracy to motion fluidity, making it crucial to select panels that meet the highest standards for your production needs. When investing in display technology, recognizing the pivotal role of the LED panel in the overall technical setup is paramount.

LED Processing

While the LED processor undoubtedly plays a crucial role in signal processing and optimizing content delivery, its effectiveness is maximized when paired with a high-quality LED panel.

LED processing translates video signals into data that controls each LED’s color and brightness. There are numerous LED controllers on the market, but when using a technically demanding setup, such as virtual production eco-systems, using a high-end system is strongly advised. LED processors like Brompton or Megapixel significantly enhance the capabilities of LED screens, providing advanced image processing, real-time adjustments, and critical synchronization features essential for professional-grade displays. Whether for live events, broadcast productions, or virtual production, these processors ensure that the LED screens deliver the highest quality visuals with maximum reliability and flexibility.

LED processing is responsible for the following:

  • Scaling and Mapping: LED processors manage the scaling and mapping of content to the LED panels, ensuring that the video source fits perfectly on the screen, whether it’s a standard resolution or a custom aspect ratio.
  • Color Calibration: LED processors are essential for calibrating the colors across the entire LED screen, ensuring uniform color reproduction and brightness. Some LED processors offer dynamic calibration, allowing real-time adjustments to the screen’s brightness and color balance based on the ambient lighting conditions or content requirements.
  • Low Latency: High-end LED processors are designed to minimize latency, ensuring that the content displayed on the LED screen is in sync with live events, making them ideal for broadcast environments.
  • Genlock Support: For virtual production environments, genlock is essential to ensure that the LED screen and the cameras are perfectly synchronized, preventing visual artifacts like tearing. LED processors handle this synchronization, ensuring all elements are in lockstep.
  • Shading & Gamma Adjustment: In virtual production, matching the on-screen content with the lighting and camera settings is critical. LED processors provide fine control over shading and gamma to match these elements perfectly.
  • Multi-Signal Processing: LED processors can handle multiple video inputs simultaneously, allowing for complex content management scenarios, such as Picture-in-Picture (PiP) or overlaying multiple video sources on the screen.
  • Error Detection & Correction: LED processors monitor the performance of the LED panels, detecting and correcting errors like dead pixels and ensuring the display remains flawless throughout its use.

Content Source & Render Nodes

In a VP setup, content is typically managed by a media server, sourced from pre-rendered assets, or dynamically generated by game engines like Unreal Engine or Unity. Various media server manufacturers offer specialized virtual production applications like VizRT, Disguise, or Pixera. Render nodes are computers that process and render the content from the source. For complex scenes, multiple render nodes may be required. All render nodes must be in sync to ensure consistent image display across the LED wall.

Set Extensions & Camera Tracking

Set extensions involve expanding the physical set with CGI, blending seamlessly with the real environment through techniques like projection mapping. It’s crucial to align the camera’s field of view (frustum) with the virtual scene’s perspective. The focal length and depth of field must match the physical and virtual cameras for realistic depth perception.

Camera tracking is essential for aligning virtual content with the live camera’s movement; there are several camera tracking systems on the market, such as Mo-Sys, StYpe, Ncam, or Pixotope, which either use through the lens tracking or physical markers, to detect the actual camera position and movement, aligning this with the perfect 3D perspective, the so-called frustum. The tracking system sends data to the render engine, updating the virtual camera’s position in real-time. Calibration of the real-time rendering system ensures that 3D content is displayed correctly on the LED wall. Camera Position Tracking calibrates the virtual camera to follow the physical camera’s movements, rendering the scene from the correct viewpoint.

For curved LED walls, the render engine must establish the distance from the render engine to the screen to make a perfect projection. Usually this is achieved by creating a LiDAR (Light Detection and Ranging) scan of the physical LED setup to map the geometry into the virtual environment. The thus generated 3D point cloud of the LED wall is used to create or align the 3D model in the media engine, ensuring that content is projected accurately, adjusting the image projection to account for the wall’s curvature, and guaranteeing consistent perspective.

SPG & Synchronization

The Sync Pulse Generator (SPG) and Genlock are critical for keeping all devices in sync. Genlock synchronizes the frame rate of the cameras with the LED panels, preventing screen tearing and stuttering. The SPG distributes a reference signal to ensure all devices operate at the same frame rate. Synchronization is vital for maintaining coherence between the LED wall and the camera, ensuring content is displayed and captured in sync. Timecode embeds time information in the video signal, keeping all elements in sync, while latency management minimizes the delay between camera movement and content display.


Figure 1 – The Roe Visual demo system at IBC 2023 was a good example of a typical virtual production system configuration.

Figure 1 – The Roe Visual demo system at IBC 2023 was a good example of a typical virtual production system configuration.


Calibrating The Virtual Production Eco-system

Calibrating the eco-system involves aligning physical and virtual cameras to ensure correct content rendering from the camera’s perspective. This includes:

  • Lens Calibration: Matching the real camera lens with virtual lens settings, including focal length, aperture, and distortion.
  • Sensor Calibration: Ensuring the camera’s sensor data corresponds accurately to the virtual camera.

Synchronization & Artifact Prevention

All components must remain in perfect sync to maintain seamless visual effects, avoiding common issues like stuttering caused by synchronization errors between the camera, render engine, and LED wall. Moiré patterns can be minimized or avoided by adjusting the LED pixel pitch, camera distance, or camera settings.

Workflow Management

Testing your setup is crucial. Ensure that the entire workflow is optimized for real-time production, using pre-visualization tools to simulate the final output before shooting begins. Regularly monitor all components for latency, synchronization, and performance issues, and re-calibrate the system frequently, especially when changing camera lenses, reconfiguring the LED wall, or updating software.

A Step-By-Step Guide

1. Confirm LED Screen Configuration

Step: Verify the configuration of your LED screen through the LED processor.

Purpose: Ensure the LED screen’s resolution, pixel pitch, and other vital parameters align with the project’s requirements.

2. Configure Content Source Machine & Render Nodes

Step: Set up your content source machine and render nodes to output the video signal to the LED processor and screen. Integrate a 3D mesh of the LED screen within the Media Server or Render Engine configuration.

Purpose: Ensure the content is correctly mapped to the LED screen, accounting for the screen’s geometry.

3. Rig & Calibrate The Camera

Step: Rig the camera and route its video output to the input of the content source machine for calibration.

Purpose: Establish a connection between the camera and the content source for calibration and real-time rendering.

4. Set Up & Calibrate The Camera Tracking System

Step: Install and calibrate the camera tracking system to capture translational and rotational position data and lens encoder metadata.

Purpose: Enable accurate tracking of the camera’s position and movement to align virtual content with the physical camera view.

5. Route & Confirm Tracking System Data

Step: Ensure the tracking system data is correctly routed to the content and control machines as required.

Purpose: Provide the content machine with real-time position and movement data for accurate rendering.

6. Route & Confirm Genlock Synchronization

Step: Route and confirm genlock signals to all necessary devices, including the LED processor, media server/render engine, camera, and camera tracking system.

Purpose: Synchronize all devices to prevent visual artifacts and ensure smooth operation.

7. Calibrate The Camera In The Media Server/Render Engine

Step: Perform camera calibration within the Media Server or Render Engine using the camera’s video feed and tracking data. Spatial Calibration: Align the virtual and physical camera spaces. Delay Calibration: Adjust for latency between the camera feed and the rendered content. Lens Calibration: Match the real camera lens settings (e.g., focal length, distortion) with the virtual camera.

Purpose: Ensure the rendered content perfectly aligns with the physical camera’s perspective.

8. Implement Color Management

Step: Adjust color management across the entire workflow, from the rendered content to the camera input. Rendered Content: Ensure accurate color representation in the media server output. Media Server Output: Match colors as they are output from the media server. LED Processor Input: Fine-tune colors as the LED processor processes them. LED Output: Verify color accuracy on the LED screen. Camera Input: Adjust for any color shifts as captured by the camera.

Purpose: Achieve consistent and desired color reproduction throughout the entire visual pipeline.

This step-by-step workflow ensures that all components in a virtual production system are correctly set up and calibrated, leading to seamless integration and high-quality visual output.

Conclusion

Implementing and operating a virtual production system requires a comprehensive understanding of the technology and components involved. This document provides a basic outline of creating a seamless integration of LED displays for virtual production projects. Ensure you’re well-informed by your suppliers, meticulously approach each detail in the process, and conduct thorough testing. While starting with virtual production can seem complex, the wealth of experience and knowledge available can guide you. Virtual production enhances creativity and production quality and streamlines workflows, positioning your team at the forefront of cutting-edge media technology.