Sony Live Production Portfolio Now Supports IP Workflows And More
Sony continues to have the industry’s largest camera portfolio but has expanded into deploying and managing a variety of IP architectures.
While cameras continue to be its forte’, Sony’s most recent virtual press conference made it abundantly clear that the company has gone to considerable lengths over the past few years to emerge as a comprehensive solutions provider that is no longer just helping customers make pretty pictures. Sharing those files, collaborative workflows and remote production are all part of the portfolio now.
Of course, this being a major Sony press conference, there was new camera and new grading monitor announcements—the HDC-F5500 4K HDR model, with a built-in Super 35 mm, 4K CMOS global shutter sensor; and the PVM-X3200 Trimaster monitor; both of which start shipping in December. However this showing was more about how Sony has refocused its R&D, acquired Nevion, a leading edge virtualized media production innovator, advanced its Ci media cloud platform to streamline file sharing, production and long-distance, high-speed transfer (via SRT), and has even begun dabbling in artificial intelligence (AI) after acquiring a company called Hawk-Eye Innovations, to create in-depth live sports analytics and other augmented reality graphics.
The moves were critical to keeping the company competitive in today’s growing IP-centric landscape.
“Sony’s commitment in the professional space remains on imaging technology, IP and Cloud workflows, and the integration of artificial intelligence,” said John Stoddert, Vice President, Sales & Marketing, Media & Sports (US) at Sony.
Traditional Imaging With A Twist
With live sports and entertainment TV shows now using shallow depth of field shooting to bring viewers closer to the action, Sony introduced the new HDC-F5500, which features a highly sensitive Super 35mm, 4K CMOS global shutter image sensor with 12G-SDI directly out of the camera that enables this close up style of acquisition.
“There is a strong desire to integrate the ‘cinematic look’ into sports and entertainment productions to heighten emotions and enable more connected storytelling,” said Theresa Alesso, Pro Division President, Sony Electronics.
The HDC-F5500 features a motorized 8-step ND filter—used on Sony’s Venice digital motion picture camera—that can be controlled locally or remotely. Sony said it also allows for the precise selection of focus depth as well as controlled capture of fast-moving subjects even in bright lighting conditions. The new system camera also offers a wide color gamut and support for BT.2020, S-Gamut3/S-Gamut3.cine and HLG HDR processing.
In addition, the new camera provides numerous workflow enhancements for more efficient live production. The multiformat (4K/HD) HDC-F5500 integrates with Sony’s IP Live production system through the use of the HDCU-5000 series, which supports SMPTE ST 2110 and AMWA NMOS standards. The HDC-F5500 can utilize the company’s new IP extension adaptor (HDCE-TX50) for remote production or multi-camera flight pack “CCU-Less” operation. The HDC-F5500 is also compatible with Sony’s existing SR Live for HDR workflow with HLG and S-Log3.
Aerial Videography
Sony also showed a preview of its new Airpeak S1 aerial drone system for its Alpha series of mirrorless cameras that leverages an active 3D spatial system onboard that’s powered by AI algorithms and five dedicated sensors—to help operators avoid obstacles and collisions.
Cloud Production And Distribution
In the area of cloud-based operations, Sony’s Ci scalable media cloud platform is gaining momentum within the industry for its ability to bring production teams together virtually, as if they were in the next room. This can include camera-to-cloud remote workflows, live streaming, new ways to execute cloud-based workflows, and the ability to manage media across that entire process.
“There’s been a lot of interest in camera-to-cloud workflows over the last year,” said David Rosen, VP, Cloud Applications and Solutions at Sony. “There are two main drivers for this: 5G and the pandemic. The first allows large files to be sent over a wireless network not only feasible but in many cases preferable to the old school ‘sneakernet’.”
To support this cloud initiative, the company announced a mobile camera gateway app for mobile phones (Android and iOS) that connects the camera to all of its virtualized Ci platform services. The app allows users to send files from their camera to anywhere in the world—in real time while taking advantage of cloud production workflows. It will be available early in 2022.
IP Remote Production
Developed in partnership with Skyline Communications, Sony’s new Live Element Orchestrator (LEO) is designed to help configure and manage remote workflows. It’s a software-based facilities management system that oversees production and non-production resources to enable users to get the most out of their equipment investments. It offers live monitoring and device settings for the entire live production environment.
Sony has added new types of broadcast control capabilities to Nevion’s VideoIPath orchestration SDN software system to expand its use cases.
“These features empower content creators to share production resources, and confidently in advance or ad hoc,” said Deon LeCointe, Director, Networked Solutions at Sony, adding that it can be deployed on premise or in the cloud to support many different remote production architectures where network connectivity is present. The LEO can be used to manage and configure Sony equipment as well as a wide range of third-party products and services.”
Sony is also leveraging its Nevion acquisition and has added new types of broadcast control capabilities to Nevion’s VideoIPath orchestration SDN software system using Sony’s IP Live Systems Manager (LSM) software. It manages and monitors IP activity in both LAN and WAN environments. It can establish and release connections in the network in a scheduled or ad hoc way. This ensures full system protection while managing resources capacity, including bandwidth and ports.
LeCointe said the real strength of VideoIPath lies in software-defined networks, in which it can control COTS IP switchers from all of the major vendors. It can handle hundreds of thousands of IP connections.
Artificial Intelligence
“At Sony we believe in AI that releases imagination and creativity that allows content creators to realize their vision,” said Hugo Gaggioni, Chief Technology Officer at Sony. “We continue to be on the cutting edge of creating the latest technologies that use our AI engines in unique ways to improve and enhance our products and our customers’ workflows.”
Among a number of applications, AI is also being used to automatically generate multi-camera highlights reels during live sports events.
The company is now offering a cognitive processor series called XR—XR Picture for texture creation and true color recreation and XR Sound to support immersive audio in home theaters—that can cross-analyze hundreds of thousands of elements to provide an immersive and realistic viewing experience. This includes immersive audio processing for multichannel rendering on its Bravia XR TV sets.
Indeed, Gaggioni said AI is being used across Sony’s product portfolio where it makes sense, including talent tracking for its PTZ cameras, auto-focus and Sony’s Media Analytics Portal System, a microservices-based virtualized architecture that uses Sony and third-party AI engines to analyze live video and data feeds (and was used by CBS during this year’s Super Bowl 55 telecast). AI is also being used to automatically generate multi-camera highlights reels during live sports events.
Sony executives said they will continue to provide tools for the best picture quality, easiest workflow orchestration and most flexible ways to work for professionals around the world. However, owing to its storied heritage in the video production industry, the company will still argue that it all starts with the acquisition of the most pristine quality image.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.