Sony Wants To Drive The Future Of Media Production And Delivery
Sony laid out its plan for helping customers navigate today’s rapidly changing production landscape.
Sony Electronics used its first in-person NAB press conference in two years to outline the many areas of production and content delivery where the company is placing much of its emphasis on product development.
Company executives said that the global pandemic has acted as an accelerant for great progress, including increased demand for high-quality content, a rapid expansion of streaming platforms, and a shift to remote production workflows. Sony, they said, has been at the forefront of this rapid change.
Of particular interest to its customers this year are virtual production (including in-camera VFX and a new series of Crystal LED video walls), a cinematic look for television and sporting events, cloud-based production and content delivery, and a new replay and clip highlight reel production system.
Virtual Production
In the area of virtual production, Sony is hoping customers will pair its new cameras and LED videowalls to create volumetric video capture studios that allow actors to interact with the computer-generated set around them in real time. This saves time and money in post-production.
To this end Sony announced a new partnership with the USC School of Cinematic Arts, whereby the school is building a virtual production studio with a Crystal LED B-Series screen backdrop and has introduced a full virtual production curriculum that will make extensive use of it.
To complement its “virtual production” message, Sony unveiled the Crystal LED C-Series video wall with high contrast as well as the B-Series wall built for virtual production. The C-Series is available in a 1.2 or 1.5 pixel pitch configuration while the B-Series, with its wide color gamut reproduction, is scalable to enable it to be customized for individual production studios—large and small.
The idea is to pair the Venice camera with the new LED walls to create cinematic images in real time while finding effective ways to decrease work in post. In-Camera VFX is a new way of shooting real-time visual effects during a live action film shoot.
The company also announced an upcoming firmware update for its PVM-X series of 4K HDR monitors, which will include support for signal conversion and 3D LUT Output.
Cinematic Look
The cinematic look (shallow depth of field) generated by the Venice line has been embraced not only in feature films but also for live sports, due to its use of shallow depth of field image capture and 16-stops of dynamic range to enhance storytelling. For example, a dozen Venice cameras were used to shoot the half-time show at this year’s NFL Super Bowl game—which were able to seamlessly integrate into the live broadcast production. They also used Sony FS6 cameras for handheld shooting and P50/P43 cameras for the in-stadium Skycam systems.
[Of note: the production company (led by Jesse Collins and Roc Nation) had the DIT setting focus from a control room remotely. And due to rapidly changing lighting conditions, technicians were performing live color grading.]
Broadcasters are increasingly turning to IP in the cloud to connect and mange distributed networks, global content and support remote teams. For those looking to deploy the flexible workflows required, Sony’s Ci Media Cloud cloud-based production platform has added new tools to perform VFX and conform applications during acquisition. Among a number of advantages, this allows for quicker publishing to social media and dailies approvals.
And thanks to Sony’s acquisition of Nevion, remote productions of all types can now benefit from reliable low-latency transfer of video from data centers to editing and quality control suites located locally or miles away.
“It’s becoming increasing important to give people the tools to collaborate efficiently, remotely and in real time,” said Theresa Alesso, President, Imaging Products & Solutions Americas, Sony Electronics. “The last few years have redefined the world as well as our industry.”
New Cameras
This being a Sony press conference, two new cameras were introduced, the Venice 2 cinema camera, as well as the HDC-F5500 and HDC-F3200 models.
The new Venice 2, with a smaller body for better maneuverability, features a new 8.6K full-frame CMOS image sensor, internal X-OCN and 4K Apple ProRes 4444 and 422 HQ recording. In addition, the Venice 2 has a Dual Base ISO, 8 stops of built-in ND filters. It’s now in use by several feature productions.
The HDC-F5500 system camera, with its Super 35mm 4K CMOS global shutter image sensor, helps shooters get that desired cinematic look. Featuring shallow depth of field, high frame rate capability, and easy systemization with other Sony system cameras, it has been used on marquee sporting events and the most influential motion picture and music awards events, among others.
Sony has expanded its HDC-3000 series with the new 3200, featuring a 2/3-inch type 3CMOS image sensor with global shutter.
The new HDC-3200 is a “more affordable” 4K system, with fiber output for sending signals long distances and a 2/3-inch type 3CMOS image sensor with global shutter, 4K and HDR and support for numerous signal formats. The HDC-3200 is compatible with the existing lineup of Sony viewfinders, large lens adapters and IP transmission systems.
5G News Production
For news production, Sony cited major customers like Gannett and the Associated Press are using Sony technology and communicating thousands of critical news stories across the globe each and every day. In addition to cameras, the AP is using Experia Pro 5G mobile phones to send video over wireless connections while taking advantage of faster transfer speeds and higher bandwidth (better quality).
Optical Tracking And Replay
Meanwhile, in the area of optical camera tracking and data platforms for developing live sports analysis in fully rendered graphic animations, the company’s HawkEye Innovations subsidiary has introduced its new Hawk-Eye Replay; a high-performance, cost-effective, remote clipping, replay and highlights solution.
Hawk-Eye Replay can be integrated with a variety of remote and cloud production tools, including Sony's Ci Media Cloud.
It combines Hawk-Eye’s optical tracking capabilities and data platforms to create a new production solution. Hawk-Eye Replay can be integrated with a variety of remote and cloud production tools, including Sony's Ci Media Cloud. For the past year the company has been showing skeletal tracking of players in real time for live sporting events.
“Our commitment to enhancing the daily lives of our customer remains unchanged,” said Alesso. “As Sony looks to the future, we’re using our influence to motivate and shape tomorrow’s creators by providing them with the necessary resources and tools today. Our solutions continue to inform, entertain, engage and connect fans and professionals in ways that could never have been imagined before.”
You might also like...
HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows
Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…
The Resolution Revolution
We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.
HDR Picture Fundamentals: Camera Technology
Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.