Cinema’s Growing Influence On Live Sport Production
With increasing regularity, digital cinema cameras like Sony’s VENICE and RED’s KOMODO cameras are making their way onto the fields of major live sporting events and into multi-camera video coverage to create a “cinematic” look that enhances the viewing experience.
These cameras, with single 35mm CMOS sensors and shallow depth of field, add yet another color in the director’s palette and open up new perspectives for fans to enjoy the game.
Pair these technologies with real-time, low-latency processing and motion picture color fidelity and you’ve got a winning formula to captivate TV viewers in new and exciting ways.
Recreating The Look Of A Blockbuster Movie For Live TV
With this approach in mind, FOX Sports decided in the summer of 2021 to translate the look and feel of Universal Pictures’ motion picture “Field of Dreams” to a regular season baseball game broadcast featuring the New York Yankees versus the Chicago White Sox. The network broke new ground, not only building a custom stadium near the original film set, but also by melding numerous cinema production techniques—including real-time frame rate conversion and post-production finishing—into a traditional 1080p/60 HDR video production workflow.
In an homage to the original film, FOX and MLB commissioned a full sized baseball field to be built near a corn field, a mere 1,000 feet from the original “Field of Dreams” set in Dyersville, IA. During the broadcast, the players were introduced as they walked out of the stalks of corn— reminiscent of the movie ghosts of legendary players past as they emerged onto the field to play ball. Several camera angles and iconic scenes from the movie were incorporated into the production to support this effect and others across the live telecast.
As Kevin Costner’s character says in the film, “If you build it, they will come,” and, last August, that’s exactly what 8,000 invited fans did, in-person. An additional six million viewers watched at home, making it FOX Sports’ most-viewed MLB regular-season game since 2005.
Real-Time, Mixed Format Workflow
The telecast took advantage of production elements befitting a postseason game, including 39 Sony 4K HDR/SDR cameras (all native 1080p HDR); over 50 microphones (with multiple mics buried in the field and one on each base); multiple aerial production drones (provided by Beverly Hills Aerials)—the first time FOX Sports has used drones as part of an MLB game—a FlyCam overhead cable system, and four Super SloMo cameras (Sony HDC-4300s with up to 8x High Frame Rate capability).
Brad Cheney, Vice President of Field Operations and Engineering at Fox Sports, oversaw the technical production, working closely with mobile services provider Game Creek Video. He said that blending video and film-style production was a challenge that took careful planning to ensure that all of the shots matched on screen: but it all went off without a hitch.
“Between taped, pre-produced elements, and live footage, we had to think carefully about how we could blend cinematic and live broadcast looks,” he said. “We wanted to emulate the capture mechanism of cinema, so we standardized on 1080p HDR, HLG wide color gamut BT.2020. This allowed us to maintain a consistent aesthetic as we transitioned from a pre-produced opener featuring [actor] Kevin Costner into the live game broadcast, which was shot with high frame rate cameras and other standard broadcast equipment. And we did this all without compromising either workflow.”
Post-Production In The Field
The key to success was integrating the cinema camera shots seamlessly with the typical baseball game telecast video workflow, switching back and forth between the telecast’s digital cinema and video segments. That meant devising new techniques for how to use the cameras’ shallow depth of field acquisition capabilities to give the live game it cinematic look.
“Tricking shallow depth of field looks with broadcast cameras is possible, but is best realized with specialized film equipment, like cinematic cameras with large sensors and prime lenses,” said Cheney. “We then played with the light levels and changed the apertures. To recreate the color, warmth, and feel of ‘Field of Dreams’ we leaned on our camera technology and AJA Video gear.”
The workflow included more than 60 AJA FS-HDR frame synchronizers/ converters, which allowed Cheney’s team to strike the right balance between the two looks (film and video), and easily move between them as they transitioned between video sources. FS-HDR was also used extensively to control blacks, whites, gammas, and matrix settings on some devices—all in real time, which was critical to incorporating the processed cinema images into the live broadcast.
“Having this color correction ability built into our converter was paramount, considering we used some 1080p SDR and 4K/UHD SDR cameras as well,” said Cheney. “The FS-HDR helped us match the looks across the board.”
Achieving The Right Look
For the production team, a lot of experimentation was necessary to determine the right equipment settings and figure out how all of the gear would work together. There were times where 24p looked great, but others where it was too jumpy for live game action. So the team opted to use an acquisition frame rate of 29.97p for many of the specialty cameras used, always being careful to account for subtleties in the look based on the equipment in the workflow.
“Ultimately, this approach ensured we had the right conversion and settings in all the right places and gave us an appreciation for recent technological evolutions,” said Cheney. “You can now have an RF transmitter that can do 24, 25, 29, 50, 60 and 59.94, which is amazing. The same thing can be said for conversion gear like the FS-HDR, which allows us to determine what the picture will look like. We can quickly move between looks, line them up, and compare them to get a clear picture of what we’re aiming for.”
This was vitally important, because the timeline they worked under to devise a plan and implement it was very tight.
Lighting The Way
The field lighting was designed for optimal baseball play, not for shooting a film, so they had originally planned to apply pre-defined looks onto the live content that matched the original “Field of Dreams” aesthetic. However, they ultimately reduced the amount of color correction because they took advantage of the natural sky that provided the right cinematic lighting they needed. However, taking advantage of this natural light introduced additional challenges – like having to figure out exactly when the sun was going to set, and then bridge that look as evening transitioned into night. Having color correction integrated into the workflow via the FS-HDR sped up the transition process and allowed them to keep more things active in real-time.
“We made a number of choices based on research leading up to the production that got us to a concept of where we wanted to be, and once the [natural] lighting conditions showed up, we knew what adjustments to make to get to our look,” he said.
A Variety Of Cameras Captured The Action
The ground-breaking production leveraged an extensive range of cameras for capture, mostly broadcast-style cameras, but it also utilized a dozen cinema cameras. The lineup included multiple Sony VENICE camera systems with PL mount lenses; Sony F55, 4300, 5500, and P50 cameras; and RED Digital Cinema cameras, including the KOMODO 6K. They also had many 1080p-capable POV cameras buried around the infield.
HDR Conversion Workflow
All the cinema camera feeds were run through a total of 60 FS-HDR frame synchronizer/converters (each 1 RU), which helped create the desired look with full color imagery, balance, and color correction. The main game feed was produced in 1080p HDR, but some international distributors distributed content in other formats as well.
That meant there was a lot of conversion with the FS-HDR going on in the field, and why the FS-HDR was the ideal tool for the job. The FS-HDR is designed to handle real-time HDR transforms as well as 4K/HD up/down/cross conversions. Combining AJA’s frame synchronization and conversion technology with video and color space processing algorithms from the Colorfront Engine (found on many motion picture sets), the FS-HDR brings the real-time, low-latency processing and color fidelity demands that live broadcast television projects require.
“We could easily convert SDR to HDR for production and HDR to SDR for domestic and global distribution,” said Cheney. “It proved key, as we all know that starting with the highest image quality possible will result in the best final product, and the AJA FS-HDRs helped make that possible.”
A “Beautiful TV Moment”
In the end the team of more than 200 team members—ranging in disciplines, from ENG camera people to drone professionals—had to relearn and undo a lot of things they would normally do with ENG crews, drones, and specialty film cameras. The momentous project has given FOX Sports, and the industry at large, a great game plan going forward for cinematic integration at large scale.
“When live action and replays can be shown with that same beautiful post-produced look that you’d expect from post produced highlight show in a non-jarring way, you can make the game feel bigger than life for fans,” said Cheney. “We now know how to execute something like this quickly, so in the future, we want to provide this amazing depth of field and color to audiences in live broadcasts where it makes sense.
“From an industry standpoint, we’re on the cusp of being able to do all the things we want to in sport in high frame rate and with a cinematic look, things that just weren’t achievable live five or ten years ago,” he said. “We’ve now reached a point where this technique can be applied to other productions on a full-time basis, versus just as an added feature for select telecasts. For this project we made equipment modifications as necessary and refined the workflow until we hit the mark – bringing a cinema look to live broadcast. It was one of those beautiful television moments.”
Supported by
You might also like...
HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows
Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…
IP Security For Broadcasters: Part 4 - MACsec Explained
IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.
IP Security For Broadcasters: Part 3 - IPsec Explained
One of the great advantages of the internet is that it relies on open standards that promote routing of IP packets between multiple networks. But this provides many challenges when considering security. The good news is that we have solutions…
The Resolution Revolution
We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.