Broadcasting Innovations At Paris 2024 Olympic Games

France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.

The 2024 Paris Olympics has been widely acknowledged as a successful proving ground for several technical innovations across the whole AV lifecycle, from remote production through contribution to distribution. Summer Olympic Games have long been prominent stages for emerging broadcast technologies, in large part because of the diversity of both events and locations.

These include stadia, open countryside, and water, posing challenges for production and contribution to capture critical aspects of events at sufficient resolution. Then the large number of events is challenging on the distribution front, helping propel features such as the original red button from the BBC, to allow viewers to switch between different channels effectively. That was pioneered at the 1999 Wimbledon tennis championships but bedded in at the summer Olympics in Australia the following year.

The 2012 London Olympics was the first to feature streaming in a big way with the onus then on expanding on red button capabilities to allow live access in principle to all events, although there were issues with quality and reliability. The 2021 Tokyo Olympics was notable for the full coming of age of mobile streaming, and for use of cellular networks for coverage of remote events.

The 2024 Games just gone has been widely billed as the 5G Olympics by exploiting the more advanced capabilities of that mobile generation, especially the SA (Standalone) variant based on 5G network cores, as well as 5G Radio Networks (RANs). 5G featured across the video life cycle and the whole duration of the games, starting with the opening ceremony where Samsung deployed its Galaxy S24 Ultra smartphones on sticks to capture the action as it proceeded along the River Seine. French telco Orange as the official mobile operator for the games had deployed antennas along the Seine to establish what it claimed was France’s first ever commercial 5G SA network. 

5G was a major feature of the Paris Olympics, starting with use of Samsung smartphones to follow the opening ceremony along the Seine.

5G was a major feature of the Paris Olympics, starting with use of Samsung smartphones to follow the opening ceremony along the Seine.

The 2024 Olympics was also notable for showing that 5G networks themselves, as well as the mobile devices they support, can be portable, and move around in real time. For this France Télévisions (FTV) as the host country’s official broadcaster adopted a system called the 5G Dome from Obvios to deploy private 5G networks in locations requiring roaming in areas that may not normally be served by cellular connectivity. The 5G Dome combines the RAN and Core in a single box, combining the features required to deliver and process 5G signals on the ground.

Another feature though was required, some form of backhaul connectivity, which is usually provided either by fiber or microwave links in the case of permanently configured mobile networks. For the Paris Olympics’ portable 5G networks, some form of ubiquitous wireless backhaul was required, and this was provided by the Space X Starlink constellation of LEO (Low Earth Orbiting) satellites owned by Elon Musk. The absence of such satellite options for backhaul at previous Olympics meant that such mobile in a box connectivity was not an option then, especially for more remote events.

For the 2024 Games, FTV deployed the 5G Dome in a car following the torch during the opening ceremony, with a 5G antenna on top creating in effect a 5G dome around the torch bearer, moving with it. The use of LEO satellites for backhaul enabled latency, or round-trip delay, to be kept down to about 50 milliseconds, ample for the application.

This mobile in a box approach allowed FTV to provide coverage of remote events throughout the Games, with a combination of traditional cameras equipped with 4G/5G devices and drones. This enabled a breakthrough for broadcasting of the sailing competition, an event that traditionally has made for uncompelling viewing confined to glimpses of distant yachts, without conveying the drama or tension involved. But for the 2024 Games, 5G base stations were placed on the race buoys so that drones equipped with cameras and the 5G boxes could follow the yachts and provide close viewing of the action on board. This was conveyed not just to remote viewers, but also bystanders at the harbor via a 50-foot floating screen.

Similarly, footage of players at the golf tournament was captured and relayed back over 5G, so that for the first time attendees at the event could rent devices to watch players at different holes from where they were sitting or standing. According to FTV, this approach will be incorporated into coverage of live events generally.

Mercifully, the Games was not depicted as the first AI Olympics, at least not from the broadcasting perspective. FTV did apply machine learning in its speech-to-text system that it had been working on for two years, and suggested that the Games had helped drive that forward.

The initial aim was to develop a speech-to-text and translation infrastructure specific to FTV’s own requirements. For the Olympics, it was used to transcribe interviews sent by OBS (Olympic Broadcasting Service), and translate them into French if required, before integrating them into its sports production system.

AI innovations at the Olympics were more notable for application to sporting events themselves, although some of these involved video capture and provided information for the broadcast coverage. One came from watchmaker Omega, whose Olympic involvement began with precision timing and photo finish cameras.

At the Paris Olympics, Omega’s Computer Vision systems came into play for various sports because of their ability to track athletes and other objects accurately for performance metrics. In the case of beach volleyball, distances covered by each player and speed of the ball, jump heights, and shot types, such as smashes, blocks, and spikes, were all recorded.

While many of these metrics had been captured before, they required athletes to wear sensors to record them, which was a slight encumbrance and limited scope for instant analysis. Now the metrics are captured by high-definition cameras around the field of play, providing higher resolution data to feed AI models that were trained for each sport. This enabled analysis of each event at much greater depth than before and fed broadcasters plenty of nuggets to enrich their narrative of the events.

The stand out feature of the Olympics though was not any single innovation but the cumulative impact of various enabling technologies at the production and contribution level, which combined to elevate the viewing experience in various ways. Such at least was the view of the EBU (European Broadcasting Union) in deciding to award FTV its annual "Excellence in Media Award" presented at the recent IBC 2024 for its coverage of the Paris 2024 Olympics and Paralympics. 

France Télévisions won the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production at the 2024 Summer Olympics.

France Télévisions won the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production at the 2024 Summer Olympics.

The EBU singled out the role of the SMPTE 2110 suite of standards for transmitting digital media over an IP network, to which FTV has been migrating since 2021. Following tests earlier in 2024, this ensured that FTV was able to deliver its Paris Olympics coverage in 4K UHD with High Dynamic Range (HDR10), 10-bit color depth, wide color gamut (based on BT.2020), higher frame rates (50 fps progressive) and Next Generation Audio (AC-4).

The latter is particularly worthy of mention because while those other aspects of what might be called next gen TV have already been largely proven in trials and deployments, NGA is very much work in progress. Its role is not confined to improving general viewing experiences through object-based sound, for example, but also improving accessibility for the hard of hearing, and facilitating generation and transmission of multiple language tracks.

NGA relies heavily on metadata to deliver the best experience on target devices to render the audio optimally. The metadata conveys critical aspects of the audio that instruct the play back system to reproduce each element effectively, whether this is a headset, soundbar, or multiple speakers in a surround sound set up.

Indeed, NGA relies for its flexibility and scalability scalable on metadata to describe all parts of the audio, created during the mixing process and transported alongside the audio elements. The metadata is also essential for matching the playback to the user’s preferences, and control inputs.

Naturally, this requires a standardized format to ensure playback devices can interpret the instructions accurately without ambiguity. The key standard emerging in this area and adopted by FTV for its Olympic coverage is SMPTE ST 2110-41 Fast Metadata, part of the ST2110 family.

It is then all round progress across the whole workflow instigated by the Olympics that should be celebrated. It is true that AI contributed to some of these advances, but mostly for fine tuning existing technologies rather than at this stage enabling any radically new use case for sports broadcasting.

You might also like...

Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer

The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…

Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs

Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.

HDR & WCG For Broadcast - HDR Picture Fundamentals: Color

How humans perceive color and the various compromises involved in representing color, using the historical iterations of display technology.