The Sponsors Perspective: The Object Of The Game
How it fits in - a live broadcast workflow example for the AMBEO Sports Microphone Array, incorporating Lawo’s Kick software.
Sennheiser introduces its new and innovative approach isolating audio objects in live sports broadcast production with the power of beamforming.
This article was first published as part of Essential Guide: Immersive Audio Pt 4 - Options And Tools For Production Of Live Immersive Content
Sports broadcasting is one of the most valuable sectors for paid services worldwide, and also one of the most prolific when it comes to pioneering new, premium viewer experiences. Immersive audio and object-based broadcasts are naturally suited to sports broadcasting as competing sports persons and teams offer obvious points of view, and the creative rewards of reproducing an immerse live atmosphere far exceed the value in immersive audio for many other programme formats.
However, even top-tier Immersive sports broadcasts have tended not to embrace all the possibilities of the medium. Rather, a ‘pseudo’ immersion is constructed due to both the perception of what consumers want and can cope with, and what is practically possible in live production. However, the rise of ambisonic encoding and streaming services pushes immersive audio into new realms, enabled by impressive modern tracking technologies and joined-up audio-follow-camera production automation.
Last year, Sennheiser proposed a new microphone product that embraced the possibilities of beam forming technologies and production automation systems, providing a route to connecting tracking technologies to the microphone. The Sennheiser AMBEO Sports Microphone Array - now refined, field-tested, and ready for deployment - offers unprecedented steerable directionality and off-axis rejection. It’s a real opportunity to multiply the impact of live sports. This includes reacting to and reproducing accurate positional information, and also bringing important creative audio opportunities for things like ball impact, competitor effort, in-field dialogue and so on within reach of the live mix engineer – something only previously available to post people and sound designers.
Importantly, the microphone allows many simultaneous beams to be generated from a single array so that many objects can be isolated with a single array. One beam might be following the ball, while another might be focussed on the referee, for example.
The project was initiated to address a long list of issues currently experienced in sports broadcast:
- The sound event may take place at a considerably high distance from any microphone.
- The sounds of interest may be far lower in level than the general ambient level in a stadium.
- The objects creating the sound may be moving at high speed.
- Any processing of the captured sounds must allow for live broadcast of the audio stream and therefore involve no, or conceivably low latency.
- Systems should sustain adverse weather conditions such as rain or wind.
- Systems must withstand mechanical impact such as a ball hitting a microphone.
- Microphones shall not conceal any camera view.
- The frequency spectrum of the sounds of interest ranges from low frequencies below 200Hz up to 5kHz and above, while typical crowd noises cover the same frequencies.
- Depending on the camera view, the sound may need to be panned and played back from a different angle.
- Some sounds may only be of interest if they add information to the visuals, while others (such as the referee’s whistle) need to be heard independently of what is shown on screen.
The polar response of a single Sennheiser MKH8070 shotgun microphone (left). The polar response of the AMBEO Sports Microphone Array, showing a marked increase in directivity and rear / off-axis rejection (right).
The solution to these points - the Sennheiser AMBEO Sports Microphone Array - is a 360-degree array of 31 shotgun microphones in the horizontal plane. Sennheiser uses a combination of beamforming algorithms (modal beamforming) paired with detailed modelling of the shotgun response, and its inherent directionality, to achieve a highly directional output that can be steered in any direction via a control link.
The array can achieve ‘proper’ beamforming over a relatively wide bandwidth and with a level frequency response – from below 200Hz to over 5kHz. Level difference due to distance can be accounted for with a variable gain for each beam.
There is no mechanical or motorised movement of the array; it’s purely done by processing the individual outputs of all the microphones in the array so that the combined input of the off-axis mics is attenuated, and the on-axis audio is accentuated. Vertical rejection is taken care of by the shotgun mics’ physical characteristics due to the usual action of their interference tubes.
The fact that this directionality can by steered ‘hands-free’ by the algorithm is an important rung on the immersive and object-based ladder.
During the design phase, arrays with other microphone types were trialled and considered, but nothing offered as much off-axis rejection and control as the high-quality shotgun mics. The added benefit of having sufficient rejection in vertical plane without having to make is a three-dimensional array was also a bonus and avoids any obstruction issues with the cameras.
Since prototype stage at IBC 2018 Sennheiser has made some big gains with the system especially with the form-factor. The array size has been reduced to a diameter of one meter, which makes it a realistic consideration for pitch-side placement where the larger original could obstruct digital advertising boards.
The system has been tested with the German and Spanish football leagues, as well is in the US on American Football and basketball. For football specifically, trials have shown the most effective deployment to be two arrays behind each goal, and one array opposite the centre-line camera and microphone. Sennheiser expects a unit to be commercially available in 2020, initially aimed at the football broadcast market. In the meantime, it continues to show the technology off, coupled with Lawo’s innovative Kick system, at the major broadcast shows, including IBC 2019.
The hope is that object tracking technologies and protocols such as Dolby’s ED-2, which enables object-audio metadata to be carried along with the audio, will eventually bring all aspects on a live object-based production together, moving fully positional tracking data around, along with live-generated audio objects and that production automation will become the engineer’s third-hand in an object-based future.
Supported by
You might also like...
Designing IP Broadcast Systems
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.
Expanding Display Capabilities And The Quest For HDR & WCG
Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.