Asia’s Golden Melody Awards Feature Stunning AR-Supported Performance

Television production these days is tricky enough without adding virtual elements like augmented reality (AR) graphics, but that’s exactly what Taipei-based production company Getop did for the live telecast of the 2020 Golden Melody Awards (GMA). The highly rated annual televised awards ceremony - considered the “Asian Grammys” by many - celebrates top musical talent from across southeast Asia.

Originally scheduled for June 27th, the ceremony was rescheduled to October 3rd, due to the pandemic, and moved to the outdoor Taipei Music Center. It was broadcast to high ratings on the Taiwan Television Channel and streamed live, with Taiwanese talent taking home many of the top prizes.

AR Takes Center Stage

The event featured a surprise performance by Taiwanese Artist Hebe Tien that was supported by a large videowall backdrop and a number of AR effects floating across the stage and around the star. The complex AR segment presented a number of technical challenges in making the virtual effect work for viewers at home. Besides the immense logistics - working with hundreds of people from different companies and departments - the AR portion of the production took a lot of technical coordination and finessing to get it right. Paul Chou, CEO of Getop, said they began preparing for the event a month and a half prior and spent a lot of time rehearsing to ensure everything worked as desired.

KUMO 3232-12G router.

KUMO 3232-12G router.

He said the key to making the AR effect believable and seamless during Tien’s performance was synchronizing the timing of the live streams of each camera angle in real time using software and hardware. They had to get it perfect or the stunning effect would be lost on viewers. The pressure was intense during production.

“The high risk of live television made this production a high-pressure working environment because it’s either rated 0 or 100 at the end of the day,” said Chou. “AR brings these richly produced images to the production and opens up more opportunities for creativity. It satisfies the needs of the new generation that is used to seeing content from the internet that contains a lot of effects.”

The Getop team used a variety of 12G equipment from AJA Video Systems in tandem with other vendors to make it happen. They developed a flexible live AR production system for the event that included a Dell 7920 workstation, an AJA Corvid 88 card for audio and video I/O, Pixotope software - a software-based solution on COTS hardware used to create the AR graphics - and Epic Games’ Unreal Engine for real-time rendering and compositing. An AJA GEN 10 sync generator handled synchronization of the camera feeds, while an HD10CEA HD/SD-SDI converter was used to split the feed into two to generate additional signals for the effect.

HD10CEA HD/SD-SDI converter.

HD10CEA HD/SD-SDI converter.

Pre-Production Challenges

During pre-production there were time delay issues in the Unreal Engine workflow, and an AJA Ki Pro Ultra 12G multichannel HD recorder, which supports the Apple ProRes codec, was used to solve the delay. Images were composited in real-time and output to an OB truck parked on site. With Ki Pro Ultra 12G’s multichannel HD recording capabilities, the crew was able to capture the output of the AR rendering engine alongside other live production feeds coming from the OB truck and the cameras. This was critically important, since the delay of each signal was different and had to be adjusted in order for the AR effect to work.

Overcoming Delay

Chou said these signal delays occurred for several reasons, including the continuous large amounts of data processing in Unreal Engine to precisely position high fidelity AR effects. But the most noticeable delay, he said, came from the signal that was passed through the broadcast systems and captured with the AJA recorder. Because the live camera outputs often contained longer delay times than the PGM output, there was a noticeable lag when the AR elements were produced in sync with the musical performance.

“When we received the signal from the truck to monitor the program feed, it contained some delay problems,” said Chou. “For all of these signals, where we were getting different delay times, the Ki Pro Ultra 12G made it easy for us to record all the signals in sync, without having to use genlock as a reference. This helped reduce a lot of time spent managing the image files.”

For routing, Getop chose an AJA KUMO 3232-12G router, which handled all of the various audio and video sources, and automatically distributed them to the appropriate destinations (within the venue and outside), using a number of preset parameters. The Getop crew also used a host of Mini-Converters to change the SDI signal to HDMI for the inputs of the large monitors on-stage behind Tien as she sang. The AR elements appeared to flow out of the 2D screens (and 3D graphics) and land lightly on the stage before disappearing. These effects were only seen by viewers at home and on smaller monitors mounted throughout the venue.

Grading For REC.709 Color Space

Tien’s AR performance was recorded on a Ki Pro Ultra 12G recorder for archival purposes and monitored with an EIZO CG319X Grade One monitor working in the SDR REC.709 color space. The reference monitor was used extensively to check for correct color rendition.

At the end of the day, Chou said that using technology that is pro-certified to work seamlessly together makes networking and production workflows run smoother because the equipment all speaks the same protocol language. This interoperability during a live production saves setup time on-site and ensures a successful production, but experience and becoming familiar with how the technology operates counts most.

Ki Pro Ultra 12G multichannel HD recorder.

Ki Pro Ultra 12G multichannel HD recorder.

“Virtual production is a mix of the real world and the virtual world, so you must master the techniques from both and accumulate a big amount of experience in order to get it right,” said Chou, adding that deploying a comprehensive workflow built using technology from AJA Video Systems for this project helped ensure that everything worked together on the same local area network. “At the end of the day, you’ll need a high level of experience to present the art that has been created here and a good technology partner like AJA.”

Supported by

You might also like...

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.