Virtual SMPTE 2020 Targets Global Community With Interactive Platform

Like most industry gatherings this year, the 2020 SMPTE show is virtual and runs from November 10-12, complete with an interactive environment that incorporates a main conference hub, meeting rooms, theater space for sessions and the annual awards gala, and an exhibition hall with private meeting spaces. Many of the events, in addition to the Awards Gala, have been pre-recorded so attendees can view them at their leisure.

Veteran technologist Bruce Devlin, who also serves Standards Vice President of SMPTE (The Society Of Motion Picture & Television Engineers), said going virtual will allow the conference to be truly global, enabling more people from distant places to participate in the annual conference that has been held in Los Angeles for the past several years.

“We wanted to make certain this was a global show that is a bit different than our physical shows in the past,” he said. “Previously we’ve had attendees from around the world, but the percentage compared to U.S. attendees was always very low. Because this is a virtual event, we wanted to show this inclusivity by making sure that there are sessions available during the night for Europeans and in prime time for the Asian community. So everyone who wants to can participate.”

The show’s creators also wanted to give their online platform a unique look and feel and indeed the highly interactive interface—with 3D visuals developed by Storycraft Lab, in New York, from an original design concept by SMPTE staff—straddles the line between an eSports gaming environment and the traditional media world with various areas so that attendees can click and move to the session or vendor demo they desire. Content will remain online for 30 days after the 12th for paid attendees.

The online platform features a series of gaming-type portals that lead to various sessions and tutorials.

The online platform features a series of gaming-type portals that lead to various sessions and tutorials.

One of the big topics this year will be remote production, a method of producing content from a distance, including a session that will discuss the required connectivity and what remote production actually means in the real world. Another will highlight how do you control latency and the various networks involved in a production.

Devlin himself will be part of a panel discussion on Tuesday, Nov. 10th at 3:15 PM - 3:45 PM entitled “Are standards still relevant in a post-COVID world?," along with Thomas Bause Mason. The session will look at the changing face of the media technology industry and explore a future in which Open Source, Standards and Cloud fulfill different, but essential roles in creating an interoperable global eco-system.

During this opening session Devlin and Mason will also discuss how SMPTE’s future relies on good relations with other technology and standards groups, like the ATSC, IEEE, DVB, etc. in order to develop universal standards and foster collaboration between the two entities.

“The pipeline between the photon going into the camera and the photon coming at you off the screen is getting shorter and shorter,” said Devlin. “So, we’re focused on building a better relationship with organizations that have similar goals of helping users develop better production and distribution methods.

Speaking of standards, SMPTE recently published its VC6 encoding standard and its MPEG-5 LCEVC sibling standard. Both of these encoding methods, which cover the top end, lossless, super high quality production all the way down to enhanced distribution, will be discussed during the online show.

Machine Learning (ML) will be part of a Keynote presentation given by Anima Anandkumar on Tuesday, Nov. 10 at 5:00 PM that will look at the role of ML in the media industry. 

Attendee collaboration is a big part of what SMPTE is trying to accomplish with its online conference.

Attendee collaboration is a big part of what SMPTE is trying to accomplish with its online conference.

In fact, SMPTE has formed a new taskforce on ML and artificial intelligence (AI) in the media space and is working with the Entertainment Technology Center— at the University of Southern California's (USC) School of Cinematic Arts—to develop new workflow processes. There will be a Standards Session during the show entitled SMPTE Task Force, which will be presented by Yves Bergquist, Frederick Walls, on Wednesday Nov. 11 at 3:00 pm.

ML is also a topic that is near to Devlin’s heart, as, in his other role as owner of MR MXF consultancy, he has been working with clients to figure out how to harness its power to streamline a wide variety of workflows.

“I think there’s a whole bunch of stuff that’s really hard for humans to do that machines can accomplish so much faster and more accurately,” he said. “So, generating metadata to identify what actor is in a scene or whatever, that’s got to be done by humans today. There’s no reason that can’t be done by machines.

Some of his clients are using ML for things like video filter optimization. So, if you want to upconvert an image, you’d typically take an old SD image that’s interlaced and process it using a filter. The problem is that different countries have different frame rates, so the content distributor has to find the right filter to get the best image quality.

“Upconverting that to progressive 4K images that do not look terrible is quite hard,” said Devlin. “What you end up doing is experimenting with 40 or 50 different filters and try to figure out which one looks best. It’s really tedious for a human being to look at hours of content and judge which ones are best. Alternatively, we can teach a machine learning algorithm what ‘best’ looks like and you can give it 40 different filters and it will apply the images to all of them in a fraction of the time it would take a human.”

He’s also excited about metadata and the increased role it will play in the future of the broadcast industry. Embedding metadata allows the user to deliver entertainment in new and interesting ways, and, he said, it will only become more common as we move forward. Devlin is trying to figure out how to take on-set exotic metadata and get it into the production pipeline for minimum cost. He tested the technique while recording a concert in Germany this summer.

“I hope that metadata gains some respect and not be overlooked like a third-class citizen, as it has been,” he said. “Metadata has always been the poor step child. But Metadata can be used to deliver better content, to select better filters, to correlate different bits of a production. Moving metadata to the front of the line will have a massive impact on how we create content and how we consume content.

“In general this show will be talking about the core concept of what it means to be a broadcaster these days,” said Devlin. “Part of what we’re trying to do is make certain that those that haven’t been paying enough attention to what is going on can easily find the information they need.”

You might also like...

HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows

Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…

IP Security For Broadcasters: Part 4 - MACsec Explained

IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.

IP Security For Broadcasters: Part 3 - IPsec Explained

One of the great advantages of the internet is that it relies on open standards that promote routing of IP packets between multiple networks. But this provides many challenges when considering security. The good news is that we have solutions…

The Resolution Revolution

We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?

Microphones: Part 3 - Human Auditory System

To get the best out of a microphone it is important to understand how it differs from the human ear.