2022 HPA Tech Retreat Returns In Person

After a year of meeting virtually, the Hollywood Professional Association (HPA) is hosting its Technology Retreat conference in person this year and members could not be happier. The highly anticipated gathering of the industry’s forward-looking technologists working at many of the largest companies in the U.S., is being held February 21-24 at the Westin Mission Hills Golf Resort & Spa in Rancho Mirage, Calif.

It’s four days in the sun hobnobbing with some of the country’s most brilliant Broadcast and Production (and Post) technology minds.

A wide range of topics will be spotlighted across the four-day event, from virtual production and cloud processing to HDR workflows, AI-based video codecs, Next-Gen TV, Edge computing, virtual reality and more. The technology exhibit area will feature a number of live demonstrations and a few novelties: a complete camera (including optical system) the size of a grain of salt, TV set you can taste, and a media-equipped refrigerator — from 1956.

Be sure to sign up for the event’s daily “roundtables” focused on specific topics, with up to 30 topics on the agenda. For example, there’s a roundtable on "Broadcasting at the Edge." But understand that space at the tables is filling up fast.

“This year we’re hosting an in-person-only event with extraordinary health & safety precautions,” said Mark Schubin, "Program Maestro" for the HPA Retreat. “Therefore, the most valuable part of the event might be the in-person networking.”

Indeed, past Retreats have seen the president of Sony Pictures sitting next to a senior member of the Sinclair Broadcast team, so you never know who you’ll meet (or see for the first time since the pandemic started).

As usual, Schubin will present his “Technology Year in Review” report, filled with colorful anecdotes and uncanny comparisons to technology from years ago.

Yet it also covers things up to the last minute, “so I can't tell you everything that will be in it, but it will include some wild -- almost unbelievable -- developments in satellites, cameras, and lenses at the very least. There will also be some revelations about carbon impact that I found quite interesting.”

Tuesday of the conference will feature a full agenda of sessions covering virtual production inside the camera for VFX work. These sessions have been coordinated by co-chairman Erik Weaver, who’s day job is running the Adaptive and Virtual Production department at the Entertainment Technology Center at USC. He and his team have been responsible helping create the now standardized Digital Cinema and IMF formats that streamline feature film effects and live production work.

One session: “Introduction to the VAD and Final Pixel”, (Tuesday, 9:25-10:15 am) will boast a stellar panel discussing the fundamentals of how you create a digital world using Virtual Art Department (VAD) cloud-based technology and Final Pixel processes that result in photo-realistic images. It uses Unreal Engine to render images in real time.

The popular Technology Exhibits area will return again this year.

The popular Technology Exhibits area will return again this year.

“With AI or volumetric video capture, it’s a whole bunch of cameras pointing in at the subject and then they are inserting that into a 360 world,” said Weaver. “This is a real human being on a stage that’s being photographed with all of your visual effects in real time through the camera. And it leverages parallax processing so that as the camera moves, everything moves in proper perspective in a real time nature.”

Panelists include Kristin Turnipseed, Lux Machina Consulting; Felix Jorge, Happy Mushroom; Ben Baker, ETC; Dane Smith, The Third Floor; and Arvind Arumbakkam from Wacom. All are experts in the field and are currently participating in real-world trials of the innovative processes.

Next up, at 1:40 PM on Tuesday, is “Optimizing dvLED Performance for Virtual Production (Or Any In Camera Usage)” presented by Gary Feather, who will take participants on a journey of LED technology and the benefits of each type.

Display used for Virtual Production or xR is referred to as dvLED, direct view LED Display. The system design and performance of the selected dvLED display can enhance or hinder the utility for a production; with the controller being an essential element. This presentation will cover discussion on the optical, electrical, display and physical characteristics that must be considered.

Also on the Tuesday, at 4:05 pm – 4:30 pm, a session called “Final Sample - On Set Virtual Production Sound Challenges.” It will look at the acoustical equivalent of virtual production’s final pixel, called “final sample,” which is used to capture and deliver spoken on-set performances directly to audiences, with minimal post processing or ADR. Acoustical camera technology visually demonstrates challenges in 3D, including sound reflection echoes. This session will be led by Eric Rigney, Executive Vice President of the Media & Entertainment Data Center Alliance, who in his day job is focused On-set Virtual Production technologies, workflows, and infrastructure.

On Wednesday, February 23rd, 10:30 am – 11:15 am, Mobile TV Group’s Mark Chiolis will lead a panel entitled, “Remote, Mobile, and Live Workflow Innovation Updates.” It will look at the effects of Covid on the industry and the emergence of the many disruptive changes to live production workflows. This panel will dive into a number of workflow innovations throughout sports, esports, entertainment, corporate, concert, awards, and other events that have, or soon will, become part of accepted workflows. Panelists share what’s new, what worked, what maybe didn’t work, and how the way we all do “live production” will never be the same as it was.

Panelists include Scott Rothenberg, NEP Group; Phil Garvin, Mobile TV Group; Tony Cole, NFL Media; and Wileen Charles, Crown Media Family Networks/Hallmark Channel.

Other topics covered at the HPA Tech Retreat include the notion of systems integration in the cloud. Many have posited that once in the cloud, you no longer need a systems integrator. On Thursday at 11:50 am – 12:05 pm, Dave Van Hoy, president of Advanced Systems Group (San Francisco, Calif.) will host a session on “What Does Systems Integration Look Like in the Cloud.”

Matthew Goldman, now at the Sinclair Broadcast Group, will discuss “Broadcast HDR at 4:30 pm – 5:00 pm on Wednesday.

Finally, at the close of each day of the conference, at 6:15 pm – 6:30 pm, Annie Chang and Leon Silverman will host “What Just Happened?”, a review of the day.

As in years past, HPA members will come from all parts of the globe to offer their insight. Register quickly as hotel space has already sold out and registration, there is a cap, is sure to fill up.

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.