HPA Tech Retreat 2017 (Part 1) – Technology, Innovation & Fellowship

There are any number of media conventions, conferences, trade shows and technical meetings across the world, but before last year there was only one technology retreat, HPA.

This year’s Hollywood Professional Association (HPA) Tech Retreat was held February 21-24, 2017 at the Hyatt Regency Resort in a suburb of Palm Springs. The second annual HPA Tech Retreat UK is currently requesting presentation proposals for that event, which will run July 11 through 13 near Oxfordshire, UK.

The unique combination of 650 or so attendees from network and group broadcast engineering, video editing and senior executives and engineers in post-production, motion picture distribution and exhibition, display technologies, cinematography, manufacturing, media law, research & development -- Greater Hollywood -- holding informal meetings covering the emerging concepts and latest media technologies alone made the original event stand out. When one factors in that the four plus days of the Tech Retreat includes four breakfasts, seven coffee/tea breaks, three catered lunches, an open cocktail reception, a dinner including two drinks, the distinction is clear: The Tech Retreat is an ideal environment for getting to know other attendees, participants and panelists.

Scheduled providently for two or so months before the National Association of Broadcasters (NAB) convention, with proposed presentations closing about two months after the International Broadcasting Convention (IBC), the Tech Retreat provides new companies and established firms with an opportunity to debut (or preview) their work and offerings and enjoy direct feedback, without the crowds, rush, formality and bustle that characterize so many other industry events.

How formal could an event be where the host sports a long grey beard and wears a natty, probably recently laundered, at-one-time-surely white T-shirt, atop thin flannel trousers bearing patterns and colors from childhood pajamas and who is shod in sandals (no socks)? 

Mark Schubin, the Tech Retreat’s friendly host and an engineering fashion iconoclast.

Mark Schubin, the Tech Retreat’s friendly host and an engineering fashion iconoclast.

As formal as can be attained when the host is Mark Schubin, the Society of Motion Picture and Television Engineers (SMPTE) Fellow, media and technology writer, historian and researcher, winner of multiple Emmy awards, and long-time technical consultant to the New York Metropolitan (“Met”) Opera.

Covering the Tech Retreat can be a challenge due to its inherent nature as semi-formal engineering meeting with table-top exhibitors that might only be a basic proof of a very interesting concept. Presenters or exhibitors may not be “press-ready” and, without exhibitor staff and exhibitor guest passes, there are few marketing folks in attendance. Topics covered and explored at the Tech Retreat tend to span all media platforms and may not fit readily into concepts such as broadcast, production, post-production. For many, Mark Schubin, lacking anything to sell and devoted to exploring the new, does not easily fit into any category previously identified.

The 23rd Annual Tech Retreat was preceded, by a half day “TR-X” session on the current state and prospects of virtual and augmented reality.

Virtual and other realities

Virtual, Augmented and Mixed realities were the focus of a half-day TR-X session on Monday.

Virtual, Augmented and Mixed realities were the focus of a half-day TR-X session on Monday.

The developing fields of technologies, creative possibilities and practical considerations for virtual reality, ‘360° video’, augmented (or ‘mixed’) reality have drawn great interest and investments in recent years, a reported $1,8 billion in 2016, more than double the level of 2015. Projections for 2017 are for another doubling of expenditures. What goes unreported are actual industry revenues or returns on VR/AR, etc. The VR/AR sessions at TR-X were refreshingly practical. All seats were occupied, with sinews of standees in the back of the room.

The first presenter was Lucas Wilson, founder of SuperSphere VR. He provided an overview of the emerging and newly practical imaging and presentation technologies that constitute “virtual reality” and related technologies.

Six degrees of freedom (also known as 6DOF) is techno-speak for the five common camera directions: pedestal (up/down vertically), dolly (closer to or farther from object), truck (left-right movement), pan (pivot left or right horizontally) and tilt (pivoting the camera up or down).

The sixth degree of freedom is titling left or right, which is known to cause nausea. Learning to curtail use of that degree of freedom will help make VR acceptable. While VR-induced nausea has created a spot of controversy, many forms of media initially created some degree of nausea in the populace, including television.

Stereo videography employs two matched video cameras that share a mount and horizontal plane and move as one unit to provide video that emulates binocular vision. Some stereo video implementations employ two anamorphic lenses that divide up the sensor on a single camera. When viewed on a suitable device including 3D TV sets, stereo adds “depth sense” to video. Unfortunately, stereo videography tends to flatten people and objects compared to a binocular view of the same scene with the naked eye. While stereoscopic cinematography can be included in “3D” (now called Stereo-3D or S3D) cinema productions, due to workflow, consistency and other considerations, 3D effects are largely synthesized in post-production from single camera captures.

Stereoscopic video camera arrangement.

Stereoscopic video camera arrangement.

360° video is captured by an array of nominally matched cameras mounted at equidistant points around a circle on a common horizontal plane. The cameras all move as one. Camera ‘viewpoints’ can sport a single camera or a stereo video pair. Cameras in use tend to employ fixed lenses with wide angle or fish-eye fields of view. Implementations can feature cameras along both vertical axes in addition to the horizontal for true ‘spherical 3603’ coverage. “Computational videography” and other algorithms stitch together the video signals into a composite, normally compressed, feed. 360° video can be viewed using a variety of low-fidelity (but ever-improving) personal handheld viewers and is also suitable for “video in the round” direct displays that could provide groups or audiences with a 360° experience. Unfortunately, most current viewing devices only support one or two (at best) servers. To be usable in wider contexts, 360° video could benefit from flat image capture while leaving the stitching and lens distortions to post-production.

A 360° video camera array.

A 360° video camera array.

Plenoptics (plen from the Latin root for plenty, meaning ‘many lenses’) is an array of cameras (or an array of lenses within a camera) that captures more of the “light field” that emanates from illuminated objects in a scene but which are filtered out by traditional lenses that tend to focus a single light ray. Early plenoptics work included contributions by Sir Michael Faraday, among others. Recent advances in raw data processing speed and bandwidth have breathed new life into a field of study that lay dormant for more than two centuries. Practical plenoptic viewing solutions have yet to appear on the horizon, but plenoptics portends great promise for live and digital cinema productions when there are further improvements in data processing speed and bandwidth. In post-production, plenopticaly captured scenes can be refocused; the effective aperture and F-stop can be changed, even the frame rate, and cinematographers can achieve depth of field impossible with mere optics and even benefit from a new keying tool: “depth keying.” If broad enough rays are included in a plenoptic capture, the point of view can be adjusted, almost as if the viewer can look around a subject in the middle of the image. While some plenoptic implementations use an array of cameras in a single mount, other implementations comprise multiple cameras on two or more sides of a field of action.

Holography employs coherent laser light and spatial light modulators to capture three-dimensional scenes and play the capture back.

When one or more of the above technologies is combined with a virtual world and user-directed interactivity (such a camera movements), the result is “virtual reality,” although that term is mostly used for 360° video.

Augmented (or mixed) reality applies live motion imagery atop the virtual or actual scene one is viewing. One example would be the much-derided (but not officially dead) Google Glass camera/computer/display/eyeglass combo, which overlaid a miniature computer screen atop the field of view in one’s right eye.

Later TR-X VR/AR panels addressed the VR/AR market and various presentation technologies and addressed needed work on VR/AR processing workflows. Andrew McGovern of Digital Domain and Michael Mansouri of Radiant Images presented case studies on live 360° video productions. One interesting aspect in 360° video production is where to locate the crew so they remain off-camera and off-microphone. Huddling amid the tripod legs is one of the common approaches. The VR/AR panels concluded with a panel of young professionals engaged in VR development.

Changing consumer expectations for content

Tech Retreat sessions are preceded by breakfast roundtable presentations over coffee, juice and breakfast buffets. Any attendee can host a breakfast roundtable; just write the topic next to an empty table number and proceed. The end of the breakfast is announced by Mark Schubin softly sounding his pink bicycle horn.

The first full day of Tech Retreat 2017 was a “super-session” on changes in the consumption and delivery of media content. Panels and presentations ranged from theater owners and operators to participants in the latest wave of news and entertainment startups, and an uncredited appearance by a Nielsen Vice President detailing the supremacy of broadcasting, contrasted by the inroads that streaming media continue to make in viewer preferences.

At this Tech Retreat, the HPA coined a new term that drew groans and chuckles from the mostly middle-aged audience: HPA has started a “youthenizaton” initiative, to broaden out HPA membership with younger members. Fortunately, the initiative does not seem to involve actual euthenization of older members.

One panel of speakers addressing young people and their media habits contained only younger voices and viewpoints.

Sana Saeed, a producer and host at AJ+ stood out on that panel. AJ+ is an online-only, social-only multimedia news service that was carved out of the closure of Al Jazeera America. The “network” garners very high audience and engagement numbers, without a web site or dedicated TV channel. AJ+’s produced and reported pieces are streamed and consumed on YouTube.com, Facebook, Twitter/Periscope and other social media sites, where many viewers interact with each other and with the reported and produced stories.

Tech Retreat audience before the day's first speaker.

Tech Retreat audience before the day's first speaker.

At the Tech Retreat, broadcasting per se is usually addressed in the Broadcaster’s Panel, hosted by Matthew Goldman, SVP Technology, TV & Media, Ericsson. This year’s panel included engineering executives from CBS, NBC, FOX, PBS and group station operators. Much of the panelists talked about the prospects for ATSC 3.0 and, publicly for the first time, addressed RF band repacking as the FCC-imposed “quiet period” had elapsed shortly before the event.

Robert Seidel, VP Engineering & Advanced Technology, CBS & CW TV Networks, expressed, lone among the panelists, that his network had no interest in broadcasting 4K video – as opposed to capturing and processing in 4K -- but is very interested in High Dynamic Range (HDR) video. He outlined how ATSC 3.0 could be used to transmit 1080p video in HDR. In an ATSC 3.0 TV set, the video would be automatically upconverted to 4K, while using vastly fewer bits than 4K transmission would require.

Del Parks, Sinclair Broadcast Group Senior Vice President and Chief Technology Officer, reported that the repack would involve 93 “RF moves” for his company. More than 1200 of the 1600 or so stations (excluding Class A, LPTV and TV translator stations) on air would be affected, without considering the effect of channel-sharing arrangements.

Both Sinclair and Fox are committed to rolling out ATSC 3.0, including 4K video and associated online interactive services. Rich Friedel,Fox Executive Vice President & General Manager, Engineering & Operations, said that Fox was all-in for the ATSC 3.0 rollout and noted that he was also Chairman of the Advanced Television Systems Committee (ATSC) which developed the standard.

One unexpected presentation was the interview, termed the “Spectacle of the Theater” of Neil Campbell, President & CEO of Landmark Cinemas (Western Canada), that country’s second largest exhibition chain with 44 locations and 336 screens. Mr. Campbell described his chain as favoring movie lovers, with new projection systems, seat reservations and the move to more comfortable seating and food and beverage options.

The day’s sessions ended with a cocktail party in the Innovation Zone, where new and existing firms showed their innovations.

The second part of this report will cover additional technological advancements discussed at the HPA Tech Retreat 2017 and provide an intriguing look at interactive plenoptics possibilities and the history of the HPA Tech Retreat.

Part 2 of this report can be read here.

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.