Blending real and virtual elements on set: part 3

Although augmented reality virtual studios have been around for decades, their use is only now coming of age. Previous barriers of price, design quality, graphics performance, and ease-of-use have disappeared, resulting in comparable costs for traditional studios with practical sets versus new studios with AR and VS.

Discussing the topic in our third look at blending real and virtual sets are Mark Bowden, Senior Product Manager, ChyronHego and Dave Larson, General Manager, Ross Virtual Solutions.

AR, explains Larson, technically means that there is no green screen background and virtual elements can only be in the foreground or the illusion is lost. With a green screen, virtual elements can be both foreground and background.

“The biggest challenge in blending real and virtual elements is the creation of more photorealistic virtual elements and making a seamless transition between the real and the virtual,” suggests Larson. “This includes matching the lighting and the reflections of the real environment on the virtual elements so they composite seamlessly into the real environment.

“The goal should always be to create virtual elements that support and enhance storytelling. The same principles that go into making real practical elements should be applied to developing virtual elements. Designing with virtual elements needs to include the interaction of the talent with the virtual elements as well as having proper space and location for the virtual elements. Uniquely, virtual elements can defy the laws of physics, enabling new approaches in using virtual elements in storytelling.”

AR for VS Design

Bowden's advice is to approach AR as something to enhance the shoot, rather than an AR design that overtakes the set and forces the crew to shoot around AR elements. In other words, he says, the crew shouldn’t have to make any changes in their normal shooting routine; rather, the AR should blend in. AR has to work within the realm of the physical set. It should be dictated to but not dictate the show.

However, AR is not always suitable. “It should never be used if it doesn’t serve a purpose or becomes editorially jarring,” says Bowden, adding, “AR should blend in seamlessly and never distract, and it should never dictate the overall look of the set.”

Neither is it particularly appropriate for programming that requires lots of physical interaction, suggests Larson, with set elements or that requires audience participation, where the audience is unable to see the virtual elements on stage and must rely on monitors/TVs. “Environments with uncontrolled lighting can be difficult since this may change or brake the illusion,” he says.

Rehearsals are the key for talent to be comfortable with the technology. This can include placing physical marks only visible by talent to identify the location of virtual elements. “Placement of large monitors off camera to provide talent feedback so they can visually and physically interact with the virtual elements,” says Larson. Ross Video suggests a walk before you run approach by introducing virtual graphics on an incremental basis as talent works with the technology.

It takes time and patience for a presenter to become comfortable with AR elements, and also to create an environment in which the presenter looks natural with the AR. “This requires careful placement of monitors, repetition until such factors as the proper eye line become natural, and keeping the environment as simple as possible,” advises Bowden, “It’s all about making the presenter’s words work and letting the AR elements handle the complex, visual aspects of the story.”

Lens calibration is vital; in fact, its importance can’t be overstated for creating a realistic and non-distracting AR or VR experience. On too many broadcasts, the vendors note glaring AR errors because the lens was not calibrated properly. “In order to blend AR with physical elements seamlessly and in a believable manner, the lens focal length has to be perfect from tight to wide angle,” says Bowden.

A look to the future

“From a technology point of view, more powerful graphics will enable virtual elements to become more photo realistic with real time shadows and reflections making virtual and real seamless,” says Larson. “Improved talent interaction with virtual elements as well as advanced talent tracking technology will play a big role in the future.

“Camera tracking will become ubiquitous, where all cameras and lens will track their position and field of view, streaming the data to real-time graphics engines as well as saving the metadata for postproduction workflows. Both real-time depth keying and difference keying will be used as the technology evolves to the current level of chromakeying technology.”

At NAB2015 Ross expects broadcasters to be seriously looking at how this technology can enhance storytelling, retain/gain viewership, create new advertising opportunities and reduce future design and operational costs.

“We anticipate that more technology companies will be showing how to practically use AR and VS for news, weather, and sports today. We also will see more use of a combination of both practical and virtual elements to enable customers to get started now and grow with the technology over time.”

ChyronHego's take is that forthcoming solutions will enable users to automatically drive virtual graphics. As operators become more comfortable with the AR tools, we’ll start to see their use with jibs and steady cams rather than just stationary pedestals. If not this year than at the 2016 NAB Show, Bowden thinks we’ll see more dynamic cameras that facilitate this AR interaction with presenters – such as the new class of depth-sensing cameras from Intel.” 

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.