Blending real and virtual elements on set: part 2

In the second part of our look at augmented or virtual reality elements in virtual sets we’ll start by explaining the difference between AR and VR. The terms augmented reality (AR), virtual reality (VR), virtual studio and virtual set are all referring to different flavours of the same technology but just emphasise different features.

Gerhard Lang, chief engineering officer, Vizrt explains: “When we’re talking about augmented reality we are, for the most part, referencing tracked virtual elements. They’re arranged three-dimensionally and attached to a live video signal. These are outputted as a ready-composite of live video and graphics. Virtual reality is an older term that refers to live motion in a three dimensional world. This doesn’t necessarily have any live video in it.”

The term virtual studio (or virtual set), refers to a three-dimensional space where elements – like a presenter – are keyed on top. Camera motion inside the virtual studio is created using tracking data from a live studio camera creating a realistic environment. The tracking data from the camera and the rendering of the graphics are composited and outputted as a final image. You can still put keyed graphics (for example, AR graphics) on top of this image so you can have additional graphics as part of the final image.

Lang says AR graphics are more complicated to work with compared to normal real-time 3D graphics. “Users and operators need to understand not only the physics of what is going on but also the potential and limitations of the technology,” he says.

“The human eye is highly sensitive to irregularities in any video content and picks them up immediately,” he explains. “This ruins the illusion. To get AR right, we must minimise any errors introduced by technology. One of the most important elements is studio camera tracking, which reports the position and the viewing direction of every studio camera, as well as the zoom and focus of each lens. This tracking data must be accurate to the smallest detail and delivered with as little delay as possible with constant attention paid to timing behaviour.”

An AR virtual camera must match its real camera as exactly as possible, including things like image deformation when using a specific wide-angle lens and depth of field. The position and rotation of the camera are, of course, local. The camera’s coordinate space must match the one used to build the scene. Put simply, an AR element which is meant to sit on a real table wouldn’t be there if the coordinate system’s data doesn’t match.

“The virtual elements must fit naturally with the real set with regards to colour, contrast, light and shadows – even if the element is something which does not exist in real life,” outlines Lang. “It’s distracting to the human eye if the shadows cast by virtual 3D objects are different to the ones cast by the real objects in the space, or if an object appears a lot brighter than the real-life studio lights would allow.”

What are some of the best approaches to designing with AR in mind?

Lang: Focus on getting the idea right. Content producers need to ensure they know that the AR they’re putting into their work will enhance the content and make the production stand out. And involve everyone. Producers, designers and virtual set wizards should all have a say in the process.

Secondly, limit your idea to something your team is able to produce. Make sure that your AR objects, texture videos and so on will run in real time on your Viz Engine. Think carefully through the production set-up so you know what cameras and shots will be needed during production and how you’ll cut between them live. Stick with real live measurements and avoid scaling. If possible have a mock-up of the real environment available inside Viz Artist to check how your elements will work without the need to have a camera and tracking available for planning.

When does AR / VR not work?

Lang: Everything in this area now works so well that there aren’t really any limits to the use of AR anymore. With highly portable tracking systems, like ncam, AR is now possible everywhere. If the hardware has already been assembled and calibrated, on-site setup time is really short. AR enhancements can now be used in a live broadcast within a few minutes. AR systems from Vizrt consisting of a laptop and Thunderbolt expansion box are lightweight and easy to transport anywhere. Because there are no more limitations, technically-speaking, the question producers need to ask themselves is whether AR makes sense and is appropriate for their particular project.

Vizrt virtual set at Sky Sports

Vizrt virtual set at Sky Sports

How do you best prepare talent to work with AR/VR?

Lang: It’s fair to say that some talent is better than others when working with AR or VR. Those who are better can visualise for themselves how things will look. They often have a better timing which helps when understanding how to work with virtual elements. Having enough monitors to show the final picture makes it easier for everyone, not just the talent. But most importantly, there should be plenty of rehearsal and time for talent to study plans of the content. Moving with an AR element or within a VR environment should be as natural as moving with real things in a real environment.

How important is lens calibration?

Lang: It is crucial for getting virtual and real elements to match up at all zoom levels. It’s a job that just has to be done and at Vizrt we have made some nice tools to simplify this process. These tools also cut down the time needed for calibrating lenses. At Vizrt we keep the lens and camera calibration separate. That means that once you’ve calibrated a lens, its data can be easily recalled when being mounted on a different camera body. A little bit of fine tuning and you are ready to shoot. This time is often very valuable as setup time should be as short as possible.

What can we expect from Vizrt at NAB?

Lang: Augmented reality is now a technology we see used in more and more productions without being tied to a specific programme type like sports, news, finance or weather. Even so, graphics products can be tailored for specific types, like Viz Arena for sports, which works with advertisements and tied to field graphics like player line ups, scores, team logos and so on.

Portability is also a key to see AR in areas where it has not been used before. Viz on a laptop with ncam is a highly portable system which can go anywhere. We even simplified our 4K AR solution. While we can’t do this on a laptop yet, we’ve reduced the setup to a single box which can do everything people are used to with HD in 4K. AR elements can be used in a Viz Mosart automated production together with robotic cameras like the one from Electric Friends which we’ll show at NAB 2015.

Having the camera moves and the appearance of AR elements automated and predictable inspires even more confidence in people working with AR. In studio environments we try to limit the set up time to a few minutes before a production can start. Together with MA we have a system which is able to track objects the talent is holding as well as measure the position and orientation of objects in the real set. When working in a studio where the set changes all the time, this helps tremendously by adjusting masks and surfaces automatically.

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing An LED Wall Display For Virtual Production - Part 2

We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.