Mo-Sys Partner With VividQ For Next-Generation AR Displays

Mo-Sys has teamed up with VividQ, pioneers in computer-generated holography for next-generation augmented reality (AR) displays.

This allows 3D holographic projections to be placed precisely in real space, enabling users of future AR devices, like smart glasses, to explore virtual content in context with the natural environment.

Mo-Sys StarTracker is a proven and powerful camera tracking technology, widely used in television production and other creative environments for applications from virtual studios to realtime set extensions. It provides precise location for the camera in XYZ space, and with free rotation.

VividQ software for computer-generated holography is used in innovative display applications from AR wearables, to head-up displays. Holography - the holy grail of display technologies - relies on high-performance computation of complex light patterns to project realistic objects and scenes, for example in AR devices. VividQ generates holographic projections which, thanks to the precision location of Mo-Sys, can be displayed to the user at the correct place in the real environment. This is a major advance on today’s AR devices where flat (stereoscopic) objects are mismatched with the real world. By presenting holographic projections with depth, the user’s eyes can focus naturally as they scan the scene.

“The possibilities and applications of augmented reality in realtime devices are only just being explored,” said Michael Geissler, CEO of Mo-Sys Engineering. “We are at the cutting edge of camera tracking; VividQ is at the cutting edge of computer-generated holography, and we are excited to work together to bring some of these concepts to reality.”

Darran Milne, CEO of VividQ added “Our partnership with Mo-Sys is key to understanding the potential of computer-generated holography in future AR applications, developing experiences where virtual objects can blend seamlessly into the real world.”

You might also like...

Remote Contribution At NAB 2025

The technology required to get high quality content from the venue to the viewer for live sports production remains an area of intense research and development, so there will be plenty of innovation and expertise in this area on the…

Image Capture & Virtual Production At NAB 2025

The world of image capture is vast. NAB 2025 presents a unique opportunity to get a first hand inspection of cameras to suit every conceivable application alongside the latest in Virtual Production and robotics.

Sports Production Infrastructure – Where’s The Compute?

The evolution of IP based production and increased computer processing power have enabled new workflows, so how is compute resource being deployed to create new remote and hybrid approaches?

BEITC At NAB 2025: Conference Sessions Preview - Part 2

Once again in 2025 The Broadcast Bridge is proud to be the sole media partner for the BEIT Conference Sessions at NAB. They are not free, but the conference sessions are a unique opportunity to engage with very high quality in-person…

Microphones: Part 8 - Audio Vectorscopes

The audio vectorscope is an excellent tool for assuring quality in stereo sound production, because it makes the virtual sound image visible in the same way that a television vectorscope allows the color signals to be seen.