Mo-Sys Partner With VividQ For Next-Generation AR Displays
Precision positioning of 3D models in augmented reality devices using computer-generated holography.
Mo-Sys has teamed up with VividQ, pioneers in computer-generated holography for next-generation augmented reality (AR) displays.
This allows 3D holographic projections to be placed precisely in real space, enabling users of future AR devices, like smart glasses, to explore virtual content in context with the natural environment.
Mo-Sys StarTracker is a proven and powerful camera tracking technology, widely used in television production and other creative environments for applications from virtual studios to realtime set extensions. It provides precise location for the camera in XYZ space, and with free rotation.
VividQ software for computer-generated holography is used in innovative display applications from AR wearables, to head-up displays. Holography - the holy grail of display technologies - relies on high-performance computation of complex light patterns to project realistic objects and scenes, for example in AR devices. VividQ generates holographic projections which, thanks to the precision location of Mo-Sys, can be displayed to the user at the correct place in the real environment. This is a major advance on today’s AR devices where flat (stereoscopic) objects are mismatched with the real world. By presenting holographic projections with depth, the user’s eyes can focus naturally as they scan the scene.
“The possibilities and applications of augmented reality in realtime devices are only just being explored,” said Michael Geissler, CEO of Mo-Sys Engineering. “We are at the cutting edge of camera tracking; VividQ is at the cutting edge of computer-generated holography, and we are excited to work together to bring some of these concepts to reality.”
Darran Milne, CEO of VividQ added “Our partnership with Mo-Sys is key to understanding the potential of computer-generated holography in future AR applications, developing experiences where virtual objects can blend seamlessly into the real world.”
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.