Sony Introduces New OCELLUS Camera Tracking System

Sony Electronics is launching its first camera tracking system, OCELLUS (ASR-CT1), specifically designed to simplify and facilitate augmented reality and virtual production applications in broadcast and cinema by providing marker-free camera tracking through multiple sensors. OCELLUS is camera agnostic and can be used with both cinema and broadcast cameras.
OCELLUS, perfectly suited for virtual production such as In-Camera VFX and AR, sends the camera position and orientation data while the camera is shooting. The system comprises a sensor unit, a processing box, and three lens encoders, and can be used with Sony Cinema Line cameras, system cameras, and non-Sony cameras.
With the five image sensors and Sony's Visual SLAM (Simultaneous Localization and Mapping) technology, the system creates a reference map[1], enabling stable marker-free tracking both indoors and outdoors.
When using Sony cameras, metadata regarding about focus, iris and zoom values from the camera and lens[2] can be obtained via the camera's SDI[3] output and transmitted in real-time to external devices via Ethernet cable[4]. If the lens does not support metadata acquisition through the camera, lens encoders can be affixed to the camera to obtain this metadata. The acquired metadata can then be used for virtual production and AR.
The system also supports recording tracking data, camera/lens metadata, timecode and file name, which can be used for the post-production workflow.
Sony continues its unwavering support to creators in the Virtual Production and AR space, with tools ranging from acquisition to display screens such as the Crystal LED VERONA and software-based solutions such as the Virtual Production Toolset.
Key Features
Compact and lightweight sensor unit with five image sensors:
- Four image sensors out of five on the sensor unit are selected to use, providing stable marker-free tracking and high occlusion resistance, critical for operations.
- If at least one image sensor in use captures valid feature points, tracking data can be extracted.
- IR LEDs on both sides of each image sensor help tracking in low-light environments.
- Visible Light Cut Unit included for stable tracking in environments with frequent lighting changes.
- Sensor unit dimensions: approx. 86 mm × 60 mm × 43 mm (W × H × D) (3.39" × 2.36" × 1.69"), weight: approx. 250 g[5].
- Easy installation and position adjustment using the NATO rail mounting parts (included).
- Connection to the processing box via a single USB Type-C® cable with a lock mechanism, powered by the processing box via same USB Type-C® cable.
Processing Box:
- Real-time transmission of tracking data and camera[2]/lens[3] metadata to CG rendering software like Unreal Engine via Ethernet cable[6] in free-d format.
- Equipped with Genlock input, Timecode input, SDI input/output terminals, and lens encoder connection terminals.
- Supports recording tracking data and camera/lens metadata as FBX files on SDXC memory cards (UHS-II/UHS-I) synchronized with video files of main camera.
- OLED display for checking IP address, tracking information, lens data, and more.
Lens Encoder:
- Detects precise rotation angles and positions of lens focus, zoom, and iris values.
- Transmits detected data to the processing box via LEMO 7-pin cable.
- Enables metadata acquisition for lenses and cameras not supporting lens data embedding on SDI output.
- Includes five different types of gears for various lenses.
[1] There are limitations to the size of the map.
[2] Supports Cooke /i lenses, B4 lenses, and E-mount lenses.
[3] The camera must support metadata embedding on SDI output.
[4] Compatible with 1000BASE-T, 100BASE-TX, 10BASE-T.
[5] Please note that the final specifications may differ.
[6] Compatible with 1000BASE-T, 100BASE-TX, 10BASE-T.
You might also like...
Image Capture & Virtual Production At NAB 2025
The world of image capture is vast. NAB 2025 presents a unique opportunity to get a first hand inspection of cameras to suit every conceivable application alongside the latest in Virtual Production and robotics.
Sports Production Infrastructure – Where’s The Compute?
The evolution of IP based production and increased computer processing power have enabled new workflows, so how is compute resource being deployed to create new remote and hybrid approaches?
BEITC At NAB 2025: Conference Sessions Preview - Part 2
Once again in 2025 The Broadcast Bridge is proud to be the sole media partner for the BEIT Conference Sessions at NAB. They are not free, but the conference sessions are a unique opportunity to engage with very high quality in-person…
Microphones: Part 8 - Audio Vectorscopes
The audio vectorscope is an excellent tool for assuring quality in stereo sound production, because it makes the virtual sound image visible in the same way that a television vectorscope allows the color signals to be seen.
BEITC At NAB 2025: Conference Sessions Preview - Part 1
Once again in 2025 The Broadcast Bridge is proud to be the sole media partner for the BEIT Conference Sessions at NAB. They are not free, but the conference sessions are a unique opportunity to engage with very high quality in-person…