Sony Introduces New OCELLUS Camera Tracking System

Sony Electronics is launching its first camera tracking system, OCELLUS (ASR-CT1), specifically designed to simplify and facilitate augmented reality and virtual production applications in broadcast and cinema by providing marker-free camera tracking through multiple sensors. OCELLUS is camera agnostic and can be used with both cinema and broadcast cameras.

OCELLUS, perfectly suited for virtual production such as In-Camera VFX and AR, sends the camera position and orientation data while the camera is shooting. The system comprises a sensor unit, a processing box, and three lens encoders, and can be used with Sony Cinema Line cameras, system cameras, and non-Sony cameras.

With the five image sensors and Sony's Visual SLAM (Simultaneous Localization and Mapping) technology, the system creates a reference map[1], enabling stable marker-free tracking both indoors and outdoors.

When using Sony cameras, metadata regarding about focus, iris and zoom values from the camera and lens[2] can be obtained via the camera's SDI[3] output and transmitted in real-time to external devices via Ethernet cable[4]. If the lens does not support metadata acquisition through the camera, lens encoders can be affixed to the camera to obtain this metadata. The acquired metadata can then be used for virtual production and AR.

The system also supports recording tracking data, camera/lens metadata, timecode and file name, which can be used for the post-production workflow.

Sony continues its unwavering support to creators in the Virtual Production and AR space, with tools ranging from acquisition to display screens such as the Crystal LED VERONA and software-based solutions such as the Virtual Production Toolset.

Key Features

Compact and lightweight sensor unit with five image sensors: 

  • Four image sensors out of five on the sensor unit are selected to use, providing stable marker-free tracking and high occlusion resistance, critical for operations.
  • If at least one image sensor in use captures valid feature points, tracking data can be extracted.
  • IR LEDs on both sides of each image sensor help tracking in low-light environments.
  • Visible Light Cut Unit included for stable tracking in environments with frequent lighting changes.
  • Sensor unit dimensions: approx. 86 mm × 60 mm × 43 mm (W × H × D) (3.39" × 2.36" × 1.69"), weight: approx. 250 g[5].
  • Easy installation and position adjustment using the NATO rail mounting parts (included).
  • Connection to the processing box via a single USB Type-C® cable with a lock mechanism, powered by the processing box via same USB Type-C® cable.

Processing Box:

  • Real-time transmission of tracking data and camera[2]/lens[3] metadata to CG rendering software like Unreal Engine via Ethernet cable[6] in free-d format.
  • Equipped with Genlock input, Timecode input, SDI input/output terminals, and lens encoder connection terminals.
  • Supports recording tracking data and camera/lens metadata as FBX files on SDXC memory cards (UHS-II/UHS-I) synchronized with video files of main camera.
  • OLED display for checking IP address, tracking information, lens data, and more.

Lens Encoder:

  • Detects precise rotation angles and positions of lens focus, zoom, and iris values.
  • Transmits detected data to the processing box via LEMO 7-pin cable.
  • Enables metadata acquisition for lenses and cameras not supporting lens data embedding on SDI output.
  • Includes five different types of gears for various lenses.

 


[1] There are limitations to the size of the map.
[2] Supports Cooke /i lenses, B4 lenses, and E-mount lenses.
[3] The camera must support metadata embedding on SDI output.
[4] Compatible with 1000BASE-T, 100BASE-TX, 10BASE-T.
[5] Please note that the final specifications may differ.
[6] Compatible with 1000BASE-T, 100BASE-TX, 10BASE-T.

You might also like...

Monitoring & Compliance In Broadcast: Monitoring Cloud Infrastructure

If we take cloud infrastructures to their extreme, that is, their physical locality is unknown to us, then monitoring them becomes a whole new ball game, especially as dispersed teams use them for production.

Phil Rhodes Image Capture NAB 2025 Show Floor Report

Our resident image capture expert Phil Rhodes offers up his own personal impressions of the technology he encountered walking the halls at the 2025 NAB Show.

The DOP As Sound Recordist: 32-BIT Float Is Our Godsend

As a cinematographer with several decades of experience on feature films and large broadcast projects, my current work on smaller productions and documentaries has increasingly added the duties of a sound recordist, and with it a greater appreciation for 32-bit…

Microphones: Part 9 - The Science Of Stereo Capture & Reproduction

Here we look at the science of using a matched pair of microphones positioned as a coincident pair to capture stereo sound images.

Monitoring & Compliance In Broadcast: Monitoring Cloud Networks

Networks, by their very definition are dispersed. But some are more dispersed than others, especially when we look at the challenges multi-site and remote teams face.