Applied Technology: Timecode Systems — It’s All About The Sync

In the last decade, there has been a significant shift what viewers like to see on their televisions with much of that content recorded as live reality style shows. Capturing such spontaneous content brings many challenges, but one of the biggest is managing the huge amounts of media created. Fortunately, with the help of timecode and synchronisation, it is possible to wrestle back some control over the content while saving money.

Shooting unscripted, fly-on-the-wall content shares many of the technical demands of live broadcast, most significantly in that it requires production teams to relinquish a large amount of control. With careful preparation, crews have a pretty good idea what the rough sequence of events is going to be, but it’s impossible to know exactly what is going to happen. There is little or sometimes no flexibility to stop, reset and repeat the action. Executing a successful shoot depends on the ability to capture every enraged outburst, revolted glance or controversial exchange as it happens. Virtually nothing can be missed.

As the appetite for reality television has grown, programme making has adapted to meet the voyeuristic demands of the audience. This has led to an increase in the use of the POV shot, often using wearable cameras, mounted everywhere and on everyone to ensure each angle is covered.

Managing all the data

It may be relatively hassle-free to film from multiple and varied camera sources nowadays, but the real issue comes with how to sync all that footage together. Although it’s possible to fix sync problems in the edit suite, hours and hours can be spent manually lining up shots to sync with each other and with any separately recorded audio. Using timecode to synchronise footage at the point of shooting offers massive benefits, especially for production teams working on tight budgets or operating with short turnaround times.

Timecode is an important form of production metadata that, if used correctly, can save productions time and money. The theory is simple. Each frame recorded is assigned a specific timecode, allowing editors to find a particular frame across multiple camera and audio sources by referencing this number. If each camera and audio device on a shoot is running timecode, all sources are synced and the data can then easily be dropped into the edit timeline and automatically aligned. Simple, right? 

Unfortunately the various internal clocks on cameras and recorders devices run at marginally different rates, causing drift, and consequently synchronisation is lost. This is where using external timecode can help, by jamming every camera and audio source to one highly accurate master clock.

Block diagram of timecode synchronization across multiple cameras.

Block diagram of timecode synchronization across multiple cameras.

Synchronize the clocks

In terms of avoiding drift and providing syncing points for postproduction, wireless timecode is a great tool, but the complex shooting scenarios of reality shows often calls for even more accuracy and reliability. Teams on major productions now routinely demand genlock as well as timecode for multicamera shoots. We always recommend using timecode products with the added capability of genlock.

Most wireless timecode systems, including the Timecode Systems solution, use a master and slave relationship as the basis for synchronisation. To set up your sync network, every camera and audio source needs an external timecode unit. One unit should be assigned as the master device and set to transmit timecode (or genlock if relevant). This confirms the master device has the clock settings that you want all of the devices in your network to use.

Then, set all of the timecode devices on the other recording units to run as slaves, using the same RF channel as the master. When a slave device is on, it transmits a signal via the set RF channel. If there is a master device within range and using the same RF channel, the master detects the slave. It then relays its clock settings to the slave. The slave sets its own clock to match the master, and this jams all sources to the same accurate clock.

Here is a fairly simple multicamera set-up as an example using Timecode Systems’ equipment as the sync solution. Imagine there’s a large-scale main camera and sound recordist with the presenter, and a portable single camera (PSC) with the contributors and multiple GoPros mounted around the location. In this scenario, we would connect a :pulse mini base station to the sound mixer and set it as the master unit. To complete the network, :minitrx+ units are mounted on the main camera and PSC cameras, and each GoPro is rigged with a SyncBac PRO. All devices are set to receive timecode from the master over the same RF channel. All sources are now shooting in sync. 

Timecode Systems SyncBac Pro accessory for the GoPro HERO4 Silver and Black seamlessly attaches via the HeroBus 30 pin port.

Timecode Systems SyncBac Pro accessory for the GoPro HERO4 Silver and Black seamlessly attaches via the HeroBus 30 pin port.

If the shoot takes an unexpected turn and a camera roams out of the RF range of the master, each device will continue to run using its own internal timecode. It will then sync back to the master clock as soon as can again receive the RF signal. Furthermore, if additional cameras arrive on location at any point in the shoot, users have the capability to simply attach a :minitrx+ to the camera, or SyncBac PRO if a GoPro, select the chosen RF channel, and the content recorded will be instantly synced against the master sync clock.

At the end of the shoot, the memory cards from the main cameras, PSC cameras, sound mixer and GoPros will all contain data files stamped with the same embedded timecode. The media can be dropped into the edit timeline and automatically aligned for a swift and efficient edit.

The BLINK Hub Production Dashboard And Device Control app enables users to view, monitor & control all BLINK enabled Timecode Systems devices.

The BLINK Hub Production Dashboard And Device Control app enables users to view, monitor & control all BLINK enabled Timecode Systems devices.

However, syncing timecode is only part of the story. It is also possible to synchronise other metadata as well. In the example above, in addition to transmitting timecode to the receiving units, the :pulse master unit will also collect all status, control commands and metadata from the slave devices (in this case, the :minitrx+ units and SyncBac PROs). This feature allows Timecode Systems devices to be monitored and controlled centrally and remotely from the BLINK Hub production dashboard and device control app on a smartphone, tablet or MacBook.

The most exciting thing, however, is this is still only the start of where wireless workflow and synchronisation technology is heading. Look for streamlining media workflows to become an important talking point for broadcasters in 2017.

You might also like...

HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows

Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…

The Resolution Revolution

We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?

Microphones: Part 3 - Human Auditory System

To get the best out of a microphone it is important to understand how it differs from the human ear.

HDR Picture Fundamentals: Camera Technology

Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.