Broadcast For IT - Part 2 - Video Frames
In this series of articles, we will explain broadcasting for IT engineers. Television is an illusion, there are no moving pictures and todays broadcast formats are heavily dependent on the decisions engineers made in the 1930’s and 1940’s. In this article we look at video frames, frequencies used, and what they mean in broadcasting.
Film recording, and projection relies on a series of still pictures being photographed 24 times a second, this is the lowest speed of recording and playback that guarantee’s fluidity of motion. If we record at less than 24 frames per second, the motion appears to be jerky.
When we watch a film at the cinema, we are in effect seeing a series of still images played at 24 times per second, and this gives us the perception of motion. There is no real motion, but the human visual system interprets the still images being played quickly as motion.
Remove Flicker
Film projectors achieve playback at this speed using a gate and shutter system. During projection, the shutter blocks the light being projected through the film long enough for the gate to pull down the next frame of film. When it’s stable and still, the shutter opens, and light is projected through the film. This process continues until the end of the film, repeats 24 times per second and guarantees fluidity of motion. But to remove flicker, the projection lamp is flashed twice for every frame of film.
Television works in an analogous way, but instead of creating frames of film, we create frames of video. In video terms, a frame is one complete picture, analogous to one picture in a film frame.
Interference from Lighting
In the early days of television, electronics were primitive, and engineers found that studio lamps would interfere visually with the television picture due to the frequency of electrical mains supplies. A type of flicker would be seen on the television. In Europe and the UK mains frequency was 50Hz, and in USA it was 60Hz. Therefore, the first frame rate for television in the UK and Europe was 50Hz, and 60Hz in the USA. The electronics that created the framing information of 50Hz or 60Hz was synchronized to the electrical frequency of the country the broadcast was working in.
USA, Japan and other countries using NTSC systems no longer use 60Hz, but use 59.94Hz. This happened at the advent of color and will be explained later. Understanding why is crucial to understanding television in NTSC countries such as the USA.
Fluidity of Motion
In the 1930’s and 1940’s, UK, Europe and USA transmitted monochrome television (black and white). In broadcasting we are constantly trying to reduce the amount of bandwidth we use in a transmission. The lower the bandwidth used, the more channels we can transmit, and hence the more revenue generated for the broadcaster from advertising. Engineers constantly battle between reducing bandwidth and maintaining high picture and sound quality.
The frame rates of 50Hz and 60Hz gave us good perception of motion fluidity and reduced any perceivable flicker. However, these systems were both bandwidth hungry and engineers found they could maintain fluidity of motion and keep flicker low by using interlace.
Two Fields Make One Frame
Interlace was developed in the 1930’s and is still with us to this day. It works by splitting a video frame into two fields, and each field uses half the amount of bandwidth of a frame as it uses half the number of video lines. Video lines will be explained in a later article.
Each field comprises of even or odd lines. Field one is line numbers one, three, five etc. And field two is line numbers two, four, six etc. In UK and European broadcasts, we send 50 fields per second, and as two fields make up one complete picture, we’re sending 25 frames per second. In the USA monochrome broadcasts, we were sending 60 fields per second and 30 frames per second.
Using this method, we keep flicker low because we are still transmitting at 50 or 60 pictures per second. And maintain fluidity of motion as we’re sending 25 (for Europe) and 30 frames (for USA before color) per second.
Video is split into color and luma to simultaneously broadcast color and black-and-white television.
In Europe and the UK frame rates of 25 frames per second and 50 fields per second are still used, even for color. However, in the USA and other countries using the NTSC system, the field rate changed to 59.94Hz and frame rate to 29.97Hz when color was introduced in 1967.
Luma and Chroma
As technology evolves, better standards for television are provided. Before 1965 all broadcasts were in black and white. From 1965 countries throughout the world started to transmit in color. However, many viewers had invested in black and white sets, and as they were so expensive to buy they were reluctant to change them. Therefore, the broadcasters had to transmit pictures that would work on both the existing black and white televisions, and the new color sets.
In a later article we will cover how color is represented in television, but for the time being assume color television signals consist of two components – luminance (luma) and chrominance (chroma). The luma part of the video signal is the black and white information and maintained backwards compatibility with existing television sets. The chroma was added to the luma signal using quadrature amplitude modulation with a carrier frequency of 3.898125MHz for USA (originally) and 4.43361875MHz for European video.
USA Color Subcarrier Problem
Black and white televisions would naturally filter out the high frequency subcarriers, so the color information wouldn’t be seen on existing sets as the frequency was too high, but could be decoded in new color sets to provide the chroma and display color television.
In the USA a problem arose, development engineers found that the color subcarrier frequency and audio carrier frequency were interfering with each other, causing a strobing effect on the television screen. To overcome this, engineers moved the color subcarrier frequency (CSC) down a little to 3.597545MHz. As the CSC is related to the line rate and hence frame rate, the frame rate also decreased slightly giving us a frame frequency of 1000/1001 * 60 = 59.94005994Hz (this number has an infinite number of decimal points, but is generally referred to as 59.94Hz). To this day, 99% of all broadcasts based on the USA 525 color system still broadcast in 59.94 fields per second.
The same problem did not occur in 625 European systems so there was no need to change the frame rate, and to this day it is still 50 fields per second, equivalent to 25 frames per second.
In the next article we look at video lines and how they relate to fields and frames.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
If It Ain’t Broke Still Fix It: Part 2 - Security
The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…