Broadcast For IT - Part 1 - Introduction
In this series of articles, we will explain broadcasting for IT engineers. Television is an illusion, there are no moving pictures and todays broadcast formats are heavily dependent on the decisions engineers made in the 1930’s and 1940’s. Understanding broadcast video and audio is a lesson in history and backwards compatibility, and to a certain extent, the two are inextricably entwined.
Televisions started to emerge in the 1930’s when electronics technology was in its infancy. The thermionic valve, invented in 1904 by John Ambrose Fleming, played a major part in all aspects of telecommunications at that time. However, valves were bulky, required power rails of several hundred volts, and product development times were painfully slow. Consequently, any television set built with them was heavy, big and expensive.
Mechanical Television
Although much research had been conducted into semiconductors, it wasn’t until the 1940’s that solid-state diodes became available, the 1950’s for the first commercially available transistors and integrated circuits started to appear in the 1960’s.
Innovation is constantly driven by technological advances, and broadcasting is a perfect example of this. In the late nineteenth century, television started to emerge as a mechanical format with Nipkov’s (Germany) disk - a rotating disk that had a series of holes in it to build up a raster image. In the 1920’s Scottish inventor John Logie Baird used the Nipkov disk in his prototype mechanical television system.
Cathode Ray Tubes
At the same time, electronic systems were being developed using the cathode tube developed by English physicist J J Thomson and German physicist Karl Ferdinand Braun. These developments led on to the Cathode Ray Tube (CRT), the forerunner of television sets up until the 1990’s. In 1926 Hungarian engineer Kalman Tihanyi designed a system to turn light into electronic signals with the charge storage camera tubes. Variations of these were still in operation up until the 1990’s.
In 1936, the UK government set up a series of trials between the Marconi-EMI electronic television system and Baird’s mechanical system by alternating weekly broadcasts from London’s Alexandra Palace. Marconi-EMI won with their 405-line monochrome system and the BBC became established as world leaders in television broadcasting.
Similar developments were happening around the world with Vladimir Zworykin demonstrating transmission of electronic images with his tube at Westinghouse Electric in 1923, and Phil Farnsworth’s image dissector camera tube transmitting its first image in 1927, San Francisco. In 1939, US broadcaster NBC established an experimental station – W2XBS in New York, and transmitted for four hours a day. Various formats were in operation throughout the USA using 343 to 441-line systems, but in 1941 the 525-line monochrome standard was born.
Understand the Human Visual Hearing System
Key to understanding television is to understand how the human visual and hearing systems work. Audio is often considered the simplest but is most difficult to get right. The bandwidth of the human auditory system ranges from 20Hz to 20KHz, but this deteriorates as we get older. Understandable human speech ranges from approximately 300Hz to 3KHz and telephony takes advantage of this by limiting the telephone speech circuits to a bandwidth of 3KHz so more telephone channels can be used in fixed bandwidth circuits.
Our ear-brain system is very sensitive to interruptions in sound, probably part of our ancestral fight-or-flight response. If our caveman ancestors were to hear a crack of a twig in the dead of night, they would wake up instantly as it could be an indication of a fatal attack, and this system still exists in our brains. We find it acceptable, although annoying, to watch a programme with good sound but poor vision. However, if we have poor sound and good vision, we find it almost impossible to watch and even become stressed, a direct ancestral throw-back to our fight-or-flight response.
Fluidity of Motion
At the advent of television, the film industry had already established 24 frames per second was the slowest you could play a film to represent fluid motion. If you look at a reel of cinema film, it is a series of still images. These are played back in the projector at 24 frames per second to give fluidity of motion.
Legacy processes in the human visual system have left us with the ability to detect fast movements in low light, this was useful for our ancestors if a bear crept up behind them in a cave. In film, this manifest itself as visual flicker on the 24 frames per second projection. To overcome this, engineers devised a method to flash the projector light twice on each frame of film, thus doubling the frame rate to 48 frames per second and reducing our perception of flicker but maintain our perception of fluidity of motion.
Film to SD Cards
And that, is more or less, how television works. Instead of having a film camera and projector, we have a television camera, electrical signals, video recorders and televisions. Times have moved on, and we now store television pictures on SD cards or computer servers, but the principle is still the same.
The main difference is that video is a scanning system. In pre-1980’s analogue broadcasting, an electron beam scanned a camera tube using electromagnetic coils and produced an electrical current proportional to the amount of light falling on the tube. The beam was a stream of electrons that illuminated the front of the CRT screen when they hit the phosphor. At the television end, the electrical signal is scanned using electromagnetic coils on the back of the televisions CRT and light is produced proportional to the voltage level of the signal.
Remove Flicker
If the scanning happens fast enough, that is faster than 24 frames per second then we get fluidity of motion. If we can double the scan rate, then we get rid of the flicker too.
The scan at the camera and scan at the television had to be in absolute synchronisation, and to achieve this we use synchronizing pulses to tell the electromagnetic scan coils where to start in the horizontal and vertical domains.
Formats Were Established in 1936
Although modern camera’s use Charge Couple Devices (CCD’s) to turn the light into electrical signals, and we use Liquid Crystal Displays (LCD’s) in the television to turn the electrical signals back into pictures, the basic operation hasn’t changed since the 1936 Marconi-EMI trials at the BBC, London and 1941 broadcasts at NBC, New York.
In the next article we look at video frames, one of the fundamental components of a television picture.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
If It Ain’t Broke Still Fix It: Part 2 - Security
The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…