Timing: Part 4 - Analog Television
One of the earliest and most widespread applications of synchronizing was in television.
Looking through the history of television inventions, the idea of breaking the image down into elements, what we now call pixels, was arrived at quite early in the day. In one system, each element of the picture at the camera had its own photo-cell and each photo cell had its own wire leading to a light emitter at the display. Whilst this would have worked, the number of wires, one per element, was clearly impracticable.
The solution was to use scanning, in which the state of the elements was sent down a single wire one at a time in rapid succession. The system would only work if the camera and the display were in complete agreement as to which element was being sent at a given time.
In very early television systems the scanning would be mechanical, involving rotating disks carrying lenses or fast-spinning prisms. The development of the cathode ray tube (CRT) at both camera and display put paid to the mechanical era of television. The electron beam of a CRT could be deflected much faster than any mechanism could operate.
Early CRTs had to be cylindrical to withstand atmospheric pressure on the outside with a vacuum on the inside. A rectangular landscape format picture with a moderate aspect ratio was chosen to make the best use of a circular tube face.
Scanning a rectangular picture requires two saw tooth waveforms, one much faster than the other. The slow one runs at the frame rate, the rate at which an entire picture is scanned. With a rectangular landscape picture, it was easier to have the high-speed scan going from top to bottom, as that allowed the lower scanning amplitude. Experimental TV systems scanned picture columns at high speed and scanned across the screen at low speed. The high-speed scan ran at an integer multiple of the frequency of the low-speed scan.
The camera and the display would both run from a sync pulse generator that sent frame rate pulses to initiate fly back of the slow saw tooth and column-rate pulses that initiated fly back of the fast saw tooth.
The picture rate was chosen to avoid flicker and to avoid beating effects with lighting and magnetic fields it was invariably chosen to be the same as the local power frequency, 50Hz or 60Hz, according to the country.
The number of columns determined the horizontal resolution and the vertical resolution is made to match by providing enough bandwidth that detail of the same order as the column width could be reproduced. If it were desired to increase the resolution, more columns would be needed, with more detail in each. This meant that the bandwidth needed would increase as the square of the resolution, something that remains true today.
In order to contain the bandwidth, a form of compression was devised that allowed the frame rate to be halved without affecting the flicker too much. This was interlace, in which the perceived flicker was that of the field rate, whereas each field only contained half of the picture and two complete fields would be needed per picture. The frame rate would then be 25 or 30Hz.
Fig.1 - The voltage space or gamut is divided into two regions. The upper region carries picture information and below that is the synchronizing region. As the synchronizing region is below black, no sync signals can ever be seen on the screen.
On still pictures, the system worked well, but with column scanning, any horizontal motion was reproduced badly as the two fields no longer interlaced properly. The solution was to turn the scanning system through 90 degrees so that horizontal lines were scanned and the system only failed on vertical motion, which was much less common.
2:1 interlaced scanning having a frame rate of half the local power frequency was first broadcast just before WWII in UK, and then suspended for the duration of the war. It was adopted internationally as the definitive way of broadcasting analog television and even found its way into the first generation of digital TV broadcasts.
2:1 interlaced scanning requires there to be an odd number of lines in the frame. The first field begins on a whole line and ends on a half line. The second field begins on a half line so the scanned lines fit between the lines of the first field. The second field ends on a complete line so that the two-field sequence can begin again.
It is a characteristic of analog broadcasting that there is effectively one signal that changes with time and this one signal had to carry the picture information as well as the synchronizing. The system universally adopted was the break up the voltage range or gamut of the signal into two spaces. The upper space, corresponding to about two thirds of the gamut, represents the brightness information. The lower space, the remaining third, is used for synchronizing. Any signals in that range are considered to be blacker than black as far as the picture information is concerned, so would never be seen on the screen.
The one Volt peak-to-peak signal is universal in analog television. American practice was to divide it between vision and sync in the ratio of 10:4 whereas European practice divided it 7:3. As Fig.1 shows, in a bandwidth-limited system, the edge of a sync pulse must have a finite slope and the time it represents will depend on the threshold voltage of any comparator. In television the comparator voltage was defined as one half of the sync pulse amplitude.
Fig.2 - The vertical pulse lasts much longer and contains H-pulses to keep the line scan locked. There are two types of vertical interval in interlaced systems that differ in the relationship of H and V.
A precision sync separator would sample blanking voltage and sync tip and the threshold of the comparator was half way between those voltages, obtained with a potential divider.
The horizontal synchronizing was obtained using a pulse sent every line. The scan coils of CRTs had considerable inductance and it took a finite time to retrace the beam. Retrace (and horizontal blanking) would begin at the leading edge of H-sync and by the end of the back porch the beam would be starting a new forward sweep and blanking would give way to a new active line.
The inertia of the vertical fly back was much greater and vertical sync pulses needed to be much longer, lasting for several line periods. In order to keep the horizontal scanning synchronized during vertical retrace, the H-sync pulses appeared inverted during the v-sync pulses. As Fig.2 shows, the sync waveform returns to blanking level in advance so that the falling edge of sync can take place at the usual time.
On account of interlace there were two types of vertical interval as shown, having two relationships between H and V. The final additional complexity was the use of equalizing pulses in the vertical interval to reduce the effect of the V-pulse on the DC component of the signal.
The monochrome television world used 525 lines in USA, whereas in the UK the 405-line system was supplanted by a 625-line system. One oddball was a French system that used triple interlace with 819 lines.
The next demand for synchronizing in television was the development of color. The constraints on any color television system were severe and included compatibility with existing black and white TVs as well as the need to stay within the existing TV channel bandwidth structure.
The solution eventually adopted was to add a subcarrier to the vision signal that a monochrome TV would display as if it were part of the vision signal whereas a color TV could demodulate it. In order to reduce visibility of the subcarrier the frequency was carefully chosen so that a whole number of cycles fitted into two TV lines. In NTSC there were 227.5 cycles per line, corresponding to a frequency of 3.58MHz.
Fig.3 - Subcarrier frequency is carefully chosen in relation to the scanning frequencies so that the subcarrier inverts on adjacent lines and between adjacent frames. This makes it virtually invisible on a monochrome TV set.
As Fig.3 shows, the result is that where a positive half cycle of subcarrier makes the screen too bright at some horizontal position on one line, on the next line and in the next frame the same horizontal position becomes too dark and the effects largely cancel out. Viewers tended to sit far enough away that the individual lines blended to form a picture. From that distance the subcarrier could hardly be seen on a monochrome TV.
The RGB signal from a color camera was matrixed to produce Y, R-Y and B-Y. Psycho-optic research had shown that the bandwidth of the color difference signals could be reduced dramatically because the perceived sharpness of the picture was carried in the luma. A further advantage of color difference working was that in the absence of R-Y and B-Y the luma signal alone produced a black and white picture. A black and white TV still worked with a color signal because it failed to decode the subcarrier and just displayed the luma.
The subcarrier was quadrature modulated with the two color difference signals. In order to demodulate such a signal, there has to be a reference signal at the receiver of known and constant phase. Creating this signal at the receiver was the function of the burst, which was a short sample of un-modulated subcarrier that was added to the back porch after the sync pulse. A phase-locked loop in the TV set could extend the burst to provide a constant subcarrier reference.
Broadcast Bridge Survey
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
If It Ain’t Broke Still Fix It: Part 2 - Security
The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…