Color and Colorimetry – Part 10
Since its adoption for NTSC, essentially every subsequent electronic distribution means for color images has relied on color differences, making it a topic of some importance.
The RGB formats are based on three signals, each of which controls the output of one primary color. It is a characteristic of the HVS that the amount of detail in a picture is assessed primarily from changes in brightness. The acuity of the eye when it comes to discerning changes of color across the screen is nowhere near as good and a suitable perceptual compression system can take advantage of that.
Color cameras, whether based on three sensors or using a Bayer pattern, output video in the RGB format, which will usually be gamma corrected to R', G' and B'. Whilst that format gives the best results for production processes such as chroma key and color correction, it is not necessary or desirable for final delivery of video to the viewer.
Fig.1 shows the block diagram of a color difference matrix, which takes in three signals and outputs three signals. R', G' and B' are input to a weighted addition to create luma, Y', and this is then subtracted from R' and B' to produce R' - Y' and B' - Y'.
Only luma requires to be sent with full bandwidth, as it determines the perceived sharpness of the picture, whereas in the R', G', and B' format all three of those signals must have full bandwidth.
The luma signal is substantially the same signal that would have been obtained using a panchromatic black and white camera, i.e. one with the same spectral response as human vision. Clearly such a signal alone could drive a black and white display and equally clearly has to be sent with full bandwidth, which determines the perceived sharpness. This also means that backward compatibility with black and white displays is readily obtained; something that was important in the early days of color television.
Such a luma signal could also produce a sharp black and white picture on a color display if it were fed to all three primaries. This also meant that legacy black-and-white material could be displayed on a color set. By definition the color display would remain at its white point. To obtain a color picture, the display must be moved away from the white point, without changing Y'. This means that the proportions of R', G' and B' must change without the weighted sum changing.
The key point is that shifting the display from its white point does not have to be done with high resolution. The R' - Y' and B' - Y' signals can have significantly less resolution than Y' and no-one can see it. The theory is, however only correct for linear light, and the use of gamma compensation takes the system some way away from the ideal. Nevertheless the use of color differences is still worthwhile.
Only two color differences are needed, because in decoding, once R' and B' are known, G' can be obtained by reversing the luma summation. In principle any pair of color differences could be used. In practice G' - Y' is not used because G' and Y' are very similar in typical pictures, causing any G' - Y' signal to be small and prone to noise, hence the selection of R' - Y' and B' - Y'.
To give an example, assume for the moment that all signals are sent with full bandwidth and that the screen is showing mid gray meaning that both color difference signals have a value of zero. Suppose that we now want to make the screen greener whilst at the same time Y' remains constant. R' and B' both have to get smaller as G' increases.
At the display, R' and B' are obtained by adding Y' to the color difference signals. As Y' is the weighted sum of R' G' and B', we can obtain G' by subtracting R' and B' from Y', so we get back to our primary signals to drive the display. With full bandwidth the system is fully reversible. Whatever happens to R', G' and B', they must sum to Y' and Y' is sent with full bandwidth so there can be no loss of detail.
If the color difference signals are reduced in bandwidth, that doesn't change the detail carried in Y'. What happens is that the balance between R', G' and B' changes with less resolution, which the eye can't discern.
Fig.1a) The color difference matrix performs a weighted sum on the incoming components to produce luma, Y', which carries the detail in the picture and is preserved with full bandwidth. The color differences, R' - Y' and B' - Y' can be sent with reduced bandwidth.
Conversion to color difference somewhat changes the signal space. In R'G'B', the luma axis passes diagonally across the cube from black to white. In color difference working, the luma axis is considered to be vertical, in the center of the color difference cube and orthogonal to the color difference plane that radiates from it. Brightness changes slide the entire color difference plane up and down the luma axis with full bandwidth.
The color difference signals must be bipolar, i.e. they can be positive or negative. In the analog domain the voltages go above and below zero. In 601, offset binary is used, so in an 8-bit system 128 decimal represents zero color difference. Numbers below 128 represent negative values.
In color difference space, hue is a function of the direction from the luma axis and the saturation is a function of the distance from the luma axis. The position in the color difference plane may be sent with reduced bandwidth.
Making the gray scale vertical has the effect of standing the R'G'B' cube on its black corner with the white corner vertically above. Fig.2 shows that the weighted sum used to create Y' puts the Red, Green and Blue points at different places on the Y' axis, such that the triangular color space joining the primaries is not parallel to the color difference plane, but is inclined. The R'G'B' cube is distorted so the sides become rhombic.
Fig.1 also included the full expressions for R' - Y' and B' - Y, which can be used to locate points in the color difference cube in Fig.2. For example saturated green causes R' and B' to be absent. Y' becomes 0.587 and both R' - Y' and B' - Y' become - 0.587.
Fig.2 - In color difference space the Y' axis is vertical and the R', G', B' cube is stood on its black corner. The shadow of the cube is the hexagonal vectorscope display.
When viewed from above, looking along the Y' axis, the six remaining corners of the cuboid form the familiar hexagon that a saturated color bar signal will draw on a vectorscope. Note that the green vector is at 45 degrees between the color difference axes because both color differences have the same value for primary green.
It will be seen from Fig. 2 that the R'G'B' cube cannot fill the color difference cube. The R'G'B' cube occupies less than a quarter of the volume, so there is some redundancy. There are two consequences. There are many combinations that are outside the R'G'B' cube which are therefore illegal. As black or white are approached, color difference space gets smaller. Saturated colors are only possible at mid levels of luminance.
The second consequence is that the color difference signals need a higher signal to noise ratio in the analog domain, or more bits in the digital domain, because of the redundancy. For example with three 8-bit signals there are 224 combinations. But in color difference, only a quarter of the combinations are legal, say 222. This means there will be a loss of precision unless the color difference signals have a greater wordlength.
In color difference space there are two equivalent ways of locating a given color. Starting from the luma axis in the center of color difference space, a given color can be located using polar co-ordinates travelling at a given angle for a given radius, or by Cartesian co-ordinates, travelling in two dimensions. The Cartesian color differences are bipolar so that in the case of a black and white picture, both have a value of zero.
The polar co-ordinate approach would be used to control the phase and amplitude of a chroma signal in Y/C video and in composite formats such as NTSC and PAL. The Cartesian approach would be used in Betacam, MAC, CCIR (Now ITU) 601, JPEG and MPEG.
In analog video the bandwidth reduction of the color difference signals was done with a low-pass filter that only filtered in the horizontal axis. Vertical filtering was not practicable. In the digital domain vertical filtering can also be used, although the use of interlace in legacy formats made that difficult. All filtering of video signals needs to be phase-linear, otherwise different frequency components are shifted across the screen.
In the digital domain the bandwidth of color difference signals can be limited using a finite impulse response (FIR) filter, which has a symmetrical impulse response to assure phase linearity. The sampling rate of the color difference signals can then be reduced, saving data. Without filtering, simply dropping samples would cause aliasing.
You might also like...
NDI For Broadcast: Part 2 – The NDI Tool Kit
This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.
HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG
Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…
Great Things Happen When We Learn To Work Together
Why doesn’t everything “just work together”? And how much better would it be if it did? This is an in-depth look at the issues around why production and broadcast systems typically don’t work together and how we can change …
Microphones: Part 1 - Basic Principles
This 11 part series by John Watkinson looks at the scientific theory of microphone design and use, to create a technical reference resource for professional broadcast audio engineers. It begins with the basic principles of what a microphone is and does.
Designing An LED Wall Display For Virtual Production - Part 1
The LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in an interdependent array of technology.