Data Recording and Transmission: Part 8 - RF
In part 8 of the series “Data transmission and storage”, consultant John Watkinson looks at some of the intricacies of RF transmission.
Unlike the controlled environment of a cable or optical fiber, broadcasting represents a challenge because there is no control of the environment and the transmitted signal is subject to all kinds of abuse on its way to a receiver.
Error correction helps, but if the basic channel is very poor, error-free reception will require the bit rate to be reduced. As will be discussed in a future piece, error correction is at its best turning a reasonable channel into a good one. It is therefore important to make the basic channel fairly good.
Out of all of the sources of error in a terrestrial broadcast signal, one of the worst is multipath reception. Multipath occurs when the receiver obtains an extra signal or signals in addition to the direct signal that have bounced off some reflecting object and thereby travelled further. Clearly broadcasts from satellites are not affected, as there are no significant reflecting objects up there. Satellite broadcasts will need different modulation schemes.
Given the finite propagation speed of radio waves, the extra distance travelled by the reflected signals causes a time shift with respect to the direct signal. Clearly the higher the frequency of a transmission, the shorter the time of each cycle and the greater the chance of a reflection moving into another cycle and interfering. Thus, the relatively low frequencies used in AM radio are not seriously affected by multipath reception, whereas the frequencies used for FM radio and TV will be.
Fig.1 shows a simple example of a steady reflection causing an added signal having a fixed delay. The system is no less than a comb filter where the delay causes alternate cancellation and reinforcement. The loss of signal takes the form of a notch in the received spectrum and if this notch coincides with the transmitted channel there will be considerable degradation.
Fig.1 A steady reflection effectively creates a comb filter, which puts cancellation notches in the signal.
The difficulty with multipath reception is that the reflections may be sustained. If for example the reflections are due to an apartment block, they are likely to cause constant loss of signal quality. Mobile reception, especially in a vehicle, which cannot have a tall antenna, will suffer from varying reflections whose notches will sweep through the signal spectrum like the phasing that was once popular on pop songs.
The effect of spectral notches can be reduced by the use of spectral interleaving as shown in Fig.2. Instead of attempting to broadcast one TV channel in one contiguous part of the spectrum, as was done with analog television, the spectrum is broken up into smaller slots and the data from several TV channels is distributed over a range of slots. The result is that a spectral notch now causes a small amount of damage to all of the TV channels, rather than a large amount of damage to one. Error correction becomes more effective when the extent of errors is limited by techniques such as interleaving.
Fig.2 In spectral interleaving, several TV channels are combined in one multiplex. It will be seen that a notch does a controlled amount of damage to all channels instead of serious damage to one.
One consequence of transmitting a frequency-interleaved multiplex of that kind is that the concept of tuning to a TV channel goes away. The receiver tunes to a broad spectrum carrying many channels and selects the appropriate date for the wanted channel.
As the basic mechanism of multipath reception is delay, it should be clear that one way of defeating multipath is to send symbols very slowly. In fact, if the symbol time is long enough, the energy of the reflection will add to the energy in the direct signal and augment it.
In order to obtain a sufficiently high bit rate for practical applications, it is necessary to use many slow channels in parallel. The challenge is how to do that efficiently. Another challenge is to explain it!
It is necessary to explore the concept of transform duality to see how efficient digital broadcasting works. A transform is a mathematical process that allows a signal to be converted between domains. The time and frequency domains are commonly involved.
Digital audio and video work by sampling, and, according to Shannon, a sampling process can only work if it is band-limited. For simplicity, let us consider the ideal case where the band limiting is done by brick-wall filter having a rectangular frequency response.
The impulse response of such a filter is a sinx/x curve. The entire edifice of digital audio and video rests on the fact that the starting from the center of the impulse, the zero crossings in the curve will be found at integer multiples of the sample period. Fig.3 shows a number of samples being passed through such a filter, to reconstruct the original waveform. If any one sample is considered, at the center of the impulse due to that sample, the impulses due to all other samples are crossing through zero.
Fig.3. The impulse response of a rectangular filter is a sinx/x wave having periodic zero crossings. In reconstruction, each sample falls at the zero crossings of all the others.
In other words, the adjacent samples cannot affect the filter output at the precise time of any one sample. The output waveform joins up the tops of the samples. The samples are said to be orthogonal because at the center of each one, they don't interfere with each other. This is just as well, because were it not so, digital audio and video would not work as a sampled waveform could not accurately be reconstructed.
As mentioned, the impulse response of a rectangular filter is a sinx/x curve. However, the spectrum of a rectangular waveform, such as a square wave, is also a sampled sinx/x curve. The zero crossings in the curve correspond to the lack of even harmonics in a square wave. This is transform duality.
Fig.4 shows some examples of transform duality. The time and frequency domains can be interchanged across the diagram. A singularity in one domain becomes infinite in the other. Thus, a constant voltage is infinite in the time domain, but all of the energy is shown by a singularity at zero Hz. The dual of constant voltage is a singularity in the time domain, which is a delta function having an infinite spectrum.
In this case, however, we are interested in orthogonality. If samples can be orthogonal in the time domain, then transform duality says it is possible for spectra to be orthogonal in the frequency domain. Fig.5 is simply Fig.3 with the horizontal axis changed to frequency. Each sinx/x curve is now the spectrum of a carrier. If we modulate a series of carriers with binary data and space them apart by the correct frequency, they will be orthogonal and will therefore not interfere with one another.
This is the basis of COFDM (coded orthogonal frequency division multiplexing), which achieves significant multipath rejection by transmitting a large number of closely spaced orthogonal carriers, each of which works at a low bit rate. It is ironic that COFDM was an American development, yet it was used in DAB but not in ATSC.
Fig.5 The dual of Fig.3 is a spectrum in which discrete frequencies are orthogonal. The spectrum of one carrier is shown at a). The center of each carrier falls at the nulls of all the others.
A COFDM receiver is effectively carrying out a Fourier analysis on the received signal. The coefficients coming from the analysis for one specific carrier carry the data. With such a large number of carriers and the use of multiplexing, it is absolutely vital that the frequency analysis in the receiver is carrier-accurate, else all the data will be meaningless.
The solution is for the transmitted spectrum to contain markers, which are certain carriers that have greater power than the rest. The markers are placed in closely defined carriers and the receiver can identify them to obtain spectral synchronization.
The use of COFDM allows a national network of transmitters that all work on the same frequency. A randomly located receiver close to one transmitter will treat the weak signal from other transmitters as reflections. A receiver that is equidistant from two transmitters will effectively receive twice as much power.
In an adaptation of COFDM, the effective time taken to transmit a symbol can be extended by the use of guard bands, during which the transmitter returns to a null state between symbols. This makes the frequency with which each carrier is modulated even lower, so greater timing errors due to propagation delays can be tolerated.
Multi transmitter COFDM networks need very precise synchronization. Frequency synchronization using techniques such as the program clock reference (PCR) of MPEG merely recreates the correct frequency at the receiver, but with an unknown delay. This is not good enough for COFDM networks which need to be synchronous in the time domain; something that cannot be achieved using conventional data networks as these use packet multiplexing and insert an unknown delay in the transmissions.
One way in which time synchronizing can be obtained is to use GPS transmissions, which are, by definition, more time accurate than necessary for COFDM. The difficulty is that if the GPS system is lost for any reason, the COFDM national network will fail. Other synchronization techniques exist, but unfortunately will cost more.
John Watkinson, Consultant, publisher, London (UK)
You might also like...
IP Security For Broadcasters: Part 1 - Psychology Of Security
As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
If It Ain’t Broke Still Fix It: Part 2 - Security
The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…