Virtualization - Part 2
In part one, we saw how virtualization is nothing new and that we rely on it to understand and interact with the world. In this second part, we will see how new developments like the cloud and Video Over IP have allowed us to abstract the conceptual parts of a broadcast infrastructure and spread them across the world. And when we do this, the possibilities don’t just add together; they multiply.
Other articles in this series:
Virtualization as a way to simplify a complex system has a long history. Horses are extremely complex systems, but by accessorizing them with reins, saddles and stirrups, horse riders abstract themselves from the biological system with a few simple mechanical controls. While there’s undoubtedly a complex and nuanced relationship between horse and horse rider, the controls limit the “vocabulary” and allow a degree of focus that makes the communication concise but effective. Cars have a similarly limited range of controls, but within the boundaries of all the things that you can do with a car, they are extraordinarily effective.
Helicopters have many more degrees of movement and are notoriously difficult to learn to fly. Their control systems are, of course, an abstraction, but you can’t help feeling that they are a bit too “low level” and that, if we’re ever going to have proper flying cars, they’re going to need a much simpler way to fly them. Technology will make this possible and is, perhaps, an increasingly important part of this story. When you don’t have to rely on purely mechanical linkages between control and system, then you can have almost any degree of abstraction and a completely virtual control interface.
Remote controllers and control surfaces also go back a long way. Early TV remote controls were connected to the TV with a fat cable, and if you wanted to change the channel, the equivalent of the “program-up” button initiated an electric motor that physically switched between stations.
In the professional domain, digital audio started to make waves in the middle of the 1980s, and around that time, controller specialist JL Cooper brought out a MIDI controller with eight faders and a jog wheel. A more recent but very similar version remains on sale today. A small Seattle-based company called Spectral Synthesis made a multitrack recorder/sampler based on PC cards and a digital audio bus. Spectral incorporated support for the JL Cooper controller and the result was a complete transformation in the user experience. At the time, the state of the art was a PC with a 486 processor, but all the DSP was hosted on the plug-in cards. With the controller, you could not only start/stop/record with tactile transport controls but use physical faders and - best of all - perform audio scrubbing with the hardware jog wheel. The result was that a clever but slow system was made to feel organic. It was all the advantages of digital, with the familiar, tactile feel of a tape recorder. (Spectral Synthesis was later purchased by Euphonix, which eventually became part of Avid).
At around the same time, again a result of the rise of digital audio, the first digital mixing consoles started to appear. For anyone brought up with analog desks (that’s everyone at the time) it seemed like some fundamental rules were about to be broken.
Before we go into that, and with apologies if this seems obvious - because it certainly wasn’t at the time - in a digital audio mixing desk, no actual audio passes through the faders. There’s no need. The mixing desk control surface becomes a virtual mixing desk, with the processing itself taking place conceptually somewhere else. Smaller consoles might have their digital electronics in the same case as the control surface, but with bigger devices, there would be a digital processor in a rack or some kind of separate unit. The same goes for input and output. With digital connections, there was no longer any need to have analog I/O near the desk. This extra flexibility meant that the I/O could be located nearer the microphones - on a studio floor, for example, with only a digital cable between the pre-amps (and their necessary A/D converters) and the console’s digital processing unit.
A significant benefit of this new and flexible architecture is that there is no longer a 1:1 relationship between inputs and channels. Previously, if you wanted a cello on channel 8, you’d have to plug the microphone into the desk’s channel eight input socket. With digital, if you wanted to, you could plug instruments and vocalists into the next available channel on the remote input box and sort out the routing on the desk. Just to be completely clear, if your cello was plugged into the physical input number 8, you could arrange for it to appear on the desk’s channel 3, with a simple visit to the input routing menu. This kind of flexibility is now ubiquitous to video with the arrival of Video over IP and standards such as SMPTE 2110 and protocols like NDI.
Before these video over IP standards, there was strictly one (SDI) connection per video channel and very likely a big, heavy router to manage sources and destinations. Network video broke down this one-to-one physicality of connections. Instead, it’s now possible to have as many channels of video (and, of course, audio) through a single network connection as the bandwidth will allow. With compression, this could be a large number. It also means that you can build incredibly flexible studio facilities, where the only wiring is network cables. Into a network socket, you can plug a camera, a monitor, a video switcher, or a video monitor. It’s all software-defined, and the exquisite beauty of it is that you can reconfigure a studio space in minutes. With no legacy fixed wiring scheme, you can plug anything, anywhere, and configure it with either a preset or a few menu selections. It’s absolutely transformative for cost-sensitive facilities.
Another aspect of virtualization is plug-ins. Editors will be used to plugins in their DAWs and NLEs. As the technology underpinning these incredibly useful functional units has improved (especially digital signal processing), so have their user interfaces. The state of the art in plugins is now such that they, despite being unashamedly digital, can convincingly emulate their analog counterparts. Not only can NLEs conjure up pretty accurate film looks (even in a live broadcast) but DAWs can use digital emulations of vintage audio compressors, and process microphones to sound like specific models from any time in the last sixty years. Some of these plugins have functional-looking interfaces, while some are simply gorgeous to look at.
Away from the glamor of popular plugins, physical control surfaces for virtual devices have a huge advantage in smoothing over rapid, large-scale technology changes. It’s impossible to estimate the value of production professionals’ accumulated knowledge and experience, but it is significant. Providing controllers that emulate a previous generation of physical processors and other specialist equipment provides a bridge to new generations of technology by keeping the familiar controls. When this is done well, operators might barely notice that they’re controlling a virtual operation. For example, in live cloud production, physical control surfaces can work the way they always have, and with sufficiently low latency, operators could almost forget that they’re controlling a video switcher in the sky.
A quick note about designing and implementing physical controllers for virtual systems. There are a few factors to pay close attention to.
Latency, the delay between issuing an instruction or a command and it taking audible or visible effect, is an ever-present factor in virtual or remote operations. Imagine playing MIDI keyboard connected to a virtual synthesizer. If the latency is greater than around 20ms, the keyboard will feel spongy and unresponsive. Musicians will find it hard to play accurately. The same applies to any kind of control surface (imagine trying to control a car if the steering had a 300ms latency!).
Another factor, which I mention out of curiosity, is “Zipper Noise”. In the early days of digital audio mixing consoles, fader controllers only had eight-bit or less resolution. You might think that 256 fader positions were enough - and they were in terms of levels. But when the faders were moved, they would generate a zipper noise as the levels swept through the relatively small number of steps. Designers had to build specific “De-Zipper” code to counter this until the ultimate solution, 16-bit (or more) levels of control, became feasible.
Virtualization will probably be the default approach within the next decade. It will open up new creative possibilities while massively reducing production costs and—as a bonus—making studio and live event production more flexible than ever.
You might also like...
HDR & WCG For Broadcast: Part 3 - Achieving Simultaneous HDR-SDR Workflows
Welcome to Part 3 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 3 discusses the creative challenges of HDR…
IP Security For Broadcasters: Part 4 - MACsec Explained
IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.
Standards: Part 23 - Media Types Vs MIME Types
Media Types describe the container and content format when delivering media over a network. Historically they were described as MIME Types.
Building Software Defined Infrastructure: Part 1 - System Topologies
Welcome to Part 1 of Building Software Defined Infrastructure - a new multi-part content collection from Tony Orme. This series is for broadcast engineering & IT teams seeking to deepen their technical understanding of the microservices based IT technologies that are…
IP Security For Broadcasters: Part 3 - IPsec Explained
One of the great advantages of the internet is that it relies on open standards that promote routing of IP packets between multiple networks. But this provides many challenges when considering security. The good news is that we have solutions…