Virtualization - Part 1

As progress marches us resolutely onwards to a future broadcast infrastructure that will almost certainly include of a lot more software running on cloud-based infrastructure, this seems like a good moment to consider the nature of Virtualization.


Other articles in this series:


If you want to understand virtualization in a fundamental way, you don’t have to look further than the room (or garden, or orbiting space station) that you’re currently in. It might feel pretty solid to you, a sensation reinforced by pretty much everything (legal!) that you’ve done since you were born. Most people live their entire lives without questioning their understanding of reality. But what does that mean? At the very least, you can probably identify a few pragmatic rules. For example, solid means solid: you can’t pass your hand through a tabletop in much the same way as you can’t walk through walls. We live according to a vast collection of cumulatively reinforced empirical laws of behavior, and we wouldn’t be able to play pool, drive a car or even lay peacefully on a sofa if these rules were even slightly flaky. Can you imagine being able to relax if, one time in every one hundred, you fell through the sofa into an unfathomable void?

Everything described above is a perfectly valid and consistent type of reality. But what does it even mean to admit the possibility that there are other types of reality of which we’re seemingly completely unaware?

Virtualization isn’t something relatively new, swept in by modern technology. Instead, everything we do and everything we know is courtesy of some kind of virtualization. The concept of virtualization shouldn’t seem strange to us - in a sense, it’s simply more of what we’re already used to.

Donald Hoffman compared the way we see reality to an operating system’s desktop, where the familiar-looking objects on the surface are like analogies to the complex and opaque processes that lay underneath. It makes sense for computers and for us. We’re all familiar with the concept of atoms and that they are tiny fundamental constituents of matter. In fact, they’re not even particularly fundamental. We now know that atoms can be split and that they’re made up of smaller electrons, neutrons, positrons and so on. And even these especially tiny objects can be split into a kaleidoscope of weirdly named constituent particles that are so abstract it almost beggers belief - were it not for strong experimental proof.

There’s absolutely no way that our brains can cope with all of that detail, and nor do they have to, as long as we have evolved a way to see the world through what are essentially proxies for reality. So, when I see a table, I don’t see a swirling sea of sub-atomic particles; instead, I perceive a wooden surface standing on metal legs. And I know - despite our latterly-obtained knowledge that the table is strictly composed mostly of empty space - that my coffee cup won’t pass through it when I put it down after taking a sip.

Turning back to computers, the desktop analogy is so effective that it has opened up computing to the vast majority of the population that doesn’t write machine code. Even before graphical user interfaces, operating systems like MS-DOS and Linux/UNIX were still massive abstractions from the hardware beneath. Today’s computers (even Apple’s Vision Pro) almost universally have a desktop paradigm for their UI.

This illustrates an incredibly important principle: that virtualization (because this is a prime example) can have a catalytic effect on usability. Before the GUI, computers were usable only by experts. If they had remained inaccessible, then they would still be niche, specialist machines. But that would have ruled out the vast majority of today’s computer users, who are not specialists (and nor should they have to be). As a result of today’s profoundly accessible computers, practically every field of research and wider endeavor is enhanced by the almost universal availability of computers.

To see the proof of that, look no further than email. If you’re using Gmail or anything like it, then you are confidently using something that is massively virtualized. You might see what’s labelled an “inbox” on your computer, but what that corresponds to in the physical world is a few bytes stored on some kind of massive virtualized storage system somewhere - who knows where? - in the world. To the user, the details don’t matter because it just works and it’s easy to use.

There’s a pattern here that’s far from obvious (that’s the nature of virtualization!): virtualization occurs at all levels in the technology stack. It can be a programming language like C relative to machine code or an operating system written in C that gives users a graphical interface. And today, you can get into an electric car, speak your destination to it, and it will take you there. That’s, essentially, a virtual journey: you don’t need to understand the first thing about driving, how a car works, dynamics or the law as it applies to drivers.

So, wherever you look, you can find examples of virtualization at any level. Before we go on, it’s a good idea to talk about the distinction between virtualization and abstraction. There’s actually very little difference, except, perhaps, that abstraction is a more general term. Specifically (although this distinction is a soft one), abstraction tends to hide complexity. Virtualization tends to do this at the same time as making the abstracted view look more like something else that is familiar. In practice, the two terms are so interwoven that they are largely interchangeable, and I’m going to treat them as such, except when we’re looking at specific applications. Virtual machines (VMs) are typically entire, configured computers that are abstracted from specific hardware but behave in every other way like a real computer. It’s quite tricky to achieve, and the companies that make virtual machine software do it extremely well. Here are some of the advantages

Run Multiple Machines On A Single Computer

Running a virtual machine will always require computing power (”compute”) just to enable the processes needed to support it, and “supporting” effectively means providing an emulation layer that makes the virtual machine think it’s a real computer. There’s also a Hypervisor that switches between multiple virtual machines in a way that makes sense to the actual, physical computer. But if the virtual instances aren’t too demanding, there’s no reason why you shouldn’t run multiples of them on a single, physical computer. This is a huge advantage and breaks the hard link between the number of computer instances and the number of physical devices. Is it a panacea? No. You can’t expect miracles - especially where there is heavy-duty processing called for or lots of real-time I/O.

Easily Replace A VM That’s Failed

When a VM fails, it can be for any number of reasons, but in a high-pressure environment, instead of stopping production to figure out what went wrong (a new software update, for example), it is quicker simply to close down the troublesome instance and start a new one that was configured and stored for exactly this purpose before the problem hit. Remember - there’s no change to the physical environment.

Demo An Entire Set-up Without Having To Physically Configure A Computer

This is a niche, but it nicely illustrates the advantages of a VM. Companies that sell complex computer software or services can offer their customers a “canned” version of multiple set-ups. There’s no need to waste time configuring the demonstration at the customer’s premises: it’s already set up on one or more virtual machines. It’s quick, slick, and a godsend for weary tech salespeople.

The concept of virtualization has much more to it, and we’ll examine more examples in the next article.

You might also like...

Audio At IBC 2024

Great audio is fundamental to any great broadcast and professional audio remains one of the busiest areas of the show both in terms of number of exhibitors and innovative new technologies on show. IP and cloud developments seem set to…

Network Orchestration & Monitoring At IBC 2024

Software defined systems is one of the hottest topics of the broadcast industry and IBC will be the perfect opportunity to get first hand demonstrations and expert advice from the vendors at the forefront of the leading edge of the…

Encoding & Transport For Remote Contribution At IBC 2024

The technology required to get high quality content from the venue to the viewer for live sports production remains an area of intense research and development, so there will be plenty of innovation and expertise in this area on the…

UHD & HDR Video Workflows At IBC 2024

As we head for Amsterdam we re-visit the key theme of technology that eases the burden of achieving effective workflows that simultaneously support multiple production and delivery video formats.

Hybrid SDI-IP Network Technologies At IBC 2024

As IBC approaches we pick up the key theme of hybrid SDI-IP network infrastructure with a run down of what to expect from vendors, and take a look at what might be interesting in the conference program.