Sound engineers have spent over twenty years implementing and improving audio over IP systems. This has given audio a head-start in the race to migrate to IP. Not only does the sound seamlessly transfer across networks but recent designs have propelled advances in security, integration, and control.
The push to create the ideal digital cinematography camera has now been going on for, arguably, two decades. There were a couple of standout attempts in the 1980s involving high definition tube cameras, but the introduction of Sony’s HDCAM tape format in 1998 served more or less as the starting point of recognizably modern digital cinema. Since then, a huge effort has been made to meet the standards of a century of conventional, photochemical moviemaking. Arguments about whether that’s been achieved, or ever will be achieved, seem likely to rage forever, but in 2019 there seems at least some interest in going way, way beyond (some parts of) what 35mm film could ever do. The question is why.
In the fourth and final part of this series, we wrap up with an explanation on how PTP is used to support SMPTE ST 2110 based services, we dive into timing constraints related to using COTS (Commercial Off-The-Shelf) hardware, i.e.: servers.
People have been making pictures for both the big and small screens for almost a century. In an industry with a history that long, it’s no surprise that the perpetual search for something new has long been tempered by a certain respect for tradition. Or, to put it another way, directors of photography are very often looking for ways to make pictures look different, and different in a way that’s somehow appropriate.
DVB, the consortium developing open technical specifications for delivery of broadcast services, has formally approved its DVB-I specification for delivery of linear TV over IP networks including the internet.
As the amount of data in the world keeps exponentially multiplying, a Holy Grail in research is finding a way to reliably preserve that data for the ages. Researchers are now closing in on methods to make data permanent. The problem is there is no way to be absolutely sure it will work far into the future.
In the data recording or transmission fields, any time a recovered bit is not the same as what was supplied to the channel, there has been an error. Different types of data have different tolerances to error. Any time the error rate is in excess of the tolerance, we need to do something. Error handling is a broad term that includes everything to do with errors. It includes studying the error sensitivity of different types of data, measures to prevent or reduce errors, measures to actually correct them and measures to conceal the error if correction is not available or not possible.
This past summer the NBA did a little experimenting using 5G and mobile phones to cover their summer league. This is not User Generated Content (UGC) by any means. It also was not an off the shelf deployment of 5G and demonstration of its capability.