HDR: Part 39 - Creative Technology - The Good Old Days
Progress inevitably comes with compromise. We can’t complain about the technology that’s brought us from hundred-kilo tube cameras to the 4K cellphones of today, but anyone who remembers the fuzzy old days of the 1990s might also remember a few things that we miss in 2021. Let’s examine a few.
Camera Color
Most people know that color images require red, green and blue records, and that the sensors we use only see brightness. In most cameras, we use conventional absorption filters, much like any other sort of ink or dye, which absorb all the light they don't transmit. In older cameras - and still in many three-chip broadcast cameras - dichroic filters were used, which reflect the light they don't transmit. It seems an academic difference until we realize that those dichroic filters are often much deeper and more saturated than the filters on more modern cameras.
There are two interconnected motivations for this. First, less saturated filters let more light through, resulting in a more sensitive camera. Second, most objects will be seen by the red, green and blue-sensitive parts of the sensor, which is useful for the mathematical algorithms used to reconstruct a finished image from a single-sensor. The resulting image will be low in saturation, so another algorithm, broadly equivalent to a saturation slider in Photoshop, may be used. As a result, sometimes modern cameras don't have quite the same color behavior as older, three-chip types, and can be fooled. Some, for instance, report deep blue light from blue LEDs as purple.
That's an extreme example, and modern cameras give us other huge advantage, but colorimetry will probably always be a target for improvement.
Lightweight Zooms
Bigger imaging sensors were developed largely for compatibility with traditional cinema lenses, to create results comparable to those established by nearly a century of cinema production. Mainly because of that, cameras with three separate, small imaging sensors gave away to designs with one big chip, at least in many applications.
The problem is that lenses designed for historic broadcast cameras - the familiar zoom with a handgrip on the side - don't work on big-chip designs because they just don't cast a big enough image. It's possible to build comparable lenses for big-chip cameras, and that's what Fujifilm did in the XK6x20, and Canon with the CN7x17. These lenses, though, are inevitably more expensive, larger, and heavier than traditional broadcast zooms, and may be slower, with a reduced zoom range.
In the days of 16mm film, a documentary team shooting 16mm could enjoy a handy, portable lens such as the Canon 8-64mm T/2.4, now famed as the lens used to shoot The Hurt Locker. Fundamental physics prevents something similar being made for bigger-chip cameras using conventional techniques; solutions are likely to be incremental at best.
Global Shutter
A trivia matter, maybe, but the idea of an image being sequentially captured line by line is one that only really began to afflict film and television in the days of big-chip digital capture. Accepting a rolling shutter made sensor engineering much, much easier, but created issues with only half of a frame being illuminated by something like a muzzle flash from a firearm, and potentially even visible skew during sharp horizontal pans.
It's not something anyone wants, and at the time of writing some manufacturers had begun releasing cameras which avoided rolling shutters. Some compromise, often to sensitivity or dynamic range, used to be inevitable with this, simply because the shuttering components had to take up area on the sensor that might otherwise be used to sense light. Modern approaches can solve this by stacking sensor components one behind the other, and we can reasonably hope that one day soon that global shutter will go without saying on most cameras.
Standard Monitors
For most of the history of television, monitoring standards were fairly fixed; a standard-definition display should implement ITU-R BT.601, and that was that. There were loopholes, in that the brightness behavior of high-definition displays was only specified in 2011, in ITU-R BT.1886, although practically speaking both high- and standard-definition systems had been de-facto standardized for years. Cameras and monitor standards ensured any camera could be connected to any display and things would look right.
In modern practice, in single-camera drama and related fields, that's now far from the case. Without even taking the various HDR standards into account, every camera manufacturer typically has at least one proprietary way of encoding color and one way of encoding brightness, and monitors generally need to be manually-configured to display images correctly. Sometimes, mistakes are obvious, and easily corrected; sometimes they may be subtle and hard to fix.
Standardization here, given the various commercial interests at stake, seems unlikely, though initiatives such as the Academy Color Encoding System are at least intended to address the problem.
Instant Backup
There are two magnetic tape storage systems left in the world, one of which is essentially proprietary to IBM server installations and neither of which are video tape formats. The one most often encountered in film and TV work is LTO, a solid option which stores lots of data quickly. No matter how good it is, though, it'll never quite match the convenience of pulling a tape out of a camera and putting it on a shelf.
Of course, historic camera formats were never really intended to have much archival permanence. Now many of them are many decades obsolete it's started to become clear how quickly they can fail. Still, they're more permanent than a flash card we're required to reuse the following day, and there's a convenience in creating the archive in-camera as the scene is recorded.
Really good backup solutions remain a challenge for information technologists in general, let alone film and television. Because it's a desire held so strongly by so many very different industries, it's reasonable to hope that something will come along at some point. That has been the case for a while, though, and we've seen many early demos of new ideas come and go with no product offered; soon, perhaps.
Lights That Match
There's a refreshing simplicity to a tungsten light bulb. Halogen technology, to allow longer life and higher operating temperatures, is a complication, but it's still basically a piece of metal that glows because it's hot. The color quality is almost perfect (look up "blackbody radiator") and the color temperature relies on the temperature of the metal. Theoretically, slight variances in the thickness or length of the filament might marginally alter its resistance, its temperature and thus its color temperature, but in reality, tungsten lightbulbs tend to match closely.
Other technologies vary. HMI generally matches when it's new, but drifts toward the blue-green as it ages. Fluorescent tubes vary on a magenta-to-yellow axis. Even LED lights using the same broad technological approach can look very different. Now we have lots of LED lights using very different technological approaches. Even the best may not match between manufacturers, and two shades of sunlight in the same scene is never a good look.
The trend is for LED lights to improve over time, and it's reasonable to imagine a future in which they will match (and periodically be recalibrated so they keep matching), and we might write that off as a necessary caveat of their huge flexibility.
Low Latency
Back in the days of analogue electronics, the idea of having a one-frame delay in a camera or monitor was impossible; the technology simply didn't have the ability to store a frame, and images went from the glass of the lens to the glass of the display as fast as a few transistors could pass the signal. Modern designs, conversely, usually have enough memory to keep that picture stored while various mathematical algorithms do their work.
That's become so prevalent that there may be a perceptible delay between reality and the monitor. It's much easier for designers to delay by multiples of a whole frame, to wait for the entire image to be available before starting to work on it, and as such it's easy to build up enough delay for viewfinding to become less easy than it should be.
Partly it's a necessary caveat of all digital signal processing; partly it's a design philosophy which may prioritize a long list of features over snappy response times. Some manufacturers have released firmware updates for cameras promoting reduced glass-to-glass latency as a feature. It's reassuring to discover this is in the mind of the industry as a realistic issue to be considered.
Solutions
Some of these issues are more fixable than others; some, particularly the concerns over backup, affect a much wider area of industry than just film and television, and seem likely to enjoy very large R&D investment as a result. Still, anyone who's been involved for a decade or three might be willing to raise a glass to certain long-obsolete technologies, and perhaps introduce current designers to the best aspects of what went before.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.