HDR Picture Fundamentals: Camera Technology

Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.

Dynamic range has been a target for improvement almost for as long as cameras have existed. Modern records of the first electronic images may not be an accurate representation of how they looked at the time, but their ability to handle a lot of contrast seems to have been fairly limited. That improved enormously throughout the twentieth century, and by the turn of the millennium it was clear that electronic cameras might soon achieve the sort of dynamic range required for high-end feature film production.

The current need to satisfy improved, HDR-capable distribution and display technologies means that broadcast production now demands the same expanded capabilities. Happily, most cameras already have enough performance for most situations. Even so, evaluating camera and lens options can still be complicated, especially where special circumstances exist - such as the increasingly-popular deployment of cinema cameras to lend an extra gloss to broadcast work.

The Past

Vacuum-tube broadcast cameras had poor dynamic range by modern standards, perhaps around seven stops. Solid-state sensors were often, initially, worse, albeit smaller, lighter, cheaper, and with less complex support requirements. They quickly improved, to the point where the first digital cinema cameras worthy of the name were modified broadcast cameras. Around the turn of the millennium, Thomson’s LDK-7500, called the Viper Filmstream, was among the first electronic cameras with sufficient dynamic range to be realistically competitive with the 35mm film process used in single camera drama.

With a dynamic range of perhaps eleven stops, Viper would not be competitive today, but it was state of the art at release. The three-CCD optical assembly was similar (perhaps identical) to those used in contemporary broadcast cameras, but the color processing differed. Most broadcast cameras were configured to satisfy the ITU’s Recommendations BT.601 and BT.709, which had been designed to accommodate far less capable technology. Viper was an early adopter of the now-familiar logarithmic approach to encoding everything the CCDs could capture for downstream manipulation.

Much the same approach is now being used to place broadcast cameras in HDR workflows. For some time, the performance of most cameras has been so far in advance of Standard Dynamic Range distribution that camera performance has not been a key consideration. Even the least-capable HDR distribution formats, though, are capable of describing images with at least 16 stops of dynamic range, similar to the best current cinema cameras.

Effective HDR does not require that every image be shot with cameras capable of exploring the limits of the distribution format. Even so, current broadcast practice might involve cameras using any of a variety of layouts, with very different sensor configurations.

Selecting Sensors

In a modern image sensor, each light-sensitive element is a photosite. A photosite measures a single light level; as such, a pixel requires at least three photosites with (usually) red, green and blue filters. Those might be three photosites on one sensor, or one photosite on each of three sensors.  The photosite converts photons into electrons, storing those electrons during the exposure.

The maximum number of electrons which can be stored is related to the physical size of the photosite and is often called the full-well capacity. Often, capacity will be in the tens or hundreds of thousands, which determines the brightest light the photosite can detect and therefore the upper limit of dynamic range (larger sensors are also more sensitive, given photons are simply more likely to land on a larger photosite).

Random noise generally controls the lower limit of dynamic range, and is related to the accuracy with which the sensor can count the electrons stored on each photosite. Random variation in that count creates noise which is usually well understood, often being a few electrons. The black level is chosen to ensure that noise level is minimally visible in the final image.

Other things affect photosite size. CMOS sensors include a lot of built-in features, such as on-board analogue to digital conversion. This means they must include processing electronics other than just the photosites. This is useful - in fact, it makes today’s high-performance sensors practical - but it does mean that some of the area which could have been used for photosites is occupied by other electronics.

There are many mitigations for these issues. One example involves microlenses, where a layer of tiny lenses is positioned in front of the photosites to focus light from a large area into the small photosite. Fewer photons will fall on non-photosite regions of the sensor and be overlooked, enhancing sensitivity. However, microlenses can affect the ability of the sensor to detect light which approaches from an angle, which may require specific behaviour (image-space telecentricity) from lenses.

Ultimately the difference between black level and full well capacity defines the dynamic range, making two things clear. First, dynamic range and sensitivity are influenced by the designer’s view of how much noise is acceptable. Second, all else being equal, a physically larger sensor will have some combination of higher sensitivity, improved dynamic range, or lower noise. The balance of this compromise is a matter of engineering opinion.

The consequence of all this is that the compromises involved in low-level sensor design trade off, noise, sensitivity and dynamic range performance in return for other benefits. Those benefits will vary from design to design, but the overall consequence is that, for reasonable dynamic range, high resolution sensors may need to be physically larger to maintain a given dynamic range.

Sensor Layouts

Specifying familiar, broadcast-style equipment is likely to make the operator’s life easier on live productions, but lenses are built differently for different types of sensor layout, varying in their intended use and mode of operation.

Sensor layouts are determined by the fact that any camera needs to record at least three color-filtered images to generate a full color image. Single-sensor cameras must include alternating patterns of different color filters - usually but not exclusively red, green and blue - and recover a color image mathematically. This is now an advanced art, although it has been argued that a single sensor camera cannot resolve fine details in patterns of bright color as well as an otherwise equivalent three-sensor camera, which avoids the need to calculate red, green and blue images from a single sensor.

Crucially, three-sensor cameras are likely to use smaller sensors. Compromises remain: the familiar ⅔” sensor size for broadcast cameras does not refer to a sensor which is actually two thirds of an inch in any dimension. The notation refers to obsolete vacuum tube technology where a tube of ⅔” diameter results in a sensor 9.6 by 5.4mm. That’s small compared to Super-35mm cinema sensors at about 24.9 by 18.7mm. The three ⅔” sensors’ total area is still only around a third that of a Super-35mm sensor.

Lenses & Dynamic Range

The benefit of a three-sensor design is that the small effective sensor size eases lens design, while the comparatively large total sensor area improves photographic performance. The caveats include a more complicated and expensive camera and some complexity in specific areas of the lens to consistently form an image on three separate sensors. Even so, broadcast style lenses for ⅔” cameras are generally smaller and less expensive for comparable f-number and field of view.

Cinema cameras re purposed for live work, with their larger sensors, consequently require longer focal lengths for the same framing, reducing depth of field and making the operator work harder to keep things sharp. Mostly, though, outside of special situation such as 8K or the creative desire for a high-gloss half-time show, lenses and cameras will be conventional and there will be very little extra operator workload.

It remains true that HDR may highlight variations in lens performance. Flares which might usually appear bright can become unpleasantly dazzling, and chromatic aberration - colored fringing - around very high contrast artefacts may be brighter and therefore more objectionable.

Finally, lenses themselves inevitably have limited dynamic range, at least within restricted regions of the image. All lenses have at least some diffusing effect, creating glow. Because of this, some dynamic range test charts have smaller test patches at the very brightest end of the range, to minimise the degree to which a large, bright area might contaminate the rest of the image with glow. In principle, a lens might not be capable of forming an image of a very dark shadow area nearby a very bright highlight, because the glow from that highlight will increase the brightness of the dark shadow, a de-facto limit on local dynamic range.

These considerations rarely lead to serious issues and most of them are creative rather than technical. Most cameras are now good enough for most applications, although unusual or specialist equipment (high speed or high sensitivity cameras, miniature cameras, ultra-long, short or probe lenses, underwater shooting, etc) are best evaluated carefully before being adopted into an HDR workflow. Performing those tests, and making that workflow practical for day-to-day broadcast operations, will require operators and vision engineers to adopt some new ideas and tools.

Supported by

You might also like...

IP Security For Broadcasters: Part 2 - The Problem To Be Solved

By assuming that IP must be made secure, we run the risk of missing a more fundamental question that is often overlooked: why is IP so insecure?

IP Security For Broadcasters: Part 1 - Psychology Of Security

As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…

Demands On Production With HDR & WCG

The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…