There Is A Need For Realtime Lens Distortion And Perspective Correction

The ever-increasing use of small broadcast quality HD cameras with wide angle or fisheye lenses have become increasingly prevalent. Their use can be found in live broadcast applications such as sport, ENG, wildlife programming, and reality TV, with non-broadcast use in medical, forensics/security and online gaming.

However, the “fish eye’ effect associated with these extreme lenses is not well received by the content viewer or programme maker.

This lens distortion and curvilinear perspective errors arising from the use of these camera lenses, as well as the physical camera positioning can produce results which can be considered undesirable and require correction.

How can you correct these errors? For live image feeds, any rendered software based offline processing is not practicable, and any correction process must be easy to use in a real time environment.

Software? Hardware?

Software processes require rendering of images, an offline process, so such solutions do not work in real time with live streams – crucial for sport applications for example (for forensic/security and reality TV applications that do not require real time solutions, software based processes may be useful).

(Left) Pre corrected live feed showing classic lens distortion & curvilinear perspective artefacts and (Right) corrected live feed with lens distortion & curvilinear perspective artefacts removed.

(Left) Pre corrected live feed showing classic lens distortion & curvilinear perspective artefacts and (Right) corrected live feed with lens distortion & curvilinear perspective artefacts removed.

For real time use, a hardware solution is more appropriate. Employing a hardware-based sub pixel geometry engine, this corrects the lens distortions in real-time, and most importantly, does not add any visible artefacts to the corrected output. This geometry engine concept allows for correction of lens aberrations through the range of operational HD resolutions, making it useful for a broad range of media applications.

The technology does not require co-location with the camera source. The unit can be located remotely via industry standard 3G co-axial cable subject to commensurate distance limits, but optionally supports fibre connections up to and beyond distances of 1000 metres. The unit has a Genlock input removing the need for external Genlock/Synchroniser hardware too. The setup can be done via USB connection to a Mac or Windows platform.

System overview of the lens correction Geometry Engine – graphic courtesy AlphaEye.tv

System overview of the lens correction Geometry Engine – graphic courtesy AlphaEye.tv

The image correction parameters include zoom, rotate, X and Y axis tilt, as well as offset, and adjusting of the strength of ‘barrel’ distortion correction required to the image.

You might also like...

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.

HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG

Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…

Great Things Happen When We Learn To Work Together

Why doesn’t everything “just work together”? And how much better would it be if it did? This is an in-depth look at the issues around why production and broadcast systems typically don’t work together and how we can change …

Microphones: Part 1 - Basic Principles

This 11 part series by John Watkinson looks at the scientific theory of microphone design and use, to create a technical reference resource for professional broadcast audio engineers. It begins with the basic principles of what a microphone is and does.

Designing An LED Wall Display For Virtual Production - Part 1

The LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in an interdependent array of technology.