Understanding Quality Control for File-Based Video Workflows
Quality control (QC) is an important process consideration at many points in file-based workflows, and one that can have a tangible impact on the business if neglected or improperly implemented. Defects that can be seen or heard by the end viewer such as missed or faulty commercial spots can result in lost advertising revenue, and poor or spotty picture quality can affect the brand and ultimately lead to subscriber loss. Quality or delivery compliance issues that occur prior to broadcast can add costs if the program material must be rejected and sent back to the content provider for costly rework.
Andrew Scott is an Application Engineer for the Video Product Line at Tektronix where he focuses on file-based video test products.
Typically, the earlier issues are found, the lower the cost to resolve them. Therefore, the goal for any broadcaster is to make quality control in file-based workflows as streamlined and efficient as possible while also identifying problems as early as possible.
Human QC operators – the traditional method for video quality control – are skilled at finding many kinds of visible and audible defects from inspection during playback. However, this approach does not scale well in comparison to the large number of files and formats typically found in a modern workflow, especially when adaptive streaming video packages are considered. Operators barely have time to spot check files, viewing only a few minutes at the start, middle, and end of a program. Human inspection is also inherently subjective. The thresholds for rejecting bad content typically vary depending on the personal opinions of the individual operators.
A blend of automated QC software and manual review mitigates many of these problems. QC software can decode and check each frame of video faster than normal playback speed, and automated QC can run continuously 24 hours a day, allowing larger volumes of content to be checked. Consistent objective results are produced from a software solution. Many types of non-visible errors, such as metadata errors, are easily detected by QC software. The result of such an approach is that QC operators will be able to spend more time fixing problems than trying to find them.
Delivery specifications
A delivery specification is a set of requirements for the transfer of media content from a production company to the broadcast network or streaming media provider. It includes a strict list of acceptable formats and technical attributes such as frame rate or picture size for the content, and often a description of minimum acceptable quality criteria. Delivery specifications are usually defined by the receiver, and it is the responsibility of the provider to meet acceptance criteria. It is important for the provider to check for compliance before delivery to avoid rejected material.
The Netflix Full Specifications and Operators Manual (available at https://backlothelp.netflix.com) is one example of a delivery specification. It includes separate specifications for SD, HD and UHD/4K formats. For HD, three options are acceptable: MPEG-2 video (80 Mbps, I-frame only) in an MPEG-TS container, ProRes 422 HQ video in a QuickTime (MOV) container, or JPEG2000 video in an IMF Application 2 Extended container. The ProRes option is useful for many content providers because it conforms exactly to Apple’s iTunes package format, so the same package can be used with both companies.
In the United Kingdom, the Digital Production Partnership (DPP) is an organization whose membership includes media companies representing the entire workflow from production to broadcasters. The DPP has published a common delivery specification that is mandatory for all content delivered since October 1, 2014. The DPP’s Programme Delivery Standard is based on AMWA’s AS-11, but also extends the specification to include mandatory technical and editorial metadata objects that must be present in the MXF file. DPP also requires compliance to several quality standards, such as EBU R128 for audio loudness and Ofcom 2009 guidelines for photosensitive epilepsy (PSE).
The success of the DPP has led to the development of common delivery standards for other countries. Variations of the original AS-11 specification are intended to be used in Australia and New Zealand (AS-11 X2), the Nordic countries (AS-11 X3 and AS-11 X4), and in the United States and Canada (AS-11 X8 and AS-11 X9). The North American Broadcasters Association (NABA) has collaborated with the DPP to develop the X8 and X9 versions. Broadcasters in Germany have adopted ARD_ZDF_HFD encoding profiles (XDCAM HD or AVC Intra video in an MXF container). In France, the “Prêt à Diffuser” (PAD) specification (based on AS 10) is in use by that nation’s broadcasters and post-production companies.
Delivery specifications have gone a long way toward improving the quality and consistency of incoming file-based content, but in and of themselves are no substitute for rigorous QC workflows.
Typical file-based errors
Several different types of errors can exist in video files. Many error types are visible or audible to the viewer, but a bad metadata value, for example, would only be detected by a software tool that decoded the value. Some QC tests have a clear pass or fail definition: is the container type acceptable, or is the frame rate correct? Other QC tests can be subjective. For instance, the amount of visible compression artifacts that are present before picture quality is deemed unacceptable can be an opinion that varies by viewer.
The simplest types of errors that can be detected by QC software are related to the attributes of the file, such as formats and metadata values. Some checks can be determined immediately, such as the video codec and its profile and level. If a network’s specification mandates H.264 video with High Profile @ Level 4.1 and the received file is a different format, it can be rejected immediately. Other QC checks may require measurements to be made. If the play duration of an ad spot must be 30 seconds (perhaps with a tolerance of ± 0.1 seconds), the number of frames in the clip must be counted. The QC system can also verify that measured values and attributes match the corresponding metadata value encoded with the file. Playout issues might arise if the video frame rate is actually 23.976 fps but the header metadata claims it is 29.97 fps.
Errors in how a file is encoded are best detected by a complete decode. At the container level, structural errors will often prevent the file from playing properly. Syntax errors in the video or audio tracks will usually result in visible or audible defects such as those shown in Figure 1, although set-top boxes or player applications typically attempt to conceal these errors. The decoder in a QC application will report syntax errors instead of attempting to hide them. Video encoded correctly can still have poor picture quality. If the bit rate is too low, compression artifacts can be seen. These include macroblock artifacts (edges) and quantization artifacts (banding).
Figure 1. Syntax (decode) errors in a file can result in visible artifacts.
Baseband errors can be detected in the video and audio essence in the file by decoding each frame and applying specific algorithms to the image and audio data. Video and audio dropouts appear as solid black frames and low-level audio data respectively. Ingest errors have signature patterns that can be detected by QC software. Out-of-range video levels (gamut errors) or audio levels (loudness errors) can be detected by measuring the post-decode pixel and audio sample data respectively.
Workflow automation
File-based workflow processes such as transcode and QC can be automated using a folder-based system or in conjunction with asset management system. Folder-based automation, such as that shown in Figure 2, is used in many workflows, including large workflows that process thousands of files daily. Each software application is configured to monitor “watch folders” on the media server by making periodic directory listings. When new files appear in the watch folder, they are added to the list of jobs. When a job has finished, the file can be left in place or moved, depending on how the workflow is constructed. For example, a QC system might have a different output folder for files that pass and files that fail the QC check. The “quarantine” folder for failed files will be managed manually by the QC operator, who fixes (and resubmits) or rejects each file as appropriate. The output folder for files that pass QC may in turn be the input watch folder for the next application in the workflow (e.g. transcode).
Figure 2. Folders on the media server can be “watched” by the automated QC system. New files are tested automatically when they appear in the folder. The files can be moved to different output folders based on the QC results.
Asset management systems typically use a direct control interface to each workflow application (for automation. These applications are integrated using plug-in software and application programming interfaces (API) and a web services architecture. As shown in Figure 2, a media asset management (MAM) application initiates a message exchange by sending a request such as “create job” or “get job status.” The server side replies with the appropriate response message. Web services protocols make it easy to create requests and parse reply messages in software.
With both types of automation, notification and reporting mechanisms are required. For a QC system, the operator is often notified by an email message when a file fails the QC check. That email message might contain an attachment of the QC report, making it easy to determine when manual review is required.
Figure 3. Using the API of a QC system, automation client software can programmatically start new QC jobs, query job status, and collect the results at job completion.
A key component of a file-based workflow is quality control. QC software can be used to assist the human operator, by finding both visible defects and those hidden within files. Compliance to a broadcaster’s delivery specification can be determined prior to delivery, avoiding rejected content. Automated QC can scale in capacity to meet the growing volumes of content that must be tested.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Operating Systems Climb Competitive Agenda For TV Makers
TV makers have adopted different approaches to the OS, some developing their own, while others adopt a platform such as Google TV or Amazon Fire TV. But all rely increasingly on the OS for competitive differentiation of the UI, navigation,…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
Broadcasters Seek Deeper Integration Between Streaming And Linear
Many broadcasters have been revising their streaming strategies with some significant differences, especially between Europe with its stronger tilt towards the internet and North America where ATSC 3.0 is designed to sustain hybrid broadcast/broadband delivery.