Musings of a Consultant - Workflow in Modern Media

This article will be the first in a series that will continue for a while. I hope to explore some aspects of workflow for modern media and create some clarity about terms, trends, pitfalls, and successes. I am also hoping that as we explore this together you will feed back questions and comments that might steer future postings in the series to be more useful to you in the future.

We all have opinions about what constitutes workflow for moving media. I must say that the topic is not new, just our focus on it as a discipline. Moving media workflow can certainly be traced back to the early days of film. The craft of film editing developed over decades into a set of approaches to completion of film content in ways that allowed collaboration between practitioners and service agencies that provided unique services a film editor needed. Those might have included sound, or film special effects work. By staying inside a repeatable workflow the results became repeatable, and of course less expensive. Clearly that is important. Part of the workflow was the devices necessary to complete projects, like well known tools such as the Steinbeck flat bed editor used for decades in documentary, television, and motion picture work.

In electronic media purely electronic production workflow can be traced to the early days of professional videotape, facilitated by the invention of the quadruplex video tape format by Ampex. While not file based in quite the sense we think of today, the analog electronic records on the videotape were more ‘file like’ than ‘film like’ in that they were only useful when reproduced and displayed. The same is quite true of files today. We work with electronic versions, digital versions, of the content we are producing, but only at the time of playback does it become ’television’.

The workflow that evolved in color television before digital files was based upon tools that allowed linear content to be edited and effects performed in ways that were well understood. It involved control over playback functions of video tape recorders, and manual operations of course. Take the workflow for the critically acclaimed program from the late 60’s and early 70’s, ‘Rowin and Martin’s Laugh In’. Art Schneider, a highly accomplished editor who passed away in 2009, developed a workflow that allowed a program with sometimes hundreds of edits to be assembled weekly. He used an off line process based in film and then conformed the original video tape (2” quadruplex recordings) by physically cutting the tape with razor blades. Once completed the program was ‘sweetened’ by adding a laugh track and other sound elements as appropriate. His approach was unique and the production technique was awarded an Emmy in 1968, validating the importance of the ‘jump cut’ edit as well as the workflow which facilitated sometimes as many as 500 edits in one hour. Art had a production concept which could not be done by any existing workflow, so he invented one that worked for the production values he wanted. Perfect, form following function.

The Smith 2 inch quad videotape splicer
image Courtesy of Early Television
Foundation and Museum and Steve
McVoy.

As digital technology was entering the industry we had to adapt workflow to more new tools. Interestingly, the workflow that was developed for analog video recording was simply modified for digital recording. The first digital recorders were almost recording files on tape. Only the lack of headers and footers and other file structures would limit the interpretation of the record on the digital tape as a file (it was a digital stream, a subject for a future article…). We interfaced to the digital recorders initially with analog interfaces, later replaced with digital interfaces as other digital production devices like switchers and graphics forced the development and standardization of fully digital studio systems. Even then, much of post production was done with techniques that simply used digital versions of analog tools, with digital interfaces, to complete essentially the same linear workflow. The source and output was digital, but the workflow was linear and additive, with no ability to ‘undo’ once a process had been completed.

The first major disruption of this long time pattern of workflow came as a result of the implementation of islands of non-linear, computer based, editing tools. Those first primitive file based tools fundamentally changed how we look at production, but in the context of this series, how workflow is crafted and completed. It was different in the following ways.


First, it was not additive and destructive. Until ‘flattened and rendered’ editing changes could be made, including addition or removal of graphics and special effects. To a degree this was possible in a well crafted linear workflow so long as all of the production elements were left unmodified and used as input sources in a combining stage, usually in a sophisticated editing room with large production switcher, audio console, and of course multiple people. In previous workflow that meant often the content had to be dubbed (copied), inexorably lowering the quality with each successive generation away from the original material. The major difference with files is that the next generation is the same quality as the previous, and so the flattened master preserves the quality of the original recording. Second, the ability to ‘undo’ in a non-destructive way meant that many production decisions could be explored and then abandoned. Though in ‘rehearsal or preview mode’ this was possible before, only a single edit could be previewed. With non-linear file based workflow it was possible to edit entire sections of a production and then choose to accept it and move on, or try another approach, non-destructively. This changed the game in pernicious ways. Some say it sped up production, others said it slowed it down because so many options could be explored. In my own experience as an editor it was still up to the editor to recommend what was appropriate and then demonstrate the wisdom on experience. And sometimes that actually worked!

There was one aspect of this change that cannot be missed. Until field and studio production moved to file based acquisition, which was considerably after the editing tools were created by CMX, AVID, Imix, Lucas Films, Montage and others, the weak link was the movement of content from acquisition to post production. Content had to be ingested, which meant real time copying of the content to the computer file based environment before editing could begin. And of course it also had to be rendered and ‘printed’ to linear media like tape for delivery and consumption.

EditCAM recorded on removable hard
drives, with files formatted to interchange
directly with AVID Media Composer.
Photo Courtesy of Ikegami.

Once Ikegami and AVID created the EditCAM in 1995 things changed for ever. The camera recorded files which could be copied at considerably faster than real time into the edit system. Things would never go back to linear I/O again once this cat was out of the bag.

To be continued soon …..next, “what is the role of metadata in production?”

You might also like...

Designing IP Broadcast Systems

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

NDI For Broadcast: Part 3 – Bridging The Gap

This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…