On-Line, near-Line or archive. Just how should data be stored?
Shown is a Spectra Logic T-Finity tape archive system.
Any discussion of media storage relies on four generic phrases; on-line, near-line, archive or off-line. What storage technology is best suited for each task?
Let’s set some general definitions for our discussion on data storage. On-line means the information is immediately usable for any required purpose. The data could be on-line for transcoding while at the same time being off-line for editing. In other words storage terms like “on-line” cannot be used unless an application is also specified.
Off-line has occupied some under-defined usage scenarios for the last 10 to 20 years. For me it has always meant "Damn-it, I have to wait for something to happen before I can do what I really wanted to do". Obviously such interruptions in a workflow, to the extent that KPI's are not met, require the media to be less "off-line".
Types of storage
Why do we need a third definition of storage usage scenarios? If something is "On-shelf" is it not simply further off-line. I would like to propose that we define the difference in terms of human interaction.
Off-line can become on-line as an automated process. The cheapest, and in some sense the most secure, storage is on the shelf, preferably kept in two geophysical separated disaster protected locations. Deciding what to put where becomes easier if we know that all three kinds of storage are available. Just to be clear, these distinctions still have value even when provisioning from the cloud.
Localized storage typically consists of multiple HDs installed in rack mounts of various sizes, configuration. Shown here is a Facilis TX16 storage system.
Today’s storage systems are virtually all disk based. While solid-state drives are available, and we use RAM, both provide insufficient storage capacity and are more expensive than rotating disk—especially for media projects. So what are the key differences in types of storage and how do we judge their performance?
The criteria for data storage are, Permanence, Availability, Scalability and Security. Making an acronym we get P.A.S.S.
- Permanence means that data is never lost.
- Availability means that the user/application requirements for access/performance are met.
- Scalability defines the ease of meeting changing requirements.
- Security defines the granularity and durability of access privileges.
Providing storage with the best technologically available, P.A.S.S., regardless of application, is going to be prohibitively expensive. Therefore we need to match the storage to the application. That means trading storage performance against availability. But that means we have to maintain multiple types of storage.
Cloud storage gives us the advantage of bespoke storage without the additional overhead. Generic cloud storage enables economies of scale were multiple clients are served from a single enterprise storage system. But, you still need to ask, will the cloud provider at some point off-load your deep archive and ship it to Iron Mountain? Or, can you even afford to keep all of your projects on expensive, always ready, on-line storage?
Let’s assume that we have determined what P.A.S.S we need for each application used within our acquisition, post-production and distribution pipeline. How do we determine when to move the data to less-expensive storage? Changing technology (lower prices) mean that this decision is always open for reinterpretation. This all comes back to KPI’s, if we can achieve the KPI while moving the data to less expensive storage, then do it!
Match workflows to KPI
An automated workflow should make data migration between types of storage a transparent background process. This works because the task requirements are anticipated and built into the system.
Grading systems arguably have the highest availability requirements in the production process, but does the storage used for this application have to be mirrored? Do you have to simultaneously store all current projects? Or, can you live working on just one project on-line at a time? Tradeoffs can be made to permit a sufficient level of availability, redundancy and capacity. After all, the redundancy required for the safe operation of a nuclear power-plant may be excessive for the production of the nightly news!
Diagram illustrates a typical broadcast workflow using Isilon technology. A modern storage platform allows users to adapt available storage to project needs—all without the complexity of becoming a storage infrastructure expert.
Correctly designed workflow management systems acquire the necessary information in order to anticipate data access requirements and move the data where it will be needed in a timely manner. This can even include an automatic order to get backups from Iron Mountain in advance so that the material is in place when needed for post. Fortunately, today’s workflow management solutions are so sophisticated, they can anticipate virtually all your production storage needs and automatically retrieve and move the data where it’s needed without human intervention.
Storage costs continue to drop. But, that doesn’t mean you should chase them. Shown here is a Samsung 1TB SSD, which today costs about $400. A 1TB HD may cost less than $50.
Exact pricing for each storage option is a moving target, however the relationship between the options should remain essentially the same.
Off-line storage costs are about one-third of on-line storage, this is without geographic replication. Using LTO-6 and 3TB cassettes at 50 cents per tape per month makes archive physical storage cost 1/100 of off-line costs. The latter comparison is, of course, unfair as it does not include the cost of the tape itself or the additional cost for physical retrieval.
However the extreme discrepancy between automated retrieval taking hours and manual retrieval taking days leaves room for a new service offering 24 hour retrieval of on-shelf storage. When thinking about the viability of shelf storage, remember that Disney destroyed the 4K data used for the latest release of Snow White and only keeps the physical separations!
You might also like...
Expanding Display Capabilities And The Quest For HDR & WCG
Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.
NDI For Broadcast: Part 2 – The NDI Tool Kit
This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.
HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG
Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…
Great Things Happen When We Learn To Work Together
Why doesn’t everything “just work together”? And how much better would it be if it did? This is an in-depth look at the issues around why production and broadcast systems typically don’t work together and how we can change …
Microphones: Part 1 - Basic Principles
This 11 part series by John Watkinson looks at the scientific theory of microphone design and use, to create a technical reference resource for professional broadcast audio engineers. It begins with the basic principles of what a microphone is and does.