LTO Tape is the Star for Archival Data Storage
In an era when exploding quantities of storage is needed for archival data, it is ironic that the oldest, tried and true format — magnetic tape — is winning the competition over newer media.
The tape format of choice is Linear Tape-Open, or LTO, which was originally developed in the late 1990s as an open standards alternative to the proprietary magnetic tape formats that were available at the time. Hewlett Packard Enterprise, IBM and Quantum control the LTO Consortium, which directs development and manages licensing and certification of media and mechanism manufacturers.
The original version of LTO tape was released in 2000 and could hold 100 gigabytes of data in a cartridge. Now, we are up to the seventh generation of LTO, which was released in 2015 and can hold six terabytes in a cartridge of the same size.
LTO rapidly defined the data tape market segment and has consistently been the best-selling format for use with both small and large computer systems, especially for cold storage — or long-term archiving.
The irony is half-inch (½-inch, 12.65 mm) magnetic tape has been used for data storage for more than 60 years. In the mid 1980s, IBM and DEC put this kind of tape into a single reel, enclosed cartridge. Though the formats have changed through the years, the basic idea was a winner.
Another major development in LTO’s history is the Linear Tape File System (LTFS), a self-describing tape format and file system made possible by the partition feature. File data and filesystem metadata are stored in separate partitions on the tape.
The metadata, which uses a standard XML schema, is readable by any LTFS-aware system and can be modified separately from the data it describes. Since LTFS is an open standard, LTFS-formatted tapes are usable by a wide variety of computing systems.
LTO technology competes in the market against other tape technologies and hard disk technology. In the course of its existence, the format has succeeded on both fronts — almost completely displacing all other mid-range and low-end tape technologies and preventing, or at least delaying, the predicted “death of tape” at the hands of newer media.
Part of the reason for LTO’s continuing success comes from innovations by companies like Spectra Logic, a developer of deep storage solutions that solve the problem of archival storage for clients dealing with exponential data growth.
Founded in 1979 and based in Boulder, Colorado, the company was founded by Nathan C. Thompson, who gathered his last $500 to start buying and reselling used computer equipment from his dorm room while a student at the University of Colorado. That turned into Spectra Logic. Today, Thompson is CEO of the company.
Specta Logic’s technology is used by a who’s who in the broadcasting and entertainment industry. Clients include ABC, NBC, Comcast, DirecTV, Discovery, DISH, ESPN, NASCAR, PBS, Showtime, Sony, Saturday Night Live and dozens of individual television stations around the world.
Spectra Logic’s chief technology officer, Matthew Starr, spoke with The Broadcast Bridge’s Frank Beacham about what his company is doing to enhance LTO for archival storage.
The Broadcast Bridge: Your company has a new product that is redefining archival storage and enhancing the use of LTO tape, called BlackPearl. What does it do?
Matthew Starr: An archive is a very large repository of all the high resolution video clips and sometimes the proxies. Between the editor and the archive is a workflow — a very complex software stack. Users might buy two or three software packages and use them together to create a workflow where an editor can ask for something and the software can then dig down into the physical tape archive, find it and stream it back.
We created BlackPearl to simplify this process. We adopted the same protocol that Amazon does for their cloud storage. It uses something called Simple Storage Services, or Amazon S3. If you put a BlackPearl in front of a physical tape library, it makes it look like you have a private cloud. It can simultaneously go to the cloud and deliver the data with LTFS to any LTO tape archive system.
The Broadcast Bridge: So BlackPearl sits between the edit system with its nearline storage and the archival tape system.
Matthew Starr: Yes. For example, we’ve developed a tie-in so the Avid Interplay system can talk directly to the BlackPearl. From the editor’s standpoint, it may not look that different. But from the engineer’s standpoint, BlackPearl takes a lot of complexity and cost out of the way. With BlackPearl, when the editor wants something from the archive, it’s just two steps away.
The Broadcast Bridge: Your company makes libraries using LTO tape that range from small to gigantic systems supporting supercomputers. What is driving the use of LTO tape for archives?
Matthew Starr: Cost! Are you willing to take the possible one or two minute time penalty to save hundreds of thousands of dollars in storage costs for your cold assets? If you want to hold a petabyte of data — or a few hundred thousand hours of content — but you only play out .1 percent of that content per year, you certainly wouldn’t want to store it on a flash system that costs hundreds of dollars per gigabyte. You would want to store it on the coldest tier. Today that storage is tape.
The other thing that is driving LTO sales is the world of storage today is made up of three primary platforms — flash, discs and tape. Flash sales are chewing into disc drives. Flash is so fast and costs only two or three times more than some disc arrays. It’s like ten times faster and only twice the cost — so many people buy flash. Sales of disc drives are going down, and that is driving LTO shipments up. People are starting to understand that data is in one of two states — either very warm or very cool. There is very little lukewarm data.
The Broadcast Bridge: How long does LTO tape last and how often should you change out your data?
Matthew Starr: LTO tape has a 30-year life. But my recommendation for LTO users is they should migrate to the latest version every five to seven years. The maximum I would wait is ten years. It’s not so much that the data is degrading, but think of the operating systems and storage systems that we ran ten years ago. There has been a lot of change. If your content is valuable, a regular migration path is essential.
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
If It Ain’t Broke Still Fix It: Part 2 - Security
The old broadcasting adage: ‘if it ain’t broke don’t fix it’ is no longer relevant and potentially highly dangerous, especially when we consider the security implications of not updating software and operating systems.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…