Choosing the Right Storage for Collaborative Workflows

Like the creative professionals that use them, today’s storage systems need to be agile and able to serve up content when it is requested, as quickly and efficiently as possible. A real-time performing ‘Primary’ storage system also needs to accommodate multiple, disparately located users working in a collaborative environment. There are many successful variations on how to set up such an infrastructure, but the type of storage technology you choose—spinning hard disc drives (HDD) or solid-state flash media (SSD)—could mean the difference between a project’s success and failure.

While many have looked to flash storage for its fast read/write times (e.g., low latency), experts say you have to understand your workflow and deploy the technology accordingly. Storage vendor Quantum recently conducted a series of lab tests on HDD and flash media at its Englewood, Colorado lab facility and found some interesting and a bit surprising results. For example, Quantum says it found that flash is not necessarily the best option, in terms of value, for certain types of shared content.

Quantum's testing showed that flash storage provides excellent value in media workflows where there are a high number of compressed streams—that can leverage the low latency performance of flash—and where high storage capacity is not as important as multi-stream performance (“I need lots of files now”). On the other hand, because of the nature of uncompressed formats, flash does not significantly improve performance when there are a fewer numbers of uncompressed streams.

[The company’s StorNext data storage management software benefits from both Flash and HDD storage, thereby giving users the freedom to choose the infrastructure that best fits their performance, capacity and cost requirements.]

To conduct the storage tests—basically looking at 4K performance in a real-time collaborative editing environment—Quantum engineers used both AutoDesk Flame 3D Effects and Blackmagic Design Davinci Resolve software systems and ran as many streams as they could to see at what point the storage failed to deliver the performance required. Then they measured that performance with a benchmarking tool called VidIO to calibrate the measurement tool results to the stream count they were able to get out of the real-world applications. Once that was done, they were able to measure how the storage would actually work in the real world using VidIO.

HD versus 4K UHD storage formats evaluated by Quantum.

HD versus 4K UHD storage formats evaluated by Quantum.

“We got more streams out of compressed formats on flash than we expected, but we also got less uncompressed streams than we expected,” said Dave Frederick, senior director of media and entertainment at Quantum. “Customers want to know that they’re going to get what they expect out of the system to support their staff. Predictability is more important than the number of streams.”

Quantum offers its StorNext software in tandem with its Xcellis workflow storage system. The in-house testing was completed using 14 different Xcellis storage configurations that used different 4K media formats totaling over 500 individual tests.

HDD Storage Offers Best Value for Uncompressed Workflows

Among the findings, which are detailed in a new white paper released by Quantum, real-time collaborative editing workflows using uncompressed HD or 4K files get the best value from HDD storage rather than all-flash SSD storage. Also, stream count (relating to the number of users on a network) was found to be as important a factor for media choice as stream data rates and capacity requirements. Quantum’s StorNext data management system works with both Flash and HDD storage, thereby giving users the freedom to choose the infrastructure that best fits their performance, capacity and cost requirements.

The company’s engineers found that 3.5-inch HDDs work best for a low number of compressed or uncompressed 4K stream counts requiring high capacity, while 2.5-inch drives are best for a higher number of compressed 4K stream counts with mid capacity requirements. And finally, Flash storage is ideal for very high compressed 4K stream counts that require low capacity.

System designers should look at the difference between bandwidth and latency that the system provides. When it comes to compressed formats, multiple streams of compressed formats, you want the lowest latency you can get.

“It’s about the amount of time it takes the discs to get to the piece of data you are looking for, as opposed to the amount of data they can push out at a certain amount of time,” Frederick said. “It’s the difference between seek time and playback time.”

Working with compressed formats, users should look for low latency storage. This allows them to seek around fast enough to get to those compressed files. Then when the compressed format is played out, they are not using that much bandwidth because the files are not super heavy. A Flash system or a high-speed disc system can produce a lot of streams, but when moving to an uncompressed workflow, what the test results data showed is that you max out the bandwidth far before you ever get to the point where Flash storage will have a positive impact on the number of steams you can do.

At full 4K resolution, Quantum found that 24 fps material requires a storage capacity of 4,406 GB/hour.

At full 4K resolution, Quantum found that 24 fps material requires a storage capacity of 4,406 GB/hour.

“So, many companies that sell flash boxes are saying, ‘just buy flash, it will solve all of your 4K problems,” Frederick said, “We say, yes, Flash will allow you to do work in multiple streams of compressed 4K. But if a customer wants to do a color correction suite in uncompressed 4K, it would be a waste of money to outfit that with Flash. You can get enough performance from spinning discs, at a much lower price.”

Primary, Secondary and Tertiary Storage

The testing also showed that while Flash is fast for getting data on and off the drive, it lacks required capacity, because it’s so expensive when looking for 4K performance in a collaborative editing environment.

“Uncompressed files are much larger than compressed files,” Frederick said, “so therefore, it’s a double wammy: you don’t benefit form the low latency of Flash with uncompressed files and you take up so much extra space that you end up spending way too much money trying to support your uncompressed workflow on Flash. In our opinion, you are much better off with spinning discs. That’s the counter-intuitive result of our testing: we thought that if you get faster storage, you can do more work, but it turns out that you can do more work, but it depends upon the type of work you are doing.”

Across the industry, storage systems are classified into three main categories: Primary (I’m working on this file right now); Secondary (think object storage) and Tertiary (LTO Tape).

If your facility has built up an archive of 4K material (Tertiary), you want to try to keep everything as close to Primary storage as possible, because if you transfer a lot of content over the Internet, it will cost you a lot of time and money. And you want to use the lowest cost, most reliable format, and that’s LTO data tape cartridges. It’s the cheapest, safest long-term storage. Many people are looking to spinning disc-based, object storage, because it is intended to be a long-term solution and is constantly evolving as technology gets better.

Cost-Effective Expandability

And, according to Quantum, whenever you can move files off of Primary storage and on to Secondary or Tertiary storage, you’re getting free Primary storage back.

“The prevailing attitude is that ‘I don't want to archive because I can't get back to it easily,’” said Frederick. “But what users need to understand is that Tertiary storage is not a tape library in the sense that it’s videotapes on a shelf. This is an automated library that’s controlled by a robot and accessible through the same file system that they access their everyday work on.”

Using the Quantum StorNext file system and data management system, the user can cost-effectively add a storage array and extend the file system across to that new array. And everyone on the team gets the benefit of the added HDDs that provide more performance and more capacity.

Contrast that with a clustered file system. Every time you want to scale, you must add a whole new cluster, including all of the overhead that comes with it. In Quantum’s case, the Xcellis controller is already in place and all that’s needed is an additional HDD storage array. Frederick said customers that are buying clusters are also adding a lot of compute hardware they don’t need.

“With our system, you never need to add controller hardware,” he said. “The only thing users need to add to go from terabytes of storage to petabytes of storage is just storage. The processing power is built into the file system.”

With clustered systems, the computer sits between the storage and the user.  With Quantum’s StorNext, the computer is off to the side. This allows for fast retrieval times. The system also scales easily. Fredrick said they have a customer that has more than 1,600 users on a single StorNext file system.

“StorNext handles the additional demands by not being in band,” Frederick said. “We don't get in the way of the workflow. The system tells the user where they can go to get the file. So the controller is not part of that process.”

It’s All About Access

Access is another big benefit of Quantum’s Xcellis HDD storage arrays, or the ability to retrieve content either directly over Fiber Channel to the storage or via a Distributed LAN Client (DLC), which allows for Fiber Channel protocol over Ethernet. Fiber Channel over Ethernet is built into the Macintosh OS. So it’s a way of using Ethernet and still getting the benefit of that direct communication between the computer and the storage. There’s also the benefit of using Networked Attached Storage (NAS) arrays so you can customize the system to your specific needs. This way every user on the network gets the same performance.

“Playing a single stream is not that hard,” Frederick said. “Playing a single stream of uncompressed, 60 frames per second, full aperture in 4K still requires 3 Gbps of bandwidth, and that is hard. That’s where bandwidth becomes an issue. But when you then start to add streams, now what happens is that you need to be able to read from stream A and stream B, basically simultaneously. And those files can exist on the same disc, in two different places. So now you’re asking the system to go back and forth between those files and say ‘give me some of A and some of B.’ The longer it takes the drive to seek the data, the larger the delay from getting from A to B.

“If you are going from A to Z, with multiple users working at the same time, you can imagine the demands on the hard drive,” Frederick said. “That’s why if you get drives with low latency, they can get between the files faster and you can do more streams. Remember, it’s not about how much data you can pump out of a single stream, it’s about how you can get to all of those streams to a specific desktop at the same time.”

In the end Quantum’s testing provided a number of key conclusions regarding stream counts and performance levels, when and where flash is the most cost-effective solution for high performance, higher-resolution workflows, and when a spinning disk configuration can provide better performance for less cost.

“Our goal for the test was to validate the performance of the system in 4K,” Frederick said. “We want to be able to tell customers precisely how many streams they can expect from the various formats. And in the process of doing that, we discovered this unexpected attribute of stream count becoming a bigger deal than we thought. That’s when we learned that Flash is not the best option, in terms of value, for certain types of streams.”

Registered readers can download the full white paper at the link below.

You might also like...

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.