Performance and Scale: Overcoming the Hurdles of Video and Video-like Data

It’s no secret that video and high-res content is continuing to grow exponentially as consumers respond to the engaging and interactive experience it can offer. Indeed, 50 percent of all data is now video, and an explosion of high-res content is driving performance and capacity gains across a number of markets. This content enhances the capabilities of businesses across the globe. The problem is that a lack of infrastructure and management tools for video and video-like data is presenting organizations with some major performance and scalability challenges to overcome.

For example, data infrastructure, database management services, virtual environments and traditional corporate file data are all grounded in technologies that are not necessarily well-suited to supporting the large file sizes of video. Some of them, like replication, are far too expensive, while others such as compression and deduplication simply do not work at all.

At the same time, new applications for video and high-res content are emerging rapidly, expanding beyond media and entertainment into areas such as video surveillance, medical imagery and the Internet of Things (IoT).

These use cases require advanced technology which needs to be usable, practical and adaptable enough to work for non-technical markets. No matter the industry, they all need the same capabilities, including the likes of high-speed video ingestion, sophisticated high-volume analytics and economical storage tiering.

Real World Applications

When it comes to leveraging video and high-res content, businesses shouldn’t need to hire computer scientists. Whether the data comes from images, photographs, video, lidar, radar or IoT, the content must be easy to use, yet still provide the low latency and high-speed storage required to make it usable and practical.

For example, high-end auto makers are adding high-speed computing features to their cars, but don't want the heavyweight complexity of a supercomputing system. They need data analytics, high-performance computing and storage at an economical price.

Then there’s high-tech stadiums, which are using hyper-converged video and data to support their security operations, or life sciences applications ingesting massive volumes of raw data for processing and study. Not only must this data be indexed and preserved to validate studies and for regulatory compliance, it also has to be kept available to guide future studies.

Jamie Lemer is CEO of Quantum.

Jamie Lemer is CEO of Quantum.

What do users expect?

Users of video/rich media expect the same sophisticated level of data services that they enjoy with text-based data - i.e. the ability to ingest, analyse, search and store data. This requires technology to manage these assets throughout their lifecycle.

The key is having a compute/storage ecosystem that provides high-performance data movement, low cost/high capacity storage and tools for analysis and pattern recognition. Video surveillance, which continuously generates massive files that users must protect and store, provides a perfect example. These data services need to efficiently search and identify matching patterns, such as license plates, faces, an unattended piece of luggage in an airport, or a person holding a weapon.

Video surveillance professionals then need access to tools that ensure the data is immediately available and highly searchable, which is where technologies such as programmatic APIs, high speed flash, software data services and intelligent storage tiering come into play.

Tiering in particular is key. From high performance ingestion to SSD flash tiers, to high speed disk and tape; intelligent, policy-driven tiering efficiently manages massive media files for performance and retention.

Innovations such as high-speed Non-Volatile Memory Express (NVMe) quickly ingest massive volumes of sensor data and video, which is then managed for optimum performance and capacity to reduce storage costs. Although these technologies have historically been used within the media and entertainment space, they are now also driving new, on-premises scalability architectures into several other verticals.

For example, they are providing sports broadcasters with immediate access to historical footage, which can then be added into new broadcasts. In many cases, broadcasters are turning to tape, which provides a cost-effective solution in a cloud-like, video/rich media management infrastructure and download speeds that are able to cope with today’s accelerated broadcast schedules.

Users also need to be able to search their assets, be it for stats from a sporting event or intelligence on terrorist activity. These queries require artificial intelligence that enriches the metadata upon ingestion, so users can intelligently search their assets for the information they require.

What’s next?

Technology development is showing no signs of slowing. Forward-thinking video lifecycle vendors are investing heavily in several areas, including AI development, on-premise storage scalability and search and analytics. They are also looking to bundle these technologies into managed services for their customers.

We also expect to see future growth in cloud-managed, on-premise storage, which combines the speed and cost savings of on-premise systems with the ease of use and elasticity of cloud offerings. This could bridge the gap until full post-production can efficiently be done fully in the cloud in media and entertainment applications.

Ultimately, customers expect their vendors to help them store, analyse, tier and protect their rich media so that they can generate insights from it. This is certainly a challenge, but it also provides a rich opportunity for those storage vendors that are able to step up to the plate and answer the call.

You might also like...

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…

NDI For Broadcast: Part 2 – The NDI Tool Kit

This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.