Spectra Logic Enhances StorCycle Software With Long-Term Data Protection

The new features in StorCycle 3.5 allow tiering and protection of cloud data, provide increased protection against ransomware attacks, and boost metadata searchability and accessibility, among other benefits.

“With new cloud mandates and mounting ransomware attacks, it’s more critical than ever for organisations to get visibility into their data and manage that data for its lifetime,” said Jeff Braunstein, Spectra Logic's director of product management. “Organisations need to know how much data they have, where it is located, how to find it when needed, and where to protect it long-term. StorCycle is built to meet those requirements.”

StorCycle scans primary onsite or cloud storage and migrates or copies files that meet policy-based criteria to a lower cost tier of storage, which includes any combination of cloud storage, object storage disk, network-attached storage (NAS) and object storage tape, while leaving data accessible and useable.

Latest StorCycle features include:

  • S3 Source Storage -- In addition to StorCycle’s support for cloud storage targets, the software now allows users to migrate or copy S3 cloud data to a BlackPearl object storage device.
  • Data Encryption – Data encryption is now available for data migrated by StorCycle. Previously, data encryption was provided in StorCycle via the encryption capabilities of the storage targets themselves. Now, the software can also encrypt the data as it is moved or copied to a storage target. This integrated feature provides users with an easy method to protect all migrated data.
  • Single HTML Links for Jobs – This feature allows for a single HTML file link to be left for an entire job/project, rather than for each migrated file, making it easier to access and restore data. With this feature, the structure remains clean and organised on the source.
  • Job Queue Priority Control – This feature allows users to prioritise jobs in the queue, giving users greater control over job execution order.
  • Linked Instances – For organisations with multiple StorCycle installations on the same network, the software can now link to other instances and provide a single search window. This feature makes the search for migrate/store projects much easier.
  • Restore User: Daily Capacity Limitations – Administrators can set daily restore limits for users to regulate the amount of data returned to primary storage. This ensures that the data returned to primary storage stays beneath a pre-determined threshold to maintain system and performance advantages.
  • Custom Age Filter – Instead of pre-set age ranges for data migration, StorCycle now enables users to customise the age range that triggers a migration. For example, a user might set the policy to migrate all data that is 97 days old and older. 

The company also announced a free 60-day trial of StorCycle software.

You might also like...

Remote Contribution At NAB 2025

The technology required to get high quality content from the venue to the viewer for live sports production remains an area of intense research and development, so there will be plenty of innovation and expertise in this area on the…

Production Network Technologies At NAB 2025

As NAB approaches we pick up the key theme of hybrid production network infrastructure that combines SDI-IP network infrastructure & data center model compute resources, with a run-down of what to expect from vendors on the show floor.

KVM & Multiviewer Systems At NAB 2025

It’s NAB time again. Once again, as we head towards the show, we will take a look at the key product announcements across a range of key technology and workflow areas. We begin with the always critical world of K…

Sports Production Infrastructure – Where’s The Compute?

The evolution of IP based production and increased computer processing power have enabled new workflows, so how is compute resource being deployed to create new remote and hybrid approaches?

Building Software Defined Infrastructure: Shifting Data

The fundamental principles of how data flows through local and remote processing systems are central to designing software defined infrastructure.