The following use cases give insight into how implementing the two-tier approach with storage lifecycle management software enables more storage capacity, lower costs, greater data access, enhanced protection and fewer silos.
Maximizing Efficiencies
As much as 80 percent of enterprise data may qualify as inactive, yet it resides on primary storage. In a two-tier storage model, inactive data is moved to the lower-cost Perpetual Tier. By allowing organizations to configure a smaller Primary Tier, this strategy reduces primary storage costs in hardware, software, storage licensing, administration, maintenance and support. As the size of an organization’s primary storage is reduced, backups or replication snapshots are also smaller, leading to shorter backup windows and reduced backup storage costs.
Picture a small corporate IT group with around 12TB of data growing at 25 percent per year. Their backup software licensing is capacity-based and they have conservatively estimated that 50 percent of their data is inactive. Given the relatively small amount of data, and the limited resources they have available, they decide to move this data to the cloud. Not only does this move cut their backup windows in half, but now they have less backup capacity charges to worry about. They are able to make two redundant archive copies to cloud storage for a fraction of what it would have cost them to keep that data on the Primary Tier. The change was seamless to their existing backup operations. Implementing a cloud archive strategy is both easier and less costly with the right storage lifecycle management software.
Managing Budgets and Growing Capacities
When a data-driven business predicts that their data will expand at a compounded annual rate, business growth mandates may seem to directly clash with budget constraints. Let us say, for example, that a midsized corporate IT department currently runs 600TB on their primary storage and expects their managed data to increase by roughly 40 percent per year. Adding 240TB of high-performance HDD per year doesn’t fit their budget. Rather than implementing severe storage quotas to departments, they deploy a new storage lifecycle management software. They will be able to move an estimated 70 percent of data deemed inactive to the Perpetual Tier, consisting of a combination of NAS and tape.
The upside is significant in this case. They project being able to hold their primary storage growth level steady for 18 to 24 months. They are also able to increase storage quotas for data stored on NAS while completely eliminating storage quotas for anything moved to tape – where users still have seamless access to that data, restoring any file (or directory) that exists on low-cost storage from their local work machine.
Greater Access, Enhanced Protection and Fewer Silos
As we continue to look at use case examples, it is important to keep in mind that the Perpetual Tier of storage is not limited to older data. The above use cases focus on moving data into the Perpetual Tier based on age and last access date. The Perpetual Tier should also serve as an archive tier for large data sets or projects that may be moved immediately after creation or collection.
Imagine a governmental research facility collecting large amounts of machine-generated data from sensors that detect physical phenomena. The output may not be analyzed immediately, but the data is by no means disposable and must be kept for future reference. Most of the machine-generated data they receive initially requires high-speed disk as a landing zone. With storage lifecycle management software, individual researchers can designate the storage layer for data, immediately moving it to lower cost storage after it lands and bringing it back as needed. Data that needs to be safeguarded can be moved to a tape storage tier – which is not externally accessible – as well as directed to the cloud for distribution or sharing. This type of project identification allows multiple forms of data from multiple types of data generators to be collected, archived and accessed many years into the future.
In environments such as a large university supporting multiple research projects, where different groups use the same data storage infrastructure under standardized service level agreements (SLAs) based on performance, storage lifecycle management software can help accelerate research by moving inactive data and whole data sets off high-speed storage. Providing access across all data – human, application or machine generated – it allows administrators and researchers to identify what is actively used, what needs to be archived and what data is orphaned.
With modern storage lifecycle management software, researchers can scan and identify inactive data sets for archive, or move files or directories associated with a project as a group to the Perpetual Tier after a large project is completed. Archived data sets can be tagged with additional metadata to enable easy search and access even after a researcher leaves the university.
Deploying New Technology
Now imagine a midsize corporate IT department in the process of upgrading their primary storage flash array. They have roughly 500TB of data in primary storage and estimate 60 percent of it is inactive. By deploying storage lifecycle management software to remove “low transaction” data from the Primary Tier, they can now more affordably upgrade to using high-performance SSD, NVMe flash or other cutting-edge technologies. They are able to significantly increase the performance of their data center and work with state-of-art technology, and users have seamless access to migrated data on the NAS in the Perpetual Tier.