By Rich Gadomski, Head of Tape Evangelism, FUJIFILM Recording Media U.S.A., Inc.
In mid-December, Fujifilm issued a press release to announce that, together with IBM Research, they had successfully achieved a record areal density of 317 Gbpsi (billion bits per square inch) on next-generation magnetic tape coated with next- generation Strontium Ferrite (SrFe) magnetic particles. This areal density achievement would yield an amazing native storage capacity of 580TB on a standard-sized data cartridge. That’s almost 50 times more capacity than what we have now with an LTO-8 tape based on Barium Ferrite (BaFe) at 12TB native.Read More
by Randy Kerns Senior Strategist & Analyst, Evaluator Group
Due to many factors, including the COVID-19 pandemic, professionals in Information Technology have been focused on optimizing their environments and reducing expenses. Recent strategic projects with our IT clients have provided new insights into the current needs and methods for achieving maximum efficiency. One approach is the process of identifying inactive or project-based data that resides on primary storage and moving it to less costly secondary storage such as object storage, NAS systems, cloud storage, or tape. Cost savings were not only from reducing the need to purchase additional primary storage resulting from the recovery of capacity freed up by moving the inactive data but also in the cost for data protection of unchanging data.Read More
What do you do when you need to increase the space available for storing vast amounts of data? One solution is to install a high-capacity automated tape library or ‘Tape Robot’.
STFC’s Scientific Computing Department (SCD) needed to increase its capacity for storing and managing the vast and ever-growing amounts of data from the JASMIN Facility1 (environment and climate data) and through IRIS 2 (astronomy, particle physics, and data from national facilities such as ISIS Neutron Source, Diamond Light Source and the Central Laser Facility).
The extra capacity needed to be at least 60 petabytes (PB) – the equivalent of storing more than 10 million DVDs – with the potential to increase again to meet future data demands.Read More
by Elizabeth Rosenthal
Oak Ridge National Laboratory
Results from supercomputer simulations will reside in modernized tape library
Tape recorders and videocassette tapes may be considered archaic in the entertainment industry, but magnetic tape storage remains one of the most durable, secure, and cost-effective ways to archive big data generated by supercomputers.
After researchers run code on high-performance computing (HPC) systems, key results are initially stored on disks and then transferred to tape libraries for long-term storage. These systems contain tape drives that write data onto magnetic tape and hundreds or thousands of tape cartridges. Robotic arms place these cartridges in slots for safekeeping and retrieve them when users need to access data to cite experiment outcomes in academic articles or share accomplishments with colleagues.Read More
by Fred Moore President, Horison Information Strategies
The changing landscape of the data protection industry has evolved from backing up data to providing recovery from hardware and network failures, software bugs, and human errors, as well as fighting a mounting wave of cybercrime. Over the years, reliability and resiliency level of hardware and software have significantly improved. Cybercrime, however, has now become a bigger threat to data protection than accidental deletion, bit rot or drive failure. And the stakes are getting higher as anonymous cyber criminals seek to profit from the valuable digital data of others. With a ceasefire in the cybercrime war unlikely, we are witnessing the convergence of data protection and cybersecurity to counter rapidly growing cybercrime threats, including ransomware.Read More
We all know the story – data growth is exploding, but IT budgets are staying flat. Customers realize that storing and managing all data on primary storage is not sustainable – especially since the amount of data managed is typically 300%+ of the amount created (for 1TB of data stored, it is 3TB to 6TB managed – 1TB source, 1TB mirror/disaster recovery and 1 to 3TB backup copies).
Since 80%+ of this data is cold/inactive, by moving the cold data and its management to a cost-efficient secondary storage such as Spectra’s BlackPearl® Converged Storage System, businesses can typically save 70%+ on storage costs.
But customers tell us that data often continues to sit and be managed exactly where users first put it – on expensive primary storage — because of some key roadblocks that create friction. The combined Komprise data management and Spectra solution eliminates these hurdles and creates a frictionless solution for massive data management:
Driving Blind with Data: The #1 issue we hear from many customers is that IT is often making critical storage decisions in the dark. IT does not create most of the data it is managing, and IT does not often have a good understanding of what data is still being actively used, what isn’t, or how fast data is expected to grow. In many cases, this problem is further compounded by data sitting on different vendor storage systems, with little or no visibility across these silos.
With the Komprise and Spectra Solution, you get full visibility into how data is growing and being used across all your NAS storage. And, based on policies you set, Komprise projects the estimated savings of using Spectra targets to transparently archive and/or replicate data.
Users Don’t Want to Look for Data in Multiple Places: typically, when cold data is moved off primary storage, users need to go to the secondary location to find the data. This requires a behavior change for users and creates friction and resistance to moving data off primary storage.
With the Komprise Spectra Solution, all the data moved to Spectra targets are still accessible as files on the source with no disruption for users and applications. Spectra BlackPearl provides object access to the moved data, Komprise mimics a transparent fully hierarchical file system on BlackPearl and maintains a dynamic link on the primary NAS so that users still see and access the moved files as if they were still on the original NAS. All NTFS privileges, file hierarchies and attributes are fully preserved. Read More
SVP, Products, StrongBox Data Solutions
Spectra invited StrongBox Senior VP of Products Floyd Christofferson, to discuss the attributes of the StrongBox and BlackPearl Solution.
How does the StrongLink® solution work?
StrongLink leverages the power of metadata and machine learning to deliver the first solution to truly automate storage and data management across multi-vendor storage environments. StrongLink is not limited by a particular filesystem or storage vendor ecosystem but can bridge multi-vendor environments with a policy-driven management layer to automate and simplify all aspects of data management, data protection, storage resource management and more.
The result is reduced complexity for IT staff and users alike, which results in reduced management overhead, better ROI from existing storage resources, and greater control over protection and utilization of the data.
How does Spectra’s BlackPearl enhance your solution and benefit customers?
The direct integration between StrongLink and the Spectra® BlackPearl® Converged Storage System means that tape-based storage may be seamlessly tied into a cross-platform policy-based storage management fabric. StrongLink extends BlackPearl and Spectra disk and tape-based platforms to be part of a global namespace, automating data protection policies, data migration, data placement for workflows, etc.
With the combined Spectra and StrongBox solution, IT administrators can centrally manage all of their resources and seamlessly leverage the advantages of each of Spectra’s storage solutions to automatically ensure that data is protected and accessible in the correct storage tier according to the stage of its lifecycle.
What are the ideal markets/ vertical industries for the solution?
The greater the volume of data that must be managed, particularly unstructured data, the greater StrongLink’s value is to customers. This applies to Higher Ed and Research, Life Sciences pipelines, such as genomic labs, Media and Entertainment, large-scale video surveillance, oil and gas, manufacturing, and so on. Customers who have greater than 300-500TB, or who are on the path to get there will see improved ROI on their existing storage resources, and a reduced load on IT staff as they move forward.
What are some of the biggest challenges end users face today that led you to develop this solution?
The extreme growth of unstructured data has created a burgeoning set of new problems for IT organizations. This is particularly the case where expensive primary storage platforms end up housing a high percentage of inactive data. Storage vendors offer storage-centric solutions to this, but those typically are limited to the vendor’s file system, or storage ecosystem. These solutions usually lock customers into a limited way of doing things.
Whether it is to get better utilization out of primary storage, or to enable multiple storage tiers or cloud options, or to simplify data protection, or optimize multiple existing storage resources, customers want the freedom to select whichever storage type they need to match budget, performance requirements, or current and future use cases. And they want to do this without adding complexity to their environment, or interruption to users.
StrongLink was built to provide a data-centric approach that would work across any storage type. By leveraging the power of metadata in a vendor-neutral platform, StrongLink can automate, and thus dramatically simplify cross-platform data management, data protection & replication, reducing load on IT staff, and simplifying access to users.
<!--What three benefits will end user realize from implementing the solution?
By aggregating multiple types of metadata about the files, the system provides IT administrators the intelligence about their data to know what it is, where it is, whether it is stored in the right place, how it is protected, etc.
Simplify data and storage management with Automation:
With the knowledge derived from automated data classification, StrongLink enables IT managers to automate cross-platform data migration, data protection, storage optimization and other routine tasks, without causing disruption to users.
Improved ROI on Existing Storage Resources:
Data classification and automation have a direct impact on ROI, both by reducing IT management overhead, but also by enabling storage optimization to defer or reduce the need to buy more expensive primary storage. This is only effective when it is done without impact to users, and in a way that does not add burden to IT. This the core of the design principles for the StrongLink solution.