Spectra Customer Highlight: CERN, Geneva Switzerland

Reading time for this article .

By Betsy Doughty
VP, Corporate Marketing, Spectra Logic

CERN is a European research organization that runs the largest particle physics laboratory in the world. They recently deployed two fourteen-frame TFinity libraries. Spectra is a partner to CERN in this project, recognized as an expert on long-term storage for multi-petabyte to exabyte environments. Here is an excerpt from a white paper exploring CERN’s mission, research, and need for a large storage solution that can continue to grow:

When the limits of human understanding are being pushed by the desire to answer some of life’s most complex questions, the amount of time and complexity of the research demands the most secure and scalable storage available. CERN uses some of the world’s largest and most complex scientific instruments to study the fundamental particles of matter, quite literally discovering the “God Particle”, the missing cornerstone in our knowledge of nature. They do this with the LHC (Large Hadron Collider), the world’s largest and most powerful particle accelerator. The LHC is 27 km (16.8 miles) of superconducting magnets in the shape of a ring, that sends high-energy particle beams close to the speed of light until they collide, with the goal of figuring out how the universe was formed. This research means the CERN Data Centre is producing data at an astronomical one petabyte of data per day, pushing their CERN Advance Storage System (CASTOR) to a massive 330PB of data stored on tape. According to CERN, this is equivalent to 2000 years of 24/7 HD video recording.

The LHC got a big performance upgrade that has kept it shut down for two years, just long enough to upgrade their data archival software and tape system to handle the much higher data volume. Once the LHC resumes its next run in 2022, data creation is expected to double. This means that they will be storing an additional 600PB or more after run three in 2023. This will put CERN very close to if not over 1 Exabyte of data on tape at the end of 2023. The LHC will then shut down again, upgrades to many of the sensors, magnets, and testing devices in the collider will be made, and when it resumes from 2026 to 2029, we can expect an increase of five times the current level of data stored. This means that during run four there will be over 1.5 Exabytes of additional data that needs to be archived to tape.

Multi-Petabyte Storage Environments: Sharing and Storing Strategies for Data- Intensive OrganizationsThe new CERN Tape Archive (CTA) software will replace CASTOR and will store the existing 330 petabytes of data, as well as ALL new data created. It is being designed to be able to handle these massive amounts of data. The data that is produced at CERN is extremely valuable and must be preserved for future generations of physicists making tape the ideal archive storage technology to use. It is also shared worldwide. CERN has transferred 830PB of data and 1.1 billion files to other HEPIX research organizations all over the world. This allows other physicists to conduct research and it also means that the data is archived geographically with multiple copies so that it can safely be kept forever. CTA has the ability to store an Exabyte in native capacity and CERN is rapidly approaching the need for an Exabyte of storage. They are counting on the industry to continue to innovate to store multiple Exabytes in the coming decade to be preserved for future generations and with the roadmap of LTO tape it looks to be a partnership that will stand the test of time.

To understand the challenges of storing data in massively large storage environments, download the white paper, “Multi-Petabyte Storage Environments: Sharing and Storing Strategies for Data-Intensive Organizations” here.

See how Spectra Logic helps CERN advance the boundaries of human knowledge by providing a storage solution for the vast quantities of data generated by the Large Hadron Collider (LHC) and experiments elsewhere in the accelerator complex, check out our newest case study here.