July 27, 2022

SLAC expands and centralizes computing infrastructure to prepare for data challenges of the future

An extension of the Stanford Research Computing Facility will host several data centers to handle the unprecedented data streams that will be produced by a new generation of scientific projects.

By Manuel Gnida

SRCF-II
The Stanford Research Computing Facility (SRCF) on the SLAC grounds is doubling in size, preparing the lab for a new generation of scientific endeavors that require the handling of unprecedented data streams. The extension, SRCF-II (dark gray), will host several data centers, including the SLAC Shared Science Data Facility (S3DF) and the U.S. data facility (USDF) for Vera C. Rubin Observatory. (Erik Piedad/Devcon Architecture)

A computing facility at the Department of Energy’s SLAC National Accelerator Laboratory is doubling in size, preparing the lab for new scientific endeavors that promise to revolutionize our understanding of the world from atomic to cosmic scales but also require handling unprecedented data streams.

When SLAC’s superconducting X-ray laser, for example, comes online, it’ll eventually accumulate data at a dizzying rate of a terabyte per second. And the world’s largest digital camera for astronomy, under construction at the lab for the Vera C. Rubin Observatory, will eventually capture a whopping 20 terabytes of data every night.

“The new computing infrastructure will be up for these challenges and more,” said Amedeo Perazzo, who leads the Controls and Data Systems division within the lab’s Technology Innovation Directorate. “We’re adopting some of the latest, greatest technology to create computing capabilities for all of SLAC for years to come.”

SRCF-II construction
Construction of an annex to the Stanford Research Computing Facility (SRCF) began in November 2021 and is expected to be completed in the second half of 2023.  (Olivier Bonin/SLAC National Accelerator Laboratory)

The Stanford University-led construction adds a second building to the existing Stanford Research Computing Facility (SRCF). SLAC will become a major tenant of SRCF-II – a modern data center that will provide an environment that is designed to operate 24/7 without service interruptions and with data integrity in mind. SRCF-II will double the current data center capabilities, for a total of 6 megawatts of power capacity.

“Computing is a core competency for a science-driven organization like SLAC,” said Adeyemi Adesanya, head of the Scientific Computing Systems department of Perazzo’s division. “I’m thrilled to see our vision for an integrated computing facility come to life. It’s a necessity for analyzing data on massive scales, and it’ll also pave the way for new initiatives.”

A hub for SLAC’s Big Data

Adesanya’s team is preparing to set up hardware for the SLAC Shared Science Data Facility (S3DF), which will find its home inside SRCF-II. It’ll become a computing hub for all data-intensive experiments performed at the lab.

First and foremost, it’ll benefit future users of LCLS-II, the upgrade of the Linac Coherent Light Source (LCLS) X-ray laser that will produce over 8,000 more pulses per second than the first-generation machine. Researchers hope to use LCLS-II to gain new insights into atomic processes that are fundamental to some of the most pressing challenges of our time, including the chemistry of clean energy technologies, the molecular design of drugs and the development of quantum materials and devices.

ChemRIXS
When SLAC’s superconducting X-ray laser LCLS-II comes online, it’ll fire up to a million light bursts per second, enabling unprecedented experiments, like at the new ChemRIXS instrument shown in this photo. LCLS-II will produce a terabyte of data every second that will be processed at the future SLAC Shared Science Data Facility (S3DF). (Jacqueline Ramseyer Orrell/SLAC National Accelerator Laboratory)

But with the new capabilities come tough computational challenges, said Jana Thayer, head of the LCLS Data Systems division. “To get the best science results and make the most of their time at LCLS-II, users will need fast feedback – within minutes – on the quality of their data,” she said. “To do that with an X-ray laser that produces thousands of times more data every second than its predecessor, we need the petaflops of computing power that S3DF will provide.”

Another issue researchers will have to contend with is the fact that LCLS-II will amass too much data to store it all. The new data facility will run an innovative data reduction pipeline that throws out unnecessary data before it gets saved for analysis.  

Another computationally demanding technique that will benefit from the new infrastructure is cryogenic electron microscopy (cryo-EM) of biomolecules, such as proteins, RNA or virus particles. In this method, scientists take images of how a beam of electrons interacts with a sample that contains the biomolecules. They sometimes need to analyze millions of images to reconstruct the three-dimensional molecular structure in near-atomic detail. Researchers also hope to visualize molecular components in cells, not just biochemically purified molecules, at high resolution in the future.

CryoEM
Cryogenic electron microscopy (cryo-EM) is a powerful tool to reveal the three-dimensional structures of biomolecules. Producing images like this one is a computationally challenging process that will benefit from the state-of-the art infrastructure of the future SLAC Shared Science Data Facility (S3DF). (Greg Stewart/SLAC National Accelerator Laboratory)

The complex image reconstruction process requires lots of CPU and GPU power and involves elaborate machine learning algorithms. Doing these calculations at the S3DF will bring new opportunities, said Wah Chiu, head of the Stanford-SLAC Cryo-EM Center.

“I really hope that the S3DF will become an intellectual hub for computing, where experts gather to write code that allows us to visualize increasingly complex biological systems,” Chiu said. “There is a lot of potential to discover new structural states of molecules and organelles in normal and pathological cells at SLAC.”

In fact, everyone at the lab will be able to use available computing resources. Other potential “customers” include SLAC’s instrument for ultrafast electron diffraction (MeV-UED), the Stanford Synchrotron Radiation Lightsource (SSRL), the lab-wide machine learning initiative and applications in accelerator science. All in all, the S3DF will be able to support 80% of SLAC’s computing needs, while 20% of the most demanding scientific computing will be done at supercomputer facilities offsite.

Multiple services under one roof

SRCF-II will host two other major data facilities.

LSSTCam
This photo shows the 3,200-megapixel focal plane of Rubin Observatory’s LSST Camera, under construction at SLAC. In a few years, the camera will begin taking images of the Chilean night sky, collecting 20 terabytes of data every night. An extension of the Stanford Research Computing Facility (SRCF) at SLAC will host Rubin Observatory’s U.S. data facility (USDF) for the handling of the gigantic data stream. (Jacqueline Ramseyer Orrell/SLAC National Accelerator Laboratory)

One of them is Rubin Observatory’s U.S. data facility (USDF). In a few years, the observatory will begin taking images of the Southern night sky from a mountain top in Chile using its SLAC-built 3,200-megapixel camera. For the Legacy Survey of Space and Time (LSST), it’ll take two images every 37 seconds for 10 years. The resulting information might hold answers to some of the biggest questions about our universe, including what exactly speeds up its expansion, but that information will be contained in a 60-petabyte-sized catalog that researchers will have to sift through. The resulting image archive will reach some 300 petabytes, dominating the storage usage in SRCF-II. The USDF, together with two other centers in the UK and France, will handle production of the enormous data catalog.

A third data hub will serve the user community of SLAC’s first-generation X-ray laser. Existing computing infrastructure for the LCLS data analysis will gradually move to SRCF-II and become a much larger system there.

Although each data center has specific needs in terms of technical specifications, they all rely on a core of shared services: Data always need to be transferred, stored, analyzed and managed. Working closely with Stanford, Rubin Observatory, LCLS and other partners, Perazzo’s and Adesanya’s teams are setting up all three systems.

S3DF Team
The Scientific Computing Systems department, a unit of the Controls and Data Systems division within SLAC’s Technology Innovation Directorate, is creating an integrated scientific computing infrastructure for the lab. From left: Lance Nakata, Grace Tsai, Guangwei Che, Amedeo Perazzo, Renata Dart, Jon Bergman, Victor Elmir, Yee Ting Li, Riccardo Veraldi, Hoang Vu, Julieth Otero, and Adeyemi Adesanya. (Jacqueline Ramseyer Orrell/SLAC National Accelerator Laboratory)

For Adesanya, this unified approach – which includes a cost model that will help pay for future upgrades and growth – is a dream come true. “Historically, computing at SLAC was highly distributed and each facility would have its own, specialized system,” he said. “The new, more centralized approach will help stimulate new lab-wide initiatives, such as machine learning, and by breaking down the silos and converging to an integrated data facility, we’re building something that is more capable than the sum of everything we had before.”

SRCF-II construction is a Stanford project. Large parts of the S3DF infrastructure are funded by the Department of Energy’s Office of Science. LCLS and SSRL are Office of Science user facilities. Rubin Observatory is a joint initiative of the National Science Foundation (NSF) and the Office of Science. Its primary mission is to carry out the Legacy Survey of Space and Time, providing an unprecedented data set for scientific research supported by both agencies. Rubin is operated jointly by NSF’s NOIRLab and SLAC. NOIRLab is managed for NSF by the Association of Universities for Research in Astronomy and SLAC is operated for DOE by Stanford. Stanford-SLAC Cryo-EM Center (S2C2) is supported by the National Institutes of Health (NIH) Common Fund Transformative High-Resolution Cryo-Electron Microscopy program.

For questions or comments, contact the SLAC Office of Communications at communications@slac.stanford.edu.


SLAC is a vibrant multiprogram laboratory that explores how the universe works at the biggest, smallest and fastest scales and invents powerful tools used by scientists around the globe. With research spanning particle physics, astrophysics and cosmology, materials, chemistry, bio- and energy sciences and scientific computing, we help solve real-world problems and advance the interests of the nation.

SLAC is operated by Stanford University for the U.S. Department of Energy’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.

 

Dig Deeper

Related stories

News Brief

The observatory's practice camera has captured its first on-sky data.

A telescope pointed through open doors in its building's roof.
News Feature

Vera C. Rubin Observatory will unite coordinated observations of cosmic phenomena using the four messengers of the universe.

Two stars collide, sending particles to earth.
News Feature

The 3.5-meter  glass mirror is the first permanent component of the Simonyi Survey Telescope's  state-of-the-art, wide-field optical system to be installed and will soon...

A donut-shaped mirror is lowered into a large support apparatus.
News Brief

The observatory's practice camera has captured its first on-sky data.

A telescope pointed through open doors in its building's roof.
News Feature

Vera C. Rubin Observatory will unite coordinated observations of cosmic phenomena using the four messengers of the universe.

Two stars collide, sending particles to earth.
News Feature

The 3.5-meter  glass mirror is the first permanent component of the Simonyi Survey Telescope's  state-of-the-art, wide-field optical system to be installed and will soon...

A donut-shaped mirror is lowered into a large support apparatus.
News Feature

Vera C. Rubin Observatory will capture the faint light of distant brown dwarfs to help scientists understand the Milky Way’s formation and evolution.

Brown spots surround a spiral galaxy.
News Feature

Researchers have released 10 terabytes of data from the OpenUniverse project, which has created a detailed simulation of the universe astrophysicists can use to...

Stars and galaxies on a black background.
News Feature

The largest camera ever built for astrophysics has completed the journey to Cerro Pachón in Chile, where it will soon help unlock the Universe’s...

A semi truck traveling a gravel road approaches two large telescope facilities.