stampede-1-470

In Texas, A Stampede of Petaflops

Add Your Comments

stampede-1-470

The Stampede supercomputer is housed in nearly 200 cabinets in a new data center at the Texas Advanced Computing center in Austin. (Photo: TACC)

In its first days of operation, the new Stampede system at the University of Texas at Austin’s Texas Advanced Computing Center (TACC) debuted as the world’s seventh-fastest supercomputer. But there’s plenty more power in the pipeline.

For its first outing on the prestigious Top500 list, Stampede harnessed 6,400 nodes with two Intel Xeon E5 processors each, recording a speed of 2.6 petaflops. The pending addition of 6,880 Xeon Phi coprocessors, which currently in user evaluation mode, would add more than seven additional petaflops of performance to Stampede. With a theoretical peak performance of nearly 10 petaflops (10 quadrillion mathematical calculations per second) Stampede would sit comfortably in the top four supercomputers in the world.

Stampede is a massive Dell/Intel cluster, and is a centerpiece of the National Science Foundation’s investment in an integrated advanced cyberinfrastructure. The system also features NVIDIA GPUs for remote visualization, a Lustre file system, Mellanox InfiniBand networking, 270 terabytes of memory, and 14 petabytes of storage. The data center housing Stampede is 11,000 square feet and consumes an average 3 megawatts of power.

Coprocessors like Intel’s Xeon Phi supplement the performance of the primary processor, and have become a common feature in the fastest supercomputers.  Phi is the new brand for products using Intel’s Many Integrated Core (MIC) architecture for highly parallel workloads.

Yesterday Stampede was formally introduced to the public at a dedication ceremony at TACC. The system, which began operating on January 7, has successfully executed more than 450,000 computational jobs to date.

Powering New Scientific Research

The supercomputer has enabled research teams to predict where and when earthquakes may strike, how much sea levels could rise, and how fast brain tumors grow. It allows scientists and engineers to interactively share advanced computational resources, data and expertise to further research across scientific disciplines. Some of the early research examples Stampede has completed includes seismic hazard mapping, ice sheet modeling to study climate change, improving the imaging quality of brain tumors, and carbon dioxide capture and conversion.

“Stampede has been designed to support a large, diverse research community,” said TACC Director Jay Boisseau. “We’re as excited about Stampede’s comprehensive capabilities and its high usability as we are of its tremendous performance. Stampede will lead the way to major advances in all fields of science and engineering. It’s an honor to be at this intersection of advanced computing technologies and world-class science, and we thank NSF, Dell, and Intel for their roles in helping TACC design, deploy, and operate Stampede.”

NSF Directorate for Computer and Information Science and Engineering Farnam Jahanian helped dedicate Stampede, with help from U.S. House Science, Space, and Technology Committee Chairman Lamar Smith and representatives from Dell, Intel, UT Austin and TACC.

“Stampede is an important part of NSF’s portfolio for advanced computing infrastructure enabling cutting-edge foundational research for computational and data-intensive science and engineering,” said Jahanian. “Society’s ability to address today’s global challenges depends on advancing cyberinfrastructure.”The base Stampede system has been accepted by NSF, and has successfully executed more than 450,000 computational jobs to date. The supercomputer has enabled research teams to predict where and when earthquakes may strike, how much sea levels could rise, and how fast brain tumors grow. It allows scientists and engineers to interactively share advanced computational resources, data and expertise to further research across scientific disciplines. “

About the Author

John Rath is a veteran IT professional and regular contributor at Data Center Knowledge. He has served many roles in the data center, including support, system administration, web development and facility management.

Add Your Comments

  • (will not be published)