Supercomputers Extend Frontiers of Research

1 comment

Supercomputing on a smart phone? An earth-human knowledge management system? Powerful computers are enabling researchers to extend the frontiers of their work in amazing and important ways. With 16.4 percent of the Top500 supercomputers in the world dedicated to research, there are a number of exciting projects taking place on some very powerful computers. Here’s a look at some notable projects in the pipeline:

There’s an App for that
Researchers at MIT are collaborating with staff at the Texas Advanced Computing Center to create a smart phone application to do real-time supercomputing applications. Working with the Ranger supercomputer  (#11 in June 2010)  researchers performed a series of expensive high-fidelity simulations to generate a small “reduced model” which was transferred to a Google Android phone.

“If you have sensors feeding in data to the reduced order model system, then it could solve the equation corresponding to the input data, and indicate the appropriate response in real-time based on the calculations you performed on a supercomputer,” said David Knezevic, a post-doctoral associate in Mechanical Engineering at MIT.

Supercomputing for the birds
What do you do when you have amassed so much data that it would take a single processor around 35,000 days to crunch it? The eBird project, a joint venture of the Cornell Laboratory of Ornithology and the National Audubon Society faced this problem when its project received 48 million bird observations over the last eight years.  Steve Kelling from the eBird project told Fast Company that a single processor would take about 10 days to run a single year’s worth of data on a single species, and they have around 700 species and five years’ worth of data.  The NSF has given the eBird project 100,000 hours on the their TeraGrid supercomputing network to help analyze its data.

Blue Waters – Petascale research
Earlier this summer the NCSA unveiled the National Petascale Computing Facility, where the Blue Waters supercomputer will reside in 2011. Earlier this month retired NCSA chief science officer Bob Wilhelmson gave a keynote at the TeraGrid 2010 conference where he shared several planned projects for Blue Waters.

With the supercomputer expected to help solve scientific and social challenges, the 300,000 plus cores will help simulate weather, study the formation of the first galaxies and better understand human disease and drug development.

“Machines are just technology,” said Wilhelmson. “They live for five years and then they’re gone, replaced by something else. What does not die is the application, because it is developed and used to gain a deeper understanding of the world around us.”

Looking forward to exascale computing Wilhelmson stressed the need for funding in order to satiate the enormous power demands and to keep adequate levels of applications development and system support.

Sustainable Cyberinfrastructure
Also speaking at TeraGrid 2010 was Dr. Tim Killeen from the National Science Foundation (NSF). Killeen’s keynote emphasized the need for sustainable cyberinfrastructure in the geosciences and across all domains of science.

“Overall, it’s a sustained investment and a balanced approach to drive transformation in the scientific disciplines,” said Killeen. “This is what NSF wants to get to … it involves training people to address incredibly important societal challenges with all the tools at our command. It’s going to be the cyberinfrastructure that transforms the geosciences and takes it to the next level.”

Killeen outlined a challenge for all conference participants to think about “Earth-Cubed” –  the development of an earth-human knowledge management system. A year ago the NSF and agencies from countries including Brazil, Australia, Russia, Canada, France, Germany, Great Britain, and Japan said that they would work together to deliver that system to support human action and adaptation to regional environmental change.

Speeding Genetic Research
Cox Business announced it will provide Translational Genomics Research Institute (TGen) with light speed computer connection, based on Obsidian Strategics technology, an advanced, military grade technology.

The advanced Obsidian technology will move data 100 times faster between TGen and Saguaro 2, Arizona State University’s (ASU) supercomputer. Saguaro 2 is a Dell PowerEdge cluster that started at #83 on the Top500 list in November 2008, and settled for #332 in June 2010.

“The field of biomedical research presents one of the greatest opportunities in transferring massive amounts of data from point to point. Our Cox LightWave Service accomplishes this quickly, and with 100 percent security, over our wholly-owned network. It’s ideal for enterprises like TGen, ASU and datacenters that transmit and receive information in terabytes,” said Hyman Sukiennik, vice president, Cox Business Arizona.

TGen was recently awarded a $1.99 million grant from the National Institutes of Health to enhance the molecular identification of numerous diseases.

About the Author

John Rath is a veteran IT professional and regular contributor at Data Center Knowledge. He has served many roles in the data center, including support, system administration, web development and facility management.

Add Your Comments

  • (will not be published)

One Comment

  1. Hugo Breton

    Data Storage Interest me since a long while.