Big Data For CERN Requires a Big Network

The CERN Large Hadron Collider (LHC) generates over 100 hundred petabytes of data every year at its home near Geneva, Switzerland. CERN's new data center in Budapest is set to be one of the first beneficiaries of a new terabit network created by GÉANT, a European data network for researchers and scientists.

John Rath

November 21, 2012

3 Min Read
Big Data For CERN Requires a Big Network
The Large Hadron Collider is the world’s largest and most powerful particle accelerator (Image: CERN)

A look at the ATLAS particle detector experiment at the Large Hadron Collider (LHC), the huge particle accelerator at CERN near Geneva, Switzerland. (Photo: Image Editor via Flickr)

lhc-cern

A look at the ATLAS particle detector experiment at the Large Hadron Collider (LHC), the huge particle accelerator at CERN near Geneva, Switzerland. (Photo: Image Editor via Flickr)

The CERN Large Hadron Collider (LHC) generates over 100 petabytes of data every year at its home near Geneva, Switzerland.  Distributing this data to data centers around the world for analysis is important, and requires a big network to transport the big data.

CERN's new data center in Budapest is set to be one of the first beneficiaries of a new terabit network created by GÉANT, a European data network for researchers and scientists. It will act as an extension to CERN’s existing data center as well as providing business continuity in case of any disruptions that could affect the CERN facility. GEANT launched its 50,000 kilometer pan-European research and education network in May, with Infinera and Imtech providing the transmission equipment and switching platform respectively.

The Wigner Research Centre for Physics in Hungary will host CERN’s new remote data center and will process and store data from CERN for the LHC. Together with CERN it will be the first to use multiple new 100Gbps links, and support the type of big data requirements the network was built to serve.

“Having a remote site and operations places a lot of requirements on the networking solutions," said David Foster, Deputy Head of the CERN IT Department.  "Together with GÉANT and NIIF/Hungarnet, as well as our research and education and commercial partners we will be implementing state-of-the-art capabilities to connect CERN and Wigner. The GÉANT network is fundamental to our data transfer needs, and we’re delighted that we will be continuing this successful relationship.”

GÉANT’s migration to the latest transmission and switching technology is designed to support up to 2Tbps (terabits per second) capacity across the core network. 500Gbps capacity will be available across the core network from first implementation, delivering circuits across Europe that will allow individual users to transfer data at speeds of up to 100Gbps, or multiples thereof, thereby enabling faster collaboration on critical projects and meeting the rapidly increasing demand for data transfer.

"The Wigner data centre is exactly the kind of power user that the upgraded network will continue to support, and we look forward to working with all the partners involved to ensure the continued success of the LHC research,” said Matthew Scott and Niels Hersoug, joint General Managers of DANTE, which operates the GÉANT network.

At the recent SC12 conference in Salt Lake City industry partners demonstrated an inter-connect with three major LHC Tier-2 computing sites and the SC12 show floor using 100 Gbps technology (and all of Internet2’s Advanced Layer 2 Service links to SC12) through a collaboration between Caltech, the University of Victoria and the University of Michigan.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like