CERN is making headlines for the successful launch of the Large Hadron Collider (LHC), the world’s most powerful particle accelerator. While much of the press attention focused on the potential risks of the experiments, there’s also an interesting technology story in the computing infrastructure required to filter, interpret and understand the enormous volumes of data being created. CERN’s LHC experiments will produce roughly 15 petabytes (15 million gigabytes) of data annually – enough to fill more than 1.7 million dual-layer DVDs a year. CERN is collaborating with institutions in 33 different countries to operate a distributed computing and data storage infrastructure known as the LHC Computing Grid. There’s even an LHC at Home program to allow anyone to contribute idle time on their computer to the effort.
CERN also has industrial strength computing infrastructure of its own. We’ve put together a look inside the CERN Computer Center and images of the server and storage supporting the Large Hadron Collider. It’s not surprising that the Geneva, Switzerland facility would have industrial-strength infrastructure – it, after all, the birthplace of the web and home to the first web server. There are also stories today looking at the role of Linux and VMware software in supporting the LHC project.