[caption id="attachment_2698" align="aligncenter" width="561"] The Large Hadron Collider (LHC), where the magic happens.[/caption] It takes a lot of computing power to change the understanding of the universe as we know it. On July 4, a collective of physicists working for the European Organization for Nuclear Research (known as CERN) announced the discovery of a particle consistent in its behavior with Higgs boson. Pursued by physicists for decades, the Higgs particle is considered a missing element in the Standard Model of particle physics; identify it with certainty, and you take a major step toward answering how elementary particles assume mass. Determining the parameters of the new particle involved experiments conducted with the Large Hadron Collider (LHC), a giant particle accelerator, combined with a network of data centers designed to crunch epic amounts of information. “The amount of data collected by LHC in one year of data-taking is on the order to 3 petabytes per experiment,” Professor Giovanni Organtini of Italy’s Instituto Nazionale di Fisica Nucleare (that’s the National Institute of Nuclear Physics), and one of the scientists involved in the CMS experiment at the Large Hadron Collider, wrote in an email. “These data are spread around the world, transferring them to Tier-1 and Tier-2 computing centers specially set up in various countries.” Another 100 data centers worldwide help in hosting the data and providing compute power to the physicists working on the project, with all of them connected via the LHC Computing Grid. “Physicists prepare their analysis jobs on their own computers,” Organtini wrote. “Then, once successfully tested on a small subsample of the data, they submit the job to a central system that automatically finds the location of the required data, splits the job into sub-jobs sent to each computing center” according to the data’s location. Jobs are run in parallel within each data center, to extract the most interesting bits of data; that data is then merged and sent to the original requestor, typically as files on remote virtual disks. “All the infrastructure has been built on top of the Linux OS, found to be the easiest to manage and the most efficient for our purposes,” Organtini added. “In many cases, as in Rome, computer centers dedicated to LHC were built from scratch.” Despite the discovery of a particle that might be Higgs boson, there’s still a lot of work to be done. “There are still many open questions not answered by the current theory,” according to Organtini. “That's why several new models have been elaborated and we are still trying to confirm or exclude some of them. Every new model predicts new phenomena that should be observer at LHC, whose program is still vast and is expected to last for the next 10 years.” In other words, much more theory-altering and data-crunching awaits.   Image: CERN