Monday, April 25, 2016

CERN provides 300 TB of particle collision data – Tech & Net

 
 
 
 

The CERN – CERN – has just release 300 terabytes of experience of data by the LHC, the largest collider in the world particles. But despite the huge volume of information, it only holds half of the experiments carried out during the year 2011.

In addition to releasing the information that will allow further study of the fundamental particles of matter, CERN also provides the download data modeling software. The Virtual Machine based on tool (CernVM) that scientists use in the complex located on the Franco-Swiss border. 300 terabytes of released data are available in two different formats, the set of primary data and set of data derived . The first are those used by the researchers themselves CERN and the second intended to a wider audience.

This is the second time that CERN releases data of their research, but in 2014 had provided only 17 terabytes, well away from the stunning 300 terabytes now released. With the Virtual Machine also available for download, researchers from around the world can thus access the data that those who work at CERN failed to analyze and resulting experiences with CMS – the compact solenoid for muons, namely, the spiral conductor that allows the study of muon particle.

“Member DACMs put much effort and thousands of person-hours of each in order to operate the CMS detector and collecting these search data for our analysis, “explains Kati Lassila-Perini, the CMS physics that leads these data preservation efforts. “However, since already we have exhausted our exploration of the data, we see no reason not to make them available to the public. The benefits are numerous, since inspire high school students to become the particle physicists of tomorrow. And personally, as coordinator for the preservation of the CMS data, this is a crucial part of ensuring the long-term availability of our research data. “

LikeTweet

No comments:

Post a Comment