The CMS Collaboration at CERN has released more than 300 terabytes (TB) of high-quality open data. These include over 100 TB, or 2.5inverse femtobarns (fb−1), of data from proton collisions at 7 TeV, making up half the data collected at the LHC by the CMS detector in 2011. This follows a previous release from November 2014, which made available around 27 TB of research data collected in 2010.
Available on the CERN Open Data Portal — which is built in collaboration with members of CERN’s IT Department and Scientific Information Service— the collision data are released into the public domain under the CC0 waiver and come in types: The so-called “primary datasets” are in the same format used by the CMS Collaboration toperform research. The “derived datasets” on the other hand require a lot less computing power and can be readily analysed by university or high-school students, and CMS has provided a limited number of datasets in this format.
Notably, CMS is also providing the simulated data generated with the same software version that should be used to analyse the primary datasets. Simulations play a crucial role in particle-physics research and CMS is also making available the protocols for generating the simulations that are provided. The data release is accompanied by analysis tools and code examples tailored to the datasets. A virtual-machine image based on CernVM, which comes preloaded with the software environment needed to analyse the CMS data, can also be downloaded from the portal.These data are being made public in accordance with CMS’s commitment to long-term data preservation and as part of the collaboration’s open-data policy. “Members of the CMS Collaboration put in lots of effort and thousands of person-hours each of service work in order to operate the CMS detector and collect these research data for our analysis,” explains Kati Lassila-Perini, a CMS physicist who leads these data-preservation efforts. “However, once we’ve exhausted our exploration of the data, we see no reason not to make them available publicly. The benefits are numerous, from inspiring high-school students to the training of the particle physicists of tomorrow. And personally, as CMS’s data-preservation co-ordinator, this is a crucial part of ensuring the long-term availability of our research data.”
The scope of open LHC data has already been demonstrated with the previous release of research data. A group of theorists at MIT wanted to study the substructure of jets — showers of hadron clusters recorded in the CMS detector. Since CMS had not performed this particular research, the theorists got in touch with the CMS scientists for advice on how to proceed. This blossomed into a fruitful collaboration between the theorists and CMS revolving around CMS open data. “As scientists, we should take the release of data from publicly funded research very seriously,” says Salvatore Rappoccio, a CMS physicist who worked with the MIT theorists. “In addition to showing good stewardship of the funding we have received, it also provides a scientific benefit to our field as a whole. While it is a difficult and daunting task with much left to do, the release of CMS data is a giant step in the right direction.”
Further, a CMS physicist in Germany tasked two undergraduates with validating the CMS Open Data by re-producing key plots from some highly cited CMS papers that used data collected in 2010. Using openly available documentation about CMS’s analysis software and with some guidance from the physicist, the students were able to re-create plots that look nearly identical to those from CMS, showing what can be achieved with these data. “I was pleasantly surprised by how easy it was for the students to get started working with the CMS Open Data and how well the exercise worked,” says Achim Geiser, the physicist behind this project. Simplified example code from one of these analyses is available on the CERN Open Data Portal and more is on its way.
Prior to the launch of the CERN Open Data Portal with the first batch of research-quality data from CMS, the Collaboration had provided certain curated datasets for use in high-school workshops. These “masterclasses”, developed by QuarkNet and conducted under the aegis of the International Particle Physics Outreach Group, bring particle-physics data to thousands of high-school students each year. These educational datasets are also available on the CERN Open Data Portal, along with an “event display” for visualising the particle-collision events.
“We are very pleased that we can make all these data publicly available,” adds Kati. “We look forward to how they are utilised outside our collaboration, for research as well as for building educational tools.”
Exotic subatomic particle confirmed at Large Hadron Collider after earlier false sightings.
An exotic particle made up of five quarks has been discovered a decade after experiments seemed to rule out its existence.
The short-lived ‘pentaquark’ was spotted by researchers analysing data on the decay of unstable particles in the LHCb experiment at the Large Hadron Collider (LHC) at CERN, Europe’s particle-physics laboratory near Geneva. The finding, says LHCb spokesperson Guy Wilkinson, opens a new era in physicists’ understanding of the strong nuclear force that holds atomic nuclei together.
“The pentaquark is not just any new particle — it represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before,” he says. “Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”….
This animation shows how the Large Hadron Collider (LHC) works.
The film begins with an aerial view of CERN near Geneva, with outlines of the accelerator complex, including the underground Large Hadron Collider (LHC), 27-km in circumference. The positions of the four largest LHC experiments, ALICE, ATLAS, CMS and LHCb are revealed before we see protons travelling around the LHC ring.
The proton source is a simple bottle of hydrogen gas. An electric field is used to strip hydrogen atoms of their electrons to yield protons. Linac 2, the first accelerator in the chain, accelerates the protons to the energy of 50 MeV. The beam is then injected into the Proton Synchrotron Booster (PSB), which accelerates the protons to 1.4 GeV, followed by the Proton Synchrotron (PS), which pushes the beam to 25 GeV. Protons are then sent to the Super Proton Synchrotron (SPS) where they are accelerated to 450 GeV.
The protons are finally transferred to the two beam pipes of the LHC. The beam in one pipe circulates clockwise while the beam in the other pipe circulates anticlockwise, increasing in energy until they reach 6.5 TeV. Beams circulate for many hours inside the LHC beam pipes under normal operating conditions. The two beams are brought into collision inside four detectors – ALICE, ATLAS, CMS and LHCb – where the total energy at the collision point is equal to 13 TeV.
Collisions occur once every 25 nanoseconds, the trigger level 1 performs ultrafast event selection before data move to trigger levels 2 and 3 at the PC farm. Selected event data are then sent to the CERN data centre that performs initial data reconstruction and makes a copy of the data for long-term storage, while raw and reconstituted data are sent to the Computing Grid. The Worldwide LHC Computing Grid infrastructure includes two “Tier 0” sites, one at CERN and one in Budapest, Hungary, as well as further smaller computing sites located around the world.
As collision data increases, physicists build up enough statistics to test theoretical predictions, such as the prediction of a Higgs Boson, discovered in the data from the LHC’s first physics run (shown as a bump in the graphs in the animation). The LHC allows physicists to probe the nature of matter. The new higher collision energy of 13 TeV opens up new frontiers in particle physics.
Directors: Daniel Dominguez, Arzur Catel Torres
Music: F_Fact_-_State_of_Mind_(_psystep_vers._of_the_beach) by “Platinum Butterfly” CC BY 3.0
Two years ago, the Higgs Boson was discovered by the ATLAS and CMS experiments. But how precisely does it fill its role as the last missing piece in the Standard Model of particle physics?
The Large Hadron Collider will restart in 2015 with almost double the collision energy to test just that. But even then, this theory only accounts for 5% of the Universe, and does not include gravity.Can the LHC shed light on the origin of dark matter? Why is gravity so much weaker than the other forces? Dr Pippa Wells explains how the LHC will explore these mysteries of matter.
Pippa Wells was the Inner Detector System Project Leader on the ATLAS Experiment at CERN. ATLAS is one of two general-purpose detectors at the Large Hadron Collider (LHC). It investigates a wide range of physics, from the search for the Higgs boson to extra dimensions and particles that could make up dark matter.
Yes, that’s correct: photon collider.
The Large Hadron Collider is known for smashing together protons. The energy from these collisions gets converted into matter, producing new particles that allow us to explore the nature of our Universe. The protons are not fired at one another individually; instead, they are circulated in bunches inside the LHC, each bunch containing some 100 billion (100,000,000,000) particles. When two bunches cross each other in the centre of CMS, a few of the protons — around 25 or so — will collide with one another. The rest of the protons continue flying through the LHC unimpeded until the next time two bunches cross.
Sometimes, something very different happens. As they fly through the LHC, the accelerating protons radiate photons, the quanta of light. If two protons going in opposite directions fly very close to one another within CMS, photons radiated from each can collide together and produce new particles, just as in proton collisions. The two parent protons remain completely intact but recoil as a result of this photon-photon interaction: they get slightly deflected from their original paths but continue circulating in the LHC. We can determine whether the photon interactions took place by identifying these deflected protons, thus effectively treating the LHC as a photon collider and adding a new probe to our toolkit for exploring fundamental physics. Continue reading The LHC as a photon collider
Speaking at press conference held during the 37th International Conference on High Energy Physics, ICHEP, in Valencia, Spain, this morning CERN Director-General Rolf Heuer summarized the results being presented from CERN.
The conference, which began last Thursday with three days of parallel sessions, now moves on to plenary sessions until Wednesday, summing up the current state of the art in the field. The plenary sessions will be webcast.
“Two years on from the last ICHEP conference, during which the discovery of the messenger of the Brout-Englert-Higgs mechanism, a Higgs boson, was announced, this topic is still a strong focus of the presentations from CERN,” said Heuer. “But for me, the main message I’m taking away from this conference is that there’s a lot at stake for the LHC’s second run starting next year, and the experiments are all ready to exploit the full potential that higher-energy running brings.”
All four LHC experiments presented new results from the LHC’s first run, which concluded in 2013. For ATLAS and CMS, the run-1 Brout-Englert-Higgs (BEH) analyses are reaching a conclusion. All show that the Higgs particle behaves in a way consistent with the Standard Model: the theory that accounts for the behaviour of fundamental particles of matter and the interactions at work between them. Nevertheless, based on the run-1 sample, the BEH analyses do not rule out new physics, and with a much higher Higgs production rate at higher energy, run-2 BEH physics holds much promise. The Standard Model describes the behaviour of what we consider to be ordinary matter to great precision, but we know that ordinary matter makes up just about 5% of the total matter and energy of the universe: there’s much more to be discovered in the so-called dark universe of dark matter and energy.
One possible candidate for dark matter is supersymmetry, a theory that predicts a range of so-far unobserved particles that could make up the 27% of the universe composed of dark matter. Through run-1, the LHC experiments have ruled out a number of supersymmetric models, but more possibilities will be within reach in run-2.
Spearheaded by the ALICE experiment, which is dedicated to exploring quark-gluon plasma, the hot-dense state of matter that would have existed just after the big bang, all the LHC experiments have delivered new insights into this exotic form of matter. AndLHCb, the experiment that specializes in measuring short-lived particles with great precision, presented a range of results showing the power of the LHCb detector in contributing to a wide range of topics, from quark-gluon plasma to matter-antimatter asymmetry.
After 18 months of maintenance and upgrading, the CERN accelerator complex is now starting up for physics. Research programmes at all the accelerators with the exception of the LHC will be underway in 2014, with the LHC joining in spring 2015.