Napędy taśmowe




Punkty Wi-Fi

The CERN Data Centre is the heart of CERN’s entire scientific, administrative, and computing infrastructure. All services, including email, scientific data management and videoconferencing use equipment based here.

The 230 000 processor cores and 15 000 servers run 24/7. Over 90% of the resources for computing in the Data Centre are provided through a private cloud based on OpenStack, an open-source project to deliver a massively scalable cloud operating system.

CERN is one of the most highly demanding computing environments in the research world. The World Wide Web was originally conceived and developed at CERN to meet the demand for automated information-sharing between scientists in universities and institutes around the world. From software development, to data processing and storage, networks, support for the LHC and non-LHC experimental programme, automation and controls, as well as services for the accelerator complex and for the whole laboratory and its users, computing is at the heart of CERN’s infrastructure.

The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of thousands of physicists near real-time access to LHC data. The CERN data centre is at the heart of WLCG, the first point of contact between experimental data from the LHC and the grid. Through CERN openlab, a unique public-private partnership, CERN collaborates with leading ICT companies and other research organisations to accelerate the development of cutting-edge ICT solutions for the research community.

Storage – What data to record?

Collisions in the LHC generate particles that often decay in complex ways into even more particles. Electronic circuits record the passage of each particle through a detector as a series of electronic signals, and send the data to the CERN Data Centre for digital reconstruction. The digitised summary is recorded as a ‘collision event’. Up to about 1 billion particle collisions can take place every second inside the LHC experiment’s detectors. It is not possible to read out all of these events. A ‘trigger’ system is therefore used to filter the data and select those events that are potentially interesting for further analysis.
Even after the drastic data reduction performed by the experiments, the CERN Data Centre processes on average one petabyte (one million gigabytes) of data per day. The LHC experiments produce about 90 petabytes of data per year, and an additional 25 petabytes of data are produced per year for data from other (non-LHC) experiments at CERN. Archiving the vast quantities of data is an essential function at CERN. Magnetic tapes are used as the main long-term storage medium and data from the archive is continuously migrated to newer technology, higher density tapes.
The CERN storage system, EOS, was created for the extreme LHC computing requirements. EOS instances at CERN exceed three billion files (as of 2019), matching the exceptional performances of the LHC machine and experiments. EOS is now expanding for other data storage needs beyond high-energy physics, with AARNET, the Australian Academic and Research Network, and the EU Joint Research Centre for Digital Earth and Reference Data adopting it for their big-data systems.