Diese Webseite wird nicht länger aktualisiert. Für Inhalte und Links wird keine Haftung übernommen. Bitte besuchen Sie die Seite des Nachfolgeclusters ORIGINS.
This website is no longer maintained. We assume no liability for content and links. Please visit the webpage of the successive cluster ORIGINS.

Atomic and Subatomic Physics: Swiftly sifting the data

18.12.2018 —

Large-scale experiments in fundamental research require more and more computing and storage resources. German physicists have now joined forces to develop innovative digital processing methods.

What are the fundamental building blocks of nature? How did the universe come into being and how did it evolve? Scientists are investigating these fundamental questions with very different methods. At the Large Hadron Collider (LHC) at CERN and at the Belle II experiment in Japan, they are searching for new elementary particles in proton-proton or electron-positron collisions. At the FAIR accelerator (Facility for Antiproton and Ion Research), researchers hope to produce compressed neutron-star matter, the source of the heavy elements in the cosmos, in the laboratory within a few years. At the Pierre Auger Observatory in Argentina, scientists gain insights into astrophysical and cosmological processes by detecting high-energy cosmic rays.

50 million gigabytes of data annually at CERN alone

Despite using different methods and probing different scientific questions, these researchers all face a common challenge. The increasingly higher resolution of the measuring instruments and the increased performance of the accelerators promise new scientific findings, but data volumes are also increasing rapidly as a result. The experiments at CERN, for example, already generate around 50 petabytes of data per year. That's around 50 million gigabytes - an unimaginable amount of data. If they were stored on a total of 10 million DVDs, the stacked DVDs would have a height of 13 kilometres. But that’s by no means the end of the line. "In the next 10 years, we expect an increase in data volumes by a factor of 50 due to the further development of detectors and accelerators," says Prof. Thomas Kuhr of LMU.

New developments in storage and processor technology cannot hope to keep pace with this growth in storage requirements. To be able to continue analyzing research data, completely new computing concepts will be required in the future. This is why scientists engaged in particle physics, hadron and nuclear physics, and astroparticle physics have joined forces to form an interdisciplinary network.

Joint project brings together core competences

In the context of the Framework Programme Erforschung von Universum und Materie (ErUM), the Federal Ministry of Education and Research (BMBF) is funding this network via the pilot project "Innovative Digital Technologies for Research on Universe and Matter", by providing a total of 3.6 million euros over the next three years. The participating researchers will contribute their diverse experience and knowledge in the fields of distributed computing infrastructures and algorithm development to the project. They belong to research groups based at the Universities of Aachen, Erlangen-Nuremberg, Frankfurt am Main, Freiburg, Hamburg, Mainz, Munich, Wuppertal and the Karlsruhe Institute of Technology, as well as the associated partners DESY (Deutsches Elektronen-Synchrotron), CERN, Forschungszentrum Jülich, Grid Computing Centre Karlsruhe (GridKa), GSI Helmholtzzentrum für Schwerionenforschung and the Universities of Bonn, Göttingen and Münster. Prof. Thomas Kuhr from the LMU will coordinate the network.

Over the next three years, the joint project will develop and test new computing systems. One promising approach is the use of virtualization technologies to tap previously inaccessible resources. The scientists are also considering the use of new processor architectures, which are used, for example, in graphics boards and promise better energy efficiency (Green IT). The researchers also regard the development of improved algorithms and the use of artificial intelligence (AI) for Big Data analyses as an important component of the project. Innovative methods of "machine learning" will play an important role here.

All scientific fields face the digital challenge

"The huge amounts of data are a great challenge for us. Innovative digital methods will be indispensable in the future if fundamental research is to advance decisively," said network coordinator Prof. Thomas Kuhr. But it is not only physical research that faces the digital challenge. "Sooner or later, other scientific disciplines will also need powerful computing environments and will benefit from the new competences," Kuhr emphasises. The joint project offers the participating young scientists an excellent opportunity to acquire a comprehensive knowledge of new computing technologies. This means they will be well prepared to fill leading positions in science or business in order to drive digital change forward.

In addition to coordinating the project, LMU will be responsible for the task of accelerating simulation calculations by using deep neural networks, initially using the Belle II experiment as an example. In addition, LMU scientists are developing new methods for efficiently accessing huge amounts of data over long distances. "Only if remote data can be accessed quickly and easily, can unused resources be used for all experiments," says Dr. Günter Duckeck, an LMU scientist involved in the ATLAS experiment.

Contact:
Prof. Dr. Thomas Kuhr LMU
Phone: +49 (0) 89/35831-7174
Email: Thomas.Kuhr@lmu.de

 

 

 

The complexity of data collection: The Belle II detector. Credit: Van Than Dong/Belle II Collaboration

Social Networks

google.com

Technische Universitaet Muenchen
Exzellenzcluster Universe

Boltzmannstr. 2
D-85748 Garching

Tel. + 49 89 35831 - 7100
Fax + 49 89 3299 - 4002
info@universe-cluster.de