[introductory/intermediate] Generative Models in High Energy Physics: Examples from CERN
Generative Models are among the most successful deep learning methods studied today. Their range of applicability is very wide: going from simulation, to anomaly detection, to data augmentation and compression and more. The High Energy Physics community has a long tradition of using machine learning to solve specific tasks, in particular related to a more efficient selection of interesting events over the overwhelming background produced at colliders such as the LHC. In the recent years, several studies have demonstrated the benefit of using deep learning techniques and building on these examples, many High Energy Physics experiments are now working on integrating deep learning into their workflows for different applications: from pattern recognition, to real-time selection of interesting collision events, to simulation and data analysis. Initial tests on generative models, in particular, have shown promising results in different areas. This course covers key aspects of generative modeling , describing recent algorithmics developments, while at the same time highlighting the specific challenges related to the application of generative modelling in scientific environment. Example applications from the Large Hadron Collider experiments at CERN will be presented and used throughout the course as case-studies.
Introduction and background
Deep Generative models
• Fully observed likelihood-based models
• Latent variable models
• Implicit generative models
• Energy Based Models
Practical applications in High Energy Physics
Uncertainty estimation, Results validation and Integration in the scientific computing workflow
- Stefano Ermon, Yang Song, « Deep Generative Models » course CS236, Stanford, 2021
- Dean Foster, « Generative Deep Learning », O’Reilly, 2019
- Ian Goodfellow, “NIPS 2016 Tutorial: Generative Adversarial Networks”, arxiv: 1701.00160
- Shakir Mohamed and Danilo Rezende. « Tutorial on Deep Generative Models ». Uncertainty in Artificial Intelligence, July 2017
- I. Goodfellow, Y. Bengio and A. Courville, “Deep Learning”, MIT Press, 2016
Basic statistics. Basic knowledge about machine learning. Familiarity with learning primitives (convolutions, recurrent neural networks, graph neural networks, …).
Dr. Sofia Vallecorsa is a CERN physicist with extensive experience on software development in the High Energy Physics domain. She obtained her PhD at the University of Geneva and worked on different experiments, from CDF to IceCube and ATLAS. Dr. Vallecorsa coordinates the Quantum Computing area of the CERN Quantum Technology Initiative, recently established. She is also responsible for Deep Learning and Quantum Computing research within CERN openlab (http://openlab.cern.ch) which is a unique public-private partnership between CERN and leading ICT companies. Before joining openlab, Dr. Vallecorsa has been responsible for the development of Deep Learning based technologies for the simulation of particle transport through detectors at CERN and she has worked on code modernization projects in the field of Monte Carlo simulation.