Wahid Bhimji
Deep Learning on Supercomputers for Fundamental Science [virtual]
Summary
Deep learning is already being used to drive new insights in fundamental sciences. The field is on the verge of having a transformative impact, but doing so will not only require novel deep learning approaches, but also exploiting large scale computing resources. This talk will highlight some particular applications of deep learning for fundamental science, including material science, climate, cosmology and particle physics. We will explain how these applications have exploited supercomputers to achieve their current insights, and what future directions will be needed for AI to truly uncover the fundamental secrets of the universe.
Short bio
Wahid Bhimji currently leads the Data, AI and Analytics Services Group at NERSC, Berkeley National Laboratory: the mission HPC center for the US Department of Energy, Office for Science. He has been responsible for aspects of AI applications, system design, software and training for “Perlmutter”, the newest Supercomputer at NERSC, which has over 6000 A100 GPUs. His interests in deep learning include generative models, probabilistic programming, large-compute-scale model training, and applications to fundamental science, particularly high-energy physics. He is also involved in optimizing many other aspects of scientific big-data workflows running on high-performance computing resources. Wahid was previously heavily involved in data management and analysis for the Large Hadron Collider at CERN and the UK government; and has a Ph.D. in high-energy particle physics.