Supercomputing and the scientist: How HPC and large-scale data analytics are transforming experimental science
Group Lead for Data Science Engagement, National Energy Research Scientific Computing Center
Abstract: Computing has been an important part of the scientists toolkit for decades, but the increasing volume and complexity of scientific datasets is transforming the way we think about the use of computing for experimental science. NERSC is the mission computing center for the US Department of Energy Office of Science, and runs some of the most powerful computers on the planet. While we traditionally think of using supercomputers for theoretical simulations of complex physical systems, I will discuss how supercomputing at NERSC is being leveraged in experimental science to revolutionize the way we collect and analyze data in fields as diverse as particle physics, cosmology, materials science and structural biology. Through these case studies and real-life challenges, I will describe how supercomputing is solving some of the critical problems faced in experiment design and operation; the management of data, metadata and provenance; and the application of advanced analytic techniques to complex multi-dimensional datasets.
Bio: Debbie Bard leads the Data Science Engagement Group at the National Energy Research Scientific Computing Center (NERSC) at Berkeley National Lab. NERSC is the mission supercomputing center for the USA Department of Energy, and supports over 7000 scientists and 700 projects with supercomputing needs.
A native of the UK, her career spans research in particle physics, cosmology and computing on both sides of the Atlantic. She obtained her PhD at Edinburgh University, and worked at Imperial College London and SLAC National Accelerator Laboratory in the USA before joining the Data Department at NERSC, where she focuses on data-intensive computing and research, including supercomputing for experimental science and machine learning at scale.