Skip to main content Skip to secondary navigation

Nachman Group: AI for Fundamental Physics

Viewing fundamental physics through the lens of modern machine learning.

We develop, adapt, and deploy novel physics and statistics methods to particle, nuclear, and astrophysics.

Main content start

Across the physical sciences, there has been a shift in paradigm from a theory-driven to a data-driven era. In this new regime, we let the data speak for themselves by using modern machine learning tools unimaginable prior to the deep learning revolution of the last decade. At the same time, the physical sciences face unique challenges that require dedicated solutions to maximize the potential for discovery. Now, more than ever, we need a new kind of researcher - a phystatistician (like biostatistician) or a data physicist (like data scientist).  

Dr. Benjamin Nachman is a data physicist, fusing particle physics theory and experiment with statistics and modern machine learning.

He is an Associate Professor of Particle Physics and Astrophysics, and, by Courtesy, of Physics and of Statistics at Stanford University and SLAC National Laboratory.   Nachman is a co-director of the Stanford Data Science Center for Decoding the Universe and also a senior member of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC).  His team of data physicists are developing and deploying novel physics and statistics methods to particle, nuclear, astrophysics, and beyond.