Mar 2 – 4, 2026
Karlsruhe Institute of Technology
Europe/Berlin timezone

Geometric Optimization in Scientific Machine Learning

Mar 2, 2026, 6:24 PM
3m
Triangel

Triangel

Kaiserstraße 93, 76133 Karlsruhe

Speaker

Johannes Müller

Description

We discusses an “optimize-then-project” approach for applications in scientific machine learning. The key idea is to design algorithms at the infinite-dimensional level and subsequently discretize them in the tangent space of the neural network ansatz, similar to a natural gradient style ansatz. We illustrate this approach in the context of the variational Monte Carlo method for quantum many-body problems, where neural quantum states have recently emerged as powerful representations of high-dimensional wavefunctions. In this setting, we recover the celebrated stochastic reconfiguration algorithm, interpreting it as a projected Riemannian $L^2$ gradient descent method. We further explore extensions to Riemannian Newton methods, and conclude with considerations related to the scalability of these schemes.

Author

Co-authors

Hang Zhang (ETH Zurich) Jannes Nys (ETH Zurich) Juan Carrasquilla (ETH Zurich) Marius Zeinhofer (ETH Zurich) Siddhartha Mishra (ETH Zurich) Victor Armegioiu (ETH Zurich)

Presentation materials

There are no materials yet.