Speaker
Description
The conjugate gradient is the standard technique for computing propagators in lattice QCD. Preconditioning the Dirac Operator makes this method faster. The goal of our project is to develop Neural Networks to predict preconditioners for the Dirac operator with the gauge configuration as input.
Our approach closely follows recent successes from MIT[1], where traditional neural networks have been used to learn preconditioners by optimizing a differentiable loss function. We plan to first set up similar neural networks for 4D SU(3) gauge configurations. 
The primary objective is to then develop a reinforcement learning-based neural network that achieves similar outcomes. These networks can be used to build neural networks in cases where the loss function may not be differentiable, making them a viable option for a broader range of applications.
[1] Y. Sun, S. Eswar, Y. Lin, W. Detmold, P. Shanahan, X. Li, Y. Liu and P. Balaprakash,
``Matrix-free Neural Preconditioner for the Dirac Operator in Lattice Gauge Theory, [arXiv:2509.10378 [hep-lat]].
| Parallel Session (for talks only) | Algorithms and artificial intelligence | 
|---|
