Speaker
Description
Approaching the continuum limit in a lattice field theory is an important but computationally difficult problem. Here, on the one side, most traditional Monte Carlo methods suffer from critical slowing down. On the other side, generative models find it increasingly difficult to learn the map from a simpler to the targeted theory.
To tackle this problem, we construct a generative model using the physics-informed renormalisation group. Here, the layered map is trained and optimised independently on each layer. For each layer, the training boils down to solving an independent linear differential equation. This potentially parallelises the training procedure. Moreover, because the differential equation is analytically given, it allows for a systematic error correction and improvement beyond training time. This is done without requiring samples from the targeted theory. These features pave the way for future research and enable a systematic correction and design of generative models for lattice field theories. We illustrate the practical feasibility of the architecture within simulations of scalar field theories.
| Parallel Session (for talks only) | Algorithms and artificial intelligence | 
|---|
