Speaker
Anindita Maiti
Description
Neural Network (NN) output distributions at initialization give rise to free or interacting field theories, known as NN field theories. Details of NN architectures, e.g. parameters and hyperparameters, control field interaction strengths. Following an introduction to NN field theories, I will review the construction of global symmetry groups of these field actions via a dual framework of NN parameter distributions. This duality in architecture is exploited to construct free and interacting Grassmann valued NN field theories, starting from Grassmann central limit theorem. I will present preliminary results on Grassmann NN field theories. Such constructions of NN field theories via architectures have potential impacts in both physics and ML.