Speaker
Description
Optimization has long served as the primary paradigm for inference and design across science and engineering. More recently, machine learning has shifted this picture by trading explicit structure for universality in highly over-parameterized black-box models. Though powerful, over-parameterization obscures the interpretability of individual degrees of freedom, an effect further amplified by optimization. In this talk, I argue for the existence of an enriched perspective in which solutions are not optimized but instead emerge from collective behaviour. I will motivate this framework using an example from topology optimization, where early-stage collective dynamics encode critical design information well before convergence. Using Bayesian statistics, I then show how inference can be reformulated in terms of ensembles, where collective variables emerge naturally from the geometry of over-parameterized models without any a priori assumptions. Finally, I demonstrate how this framework extends to learning in neural networks, state estimation, and parameter identification. The key message is that useful solutions need not be enforced by optimization, but can arise through collective dynamics.
| Keyword-1 | Emergent behavior |
|---|---|
| Keyword-2 | Statistical inference |
| Keyword-3 | Optimization and learning |