Speaker
Description
Accurate measurements of the distribution of dark matter on small scales and the expansion rate of the Universe (H0) have emerged as two of the most pressing challenges in modern cosmology. Strong gravitational lensing serves as a natural phenomenon that can probe both. Specifically, surface brightness modeling of galaxy-galaxy lenses enables detailed mapping of matter distribution in the foreground, while lensed quasar time delays offer a precise method for measuring H0. Upcoming optical surveys like LSST and Euclid are set to discover hundreds of thousands of strong lenses, representing an increase of several orders of magnitude over current samples.
However, accurate and unbiased analysis of these large data volumes using traditional likelihood-based methods has proven to be impractical. Our team leads the development of rigorous machine learning-based statistical frameworks for strong lensing data analysis. I will share some of the latest exciting results, particularly in the inference of high-dimensional variables. I will show that, beyond accelerating the analysis, these methods enable unprecedented levels of accuracy previously deemed unattainable.
I will conclude by discussing how the application of these methods to the H0 and the small-scale problems could cause a paradigm shift in the field of cosmology.
Keyword-1 | Machine Learning |
---|---|
Keyword-2 | Cosmology |
Keyword-3 | Bayesian Inference |