Tsinghua Workshop on Machine Learning in Geometry and Physics 2018
from
Sunday 10 June 2018 (16:00)
to
Friday 15 June 2018 (17:00)
Monday 4 June 2018
Tuesday 5 June 2018
Wednesday 6 June 2018
Thursday 7 June 2018
Friday 8 June 2018
Saturday 9 June 2018
Sunday 10 June 2018
Monday 11 June 2018
07:30
Breakfast
Breakfast
07:30 - 08:30
08:40
Opening Ceremony
Opening Ceremony
08:40 - 09:00
09:00
Learning and Lie Groups
-
Gregory S. Chirikjian
(
Johns Hopkins University
)
Learning and Lie Groups
Gregory S. Chirikjian
(
Johns Hopkins University
)
09:00 - 10:00
Machine learning methods are mostly based on calculus and probability and statistics on Euclidean spaces. However, many interesting problems can be articulated as learning in lower dimensional embedded manifolds and on Lie groups. This talk reviews how learning and Lie groups fit together, and how the machine learning community can benefit from modern mathematical developments. The topics include: •Introduction to Calculus on Lie Groups (Differential Operators, Integration) •Probability on Lie Groups (Convolution, Fourier Analysis, Diffusion Equations) •Application 1: Workspace Generation and Inverse Kinematics of Highly Articulated Robotic Manipulators •Application 2: Pose Distributions for Mobile Robots •Application 2: Lie-Theoretic Invariances in Image Processing and Computer Vision •Application 3: Coset-Spaces of Lie Groups by Discrete Subgroups in Crystallography •Prospects for the Future
10:00
Tea Break
Tea Break
10:00 - 10:30
10:30
A Generalized Construction of Calabi-Yau Manifolds and Mirror Symmetry
-
Per Berglund
(
New Hampshire
)
A Generalized Construction of Calabi-Yau Manifolds and Mirror Symmetry
Per Berglund
(
New Hampshire
)
10:30 - 11:15
We extend the construction of Calabi-Yau manifolds to hypersurfaces in non-Fano toric varieties. The associated non-reflexive polytopes provide a generalization of Batyrev’s original work, allowing us to construct new pairs of mirror manifolds. In particular, this allows us to find new K3-fibered Calabi-Yau manifolds, relevant for string compactifications.
11:15
Patterns in Calabi-Yau Threefolds
-
Vishnu Jejjala
(
Witwatersrand
)
Patterns in Calabi-Yau Threefolds
Vishnu Jejjala
(
Witwatersrand
)
11:15 - 12:00
TBA
12:00
Lunch Break
Lunch Break
12:00 - 14:00
14:00
Neural Program Synthesis and Neural Automated Theorem Proving, via Curry-Howard Correspondence
-
Greg Yang
(
Microsoft Research
)
Neural Program Synthesis and Neural Automated Theorem Proving, via Curry-Howard Correspondence
Greg Yang
(
Microsoft Research
)
14:00 - 14:45
Curry-Howard correspondence is, roughly speaking, the observation that proving a theorem is equivalent to writing a program. Using this principle, I will present a unified survey of recent trends in the application of deep learning in program synthesis and automated theorem proving, with commentary on their applicability to the working mathematician and physicists.
14:45
Quantum Computers and Machine Learning
-
Artur Garcia Saez
(
Barcelona SC Center
)
Quantum Computers and Machine Learning
Artur Garcia Saez
(
Barcelona SC Center
)
14:45 - 15:30
I will discuss the two-fold relation between Quantum Computers and Machine Learning. On one hand Quantum Computers offer new algorithms to perform training tasks on classical or Quantum data. On the other hand, Machine Learning offers new tools to study Quantum Matter, and to control Quantum experiments.
15:30
Tea Break
Tea Break
15:30 - 16:00
16:00
Free Discussion
Free Discussion
16:00 - 18:00
18:00
Dinner
Dinner
18:00 - 19:30
Tuesday 12 June 2018
07:30
Breakfast
Breakfast
07:30 - 08:30
09:00
Renormalization and hierarchical knowledge representations
-
Cedric Beny
(
Hanyang University
)
Renormalization and hierarchical knowledge representations
Cedric Beny
(
Hanyang University
)
09:00 - 10:00
Our understanding of any given complex physical system is made of not just one, but many theories which capture different aspects of the system. These theories are often stitched together only in informal ways. An exception is given by renormalization group techniques, which provide formal ways of hierarchically connecting descriptions at different scales. In machine learning, the various layers of a deep neural network seem to represent different levels of abstraction. How does this compare to scale in renormalization? Can one build a common information-theoretic framework underlying those techniques? To approach these questions, I compare two different renormalization techniques (which emerged from quantum information theory), and attempt to adapt them to unsupervised learning tasks. One approach, MERA, superficially resembles a deep convolutional neural net, while another approach based on dimensional reduction yields something similar to principal component analysis.
10:00
Tea Break
Tea Break
10:00 - 10:30
10:30
Real-space renormalization group
-
Maciej Koch-Janusz
(
ETH Zurich
)
Real-space renormalization group
Maciej Koch-Janusz
(
ETH Zurich
)
10:30 - 11:15
Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the renormalization group (RG) procedure, which systematically retains ‘slow’ degrees of freedom and integrates out the rest. We demonstrate a machine-learning algorithm based on a model-independent, information-theoretic characterization of a real-space RG capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We apply it to classical statistical physics problems in 1 and 2D: we demonstrate RG flow and extract critical exponents. We also discuss the optimality of the procedure.
11:15
Topological Data Analysis for Cosmology and String Theory
-
Gary Shiu
(
University of Wisconsin-Madison
)
Topological Data Analysis for Cosmology and String Theory
Gary Shiu
(
University of Wisconsin-Madison
)
11:15 - 12:00
Topological data analysis (TDA) is a multi-scale approach in computational topology used to analyze the ``shape” of large datasets by identifying which homological characteristics persist over a range of scales. In this talk, I will discuss how TDA can be used to extract physics from cosmological datasets (e.g., primordial non-Gaussianities generated by cosmic inflation) and to explore the structure of the string landscape.
12:00
Lunch Break
Lunch Break
12:00 - 14:00
14:00
Excursion: Nanshan Temple
Excursion: Nanshan Temple
14:00 - 18:00
18:00
Banquet
Banquet
18:00 - 20:00
Wednesday 13 June 2018
07:30
Breakfast
Breakfast
07:30 - 08:30
09:00
Algebraic geometry of the restricted Boltzmann machine
-
Jason Morton
(
Pennsylvania State University
)
Algebraic geometry of the restricted Boltzmann machine
Jason Morton
(
Pennsylvania State University
)
09:00 - 10:00
TBA
10:00
Tea Break
Tea Break
10:00 - 10:30
10:30
Reinforcement learning in the string landscape
-
Fabian Ruehle
(
Oxford
)
Reinforcement learning in the string landscape
Fabian Ruehle
(
Oxford
)
10:30 - 11:15
In studying the string landscape, we often want to find vacua with specific properties, but do not know how to select the string geometry that gives rise to such vacua. For this reason, we apply reinforcement learning, a semi-supervised approach to machine learning in which the algorithm explores the landscape autonomously while being guided towards models with given properties. We illustrate the approach using examples from heterotic, type II, and F-theory.
11:15
On Finding Small Cosmological Constants with Deep Reinforcement Learning
-
Jim Halverson
(
Northeastern
)
On Finding Small Cosmological Constants with Deep Reinforcement Learning
Jim Halverson
(
Northeastern
)
11:15 - 12:00
I will review the Bousso-Polchinski model and aspects of its computational complexity. An asynchronous advantage actor-critic will be used to find small cosmological constants.
12:00
Lunch Break
Lunch Break
12:00 - 14:00
14:00
Faster exploration of parameter space in supersymmetry and string theory using machine learning
-
Sven Krippendorf
(
LMU Munich
)
Faster exploration of parameter space in supersymmetry and string theory using machine learning
Sven Krippendorf
(
LMU Munich
)
14:00 - 14:45
TBA
14:45
The Nelson-Seiberg theorem, its extensions, string realizations, and possible machine learning applications
-
Zheng Sun
(
Sichuan University
)
The Nelson-Seiberg theorem, its extensions, string realizations, and possible machine learning applications
Zheng Sun
(
Sichuan University
)
14:45 - 15:30
The Nelson-Seiberg theorem relates F-term SUSY breaking and R-symmetries in N=1 SUSY field theories. I will talk its several extensions including a revision to a necessary and sufficient condition, discrete R-symmetries and non-Abelian R-symmetries, relation to SUSY and W=0 vacua in the string landscape, and some possible machine learning applications in the searching for SUSY vacua.
15:30
Tea Break
Tea Break
15:30 - 16:00
16:00
Free Discussion
Free Discussion
16:00 - 18:00
18:00
Dinner
Dinner
18:00 - 19:30
Thursday 14 June 2018
07:30
Breakfast
Breakfast
07:30 - 08:30
09:00
Tensor network from quantum simulations to machine learning
-
Jing Chen
(
Flatiron Institute
)
Tensor network from quantum simulations to machine learning
Jing Chen
(
Flatiron Institute
)
09:00 - 10:00
Tensor network is both a theoretical and numerical tool, which has achieved great success in many body physics from calculating he thermodynamic property and quantum phase transition to simulations of black holes. As a general form of high dimensional data structure, tensors have been adopted in diverse branches of data analysis, such as in signal and image processing, psychometric, quantum chemistry, biometric, quantum information, back holes, and brain science. Tensor network simulates the interactions between tensors and becomes a developing powerful in these new fields. During recent years, tensor network numerical methods such as matrix product state (MPS) and projected entangled pair state (PEPS) has also finds its way to machine learning. Besides, the physical concept of entanglement offers a new theoretical approach to the design of different neural networks. For example, we find that graphic models, such as restricted Boltzmann machine (RBM) is equivalent to a specific tensor network and we can study the expression power of the RBM. Phys. Rev. B 97, 085104 (2018) arXiv: 1712.04144
10:00
Tea Break
Tea Break
10:00 - 10:30
10:30
Tensor Network Holography and Deep Learning
-
Yi-Zhuang You
(
UCSD & Harvard
)
Tensor Network Holography and Deep Learning
Yi-Zhuang You
(
UCSD & Harvard
)
10:30 - 11:15
Motivated by the close relations of the renormalization group with both the holography duality and the deep learning, we propose that the holographic geometry can emerge from deep learning the entanglement feature of a quantum many-body state. We develop a concrete algorithm, call the entanglement feature learning (EFL), based on the random tensor network (RTN) model for the tensor network holography. We show that each RTN can be mapped to a Boltzmann machine, trained by the entanglement entropies over all subregions of a given quantum many-body state. The goal is to construct the optimal RTN that best reproduce the entanglement feature. The RTN geometry can then be interpreted as the emergent holographic geometry. We demonstrate the EFL algorithm on 1D free fermion system and observe the emergence of the hyperbolic geometry (AdS_3 spatial geometry) as we tune the fermion system towards the gapless critical point (CFT_2 point).
11:15
Reverse engineering Hamiltonian from spectrum
-
Hiroyu Fujita
(
University of Tokyo
)
Reverse engineering Hamiltonian from spectrum
Hiroyu Fujita
(
University of Tokyo
)
11:15 - 12:00
Handling the large number of degrees of freedom with proper approximations, namely the construction of the effective Hamiltonian is at the heart of the (condensed matter) physics. Here we propose a simple scheme of constructing Hamiltonians from given energy spectrum. The sparse nature of the physical Hamiltonians allows us to formulate this as a solvable supervised learning problem. Taking a simple model of correlated electron systems, we demonstrate the data-driven construction of its low-energy effective model. Moreover, we find that the same approach works for the construction of the entanglement Hamiltonian of a given quantum many-body state from its entanglement spectrum. Compared to the known approach based on the full diagonalization of the reduced density matrix, our one is computationally much cheeper thus offering a way of studying the entanglement nature of large (sub)systems under various boundary conditions.
12:00
Lunch Break
Lunch Break
12:00 - 14:00
14:00
Toward reduction of autocorrelation in HMC by machine learning
-
Akinori Tanaka
(
Riken
)
Toward reduction of autocorrelation in HMC by machine learning
Akinori Tanaka
(
Riken
)
14:00 - 14:45
Recent development of machine learning (ML), especially deep learning is remarkable. It has been applied to image recognition, image generation and so on with very good precision. From a mathematical point of view, images are just real matrices, so it would be a natural idea to replace this matrices with the configurations of the physical system created by numerical simulation and see what happens. In this talk, I will review our attempt to reduce autocorrelation of Hamiltonian Monte Carlo (HMC) algorithm. In addition, I would like to discuss a possibility of using recent sophisticated generative models like VAE, GAN to improve HMC. (work in collaboration with A. Tomiya, arXiv:1712.03893)
14:45
Machine learning for parton distribution determination
-
Stefano Carrazza
(
CERN
)
Machine learning for parton distribution determination
Stefano Carrazza
(
CERN
)
14:45 - 15:30
Parton Distribution Functions (PDFs) are a crucial ingredient for accurate and reliable theoretical predictions for precision phenomenology at the LHC. The NNPDF approach to the extraction of Parton Distribution Functions relies on Monte Carlo techniques and Artificial Neural Networks to provide an unbiased determination of parton densities with a reliable determination of their uncertainties. I will discuss the NNPDF methodology in general, the latest NNPDF global fit (NNPDF3.1) and then present ideas to improve the training methodology used in the NNPDF fits and related PDFs.
15:30
Tea Break
Tea Break
15:30 - 16:00
16:00
Free Discussion
Free Discussion
16:00 - 18:00
18:00
Dinner
Dinner
18:00 - 19:30
Friday 15 June 2018
07:30
Breakfast
Breakfast
07:30 - 08:30
09:00
Deep Learning and AdS/CFT
-
Koji Hashimoto
(
Osaka University
)
Deep Learning and AdS/CFT
Koji Hashimoto
(
Osaka University
)
09:00 - 10:00
We present a deep neural network representation of the AdS/CFT correspondence, and demonstrate the emergence of the bulk metric function via the learning process for given data sets of response in boundary quantum field theories. The emergent radial direction of the bulk is identified with the depth of the layers, and the network itself is interpreted as a bulk geometry. Our network provides a data-driven holographic modeling of strongly coupled systems. With a scalar ϕ4 theory with unknown mass and coupling, in unknown curved spacetime with a black hole horizon, we demonstrate our deep learning (DL) framework can determine them which fit given response data. First, we show that, from boundary data generated by the AdS Schwarzschild spacetime, our network can reproduce the metric. Second, we demonstrate that our network with experimental data as an input can determine the bulk metric, the mass and the quadratic coupling of the holographic model. As an example we use the experimental data of magnetic response of a strongly correlated material Sm0.6Sr0.4MnO3. This AdS/DL correspondence not only enables gravity modeling of strongly correlated systems, but also sheds light on a hidden mechanism of the emerging space in both AdS and DL. (Work in collaboration with A.Tanaka, A.Tomiya and S.Sugishita, arXiv:1802.08313)
10:00
Tea Break
Tea Break
10:00 - 10:30
10:30
Detection of phase transition via convolutional neural networks
-
Akio Tomiya
(
Central China Normal University
)
Detection of phase transition via convolutional neural networks
Akio Tomiya
(
Central China Normal University
)
10:30 - 11:15
A Convolutional Neural Network (CNN) is designed to study correlation between the temperature and the spin configuration of the 2 dimensional Ising model. Our CNN is able to find the characteristic feature of the phase transition without prior knowledge. Also a novel order parameter on the basis of the CNN is introduced to identify the location of the critical temperature; the result is found to be consistent with the exact value. This talk is based on following paper, Journal of the Physical Society of Japan 86 (6), 063001 (arXiv:1609.09087).
11:15
The Machine Learning Role on High Energy Physics: A theoretical view
-
Javier Andres Orduz Ducuara
(
UNAM
)
The Machine Learning Role on High Energy Physics: A theoretical view
Javier Andres Orduz Ducuara
(
UNAM
)
11:15 - 12:00
In this talk, I show some concepts in computing, Physics and Mathematics focusing on High Energy Physics. I share some programming languages and tools implemented for computing the amplitudes, decays and cross sections. In particular, I explore the Two-Higgs Doublet Model and Extended Gauge Group Model and some results using Artificial Inteligence.
12:00
Lunch & Departure
Lunch & Departure
12:00 - 14:00