Name: Florian Schaefer
Date: Tuesday, February 23, 2021 at 11:00 am
Title: Competitive optimization, statistical inference, and fast solvers
Abstract: In this talk, we will use perspectives from game theory and statistical inference to design simple, novel, and efficient algorithms for classical problems in computational science.
In the first part of the talk, we propose competitive gradient descent (CGD) as a natural generalization of gradient descent to saddle point problems and general zero-sum games. Whereas gradient descent minimizes a local linear approximation at each step, CGD uses the Nash equilibrium of a local bilinear approximation. Explicitly accounting for agent interaction significantly improves the convergence properties, as demonstrated in applications to GANs, reinforcement learning, and computational geometry.
In the second part of the talk, we show that the conditional near-independence properties of smooth Gaussian processes imply the near-sparsity of Cholesky factors of dense kernel matrices. We use this insight to derive simple, fast solvers with state-of-the-art complexity vs. accuracy guarantees for general elliptic differential- and integral equations. Our methods come with rigorous error estimates, are easy to parallelize, and show good performance in practice.
Bio: I am a PhD-candidate in applied and computational mathematics at Caltech, advised by Houman Owhadi. Before coming to Caltech, I obtained my Bachelor’s– and Master’s degrees in mathematics at the University of Bonn. My research combines ideas from game theory, statistical inference, and applied mathematics to solve problems in computational science and engineering.