University of Wisconsin – Madison, Department of Computer Sciences
?OnMin-Max Optimization and Halpern Iteration?
Abstract: Min-max optimizationconstitutes a fundamental class of problems with a range of applications inareas such as game theory, engineering, finance, and training of adversarialmodels in machine learning. In this talk, I willprovide an overview of smoothconvex-concave min-max optimization. I will then discuss connections betweensmooth min-max optimization and the problem of finding a fixed point of a nonexpansive(1-Lipschitz) map. This connection allows us to use variants of Halperniteration ? a classical method for finding fixed points of nonexpansive maps? to solve broad classes of min-max optimization problems. The resultingalgorithms attain near-optimal convergence rates for different classes ofsmooth min-max problems, are parameter-free, and lead to guarantees both interms of approximating the associated optimality gap and the appropriate notionof stationarity (in unconstrained optimization, an approximate stationary pointis a point with small gradient). These are the first results that cansimultaneously achieve all of the aforementioned guarantees and, up tologarithmic factors, completely resolvethe complexity of smooth min-maxoptimization.
Bio: Jelena Diakonikolas isan assistant professor in the Department of Computer Sciences at UW-Madison.Prior to joining UW-Madison, she held postdoctoral positions at UC Berkeley andBoston University. She received her PhD from Columbia University in 2016. Herresearch interests are in efficient optimizationmethods for large scaleproblems and connections between optimization and other areas such as dynamicalsystems and Markov Chain Monte Carlo methods. Jelena is arecipient of a Simons-Berkeley Research Fellowship, the Morton B. FriedmanPrize for Excellence at Columbia Engineering, and a Qualcomm InnovationFellowship.