August 31, 2021

12:00 pm / 1:00 pm

Venue

Held virtually; Link TBA

Recorded Seminar Link:
https://wse.zoom.us/rec/play/i7p2KEZSOhj9-dDN9G7qrPHHtp-w6s-kYzWE-ndNi-h5S2p_A4lx_I0sLa5VZp5WrK86aRDWsEfalg2q.oErb6GzH4Xap8YOu?_x_zm_rhtaid=176&_x_zm_rtaid=36bHVUOLRYeH7ydgeRiTeQ.1630441499172.ab64fb8b2d258128cf6140e3738d8a7e&autoplay=true&continueMode=true&startTime=1630425620000

Title:  ?RadialDuality: Scalable, Projection-Free Optimization Methods?

 

Abstract:Generalizing ideas for solving conic programs by (Renegar, 2016), this talkwill introduce a new radial duality between optimization problems. This dualityallows us to reformulate (potentially nonconvex) constrained optimizationproblems into an equivalent unconstrained, Lipschitz continuous form. Designingalgorithms to solve this radially dual problem yields new projection-freefirst-order methods that avoid potentially costly orthogonal projection stepsor needing any Lipschitz continuity-type assumptions. The resulting radialoptimization methods present an opportunity to scale up much more efficientlythan classic alternatives like projected gradient descent or Frank-Wolfe.

 

Biography:Benjamin Grimmer recently joined the Johns Hopkins AMS as an assistantprofessor after completing his PhD in Operations Research at CornellUniversity, advised by Jim Renegar and Damek Davis and supported by an NSFGraduate Research Fellowship. Ben’s research focuses on mathematicalprogramming and continuous optimization methods that work at large-scale. Anoverarching theme in Ben’s research is bridging the gap between ourunderstanding of classical continuous optimization approaches and thepotentially stochastic, nonconvex, nonsmooth, adversarial models employed in manymodern data science and machine learning settings.