运筹与优化

Adaptive first-order algorithms for convex optimization

发布人:发布时间: 2024-04-15

字体大小: 【小】 【中】 【大】

题目:Adaptive first-order algorithms for convex optimization

报告人:Prof. Yurii Malitskyi (University of Vienna)

时间:2024年4月22日(星期一)下午16:00

地点:Online,Zoom

Meeting ID: 697 8613 5281

Passcode: 107700

摘要: In this talk, we will cover several results related to adaptive algorithms. In particular, we show how to make gradient descent and proximal gradient fully adaptive without increasing their iteration cost. Our approach requires even less than the classical results --- we only need local Lipschitzness of the gradient. The introduced stepsizes dynamically approximate the local curvature of the differentiable function, allowing for incremental increases over iterations. We will discuss some limitations and open problems. Time permitting, we will continue our discussion with the more challenging problem of solving saddle point problems. 

报告人简介:Yura Malitsky is an assistant professor of computational optimization at the University of Vienna in Austria. He completed his PhD at Kyiv National University in Ukraine in 2015. Following that, he held several postdoctoral positions in various European locations and served as an assistant professor in Linköping, Sweden. Yura Malitsky's primary research focus is continuous optimization, with applications in the field of machine learning. You can find more about his work at https://ymalitsky.com.

邀请人:陶敏老师