News icon 学术报告
Gywm line

目:Quadratic Optimization with Orthogonality Constraint: Explicit Lojasiewicz Exponent and Linear Convergence of Retraction-Based Line-Search and Stochastic Variance-Reduced Gradient Methods


报告人:Anthony Man-Cho So




间:2018年10月17日  上午10:00


要:The problem of optimizing a quadratic form over an orthogonality constraint (QP-OC for short) is one of the most fundamental matrix optimization problems and arises in many applications. In this work, we characterize the growth behavior of the objective function around the critical points of the QP-OC problem and demonstrate how such characterization can be used to establish the linear convergence of iterative methods that exploit the manifold structure of the orthogonality constraint to find a critical point of the problem. We also propose a stochastic variance-reduced gradient (SVRG) method called Stiefel-SVRG for solving the QP-OC problem and present a novel linear convergence analysis of the method. An important feature of Stiefel-SVRG is that it allows for general retractions and does not require the computation of any vector transport on the Stiefel manifold. As such, it is computationally more advantageous than other recently-proposed SVRG-type algorithms for manifold optimization.


邀请人:陶敏 老师