Revisiting Riemannian optimization for the symmetric eigenvalue problem
Bart Vandereycken  1@  
1 : Department of Mathematics, University of Geneva

omputing dominant eigenpairs of a symmetric matrix is one of the most prototypical problems for Riemannian optimization. In particular, it naturally leads to the maximization of a smooth objective function on the Stiefel manifold (of orthonormal matrices) or the Grassmann manifold (of subspaces). Several well-known algorithms in numerical linear algebra for the symmetric eigenvalue problem can be elegantly analyzed by exploiting this Riemannian point of view.

In this talk, we revisit this program and show how recent advances in non-convex optimization, like weak quasi convexity, can be generalized to the Grassmann manifold and the symmetric problem matrix. In particular, in contrast to existing results on local convexity, we are able to show that Riemannian steepest descent linearly converges to the dominant subspace if it is started from any initial point (excluding a measure zero set).

While steepest descent is not a competitive algorithm compared to Krylov methods or even subspace iteration, it has the benefit that it is very robust to perturbations in the matrix and that it can be accelerated with a momentum term. We will illustrate this empirically and compare to existing methods.

Joint work with: Foivos Alimisis (Geneva) and Yousef Saad (Minnesota).


Personnes connectées : 7 Vie privée
Chargement...