We give the formulation of a Riemannian Newton algorithm for solving a class of nonlinear eigenvalue problems by minimizing a total energy function subject to the orthogonality constraint. Under some mild assumptions, we establish the global and quadratic convergence of the proposed method. Moreover, the positive definiteness condition of the Riemannian Hessian of the total energy function at a solution is derived. Some numerical tests are reported to illustrate the efficiency of the proposed method for solving large-scale problems.
In this paper, we focus on the stochastic inverse eigenvalue problem of reconstructing a stochastic matrix from the prescribed spectrum. We directly reformulate the stochastic inverse eigenvalue problem as a constrained optimization problem over several matrix manifolds to minimize the distance between isospectral matrices and stochastic matrices. Then we propose a geometric Polak–Ribi`ere–Polyak-based nonlinear conjugate gradient method for solving the constrained optimization problem. The global convergence of the proposed method is established. Our method can also be extended to the stochastic inverse eigenvalue problem with prescribed entries. An extra advantage is that our models yield new isospectral flow methods. Finally, we report some numerical tests to illustrate the efficiency of the proposed method for solving the stochastic inverse eigenvalue problem and the case of prescribed entries.