In this paper, we focus on the stochastic inverse eigenvalue problem of reconstructing a stochastic matrix from the prescribed spectrum. We directly reformulate the stochastic inverse eigenvalue problem as a constrained optimization problem over several matrix manifolds to minimize the distance between isospectral matrices and stochastic matrices. Then we propose a geometric Polak–Ribi`ere–Polyak-based nonlinear conjugate gradient method for solving the constrained optimization problem. The global convergence of the proposed method is established. Our method can also be extended to the stochastic inverse eigenvalue problem with prescribed entries. An extra advantage is that our models yield new isospectral flow methods. Finally, we report some numerical tests to illustrate the efficiency of the proposed method for solving the stochastic inverse eigenvalue problem and the case of prescribed entries.
The Lanczos method is often used to solve a large scale symmetric matrix eigenvalue problem. It is well-known that the single-vector Lanczos method can only find one copy of any multiple eigenvalue and encounters slow convergence towards clustered eigenvalues. On the other hand, the block Lanczos method can compute all or some of the copies of a multiple eigenvalue and, with a suitable block size, also compute clustered eigenvalues much faster. The existing convergence theory due to Saad for the block Lanczos method, however, does not fully reflect this phenomenon since the theory was established to bound approximation errors in each individual approximate eigenpairs. Here, it is argued that in the presence of an eigenvalue cluster,
the entire approximate eigenspace associated with the cluster should be considered as a whole, instead of each individual approximate eigenvectors, and likewise for approximating clusters of eigenvalues. In this paper, we obtain error bounds on approximating eigenspaces and eigenvalue clusters. Our bounds are much sharper than the existing ones and expose true rates of convergence of the block Lanczos method towards eigenvalue clusters. Furthermore, their sharpness is independent of
the closeness of eigenvalues within a cluster. Numerical examples are presented to support our claims.