Neha Patil (Editor)

LOBPCG

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) is a matrix-free method for finding the largest (or smallest) eigenvalues and the corresponding eigenvectors of a symmetric positive definite generalized eigenvalue problem

Contents

A x = λ B x ,

for a given pair ( A , B ) of complex Hermitian or real symmetric matrices, where the matrix B is also assumed positive-definite.

Algorithm

The method performs an iterative maximization (or minimization) of the generalized Rayleigh quotient

ρ ( x ) := ρ ( A , B ; x ) := x T A x x T B x ,

which results in finding largest (or smallest) eigenpairs of A x = λ B x .

The direction of the steepest ascent, which is the gradient, of the generalized Rayleigh quotient is positively proportional to the vector

r := A x ρ ( x ) B x ,

called the eigenvector residual. If a preconditioner T is available, it is applied to the residual giving vector

w := T r ,

called the preconditioned residual. Without preconditioning, we set T := I and so w := r , . An iterative method

x i + 1 := x i + α i T ( A x i ρ ( x i ) B x i ) ,

or, in short,

x i + 1 := x i + α i w i , w i := T r i , r i := A x i ρ ( x i ) B x i ,

is known as preconditioned steepest ascent (or descent), where the scalar α i is called the step size. The optimal step size can be determined by maximizing the Rayleigh quotient, i.e.,

x i + 1 := arg max y s p a n { x i , w i } ρ ( y )

(or arg min in case of minimizing), in which case the method is called locally optimal. To further accelerate the convergence of the locally optimal preconditioned steepest ascent (or descent), one can add one extra vector to the two-term recurrence relation to make it three-term:

x i + 1 := arg max y s p a n { x i , w i , x i 1 } ρ ( y )

(use arg min in case of minimizing). The maximization/minimization of the Rayleigh quotient in a 3-dimensional subspace can be performed numerically by the Rayleigh–Ritz method. As the iterations converge, the vectors x i and x i 1 become nearly linearly dependent, making the Rayleigh–Ritz method numerically unstable in the presence of round-off errors. It is possible to substitute the vector x i 1 with an explicitly computed difference p i = x i 1 x i making the Rayleigh–Ritz method more stable; see.

This is a single-vector version of the LOBPCG method. It is one of possible generalization of the preconditioned conjugate gradient linear solvers to the case of symmetric eigenvalue problems. Even in the trivial case T = I and B = I the resulting approximation with i > 3 will be different from that obtained by the Lanczos algorithm, although both approximations will belong to the same Krylov subspace.

Iterating several approximate eigenvectors together in a block in a similar locally optimal fashion, gives the full block version of the LOBPCG. It allows robust computation of eigenvectors corresponding to nearly-multiple eigenvalues.

General software implementations

LOBPCG's inventor, Andrew Knyazev, published an implementation called Block Locally Optimal Preconditioned Eigenvalue Xolvers (BLOPEX) with interfaces to PETSc and hypre. Other implementations are available in, e.g., Octave, MATLAB, Java, Anasazi (Trilinos), SLEPc, SciPy, and MAGMA.

Material Sciences

LOBPCG is implemented in ABINIT (including CUDA version) and Octopus. It has been used for multi-billion size matrices by Gordon Bell Prize finalists, on the Earth Simulator supercomputer in Japan. Recent implementations include TTPY, Platypus‐QM, and MFDn.

Maxwell's equations

LOBPCG is one of core eigenvalue solvers in PYFEMax, NGSolve, and MFEM.

Data Mining

Software package Megaman uses LOBPCG to scale manifold learning algorithms to large data sets.

References

LOBPCG Wikipedia