Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) is a matrix-free method for finding the largest (or smallest) eigenvalues and the corresponding eigenvectors of a symmetric positive definite generalized eigenvalue problem
Contents
for a given pair
Algorithm
The method performs an iterative maximization (or minimization) of the generalized Rayleigh quotient
which results in finding largest (or smallest) eigenpairs of
The direction of the steepest ascent, which is the gradient, of the generalized Rayleigh quotient is positively proportional to the vector
called the eigenvector residual. If a preconditioner
called the preconditioned residual. Without preconditioning, we set
or, in short,
is known as preconditioned steepest ascent (or descent), where the scalar
(or
(use
This is a single-vector version of the LOBPCG method. It is one of possible generalization of the preconditioned conjugate gradient linear solvers to the case of symmetric eigenvalue problems. Even in the trivial case
Iterating several approximate eigenvectors together in a block in a similar locally optimal fashion, gives the full block version of the LOBPCG. It allows robust computation of eigenvectors corresponding to nearly-multiple eigenvalues.
General software implementations
LOBPCG's inventor, Andrew Knyazev, published an implementation called Block Locally Optimal Preconditioned Eigenvalue Xolvers (BLOPEX) with interfaces to PETSc and hypre. Other implementations are available in, e.g., Octave, MATLAB, Java, Anasazi (Trilinos), SLEPc, SciPy, and MAGMA.
Material Sciences
LOBPCG is implemented in ABINIT (including CUDA version) and Octopus. It has been used for multi-billion size matrices by Gordon Bell Prize finalists, on the Earth Simulator supercomputer in Japan. Recent implementations include TTPY, Platypus‐QM, and MFDn.
Maxwell's equations
LOBPCG is one of core eigenvalue solvers in PYFEMax, NGSolve, and MFEM.
Data Mining
Software package Megaman uses LOBPCG to scale manifold learning algorithms to large data sets.