Suvarna Garge (Editor)

Matrix free methods

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In computational mathematics, a matrix-free method is an algorithm for solving a linear system of equations or an eigenvalue problem that does not store the coefficient matrix explicitly, but accesses the matrix by evaluating matrix-vector products. Such methods can be preferable when the matrix is so big that storing and manipulating it would cost a lot of memory and computer time, even with the use of methods for sparse matrices. Many iterative methods allow for a matrix-free implementation, including:

  • the power method,
  • the Lanczos algorithm,
  • Locally Optimal Block Preconditioned Conjugate Gradient Method (LOBPCG),
  • Wiedemann's coordinate recurrence algorithm, and
  • the conjugate gradient method.
  • Distributed solutions have also been explored using coarse-grain parallel software systems to achieve homogeneous solutions of linear systems.

    It is generally used in solving non-linear equations like Euler's equations in Computational Fluid Dynamics. Solving these equations requires the calculation of the jacobian which is costly in terms of CPU time and storage. To avoid this expense, matrix free methods are employed. In order to remove the need to calculate the jacobian, the jacobian vector product is formed instead, which is in fact a vector itself. Manipulating and calculating this vector is easier than working with a large matrix or linear system.

    References

    Matrix-free methods Wikipedia