![]() | ||
In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum
Contents
The line search approach first finds a descent direction along which the objective function
Example use
Here is an example gradient method that uses a line search in step 4.
- Set iteration counter
k = 0 , and make an initial guess,x 0 - Repeat:
- Compute a descent direction
p k - Choose
α k h ( α ) = f ( x k + α p k ) overα ∈ R + - Update
x k + 1 = x k + α k p k k = k + 1 - Until
∥ ∇ f ( x k ) ∥ < tolerance
At the line search step (4) the algorithm might either exactly minimize h, by solving
Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima.
Direct search methods
In this method, the minimum must first be bracketed, so the algorithm must identify points x1 and x2 such that the sought minimum lies between them. The interval is then divided by computing