Steepest ascent

From Glossary

Jump to: navigation, search

(descent, if minimizing). This is a class of algorithms, where LaTeX: \textstyle x' = x + sd, such that the direction vector d is chosen by maximizing the initial "velocity" of change, and the step size LaTeX: (s) is chosen by line search. Generally used in the context of unconstrained optimization, the mathematical program is LaTeX: \textstyle \max \{f(x): x \in R^n\}, where LaTeX: \textstyle f \in C^1. (For descent, change Max to Min.) Then, LaTeX: d is chosen to maximize the first-order Taylor approximation, subject to a normalization constraint: LaTeX: \textstyle \max \{ \nabla f(x) d: ||d|| = 1 \}, where LaTeX: ||d|| denotes the norm of the direction vector, LaTeX: d. When the Euclidean norm is used, this yields the original steepest ascent algorithm by Cauchy, which moves in the direction of the gradient:

LaTeX: 
d = \nabla f(x) / ||\nabla f(x)||.

(No direction vector is sought if LaTeX: \textstyle \nabla f(x)=0; such algorithms stop when reaching a stationary point.)

Other norms, such as LaTeX: \textstyle ||d||^2 = d' Q d, where LaTeX: Q is symmetric and positive definite, lead to other directions that are steepest relative to that norm. In particular, if LaTeX: \textstyle Q = H_f(x), this yields the modified Newton method.


Views
Personal tools