# Steepest ascent

### From Glossary

(**descent**, if minimizing). This is a class of algorithms, where
such that the *direction vector* d is chosen by maximizing the initial "velocity" of change, and the step size is chosen by line search. Generally used in the context of unconstrained optimization, the mathematical program is
where
(For *descent*, change Max to Min.) Then, is chosen to maximize the first-order Taylor approximation, subject to a normalization constraint:
where denotes the norm of the direction vector, . When the Euclidean norm is used, this yields the original steepest ascent algorithm by Cauchy, which moves in the direction of the gradient:

(No direction vector is sought if such algorithms stop when reaching a stationary point.)

Other norms, such as
where is symmetric and positive definite, lead to other directions that are *steepest* relative to that norm. In particular, if this yields the modified Newton method.