# Reduced gradient method

### From Glossary

This applies to the case of linear constraints: where and The variables are partitioned into with the corresponding partition of such that the mathematical program is equivalent to:

The method assumes is nonsingular (i.e., only variables with linearly independent columns in can be grouped), and the nondegeneracy assumption: Now *dw* is the independent direction of change, and
thus keeping
on the hyperplanes -- i.e.,

The method considers a direction for the independent variables, which then determines the direction for the dependent variables. In particular, is chosen by evaluating the gradient of the objective:
This gradient (with respect to ) is called the *reduced gradient*:

at Then, completes the first part of the iteration. The second part is to select a step size, which can be blocked by the non-negativity of The resulting change can cause the working set to change such that the partition changes.

Also see the generalized reduced gradient method.