# All Pages

### From Glossary

100% Rule |

This pertains to sensitivity analysis in linear programming. In its original form, it uses the convexity of the set of admissible changes
in the rim data to test whether a particular change is admissible: *any combination
of changes can occur as long as the total percentage deviation from
the coordinate extremes does not exceed* 100%. (Note: this
applies to right-hand sides (b) and costs (c) separately.)

More generally, suppose the optimal value remains constant if the cost coefficient vector in a linear program is replaced with any of (we could have and let be the j-th coordinate extreme for , but that is not necessary). Then, the optimal objective value is the same for , provided and

The same applies for convex combination of changes in the right-hand side (maybe with the origin, which is no change). If the objective value remains optimal at if is replaced with any of , then it is also optimal for the right-hand side so long that and