# Objective function

The (real-valued) function to be optimized. In a mathematical program in standard form, this is denoted $LaTeX: f.$ Also see multiple objectives.

# Online problem

A decision problem that is ongoing, so its data values are frequently changing with uncertainty. Here are some examples:

1. Allocation of bandwidth during video conferencing on a telecommunications network;
2. Investment decisions in a portfolio, selling and buying as the market changes;
3. Scheduling jobs as they arrive;
4. Load forecasting and dispatching for electric power.

In general, an online decision problem could be the allocation of resources that is done in real time with changes in the amount of resources, their costs and the demands for what they produce. A problem that is not online is sometimes called an offline problem.

# Optimal

For a mathematical program in standard form, $LaTeX: \textstyle x^* \in X$ (the domain) is an optimal solution if it is a maximum (or a minimum):

1. $LaTeX: x^*$ is feasible;
2. $LaTeX: f(x^*) \ge f(x)$ for all feasible $LaTeX: x$ (maximum value).

Some authors refer to an optimal solution when they mean a local optimum; others mean a member of the optimality region (which are global optima). In either case, the optimal value is the objective value, evaluated at an optimal solution. A solution is nearly optimal if it is feasible, but the optimality condition (2) is replaced by

$LaTeX: f(x^*) \ge f(x) - t \mbox{ for all feasible } x,$

where $LaTeX: t > 0.$ ($LaTeX: t = 0$ corresponds to being optimal). Typically, $LaTeX: t$ is specified as a small fraction, such as a cut-off tolerance for an algorithm to terminate finitely.

# Optimal partition

Consider a primal-dual pair of linear programs:

$LaTeX: \min \left \{cx: x \ge 0, Ax \ge b \right \} \mbox{ and } \max \left \{yb: y \ge 0, yA \le c \right \}.$

Define primal surplus variables, $LaTeX: s = Ax - b,$ and dual slack variables, $LaTeX: d = c - yA.$ For any non-negative vector, $LaTeX: v,$ define its support set as those indexes for which the coordinate value is positive: $LaTeX: \textstyle I(v)= \left \{j: v_j > 0 \right \}.$ In an optimal solution, complementary slackness must hold, so that $LaTeX: \textstyle j \in I(x)$ implies $LaTeX: j \notin I(d),$ and $LaTeX: \textstyle i \in I(s)$ implies $LaTeX: i \notin I(y).$ Thus, the intersections, $LaTeX: \textstyle I(x) \land I(d)$ and $LaTeX: \textstyle I(s) \land I(y),$ are empty. If the solution is strictly complementary, these support sets yield partitions because then $LaTeX: \textstyle I(x) \lor I(d) = \left \{1, \dots, n \right \}$ and $LaTeX: \textstyle I(s) \lor I(y) = \left \{1, \dots, m \right \}.$ A strictly complementary solution is an interior solution (and conversely), so each interior solution yields a partition of these index sets.

A key fact is that every LP with an optimal solution must have an optimal interior solution. Further, every optimal interior solution yields the same partition, so it is called the optimal partition.

# Optimal response function

The optimal value of the objective as a function of parameters, typically the right-hand side:

$LaTeX: f^*(b, c) = \sup \left \{f(x): x \in X, g(x) \le b, h(x) = c\right\}.$

# Optimality gap

Generally the difference between a best known solution, e.g. the incumbent solution in mixed integer programming, and a value that bounds the best possible solution. One such measure is the duality gap. The term is often qualified as the absolute gap, which is the magnitude of the difference between the best known solution and the best bound, or as the relative gap, which is the absolute gap divided by the best bound.

# Optimality region

Set of optimal points.

# Optimization Software Library

(OSL). A library of optimization routines, callable from Fortran or C, produced by IBM.

# Optimum

(pl. optima). Minimum or maximum.

See convergence.

# Orthogonal complement

Orthogonal complement of a subspace, $LaTeX: S.$ $LaTeX: \textstyle \left \{y: yx = 0 \mbox{ for all } x \in S \right \}.$ In particular, if $LaTeX: \textstyle S = \left \{x: x = Av \mbox{ for some } v \in \mathbb{R}^n \right \},$ where $LaTeX: A$ is an $LaTeX: \textstyle m \times n$ matrix, its orthogonal complement is $LaTeX: \textstyle \left \{y: yA = 0 \right \}.$

# Orthogonal matrix

A matrix, $LaTeX: A,$ such that $LaTeX: \textstyle AA^T=I.$

# Orthogonal vectors

Two vectors in $LaTeX: \textstyle \mathbb{R}^n$ whose inner product is zero.

# Out-of-kilter algorithm

Applies to network flows, where the balance equations hold every iteration, but bounds and duality (optimality) conditions could be violated. The dual prices, often called potentials in this context, are modified along with flows so as to move closer to both primal and dual feasibility.

# Outer approximation

This solves a sequence of approximations to a mathematical program where the approximating problem contains the original feasible region. Examples are cutting plane algorithms and relaxation.

# Over-constrained problem

A term generally meaning that a mathematical program has too many constraints in that either the feasible region can be described with a subset of the constraints or that it is empty.

# Over-optimize

A term that means the model is too optimistic in expecting an optimal solution to produce the benefits in its objective. For example, a time-staged model generally presumes perfect information over the planning horizon, and can be "over-optimizing" in its allocation of resources.