
Mixed Integer Linear ProgrammingRestricting the domain of all or of a part of variables x_{i} of an LP problem to integer values an integer (ILP) or a mixedinteger linear programming problem (MILP) results.Some Remarks on MILP ModelsBuilding mixedinteger models requires great caution. Often there exist different possibilities to formulate an optimization problem, sometimes adding redundant constraints makes an algorithm work faster, e.g., if the gap between the optimal solutions of the LPrelaxation and of the original problem is diminished by this. Even some nonlinear optimization problems can be transformed to MILP's using special types of discrete variables.
Solution Approaches for MILP ProblemsA great variety of algorithms to solve mixed integer optimization problems has arisen during the last decades. Among the best known exact algorithms for solving MILP's are the following methods: Efficient implicit enumerative algorithms include pruning criteria so that not all feasible solutions have to be tested for finding the optimal solution and for proving optimality. The widely used Branch&Bound (B&B) algorithm with LPrelaxation (see figure ''Branch&Bound'') is the most important representative of enumerative algorithms and therefore discussed in more detail in the next subsection. Cutting plane algorithms and Branch&Cut [Gomory (1958), Crowder et al. (1983), Wolsey and Van Roy (1987), Padberg and Rinaldi (1991), Balas et al. (1993), Boyd (1994)] for MILP's are derived from the Simplex algorithm. After computing the continuous optimum by LPrelaxation of the integrality constraints step by step new constraints are added to the MILP. With the help of these additional inequalities non integer variables of the continuous solutions are forced to take integer values. Cutting plane methods are not restricted to MILP's, they are used, e.g., in nonlinear and nondifferentiable optimization as well (Lemaréchal in ). Dynamic programming is not a generalpurpose algorithm like methods belonging to the first two groups. Originally, it was developed for the optimization of sequential decision processes. This technique for multistage problem solving may be applied to linear and nonlinear OPs which can be described as a nested family of subproblems. The original problem is solved recursively from the solutions of the subproblems. Decomposition methods [Benders (1962), Fisher (1981), Geoffrion and McBride (1978), Magnanti and Wong (1981), van Roy (1983), Van Roy (1986)]. Logicbased methods [Balas (1975, 1979), Jeroslow (1977), Hooker (1988), Jeroslow and Lowe (1984, 1985), Jeroslow and Wang (1990), Raman and Grossmann (1992)]. Furthermore there exist heuristics, local and global search algorithms as for instance simulated annealing.
Branch&Bound AlgorithmThe first B&B algorithm was developed in 1960 by Land and Doig. The branch in B&B hints at the partitioning process a ''divideandconquer'' like approach used to prove optimality of a solution. Lower^{2} bounds are used during this process to avoid an exhaustive search in the solution space. The B&B idea or implicit enumeration characterizes a wide class of algorithms which can be applied to discrete optimization problems in general. The computational steps of the B&B algorithm are: After some initialization the LP relaxation that is that LP problem which results if we relax all integer variables to continuous one establishes the first node. The node selection is obvious in the first step (just take the LP relaxation), later on it is based on some heuristics. A B&B algorithm of Dakin (1965) with LP relaxations uses three pruning criteria: infeasibility, optimality and value dominance relation. In a maximization problem the integer solution found lead to an increasing sequence of lower bounds while the LP problems in the tree decrease the upper bound. Note that a denotes an addcut which causes the algorithm to accept a new integer solution only if it better by at least the value of a. If the pruning criteria fail branching starts: The branching in this algorithm is done by variable dichotomy: for a fractional y_{j}^{*} two son nodes are created with the additional constraint y_{j} £ [y_{j}^{*}] resp. y_{j} ³ [y_{j}^{*}]+1. Other possibilities for dividing the search space are for instance generalized upper bound dichotomy or enumeration of all possible values, if the domain of a variable is finite. The advantage of variable dichotomy is that only simple constraints on lower and upper bounds are added to the problem. The search strategy plays an important role in implicit enumeration, widely used is the depthfirst plus backtracking rule as presented above. If a node is not pruned, one of its two sons is considered. If a node is pruned, the algorithm goes back to the last node with a son which has not yet been considered (backtracking). In linear programming only lower and upper bound constraints are added, the dual simplex algorithm can reoptimize the problem directly without data transfer or basis reinversion. Furthermore, it is more likely that feasible solutions are found deep in the tree as experience has shown. Nevertheless, in some cases the use of the opposite strategy, breadthfirst search, may be advantageous. Another important point is the selection of the branching variable. A common way of choosing a branching variable is by userspecified priorities, because no robust general strategy is known. Degradations or penalties may also be used to choose the branching variables, both methods estimate or calculate the increase of the objective function value if a variable is required to be integral, especially penalties are costly to compute in relation to the gained information so that they are used quite rarely .
The B&B algorithm terminates after a finite number of steps, if the
solution space of the LPrelaxation of problem MILP is bounded.

Copyright © 19982013, MaBOS GmbH