MVE165/MMG631,Linear and integer optimization with applications The simplex method: degeneracy; unbounded solutions; starting solutions; infeasibility; alternative optimal solutions Ann-Brith Strömberg 2013 04 09
Summary of the simplex method Optimality condition: For a minimization [maximization] problem, the variable entering the basis should have the largest negative [positive] reduced cost [marginal value] The entering variable determines a direction (along an edge of the polyhedron) in which the objective value decreases [increases] If all reduced costs are positive [negative] then the current basis is optimal Feasibility condition: The variable leaving the basis is the one with the smallest quota between the corresponding right hand side and the (non-negative) constraint coefficient of the entering variable (search direction) The variable leaving the basis corresponds to the constraint that is reached first A negative (zero) coefficient of the entering variable: the corresponding (in)equality is not constraining
Simplex search for linear (minimization) programs (Ch. 4.6) 1. Initialization: Choose any feasible basis, construct the corresponding basic solution x 0 ; let t = 0 2. Step direction: Select a variable to enter the basis using the optimality condition (negative reduced cost). If no entering variable exists: Stop, the solution is optimal 3. Step length: Select a leaving variable using the feasibility condition (smallest non-negative quota). If all coefficients 0: Stop, the problem has an unbounded solution 4. New iterate: Compute the new basic solution x t+1 by performing matrix operations 5. Let t := t + 1 and repeat from 2
Special case: Degeneracy (Ch. 4.10) If the smallest non-negative quotient is zero, the value of a basic variable will become zero in the next iteration The solution is degenerate The objective value will not improve in this iteration Risk: cycling around (non-optimal) bases Reason: a redundant constraint touches the feasible set Example: x 1 + x 2 6 x 2 3 x 1 + 2x 2 9 x 1, x 2 0
Special case: Degeneracy example minimize z = x 1 2x 2 x 1 + x 2 6 x 2 3 x 1 + 2x 2 9 x 1, x 2 0 basis z x 1 x 2 x 3 x 4 x 5 RHS z 1-1 -2 0 0 0 0 x 3 0 1 1 1 0 0 6 x B = (x 3, x 4,x 5) = (6,3, 9) x 4 0 0 1 0 1 0 3 x 5 0 1 2 0 0 1 9 z 1-1 0 0 2 0 6 x 3 0 1 0 1-1 0 3 x B = (x 3, x 2,x 5) = (3,3, 3) x 2 0 0 1 0 1 0 3 x 5 0 1 0 0-2 1 3 z 1 0 0 0 0 1 9 x 3 0 0 0 1 1-1 0 x B = (x 3, x 2,x 1) = (0,3, 3) x 2 0 0 1 0 1 0 3 x 1 0 1 0 0-2 1 3 z 1 0 0 0 0 1 9 x 4 0 0 0 1 1-1 0 x B = (x 4, x 2,x 1) = (0,3, 3) x 2 0 0 1-1 0 1 3 x 1 0 1 0 2 0-1 3 z 1 0 0 1 1 0 9 x 5 0 0 0-1 -1 1 0 x B = (x 5, x 2,x 1) = (0,3, 3) x 2 0 0 1 0 1 0 3 x 1 0 1 0 1-1 0 3 This last iteration will not appear in the simplex course. Why?
Special case: Degeneracy Typical objective function progress of the simplex method objective value iteration number Computation rules to prevent from infinite cycling: careful choices of leaving and entering variables In modern software: perturb the right hand side (b + b), solve, decrease b and resolve starting from the current basis. Repeat until b = 0
Unbounded solutions (Ch. 4.4, 4.6) If all quotas are negative, the value of the variable entering the basis may increase infinitely The feasible set is unbounded In a real application this would probably be due to some incorrect modelling assumption Example: minimize z = x 1 2x 2 subject to x 1 +x 2 2 2x 1 +x 2 1 x 1, x 2 0 Draw graph!!
Unbounded solutions (Ch. 4.4, 4.6) A feasible basis is given by x 1 = 1, x 2 = 3, with corresponding tableau: basis z x 1 x 2 s 1 s 2 RHS z 1 0 0 5-3 7 x 1 0 1 0 1-1 1 x 2 0 0 1 2-1 3 Entering variable is s 2 Row 1: x 1 = 1 + s 2 0 s 2 1 Row 2: x 2 = 3 + s 2 0 s 2 3 No leaving variable can be found, since no constraint will prevent s 2 from increasing infinitely Homework: Find the above basis using the simplex method
Starting solution finding an initial basis (Ch. 4.9) Example: minimize z = 2x 1 +3x 2 subject to 3x 1 +2x 2 = 14 2x 1 4x 2 2 Draw graph!! 4x 1 +3x 2 19 x 1,x 2 0 Add slack and surplus variables minimize z = 2x 1 +3x 2 subject to 3x 1 +2x 2 = 14 2x 1 4x 2 s 1 = 2 4x 1 +3x 2 +s 2 = 19 x 1,x 2,s 1,s 2 0 How to find an initial basis? Only s 2 is obvious!
Artificial variables Add artificial variables a 1 and a 2 to the first and second constraints, respectively Solve an artificial problem: minimize a 1 + a 2 minimize w = a 1 +a 2 subject to 3x 1 +2x 2 +a 1 = 14 2x 1 4x 2 s 1 +a 2 = 2 4x 1 +3x 2 +s 2 = 19 x 1,x 2,s 1,s 2,a 1,a 2 0 The phase one problem An initial basis is given by a 1 = 14, a 2 = 2, and s 2 = 19: basis w x 1 x 2 s 1 s 2 a 1 a 2 RHS w 1-5 2 1 0 0 0-16 a 1 0 3 2 0 0 1 0 14 a 2 0 2-4 -1 0 0 1 2 s 2 0 4 3 0 1 0 0 19
Find an initial solution using artificial variables x 1 enters a 2 leaves (then x 2 s 2, then s 1 a 1 ) basis w x 1 x 2 s 1 s 2 a 1 a 2 RHS w 1-5 2 1 0 0 0-16 a 1 0 3 2 0 0 1 0 14 a 2 0 2-4 -1 0 0 1 2 s 2 0 4 3 0 1 0 0 19 w 1 0-8 -1.5 0 0-11 a 1 0 0 8 1.5 0 1 11 x 1 0 1-2 -0.5 0 0 1 s 2 0 0 11 2 1 0 15 w 1 0 0-0.045 0.727 0-0.091 a 1 0 0 0 0.045-0.727 1 0.091 x 1 0 1 0-0.136 0.182 0 3.727 x 2 0 0 1 0.182 0.091 0 1.364 w 1 0 0 0 0 0 s 1 0 0 0 1-16 2 x 1 0 1 0 0-2 4 x 2 0 0 1 0 3 1 A feasible basis is given by x 1 = 4, x 2 = 1, and s 1 = 2
Find an initial solution illustration
Infeasible linear programs (Ch. 4.9) If the solution to the phase one problem has an optimal value w = 0, a feasible basis has been found Start optimizing the original objective function z from this basis (homework) If the solution to the phase one problem has an optimal value w > 0, no feasible solutions exist What would this mean in a real application? Alternative: Big-M method: Add the artificial variables to the original objective with a large coefficient Example: minimize z = 2x 1 + 3x 2 minimize z a = 2x 1 + 3x 2 + Ma 1 + Ma 2
Alternative optimal solutions (Ch. 4.6) Example: maximize z = 2x 1 +4x 2 subject to x 1 +2x 2 5 x 1 +x 2 4 Draw graph!! x 1,x 2 0 The extreme points (0, 5 2 ) and (3,1) have the same optimal value z = 10 All solutions that are positive linear (convex) combinations of these are optimal: (x 1,x 2 ) = α (0, 5 ) + (1 α) (3,1), 0 α 1 2
Next lecture on Thursday, April 11 Sensitivity analysis (Ch. 5) Linear optimization duality theory (Ch. 6)