Linear Programming - Interior Point Methods Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on
Example 1 Computational Complexity of Simplex Algorithm min 2 n 1 x 1 2 n 2 x 2... 2x n 1 x n s.t. x 1 5 4x 1 + x 2 25 8x 1 + 4x 2 + x 3 125. 2 n x 1 + 2 n 1 x 2 +... + 4x n 1 + x n 5 n x j 0 j = 1,..., n Simplex method, starting at x = 0, would visit all 2 n extreme points before reaching the optimal solution. 1 V. Klee and G.J. Minty, How good is the simplex algorithm?. In O. Shisha, editor, Inequalities, II, pp. 159-175, Academic Press, 1971
min 4x 1 2x 2 x 3 s.t. x 1 5 4x 1 + x 2 25 8x 1 + 4x 2 + x 3 125 x 1, x 2, x 3 0 Iteration Basic Vectors Objective function 1 x 4, x 5, x 6 0 2 x 1, x 5, x 6-20 3 x 1, x 2, x 6-30 4 x 4, x 2, x 6-50 5 x 4, x 2, x 3-75 6 x 1, x 2, x 3-95 7 x 1, x 5, x 3-105 8 x 4, x 5, x 3-125 Simplex Algorithm is not a polynomial time algorithm (number of computational steps grows as an exponential function of the number of variables, rather than as a polynomial function)
Interior Point Methods for Linear Programming Points generated are in the interior" of the feasible region Based on nonlinear programming techniques Some interior points methods: Affine Scaling Karmarkar s Method We consider the linear program, min s.t. c T x Ax = b x 0 where A R m n and rank(a) = m.
Affine Scaling Idea: (1) Use projected steepest descent direction at every iteration Given a feasible interior point x k at current iteration k. Ax k = b, x k 0 Let d denote a direction such that x k+1 = x k + α k d, α k > 0. Therefore, Ax k+1 = b Ad = 0 Projected Steepest Descent Direction Project the steepest descent direction, c, on the null space of A
Let ĉ = c = p c + q where p c Null Space(A). Ap c = 0. q Row Space(A). q = A T λ. Aĉ = AA T λ λ = (AA T ) 1 Aĉ p c = ĉ q = ĉ A T (AA T ) 1 Aĉ = (I A T (AA T ) 1 A)ĉ = (I A T (AA T ) 1 A)c = Pc where P denotes the projection matrix.
Idea: (2) Position the current point close to the centre of the feasible region For example, one possible choice is the point: 1 = (1, 1,..., 1) T Given a point x k in the interior of the feasible region, define X k = diag(x k ). Define the transformation, y = T(x) = X k 1 x. y k = X k 1 x k = 1 or X k y k = x k
Affine Scaling Algorithm: Start with any interior point x 0 while (stopping condition is not satisfied at the current point) Transform the current problem into an equivalent problem in y space so that the current point is close to the centre of the feasible region Use projected steepest descent direction to take a step in the y space without crossing the feasible set boundary Map the new point back to the corresponding point in the x space endwhile
Stopping Condition for an Affine Scaling Algorithm Consider the following primal and dual problems: Primal Problem (P) Dual Problem (D) min c T x max b T µ s.t. Ax = b x 0 s.t. A T µ c For any primal and dual feasible x and µ, At optimality, c T x b T µ c T x b T µ = 0 (Weak Duality) (Strong Duality) Idea: Use the duality gap, c T x b T µ, to check optimality Q. How to get µ?
Consider the primal problem: min s.t. c T x Ax = b x 0 Define the Lagrangian function, L(x, λ, µ) = c T x + µ T (b Ax) λ T x Assumption: x is primal feasible and λ 0 KKT conditions at optimality: x L(x, λ, µ) = 0 A T µ + λ = c λ i x i = 0 i = 1,..., n Defining X = diag(x), the KKT conditions are X(c A T µ) = 0.
Solve the following problem to get µ: min µ Xc XAT µ 2 µ = (AX 2 A T ) 1 AX 2 c Thus, at a given point x k, Duality Gap = c T x k b T µ k where µ k = (AX k2 A T ) 1 AX k2 c and X k = diag(x k ).
Step 1: Equivalent problem formulation to get considerable improvement in the objective function Given x k, define X k = diag(x k ). Define a transformation T as, Therefore, Using this transformation, min s.t. c T x Ax = b, x 0 y = T(x) = X k 1 x X k y k = x k and y k = 1. } min s.t. which can written in standard form as min s.t. c T y Āy = b, y 0 c T X k y AX k y = b, y 0 where c = X k c and Ā = AX k.
Step 2: Find the projected steepest direction and step length at y k for the problem, min s.t. c T y Āy = b, y 0 Given x k, y k = X k 1 x k = 1. The projected direction of c on the null space of Ā is, d k = (I X k A T (AX k2 A T ) 1 AX k )X k c. Let α k (> 0) denotes the step length. y k+1 = y k + α k d k = 1 + α k d k 0 Let α max = min j:d k j <0 1 d k j and set α k =.9 α max. Step 3: x k+1 = X k y k+1
Affine Scaling Algorithm (to solve an LP in Standard Form) (1) Input: A, b, c, x 0, ɛ (2) Set k := 0. (3) X k = diag(x k ) (4) µ k = (AX k2 A T ) 1 AX k2 c (5) while (c T x k b T µ k ) > ɛ (a) d k = (I X k A T (AX k2 A T ) 1 AX k )X k c (b) α k =.9 min j:d k j <0 1 d k j (c) x k+1 = X k (1 + α k d k ) (d) X k+1 = diag(x k+1 ) (e) µ k+1 = (AX k+12 A T ) 1 AX k+12 c (f) k := k + 1 endwhile Output : x = x k
Application of Affine Scaling Algorithm to the problem, min 3x 1 x 2 s.t. x 1 + x 2 2 x 1 1 x 1, x 2 0 x = (1, 1) T Iteration x kt x k x 0 (.6,.6).57 1 (.87,.81).23 2 (.88, 1.00).12 3 (.90,.99).1026 4 (.90, 1.00).1001
2 1.8 1.6 1.4 1.2 x2 1 0.8 0.6 0.4 0.2 0 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 x1
Assumptions: Idea: Karmarkar s Method The problem is in homogeneous form: min c T x s.t. Ax = 0 1 T x = 1 x 0 Optimum objective function value is 0 (1) Use projective transformation to move an interior point to the centre of the feasible region (2) Move along projected steepest descent direction
Projective Transformation: Consider the transformation, T, defined by, Remarks: y = T(x) = Xk 1 x 1 T X k 1 x, x 0 x y (y = ( 1 n,..., 1 n )T, 1 T y = 1) Inverse transformation: x = (1 T X k 1 x)(x k y) 1 T x = (1 T X k 1 x)1 T (X k y). For every feasible x, 1 T x = 1 1 T X k 1 x = 1 1 T X k y T 1 (y) = x = Xk y 1 T X k y
Using the transformation, x = Xk y 1 T X k y min c T x s.t. Ax = 0 1 T x = 1 x 0 min ct X k y 1 T X k y s.t. AX k y = 0 1 T y = 1 y 0 min c T X k y s.t. AX k y = 0 1 T y = 1 y 0 Equivalence based on the assumption: Optimal objective function value is 0 and 1 T X k y > 0
Consider the problem, min c T X k y s.t. AX k y = 0 1 T y = 1 y 0 Step 1: Find a projected steepest descent direction in the y space ( Projection of X k c onto the subspace of {d : AX k d = 0, 1 T d = 0, d 0}) Find d by solving min 1 2 Xk c d 2 s.t. AX k d = 0, projecting it onto the null space of 1 T and ensure d 0.
Consider the problem, min 1 2 Xk c d 2 s.t. AX k d = 0 L(d, µ) = 1 2 Xk c d 2 + µ T Ax k d d L(d, µ) = 0 (X k c d) + X k A T µ = 0 d = X k c X k A T µ 0 = AX k d = AX k2 c AX k2 A T µ AX k2 c = AX k2 A T µ Projection of d on the null space of 1 T : d k = (I 1 n 11T )(X k c X k A T µ)
Karmarkar s Projective Scaling Algorithm (for homogeneous Linear Program) (1) Input: Homogeneous LP, A, c, ɛ (2) Set k := 0, x k = 1 n 1 (3) X k = diag(x k ) (4) while c T x k > ɛ (a) Find the projected steepest descent direction d k (b) y k+1 = 1 n 1 + δ n(n 1) (c) x k+1 = T 1 (y k+1 ) (d) X k+1 = diag(x k+1 ) (e) k := k + 1 endwhile Output : x = x k d k d k (δ = 1 3 )