# Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Size: px
Start display at page:

## Transcription

1 IE Lecture 5. Theorems of Alternatives and Self-Dual Embedding

2 IE A system of linear equations may not have a solution. It is well known that either Ax = c has a solution, or A T y = 0, c T y = 1 has a solution (but not both nor neither). Such type of statement is called theorem of alternatives. There are in fact quite a few such theorems of alternatives, in the context of linear algebra, as we will discuss shortly.

3 IE A very general form of theorem of alternatives is Hilbert s Nullstellensatz. Let P i (x) be a polynomial of n variables x 1,..., x n in the complex field (x C n ), i = 1,..., m. Theorem 1 (Nullstellensatz) Either P 1 (x) = 0, P 2 (x) = 0,..., P m (x) = 0 has a solution x C n, or there exist polynomials Q 1 (x),..., Q m (x), such that has a solution. P 1 (x)q 1 (x) + P 2 (x)q 2 (x) + + P m (x)q m (x) = 1 For more details, visit Terence Tao s blog at:

4 IE For the system of linear inequalities, the most famous theorem of alternative is the following Farkas lemma (shown by Farkas around the end of the 19th century). Theorem 2 (Farkas) Either Ax = b, x 0 has a solution, or A T y 0, b T y < 0 has a solution. In fact, some more refined forms of the theorems of alternatives exist, in the style of the Farkas lemma. Some examples are shown in the next few slides.

5 IE Let x and y be two vectors. Denote x y to be x y and x y ; similarly for x y. Theorem 3 (Gordan) Either Ax > 0 has a solution, or A T y = 0, y 0 has a solution. Theorem 4 (Stiemke) Either Ax 0 has a solution, or A T y = 0, y > 0 has a solution.

6 IE Theorem 5 (Gale) Either Ax b has a solution, or A T y = 0, b T y = 1, y 0 has a solution. Theorem 6 (Tucker) Suppose A 0. Either Ax 0, Bx 0, Cx = 0 has a solution, or A T u + B T v + C T w = 0, u > 0, v 0 has a solution.

7 IE Theorem 7 (Motzkin) Suppose A 0. Either Ax > 0, Bx 0, Cx = 0 has a solution, or A T u + B T v + C T w = 0, u 0, v 0 has a solution.

8 IE The most important theorem of alternatives is still the Farkas lemma. The Farkas lemma can be proven either by the simplex method for linear programming, or it can be proven by the separation theorem. Interestingly, the Farkas lemma can also be proven in an elementary manner, due to a recent brilliant expository article by Terence Tao. Let us slightly rephrase the Farkas lemma: Let P i (x) = n j=1 p ijx j r i, i = 1, 2,..., m. If the system P i (x) 0, i = 1, 2,..., m, does not have a solution, then there are y i 0 such that m i=1 y ip i (x) = 1. We prove the result by induction on n. When n = 1, we can scale P i (x) 0 possibly into three parts: x 1 a i 0, i I + ; x 1 + b j 0, j I ; c k 0, k I 0. If there is k I 0 with c k < 0 then we may let y k = 1/c k and y l = 0 for all l k. Otherwise there is i I + and j I with a i > b j. We let y i = y j = 1/(a i b j ) and 0 elsewhere, yielding m i=1 y ip i (x) = 1.

9 IE Now suppose that the result is proven for the dimension (of x) no more than n. Consider the case when we have n + 1 variables: x = (x; x n+1 ), where x R n. Depending on the coefficients of x n+1 we scale the original inequalities to get the following equivalent system of inequalities: P i ( x) = x n+1 Q i (x) 0, i I + ; P j ( x) = x n+1 + Q j (x) 0, j I ; P k ( x) = Q k (x) 0, k I 0. If Q i (x) + Q j (x) 0, i I + and j I ; Q k (x) 0, k I 0 has a solution then the original system would have a solution too.

10 IE By the condition of the lemma, therefore, it cannot have a solution. Now using the induction hypothesis, there are y ij, y k 0 such that y ij ( Q i (x) + Q j (x)) + y k Q k (x) = 1. i I + ;j I k I 0 Hence, i I + (x n+1 Q i (x)) + y ij j I j I i I + y ij ( x n+1 + Q j (x)) + k I 0 y k Q k (x) = 1. The Farkas lemma is thus proven by this simple induction argument.

11 IE In fact, the strict separation theorem of convex polytopes follows from the Farkas lemma. The separation theorem that we are talking about here is: If polytopes P 1 and P 2 do not intersect then there is an affine linear function f(x) = y, x + y 0 such that f(x) 1 for x P 1, and f(x) 1 for x P 2. Let the extreme points of P 1 be {p 1,..., p s }, and the extreme points of P 2 be {q 1,..., q t }. Then f(x) 1 for x P 1, and f(x) 1 for x P 2 y, p i + y 0 1, for i = 1,..., s; and y, q j + y 0 1, for j = 1,..., t.

12 IE Let us now use a contradiction argument. Suppose the above does not have a solution in y and y 0. Then the Farkas lemma asserts that there are u i 0 (i = 1,..., s) and v j 0 (j = 1,..., t), such that s u i ( y, p i + y 0 1) + i=1 t v j ( y, q j y 0 1) = 1. j=1 This implies that s u i p i t v j q j = 0, s u i t v j = 0, and s u i + t v j = 1. i=1 j=1 i=1 j=1 i=1 j=1 Therefore s i=1 u i = t j=1 v j = 1/2. Moreover, P 1 2 s u i p i = 2 i=1 t v j q j P 2, j=1 which contradicts to the assumption that P 1 P 2 =. The same arguments will work if polytopes are replaced by polydra.

13 IE Let us return to conic optimization models. To study the geometric structure of the problem, introduce the linear subspace L := {x Ax = 0}, and its orthogonal complement L = {A T y y R m }. Let a = A T (AA T ) 1 b, and so a + L = {x Ax = b, x 0}. In this notation, the generic conic optimization problem can be written as minimize c T x subject to x (a + L) K. (1)

14 IE Geometrically, the conic optimization is nothing but linear optimization over the intersection of an affine linear subspace a + L and a given closed convex cone K. Such problems can be either feasible or infeasible; having unbounded optimal value or finite optimal value; attainable optimal solution exists or not. We shall see first that the situation for a generic conic optimization gets more complicated than for linear programming.

15 IE Let us consider the case of infeasibility. It is well known that for linear programming the state of being infeasibility is rather stable, in the sense that if the data (a, c) corresponds to an infeasible problem instance, then a small perturbation on (a, c) will remain the infeasibility status. Why? Consider the following equivalence due to the Farkas lemma: {x R n Ax = b, x R n +} = {y R m A T y R n +, b T y < 0} =.

16 IE For conic optimization, however, this no longer holds. Consider Example 1 minimize x 22 subject to x 11 = b 1, x 12 = b 2 x 11, x 12, 0. x 21, x 22 For (b 1, b 2 ) = (0, 1) the problem is infeasible. However, for (b 1, b 2 ) = (ϵ, 1) where ϵ is any positive number, the problem is always feasible.

17 IE Definition 1 We call the conic optimization problem: weakly infeasible (a + L) K = but dist(a + L, K) = 0; strongly infeasible dist(a + L, K) > 0; weakly feasible (a + L) K = but (a + L) int K = ; strongly feasible (a + L) int K =.

18 IE Theorem 8 (Alternatives of strong infeasibility). {x R n Ax = b, x K} is not strongly infeasible {y R m A T y K, b T y < 0} =. Proof. Note that for s L, i.e. s = A T y for some y R m, we have a T s = a T A T y = b T y. Observe the following chain of equivalent statements: {y R m A T y K, b T y < 0} = for all s L K it must follow a T s 0 a (L K ) = cl (K + L) x i K, x i L such that a (x i + x i ) = (a x i ) x i 0 (a + L) K is not strongly infeasible. The theorem is proven.

19 IE Definition 2 We call d P an primal improving direction if d P L K, and c T d P < 0; d i P an primal improving direction sequence if d i P K, dist(d i P, L) 0 and lim sup i c T d i P < 0; d D a dual improve direction if d D L K and a T d D < 0; d i D a dual improve direction sequence if and lim sup i a T d i D < 0. d i D K, dist(d i D, L ) 0

20 IE If a problem is not strongly infeasible, then it can be either feasible or weakly infeasible. Next question is: can we further characterize the feasibility of a conic problem? Naturally if it is possible, then by exclusion we will also have characterized the weak feasibility. Theorem 9 (Alternatives of primal feasibility). {x R n Ax = b, x K} is infeasible if and only if there is a dual improving direction sequence.

21 IE Proof. Let L = span (H ({x Ax = b})) = x 0 x x 0b + Ax = 0 and K = R + K. We have L = s 0 s y Rm, s 0 = b T y, s = A T y and K = R + K. Let c = 1 0 n.

22 IE Theorem 8 states that the primal problem is strongly infeasible is equivalent to the existence of a dual improve direction. To state it in the dual way, Theorem 8 can also be written as ( c + L ) K is strongly infeasible x L K, c T x < 0. (2) It is easy to see that {x Ax = b, x K} is feasible x L K, c T x < 0 (by (2)) ( c + L ) K is strongly infeasible.

23 IE Therefore, {x Ax = b, x K} is infeasible ( c + L ) K is not strongly infeasible s i 0 0, s i K such that dist si 0, c + L 0 s i K, dist (s i, L ) 0 and s i lim sup i a T s i < 0 a dual improve direction sequence.

24 IE As a consequence, by exclusion, this implies the following result: Corollary 1 {x R n Ax = b, x K} is weakly infeasible there is a dual improving direction sequence but no dual improving direction. So far we have been concerned with characterizing infeasible conic programs. How about feasible ones? Recall that we have two types of feasibilities: the strong feasibility (also known as to satisfy the Slater condition), and the weak feasibility, which means that it is feasible but the Slater condition is not satisfied. It is helpful to derive an equivalent condition for the Slater condition, using the dual information.

25 IE Lemma 1 (a + L) int K = if and only if for any d 0 and d L K it must follow a T d > 0. Proof. Note that x int K 0 y K it follows x T y > 0. = : Let x (a + L) int K. In particular, write x = a + z with z L. Take any d 0 and d L K : 0 < x T d = (a + z) T d = a T d. = : If (a + L) int K =, then a + L and int K can be separated by a hyperplane. That is, there is s 0 such that s T x 0 for all x K and s T (a + y) 0 for all y L. The last inequality means that s T a 0 and s T y = 0 for all y L. Thus, we have found a non-zero vector s L K with a T s 0, which contradicts the assumption.

26 IE Theorem 10 If the primal conic program satisfies the Slater condition, and its dual is feasible, then the dual has a non-empty and compact optimal solution set. Proof. Clearly, the objective value of the dual problem is bounded below by the weak duality theorem. The only possibility for the dual problem not to possess an optimal solution is that the sup is non-attainable; i.e. there is an infinite sequence s i = c A T y i (a + L ) K such that a T s i = a T c b T y i a T c optimal value of the dual problem, and s i. Without loss of generality, suppose that s i / s i converges to d. Then we have d L K and a T d = 0. This contradicts with the primal Slater condition, in light of Lemma 1. Hence the optimal solutions must be bounded.

27 IE Therefore, if a pair of primal-dual conic programs satisfy the Slater condition, then attainable optimal solutions exist for both problems, according to Theorem 10. The only remaining issue in this regard is: how about the strong duality theorem? In other words, do these optimal solutions satisfy the complementarity slackness conditions? The answer is yes. Theorem 11 If the primal conic program and its dual conic program both satisfy the Slater condition, then the optimal solution sets for both problems are non-empty and compact. Moreover, the optimal solutions are complementary to each other with zero duality gap.

28 IE Proof. The first part of the theorem is simply Theorem 10, stated both for the primal and for the dual parts separately. We now focus on the second part. Let x and s be optimal solutions for teh primal and the dual conic optimization problems respectively. Write the primal problem in the parametric form: (P (a)) minimize c T x subject to x a + L x K. Let the optimal value of (P (a)), as a function of the vector a, be v(a). Clearly v(a) is a convex function in its domain, D, whenever (P (a)) is feasible. Obviously, D K. Also we see that v(0) = 0 due to Theorem 8.

29 IE By the Slater condition we know that a int D. Let g be a subgradient for v at a; that is for all z D. v(z) v(a) + g T (z a) (3) Observe that for any d L we have a + L = a + d + L. Thus, v(a + d) = v(a) for any d L. This implies g L. Take any u K. Since x + u is a feasible solution for (P (a + u)), it follows that v(a + u) c T (x + u) = v(a) + c T u. Together with the subgradient inequality (3) this yields (c g) T u 0 for any u K; hence c g K. Thus, c g is a dual feasible solution, and so a T s a T (c g).

30 IE Taking z = 0, the subgradient inequality gives us 0 = v(0) v(a) + g T (0 a) = c T x g T a. (4) Combining with (4) we have a T c c T x a T s 0. Hence, 0 = (x a) T (s c) = (x ) T s + a T c c T x a T s (x ) T s 0. Thus, (x ) T s = 0, and the complementarity slackness condition is satisfied.

31 IE So far we have discussed the relationship between a duality pair of conic optimization problems (P ) minimize c T x subject to Ax = b x K, and (D) maximize b T y subject to A T y + s = c s K.

32 IE In case K = K = R n + the linear programming case the following facts are well known: If (P ) and (D) both are feasible, then both problems are solvable, with attainable optimal solutions; It is possible that (P ) is feasible and unbounded in its objective. In that case, (D) is infeasible. Symmetrically, if (D) is feasible and unbounded, then (P ) is infeasible. It is possible that both (P ) and (D) are infeasible.

33 IE This can be summarized by the following table, where the row is the status of the primal problem and the column represents the status of the dual problem: Feasible Unbounded Infeasible Feasible (P ), (D) solvable Unbounded Infeasible

34 IE For general conic programs, assuming (D) is feasible, then the relationship with (P ) is as follows: Primal Dual s. f. bd opt sol set (Theorem 10) w. f. no improving direction sequence (Theorem 9) w. inf. improving dir seq but no improving dir (Corollary 1) strongly infeasible improving direction (Theorem 8)

35 IE Example 2 with the dual minimize x 12 + x 21 subject to x 12 + x 21 + x 33 = 1, x 22 = 0 x 11, x 12, x 13 x 21, x 22, x 23 0, x 31, x 32, x 33 maximize y 1 subject to 0, 1 y 1, 0 1 y 1, y 2, 0 0, 0, y 1 0. The primal problem is weakly feasible, while the dual is weakly infeasible.

36 IE Example 3 with the dual minimize x 12 + x 21 subject to x 12 + x 21 + x 23 + x 32 = 1 x 22 = 0 x 11, x 12, x 13 x 21, x 22, x 23 x 31, x 32, x 33 0, maximize y 1 subject to 0, 1 y 1, 0 1 y 1, y 2, y 1 0, y 1, 0 0. Both the primal and the dual are weakly infeasible.

37 IE Example 4 with the dual minimize x 12 + x 21 x 33 subject to x 12 + x 21 + x 23 + x 32 = 1, x 22 = 0 x 11, x 12, x 13 x 21, x 22, x 23 0, x 31, x 32, x 33 maximize y 1 subject to 0, 1 y 1, 0 1 y 1, y 2, y 1 0, y 1, 1 0. In this case, the primal is weakly infeasible, while the dual is strongly infeasible.

38 IE Example 5 with the dual minimize x 12 + x 21 x 33 subject to x 12 + x 21 + x 23 + x 32 = 1, x 22 = 1 x 11, x 12, x 13 x 21, x 22, x 23 0, x 31, x 32, x 33 maximize y 1 y 2 0, 1 y 1, 0 subject to 1 y 1, y 2, y 1 0, y 1, 1 0. In this case, both the primal and the dual are strongly infeasible.

39 IE Consider the standard conic optimization problems (P ) and (D). Take any x 0 int K, s 0 int K, and y 0 R m. Moreover, define r p = b Ax 0, r d = s 0 c + A T y 0, and r g = 1 + c T x 0 b T y 0. The following model is self-dual: (SD) min βθ s.t. Ax bτ +r p θ = 0 A T y +cτ +r d θ s = 0 b T y c T x +r g θ κ = 0 r T p y r T d x r gτ = β x K, τ 0, s K, κ 0, where β = 1 + (x 0 ) T s 0 > 1, and the decision variables are (y, x, τ, θ, s, κ).

40 IE The above self-dual model admits an interior feasible solution (y, x, τ, θ, s, κ) = (y 0, x 0, 1, 1, s 0, 1). Proposition 1 The problem (SD) has a maximally complementary optimal solution, denoted by (y, x, τ, θ, s, κ ), such that θ = 0 and (x ) T s + τ κ = 0. Moreover, if τ > 0, then x /τ is an optimal solution for the primal problem, and (y /τ, s /τ ) is an optimal solution for the dual problem. If κ > 0 then either c T x < 0 or b T y > 0; in the former case the dual problem is strongly infeasible, and in the latter case the primal problem is strongly infeasible.

41 IE If strict complementarity fails, then one can only conclude that the original problem does not have complementary optimal solutions. In that case, one may wish to identify whether the problems are weakly feasible or weakly infeasible. However, the weak infeasibility is in a slippery terrace; it cannot be certified by a finite proof. The type of certificate that one can expect is in the spirit of if the problem is ever feasible, then any of its feasible solution must have a very large norm.

42 IE For linear programming, the situation is particularly interesting. Remember we have shown before that if a primal-dual LP pair is strongly feasible, then the primal-dual central path solutions converge to a strictly complementary optimal solutions. That is, Ax(µ) = b A T y(µ) + s(µ) = c x i (µ)s i (µ) = µ, i = 1, 2,..., n and x := lim µ 0 x(µ) and s := lim µ 0 s(µ) are strictly complementary.

43 IE When we apply this to (SD), we have Ax bτ = 0 A T y + cτ s = 0 b T y c T x κ = 0 with x 0, s 0, τ 0, κ 0 and x + s > 0, τ + κ > 0.

44 IE This constructively proves a well known result of Goldman and Tucker: Theorem 12 (Goldman and Tucker, 1956). Suppose that M is skew-symmetric. Then the following system always has a solution: Mx s = 0 x 0, s 0, x + s > 0. In fact, the theorem of Goldman and Tucker can also be used to prove the previous result: they are equivalent.

45 IE Key References: T. Tao, Structure and Randomness, American Mathematical Society, Z.Q. Luo, J.F. Sturm and S. Zhang, Conic Convex Programming and Self-Dual Embedding, Optimization Methods and Software, 14, , J.F. Sturm, Theory and Algorithms for Semidefinite Programming. In High Performance Optimization, eds. H. Frenk, K. Roos, T. Terlaky, and S. Zhang, Kluwer Academic Publishers, Y. Ye, M.J. Todd, and S. Mizuno, An O( nl)-iteration Homogeneous and Self-Dual Linear Programming Algorithm, Mathematics of Operations Research, 19, 53 67, 1994.

### Lecture 5. The Dual Cone and Dual Problem

IE 8534 1 Lecture 5. The Dual Cone and Dual Problem IE 8534 2 For a convex cone K, its dual cone is defined as K = {y x, y 0, x K}. The inner-product can be replaced by x T y if the coordinates of the

### Conic Linear Optimization and its Dual. yyye

Conic Linear Optimization and Appl. MS&E314 Lecture Note #04 1 Conic Linear Optimization and its Dual Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

### Supplement: Hoffman s Error Bounds

IE 8534 1 Supplement: Hoffman s Error Bounds IE 8534 2 In Lecture 1 we learned that linear program and its dual problem (P ) min c T x s.t. (D) max b T y s.t. Ax = b x 0, A T y + s = c s 0 under the Slater

### Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.

### New stopping criteria for detecting infeasibility in conic optimization

Optimization Letters manuscript No. (will be inserted by the editor) New stopping criteria for detecting infeasibility in conic optimization Imre Pólik Tamás Terlaky Received: March 21, 2008/ Accepted:

### Summer School: Semidefinite Optimization

Summer School: Semidefinite Optimization Christine Bachoc Université Bordeaux I, IMB Research Training Group Experimental and Constructive Algebra Haus Karrenberg, Sept. 3 - Sept. 7, 2012 Duality Theory

### Chapter 1. Preliminaries

Introduction This dissertation is a reading of chapter 4 in part I of the book : Integer and Combinatorial Optimization by George L. Nemhauser & Laurence A. Wolsey. The chapter elaborates links between

### 4. Algebra and Duality

4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone

### Solving conic optimization problems via self-dual embedding and facial reduction: a unified approach

Solving conic optimization problems via self-dual embedding and facial reduction: a unified approach Frank Permenter Henrik A. Friberg Erling D. Andersen August 18, 216 Abstract We establish connections

### I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0

### A New Self-Dual Embedding Method for Convex Programming

A New Self-Dual Embedding Method for Convex Programming Shuzhong Zhang October 2001; revised October 2002 Abstract In this paper we introduce a conic optimization formulation to solve constrained convex

### Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

### Convex Optimization M2

Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

### Chapter 2: Preliminaries and elements of convex analysis

Chapter 2: Preliminaries and elements of convex analysis Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-14-15.shtml Academic year 2014-15

### 5. Duality. Lagrangian

5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

### Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS Here we consider systems of linear constraints, consisting of equations or inequalities or both. A feasible solution

### CO 250 Final Exam Guide

Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,

### Linear Programming. Chapter Introduction

Chapter 3 Linear Programming Linear programs (LP) play an important role in the theory and practice of optimization problems. Many COPs can directly be formulated as LPs. Furthermore, LPs are invaluable

### Facial Reduction and Geometry on Conic Programming

Facial Reduction and Geometry on Conic Programming Masakazu Muramatsu UEC Bruno F. Lourenço, and Takashi Tsuchiya Seikei University GRIPS Based on the papers of LMT: 1. A structural geometrical analysis

### Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

4-1 Algebra and Duality P. Parrilo and S. Lall 2006.06.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone of valid

### Midterm Review. Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

Midterm Review Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye (LY, Chapter 1-4, Appendices) 1 Separating hyperplane

### Lecture: Duality.

Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong

### Lagrangian Duality Theory

Lagrangian Duality Theory Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapter 14.1-4 1 Recall Primal and Dual

### A full-newton step infeasible interior-point algorithm for linear programming based on a kernel function

A full-newton step infeasible interior-point algorithm for linear programming based on a kernel function Zhongyi Liu, Wenyu Sun Abstract This paper proposes an infeasible interior-point algorithm with

### Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following

### 4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality

### Lecture Note 18: Duality

MATH 5330: Computational Methods of Linear Algebra 1 The Dual Problems Lecture Note 18: Duality Xianyi Zeng Department of Mathematical Sciences, UTEP The concept duality, just like accuracy and stability,

### 6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 2003 2003.09.02.10 6. The Positivstellensatz Basic semialgebraic sets Semialgebraic sets Tarski-Seidenberg and quantifier elimination Feasibility

### CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming

CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming January 26, 2018 1 / 38 Liability/asset cash-flow matching problem Recall the formulation of the problem: max w c 1 + p 1 e 1 = 150

### Convex Optimization Boyd & Vandenberghe. 5. Duality

5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

### 15. Conic optimization

L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone

### Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

### 1 Review of last lecture and introduction

Semidefinite Programming Lecture 10 OR 637 Spring 2008 April 16, 2008 (Wednesday) Instructor: Michael Jeremy Todd Scribe: Yogeshwer (Yogi) Sharma 1 Review of last lecture and introduction Let us first

### Facial Reduction and Partial Polyhedrality

Facial Reduction and Partial Polyhedrality Bruno F. Lourenço Masakazu Muramatsu Takashi Tsuchiya November 05 (Revised: October 06, April 07, April 08) Abstract We present FRA-Poly, a facial reduction algorithm

### Week 3 Linear programming duality

Week 3 Linear programming duality This week we cover the fascinating topic of linear programming duality. We will learn that every minimization program has associated a maximization program that has the

### "SYMMETRIC" PRIMAL-DUAL PAIR

"SYMMETRIC" PRIMAL-DUAL PAIR PRIMAL Minimize cx DUAL Maximize y T b st Ax b st A T y c T x y Here c 1 n, x n 1, b m 1, A m n, y m 1, WITH THE PRIMAL IN STANDARD FORM... Minimize cx Maximize y T b st Ax

### Finite-Dimensional Cones 1

John Nachbar Washington University March 28, 2018 1 Basic Definitions. Finite-Dimensional Cones 1 Definition 1. A set A R N is a cone iff it is not empty and for any a A and any γ 0, γa A. Definition 2.

### Optimality Conditions for Constrained Optimization

72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)

### Lecture 9 Monotone VIs/CPs Properties of cones and some existence results. October 6, 2008

Lecture 9 Monotone VIs/CPs Properties of cones and some existence results October 6, 2008 Outline Properties of cones Existence results for monotone CPs/VIs Polyhedrality of solution sets Game theory:

### BBM402-Lecture 20: LP Duality

BBM402-Lecture 20: LP Duality Lecturer: Lale Özkahya Resources for the presentation: https://courses.engr.illinois.edu/cs473/fa2016/lectures.html An easy LP? which is compact form for max cx subject to

### 3. Linear Programming and Polyhedral Combinatorics

Massachusetts Institute of Technology 18.433: Combinatorial Optimization Michel X. Goemans February 28th, 2013 3. Linear Programming and Polyhedral Combinatorics Summary of what was seen in the introductory

### A STRUCTURAL GEOMETRICAL ANALYSIS OF WEAKLY INFEASIBLE SDPS

Journal of the Operations Research Society of Japan Vol. 59, No. 3, July 2016, pp. 241 257 c The Operations Research Society of Japan A STRUCTURAL GEOMETRICAL ANALYSIS OF WEAKLY INFEASIBLE SDPS Bruno F.

### LECTURE 10 LECTURE OUTLINE

LECTURE 10 LECTURE OUTLINE Min Common/Max Crossing Th. III Nonlinear Farkas Lemma/Linear Constraints Linear Programming Duality Convex Programming Duality Optimality Conditions Reading: Sections 4.5, 5.1,5.2,

### Input: System of inequalities or equalities over the reals R. Output: Value for variables that minimizes cost function

Linear programming Input: System of inequalities or equalities over the reals R A linear cost function Output: Value for variables that minimizes cost function Example: Minimize 6x+4y Subject to 3x + 2y

### A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems

A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems Gábor Pataki gabor@unc.edu Dept. of Statistics and OR University of North Carolina at Chapel Hill Abstract The Facial Reduction

### Approximate Farkas Lemmas in Convex Optimization

Approximate Farkas Lemmas in Convex Optimization Imre McMaster University Advanced Optimization Lab AdvOL Graduate Student Seminar October 25, 2004 1 Exact Farkas Lemma Motivation 2 3 Future plans The

### Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

Convex Optimization (EE227A: UC Berkeley) Lecture 28 (Algebra + Optimization) 02 May, 2013 Suvrit Sra Admin Poster presentation on 10th May mandatory HW, Midterm, Quiz to be reweighted Project final report

### Research Note. A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization

Iranian Journal of Operations Research Vol. 4, No. 1, 2013, pp. 88-107 Research Note A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization B. Kheirfam We

### Lecture 8. Strong Duality Results. September 22, 2008

Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation

### An Infeasible Interior-Point Algorithm with full-newton Step for Linear Optimization

An Infeasible Interior-Point Algorithm with full-newton Step for Linear Optimization H. Mansouri M. Zangiabadi Y. Bai C. Roos Department of Mathematical Science, Shahrekord University, P.O. Box 115, Shahrekord,

### Semidefinite Programming

Chapter 2 Semidefinite Programming 2.0.1 Semi-definite programming (SDP) Given C M n, A i M n, i = 1, 2,..., m, and b R m, the semi-definite programming problem is to find a matrix X M n for the optimization

### 3. Linear Programming and Polyhedral Combinatorics

Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans April 5, 2017 3. Linear Programming and Polyhedral Combinatorics Summary of what was seen in the introductory

### A Parametric Simplex Algorithm for Linear Vector Optimization Problems

A Parametric Simplex Algorithm for Linear Vector Optimization Problems Birgit Rudloff Firdevs Ulus Robert Vanderbei July 9, 2015 Abstract In this paper, a parametric simplex algorithm for solving linear

### Yinyu Ye, MS&E, Stanford MS&E310 Lecture Note #06. The Simplex Method

The Simplex Method Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye (LY, Chapters 2.3-2.5, 3.1-3.4) 1 Geometry of Linear

### Convex Optimization and Modeling

Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual

### arxiv: v1 [math.oc] 21 Jan 2019

STATUS DETERMINATION BY INTERIOR-POINT METHODS FOR CONVEX OPTIMIZATION PROBLEMS IN DOMAIN-DRIVEN FORM MEHDI KARIMI AND LEVENT TUNÇEL arxiv:1901.07084v1 [math.oc] 21 Jan 2019 Abstract. We study the geometry

### LINEAR PROGRAMMING II

LINEAR PROGRAMMING II LP duality strong duality theorem bonus proof of LP duality applications Lecture slides by Kevin Wayne Last updated on 7/25/17 11:09 AM LINEAR PROGRAMMING II LP duality Strong duality

### Research overview. Seminar September 4, Lehigh University Department of Industrial & Systems Engineering. Research overview.

Research overview Lehigh University Department of Industrial & Systems Engineering COR@L Seminar September 4, 2008 1 Duality without regularity condition Duality in non-exact arithmetic 2 interior point

### ON IMPLEMENTATION OF A SELF-DUAL EMBEDDING METHOD FOR CONVEX PROGRAMMING

ON IMPLEMENTATION OF A SELF-DUAL EMBEDDING METHOD FOR CONVEX PROGRAMMING JOHNNY TAK WAI CHENG and SHUZHONG ZHANG Department of Systems Engineering and Engineering Management, The Chinese University of

### Semidefinite Programming

Semidefinite Programming Notes by Bernd Sturmfels for the lecture on June 26, 208, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra The transition from linear algebra to nonlinear algebra has

### - Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs

### CS261: A Second Course in Algorithms Lecture #9: Linear Programming Duality (Part 2)

CS261: A Second Course in Algorithms Lecture #9: Linear Programming Duality (Part 2) Tim Roughgarden February 2, 2016 1 Recap This is our third lecture on linear programming, and the second on linear programming

### (Linear Programming program, Linear, Theorem on Alternative, Linear Programming duality)

Lecture 2 Theory of Linear Programming (Linear Programming program, Linear, Theorem on Alternative, Linear Programming duality) 2.1 Linear Programming: basic notions A Linear Programming (LP) program is

### CSCI5654 (Linear Programming, Fall 2013) Lecture-8. Lecture 8 Slide# 1

CSCI5654 (Linear Programming, Fall 2013) Lecture-8 Lecture 8 Slide# 1 Today s Lecture 1. Recap of dual variables and strong duality. 2. Complementary Slackness Theorem. 3. Interpretation of dual variables.

### Introduction to optimization

Introduction to optimization Geir Dahl CMA, Dept. of Mathematics and Dept. of Informatics University of Oslo 1 / 24 The plan 1. The basic concepts 2. Some useful tools (linear programming = linear optimization)

### GEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III

GEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III CONVEX ANALYSIS NONLINEAR PROGRAMMING THEORY NONLINEAR PROGRAMMING ALGORITHMS

### Convex Optimization & Lagrange Duality

Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT

### CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS A Dissertation Submitted For The Award of the Degree of Master of Philosophy in Mathematics Neelam Patel School of Mathematics

### EE364a Review Session 5

EE364a Review Session 5 EE364a Review announcements: homeworks 1 and 2 graded homework 4 solutions (check solution to additional problem 1) scpd phone-in office hours: tuesdays 6-7pm (650-723-1156) 1 Complementary

### LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization

### On smoothness properties of optimal value functions at the boundary of their domain under complete convexity

On smoothness properties of optimal value functions at the boundary of their domain under complete convexity Oliver Stein # Nathan Sudermann-Merx June 14, 2013 Abstract This article studies continuity

### Linear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016

Linear Programming Larry Blume Cornell University, IHS Vienna and SFI Summer 2016 These notes derive basic results in finite-dimensional linear programming using tools of convex analysis. Most sources

### LP. Kap. 17: Interior-point methods

LP. Kap. 17: Interior-point methods the simplex algorithm moves along the boundary of the polyhedron P of feasible solutions an alternative is interior-point methods they find a path in the interior of

### Linear and Integer Optimization (V3C1/F4C1)

Linear and Integer Optimization (V3C1/F4C1) Lecture notes Ulrich Brenner Research Institute for Discrete Mathematics, University of Bonn Winter term 2016/2017 March 8, 2017 12:02 1 Preface Continuous updates

Iranian Journal of Operations Research Vol. 2, No. 2, 20, pp. 29-34 A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint M. Salahi Semidefinite

### Spring 2017 CO 250 Course Notes TABLE OF CONTENTS. richardwu.ca. CO 250 Course Notes. Introduction to Optimization

Spring 2017 CO 250 Course Notes TABLE OF CONTENTS richardwu.ca CO 250 Course Notes Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4, 2018 Table

### SEMIDEFINITE PROGRAM BASICS. Contents

SEMIDEFINITE PROGRAM BASICS BRIAN AXELROD Abstract. A introduction to the basics of Semidefinite programs. Contents 1. Definitions and Preliminaries 1 1.1. Linear Algebra 1 1.2. Convex Analysis (on R n

### CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

CSC2411 - Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming Notes taken by Mike Jamieson March 28, 2005 Summary: In this lecture, we introduce semidefinite programming

### Linear and Combinatorial Optimization

Linear and Combinatorial Optimization The dual of an LP-problem. Connections between primal and dual. Duality theorems and complementary slack. Philipp Birken (Ctr. for the Math. Sc.) Lecture 3: Duality

### MAT-INF4110/MAT-INF9110 Mathematical optimization

MAT-INF4110/MAT-INF9110 Mathematical optimization Geir Dahl August 20, 2013 Convexity Part IV Chapter 4 Representation of convex sets different representations of convex sets, boundary polyhedra and polytopes:

### Example Problem. Linear Program (standard form) CSCI5654 (Linear Programming, Fall 2013) Lecture-7. Duality

CSCI5654 (Linear Programming, Fall 013) Lecture-7 Duality Lecture 7 Slide# 1 Lecture 7 Slide# Linear Program (standard form) Example Problem maximize c 1 x 1 + + c n x n s.t. a j1 x 1 + + a jn x n b j

### Lecture 6: Conic Optimization September 8

IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions

### Interior Point Methods for Nonlinear Optimization

Interior Point Methods for Nonlinear Optimization Imre Pólik 1 and Tamás Terlaky 2 1 School of Computational Engineering and Science, McMaster University, Hamilton, Ontario, Canada, imre@polik.net 2 School

### Optimality, Duality, Complementarity for Constrained Optimization

Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear

### 14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity

### Detecting Infeasibility in Infeasible-Interior-Point Methods for Optimization

Detecting Infeasibility in Infeasible-Interior-Point Methods for Optimization M. J. Todd January 16, 2003 Abstract We study interior-point methods for optimization problems in the case of infeasibility

### The Karush-Kuhn-Tucker (KKT) conditions

The Karush-Kuhn-Tucker (KKT) conditions In this section, we will give a set of sufficient (and at most times necessary) conditions for a x to be the solution of a given convex optimization problem. These

### Optimization WS 13/14:, by Y. Goldstein/K. Reinert, 9. Dezember 2013, 16: Linear programming. Optimization Problems

Optimization WS 13/14:, by Y. Goldstein/K. Reinert, 9. Dezember 2013, 16:38 2001 Linear programming Optimization Problems General optimization problem max{z(x) f j (x) 0,x D} or min{z(x) f j (x) 0,x D}

### Lecture: Algorithms for LP, SOCP and SDP

1/53 Lecture: Algorithms for LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:

### On implementing a primal-dual interior-point method for conic quadratic optimization

On implementing a primal-dual interior-point method for conic quadratic optimization E. D. Andersen, C. Roos, and T. Terlaky December 18, 2000 Abstract Conic quadratic optimization is the problem of minimizing

### A Generalized Homogeneous and Self-Dual Algorithm. for Linear Programming. February 1994 (revised December 1994)

A Generalized Homogeneous and Self-Dual Algorithm for Linear Programming Xiaojie Xu Yinyu Ye y February 994 (revised December 994) Abstract: A generalized homogeneous and self-dual (HSD) infeasible-interior-point

### Lectures 6, 7 and part of 8

Lectures 6, 7 and part of 8 Uriel Feige April 26, May 3, May 10, 2015 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients,

### Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

### Lecture 10. Primal-Dual Interior Point Method for LP

IE 8534 1 Lecture 10. Primal-Dual Interior Point Method for LP IE 8534 2 Consider a linear program (P ) minimize c T x subject to Ax = b x 0 and its dual (D) maximize b T y subject to A T y + s = c s 0.

### Constrained Optimization and Lagrangian Duality

CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

### Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

### Chapter 1 Linear Programming. Paragraph 5 Duality

Chapter 1 Linear Programming Paragraph 5 Duality What we did so far We developed the 2-Phase Simplex Algorithm: Hop (reasonably) from basic solution (bs) to bs until you find a basic feasible solution

### Discrete Optimization

Prof. Friedrich Eisenbrand Martin Niemeier Due Date: April 15, 2010 Discussions: March 25, April 01 Discrete Optimization Spring 2010 s 3 You can hand in written solutions for up to two of the exercises