Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4
|
|
- Antony Cole
- 5 years ago
- Views:
Transcription
1 Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4 Instructor: Farid Alizadeh Scribe: Haengju Lee 10/1/ Overview We examine the dual of the Fermat-Weber Problem. Next we will study optimality condition in the form of generalized complementary slackness theorem. Finally we start the study of the eigenvalue optimization problem as a semidefinite program. 2 of the Fermat Weber Problem Recall that the Fermat-Weber problem seeks a point in m dimensional space whose Euclidean distance from a set of given n points is minimum (see lecture 1). Given points v 1, v 2,..., v n R m, weights w 1, w 2,..., w n, this problem can be formulated as follows. n min w i v i x i=1 The problem can be written equivalently as a cone-lp over Q, the second order cone: min w 1 z w n z n z i v i x, i = 1,..., n. 1
2 But, where e = (1, 0,..., 0) T and ˆx = z i v i x ( ) zi x v Q 0 i ( ) ( ) zi 0 x Q vi ( ) 0 z i e + ˆx Q v i ( 0 x). Now the cone-lp formulation is: Primal min w 1 z ( w n ) z n 0 z i e + ˆx Q, i = 1,..., n v i ( yi0 ( yi0 y i ) ) If we define dual variable corresponding to the the second order cone y i inequality in the Primal then the dual can be formulated as: max n i=1 vt i y i y 0i = w i, i = 1,..., n z i (y0 y 1 + )... + y n = 0 x i i y Q 0 since they arise from Q. i After simplification (for instance eliminating y i0 ) we get: max n i=1 vt i y i n i=1 y i = 0 y i w i. The dual of Fermat Weber problem has an interesting interpretation in dynamics. Let us assume that w i are weights of objects hanging from threads that go through a set of holes in a table. We are to take the other ends of the threads and tie them up at a position of equilibrium, and spend minimal amount of energy. Then the y i are interpreted as forces, and they must add up to zero so that we have equilibrium. The condition y i w i simply states that the magnitude of the force exerted at the knot by the i th object cannot be larger than its weight. Assuming that the optimal location is x we can write the value of the objective function as i (x v i ) T y i because (x ) T i y i = 0. Then the objective is simply the location with minimum potential energy. (Question: 2
3 Can you give an interpretation of Primal and explain why the primal and dual problem are equal at the optimum?) w1 w2 w3 w4 w5 3 ity in different spaces In many situations a m-dimensional cone in can be expressed as the intersection of another n-dimensional cone and a linear space: K 1 = K L where n > m. Then, remembering that a linear space is also a cone and its dual as a cone is simply its orthogonal complement L (why?), we get K1 = K + L. Here K1 is the dual of K 1 in the space R n. But if we can get the dual in the space L then the dual cone will be m-dimensional and different from K1; let us call the dual of K 1 in the space L K 1 +. If it is at all possible to find a good characterization of K 1 + we should use that instead of K 1. Let us look at an example and see what would the problems be if we don t. In linear programming our cone is the non-negative orthant R n + and cone-lp is simply the ordinary LP: Primal min c T x a T i x = b i i = 1,..., m x i 0 i = 1,..., n max b T y i y ia i + s i = c i i = 1,..., m s i 0 i = 1,..., n Now suppose that we express the non-negative orthant as the intersection of positive semidefinite cone and the linear space L which consists of only diagonal 3
4 matrices, that is X L iff x ij = 0 for all i j. We define diagonal matrices C = Diag(c) and A i = Diag(a i ), that is a matrix whose diagonal entries j, j are c j (or (a i ) j ), and non-diagonal entries i, j are all zeros. Now the primal linear programming problem can be written as a semidefinite programming problem. Primal : min{c X A i X = b i, for i = 1,..., m, X ij = 0 for i j, X 0} Note that the condition X ij = 0 is the same as (E ij + E ji ) X = 0 where E ij is the matrix with all entries 0 except the i, j entry which is one. Now taking the dual of this SDP we arrive at a problem that is not equivalent to the dual of the LP: : max{b T y y i A i + s ij (E ij + E ji ) C} Even if the original LP problem has unique primal and dual solutions it is unlikely in general that the dual of the SDP formulation have unique solutions. The constraints in the dual imply that y i a i c but there are in general infinitely many s ij that can be added to a set a given optimal y. The lesson is that it is not a good idea to formulate an LP as an SDP (which was obvious at the outset). But for the same reason it is not generally a good idea to express the dual of a cone-lp over K 1 L as K 1 + L. As another example consider the second order cone Q. Now we know that x Q iff Arw x 0. Thus again SOCP can be expressed as an SDP: write Q = P n n L where L is the linear space saying matrix X is arrow shaped, i.e. X ij = 0 if i j and i 0 and j 0, and X ii = X jj for all i, j. But again formulating SOCP as and SDP is not a good idea. If we form the dual as an SDP we will have extra and unnecessary variables that play no essential role and can make the solution numerically unstable, even if the original SOCP does not have numerical problems. In future lectures we will see even more compelling reasons why the SOCP poblem should be treated in its own right rather than as a special case of SDP. 4 Generalization of Complementary Slackness Conditions Consider the pair of cone-lp problems Primal min c T x Ax = b x K 0 max b T y A T y + s = c s K 0. We studied before that at the optimum the following three relations hold: x K 0 s K 0 and x T s = 0. In the case of LP, SDP and SOCP these conditions actually imply stronger relations which we now examine. 4
5 Example 1 (non-negative orthant) When K = K = R n +, at the optimum, x i 0 for i = 1,..., n, s i 0 for i = 1,..., n, and x T s = 0 imply x i s i = 0 for i = 1,..., n because sum of a set of non negative numbers x i s i is zero implies that each of them must be zero. This is the familiar complementary slackness theorem of linear programming. Example 2 (the semidefinite cone) When K = K = P n n the optimal, X 0, S 0, and X S = tr(xs) = 0. Since the matrix S is symmetric S can be expressed as S = Q T ΩQ = Q T ΩQQ T ΩQ = S 1/2 S 1/2, where Q is an orthogonal matrix, and Ω a diagonal matrix containing eigenvalues of S on its diagonal. This shows that each positive semidefinite matrix has a unique positive semidefinite square root which is denoted by S 1/2. Now, 0 = tr(xs) = tr ( XS 1/2 S 1/2) = tr ( S 1/2 XS 1/2) This implies that S 1/2 XS 1/2 = 0 because S 1/2 XS 1/2 is also a positive semidefinite matrix, with non-negative eigenvalues and trace zero. Since trace is sum of eigenvalues, this is possible only when all eigenvalues are zero, which, in the case of symmetric matrices, implies that the matrix S 1/2 XS 1/2 is zero. Thus 0 = (S 1/2 X 1/2 )(X 1/2 S 1/2 ) = A T A. We now that AA T = 0 iff A = 0, thus X 1/2 S 1/2 = 0 which implies XS = 0. We have shown: Theorem 1 (Complementary slackness theorem for SDP) If X is optimal for the primal SDP, and (y, S) optimal for the dual SDP, and duality gap X S = 0, then XS = 0. Example 3 (The second order cone) When K = K = Q, we have x Q 0, s Q 0 and x T s = 0, where x, s R n+1, and x and s are indexed from 0. This means that x 0 x, and s 0 s, and x T s = 0. or equivalently, x x 2 0 x x 2 2 n x 0 s 0 i s 0 (1) x 0 s s 2 0 s s 2 2 n x 0 s 0 i x 0 (2) s 0 x 0 s 0 = x 1 s x n s n (3) Now, adding (1), (2) and (3) we get 0 ( x 2 i s 0 + s2 i x ) 0 + 2x i s i x 0 s 0 = ( x 2 i s2 0 + s 2 i x x i s i x 0 s 0 x 0 s 0 0 (x i s 0 + s i x 0 ) 2 x 0 s 0 Again, sum of a set of non-negative numbers is less that or equal to zero. Therefore all of them must be zero. We thus have x i s 0 + x 0 s i = 0, i = 1,..., m and x T s = 0 ) 5
6 We have shown Theorem 2 (Complementary slackness for SOCP) If x Q 0, s Q 0, and x T s = 0, then x 0 s i + x i s 0 = 0 for i = 1,..., n. This conditions (along with x T s = 0) can be written more succinctly as Arw (x) Arw (s)e = 0 We have implicitly assumed that x 0 0 and s 0 0. if x 0 = 0 x then this implies that x = 0 and the theorem above is trivially true. The same holds for when s 0 = 0. 5 A general complementary slackness theorem For a proper cone K R n, define C(K) {( } x C(K) = x s) K 0, s K 0, x T s = 0 R 2n Now, on the surface, the set C(K) seems to be a (2n 1)-dimensional set: Its members have 2n coordinates and since x T s = 0 we are left with 2n 1 degrees of freedom. The condition x K by itself does not impose restriction on the dimension of the set, nor does the condition s K. Nevertheless it turns out C(K) is actually an n-dimensional set! Here is why: Theorem 3 There is a one-to-one and onto continuous mapping from C(K) to R n. Before we proceed to the proof we recall the following basic Fact 1 Let S R n be a closed convex set and a R n. Then there is a unique point x = Π S (a) in S which is closest to a, i.e. there is a unique point x S such that x = argmin y S a y. The unique point above is called projection of a on to S. The proof of this fact can be found in many texts and is based on Weierstrass s theorem. Now we give proof of Theorem 3. Proof: Let a R n be any arbitrary point and define s = x a. we will first show that s K, and then show that the correspondence between a and (x, s) is a one-to-one, onto and continuous. First we show that s K. For every u K, define convex combination u α = αu + (1 α)x where 0 α 1. Again we define ζ(α) = a u α 2. Then ζ(α) is a differentiable function on the interval [0, 1] and min 0 α 1 ζ α is attained at α = 0. Claim: dζ dα 0 α=0 6
7 proof of Claim: Otherwise α in some neighborhood of 0, such that a u α < a u 0 contradicting the fact x = u 0 is the closest point to a in K. From this claim, dζ dα = 2(a x) T (u x) 0 α=0 2(x a) T (u x) 0 2s T (u x) 0. (4) This latter inequality is true for any u K. If we choose u = 2x then we get s T x 0. If we choose u = x/2 then s T x 0. We conclude that x T s = 0. If we plug this into (4) we get s T u 0 which means s K. Thus, for each a we get a pair (x, s) C(K). Clearly each a results in a unique (x, s) as x the projection is unique and thus so is s = x a. Also, both projection operation and s = x a are continuous. Conversely, if (x, s) C(K), then we can set a = x s. All we have to show now is that projection of a onto K is x. Assume otherwise. Then there is a point u K such that a u < a x that is (a x) T (a x) > (a u) T (a u) x T x 2(x s) T x > u T u 2(x s) T u noting that x T s = 0, 0 > u T u + x T x 2x T u + 2u T s 0 > u x 2 + 2s T u which implies that s T u < 0, contradicting the fact s K. (This proof is due to Osman Güler.) Example 4 ( of half line) Let us see what C(K) looks like in the case of half-line, that is when K = K = R +. {( } x C(K) = x 0, s 0 R s) 2 In other words, C(R + ) is the union of non-negative part of the x and s axes: it is the real line R bent at the origin by a 90 angel. Now the implication of this theorem is that since C(K) is n-dimensional, then there must exist a set of n equations, that are independent in some sense and define the manifold C(K). These n equations are precisely the complementary slackness conditions. In case of non-negative orthant, semidefinite and second order cones we were able to get these equations explicitly. When the cone K is given by a set of inequalities of the form g i (x) 0 for i = 1,..., n, and g i (x) are homogeneous and convex functions, then the classical Karush-Kuhn-Tucker conditions gives us a method of obtaining these equations. 7
8 6 Eigenvalue Optimization In this section we relate the eigenvalues λ 1 (A) λ 2 (A) λ n (A) for some A S n n. Let us find an SDP formulation of the largest eigenvalue, λ 1 (A). This problem can be formulated by primal and dual SDPs as follows. Primal min z zi A max A Y I Y = tr(y ) = 1 Y 0 The primal formulation simply says find the smallest z such that z is larger than all eigenvalues of A. But z is larger than all eigenvalues of A iff zi A is positive semidefinite. The dual characterization is obtained by simply taking dual. Now define the feasible set of the dual to be S, that is Definition 1 S = {Y S n n tr(y ) = 1, Y 0} (5) E = {qq T q = 1} (6) We can characterize the extreme points of S as follows: Theorem 4 S is a convex set and the set of extreme points of S is E. Proof: Convexity of S is obvious, since it is the intersection of the semideinite cone and an affine set. Y 0 implies that Y = ω 1 q 1 q T ω k q k q T k where ωi = 1, ω i 0, and q i = 1. This shows that the extreme points of S are among elements of E. Now we prove that all elements of E are extreme points. Otherwise for some qq T there are p and r with p = r = 1 and qq T = αpp T + (1 α)rr T = ( αp 1 αr ) ( αp 1 αr ) T. If α 0 or 1 we will have a contradiction to the fact that rank(qq T )=1. so qq T are extreme points. Since the optimum of a linear function over a convex set is attained at an extreme point, it follows that the Y that maximized A Y in the dual characterization above is of the form Y = qq T, with q = 1. That is λ 1 (A) = max q =1 qt Aq This is a well-know result in linear algebra that we have proved using duality of SDP. In future lectures we will use this characterization to express optimization of eigenvalues over an affine class of matrices. 8
Lecture: Introduction to LP, SDP and SOCP
Lecture: Introduction to LP, SDP and SOCP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2015.html wenzw@pku.edu.cn Acknowledgement:
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5
Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Instructor: Farid Alizadeh Scribe: Anton Riabov 10/08/2001 1 Overview We continue studying the maximum eigenvalue SDP, and generalize
More informationCSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming
CSC2411 - Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming Notes taken by Mike Jamieson March 28, 2005 Summary: In this lecture, we introduce semidefinite programming
More informationLecture 14: Optimality Conditions for Conic Problems
EE 227A: Conve Optimization and Applications March 6, 2012 Lecture 14: Optimality Conditions for Conic Problems Lecturer: Laurent El Ghaoui Reading assignment: 5.5 of BV. 14.1 Optimality for Conic Problems
More informationSummer School: Semidefinite Optimization
Summer School: Semidefinite Optimization Christine Bachoc Université Bordeaux I, IMB Research Training Group Experimental and Constructive Algebra Haus Karrenberg, Sept. 3 - Sept. 7, 2012 Duality Theory
More informationLecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.
MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More information1 Review of last lecture and introduction
Semidefinite Programming Lecture 10 OR 637 Spring 2008 April 16, 2008 (Wednesday) Instructor: Michael Jeremy Todd Scribe: Yogeshwer (Yogi) Sharma 1 Review of last lecture and introduction Let us first
More informationA Review of Linear Programming
A Review of Linear Programming Instructor: Farid Alizadeh IEOR 4600y Spring 2001 February 14, 2001 1 Overview In this note we review the basic properties of linear programming including the primal simplex
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationLecture: Algorithms for LP, SOCP and SDP
1/53 Lecture: Algorithms for LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:
More informationSEMIDEFINITE PROGRAM BASICS. Contents
SEMIDEFINITE PROGRAM BASICS BRIAN AXELROD Abstract. A introduction to the basics of Semidefinite programs. Contents 1. Definitions and Preliminaries 1 1.1. Linear Algebra 1 1.2. Convex Analysis (on R n
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationGlobal Optimization of Polynomials
Semidefinite Programming Lecture 9 OR 637 Spring 2008 April 9, 2008 Scribe: Dennis Leventhal Global Optimization of Polynomials Recall we were considering the problem min z R n p(z) where p(z) is a degree
More informationLecture 7: Semidefinite programming
CS 766/QIC 820 Theory of Quantum Information (Fall 2011) Lecture 7: Semidefinite programming This lecture is on semidefinite programming, which is a powerful technique from both an analytic and computational
More informationLagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual
More informationSemidefinite Programming Basics and Applications
Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 2
Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 2 Instructor: Farid Alizadeh Scribe: Xuan Li 9/17/2001 1 Overview We survey the basic notions of cones and cone-lp and give several
More informationLecture 8: Semidefinite programs for fidelity and optimal measurements
CS 766/QIC 80 Theory of Quantum Information (Fall 0) Lecture 8: Semidefinite programs for fidelity and optimal measurements This lecture is devoted to two examples of semidefinite programs: one is for
More informationIntroduction to Semidefinite Programming I: Basic properties a
Introduction to Semidefinite Programming I: Basic properties and variations on the Goemans-Williamson approximation algorithm for max-cut MFO seminar on Semidefinite Programming May 30, 2010 Semidefinite
More informationSemidefinite Programming
Semidefinite Programming Basics and SOS Fernando Mário de Oliveira Filho Campos do Jordão, 2 November 23 Available at: www.ime.usp.br/~fmario under talks Conic programming V is a real vector space h, i
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationThe fundamental theorem of linear programming
The fundamental theorem of linear programming Michael Tehranchi June 8, 2017 This note supplements the lecture notes of Optimisation The statement of the fundamental theorem of linear programming and the
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationLecture #21. c T x Ax b. maximize subject to
COMPSCI 330: Design and Analysis of Algorithms 11/11/2014 Lecture #21 Lecturer: Debmalya Panigrahi Scribe: Samuel Haney 1 Overview In this lecture, we discuss linear programming. We first show that the
More information- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs
LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More information4. Algebra and Duality
4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone
More informationLecture: Examples of LP, SOCP and SDP
1/34 Lecture: Examples of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:
More information3. Linear Programming and Polyhedral Combinatorics
Massachusetts Institute of Technology 18.433: Combinatorial Optimization Michel X. Goemans February 28th, 2013 3. Linear Programming and Polyhedral Combinatorics Summary of what was seen in the introductory
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationConic Linear Optimization and its Dual. yyye
Conic Linear Optimization and Appl. MS&E314 Lecture Note #04 1 Conic Linear Optimization and its Dual Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationCO 250 Final Exam Guide
Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality
More informationEE 227A: Convex Optimization and Applications October 14, 2008
EE 227A: Convex Optimization and Applications October 14, 2008 Lecture 13: SDP Duality Lecturer: Laurent El Ghaoui Reading assignment: Chapter 5 of BV. 13.1 Direct approach 13.1.1 Primal problem Consider
More informationThe Q Method for Symmetric Cone Programmin
The Q Method for Symmetric Cone Programming The Q Method for Symmetric Cone Programmin Farid Alizadeh and Yu Xia alizadeh@rutcor.rutgers.edu, xiay@optlab.mcma Large Scale Nonlinear and Semidefinite Progra
More informationHW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.
HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard
More informationSemidefinite Programming
Semidefinite Programming Notes by Bernd Sturmfels for the lecture on June 26, 208, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra The transition from linear algebra to nonlinear algebra has
More informationELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications
ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March
More informationProblem 1 (Exercise 2.2, Monograph)
MS&E 314/CME 336 Assignment 2 Conic Linear Programming January 3, 215 Prof. Yinyu Ye 6 Pages ASSIGNMENT 2 SOLUTIONS Problem 1 (Exercise 2.2, Monograph) We prove the part ii) of Theorem 2.1 (Farkas Lemma
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 12 Luca Trevisan October 3, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analysis Handout 1 Luca Trevisan October 3, 017 Scribed by Maxim Rabinovich Lecture 1 In which we begin to prove that the SDP relaxation exactly recovers communities
More informationLargest dual ellipsoids inscribed in dual cones
Largest dual ellipsoids inscribed in dual cones M. J. Todd June 23, 2005 Abstract Suppose x and s lie in the interiors of a cone K and its dual K respectively. We seek dual ellipsoidal norms such that
More informationDuality of LPs and Applications
Lecture 6 Duality of LPs and Applications Last lecture we introduced duality of linear programs. We saw how to form duals, and proved both the weak and strong duality theorems. In this lecture we will
More informationAgenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem
Agenda 1 Duality for LP 2 Theorem of alternatives 3 Conic Duality 4 Dual cones 5 Geometric view of cone programs 6 Conic duality theorem 7 Examples Lower bounds on LPs By eliminating variables (if needed)
More informationLinear and non-linear programming
Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationLecture 5. Theorems of Alternatives and Self-Dual Embedding
IE 8534 1 Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 2 A system of linear equations may not have a solution. It is well known that either Ax = c has a solution, or A T y = 0, c
More informationCSC Linear Programming and Combinatorial Optimization Lecture 12: The Lift and Project Method
CSC2411 - Linear Programming and Combinatorial Optimization Lecture 12: The Lift and Project Method Notes taken by Stefan Mathe April 28, 2007 Summary: Throughout the course, we have seen the importance
More informationLecture 20: November 1st
10-725: Optimization Fall 2012 Lecture 20: November 1st Lecturer: Geoff Gordon Scribes: Xiaolong Shen, Alex Beutel Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have not
More informationLinear and Combinatorial Optimization
Linear and Combinatorial Optimization The dual of an LP-problem. Connections between primal and dual. Duality theorems and complementary slack. Philipp Birken (Ctr. for the Math. Sc.) Lecture 3: Duality
More informationLecture 4: January 26
10-725/36-725: Conve Optimization Spring 2015 Lecturer: Javier Pena Lecture 4: January 26 Scribes: Vipul Singh, Shinjini Kundu, Chia-Yin Tsai Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationSDP Relaxations for MAXCUT
SDP Relaxations for MAXCUT from Random Hyperplanes to Sum-of-Squares Certificates CATS @ UMD March 3, 2017 Ahmed Abdelkader MAXCUT SDP SOS March 3, 2017 1 / 27 Overview 1 MAXCUT, Hardness and UGC 2 LP
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationRelation of Pure Minimum Cost Flow Model to Linear Programming
Appendix A Page 1 Relation of Pure Minimum Cost Flow Model to Linear Programming The Network Model The network pure minimum cost flow model has m nodes. The external flows given by the vector b with m
More informationLecture Notes 1: Vector spaces
Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector
More informationLecture 1: Systems of linear equations and their solutions
Lecture 1: Systems of linear equations and their solutions Course overview Topics to be covered this semester: Systems of linear equations and Gaussian elimination: Solving linear equations and applications
More informationSupport Vector Machines
Support Vector Machines Support vector machines (SVMs) are one of the central concepts in all of machine learning. They are simply a combination of two ideas: linear classification via maximum (or optimal
More information14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.
CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity
More information16.1 L.P. Duality Applied to the Minimax Theorem
CS787: Advanced Algorithms Scribe: David Malec and Xiaoyong Chai Lecturer: Shuchi Chawla Topic: Minimax Theorem and Semi-Definite Programming Date: October 22 2007 In this lecture, we first conclude our
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationWe describe the generalization of Hazan s algorithm for symmetric programming
ON HAZAN S ALGORITHM FOR SYMMETRIC PROGRAMMING PROBLEMS L. FAYBUSOVICH Abstract. problems We describe the generalization of Hazan s algorithm for symmetric programming Key words. Symmetric programming,
More informationA notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations
A notion of for Convex, Semidefinite and Extended Formulations Marcel de Carli Silva Levent Tunçel April 26, 2018 A vector in R n is integral if each of its components is an integer, A vector in R n is
More informationStructural and Multidisciplinary Optimization. P. Duysinx and P. Tossings
Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be
More informationIntroduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs
Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following
More informationLecture 17: Primal-dual interior-point methods part II
10-725/36-725: Convex Optimization Spring 2015 Lecture 17: Primal-dual interior-point methods part II Lecturer: Javier Pena Scribes: Pinchao Zhang, Wei Ma Note: LaTeX template courtesy of UC Berkeley EECS
More informationLecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationThe Q Method for Symmetric Cone Programming
The Q Method for Symmetric Cone Programming Farid Alizadeh Yu Xia October 5, 010 Communicated by F. Potra Abstract The Q method of semidefinite programming, developed by Alizadeh, Haeberly and Overton,
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationChapter 1. Preliminaries
Introduction This dissertation is a reading of chapter 4 in part I of the book : Integer and Combinatorial Optimization by George L. Nemhauser & Laurence A. Wolsey. The chapter elaborates links between
More information15-780: LinearProgramming
15-780: LinearProgramming J. Zico Kolter February 1-3, 2016 1 Outline Introduction Some linear algebra review Linear programming Simplex algorithm Duality and dual simplex 2 Outline Introduction Some linear
More informationThe Simplex Algorithm
8.433 Combinatorial Optimization The Simplex Algorithm October 6, 8 Lecturer: Santosh Vempala We proved the following: Lemma (Farkas). Let A R m n, b R m. Exactly one of the following conditions is true:.
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual
More information"SYMMETRIC" PRIMAL-DUAL PAIR
"SYMMETRIC" PRIMAL-DUAL PAIR PRIMAL Minimize cx DUAL Maximize y T b st Ax b st A T y c T x y Here c 1 n, x n 1, b m 1, A m n, y m 1, WITH THE PRIMAL IN STANDARD FORM... Minimize cx Maximize y T b st Ax
More informationThe Q Method for Second-Order Cone Programming
The Q Method for Second-Order Cone Programming Yu Xia Farid Alizadeh July 5, 005 Key words. Second-order cone programming, infeasible interior point method, the Q method Abstract We develop the Q method
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 6 (Conic optimization) 07 Feb, 2013 Suvrit Sra Organizational Info Quiz coming up on 19th Feb. Project teams by 19th Feb Good if you can mix your research
More informationTutorial on Convex Optimization: Part II
Tutorial on Convex Optimization: Part II Dr. Khaled Ardah Communications Research Laboratory TU Ilmenau Dec. 18, 2018 Outline Convex Optimization Review Lagrangian Duality Applications Optimal Power Allocation
More information6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC
6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 2003 2003.09.02.10 6. The Positivstellensatz Basic semialgebraic sets Semialgebraic sets Tarski-Seidenberg and quantifier elimination Feasibility
More informationConvex Optimization M2
Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial
More information3. Duality: What is duality? Why does it matter? Sensitivity through duality.
1 Overview of lecture (10/5/10) 1. Review Simplex Method 2. Sensitivity Analysis: How does solution change as parameters change? How much is the optimal solution effected by changing A, b, or c? How much
More informationGeometric problems. Chapter Projection on a set. The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as
Chapter 8 Geometric problems 8.1 Projection on a set The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as dist(x 0,C) = inf{ x 0 x x C}. The infimum here is always achieved.
More informationDuality. Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities
Duality Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities Lagrangian Consider the optimization problem in standard form
More informationIII. Applications in convex optimization
III. Applications in convex optimization nonsymmetric interior-point methods partial separability and decomposition partial separability first order methods interior-point methods Conic linear optimization
More informationChapter 1: Linear Programming
Chapter 1: Linear Programming Math 368 c Copyright 2013 R Clark Robinson May 22, 2013 Chapter 1: Linear Programming 1 Max and Min For f : D R n R, f (D) = {f (x) : x D } is set of attainable values of
More informationLMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009
LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationCSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization
CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization April 6, 2018 1 / 34 This material is covered in the textbook, Chapters 9 and 10. Some of the materials are taken from it. Some of
More informationSpring 2017 CO 250 Course Notes TABLE OF CONTENTS. richardwu.ca. CO 250 Course Notes. Introduction to Optimization
Spring 2017 CO 250 Course Notes TABLE OF CONTENTS richardwu.ca CO 250 Course Notes Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4, 2018 Table
More informationLecture: Cone programming. Approximating the Lorentz cone.
Strong relaxations for discrete optimization problems 10/05/16 Lecture: Cone programming. Approximating the Lorentz cone. Lecturer: Yuri Faenza Scribes: Igor Malinović 1 Introduction Cone programming is
More informationORF 523 Lecture 9 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Thursday, March 10, 2016
ORF 523 Lecture 9 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Thursday, March 10, 2016 When in doubt on the accuracy of these notes, please cross check with the instructor
More informationLP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra
LP Duality: outline I Motivation and definition of a dual LP I Weak duality I Separating hyperplane theorem and theorems of the alternatives I Strong duality and complementary slackness I Using duality
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationHomework 4. Convex Optimization /36-725
Homework 4 Convex Optimization 10-725/36-725 Due Friday November 4 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationLecture 7. Econ August 18
Lecture 7 Econ 2001 2015 August 18 Lecture 7 Outline First, the theorem of the maximum, an amazing result about continuity in optimization problems. Then, we start linear algebra, mostly looking at familiar
More informationLecture 7 Duality II
L. Vandenberghe EE236A (Fall 2013-14) Lecture 7 Duality II sensitivity analysis two-person zero-sum games circuit interpretation 7 1 Sensitivity analysis purpose: extract from the solution of an LP information
More informationConvex Optimization Lecture 6: KKT Conditions, and applications
Convex Optimization Lecture 6: KKT Conditions, and applications Dr. Michel Baes, IFOR / ETH Zürich Quick recall of last week s lecture Various aspects of convexity: The set of minimizers is convex. Convex
More informationAgenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms
Agenda Interior Point Methods 1 Barrier functions 2 Analytic center 3 Central path 4 Barrier method 5 Primal-dual path following algorithms 6 Nesterov Todd scaling 7 Complexity analysis Interior point
More information