Problem 1 (Exercise 2.2, Monograph)

Similar documents
Conic Linear Optimization and its Dual. yyye

Lecture: Introduction to LP, SDP and SOCP

4. Algebra and Duality

Midterm Review. Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

Chapter 2: Preliminaries and elements of convex analysis

Lecture 6 - Convex Sets

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

The Simplex Algorithm

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Convex Optimization M2

Math Exam 2, October 14, 2008

Lagrangian Duality Theory

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Solution to EE 617 Mid-Term Exam, Fall November 2, 2017

Optimization Theory. Linear Operators and Adjoints

4TE3/6TE3. Algorithms for. Continuous Optimization

Projection Theorem 1

Introduction to Semidefinite Programming I: Basic properties a

SEMIDEFINITE PROGRAM BASICS. Contents

Two Posts to Fill On School Board

A notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

Rank-one Generated Spectral Cones Defined by Two Homogeneous Linear Matrix Inequalities

On the projection onto a finitely generated cone

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

A Unified Theorem on SDP Rank Reduction. yyye

Conic Linear Programming. Yinyu Ye

Introduction and Math Preliminaries

10725/36725 Optimization Homework 2 Solutions

Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems

More First-Order Optimization Algorithms

IE 521 Convex Optimization Homework #1 Solution

Chapter 3, Operations Research (OR)

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

Convex Optimization M2

The proximal mapping

Lecture 8. Strong Duality Results. September 22, 2008

Quadratic reformulation techniques for 0-1 quadratic programs

Nonlinear Programming

Grothendieck s Inequality

15. Conic optimization

Convex analysis and profit/cost/support functions

5. Duality. Lagrangian

Math 10C - Fall Final Exam

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Semidefinite Programming

Conic Linear Programming. Yinyu Ye

Semidefinite Programming Basics and Applications

Lecture 3: Semidefinite Programming

Inequality Constraints

IE 521 Convex Optimization

Mathematical Optimization Models and Applications

SOCP Relaxation of Sensor Network Localization

Convex Optimization and Modeling

Auxiliary-Function Methods in Optimization

HW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.

8. Conjugate functions

BBM402-Lecture 20: LP Duality

Lecture: Duality.

A DARK GREY P O N T, with a Switch Tail, and a small Star on the Forehead. Any

Assignment 1: From the Definition of Convexity to Helley Theorem

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

Lecture: Examples of LP, SOCP and SDP

Weak Topologies, Reflexivity, Adjoint operators

Convex Sets with Applications to Economics

Approximate Farkas Lemmas in Convex Optimization

Agenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem

subject to (x 2)(x 4) u,

EE364a Review Session 5

Input: System of inequalities or equalities over the reals R. Output: Value for variables that minimizes cost function

Module 04 Optimization Problems KKT Conditions & Solvers

On duality gap in linear conic problems

Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization

Semidefinite Programming

A Gentle, Geometric Introduction to Copositive Optimization

Supplement: Hoffman s Error Bounds

Lecture 1: Convex Sets January 23

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

Symmetric Matrices and Eigendecomposition

4.4. Orthogonality. Note. This section is awesome! It is very geometric and shows that much of the geometry of R n holds in Hilbert spaces.

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

Electronic Companion to Optimal Policies for a Dual-Sourcing Inventory Problem with Endogenous Stochastic Lead Times

Linear and Integer Optimization (V3C1/F4C1)

1 Directional Derivatives and Differentiability

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra

Lecture 7 Monotonicity. September 21, 2008

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Convex Functions and Optimization

Convex Optimization Boyd & Vandenberghe. 5. Duality

The Trust Region Subproblem with Non-Intersecting Linear Constraints

IE 521 Convex Optimization

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Lecture 7: Semidefinite programming

6.854J / J Advanced Algorithms Fall 2008

1 Introduction Semidenite programming (SDP) has been an active research area following the seminal work of Nesterov and Nemirovski [9] see also Alizad

Summer School: Semidefinite Optimization

Transcription:

MS&E 314/CME 336 Assignment 2 Conic Linear Programming January 3, 215 Prof. Yinyu Ye 6 Pages ASSIGNMENT 2 SOLUTIONS Problem 1 (Exercise 2.2, Monograph) We prove the part ii) of Theorem 2.1 (Farkas Lemma for CLP). (a) It s easy to show that the set C is convex. To see why, take arbitrary two points S A T y, S A T y C and a constant α (, 1) then α(s A T y ) + (1 α)(s A T y ) = {αs + (1 α)s } A T (αy + (1 αy )) C. }{{} K Now we prove that C is closed. Let s take a converging sequence z k := (S k A T y k ) z and show that z C. Note that the convergence implies {z k } is bounded. So for the given ˆX (K ), there is a constant c > such that c z k ˆX = (S k A T y k ) ˆX = S k ˆX A T y k ˆX = S k ˆX (y k ) T }{{} A ˆX = S k ˆX. = Since S k K and ˆX (K ), by the Proposition 1.3, {S k } is a bounded sequence and thus {y k } is bounded. By the Bolzano-Weierstrass theorem 1, there is a convergent subsequence S kn K S K( K is closed) and we can take a convergent subsequence y knm ȳ (Note: {k nm } {k n } N). Hence z knm = S knm A T y knm S A T ȳ. But z k z gives which proves the closedness. z = S A T ȳ C (b) ( ) Suppose F d is feasible, say we have a vector ȳ satisfies C A T ȳ K. 1 http://en.wikipedia.org/wiki/bolzano-weierstrass_theorem

Suppose further that for some X K we have AX =. Then by the definition of the dual K, ( C A T ȳ ) }{{}}{{} X = C X ȳ T }{{} AX = C X K K = thus {X : AX =, X K, C X < } =. ( ) Suppose F d = {y : C A T y K} =. That is, C / {S A T y : S K}, which is proven to be a closed and convex set. So by the Separating Hyperplane theorem, there exists X such that C X < inf (S y R m,s K AT y) X = inf y R m,s K S X (AT y) X = inf y R m,s K S X yt (AX). (1) We claim that X K. Suppose not, then there exists S K such that S X < and this gives (α S) X as α. However, this contradicts to the boundedness shown in (1). So we have X K. Similarly, we can show that AX = from (1). Also note that if we set y = and S =, we obtain C X < and this completes the proof. Problem 2 (Exercise 2.6, Monograph) If we define Q := ] Q b b T, I 1:n := ] I T, I 1:n := ] T, 1 then we can reformulate the given problem as minimize Q X s.t. I 1:n X = 1, I n+1 X = 1, X. 2

which has the dual minimize y 1 + y 2 s.t. y 1 I 1:n + y 2 I n+1 S = Q, S. Then from the optimality condition of SDP, we must have which gives the desired condition. XS =, AX = (1, 1), A T y S = Q, X, S Problem 3 (Exercise 2.7, Monograph) The objective function can be rewritten as x T Ay = i,j x i a ij y j = i,j and the two constraints are equivalent to So if we define x 2 = 1 i the equivalent SDP is (x i y j )a ij = x i y j ] i,j A = xy T A x 2 i = 1 xx T I = 1, y 2 = 1 yy T I = 1. minimize s.t. ] xx T xy Z := T yx T yy T 1 A ] 2 1 Z 2 AT ] In Z = 1. ] Z = 1, I m Z. Since there are only two equality constraints, we can relax the rank constraint rank(z) = 1 because the above SDP always has a rank-1 solution by the rank reduction theorem 2.5. 3

Problem 4 (a) We can follow the proof for Theorem 2.5 in the Monograph. Let X = V V T, V R n r. Then the projected problem is minimize V T CV U s.t. V T A i V U, i = 1,..., m, V T Q i V U, i = 1,..., q, Z. SInce Q j is positive semidefinite, Q j X = and X = V V T imply V T Q j V =, which means V T Q j V U = is always satisfied. The remaining proof exactly follows the proof for Theorem 2.5. (b) The equivalent SDP is minimize s.t. ] Q c c T Z ] A T A Z =, ] In Z = 1, ] Z = 1, 1 Z. Since there are only two non-homogeneous equality constraints (the other is homogeneous with coefficient matrix positive semidefinite), we can relax the rank constraint rank(z) = 1, because the above SDP always has a rank-1 solution by the rank reduction we proved in part (a) (it s easy to see that there exists an optimal solution). Problem 5 (Exercise 2.4, Monograph) (a) Suppose we have a realization {x i } i=1,...,n for the given problem and let {x j + a} 1 j n be any translation of this realization with minimum norm, that is, n j=1 x j + a 2 4

is minimized. Then the gradient of this sum of two-norm should be zero: Therefore we get 2 n (x j + a) =. j=1 a = 1 n n x j = x j=1 and the solution is given by x j x, j = 1,..., n. It is still subject to roation and reflection since rotation and reflection preserve the 2-norm, i.e., for any orthogonal matrix Q, Qx = x. Note that any rotation and reflection is an orthogonal transformation. (b) Let X = x 1 x n ] be the d n matrix that needs to be determined. Then x i x j 2 = Xe i Xe j 2 = Xe ij 2 = e T ijx T Xe ij, where we use the notation e ij := e i e j. Also the objective function can be written as n x j 2 = tr(x T X). j=1 So we can write this problem as an SDP relaxation problem by plugging in Y = X T X: or equivalently, (SDP ) minimize tr(y ) s.t. e T ijy e ij = d 2 ij, (i, j) N x, Y, (SDP ) minimize I Y s.t. (e ij e T ij) Y = d 2 ij, (i, j) N x, Y. And the dual of the SDP relaxation is given by: (SDD) maximize (i,j) N w ij d 2 ij s.t. I (i,j) N w ij e ij e T ij. Note that the dual is always feasible and has an interior, since w ij = for all (i, j) N x is an interior feasible solution. 5

Problem 6 We omit the MATLAB code. (a) The solution ranks are: (i) 3, (ii) 3, (iii) 3. (b) The solution ranks are: (i) 2, (ii) 1, (iii) 1. 6