Memorandum No An easy way to obtain strong duality results in linear, linear semidefinite and linear semi-infinite programming

Size: px
Start display at page:

Download "Memorandum No An easy way to obtain strong duality results in linear, linear semidefinite and linear semi-infinite programming"

Transcription

1 Faculty of Mathematical Sciences University of Twente University for Technical and Social Sciences P.O. Box AE Enschede The Netherlands Phone Fax Memorandum No An easy way to obtain strong duality results in linear, linear semidefinite and linear semi-infinite programming P. Pop and G.J. Still July 1999 ISSN

2 AN EASY WAY TO OBTAIN STRONG DUALITY RESULTS IN LINEAR, LINEAR SEMIDEFINITE AND LINEAR SEMI-INFINITE PROGRAMMING PETRICA POP AND GEORG STILL ABSTRACT In linear programming it is known that an appropriate nonhomogenious Farkas Lemma leads to a short proof of the strong duality results for a pair of primal and dual programs. By using a corresponding generalized Farkas lemma we give a similar proof of the strong duality results for semidefinite programs under constraint qualifications. The proof includes optimality conditions. The same approach leads to corresponding results for linear semi-infinite programs. For completeness, the proofs for linear programs and the proofs of all auxiliary lemmata for the semidefinite case are included. KEYWORDS linear programming, semidefinite programming, semi-infinite programming, duality MATHEMATICAL SUBJECT CLASSIFICATION C5, 9C25, 9C34

3 1 1. STRONG DUALITY RESULTS IN LINEAR PROGRAMMING Consider the pair of primal and dual linear programs, P max c T x s.t. Ax b x R n D min y R bt y s.t. A T y = c ; y ; m where A is an.m n/-matrix.m n/ and c R n ; b R m.letv P denote the maximum value of the primal program P and v D the minimum value of the dual problem D. The feasible sets of P and D are abbreviated by F P and F D. Most commonly a homogeneous Farkas Lemma is used to prove optimality conditions for P and D. We will use the following non-homogeneous version to prove in one step strong duality and optimality conditions. Lemma 1. Let be given an.m n/-matrix B, an.k n/-matrix C and b R m ; c R k. Then precisely one of the following alternatives is valid. (a) There is a solution x R n of Bx b, Cx = c. (b) There exist vectors ¼ R m ;¼ ; ½ R k such that ( B ) ( T b ¼ + C T) ( T c ½ = 1). T This result is an easy corollary of a common version of Farkas Lemma (see Section 4 for a proof). We begin with the weak duality result. Lemma 2. (Weak Duality) Let be given x F P ; y F D. Then, (1) b T y c T x = y T.b Ax/ If in (1) we have b T y c T x =, thenx; y are solutions of P, D with v P = v D. Proof. For feasible x; y we find b T y c T x = b T y y T Ax = y T.b Ax/ or equivalently b T y c T x. The equal sign implies that y is minimal for D and x maximal for P with same value b T y = v D = c T x = v P. We now prove the strong duality result, the existence of solutions and optimality conditions. Theorem 1. (Strong Duality) The following holds. (a) Suppose F P. ThenF D = if and only if v P =. Suppose F D. ThenF P = if and only if v D =. (b) Suppose F P ; F D. Then P and D have solutions x and y satisfying c T x = b T y, i.e. v P = v D. Moreover, the following optimality conditions hold x F P solves P there exists y F D such that y T.b Ax/ = y F D solves D there exists x F P such that y T.b Ax/ = Proof. (a) We prove the first statement. Suppose F D =. Then there is no solution of A T y = c; y. By Lemma 1 there exist vectors ¼ ; ½such that ( ) ( ) ( ) A I c T ½ + ¼ = or A½ = ¼ and c T ½ = 1 1 Then, with x F P the vector x.t/ = x t½ is feasible for all t > with c T x.t/ = c T x + t for t. This implies v P =. On the other hand, if F D then

4 2 with a vector y F D by Lemma 2 it follows v P b T y. The proof of the other case is similar. (b) By Lemma 2 we have shown that x; y are solutions of P, D with b T y c T x = if we show that x; y satisfy the relations Ax b Iy (2) A T y = c c T x + b T y Suppose that this system does not have a solution x; y. ThenbyLemma1thereexist Þ and vectors ¼ x ;¼ y ;½such that AT b T ¼ x + I ¼ y + A c T ½ + c b Þ = 1 or (3) A T ¼ x = Þc; A. ½/ = Þb ¼ y ; b T ¼ x c T. ½/ = 1 We distinguish between two cases. Case Þ = Then in view of (3) with x F P ; y F D the vectors x.t/ = x t½; y.t/ = y + t¼ x are feasible with b T y.t/ c T x.t/ = b T y c T x t for t in contradiction to our assumption. Case Þ> Then by dividing relations (3) by Þ be obtain a solution of the system (2), a contradiction. This shows the first part of (b). The optimality conditions are obtained as follows. Suppose x is a solution of P. As shown, there exist a solution y of D with = b T y c T x = y T.b Ax/. On the other hand if for x F P the vector y F D satisfies y T.b Ax/ = thenbylemma2,x is a solution of P. The optimality conditions for y F D are obtained similarly.

5 3 2. STRONG DUALITY RESULTS IN SEMIDEFINITE PROGRAMMING In this section we will give a similar proof of the strong duality result and optimality conditions in semidefinite programming. Consider the pair of primal and dual linear semidefinite programs, P max x R c x s.t. A.x/ = B x i A i n D min B Y s.t. A i Y = c i ; i = 1;;n ; Y ; Y where B; A i are symmetric.m m/-matrices and c R n. We write Y for a positive semidefinite, and Y for a positive definite matrix Y. By B Y we denote the inner product B Y = ij b ijy ij (coinciding with the trace of BY). For convenience of notation we also have replaced c T x by c x. Let again v P ;v D be the maximum, minimum values of P, D, respectively and F P, F D the feasible sets. Points x F P ; Y F D are called strictly feasible if A.x/; Y are positive definite. We give a generalized nonhomogeneous Farkas Lemma (see Section 4 for a proof). For a given set S let cone.s/ denote the convex cone, lin.s/ the linear hull and clos.s/ the closure of S. Lemma 3. Let be given S ={.b k ;þ k / b k R q ; þ k R; k K}, K a possibly infinite set, and S 1 ={.c j ; j /; c j R q ; j R; j J}, J a finite set. Then precisely one of the following alternatives is valid with S = cone.s / + lin.s 1 /. (a) There is a solution ¾ of bk T¾ þ k; k K, c T j ¾ = j; j J. (b) We have ( 1) clos.s/. We need a result for semidefinite matrices. A proof is given in Section 4. Lemma 4. Let be given A; B. ThenA B and A B = if and only if A B =. If moreover A then A B = B =. The possibility to treat the semidefinite problem as a direct generalization of the linear case depends on the following observation. Let V m denote the compact set V m = {V = vv T v R m ; v = 1}. Then, in view of A vv T = v T Av it follows (4) A A V forall V V m We now proceed as in the case of linear programs. Lemma 5. (Weak Duality) Let be given x F P ; Y F D. Then, (5) B Y c x = Y A.x/ If in (1) we have B Y c x =, thenx; Y are solutions of P, D with v P = v D. Proof. For feasible x; Y we find B Y c x = B Y n x i A i Y = Y A.x/ or B Y c x. The equal sign implies that Y is minimal for D and x maximal for P with the same value B Y = v D = c x = v P. We give the prove of the strong duality results together with optimality conditions under usual constraint qualifications.

6 4 Theorem 2. (Strong Duality) The following holds. (a) Suppose P is strictly feasible. Then F D = if and only if v P =. Suppose D is strictly feasible. Then F P = if and only if v D =. (b) Suppose P and D are strictly feasible. Then, P and D have solutions x and Y satisfying c x = B Y. Moreover, the following optimality conditions hold x F P solves P there exists Y F D such that Y A.x/ = Y F D solves D there exists x F P such that Y A.x/ = Proof. In P we can assume that A i ; i = 1;;n are linearly independent. (a) Assuming F D, then with Y F D we obtain from Lemma 5, B Y v P,i.e. v P <. Suppose now that F D =, i.e. there is no solution Y of A i Y = c i ; i = 1;;n; Y V ; for all V V m By Lemma 3, ( 1) clos ( cone. V m ; // + lin {.A i ; c i /; i = 1;;n} ),i.e.there exist V ¹ k V m ;¼ ¹ k ; k K ¹, ½ ¹ i R such that k K ¹ ¼ ¹ k ( V ¹ k ) + ½ ¹ i ( ) ( Ai c i 1 Putting S ¹ = k K ¹ ¼ ¹ k V ¹ k and x¹ = ½ ¹ this is equivalent with ) for ¹ (6) x ¹ i A i + E ¹ = S ¹ ; c x ¹ = 1 + ž ¹ with ž¹ =.E ¹ ;ž ¹ / for ¹. In the expression.e ¹ ;ž ¹ / the element.e¹ ;ž ¹ / is to be seen as a vector in R m2 +1. With a strictly feasible x we have A.x/ and we can choose M > large enough such that Mž ¹ A.x/ E ¹ ; ¹ N. This implies Mž ¹ B N ( Mž ¹ x i + x ¹ ) i Ai ; c ( Mž ¹ x + x ¹) = 1 ž ¹ + Mž¹ c x Dividing by Mž ¹ and using ž ¹ we obtain B N ( xi + x¹ ) i Ai Mž ¹ ; c ( x + x¹ ) 1 Mž ¹ Mž ¹ 1 + c x M The other case can be proven similarly. (b) In view of Lemma 5 and using (4), to prove the first part of the statement, it is sufficient to show that there exist a solution x; Y of n x i A i V B V; V V m (7) Y V ; V V m Y A i = c i ; i = 1;;n n x ic i + B Y

7 Suppose that this system is not solvable. By Lemma 3 there exist Þ ¹, V l ¹ ; V k ¹ V m ;¼ ¹ k ;¼¹ l ; k K ¹ ; l L ¹, ½ ¹ i R such that for ¹ l L ¹ ¼ ¹ l A 1 V l ¹ A n V l ¹ B V l ¹ + ¼ ¹ k k K ¹ V ¹ k + ½ ¹ i A i c i + Þ¹ c 1 c n B Putting Y ¹ = l L ¹ ¼ ¹ l V l ¹, S¹ = k K ¹ ¼ ¹ k V k ¹, x¹ = ½ ¹ this is equivalent with A i Y ¹ Þ ¹ c i + ž ¹ i = ; i = 1;n; Þ ¹ B n (8) x¹ i A i + E ¹ = S ¹ ; B Y ¹ c x ¹ = 1 + ž ¹ ; where ž ¹ =.ž 1 ¹ ;;ž¹ n ; E¹ ;ž ¹ / for¹. Define the numbers ¹ = max{.y ¹ ; S ¹ / ; x ¹ ;Þ ¹ }. We distinguish between two cases. Case ¹ M; ¹ N Then, there exist convergent subsequences Y ¹ Y; S ¹ S; x ¹ x; Þ ¹ Þ and from (8) we find (9) A i Y = Þc i ; i = 1;;n; ÞB x i A i = S ; B Y c x = 1 We distinguish between two sub-cases. If Þ>then by dividing relations (9) by Þ be obtain a solution of the system (7), a contradiction. If Þ = then in view of (9) with x F P ; Y F D the vectors x.t/ = x + tx; Y.t/ = Y + ty are feasible with B Y.t/ c x.t/ = B Y c x t for t in contradiction to our assumption. Case ¹ ;¹ (for some subsequence) By dividing (8) by ¹ and taking converging subsequences we obtain with some Ŷ ; Ŝ ; ˆÞ ; ˆx; (1) A 1 Ŷ A n Ŷ Ŝ B Ŷ ˆx i A i c i +ˆÞ c 1 c n B = and max{.ŷ; Ŝ/ ; ˆx ; ˆÞ}=1. It now follows ˆÞ>. In fact for ˆÞ = by multiplying (1) with. x; Y; 1/, x; Y strict feasible we find using A i Y + c i = 1 5 (11) A.x/ Ŷ + Ŝ Y = with A.x/; Y In view of Lemma 4 it follows Ŷ = Ŝ = and by the linear independency of A i in (1) also ˆx =, a contradiction. The relation ˆÞ > implies that (8) is valid with Þ ¹ (some subsequence). Now we can choose Y ¹ ž such that with some M > (12) A i Y ¹ ž = ž ¹ i ; i = 1;;n and Y ¹ ž M ž ¹ for all ¹ N Thus, with strictly feasible x; Y there exists M > such that Y ž ¹ + Mž ¹ Y ; Mž ¹( ) (13) B x i A i E ¹ ; ¹ N

8 6 For Y ¹ + Y ¹ ž + Mž ¹ Y, x ¹ + Mž ¹ x we find using (8), (12), (13) and ž ¹ ;ž ¹ (14) A i.y ¹ + Y ¹ ž + Mž ¹ Y /.Þ ¹ + Mž ¹ /c i = ; i = 1;n;.Þ ¹ + Mž ¹ /B n ( x ¹ i + Mž ¹ x i ) Ai ; B.Y ¹ + Y ¹ ž + Mž ¹ Y / c.x ¹ + Mž ¹ x/ = 1 + ž ¹ + O.ž¹ / 1 2 for any fixed ¹ large enough. Since Þ ¹ we obtain Þ ¹ + Mž ¹ > forlarge¹. By dividing (14) by Þ ¹ + Mž ¹ > we have a solution of (7) in contradiction to our assumption. This shows the first part of (b). The optimality conditions are obtained as follows. Suppose x is a solution of P. As shown, there exist a solution Y of D with = B Y c x = Y A.x/. Lemma 4 implies Y A.x/ =. On the other hand if for x F P the vector Y F D satisfies Y A.x/ = and thus Y A.x/ = then, by Lemma 5, x is a solution of P. The optimality conditions for Y F D are obtained similarly. The proof of the semidefinite case is longer than the proof of the statements of Theorem 1 for linear programs. The reason is that the set S = cone.s 1 / + cone.s2 / + lin.s 1/ + cone {s } with S 1 ={.A 1 V; ; A n V; ; B V / V V m }, S 2 ={.; ; ; V; / V V m }, S 1 ={.; ; ; A i ; c i / i = 1;;n}, s =. c 1 ; ; c n ; B; / need not to be closed. This, although the strict feasibility assumptions in Theorem 2(b) imply that the set cone.s 1/ + cone.s2 / + lin.s 1/ is closed. Hence, in the proof of Theorem 2(b), the case ¹ cannot be excluded. This complication is not present in linear programming since cones generated by finitely many vectors are always closed. For further details on semidefinite programming, such as duality gaps, we refer to [3]. Commonly the duality results and optimality conditions for semidefinite problems are obtained by transforming the semidefinite programs into a more abstract coneconstrained form. Our approach avoids such a transformation by transforming the programs into a special case of a semi-infinite problem (see also Section 3). 3. STRONG DUALITY RESULTS IN SEMI-INFINITE PROGRAMMING In this section we briefly outline how the same approach can be applied to linear semiinfinite programs. A common linear semi-infinite problem is of the form, P max x R c x s.t. b.t/ x i a i.t/ ; for all t T ; n where c R n is a given vector and b.t/; a i.t/ C.T; R/, T a compact subset of a topological space. Again we have replaced c T x by c x. C.T; R/ denotes the space of real-valued functions f, continuous on T, with norm f = max{ f.t/ t T}. Note, that in view of (4) the semidefinite program in the previous section can be written as a semi-infinite program by defining b.t/ = t T Bt; a i.t/ = t T A i t; i = 1;;n; t T = {t R n t = 1} For f C.T; R/ we write f (f > ) if f.t/ (f.t/>) for all t T. The dual C.T; R/ of the space C.T; R/ is the space of all real-valued Borel measures y on T

9 (see [1]). We define f y = T f.t/dy.t/ ; f C.T; R/; y C.T; R/ The measure y is said to be non-negative (notation Y ) if f y forall f C.T; R/; f and positive (y > ) if f y > forall f C.T; R/; f ; f. The dual of P then reads D min b y s.t. a i y = c i ; i = 1;;n ; y y C.T;R/ As before let v P ;v D denote the values of P; D and F P ; F D the feasible sets. Elements x F P and y F D are said to be strictly feasible if a.x/ = b x i a i > and y > We introduce the set K + 1 ={f C.T; R/ f ; f 1}. With these settings we can proceed as in the semidefinite case. The full system for the solutions x of P, y of D corresponding to (7), for example, becomes in the semi-infinite case n x ia i.t/ b.t/; t T 7 q y ; q K + 1 a i y = c i ; i = 1;;n n x ic i + b y By considering some appropriate modifications in the proofs of Section 2 we can prove weak and strong duality results for semi-infinite programs along the same lines as in the semidefinite case. For shortness we only give the strong duality result. Theorem 3. (Strong Duality) The following holds. (a) Suppose P is strictly feasible. Then F D = if and only if v P =. Suppose D is strictly feasible. Then F P = if and only if v D =. (b) Suppose P and D are strictly feasible. Then, P and D have solutions x and y satisfying c x = b y. Moreover, the following optimality conditions hold x F P solves P there exists y F D such that a.x/ y y F D solves D there exists x F P such that a.x/ y = For further details on semi-infinite programming we refer to the paper [2]. 4. PROOFS OF THE AUXILIARY LEMMATA For completeness, in this section, the proofs of all auxiliary lemmata of Section 1 and Section 2 will be given. Proof of Lemma 1 We prove the statement with the help of the following common homogeneous version of Farkas Lemma Given a.m n/-matrix A and b R m, precisely one of the alternatives.a /,.b / is valid,.a / Ax ; b T x > is solvable.b / A T y = b; y is solvable

10 8 By introducing in the situation of Lemma 1 an auxiliary variable x n+1 the statement.a/ is equivalent with There exists a solution.x; x n+1 / of x n+1 > Bx x n+1 b Cx x n+1 c Cx+ x n+1 c This system.a / has the alternative.b / =.b/ There exist vectors ¼; ¼ + ;¼ such that with b =.; 1/ and ½ = ¼ + ¼ we have ( B T ) ( b ¼ + C T ) ( T c ½ = 1). T Proof of Lemma 4 A B = directly implies A B = tr.a B/ =. To prove the converse, consider the transformation of A; B to diagonal form, A = Þ i q i qi T ; B = þ j v j v T j ; where q i ;v j are the orthonormal eigenvectors and Þ i ;þ j the corresponding eigenvalues of A; B. Then with A B = tr.a B/ we find using Þ i þ j A B = Þ i þ j tr.q i qi T v j v T j / = Þ i þ j.v T j q i qi T v j / = Þ i þ j.qi T v j / 2 i; j=1 i; j=1 j=1 i; j=1 Moreover, A B = implies Þ i þ j.qi Tv j/ 2 = orþ i þ j.qi Tv j/ = foralli; j and then A B = Þ i þ j q i qi T v jv T j = Þ i þ j.qi T v j/ q i v T j = i; j=1 When A then in particular, the matrix A is regular and A B = implies B = A 1 =. Proof of Lemma 3 We prove the statement by using the following standard separation theorem Let S R q be a convex closed set and y R q. Then precisely one of the alternatives.a /,.b / holds,.a / There exist ¾ R q ;Þ Rsuch that ¾ T s Þ; s S; ¾ T y >Þ.b / y S It is easy to show that if.b/ is valid then.a/ cannot hold. Suppose now that.b/ is not true. By putting y =.; 1/, S = clos ( cone.s / + lin.s 1 / ) the condition.b / is not fulfilled. Thus by.a / there exist a vector.¾; ¾ q / R q ;Þ Rsuch that i; j=1 ¾ T b + ¾ q þ Þ for all.b;þ/ cone.s / (15) ¾ T c + ¾ q Þ for all.c;/ lin.s 1 / ¾ q > Þ With.c;/ lin.s 1 /;.b;þ/ cone.s / these relations also holds for ±t.c;/, t.b;þ/; t. This implies ¾ T c + ¾ q =, ¾ T b + ¾ q þ and we can choose Þ =. By dividing (15) by ¾ q we obtain with ¾ = ¾=¾ q the relation ¾ T b þ; ¾ T c = for all.b;þ/ cone.s /;.c;/ lin.s 1 /,i.e.(a). REFERENCES [1] Rudin W., Functional analysis, McGraw-Hill, (1973). [2] Hettich R., Kortanek K., Semi-infinite programming Theory, methods, and applications, SIAM Review, Vol.35, No.3, , (1993). [3] Vandenberghe L. and Boyd S., Semidefinite programming,, SIAM Review, 38, 49-95, (1996).

Robust Farkas Lemma for Uncertain Linear Systems with Applications

Robust Farkas Lemma for Uncertain Linear Systems with Applications Robust Farkas Lemma for Uncertain Linear Systems with Applications V. Jeyakumar and G. Li Revised Version: July 8, 2010 Abstract We present a robust Farkas lemma, which provides a new generalization of

More information

15. Conic optimization

15. Conic optimization L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone

More information

Summer School: Semidefinite Optimization

Summer School: Semidefinite Optimization Summer School: Semidefinite Optimization Christine Bachoc Université Bordeaux I, IMB Research Training Group Experimental and Constructive Algebra Haus Karrenberg, Sept. 3 - Sept. 7, 2012 Duality Theory

More information

Linear and Combinatorial Optimization

Linear and Combinatorial Optimization Linear and Combinatorial Optimization The dual of an LP-problem. Connections between primal and dual. Duality theorems and complementary slack. Philipp Birken (Ctr. for the Math. Sc.) Lecture 3: Duality

More information

Lecture 5. The Dual Cone and Dual Problem

Lecture 5. The Dual Cone and Dual Problem IE 8534 1 Lecture 5. The Dual Cone and Dual Problem IE 8534 2 For a convex cone K, its dual cone is defined as K = {y x, y 0, x K}. The inner-product can be replaced by x T y if the coordinates of the

More information

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent. Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u

More information

Limiting behavior of the central path in semidefinite optimization

Limiting behavior of the central path in semidefinite optimization Limiting behavior of the central path in semidefinite optimization M. Halická E. de Klerk C. Roos June 11, 2002 Abstract It was recently shown in [4] that, unlike in linear optimization, the central path

More information

Lecture Note 5: Semidefinite Programming for Stability Analysis

Lecture Note 5: Semidefinite Programming for Stability Analysis ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State

More information

Copositive Plus Matrices

Copositive Plus Matrices Copositive Plus Matrices Willemieke van Vliet Master Thesis in Applied Mathematics October 2011 Copositive Plus Matrices Summary In this report we discuss the set of copositive plus matrices and their

More information

Input: System of inequalities or equalities over the reals R. Output: Value for variables that minimizes cost function

Input: System of inequalities or equalities over the reals R. Output: Value for variables that minimizes cost function Linear programming Input: System of inequalities or equalities over the reals R A linear cost function Output: Value for variables that minimizes cost function Example: Minimize 6x+4y Subject to 3x + 2y

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Rank-one LMIs and Lyapunov's Inequality. Gjerrit Meinsma 4. Abstract. We describe a new proof of the well-known Lyapunov's matrix inequality about

Rank-one LMIs and Lyapunov's Inequality. Gjerrit Meinsma 4. Abstract. We describe a new proof of the well-known Lyapunov's matrix inequality about Rank-one LMIs and Lyapunov's Inequality Didier Henrion 1;; Gjerrit Meinsma Abstract We describe a new proof of the well-known Lyapunov's matrix inequality about the location of the eigenvalues of a matrix

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality

More information

In English, this means that if we travel on a straight line between any two points in C, then we never leave C.

In English, this means that if we travel on a straight line between any two points in C, then we never leave C. Convex sets In this section, we will be introduced to some of the mathematical fundamentals of convex sets. In order to motivate some of the definitions, we will look at the closest point problem from

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009 UC Berkeley Department of Electrical Engineering and Computer Science EECS 227A Nonlinear and Convex Optimization Solutions 5 Fall 2009 Reading: Boyd and Vandenberghe, Chapter 5 Solution 5.1 Note that

More information

Semidefinite Programming

Semidefinite Programming Semidefinite Programming Basics and SOS Fernando Mário de Oliveira Filho Campos do Jordão, 2 November 23 Available at: www.ime.usp.br/~fmario under talks Conic programming V is a real vector space h, i

More information

Real Symmetric Matrices and Semidefinite Programming

Real Symmetric Matrices and Semidefinite Programming Real Symmetric Matrices and Semidefinite Programming Tatsiana Maskalevich Abstract Symmetric real matrices attain an important property stating that all their eigenvalues are real. This gives rise to many

More information

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min. MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.

More information

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008. 1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function

More information

New Class of duality models in discrete minmax fractional programming based on second-order univexities

New Class of duality models in discrete minmax fractional programming based on second-order univexities STATISTICS, OPTIMIZATION AND INFORMATION COMPUTING Stat., Optim. Inf. Comput., Vol. 5, September 017, pp 6 77. Published online in International Academic Press www.iapress.org) New Class of duality models

More information

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 2003 2003.09.02.10 6. The Positivstellensatz Basic semialgebraic sets Semialgebraic sets Tarski-Seidenberg and quantifier elimination Feasibility

More information

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming CSC2411 - Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming Notes taken by Mike Jamieson March 28, 2005 Summary: In this lecture, we introduce semidefinite programming

More information

Introduction and Math Preliminaries

Introduction and Math Preliminaries Introduction and Math Preliminaries Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Appendices A, B, and C, Chapter

More information

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364-765X eissn 1526-5471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented

More information

Solving generalized semi-infinite programs by reduction to simpler problems.

Solving generalized semi-infinite programs by reduction to simpler problems. Solving generalized semi-infinite programs by reduction to simpler problems. G. Still, University of Twente January 20, 2004 Abstract. The paper intends to give a unifying treatment of different approaches

More information

Lecture 7: Convex Optimizations

Lecture 7: Convex Optimizations Lecture 7: Convex Optimizations Radu Balan, David Levermore March 29, 2018 Convex Sets. Convex Functions A set S R n is called a convex set if for any points x, y S the line segment [x, y] := {tx + (1

More information

MAT-INF4110/MAT-INF9110 Mathematical optimization

MAT-INF4110/MAT-INF9110 Mathematical optimization MAT-INF4110/MAT-INF9110 Mathematical optimization Geir Dahl August 20, 2013 Convexity Part IV Chapter 4 Representation of convex sets different representations of convex sets, boundary polyhedra and polytopes:

More information

Lecture 9 Monotone VIs/CPs Properties of cones and some existence results. October 6, 2008

Lecture 9 Monotone VIs/CPs Properties of cones and some existence results. October 6, 2008 Lecture 9 Monotone VIs/CPs Properties of cones and some existence results October 6, 2008 Outline Properties of cones Existence results for monotone CPs/VIs Polyhedrality of solution sets Game theory:

More information

Problem 1 (Exercise 2.2, Monograph)

Problem 1 (Exercise 2.2, Monograph) MS&E 314/CME 336 Assignment 2 Conic Linear Programming January 3, 215 Prof. Yinyu Ye 6 Pages ASSIGNMENT 2 SOLUTIONS Problem 1 (Exercise 2.2, Monograph) We prove the part ii) of Theorem 2.1 (Farkas Lemma

More information

Key words. Complementarity set, Lyapunov rank, Bishop-Phelps cone, Irreducible cone

Key words. Complementarity set, Lyapunov rank, Bishop-Phelps cone, Irreducible cone ON THE IRREDUCIBILITY LYAPUNOV RANK AND AUTOMORPHISMS OF SPECIAL BISHOP-PHELPS CONES M. SEETHARAMA GOWDA AND D. TROTT Abstract. Motivated by optimization considerations we consider cones in R n to be called

More information

Approximate Farkas Lemmas in Convex Optimization

Approximate Farkas Lemmas in Convex Optimization Approximate Farkas Lemmas in Convex Optimization Imre McMaster University Advanced Optimization Lab AdvOL Graduate Student Seminar October 25, 2004 1 Exact Farkas Lemma Motivation 2 3 Future plans The

More information

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 1 Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 2 A system of linear equations may not have a solution. It is well known that either Ax = c has a solution, or A T y = 0, c

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program

More information

A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems

A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems Gábor Pataki gabor@unc.edu Dept. of Statistics and OR University of North Carolina at Chapel Hill Abstract The Facial Reduction

More information

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010 I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0

More information

We describe the generalization of Hazan s algorithm for symmetric programming

We describe the generalization of Hazan s algorithm for symmetric programming ON HAZAN S ALGORITHM FOR SYMMETRIC PROGRAMMING PROBLEMS L. FAYBUSOVICH Abstract. problems We describe the generalization of Hazan s algorithm for symmetric programming Key words. Symmetric programming,

More information

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018 MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S

More information

An Algorithm for Solving the Convex Feasibility Problem With Linear Matrix Inequality Constraints and an Implementation for Second-Order Cones

An Algorithm for Solving the Convex Feasibility Problem With Linear Matrix Inequality Constraints and an Implementation for Second-Order Cones An Algorithm for Solving the Convex Feasibility Problem With Linear Matrix Inequality Constraints and an Implementation for Second-Order Cones Bryan Karlovitz July 19, 2012 West Chester University of Pennsylvania

More information

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes Linear Algebra Preliminary Lecture Notes Adolfo J. Rumbos c Draft date April 29, 23 2 Contents Motivation for the course 5 2 Euclidean n dimensional Space 7 2. Definition of n Dimensional Euclidean Space...........

More information

4. Algebra and Duality

4. Algebra and Duality 4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

More information

Existence Of Solution For Third-Order m-point Boundary Value Problem

Existence Of Solution For Third-Order m-point Boundary Value Problem Applied Mathematics E-Notes, 1(21), 268-274 c ISSN 167-251 Available free at mirror sites of http://www.math.nthu.edu.tw/ amen/ Existence Of Solution For Third-Order m-point Boundary Value Problem Jian-Ping

More information

1 Quantum states and von Neumann entropy

1 Quantum states and von Neumann entropy Lecture 9: Quantum entropy maximization CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: February 15, 2016 1 Quantum states and von Neumann entropy Recall that S sym n n

More information

Agenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem

Agenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem Agenda 1 Duality for LP 2 Theorem of alternatives 3 Conic Duality 4 Dual cones 5 Geometric view of cone programs 6 Conic duality theorem 7 Examples Lower bounds on LPs By eliminating variables (if needed)

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Chapter 26 Semidefinite Programming Zacharias Pitouras 1 Introduction LP place a good lower bound on OPT for NP-hard problems Are there other ways of doing this? Vector programs

More information

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Statistical Inference with Reproducing Kernel Hilbert Space Kenji Fukumizu Institute of Statistical Mathematics, ROIS Department

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

Inequality Constraints

Inequality Constraints Chapter 2 Inequality Constraints 2.1 Optimality Conditions Early in multivariate calculus we learn the significance of differentiability in finding minimizers. In this section we begin our study of the

More information

Constrained Optimization Theory

Constrained Optimization Theory Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August

More information

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March

More information

Conic Linear Optimization and its Dual. yyye

Conic Linear Optimization and its Dual.   yyye Conic Linear Optimization and Appl. MS&E314 Lecture Note #04 1 Conic Linear Optimization and its Dual Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

More information

1 Review of last lecture and introduction

1 Review of last lecture and introduction Semidefinite Programming Lecture 10 OR 637 Spring 2008 April 16, 2008 (Wednesday) Instructor: Michael Jeremy Todd Scribe: Yogeshwer (Yogi) Sharma 1 Review of last lecture and introduction Let us first

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS Here we consider systems of linear constraints, consisting of equations or inequalities or both. A feasible solution

More information

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima B9824 Foundations of Optimization Lecture 1: Introduction Fall 2009 Copyright 2009 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained

More information

Optimality, Duality, Complementarity for Constrained Optimization

Optimality, Duality, Complementarity for Constrained Optimization Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear

More information

Semidefinite Programming Duality and Linear Time-invariant Systems

Semidefinite Programming Duality and Linear Time-invariant Systems Semidefinite Programming Duality and Linear Time-invariant Systems Venkataramanan (Ragu) Balakrishnan School of ECE, Purdue University 2 July 2004 Workshop on Linear Matrix Inequalities in Control LAAS-CNRS,

More information

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra LP Duality: outline I Motivation and definition of a dual LP I Weak duality I Separating hyperplane theorem and theorems of the alternatives I Strong duality and complementary slackness I Using duality

More information

Lecture 6 - Convex Sets

Lecture 6 - Convex Sets Lecture 6 - Convex Sets Definition A set C R n is called convex if for any x, y C and λ [0, 1], the point λx + (1 λ)y belongs to C. The above definition is equivalent to saying that for any x, y C, the

More information

Lecture 7: Semidefinite programming

Lecture 7: Semidefinite programming CS 766/QIC 820 Theory of Quantum Information (Fall 2011) Lecture 7: Semidefinite programming This lecture is on semidefinite programming, which is a powerful technique from both an analytic and computational

More information

The Simplest Semidefinite Programs are Trivial

The Simplest Semidefinite Programs are Trivial The Simplest Semidefinite Programs are Trivial Robert J. Vanderbei Bing Yang Program in Statistics & Operations Research Princeton University Princeton, NJ 08544 January 10, 1994 Technical Report SOR-93-12

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Paul Schrimpf. October 18, UBC Economics 526. Unconstrained optimization. Paul Schrimpf. Notation and definitions. First order conditions

Paul Schrimpf. October 18, UBC Economics 526. Unconstrained optimization. Paul Schrimpf. Notation and definitions. First order conditions Unconstrained UBC Economics 526 October 18, 2013 .1.2.3.4.5 Section 1 Unconstrained problem x U R n F : U R. max F (x) x U Definition F = max x U F (x) is the maximum of F on U if F (x) F for all x U and

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

Research Note. A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization

Research Note. A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization Iranian Journal of Operations Research Vol. 4, No. 1, 2013, pp. 88-107 Research Note A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization B. Kheirfam We

More information

Agenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms

Agenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms Agenda Interior Point Methods 1 Barrier functions 2 Analytic center 3 Central path 4 Barrier method 5 Primal-dual path following algorithms 6 Nesterov Todd scaling 7 Complexity analysis Interior point

More information

Lecture 8 : Eigenvalues and Eigenvectors

Lecture 8 : Eigenvalues and Eigenvectors CPS290: Algorithmic Foundations of Data Science February 24, 2017 Lecture 8 : Eigenvalues and Eigenvectors Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Hermitian Matrices It is simpler to begin with

More information

1. Find the solution of the following uncontrolled linear system. 2 α 1 1

1. Find the solution of the following uncontrolled linear system. 2 α 1 1 Appendix B Revision Problems 1. Find the solution of the following uncontrolled linear system 0 1 1 ẋ = x, x(0) =. 2 3 1 Class test, August 1998 2. Given the linear system described by 2 α 1 1 ẋ = x +

More information

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction

More information

Optimality Conditions for Constrained Optimization

Optimality Conditions for Constrained Optimization 72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)

More information

JUST THE MATHS UNIT NUMBER 9.9. MATRICES 9 (Modal & spectral matrices) A.J.Hobson

JUST THE MATHS UNIT NUMBER 9.9. MATRICES 9 (Modal & spectral matrices) A.J.Hobson JUST THE MATHS UNIT NUMBER 9.9 MATRICES 9 (Modal & spectral matrices) by A.J.Hobson 9.9. Assumptions and definitions 9.9.2 Diagonalisation of a matrix 9.9.3 Exercises 9.9.4 Answers to exercises UNIT 9.9

More information

A priori bounds on the condition numbers in interior-point methods

A priori bounds on the condition numbers in interior-point methods A priori bounds on the condition numbers in interior-point methods Florian Jarre, Mathematisches Institut, Heinrich-Heine Universität Düsseldorf, Germany. Abstract Interior-point methods are known to be

More information

A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials

A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials G. Y. Li Communicated by Harold P. Benson Abstract The minimax theorem for a convex-concave bifunction is a fundamental theorem

More information

Semidefinite Programming Basics and Applications

Semidefinite Programming Basics and Applications Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent

More information

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be

More information

Chapter 6 Inner product spaces

Chapter 6 Inner product spaces Chapter 6 Inner product spaces 6.1 Inner products and norms Definition 1 Let V be a vector space over F. An inner product on V is a function, : V V F such that the following conditions hold. x+z,y = x,y

More information

Extreme points of compact convex sets

Extreme points of compact convex sets Extreme points of compact convex sets In this chapter, we are going to show that compact convex sets are determined by a proper subset, the set of its extreme points. Let us start with the main definition.

More information

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3 Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. An n n matrix

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

MIT LIBRARIES. III III 111 l ll llljl II Mil IHII l l

MIT LIBRARIES. III III 111 l ll llljl II Mil IHII l l MIT LIBRARIES III III 111 l ll llljl II Mil IHII l l DUPL 3 9080 02246 1237 [DEWEy )28 1414 \^^ i MIT Sloan School of Management Sloan Working Paper 4176-01 July 2001 ON THE PRIMAL-DUAL GEOMETRY OF

More information

Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

More information

Absolute Value Programming

Absolute Value Programming O. L. Mangasarian Absolute Value Programming Abstract. We investigate equations, inequalities and mathematical programs involving absolute values of variables such as the equation Ax + B x = b, where A

More information

An Alternative Proof of Primitivity of Indecomposable Nonnegative Matrices with a Positive Trace

An Alternative Proof of Primitivity of Indecomposable Nonnegative Matrices with a Positive Trace An Alternative Proof of Primitivity of Indecomposable Nonnegative Matrices with a Positive Trace Takao Fujimoto Abstract. This research memorandum is aimed at presenting an alternative proof to a well

More information

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima B9824 Foundations of Optimization Lecture 1: Introduction Fall 2010 Copyright 2010 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained

More information

5. Duality. Lagrangian

5. Duality. Lagrangian 5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

More information

The proximal mapping

The proximal mapping The proximal mapping http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/37 1 closed function 2 Conjugate function

More information

Chapter 1. Preliminaries

Chapter 1. Preliminaries Introduction This dissertation is a reading of chapter 4 in part I of the book : Integer and Combinatorial Optimization by George L. Nemhauser & Laurence A. Wolsey. The chapter elaborates links between

More information

Absolute value equations

Absolute value equations Linear Algebra and its Applications 419 (2006) 359 367 www.elsevier.com/locate/laa Absolute value equations O.L. Mangasarian, R.R. Meyer Computer Sciences Department, University of Wisconsin, 1210 West

More information

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009 LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix

More information

Second Order Elliptic PDE

Second Order Elliptic PDE Second Order Elliptic PDE T. Muthukumar tmk@iitk.ac.in December 16, 2014 Contents 1 A Quick Introduction to PDE 1 2 Classification of Second Order PDE 3 3 Linear Second Order Elliptic Operators 4 4 Periodic

More information

Lecture: Duality of LP, SOCP and SDP

Lecture: Duality of LP, SOCP and SDP 1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:

More information

Linear and non-linear programming

Linear and non-linear programming Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)

More information

REGULAR LAGRANGE MULTIPLIERS FOR CONTROL PROBLEMS WITH MIXED POINTWISE CONTROL-STATE CONSTRAINTS

REGULAR LAGRANGE MULTIPLIERS FOR CONTROL PROBLEMS WITH MIXED POINTWISE CONTROL-STATE CONSTRAINTS REGULAR LAGRANGE MULTIPLIERS FOR CONTROL PROBLEMS WITH MIXED POINTWISE CONTROL-STATE CONSTRAINTS fredi tröltzsch 1 Abstract. A class of quadratic optimization problems in Hilbert spaces is considered,

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Peter J.C. Dickinson p.j.c.dickinson@utwente.nl http://dickinson.website version: 12/02/18 Monday 5th February 2018 Peter J.C. Dickinson

More information

Optimization Theory. A Concise Introduction. Jiongmin Yong

Optimization Theory. A Concise Introduction. Jiongmin Yong October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization

More information

Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A.

Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A. . Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A. Nemirovski Arkadi.Nemirovski@isye.gatech.edu Linear Optimization Problem,

More information

University of Twente. Faculty of Mathematical Sciences. A short proof of a conjecture on the T r -choice number of even cycles

University of Twente. Faculty of Mathematical Sciences. A short proof of a conjecture on the T r -choice number of even cycles Faculty of Mathematical Sciences University of Twente University for Technical and Social Sciences P.O. Box 217 7500 AE Enschede The Netherlands Phone: +31-53-4893400 Fax: +31-53-4893114 Email: memo@math.utwente.nl

More information

More First-Order Optimization Algorithms

More First-Order Optimization Algorithms More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM

More information