A Unified Theorem on SDP Rank Reduction. yyye

Size: px
Start display at page:

Download "A Unified Theorem on SDP Rank Reduction. yyye"

Transcription

1 SDP Rank Reduction Yinyu Ye, EURO XXII 1 A Unified Theorem on SDP Rank Reduction Yinyu Ye Department of Management Science and Engineering and Institute of Computational and Mathematical Engineering Stanford University Stanford, CA 94305, U.S.A. yyye Joint work with Anthony So and Jiawei Zhang

2 SDP Rank Reduction Yinyu Ye, EURO XXII 2 Outline Problem Statement Application New SDP Rank Reduction Theorem and Algorithm Sketch of Proof Extension and Question

3 SDP Rank Reduction Yinyu Ye, EURO XXII 3 Problem Statement Consider the system of Semidefinite Programming constraints: A i X = b i i = 1,..., m, X 0 where given A 1,..., A m are n n symmetric positive semidefinite matrices, and b 1,..., b m 0, and A X = i,j a ijx ij = TrA T X. Clearly, the feasibility of the above system can be decided by using SDP interior-point algorithms (Alizadeh 91, Nesterov and Nemirovskii 93, etc). More precisely, find an ɛ-approximate solution where solution time is linear in log(1/ɛ).

4 SDP Rank Reduction Yinyu Ye, EURO XXII 4 Problem Statement (Cont d) However, we are interested in finding a low rank solution to the above system. The low rank problem arises in many applications, e.g.: localizing sensor network (e.g., Biswas and Y 03, So and Y 04) metric embedding/dimension reduction (e.g., Johnson and Lindenstrauss 84, Matousek 90) approximating non-convex (complex, quaternion) quadratic optimization (e.g., Nemirovskii, Roos and Terlaky 99, Luo, Sidiropoulos, Tseng and Zhang 06, Faybusovich 07) graph rigidity/distance matix (e.g., Alfakih, Khandani and Wolkowicz 99, etc.)

5 SDP Rank Reduction Yinyu Ye, EURO XXII 5 Graph Realization Given a graph G = (V, E) and sets of non negative weights, say {d ij : (i, j) E} and {θ ilj : (i, l, j) Θ}, the goal is to compute a realization of G in the Euclidean space R d for a given low dimension d, i.e. to place the vertices of G in R d such that the Euclidean distance between every pair of adjacent vertices (i, j) equals (or bounded) by the prescribed weight d ij E, and the angle between edges (i, l) and (j, l) equals (or bounded) by the prescribed weight θ ilj Θ.

6 SDP Rank Reduction Yinyu Ye, EURO XXII Figure 1: 50-node 2-D Sensor Localization

7 SDP Rank Reduction Yinyu Ye, EURO XXII 7 Figure 2: A 3-D Tensegrity Graph Realization; provided by Anstreicher

8 SDP Rank Reduction Yinyu Ye, EURO XXII 8 Figure 3: Tensegrity Graph: A Needle Tower; provided by Anstreicher

9 SDP Rank Reduction Yinyu Ye, EURO XXII 9 Figure 4: Molecular Conformation: 1F39(1534 atoms) with 85% of distances below 6Å and 10% noise on upper and lower bounds

10 SDP Rank Reduction Yinyu Ye, EURO XXII 10 Math Programming: Rank-Constrained SDP Given a k R d, d ij N x, ˆd kj N a, and v ilj Θ, find x i R d such that x i x j 2 ( ) = ( ) d 2 ij, (i, j) N x, i < j, a k x j 2 ( ) = ( ) ˆd2 kj, (k, j) N a, (x i x l ) T (x j x l ) ( ) = ( ) v ilj, (i, l, j) Θ, which lead to and relaxed to A i X = b i i = 1,..., m, X 0, rank(x) d; A i X = b i i = 1,..., m, X 0.

11 SDP Rank Reduction Yinyu Ye, EURO XXII 11 Some Background Barvinok 95 showed that if the system is feasible, then there exists a solution X whose rank is at most 2m (also see Carathéodorys theorem). Moreover, Pataki 98 showed how to construct such an X efficiently. Unfortunately, for the applications mentioned above, this is not enough. We want a fixed-low-rank (say d) solution! However, there are some issues: Such a solution may not exist! Even if it does, one may not be able to find it efficiently. So we consider an approximation of the problem.

12 SDP Rank Reduction Yinyu Ye, EURO XXII 12 Approximating the Problem We consider the problem of finding an ˆX 0 of rank at most d that satisfies the system approximately: β(m, n, d) b i A i ˆX α(m, n, d) b i i = 1,..., m Here, distortion factors α 1 and β (0, 1]. Clearly, the closer are both to 1, the better.

13 SDP Rank Reduction Yinyu Ye, EURO XXII 13 Our Result Theorem 1. Suppose that the original system is feasible. Let r = max i {Rank(A i )}. Then, for any d 1, there exists an ˆX 0 of rank at most d such that: α(m, n, d) = 12 log(4mr) 1 + d 12 log(4mr) 1 + d for 1 d 12 log(4mr) for d > 12 log(4mr)

14 SDP Rank Reduction Yinyu Ye, EURO XXII 14 Our Result Theorem 1. Suppose that the original system is feasible. Let r = max i {Rank(A i )}. Then, for any d 1, there exists an ˆX 0 of rank at most d such that: α(m, n, d) = β(m, n, d) = 1 5e 1 4e 12 log(4mr) 1 + d 12 log(4mr) 1 + d for 1 d 12 log(4mr) for d > 12 log(4mr) 1 m 2/d for 1 d 2 log m log log(2m) 1 log f(m)/d (2m) for 2 log m log log(2m) < d 4 log(4mr) 4 log(4mr) 1 d where f(m) = 3 log m/ log log(2m). for d > 4 log(4mr)

15 SDP Rank Reduction Yinyu Ye, EURO XXII 15 Our Result Theorem 1. Suppose that the original system is feasible. Let r = max i {Rank(A i )}. Then, for any d 1, there exists an ˆX 0 of rank at most d such that: α(m, n, d) = β(m, n, d) = 1 5e 12 log(4mr) 1 + d 12 log(4mr) 1 + d for 1 d 12 log(4mr) for d > 12 log(4mr) 1 m 2/d for 1 d 2 log m log log(2m) 1 4e 1 log f(m)/d (2m) 4 log(4mr) 1 d for 2 log m log log(2m) < d 4 log(4mr) for d > 4 log(4mr) where f(m) = 3 log m/ log log(2m). Moreover, such an ˆX can be found in randomized polynomial time.

16 SDP Rank Reduction Yinyu Ye, EURO XXII 16 Some Remarks In general, the data parameter r can be bounded by 2m, so that ( ) log m α(m, n, d) = 1 + O d and β(m, n, d) = Ω Ω ( m 2/d) ( 3 log m/(d log log m)) (log m) otherwise for d = O ( log m ) log log m

17 SDP Rank Reduction Yinyu Ye, EURO XXII 17 Some Remarks (Cont d) In the region 1 d 2 log m/ log log(2m), the lower bound β is independent of the ranks of A 1,..., A m. f(m)/d 3/2 in the region d > 1 4 log(4mr) d 2 log m log log(2m). is a constant in the region d > 4 log(4mr) Our result contains as special cases several well-known results in the literature.

18 SDP Rank Reduction Yinyu Ye, EURO XXII 18 Early Result: Metric Embedding Given an n point set V = {v 1,..., v n } in R l, we would like to embed it into a low dimensional Euclidean space as faithfully as possible. Specifically, a map f : V R d is an α embedding (where α 1) if u v 2 f(u) f(v) 2 α u v 2 The goal is to find an f such that α is as small as possible. It is known that: for any ɛ > 0, an (1 + ɛ) embedding into R O(ɛ 2 log n) exists (Johnson Lindenstrauss); for any fixed d 1, an O(n 2/d d 1/2 log 1/2 n) embedding into R d exists (Matousek).

19 SDP Rank Reduction Yinyu Ye, EURO XXII 19 Early Result: Metric Embedding (Cont d) We can get these results from our Theorem. We focus on the fixed d case. Let {e i } m i=1 be the standard basis vectors, and set E ij = (e i e j )(e i e j ) T. Let U be the m n matrix whose i th column is v i. Then, X = U T U satisfies the system E ij X = v i v j 2 2 for 1 i < j n. By our Theorem, we can find an ˆX 0 of rank at most d such that: Ω(n 4/d ) v i v j 2 2 E ij ˆX O(log n/d) v i v j 2 2 Upon decomposing ˆX = Û T Û, where û 1,..., û n R d such that: Û is d n, we recover points Ω(n 2/d ) v i v j 2 û i û j 2 O( log n/d) v i v j 2. The embedding results imply only a weaker version (r = 1) of our theorem.

20 SDP Rank Reduction Yinyu Ye, EURO XXII 20 Early Result: Approximating QPs Let A 1,..., A m be positive semidefinite. Consider the following QP: v = maximize x T Ax subject to x T A i x 1 i = 1,..., m and its natural SDP relaxation: v sdp = maximize A X subject to A i X 1 i = 1,..., m; X 0 Let X be an optimal solution to the SDP. Nemirovskii et al. showed that one can randomly extract a rank 1 matrix ˆX from X such that it is feasible for the SDP and that E[A ˆX] Ω(log 1 m)v.

21 SDP Rank Reduction Yinyu Ye, EURO XXII 21 Early Result: Approximating QPs (Cont d) We can obtain a similar result from our Theorem. The matrix X satisfies the system: A i X = b i 1 i = 1,..., m Our proof of the Theorem shows that one can find a rank 1 matrix ˆX 0 such that: E[A ˆX] = vsdp, A i ˆX O(log m) b i i = 1,..., m By scaling down ˆX by a factor of O(log m), we obtain a feasible rank 1 matrix ˆX that satisfies E[A ˆX ] Ω(log 1 m)v.

22 SDP Rank Reduction Yinyu Ye, EURO XXII 22 Early Result: Approximating QPs (Cont d) Luo et al. considered the following real (complex) QP: minimize x T Ax subject to x T A i x 1 i = 1,..., m and its natural SDP relaxation: minimize A X subject to A i X 1 i = 1,..., m; X 0 They showed how to extract a solution ˆx from an optimal solution matrix to the SDP so that it is feasible for the SDP and that it is within a factor O(m 2 ) (O(m 1 )) of the optimal. Again, we can obtain the same results from our Theorem on both real (d = 1) and complex (d = 2) spaces.

23 SDP Rank Reduction Yinyu Ye, EURO XXII 23 How Sharp are the Bounds? For metric embedding, it is known that: for any d 1, there exists an n point set V R d+1 such that any embedding of V into R d requires D = Ω(n 1/ (d+1)/2 ) (Matousek); there exists an n point set V R l for some l such that for any ɛ (n 1/2, 1/2), say, an (1 + ɛ) embedding of V into R d will require d = Ω((ɛ 2 log(1/ɛ)) 1 log n) (Alon 03). Thus, from the metric embedding perspective, the ratio of our upper and lower bounds is almost tight for d 3.

24 SDP Rank Reduction Yinyu Ye, EURO XXII 24 How Sharp are the Bounds? (Cont d) For the QP: v = maximize x T Ax subject to x T A i x 1 i = 1,..., m and its natural SDP relaxation: v sdp = maximize A X subject to A i X 1 i = 1,..., m; X 0 Nemirovskii et al. showed that the ratio between v and v sdp as Ω(log m). can be as large For the minimization version, Luo et al. showed that the ratio can be as small as Ω(m 2 ). Thus, from the QP perspective, the ratio of our upper and lower bounds is almost tight for d = 1.

25 SDP Rank Reduction Yinyu Ye, EURO XXII 25 Sketch of Proof of the Theorem Without loss of generality, we may assume that X = I is feasible for the original system, that is, our system becomes A i X = Tr(A i ) i = 1,..., m; X 0. Thus, the Theorem becomes: Theorem 2. Let A 1,..., A m be n n positive semidefinite matrices. Then, for any d 1, there exists an ˆX 0 with rank at most d such that: β(m, n, d) Tr(A i ) A i ˆX α(m, n, d) Tr(A i ) i = 1,..., m

26 SDP Rank Reduction Yinyu Ye, EURO XXII 26 Sketch of Proof of the Theorem (Cont d) The algorithm for generating ˆX is simple: generate i.i.d. Gaussian RVs ξ j i with mean 0 and variance 1/d and define column vector ξ j = (ξ j 1 ;... ; ξj n), for 1 i n and 1 j d; return ˆX = d j=1 ξj (ξ j ) T.

27 SDP Rank Reduction Yinyu Ye, EURO XXII 27 Sketch of Proof of the Theorem (Cont d) The analysis makes use of the following Markov inequality: Lemma 1. Let ξ 1,..., ξ n be i.i.d. standard Gaussian RVs. Let α (1, ) and β (0, 1) be constants, and Chi-square U n = n i=1 ξ2 i. Then, the following hold: [ n ] Pr (U n αn) exp 2 (1 α + log α) [ n ] Pr (U n βn) exp 2 (1 β + log β)

28 SDP Rank Reduction Yinyu Ye, EURO XXII 28 Sketch of Proof of the Theorem (Cont d) Lemma 2. Let H be an n n positive semidefinite matrix and r = rank(h). Then, for any β (0, 1), we have: Pr ( H ˆX ) βtr(h) r exp [ d 2 ] (1 β + log β) On the other hand, if β satisfies eβ log r 1/5, then the above can be sharpened to: ( Pr H ˆX ) βtr(h) (5eβ/2) d/2 Note that (2) is independent of r!

29 SDP Rank Reduction Yinyu Ye, EURO XXII 29 Sketch of Proof of the Theorem (Cont d) Lemma 2. Let H be an n n positive semidefinite matrix and r = rank(h). Then, for any β (0, 1), we have: Pr ( H ˆX ) βtr(h) r exp [ d 2 ] (1 β + log β) On the other hand, if β satisfies eβ log r 1/5, then the above can be sharpened to: ( Pr H ˆX ) βtr(h) (5eβ/2) d/2 (2) Note that (2) is independent of r! (1) For any α > 1, we have: Pr ( H ˆX ) αtr(h) r exp [ d 2 ] (1 α + log α) (3)

30 SDP Rank Reduction Yinyu Ye, EURO XXII 30 Sketch of Proof of the Theorem (Cont d) It is easy to establish (1) and (3) using Lemma 1. By applying the bounds (1) and (3) of Lemma 2 to each A 1,..., A m and taking the union bound, we can get the upper bound in the Theorem. However, the lower bound obtained this way is weaker. To obtain a better lower bound (for the region 1 d 2 log m/ log log(2m)), we use the bound (2) in Lemma 2. To prove it, consider the spectral decomposition H = r k=1 λ kv k v T k and λ 1 λ 2 λ r > 0.

31 SDP Rank Reduction Yinyu Ye, EURO XXII 31 Sketch of Proof of the Theorem (Cont d) Recall that it says if β (0, 1) satisfies eβ log r 1/5, then ( Pr H ˆX ) βtr(h) (5eβ/2) d/2 First, by the spectral decomposition, we have ( r ) H ˆX d = λ k v k vk T ξ j (ξ j ) T = k=1 j=1 r k=1 d λ k (vk T ξ j ) 2. j=1 Now, observe that (v T k ξj ) k,j N (0, d 1 ) and mutually independent. Hence, H ˆX has the same distribution as the weighted Chi-square r k=1 λ d ξ k j=1 kj 2, where ξ kj are i.i.d. Gaussian of N (0, d 1 ).

32 SDP Rank Reduction Yinyu Ye, EURO XXII 32 Sketch of Proof of the Theorem (Cont d) Let λ / r k = λ k k=1 λ k. It then follows that: ( ( Pr H ˆX ) r d r βtr(h) = Pr ξ kj 2 β Pr r λ k k=1 d j=1 k=1 λ k j=1 ξ kj 2 β p(r, λ, β) k=1 We now bound p(r, λ, β). On the one hand, by replacing all λ k by the smallest one λ r and using the tail estimates of Lemma 1, we have: p ( r, λ, β ) Pr λr r k=1 j=1 d ( ξ kj 2 β eβ r λ r λ k ) = ) rd/2

33 SDP Rank Reduction Yinyu Ye, EURO XXII 33 On the other hand, by dropping smallest term λ r in the summation p ( r, λ, β ) r 1 d Pr λ k ξ2 kj β r 1 = Pr p d k=1 j=1 k=1 j=1 λ k 1 λ ξ2 kj β r 1 λ r ( r 1, λ 1:r 1 1 λ r, ) β 1 λ. r

34 SDP Rank Reduction Yinyu Ye, EURO XXII 34 Sketch of Proof of the Theorem (Cont d) By unrolling the recursive formula, we have: p ( r, λ, β ) min 1 k r { ( eβ k λ k ) kd/2 } Let γ = p ( r, λ, β ) 2/d. Note that γ (0, 1). From the above, we have λ k ( kγ 1/k) 1 eβ for k = 1,..., r. Upon summing over k and using the fact that r k=1 λ k = 1, we obtain: eβ r k=1 1 kγ 1/k 1

35 SDP Rank Reduction Yinyu Ye, EURO XXII 35 Sketch of Proof of the Theorem (Cont d) r k=1 1 kγ 1/k 1 γ + r 1 1 tγ 1/t dt = 1 γ + log(1/γ) log(1/γ) r e t t dt. Then, one can show that the above implies that: 1 eβ 2 γ + log r Together with the assumption that eβ log r 1/5, we conclude that: as desired. 5eβ/2 γ = Pr ( H ˆX ) 2/d βtr(h)

36 SDP Rank Reduction Yinyu Ye, EURO XXII 36 SDP with an Objective Function Our result can be used for solving SDP with an objective function: min C X, subject to A i X = b i for i = 1,..., m; X 0. When X is optimal, there must be a ( S, ȳ) feasible for the dual such that S X = 0 (under a mild condition). One can treat S X = 0 as an equality constraint. Thus, the rounding method will preserve S ˆX = 0, that is, low rank ˆX is optimal for a nearby problem to the original SDP with the identical objective.

37 SDP Rank Reduction Yinyu Ye, EURO XXII 37 Question Is there deterministic algorithm? Choose the largest d eigenvalue component of X? In practical applications, we see much smaller distortion, why? Add a regularization objective to find a low rank SDP solution?

The Graph Realization Problem

The Graph Realization Problem The Graph Realization Problem via Semi-Definite Programming A. Y. Alfakih alfakih@uwindsor.ca Mathematics and Statistics University of Windsor The Graph Realization Problem p.1/21 The Graph Realization

More information

A Unified Theorem on SDP Rank Reduction

A Unified Theorem on SDP Rank Reduction A Unifie heorem on SDP Ran Reuction Anthony Man Cho So, Yinyu Ye, Jiawei Zhang November 9, 006 Abstract We consier the problem of fining a low ran approximate solution to a system of linear equations in

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016 U.C. Berkeley CS294: Spectral Methods and Expanders Handout Luca Trevisan February 29, 206 Lecture : ARV In which we introduce semi-definite programming and a semi-definite programming relaxation of sparsest

More information

Jordan-algebraic aspects of optimization:randomization

Jordan-algebraic aspects of optimization:randomization Jordan-algebraic aspects of optimization:randomization L. Faybusovich, Department of Mathematics, University of Notre Dame, 255 Hurley Hall, Notre Dame, IN, 46545 June 29 2007 Abstract We describe a version

More information

Handout 6: Some Applications of Conic Linear Programming

Handout 6: Some Applications of Conic Linear Programming ENGG 550: Foundations of Optimization 08 9 First Term Handout 6: Some Applications of Conic Linear Programming Instructor: Anthony Man Cho So November, 08 Introduction Conic linear programming CLP, and

More information

Semidefinite Facial Reduction for Euclidean Distance Matrix Completion

Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Nathan Krislock, Henry Wolkowicz Department of Combinatorics & Optimization University of Waterloo First Alpen-Adria Workshop on Optimization

More information

Mathematical Optimization Models and Applications

Mathematical Optimization Models and Applications Mathematical Optimization Models and Applications Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 1, 2.1-2,

More information

arxiv: v1 [math.oc] 14 Oct 2014

arxiv: v1 [math.oc] 14 Oct 2014 arxiv:110.3571v1 [math.oc] 1 Oct 01 An Improved Analysis of Semidefinite Approximation Bound for Nonconvex Nonhomogeneous Quadratic Optimization with Ellipsoid Constraints Yong Hsia a, Shu Wang a, Zi Xu

More information

Introduction to Semidefinite Programming I: Basic properties a

Introduction to Semidefinite Programming I: Basic properties a Introduction to Semidefinite Programming I: Basic properties and variations on the Goemans-Williamson approximation algorithm for max-cut MFO seminar on Semidefinite Programming May 30, 2010 Semidefinite

More information

Problem 1 (Exercise 2.2, Monograph)

Problem 1 (Exercise 2.2, Monograph) MS&E 314/CME 336 Assignment 2 Conic Linear Programming January 3, 215 Prof. Yinyu Ye 6 Pages ASSIGNMENT 2 SOLUTIONS Problem 1 (Exercise 2.2, Monograph) We prove the part ii) of Theorem 2.1 (Farkas Lemma

More information

Rank minimization via the γ 2 norm

Rank minimization via the γ 2 norm Rank minimization via the γ 2 norm Troy Lee Columbia University Adi Shraibman Weizmann Institute Rank Minimization Problem Consider the following problem min X rank(x) A i, X b i for i = 1,..., k Arises

More information

Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming

Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305,

More information

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March

More information

Fast and Robust Phase Retrieval

Fast and Robust Phase Retrieval Fast and Robust Phase Retrieval Aditya Viswanathan aditya@math.msu.edu CCAM Lunch Seminar Purdue University April 18 2014 0 / 27 Joint work with Yang Wang Mark Iwen Research supported in part by National

More information

SDP Relaxation of Quadratic Optimization with Few Homogeneous Quadratic Constraints

SDP Relaxation of Quadratic Optimization with Few Homogeneous Quadratic Constraints SDP Relaxation of Quadratic Optimization with Few Homogeneous Quadratic Constraints Paul Tseng Mathematics, University of Washington Seattle LIDS, MIT March 22, 2006 (Joint work with Z.-Q. Luo, N. D. Sidiropoulos,

More information

Spherical Euclidean Distance Embedding of a Graph

Spherical Euclidean Distance Embedding of a Graph Spherical Euclidean Distance Embedding of a Graph Hou-Duo Qi University of Southampton Presented at Isaac Newton Institute Polynomial Optimization August 9, 2013 Spherical Embedding Problem The Problem:

More information

Rank-one Generated Spectral Cones Defined by Two Homogeneous Linear Matrix Inequalities

Rank-one Generated Spectral Cones Defined by Two Homogeneous Linear Matrix Inequalities Rank-one Generated Spectral Cones Defined by Two Homogeneous Linear Matrix Inequalities C.J. Argue Joint work with Fatma Kılınç-Karzan October 22, 2017 INFORMS Annual Meeting Houston, Texas C. Argue, F.

More information

Lecture: Examples of LP, SOCP and SDP

Lecture: Examples of LP, SOCP and SDP 1/34 Lecture: Examples of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:

More information

Facial reduction for Euclidean distance matrix problems

Facial reduction for Euclidean distance matrix problems Facial reduction for Euclidean distance matrix problems Nathan Krislock Department of Mathematical Sciences Northern Illinois University, USA Distance Geometry: Theory and Applications DIMACS Center, Rutgers

More information

Lecture: Expanders, in theory and in practice (2 of 2)

Lecture: Expanders, in theory and in practice (2 of 2) Stat260/CS294: Spectral Graph Methods Lecture 9-02/19/2015 Lecture: Expanders, in theory and in practice (2 of 2) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

SEMIDEFINITE PROGRAM BASICS. Contents

SEMIDEFINITE PROGRAM BASICS. Contents SEMIDEFINITE PROGRAM BASICS BRIAN AXELROD Abstract. A introduction to the basics of Semidefinite programs. Contents 1. Definitions and Preliminaries 1 1.1. Linear Algebra 1 1.2. Convex Analysis (on R n

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program

More information

An Approximation Algorithm for Approximation Rank

An Approximation Algorithm for Approximation Rank An Approximation Algorithm for Approximation Rank Troy Lee Columbia University Adi Shraibman Weizmann Institute Conventions Identify a communication function f : X Y { 1, +1} with the associated X-by-Y

More information

Multi-stage convex relaxation approach for low-rank structured PSD matrix recovery

Multi-stage convex relaxation approach for low-rank structured PSD matrix recovery Multi-stage convex relaxation approach for low-rank structured PSD matrix recovery Department of Mathematics & Risk Management Institute National University of Singapore (Based on a joint work with Shujun

More information

Finite Metric Spaces & Their Embeddings: Introduction and Basic Tools

Finite Metric Spaces & Their Embeddings: Introduction and Basic Tools Finite Metric Spaces & Their Embeddings: Introduction and Basic Tools Manor Mendel, CMI, Caltech 1 Finite Metric Spaces Definition of (semi) metric. (M, ρ): M a (finite) set of points. ρ a distance function

More information

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method CS 395T: Sublinear Algorithms Fall 2016 Prof. Eric Price Lecture 13 October 6, 2016 Scribe: Kiyeon Jeon and Loc Hoang 1 Overview In the last lecture we covered the lower bound for p th moment (p > 2) and

More information

Convex and Semidefinite Programming for Approximation

Convex and Semidefinite Programming for Approximation Convex and Semidefinite Programming for Approximation We have seen linear programming based methods to solve NP-hard problems. One perspective on this is that linear programming is a meta-method since

More information

A Semidefinite Relaxation Scheme for Multivariate Quartic Polynomial Optimization With Quadratic Constraints

A Semidefinite Relaxation Scheme for Multivariate Quartic Polynomial Optimization With Quadratic Constraints A Semidefinite Relaxation Scheme for Multivariate Quartic Polynomial Optimization With Quadratic Constraints Zhi-Quan Luo and Shuzhong Zhang April 3, 009 Abstract We present a general semidefinite relaxation

More information

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs 2015 American Control Conference Palmer House Hilton July 1-3, 2015. Chicago, IL, USA Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Raphael Louca and Eilyan Bitar

More information

6.854J / J Advanced Algorithms Fall 2008

6.854J / J Advanced Algorithms Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 6.85J / 8.5J Advanced Algorithms Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 8.5/6.85 Advanced Algorithms

More information

Multivariate Statistics Random Projections and Johnson-Lindenstrauss Lemma

Multivariate Statistics Random Projections and Johnson-Lindenstrauss Lemma Multivariate Statistics Random Projections and Johnson-Lindenstrauss Lemma Suppose again we have n sample points x,..., x n R p. The data-point x i R p can be thought of as the i-th row X i of an n p-dimensional

More information

Optimal compression of approximate Euclidean distances

Optimal compression of approximate Euclidean distances Optimal compression of approximate Euclidean distances Noga Alon 1 Bo az Klartag 2 Abstract Let X be a set of n points of norm at most 1 in the Euclidean space R k, and suppose ε > 0. An ε-distance sketch

More information

SDP Relaxations for MAXCUT

SDP Relaxations for MAXCUT SDP Relaxations for MAXCUT from Random Hyperplanes to Sum-of-Squares Certificates CATS @ UMD March 3, 2017 Ahmed Abdelkader MAXCUT SDP SOS March 3, 2017 1 / 27 Overview 1 MAXCUT, Hardness and UGC 2 LP

More information

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization April 6, 2018 1 / 34 This material is covered in the textbook, Chapters 9 and 10. Some of the materials are taken from it. Some of

More information

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Raphael Louca & Eilyan Bitar School of Electrical and Computer Engineering American Control Conference (ACC) Chicago,

More information

Network Localization via Schatten Quasi-Norm Minimization

Network Localization via Schatten Quasi-Norm Minimization Network Localization via Schatten Quasi-Norm Minimization Anthony Man-Cho So Department of Systems Engineering & Engineering Management The Chinese University of Hong Kong (Joint Work with Senshan Ji Kam-Fung

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

18.409: Topics in TCS: Embeddings of Finite Metric Spaces. Lecture 6

18.409: Topics in TCS: Embeddings of Finite Metric Spaces. Lecture 6 Massachusetts Institute of Technology Lecturer: Michel X. Goemans 8.409: Topics in TCS: Embeddings of Finite Metric Spaces September 7, 006 Scribe: Benjamin Rossman Lecture 6 Today we loo at dimension

More information

A SEMIDEFINITE PROGRAMMING APPROACH TO THE GRAPH REALIZATION PROBLEM: THEORY, APPLICATIONS AND EXTENSIONS

A SEMIDEFINITE PROGRAMMING APPROACH TO THE GRAPH REALIZATION PROBLEM: THEORY, APPLICATIONS AND EXTENSIONS A SEMIDEFINITE PROGRAMMING APPROACH TO THE GRAPH REALIZATION PROBLEM: THEORY, APPLICATIONS AND EXTENSIONS A DISSERTATION SUBMITTED TO THE DEPARTMENT OF COMPUTER SCIENCE AND THE COMMITTEE ON GRADUATE STUDIES

More information

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These

More information

Lecture 6 Proof for JL Lemma and Linear Dimensionality Reduction

Lecture 6 Proof for JL Lemma and Linear Dimensionality Reduction COMS 4995: Unsupervised Learning (Summer 18) June 7, 018 Lecture 6 Proof for JL Lemma and Linear imensionality Reduction Instructor: Nakul Verma Scribes: Ziyuan Zhong, Kirsten Blancato This lecture gives

More information

Lecture Semidefinite Programming and Graph Partitioning

Lecture Semidefinite Programming and Graph Partitioning Approximation Algorithms and Hardness of Approximation April 16, 013 Lecture 14 Lecturer: Alantha Newman Scribes: Marwa El Halabi 1 Semidefinite Programming and Graph Partitioning In previous lectures,

More information

The Trust Region Subproblem with Non-Intersecting Linear Constraints

The Trust Region Subproblem with Non-Intersecting Linear Constraints The Trust Region Subproblem with Non-Intersecting Linear Constraints Samuel Burer Boshi Yang February 21, 2013 Abstract This paper studies an extended trust region subproblem (etrs in which the trust region

More information

Approximating the Radii of Point Sets

Approximating the Radii of Point Sets Approximating the Radii of Point Sets Kasturi Varadarajan S. Venkatesh Yinyu Ye Jiawei Zhang April 17, 2006 Abstract We consider the problem of computing the outer-radii of point sets. In this problem,

More information

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING Kartik Krishnan Advanced Optimization Laboratory McMaster University Joint work with Gema Plaza Martinez and Tamás

More information

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming CSC2411 - Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming Notes taken by Mike Jamieson March 28, 2005 Summary: In this lecture, we introduce semidefinite programming

More information

Partitioning Algorithms that Combine Spectral and Flow Methods

Partitioning Algorithms that Combine Spectral and Flow Methods CS369M: Algorithms for Modern Massive Data Set Analysis Lecture 15-11/11/2009 Partitioning Algorithms that Combine Spectral and Flow Methods Lecturer: Michael Mahoney Scribes: Kshipra Bhawalkar and Deyan

More information

A PREDICTOR-CORRECTOR PATH-FOLLOWING ALGORITHM FOR SYMMETRIC OPTIMIZATION BASED ON DARVAY'S TECHNIQUE

A PREDICTOR-CORRECTOR PATH-FOLLOWING ALGORITHM FOR SYMMETRIC OPTIMIZATION BASED ON DARVAY'S TECHNIQUE Yugoslav Journal of Operations Research 24 (2014) Number 1, 35-51 DOI: 10.2298/YJOR120904016K A PREDICTOR-CORRECTOR PATH-FOLLOWING ALGORITHM FOR SYMMETRIC OPTIMIZATION BASED ON DARVAY'S TECHNIQUE BEHROUZ

More information

Hardness of Embedding Metric Spaces of Equal Size

Hardness of Embedding Metric Spaces of Equal Size Hardness of Embedding Metric Spaces of Equal Size Subhash Khot and Rishi Saket Georgia Institute of Technology {khot,saket}@cc.gatech.edu Abstract. We study the problem embedding an n-point metric space

More information

We describe the generalization of Hazan s algorithm for symmetric programming

We describe the generalization of Hazan s algorithm for symmetric programming ON HAZAN S ALGORITHM FOR SYMMETRIC PROGRAMMING PROBLEMS L. FAYBUSOVICH Abstract. problems We describe the generalization of Hazan s algorithm for symmetric programming Key words. Symmetric programming,

More information

Matrix Rank-One Decomposition. and Applications

Matrix Rank-One Decomposition. and Applications Matrix rank-one decomposition and applications 1 Matrix Rank-One Decomposition and Applications Shuzhong Zhang Department of Systems Engineering and Engineering Management The Chinese University of Hong

More information

A Semidefinite Programming Approach to Tensegrity Theory and Realizability of Graphs

A Semidefinite Programming Approach to Tensegrity Theory and Realizability of Graphs A Semidefinite Programming Approach to Tensegrity Theory and Realizability of Graphs Anthony Man Cho So, Yinyu Ye Abstract Recently, Connelly and Sloughter [14] have introduced the notion of d realizability

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Dimensionality reduction of SDPs through sketching

Dimensionality reduction of SDPs through sketching Technische Universität München Workshop on "Probabilistic techniques and Quantum Information Theory", Institut Henri Poincaré Joint work with Andreas Bluhm arxiv:1707.09863 Semidefinite Programs (SDPs)

More information

Preliminaries Overview OPF and Extensions. Convex Optimization. Lecture 8 - Applications in Smart Grids. Instructor: Yuanzhang Xiao

Preliminaries Overview OPF and Extensions. Convex Optimization. Lecture 8 - Applications in Smart Grids. Instructor: Yuanzhang Xiao Convex Optimization Lecture 8 - Applications in Smart Grids Instructor: Yuanzhang Xiao University of Hawaii at Manoa Fall 2017 1 / 32 Today s Lecture 1 Generalized Inequalities and Semidefinite Programming

More information

Theory of semidefinite programming for Sensor Network Localization

Theory of semidefinite programming for Sensor Network Localization Math. Program. Ser. B DOI 10.1007/s10107-006-0040-1 FULL LENGTH PAPER Theory of semidefinite programming for Sensor Network Localization Anthony Man-Cho So Yinyu Ye Received: 22 November 2004 / Accepted:

More information

A NEW SECOND-ORDER CONE PROGRAMMING RELAXATION FOR MAX-CUT PROBLEMS

A NEW SECOND-ORDER CONE PROGRAMMING RELAXATION FOR MAX-CUT PROBLEMS Journal of the Operations Research Society of Japan 2003, Vol. 46, No. 2, 164-177 2003 The Operations Research Society of Japan A NEW SECOND-ORDER CONE PROGRAMMING RELAXATION FOR MAX-CUT PROBLEMS Masakazu

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Chapter 26 Semidefinite Programming Zacharias Pitouras 1 Introduction LP place a good lower bound on OPT for NP-hard problems Are there other ways of doing this? Vector programs

More information

Nonconvex Quadratic Programming: Return of the Boolean Quadric Polytope

Nonconvex Quadratic Programming: Return of the Boolean Quadric Polytope Nonconvex Quadratic Programming: Return of the Boolean Quadric Polytope Kurt M. Anstreicher Dept. of Management Sciences University of Iowa Seminar, Chinese University of Hong Kong, October 2009 We consider

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

Further Relaxations of the SDP Approach to Sensor Network Localization

Further Relaxations of the SDP Approach to Sensor Network Localization Further Relaxations of the SDP Approach to Sensor Network Localization Zizhuo Wang, Song Zheng, Stephen Boyd, and Yinyu Ye September 27, 2006; Revised May 2, 2007 Abstract Recently, a semi-definite programming

More information

Problem Set 2. Assigned: Mon. November. 23, 2015

Problem Set 2. Assigned: Mon. November. 23, 2015 Pseudorandomness Prof. Salil Vadhan Problem Set 2 Assigned: Mon. November. 23, 2015 Chi-Ning Chou Index Problem Progress 1 SchwartzZippel lemma 1/1 2 Robustness of the model 1/1 3 Zero error versus 1-sided

More information

Equivalent relaxations of optimal power flow

Equivalent relaxations of optimal power flow Equivalent relaxations of optimal power flow 1 Subhonmesh Bose 1, Steven H. Low 2,1, Thanchanok Teeraratkul 1, Babak Hassibi 1 1 Electrical Engineering, 2 Computational and Mathematical Sciences California

More information

IE 521 Convex Optimization

IE 521 Convex Optimization Lecture 14: and Applications 11th March 2019 Outline LP SOCP SDP LP SOCP SDP 1 / 21 Conic LP SOCP SDP Primal Conic Program: min c T x s.t. Ax K b (CP) : b T y s.t. A T y = c (CD) y K 0 Theorem. (Strong

More information

INTERIOR-POINT METHODS ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB

INTERIOR-POINT METHODS ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB 1 INTERIOR-POINT METHODS FOR SECOND-ORDER-CONE AND SEMIDEFINITE PROGRAMMING ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB Outline 2 Introduction

More information

1 Regression with High Dimensional Data

1 Regression with High Dimensional Data 6.883 Learning with Combinatorial Structure ote for Lecture 11 Instructor: Prof. Stefanie Jegelka Scribe: Xuhong Zhang 1 Regression with High Dimensional Data Consider the following regression problem:

More information

On the local stability of semidefinite relaxations

On the local stability of semidefinite relaxations On the local stability of semidefinite relaxations Diego Cifuentes Department of Mathematics Massachusetts Institute of Technology Joint work with Sameer Agarwal (Google), Pablo Parrilo (MIT), Rekha Thomas

More information

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

The maximal stable set problem : Copositive programming and Semidefinite Relaxations The maximal stable set problem : Copositive programming and Semidefinite Relaxations Kartik Krishnan Department of Mathematical Sciences Rensselaer Polytechnic Institute Troy, NY 12180 USA kartis@rpi.edu

More information

Approximation Algorithms for Homogeneous Polynomial Optimization with Quadratic Constraints

Approximation Algorithms for Homogeneous Polynomial Optimization with Quadratic Constraints Approximation Algorithms for Homogeneous Polynomial Optimization with Quadratic Constraints Simai HE Zhening LI Shuzhong ZHANG July 19, 010 Abstract In this paper, we consider approximation algorithms

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial

More information

The Ellipsoid (Kachiyan) Method

The Ellipsoid (Kachiyan) Method Yinyu Ye, MS&E, Stanford MS&E310 Lecture Note: Ellipsoid Method 1 The Ellipsoid (Kachiyan) Method Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

More information

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems MATHEMATICS OF OPERATIONS RESEARCH Vol. xx, No. x, Xxxxxxx 00x, pp. xxx xxx ISSN 0364-765X EISSN 156-5471 0x xx0x 0xxx informs DOI 10.187/moor.xxxx.xxxx c 00x INFORMS On the Power of Robust Solutions in

More information

Optimization in Wireless Communication

Optimization in Wireless Communication Zhi-Quan (Tom) Luo Department of Electrical and Computer Engineering University of Minnesota 200 Union Street SE Minneapolis, MN 55455 2007 NSF Workshop Challenges Optimization problems from wireless applications

More information

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint Iranian Journal of Operations Research Vol. 2, No. 2, 20, pp. 29-34 A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint M. Salahi Semidefinite

More information

A new primal-dual path-following method for convex quadratic programming

A new primal-dual path-following method for convex quadratic programming Volume 5, N., pp. 97 0, 006 Copyright 006 SBMAC ISSN 00-805 www.scielo.br/cam A new primal-dual path-following method for convex quadratic programming MOHAMED ACHACHE Département de Mathématiques, Faculté

More information

Approximation norms and duality for communication complexity lower bounds

Approximation norms and duality for communication complexity lower bounds Approximation norms and duality for communication complexity lower bounds Troy Lee Columbia University Adi Shraibman Weizmann Institute From min to max The cost of a best algorithm is naturally phrased

More information

A Linear Round Lower Bound for Lovasz-Schrijver SDP Relaxations of Vertex Cover

A Linear Round Lower Bound for Lovasz-Schrijver SDP Relaxations of Vertex Cover A Linear Round Lower Bound for Lovasz-Schrijver SDP Relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani Abstract We study semidefinite programming relaxations of Vertex Cover arising

More information

15. Conic optimization

15. Conic optimization L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS Sercan Yıldız syildiz@samsi.info in collaboration with Dávid Papp (NCSU) OPT Transition Workshop May 02, 2017 OUTLINE Polynomial optimization and

More information

Applications of Robust Optimization in Signal Processing: Beamforming and Power Control Fall 2012

Applications of Robust Optimization in Signal Processing: Beamforming and Power Control Fall 2012 Applications of Robust Optimization in Signal Processing: Beamforg and Power Control Fall 2012 Instructor: Farid Alizadeh Scribe: Shunqiao Sun 12/09/2012 1 Overview In this presentation, we study the applications

More information

Semidefinite Programming Basics and Applications

Semidefinite Programming Basics and Applications Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent

More information

Conic Linear Optimization and its Dual. yyye

Conic Linear Optimization and its Dual.   yyye Conic Linear Optimization and Appl. MS&E314 Lecture Note #04 1 Conic Linear Optimization and its Dual Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

More information

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

Constructing Explicit RIP Matrices and the Square-Root Bottleneck Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry

More information

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3 MI 6.97 Algebraic techniques and semidefinite optimization February 4, 6 Lecture 3 Lecturer: Pablo A. Parrilo Scribe: Pablo A. Parrilo In this lecture, we will discuss one of the most important applications

More information

w Kluwer Academic Publishers Boston/Dordrecht/London HANDBOOK OF SEMIDEFINITE PROGRAMMING Theory, Algorithms, and Applications

w Kluwer Academic Publishers Boston/Dordrecht/London HANDBOOK OF SEMIDEFINITE PROGRAMMING Theory, Algorithms, and Applications HANDBOOK OF SEMIDEFINITE PROGRAMMING Theory, Algorithms, and Applications Edited by Henry Wolkowicz Department of Combinatorics and Optimization Faculty of Mathematics University of Waterloo Waterloo,

More information

8 Approximation Algorithms and Max-Cut

8 Approximation Algorithms and Max-Cut 8 Approximation Algorithms and Max-Cut 8. The Max-Cut problem Unless the widely believed P N P conjecture is false, there is no polynomial algorithm that can solve all instances of an NP-hard problem.

More information

Notes taken by Costis Georgiou revised by Hamed Hatami

Notes taken by Costis Georgiou revised by Hamed Hatami CSC414 - Metric Embeddings Lecture 6: Reductions that preserve volumes and distance to affine spaces & Lower bound techniques for distortion when embedding into l Notes taken by Costis Georgiou revised

More information

Explicit Sensor Network Localization using Semidefinite Programming and Clique Reductions

Explicit Sensor Network Localization using Semidefinite Programming and Clique Reductions Explicit Sensor Network Localization using Semidefinite Programming and Clique Reductions Nathan Krislock and Henry Wolkowicz Dept. of Combinatorics and Optimization University of Waterloo Haifa Matrix

More information

Relaxations and Randomized Methods for Nonconvex QCQPs

Relaxations and Randomized Methods for Nonconvex QCQPs Relaxations and Randomized Methods for Nonconvex QCQPs Alexandre d Aspremont, Stephen Boyd EE392o, Stanford University Autumn, 2003 Introduction While some special classes of nonconvex problems can be

More information

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min. MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.

More information

Semidefinite Programming

Semidefinite Programming Chapter 2 Semidefinite Programming 2.0.1 Semi-definite programming (SDP) Given C M n, A i M n, i = 1, 2,..., m, and b R m, the semi-definite programming problem is to find a matrix X M n for the optimization

More information

1 The linear algebra of linear programs (March 15 and 22, 2015)

1 The linear algebra of linear programs (March 15 and 22, 2015) 1 The linear algebra of linear programs (March 15 and 22, 2015) Many optimization problems can be formulated as linear programs. The main features of a linear program are the following: Variables are real

More information

Faster Johnson-Lindenstrauss style reductions

Faster Johnson-Lindenstrauss style reductions Faster Johnson-Lindenstrauss style reductions Aditya Menon August 23, 2007 Outline 1 Introduction Dimensionality reduction The Johnson-Lindenstrauss Lemma Speeding up computation 2 The Fast Johnson-Lindenstrauss

More information

Distance Geometry-Matrix Completion

Distance Geometry-Matrix Completion Christos Konaxis Algs in Struct BioInfo 2010 Outline Outline Tertiary structure Incomplete data Tertiary structure Measure diffrence of matched sets Def. Root Mean Square Deviation RMSD = 1 n x i y i 2,

More information

Solving Corrupted Quadratic Equations, Provably

Solving Corrupted Quadratic Equations, Provably Solving Corrupted Quadratic Equations, Provably Yuejie Chi London Workshop on Sparse Signal Processing September 206 Acknowledgement Joint work with Yuanxin Li (OSU), Huishuai Zhuang (Syracuse) and Yingbin

More information

Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations

Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations Dian Mo Department of Electrical and Computer Engineering University of Massachusetts Amherst, MA 3 mo@umass.edu Marco F.

More information

Inderjit Dhillon The University of Texas at Austin

Inderjit Dhillon The University of Texas at Austin Inderjit Dhillon The University of Texas at Austin ( Universidad Carlos III de Madrid; 15 th June, 2012) (Based on joint work with J. Brickell, S. Sra, J. Tropp) Introduction 2 / 29 Notion of distance

More information

Lecture 8: The Goemans-Williamson MAXCUT algorithm

Lecture 8: The Goemans-Williamson MAXCUT algorithm IU Summer School Lecture 8: The Goemans-Williamson MAXCUT algorithm Lecturer: Igor Gorodezky The Goemans-Williamson algorithm is an approximation algorithm for MAX-CUT based on semidefinite programming.

More information

Further Relaxations of the SDP Approach to Sensor Network Localization

Further Relaxations of the SDP Approach to Sensor Network Localization Further Relaxations of the SDP Approach to Sensor Network Localization Zizhuo Wang, Song Zheng, Stephen Boyd, and inyu e September 27, 2006 Abstract Recently, a semidefinite programming (SDP) relaxation

More information