Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry

Similar documents
arxiv: v1 [math.oc] 26 Sep 2015

Solving large Semidefinite Programs - Part 1 and 2

Second-order cone programming

Lecture: Introduction to LP, SDP and SOCP

III. Applications in convex optimization

Projection methods to solve SDP

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

4. Algebra and Duality

Lecture 14: Optimality Conditions for Conic Problems

Degeneracy in Maximal Clique Decomposition for Semidefinite Programs

Semidefinite Programming

Continuous Optimisation, Chpt 9: Semidefinite Optimisation

Introduction to Semidefinite Programming I: Basic properties a

Lecture: Algorithms for LP, SOCP and SDP

Semidefinite Programming Basics and Applications

12. Interior-point methods

Research Note. A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization

Summer School: Semidefinite Optimization

m i=1 c ix i i=1 F ix i F 0, X O.

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

15. Conic optimization

Croatian Operational Research Review (CRORR), Vol. 3, 2012

Linear and non-linear programming

Lecture 6: Conic Optimization September 8

Semidefinite Programming

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

The Ongoing Development of CSDP

EE 227A: Convex Optimization and Applications October 14, 2008

Continuous Optimisation, Chpt 9: Semidefinite Problems

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

Modeling with semidefinite and copositive matrices

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Copositive Programming and Combinatorial Optimization

Strong duality in Lasserre s hierarchy for polynomial optimization

Lecture 5. The Dual Cone and Dual Problem

Interior Point Methods: Second-Order Cone Programming and Semidefinite Programming

The Q Method for Symmetric Cone Programmin

Eigenvalue Optimization for Solving the

Lecture 8. Strong Duality Results. September 22, 2008

Conic Linear Optimization and its Dual. yyye

Sparse Matrix Theory and Semidefinite Optimization

Copositive Programming and Combinatorial Optimization

Applications of semidefinite programming, symmetry and algebra to graph partitioning problems

Lecture 17: Primal-dual interior-point methods part II

Introduction to Semidefinite Programs

Approximate Farkas Lemmas in Convex Optimization

Lecture Note 5: Semidefinite Programming for Stability Analysis

Interior Point Algorithms for Constrained Convex Optimization

Uniqueness of the Solutions of Some Completion Problems

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

Real Symmetric Matrices and Semidefinite Programming

Infeasible Primal-Dual (Path-Following) Interior-Point Methods for Semidefinite Programming*

A solution approach for linear optimization with completely positive matrices

SEMIDEFINITE PROGRAM BASICS. Contents

Parallel implementation of primal-dual interior-point methods for semidefinite programs

On the Sandwich Theorem and a approximation algorithm for MAX CUT

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs

Convex Optimization M2

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization

Research overview. Seminar September 4, Lehigh University Department of Industrial & Systems Engineering. Research overview.

The moment-lp and moment-sos approaches in optimization

Primal-Dual Geometry of Level Sets and their Explanatory Value of the Practical Performance of Interior-Point Methods for Conic Optimization

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Lecture: Examples of LP, SOCP and SDP

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

CS711008Z Algorithm Design and Analysis

The moment-lp and moment-sos approaches

10 Numerical methods for constrained problems

12. Interior-point methods

Identifying Redundant Linear Constraints in Systems of Linear Matrix. Inequality Constraints. Shafiu Jibrin

Tamás Terlaky George N. and Soteria Kledaras 87 Endowed Chair Professor. Chair, Department of Industrial and Systems Engineering Lehigh University

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Infeasible Primal-Dual (Path-Following) Interior-Point Methods for Semidefinite Programming*

Copositive and Semidefinite Relaxations of the Quadratic Assignment Problem (appeared in Discrete Optimization 6 (2009) )

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

ELEMENTARY REFORMULATION AND SUCCINCT CERTIFICATES IN CONIC LINEAR PROGRAMMING. Minghui Liu

Convex Optimization & Lagrange Duality

Convex Optimization M2

Agenda. Applications of semidefinite programming. 1 Control and system theory. 2 Combinatorial and nonconvex optimization

Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012

A path following interior-point algorithm for semidefinite optimization problem based on new kernel function. djeffal

Efficient Solution of Maximum-Entropy Sampling Problems

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

A ten page introduction to conic optimization

New stopping criteria for detecting infeasibility in conic optimization

Lift me up but not too high Fast algorithms to solve SDP s with block-diagonal constraints

Mustapha Ç. Pinar 1. Communicated by Jean Abadie

Lecture 3: Semidefinite Programming

Relaxations and Randomized Methods for Nonconvex QCQPs

18. Primal-dual interior-point methods

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 17

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

Lecture: Duality of LP, SOCP and SDP

Convex Optimization and Support Vector Machine

Primal-Dual Interior-Point Methods

5. Duality. Lagrangian

Support Vector Machines: Maximum Margin Classifiers

Transcription:

Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry assoc. prof., Ph.D. 1 1 UNM - Faculty of information studies Edinburgh, 16. September 2014

Outline Introduction Definition of SDP Dual theory

Definition of SDP Dual theory Some notation I - unit matrix, X R m n x = vec(x ) R mn x, y = x T y = i x iy i A, B = trace(a T B) = i,j a ijb ij

Definition of SDP Dual theory Some notation Symmetric matrices S n := {X R n n : X T = X } Cone of positive semidefinite matrices S n + := {X S n : u T Xu 0, u R n } - closed convex pointed cone Positive definite matrices S ++ n := {X S n : u T Xu > 0, u R n } The dual cone: (S + n ) = {Y S n : Y, X 0, X S + n } = S + n (Note: inf X 0 X, Y = 0 Y 0) For X S + n we use X 0.

Definition of SDP Dual theory Few properties of PSD matrices For A S n the following are equivalent A S n +, all eigenvalues of A are nonnegative real numbers, there exist P and D = Diag(d), d 0, such that A = PDP T, det AII 0 for every main submatrix A II (A II is main submatrix, if A = [a ij ], for i, j I {1,..., n}). BAB T S n +, for some non-singular B.

Definition of SDP Dual theory Few properties of PSD matrices For X, Y S + n : X, Y 0 and X, Y = 0 XY = 0. Schur s complement: if A S n ++ then [ ] A B B t 0 C B t A 1 B 0. C

Definition of SDP Dual theory Primal and the dual SDP Primal semidefinite programming problem (PSDP) inf C, X s. t. A i, X = b i, i, X S n + Dual semidefinite programming problem (DSDP) s. t. sup b T y i y ia i + Z = C Z S n + PSDP and DSDP are dual of each other.

Definition of SDP Dual theory Example 1: linear programming (LP) Linear programming problem (LP) in standard primal form min c T x p. p. Ax = b x 0 LP as PSDP: C = Diag(c), X = Diag(x), A i = Diag(A(i, :)) min C, X p. p. A i, X = b i, 1 i m, X S n +

Definition of SDP Dual theory Example 2: convex quadratic programming Def. Convex quadratic programming problem (CCP): Rem. inf f 0(x) p. p. f i (x) 0, i = 1,..., m. (CCP) where: f i (x) = x t U i x v t i x z i, U i 0. Note f i (x) 0 A i = [ ] I U 1/2 i x (U 1/2 i x) t vi t x + z i 0. [ [ [ ] I 0 0 U A i = ]+x 1/2 i (:, 1) 0 U 0 z 1 ]+ 1/2 i U 1/2 i (:, n) +x n i (1, :) v i1 U 1/2. i (n, :) v in

Definition of SDP Dual theory Example 2: convex quadratic programming cnt. Finding min of f 0 (x) is equiv. to finding min. of t with additional constraint f 0 (x) t. (CCP) is equivalent to: inf t p. p. Diag(A 0, A 1,..., A m ) 0, where [ ] I U 1/2 A 0 = 0 x (U 1/2 0 x) t v0 tx + z. 0 + t

Definition of SDP Dual theory Weak duality Let us define A i, X = b i =: A(X ) = b y i A i =: A T (y) i OPT P = inf { C, X ; A(X ) = b, X 0} and OPT D = sup {b t y ; A t (y) + Z = C, Z 0, y R m }. More definitions sup =, inf =. Thm.: OPT P OPT D. Proof: Duality gap: OPT P OPT D = C, X opt b T y opt = X opt, Z opt.

Definition of SDP Dual theory Strong duality Def.: PSDP is strictly feasible, if there exists X S ++ n A(X ) = b. DSDP is strictly feasible, if there exists pair (y, Z) R m S ++ n such that A T (y) + Z = C. Thm.: Let (DSDP) be strictly feasible. Then OPTP = OPT D. such that If OPT P < then there exists X 0 s.t. A(X ) = b and C, X = OPT P.

Definition of SDP Dual theory Example 3: optimum is not attained min [ ] 1 0, X 0 0 [ ] 0 1 p. p., X = 2, X 0. 1 0 [ ] x11 1 Feasible solutions: X = with x 1 x 22 > 0 in 22 x 11 x 22 1. OPT P = 0, but OPT P is not attained. Interpretation : DSDP has no strictly feasible solution.

Definition of SDP Dual theory Example 4: positive duality gap min p. p. 0 0 0 0 0 0, X 0 0 1 0 0 1 0 0 0, X = 0, 0 0 0 1 0 0 1 0 0, X = 2, X 0. 0 0 2 OPT P = 1, OPT D = 0.

Definition of SDP Dual theory Optimal conditions for SDP Let PSDP and DSDP be strictly feasible. Thm.: X in (y, Z ) are optimal for PSDP and DSDP if and only if: (prim. dop.) A(X ) = b, X 0 (dual. dop.) A T (y) + Z = C, Z 0 (zero duality gap) XZ = 0 (b T y = C, X )

Central path Assumptions eqs. Ai, X = b i linearly independant PSDP and DSDP strictly feasible Then the system (prim. dop.) A(X ) = b, X 0 (dual. dop.) A T (y) + Z = C, Z 0 XZ = µi µ > 0 has a unique solution (X µ, y µ, Z µ ). Def.: The central path: {(X µ, y µ, Z µ ): µ > 0}.

Example PSDP min p. p. [ ] 1 0, X 0 1 [ ] 0 1, X 1 0 = 2, X 0. DSDP max 2y p. p. Z = [ 1 ] y y 1 0.

The central path (primal part)

The properties of the central path Thm: The central path is a smooth curve parameterized by µ. Thm: The central path always converge lim µ 0 (X µ, y µ, Z µ ) = (X, y, Z ). Limit point is maximally complementary primal dual optimal solution. Def: Let (X, y, Z ) be primal dual optimal solution. It is maximally complementary if rank(x ) is maximal among all primal optimal solutions and rank(y ) is maximal among all dual optimal solutions. Proof: See e.g. H. Wolkowicz, R. Saigal, L. Vanderberghe (ed.): Handbook of Semidefinite Programming. Kluwer Academic Publishers, Boston-Dordrecht-London, 2000.

Path following IPMs Idea: We solve approximately the equations defining the central path (we follow the central path approximately).

Path following IPMs - more precisely Input: A, b R m, C S n, ε > 0, σ (0, 1) and (X 0, y 0, Z 0 ) strictly feasible for PSDP in DSDP 1. Set k := 0. 2. Repeat 2.1 µ k = X k, Z k /n 2.1 Solve A(X k + X ) = b, A T (y k + y) + Z + Z = C, (X k + X )(Z k + Z) = σµ k I. 2.2 X = ( X + ( X ) T )/2. 2.3 (X k+1, y k+1, Z k+1 ) := (X k + X, y k + y, Z k + Z). 2.4 k := k + 1. 3. until X k+1, Z k+1 ε. Output: (X k, y k, Z k ).

Solving the system Note that the starting point is strictly feasible. A( X ) = 0, A T ( y) + Z = 0, XZ k + X k Z = µ k I X k Z k, FIRSTLY: Z = A T ( y) X = ( X + ( X ) T )/2, THEN: X = (µi X k Z k X k Z)Z 1 k FINALLY: A(X k A T ( y)z 1 k ) = A(µZ 1 k )+A(X k ) = b A(µZ 1 k ).

Solving the system: bottlenecks On each step we have to - compute one inverse (Z 1 k ) - compose the system matrix M y = b. Each m ij = A i, XA j Z 1 k takes O(m 2 n 2 + mn 3 ) flops. - Solve the system M y = b (takes O(m 3 ) flops) - Compute Z in X (takes O(mn 2 + n 3 ) flops)

Theoretical guaranty Thm.: Let (X 0, y 0, Z 0 ) be strictly feasible starting point with Z 1/2 XZ 1/2 µi θ X, Z. The described path following IPM gives ε-optimal solution in at most n/δ log(ε 1 X 0, S 0 ) iterations, where δ = n(1 σ). (1+θ) 2 1 (θ 2 + n(1 σ) 2 ) σθ. 2(1 θ) 3 2

(Povh, Rendl, Wiegele, 2005) Relaxed optimality conditions (primal feas.) A(X ) b, X 0 (dual feas.) A T (y) + Z C, Z 0 (zero opt. dual. gap) XZ = 0

(Povh, Rendl, Wiegele, 2005) Relaxed optimality conditions (primal feas.) A(X ) b, X 0 (dual feas.) A T (y) + Z C, Z 0 (zero opt. dual. gap) XZ = 0

Idea behind the Augmented Lagrangian approach to DSDP sup{b T y : A T y + Z = C, Z S n + } Replace DSDP by DSDP-L: min {b T y + X, C A T (y) Z + σ 2 C AT (y) Z 2 : Z 0}

BPM: algorithm INPUT: A, b, C S n. 1. Select σ > 0, X 0 0. 2. k = 0. 3. Repeat 3.1 Solve DSDP-L approximately to get y k and Z k 0. 3.2 Update X k+1 := X k + σ(z k C + A T (y k )) 3.3 k := k + 1. 4. Until: stopping criteria is satisfied. OUTPUT: X k, y k, Z k. Efficient idea: alternating y and Z.

in praxis - we use software packages Nekaj najbolj razširjenih in robustnih paketa - SEDUMI (http://sedumi.ie.lehigh.edu/). - SDPT3 (http://www.math.nus.edu.sg/ mattohkc/sdpt3.html). - SDPA (http://sdpa.sourceforge.net/). - MOSEK (http://www.mosek.com/). - YALMIP (http://users.isy.liu.se/johanl/yalmip/). To solve large SDP (m is large) we can use: - SDPLR (http://dollar.biz.uiowa.edu/ burer/software/sdplr/) - spectral bundle method (http://www-user.tu-chemnitz.de/ helmberg/sbmethod/). - boundary point method (http://www.math.uni-klu.ac.at/or/software/).

Literature 1. Miguel F. Anjos and Jean B. Lasserre. Handbook of Semidefinite, Conic and Polynomial Optimization: Theory, Algorithms, Software and Applications, volume 166 of International Series in Operational Research and Management Science. Springer, 2012 2. E. de Klerk: Aspects of Semidefinite Programming - Interior Point Algorithms and Selected Applications. Kluwer Academic Publishers, Dordrecht, 2002. 3. M. Laurent. Sums of squares, moment matrices and optimization over polynomials, volume 149 of The IMA Volumes in Mathematics and its Applications, str. 157 270. Springer, 2009. 4. J. Povh: Semidefinitno programiranje in kombinatorična optimizacija: magistrsko delo. Ljubljana, 2002 5. J. Povh: Semidefinitno programiranje. Obz. mat. fiz., 2002, letn. 49, št. 6, str. 161-173 6. H. Wolkowicz, R. Saigal, L. Vanderberghe (ed.): Handbook of Semidefinite Programming. Kluwer Academic Publishers, Boston-Dordrecht-London, 2000.