Real Symmetric Matrices and Semidefinite Programming

Similar documents
Knowledge Discovery and Data Mining 1 (VO) ( )

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

Lecture Note 5: Semidefinite Programming for Stability Analysis

1 Linear Algebra Problems

Symmetric Matrices and Eigendecomposition

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

Lecture 7: Positive Semidefinite Matrices

Symmetric matrices and dot products

STAT200C: Review of Linear Algebra

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Tutorials in Optimization. Richard Socher

The Singular Value Decomposition

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Linear Algebra: Matrix Eigenvalue Problems

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Math 315: Linear Algebra Solutions to Assignment 7

Chapter 3 Transformations

Semidefinite Programming

Introduction to Matrix Algebra

SEMIDEFINITE PROGRAM BASICS. Contents

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

4. Algebra and Duality

Chapter 1. Matrix Algebra

Linear algebra and applications to graphs Part 1

2. Matrix Algebra and Random Vectors

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Here each term has degree 2 (the sum of exponents is 2 for all summands). A quadratic form of three variables looks as

Semidefinite Programming Basics and Applications

Notes on Linear Algebra and Matrix Theory

8. Diagonalization.

Recall the convention that, for us, all vectors are column vectors.

Numerical Linear Algebra Homework Assignment - Week 2

1 Last time: least-squares problems

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Symmetric and anti symmetric matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Linear Algebra. Session 12

Continuous Optimisation, Chpt 9: Semidefinite Problems

Assignment 1: From the Definition of Convexity to Helley Theorem

Conjugate Gradient (CG) Method

Continuous Optimisation, Chpt 9: Semidefinite Optimisation

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016

Math Matrix Algebra

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Matrix Algebra, part 2

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Linear Algebra. Workbook

arxiv: v1 [math.oc] 26 Sep 2015

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Properties of Matrices and Operations on Matrices

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Semidefinite Programming

Ranks of Real Symmetric Tensors

MIT Algebraic techniques and semidefinite optimization May 9, Lecture 21. Lecturer: Pablo A. Parrilo Scribe:???

Summer School: Semidefinite Optimization

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

CS 246 Review of Linear Algebra 01/17/19

Problems in Linear Algebra and Representation Theory

Chap 3. Linear Algebra

Linear Algebra Massoud Malek

1. General Vector Spaces

1 Quantum states and von Neumann entropy

Lecture 5. The Dual Cone and Dual Problem

Review of Linear Algebra

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

Linear Algebra Review. Fei-Fei Li

1. Find the solution of the following uncontrolled linear system. 2 α 1 1

Approximation Algorithms

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Linear Algebra: Characteristic Value Problem

Quantum Computing Lecture 2. Review of Linear Algebra

Econ Slides from Lecture 7

The Simplest Semidefinite Programs are Trivial

An Alternative Proof of Primitivity of Indecomposable Nonnegative Matrices with a Positive Trace

We describe the generalization of Hazan s algorithm for symmetric programming

Elementary linear algebra

Optimization over Polynomials with Sums of Squares and Moment Matrices

10-701/ Recitation : Linear Algebra Review (based on notes written by Jing Xiang)

GQE ALGEBRA PROBLEMS

Lecture 8 : Eigenvalues and Eigenvectors

ON SUM OF SQUARES DECOMPOSITION FOR A BIQUADRATIC MATRIX FUNCTION

An Algorithm for Solving the Convex Feasibility Problem With Linear Matrix Inequality Constraints and an Implementation for Second-Order Cones

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,

Section 7.3: SYMMETRIC MATRICES AND ORTHOGONAL DIAGONALIZATION

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Robust and Optimal Control, Spring 2015

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 2

Introduction to Semidefinite Programming I: Basic properties a

Dimension reduction for semidefinite programming

MAT 610: Numerical Linear Algebra. James V. Lambers

Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then

Transcription:

Real Symmetric Matrices and Semidefinite Programming Tatsiana Maskalevich Abstract Symmetric real matrices attain an important property stating that all their eigenvalues are real. This gives rise to many important applications in areas across mathematical and earth sciences. In semidefinite programming, we try to optimize a linear function subject to the constraint that a combination of symmetric matrices is positive semidefinite. Semidefinite programming is used to model problems that arise in the areas of operational research and optimization. This paper gives a brief introduction to symmetric matrices, as well as theory and applications of semidefinite programming. 1 Introduction The beautiful properties of symmetric positive semidefinite matrices and their associated convex quadratic forms have fascinated mathematicians since the discovery of conic sections. Many properties of nonlinear objects have therefore been related to the behavior of convex quadratic functions that are used in control theory or in combinatorial optimization [6]. The development of interior point methods for semidefinite programming in the late eighties made it possible to optimize over this set. That aroused much interest and lead to heavy activity in this field. In fact, semidefinite programming has become one of the basic modeling and optimization tools along with linear and quadratic programming. The article is organized as follows. In the Background section, we review some basic notions of symmetric matrices and fundamental properties of the cone of positive semidefinite matrices. Semidefinite programs and their characteristics are introduced in Section 3. The applications is one of the semidefinite programming will be discussed in the Section 4. 2 Background Let M be a square matrix with entries in a eld R. Define a transpose matrix of n m matrix M is the m n matrix M T defined as m T ij := m ji 1 i m 1 j n From this definition it is easy to observe that (M T ) T = M. When M is a square matrix, M and M T have the same size and we can compare them 1

Proposition 2.1 A square matrixm with real entries is symmetric if M T = M skew-symmetric if M T = M orthogonal if M T = M 1 A matrix M is diagonal if all its off-diagonal entries are zero. Moreover, M is diagonalisable if there is an invertible matrix B such that BMB 1 is diagonal. We say, matrix is unitary if Mx = x. Unitary matrices satisfy det(m) = ±1 since det(m T M) = (det(m)) 2 for every matrix M and M T M = MM T = I when M is unitary. Thus we have the following Lemma: Lemma 2.1 A matrix with real entries is orthogonal if and only if it is unitary. An eigenvector of an n n matrix M is a nonzero vector x such that Mx = λx. We call λ an eigenvalue of M [3]. Proposition 2.2 The eigenvalues of real symmetric matrices are real [1]. Sketch of the proof: If M is the symmetric matrix and λ is an eigenvalue of M. Take transpose of the Mx = λx to obtain x T M = λx T, thus λx T x = x T (Mx) = (xm)x = λx T x Obtain λ λ = 0 implies λ is real. Thus we are guarantee to have only real eigenvalues. Lemma 2.2 A is a real symmetric matrix if and only if A is orthogonally similar to a real-diagonal matrix D, i.e D = P T AP for some orthogonal P [1]. Since the symmetric structure of a matrix forces its eigenvalues to be real, what additional property will force all eigenvalues to be positive (or non- negative)? If A R n n is symmetric, then, as observed above, there is an orthogonal matrix P such λ 1 0... 0 that A = P DP T 0 λ 2... 0, where D =.... is real. If λ i 0 for each i, then D 0 0... λ n exists, so A = P DP T = P D DP T = B T B for B = DP T 2

and λ i > 0 for each i if and only if B is nonsingular. Conversely, if A can be factored as A = B T B, then all eigenvalues of A are nonnegative because for any eigenvector x, λ = xt Ax x T x = xt B T Bx x T x = Bx 2 2 bx 2 2 0 Moreover,if B is nonsingular,then λ > 0. In other words, a real-symmetric matrix A has nonnegative eigenvalues if and only if A can be factored as A = B T B, and all eigenvalues are positive if and only if B is nonsingular. Definition 1 A symmetric matrix A whose eigenvalues are positive is called positive definite, and when the eigenvalues are just nonnegative, A is said to be positive semidefinite. This leads us to the following proposition [5]: Proposition 2.3 For A in R n n the following statements are equivalent: 1. A is positive semidefinite 2. λ i 0, i = 1,.., n 3. There exists matrix B so that A = B T B and rank(b) = rank(a). 4. Let S n denote the space of real, symmetric n n matrices. The usual inner product on this space, denoted by,, is defined by A, B = Tr(A T B) = j a ijb ij. Then A, B 0 for all positive semidefinite B. The set of positive semidefinite matrices (denoted S n + is a full dimensional, closed pointed cone in R (n+1 2 ) Since the eigenvalues are the roots of the characteristic polynomial, they depend continuously on the matrix elements. To characterize these matrices we need the following definition [4] Definition 2 (Convex Cone) A convex cone is a subset of a vector space that is closed under linear combinations with positive coefficients. We call a set K a convex cone if and only if any nonnegative combination of elements from K remains in K. The set of all convex cones is a proper subset of all cones. Therefore, the set of positive definite matrices forms the interior of the cone. The boundary of this cone consists of the positive semidefinite matrices having at least one zero eigenvalue. 3

3 Semidefinite Programming Symmetric matrices play an important role in many areas of algebra and optimization. Let S n denote the space of real, symmetric n n matrices. If X S n we can consider a linear function C(X) defined by the inner product on this space,,, so that C, X = Tr(C T X) = ij C ijx ij. We assume that C is a symmetric matrix. We consider semidefinite programs (SDP) (optimization problem) in the following standard form: min C, X such that (3.1) A k, X = b k, k = 1,..., m; X 0 where C, A k for (k = 1, 2,..., m) and X are symmetric matrices, with b = {b 1,..., b m } is the m-vector that forms m linear equations. We say X 0 meaning X lies in the closed, convex cone of positive semidefinite matrices S+ n.[2]. In order to derive the dual of this program let A = Since AX, y = y i A i, X = y i A i, X for all X S n and y R n we get where AX = A 1, X. A n, X The dual of (3.1) is [2]: A T y = m y i A i i=1 A 1. A n. max b T y such that (3.2) m y k A k + Z = C; k=1 Z 0 where Z S n is a positive semidefinite dual variable. In other words for dual problem given y 1,..., y m we want to maximize linear function b T y. According to constraints of this semidefinite dual matrix Z, defined as Z = C m k=1 y ka k, has to be positive semidefinite. 4

4 Application of Semidefinite Programming A fundamental problem in real algebraic geometry is the whether representation of a multivariate polynomial as a sum of squares (SOS) exists and can be computed[7]. While algebraic techniques have been proposed to decide the problem, recent results suggest that this problem can be solved much more efficiently using numerical optimization techniques such as semidefinite programming. Let p(x) be multivariate polynomial. Then p(x) can be written as p(x) = α p α x α 1 1 xαn n = α p α xα Then the degree of p(x) is max deg(x α ) = n i=1 α i such that p α 0. p(x) is a SOS is m p(x) = (h j (x)) 2 for some polynomials h 1, h 2,..., h m j=1 Then degree of p(x) is even and deg(h j ) deg(p(x). 2 Example 1 x 2 + y 2 + 2xy + z 6 = (x + y) 2 + (z 3 ) is SOS. To solve whether representation of a multivariate polynomial as a sum of squares (SOS) exists we can consider the following semidefinite program f(x) = { X 0 β, γ d β+γ=α X β,γ = p α ( α 2d) where X symmetrix matrix of order ( ) ( n+d d n+d ) ( d with n+d2 ) 2d equations. Example 2 Let p(x, y) = 2x 4 + 2x 3 y x 2 y 2 + 5y 4. Is it SOS? If so that there exists decomposition of the following form: p(x, y) = [ x 2 y 2 xy ] a b c b d e c e f } {{ } X 0 Multiply out and compute coefficients of the monomial degree x 4 = x 2 x 2 = a = 2 x 3 y = x 2 xy = 2c = 2 x 2 y 2 = (xy)(xy) = 2b + f = 1 y 3 = y 2 x = 2e = 0 y 4 = y 2 y 2 = d = 5 5 x 2 y 2 xy

Hence X = 2 b 1 b 5 0 1 0 1 2b for b = 3, X 0, X = 1 2 Thus SOS decomposition is 2 0 3 1 1 3 [ 2 3 1 0 1 3 p(x) = 1 2 (2x2 3y 2 + xy) 2 + 1 2 (y2 + 3xy) 2 ] 6

References [1] Leslie Hogben Handbook of Linear Algebra, Chapman & Hall/CRC, Taylor and Francis Group, 2007. [2] Etienne de Klerk Aspects of Semidefinite Programming, Kluwer Academic Publishers, 2002. [3] Denis Serre Matrices: Theory and Applications Springer Science Business Media, 20104 [4] Dattorro, Convex Optimization & Euclidean Distance Geometry, M??oo, 2005, v2010.10.26. [5] Thomas S. Shores Applied Linear Algebra and Matrix Analysis, Springer Science Business Media, 2007 [6] L. Vandenberghe and S. Boyd Semidefinite Programming, SIAM Review, 38(1): 49-95, March 1996. [7] Karin Gatermann and Pablo A. Parrilo Symmetry groups, semidefinite programs, and sums of squares, arxiv:math/0211450v1, 2002. 7