An Introduction to Linear Matrix Inequalities Raktim Bhattacharya Aerospace Engineering, Texas A&M University
Linear Matrix Inequalities What are they? Inequalities involving matrix variables Matrix variables appear linearly Represent convex sets polynomial inequalities Critical tool in post-modern control theory AERO 632, Instructor: Raktim Bhattacharya 2 / 38
Standard Form where F (x) := F 0 + x 1 F 1 + + x n F n > 0 x 1 x 2 x :=., F i S m m m symmetric matrix x n Think of F (x) : R n S m. Example: [ ] 1 x > 0 x 1 [ ] 1 0 + x 0 1 [ ] 0 1 > 0. 1 0 AERO 632, Instructor: Raktim Bhattacharya 3 / 38
Positive Definiteness Let Matrix F > 0 represents positive definite matrix F > 0 x T F x > 0, x 0 F > 0 leading principal minors of F are positive F 11 F 12 F 13 F = F 21 F 22 F 23 F 31 F 32 F 33 n Polynomial Constraints as a Linear Matrix Inequality F > 0 F 11 > 0, F 11 F 12 F 21 F 22 > 0, F 11 F 12 F 13 F 21 F 22 F 23 F 31 F 32 F 33 > 0, AERO 632, Instructor: Raktim Bhattacharya 4 / 38
Definiteness Positive Semi-Definite F 0 iff all principal minors are 0 not just leading Negative Definite F < 0 iff every odd leading principal minor is < 0 and even leading principal minor is > 0 they alternate signs, starting with < 0 Negative Semi-Definite F 0 iff every odd principal minor is 0 and even principal minor is 0 F > 0 F < 0 F 0 F 0 Matrix Analysis, Roger Horn. AERO 632, Instructor: Raktim Bhattacharya 5 / 38
Example 1 y > 0, y x 2 > 0, [ ] y x > 0 x 1 LMI written as [ ] y x > 0 is in general form. x 1 We can write it in standard form as [ ] [ ] [ ] 0 0 1 0 0 1 + y + x > 0 0 1 0 0 1 0 General form saves notations, may lead to more efficient computation AERO 632, Instructor: Raktim Bhattacharya 6 / 38
Example 2 x 2 1 + x 2 2 < 1 Leading Minors are 1 0 x 1 0 1 x 2 > 0 x 1 x 2 1 1 > 0 1 0 0 1 > 0 1 1 x 2 x 2 1 0 1 x 1 x 1 1 + x 1 0 1 x 1 x 2 > 0 Last inequality simplifies to 1 (x 2 1 + x 2 2) > 0 AERO 632, Instructor: Raktim Bhattacharya 7 / 38
Eigenvalue Minimization Let A i S n, i = 0, 1,, n. Let A(x) := A 0 + A 1 x 1 + + A n x n. Find x := [x 1 x 2 x n ] T that minimizes How to solve this problem? J(x) := min λ max A(x). x AERO 632, Instructor: Raktim Bhattacharya 8 / 38
Eigenvalue Minimization (contd.) Recall for M S n λ max M t M ti 0. Linear algebra result: Matrix Analysis R.Horn, C.R. Johnson Optimization problem is therefore min x,t t such that A(x) ti 0. AERO 632, Instructor: Raktim Bhattacharya 9 / 38
Matrix Norm Minimization Let A i R n, i = 0, 1,, n. Let A(x) := A 0 + A 1 x 1 + + A n x n. Find x := [x 1 x 2 x n ] T that minimizes How to solve this problem? J(x) := min x A(x) 2. AERO 632, Instructor: Raktim Bhattacharya 10 / 38
Matrix Norm Minimization contd. Recall A 2 := λ max A T A. or Implies min t,x t2 A(x) T A(x) t 2 I 0. Optimization problem is therefore min t,x t2 subject to [ ] ti A(x) A(x) T 0. ti AERO 632, Instructor: Raktim Bhattacharya 11 / 38
Important Inequalities
Generalized Square Inequalities Lemma For arbitrary scalar x, y, and δ > 0, we have Implies ( ) δx y 2 = δx 2 + 1 δ δ y2 2xy 0. 2xy δx 2 + 1 δ y2. AERO 632, Instructor: Raktim Bhattacharya 13 / 38
Generalized Square Inequalities Restriction-Free Inequalities Lemma Let X, Y R m n, F S m, F > 0, and δ > 0 be a scalar, then X T F Y + Y T F X δx T F X + δ 1 Y T F Y. When X = x and Y = y 2x T F y δx T F x + δ 1 y T F y. Proof: Using completion of squares. ( ) T ( ) δx δ 1 Y F δx δ 1 Y 0. AERO 632, Instructor: Raktim Bhattacharya 14 / 38
Generalized Square Inequalities Inequalities with Restrictions Let F = {F F R n n, F T F I}. Lemma Let X R m n, Y R n m, then for arbitrary δ > 0 XF Y + Y T F T X T δxx T + δ 1 Y T Y, F F. Proof: Approach 1: Using completion of squares. Start with ( δx T ) T ( δ 1 F Y δx T δ 1 F Y )) 0. AERO 632, Instructor: Raktim Bhattacharya 15 / 38
Schur Complements Very useful for identifying convex sets Let [ Q(x) ] S(x) S T (x) R(x) Generalizing, Q(x) S m 1, R(x) S m 2 Q(x), R(x), S(x) are affine functions of x [ ] Q(x) S(x) S T 0 (x) R(x) > 0 Q(x) > 0 R(x) S T (x)q(x) 1 S(x) > 0 Q(x) 0 S T (x) ( I Q(x)Q (x) ) = 0 R(x) S T (x)q(x) S(x) 0 Q(x) is the pseudo-inverse This generalization is used when Q(x) is positive semidefinite but singular AERO 632, Instructor: Raktim Bhattacharya 16 / 38
Schur Complement Lemma Let Define [ ] A11 A A := 12. A 21 A 22 For symmetric A, S ch (A 11 ) := A 22 A 21 A 1 11 A 12 S ch (A 22 ) := A 11 A 12 A 1 22 A 21 A > 0 A 11 > 0, S ch (A 11 ) > 0 A 22 > 0, S ch (A 22 ) > 0 AERO 632, Instructor: Raktim Bhattacharya 17 / 38
Example 1 Here x 2 1 + x 2 2 < 1 1 x T x > 0 R(x) = 1, Q(x) = I > 0. [ ] I x x T > 0 1 AERO 632, Instructor: Raktim Bhattacharya 18 / 38
Example 2 or x P < 1 1 x T P x > 0 1 x T P x = 1 ( P x) T ( P x) > 0 where P is matrix square root. [ ] P 1 x x T > 0 1 [ I ( ] P x) ( P x) T > 0 1 AERO 632, Instructor: Raktim Bhattacharya 19 / 38
LMIs are not unique If F is positive definite then congruence transformation of F is also positive definite F > 0 x T F x, x 0 y T M T F My > 0, y 0 and nonsingular M M T F M > 0 Implies, rearrangement of matrix elements does not change the feasible set [ ] [ ] [ ] [ ] [ ] Q S 0 I Q S 0 I R S T S T > 0 R I 0 S T > 0 > 0 R I 0 S Q AERO 632, Instructor: Raktim Bhattacharya 20 / 38
Variable Elimination Lemma Lemma: For arbitrary nonzero vectors x, y R n, there holds max (x T F y) 2 = (x T x)(y T y). F F:F T F I Proof: From Schwarz inequality, x T F y x T x y T F T F y x T x y T y. Therefore for arbitrary x, y we have Next show equality. (x T F y) 2 (x T x)(y T y). AERO 632, Instructor: Raktim Bhattacharya 21 / 38
Variable Elimination Lemma contd. Let F 0 = xy T x T x y T y. Therefore, We can show that F T 0 F 0 = yxt xy T (x T x)(y T y) = yyt y T y. σ max (F T 0 F 0 ) = σ max (F 0 F T 0 ) = 1. = F T 0 F 0 1, thus F 0 F. AERO 632, Instructor: Raktim Bhattacharya 22 / 38
Variable Elimination Lemma contd. Therefore, (x T F 0 y) 2 = ( x T xy T x T x y T y y ) 2 = (x T x)(y T y). AERO 632, Instructor: Raktim Bhattacharya 23 / 38
Variable Elimination Lemma contd. Lemma: Let X R m n, Y R n m, and Q R m m. Then Q + XF Y + Y T F T X T < 0, F F, iff δ > 0 such that Q + δxx T + 1 δ Y T Y < 0. Proof: Sufficiency Q + XF Y + Y T F T X T Q + δxx T + 1 δ Y T Y from previous Lemma < 0. AERO 632, Instructor: Raktim Bhattacharya 24 / 38
Variable Elimination Lemma contd. Proof: Necessity Suppose Q + XF Y + Y T F T X T < 0, F F is true. Then for arbitrary nonzero x or Using previous lemma result x T (Q + XF Y + Y T F T X T )x < 0, max F F (xt XF Y x) = x T Qx + 2x T XF Y x < 0. (x T XX T x)(x T Y T Y x), = x T Qx + 2 (x T XX T x)(x T Y T Y x) < 0. AERO 632, Instructor: Raktim Bhattacharya 25 / 38
Variable Elimination Lemma contd. Therefore, x T Qx + 2 (x T XX T x)(x T Y T Y x) < 0 = x T Qx 2 (x T XX T x)(x T Y T Y x) < 0, and x T Qx < 0. or (x T Qx) 2 4 (x T XX T x) (x T Y T Y x) > 0. }{{}}{{}}{{} b 2 a c b 2 4ac > 0. AERO 632, Instructor: Raktim Bhattacharya 26 / 38
Variable Elimination Lemma contd. Or the quadratic equation aδ 2 + bδ + c = 0 has real-roots Recall, b ± b 2 4ac. 2a a := (x T XX T x) > 0, b := (x T Qx) < 0, c := (x T Y T Y x) > 0. Implies b 2a > 0, or at least one positive root. AERO 632, Instructor: Raktim Bhattacharya 27 / 38
Variable Elimination Lemma contd. Therefore, δ > 0 such that aδ 2 + bδ + c < 0. Dividing by δ we get aδ + b + c δ < 0, or or or x T Qx + δx T XX T x + 1 δ xt Y T Y x < 0, x T (Q + δxx T + 1 δ Y T Y )x < 0, Q + δxx T + 1 δ Y T Y < 0. AERO 632, Instructor: Raktim Bhattacharya 28 / 38
Elimination of Variables In a Partitioned Matrix Lemma: Let [ ] Z11 Z Z = 12 Z12 T, Z Z 11 R n n, 22 be symmetric. Then X = X T such that Z 11 X Z 12 X Z12 T Z 22 0 < 0 Z < 0. X 0 X Proof: Apply Schur complement lemma. Z 11 X Z 12 X Z12 T Z 22 0 < 0 X < 0, S ch ( X) < 0. X 0 X AERO 632, Instructor: Raktim Bhattacharya 29 / 38
Elimination of Variables In a Partitioned Matrix (contd.) 0 > S ch ( X), [ Z11 X Z = 12 Z12 T Z 22 [ Z11 X Z = 12 Z12 T Z 22 [ ] Z11 Z = 12 Z12 T. Z 22 ] ] + [ X 0 [ ] X 0, 0 0 ] ( X) 1 [ X 0 ], AERO 632, Instructor: Raktim Bhattacharya 30 / 38
Elimination of Variables In a Partitioned Matrix (contd.) Lemma: Z 11 Z 12 Z 13 Z 12 T Z 22 Z 23 + X T < 0 Z13 T Z23 T + X Z 33 [ ] Z11 Z 12 Z12 T < 0 Z 22 [ ] Z11 Z 13 Z13 T < 0, Z 33 with X = Z13Z T 11 1 Z 12 Z23. T Proof: Necessity Apply rules for negative definiteness. Sufficiency Following are true from Schur complement lemma. Z 11 < 0 Z 22 Z T 12Z 1 11 Z 12 < 0 Z 33 Z T 13Z 1 11 Z 13 < 0 AERO 632, Instructor: Raktim Bhattacharya 31 / 38
Elimination of Variables In a Partitioned Matrix (contd.) Look at Schur complement of Z 11 Z 12 Z 13 Z 12 T Z 22 Z 23 + X T. Z13 T Z23 T + X Z 33 [ Z 22 Z 23 + X T Z23 T + X Z 33 [ = Also Z!1 < 0. ] [ ] Z T 12 Z13 T Z11 1 [ ] Z12 Z 13 Z 22 Z12 T Z 1 11 Z 12 Z 23 + X T Z12 T Z 1 Z23 T + X ZT 13 Z 1 11 Z 12 Z 33 Z13 T Z 1 11 Z 13 11 Z 13 ] < 0. AERO 632, Instructor: Raktim Bhattacharya 32 / 38
Elimination of Variables Projection Lemma Definition Let A R m n. Then M a is left orthogonal complement of A if it satisfies M a A = 0, rank(m a ) = m rank(a). Definition Let A R m n. Then N a is right orthogonal complement of A if it satisfies AN a = 0, rank(n a ) = n rank(a). AERO 632, Instructor: Raktim Bhattacharya 33 / 38
Elimination of Variables Projection Lemma (contd.) Lemma: Let P, Q, and H = H T be matrices of appropriate dimensions. Let N p, N q be right orthogonal complements of P, Q respectively. Then X such that H + P T X T Q + Q T XP < 0 N T p HN p < 0 and N T q HN q < 0. Proof: Necessity : Multiply by N p or N q. Sufficiency : Little more involved Use base kernel of P, Q, followed by Schur complement lemma. AERO 632, Instructor: Raktim Bhattacharya 34 / 38
Elimination of Variables Reciprocal Projection Lemma Lemma: Let P be any given positive definite matrix. The following statements are equivalent: 1. Ψ + S + S T < 0. 2. The LMI problem [ Ψ + P (W + W T ) S T + W T ] < 0, S + W P is feasible with respect to W. Proof: Apply projection lemma w.r.t general variable W. Let [ ] Ψ + P S T X =, Y = [ I S P n 0 ], Z = [ ] I n I n. AERO 632, Instructor: Raktim Bhattacharya 35 / 38
Elimination of Variables Reciprocal Projection Lemma (contd.) Let [ ] Ψ + P S T X =, Y = [ I S P n 0 ], Z = [ ] I n I n. Right orthogonal complements of Y, Z are [ ] 0 N y = P 1, N z = Verify that Y N y = 0 and ZN z = 0. We can show [ In I n ]. N T y XN y = P 1, N T z XN z = Ψ + S T + S. Apply projection lemma. AERO 632, Instructor: Raktim Bhattacharya 36 / 38
Elimination of Variables Reciprocal Projection Lemma (contd.) N T y XN y = P 1, N T z XN z = Ψ + S T + S. The expression [ X + Y T W T Z + Z T Ψ + P (W + W W Y = T ) S T + W T ]. S + W P Therefore, if N T y XN y < 0 N T z XN z < 0 = [ Ψ + P (W + W T ) S T + W T ] < 0. S + W P AERO 632, Instructor: Raktim Bhattacharya 37 / 38
Trace of Matrices in LMIs Lemma Let A(x) S m be a matrix function in R n, and γ R > 0. The following statements are equivalent: 1. x R n such that 2. x R n, Z S m such that Proof: Homework problem. tra(x) < γ, A(x) < Z, trz < γ. AERO 632, Instructor: Raktim Bhattacharya 38 / 38