Hamiltonian Matrices and the Algebraic Riccati Equation Seminar presentation by Piyapong Yuantong 7th December 2009 Technische Universität Chemnitz
1 Hamiltonian matrices We start by introducing the square matrix J R 2n 2n defined by On I n J =, (1) I n O n where O n R n n I n R n n zero matrix identity matrix. Remark It is not very complicated to prove the following properties of the matrix J: i. J T = J ii. J = J T iii. J T J = I 2n iv. J T J T = I 2n v. J 2 = I 2n vi. det J = ±1 Definition 1 A matrix A R 2n 2n is called Hamiltonian if J A is symmetric, so J A = (J A) T A T J + J A = 0 where J R 2n 2n is from (1). We will denote H n = { A R 2n 2n A T J + J A = 0 } the set of 2n 2n Hamiltonian matrices. 1
Proposition 1.1 The following are equivalent: a) A is a Hamiltionian matrix b) A = J S, where S = S T c) (J A) T = J A a b A = J J A A = J( J)A A H n (J( JA)) T J + JA = 0 J T J=I 2n ( JA) T J T J = JA ( JA) T = JA J( JA) T = A If JA = S A = J( JA) = J( JA) T A = JS = JS T = S = S T a c A T J + JA = 0 A T J = JA () T (A T J) T = ( JA) T J T A = (JA) T JA = (JA) T (JA) T = JA 2
Proposition 1.2 Let A, B H n. The following are true: a) A + B H n b) αa H n, α R c) A, B H n, where A, B def. = AB BA a) Because A and B are Hamiltonian matrices, it results that A T J + JA = 0, and respectively B T J + JB = 0. By adding those two relations we obtain: (A T + B T )J + J(A + B) = 0 (A + B) T J + J(A + B) = 0 A + B H b) A T J + JA = 0 A T Jα + JAα = 0 A T αj + J(Aα) = 0 (Aα) T J + J(Aα) = 0 αa H c) We will prove that A, B = JM, where M = M T. We know that A = JS and B = JR, where S = S T and R = R T. A, B = AB BA = JSJR JRJS = J(SJR RJS) from where, using the notation SJR RJS = M, we obtain A, B = JM. Now we will show that M = M T : M T = (SJR RJS) T = (SJR) T (RJS) T = R T J T S T S T J T R T = RJS + SJR = SJR RJS = M 3
Consequence (H,, ) is a Lie algebra. We will prove the necessary properties of the bracket, in terms of bilinearity, antisymmetry and Jacobi s relation. i. αa + βb, C = αa, C + βb, C - evidently = the operation is bilinear. ii. A, B = AB BA = (BA AB) = B, A = the operation is antisymmetric. iii. Jacobi s relation is satisfied: A, B, C + C, A, B + B, C, A = AB BA, C + CA AC, B + BC CB, A == ABC BAC (CAB CBA) + CAB ACB = 0 (BCA BAC) + BCA CBA (ABC ACB)) Proposition 1.3 Let A H n and p A (x) - the characteristic polynomial of the matrix A. Then: a) p A (x) = p A ( x) b) if p A (c) = 0, then p A ( c) = p A ( c) = p A ( c) = 0, where c R. 4
a) p A (x) = det(a xi 2n ), but A = JA T J = p A (x) = det(ja T J xi 2n ) A = JA J = det(ja T J + JxJ) = det(j(a T + xi 2n )J) = detj det(a T + xi 2n )detj = det(a T + xi 2n ) = det(a T + xi2n) T = det(a + xi 2n ) T = det(a + xi 2n ) = det(a ( x)i 2n ) = p A ( x) b) p A (c) = 0 a) = p A ( c) = 0 p A (x) is a real coefficients polynomial = p A ( c) = 0 a) = p A ( c) = 0 2 The Algebraic Riccati Equation Definition 2.1 The linear space ν spanned by the vectors v 1,..., v k is called an invariant subspace of the matrix A (or A-invariant) if for every v ν, Av ν. This is equivalent to Av i ν for i = 1,..., k. We consider the general Algebraic Riccati Equation (ARE) 0 = R(x) = F + A T X + XA XGX (2) where A, F, G, X R n n and F, G are symmetric matrices (F = F T, G = G T ). Equation (2) is for a symmetric unknown X. First, we define the following 2n 2n matrix H = F A G A T (3) 5
Let the columns of U T, V T T, U, V, R n n, span a H-invariant, n-dimensional subspace, i.e., A G U U = Z, Z R n n, λ(z) λ(h) (4) F A T V V Assuming that U is nonsingular, we obtain from the first row of (4) AU + GV = UZ U AU + U GV = Z Inserting this into the equation resulting from evaluating the second row of (4) yields F U A T V = V Z = V U AU + V U GV The above equation is equivalent to 0 = F A T V U V U A V U GV U setting X := V U we see that X solves (2). Hence, from an H-invariant subspace of dimension n, given as the range U of with U nonsingular, we obtain a solution of the Algebraic Riccati V Equation (ARE). What remains is the problem of how to choose U, V such that U is nonsingular, V U is symmetric and X is stabilizing. Before we can solve this problem, we need some properties of the matrix H in (3). Remark It is easy to see that any Hamiltonian matrix must have the block representation as shown in (3). Moreover, it is easy to verify that the matrix H defined in (3) is Hamiltonian according to (JH) T = JH. By using the similarity transformation J HJ = JHJ = H T (5) it can be shown that if λ is an eigenvalue of H, then λ is also an eigenvalue of H. The subspace χ is the subspace spanned by the eigenvectors of H corresponding to the negative eigenvalues. This subspace is invariant under the transformation H; i.e., if x χ then Hx χ. 6
Theorem Consider the Hamiltonian matrix H with no eigenvalue on the imaginary axis and the invariant subspace χ R 2n n as follows, χ = Im which is spanned by eigenvectors associated with stable eigenvalues. If X 1 is invertible then X = is a (stabilizing and symmetric) solution of the corresponding Riccati equation and A GX is stable. Because χ is an invariant subset, there exists a stable matrix H such that A G H = H = = H F A T A G I I = = X F A T 1 HX 1 X X A G I = X I = 0 F A T X = XA + F XG A T I = 0 X, = XA F + XGX A T X = 0 = F + A T X + XA XGX = 0 From the other side we have GX A = X 1 HX 1 A GX = X 1 HX 1 which proves stability. To show that X is symmetric, we use equation (5) = T JH T F A T A G = T J H = ( X T 2 X 1 + X T 1 ) H 7
The left-hand side of this equation is symmetric and therefore the right-hand side is also symmetric, ( X T 2 X 1 + X T 1 ) H = H T ( X T 1 + X T 2 X 1 ) = H T ( X T 2 X 1 + X T 1 ) = ( X T 2 X 1 + X T 1 ) H + H T ( X T 2 X 1 + X T 1 ) = 0. This is a Lyapunov equation with the stable matrix H, and the only solution should be zero, and therefore X T 2 X 1 + X T 1 = 0 = X T 2 X 1 = X T 1, X = = T T = T X2 T X 1 = T X2 T is also symmetric. Remark As the matrix H is real, it can be shown that the solution X = X 1 is also real. Furthermore, this solution does not depend on the bases chosen and is the unique solution determined by H. 8