Lectures on ynamic Systems and Control Mohammed ahleh Munther A. ahleh George Verghese epartment of Electrical Engineering and Computer Science Massachuasetts Institute of Technology c
Chapter 6 Balanced Realization 6. Introduction One popular approach for obtaining a minimal realization is known as Balanced Realization. In this approach, a new state-space description is obtained so that the reachability and observability gramians are diagonalized. This denes a new set of invariant parameters known as Hankel singular values. This approach plays a major role in model reduction which w i l l b e h i g h l i g h ted in this chapter. 6. Balanced Realization Let us start with a system G with minimal realization A B G : C As we h a ve seen in an earlier lecture, the controllability gramian P, and the observability gramian Q are obtained as solutions to the following Lyapunov equations AP + P A + BB = A Q + QA + C C = : P and Q are symmetric and since the realization is minimal they are also positive denite. The eigenvalues of the product of the controllability and observability gramians play an important role in system theory and control. We dene the Hankel singular values, i, as the square roots of the eigenvalues of P Q 4 i = ( i (P Q )) We w ould like to obtain coordinate transformation, T, that results in a realization for which the controllability and observability gramians are equal and diagonal. The diagonal entries of the transformed controllability and observability gramians will b e the Hankel singular values. With the coordinate transformation T the new system realization is given by : T ; AT T ; B B ^ G = CT C^
@ A and the Lyapunov equations in the new coordinates are given by ^ ^ ^ ^ A(T ; P T ; ) + ( T ; P T ; )A + BB = ^ A (T QT ) + ( T QT )A ^ + C^ C ^ = : Therefore the controllability and observability gramian in the new coordinate system are given by ^P = T ; P T ; ^Q = T QT: We are looking for a transformation T such that P ^ = Q ^ = = B... C : n We h a ve the relation (T ; P T ; )(T QT ) = T ; P QT = : (6.) Since Q = Q and is positive denite, we can factor it as Q = R R, where R is an invertible matrix. We can write equation 6. as T ; P R RT =, w hich is equivalent to (RT ) ; RP R (RT ) = : (6.) Equation 6. means that RP R is similar to and is positive denite. Therefore, there exists an orthogonal transformation U, U U = I, such that By setting (RT ) ; U = I, w e arrive at a denition for T and T ; as With this transformation it follows that RP R = U U : (6.3) T = R ; U = ; U R: T ; ^P = ( ; = ( ; = U R)P (R U U )(U U )(U ; ) ; ) and Q ^ = (R ; U ) R R(R ; U ) U ) = ( )(R ; R RR ; )(U = :
6.3 Model Reduction by Balanced Truncation Suppose we start with a system A B G C where A is asymptotically stable. Suppose T is the transformation that converts the above realization to a balanced realization, with B ^ G ^C and P ^ = Q ^ = = diag( : : : n). In many applications it may b e benecial to only consider the subsystem of G that corresponds to the Hankel singular values that are larger than a certain small constant. For that reason, suppose we partition as = where contains the small Hankel singular values. We can partition the realization of G accordingly as G 4 C^ C^ Recall that the following Lyapunov equations hold which can be expanded as 3 ^B ^B 5 : A ^ + + B ^B ^ = ^ ^ ^ A + A + C C^ = B^ B^ B^ B^ + + ^ ^ ^ ^ = B B B B ^ ^ C^ C^ C^ ^ + A A C ^ ^ + = : A A C^ C^ C^ C ^ From the above t wo matrix equations we get the following set of equations ^ ^ ^ + B B + A = (6.4) ^ ^ ^ + B B + A = (6.5) ^ ^ ^ + B B + A = (6.6) ^ + A + C^ C^ = (6.7)
G G G = (6.) (6.9) + + ^C ^C = (6.8) ^C ^C : = + + From this decomposition we can extract two subsystems ^C ^B ^B : ^C Theorem 6. G is an asymptotically stable system. If and do not have any common diagonal elements then G and G are asymptotically stable. Proof: Let us show that the subsystem ^C ^B is asymptotically stable. Since satises the Lyapunov equation = ^B + + ^B then it immediately follows that all the eigenvalues of must be in the closed left half of the complex plane that is, Rei( ). In order to show asymptotic stability w e m ust show t h a t has no purely imaginary eigenvalues. Suppose j! is an eigenvalue of, and let v be an eigenvector associated with j! ( ; j! I)v =. Assume that the Kernel of ( ; j! I) is one-dimensional. The general case where there may b e several independent eigenvectors associated with j! can b e handled by a slight modication of the argument. present as Equation 6.7 can be written ( ; j! I) (A ; j ) +! I + ^C ^C By multiplying the above equation by v on the right a n d v on the left we get v ( ; j!i) v + v (A! I ^ ; j )v + v which implies that ( ^C v ) ( ^C v) =, and this in turn implies that Again from equation 6.7 we get C ^C v = ^C v = : (6.) ( ; j! I) v )v (A ; j! + I + ^C ^C v = which implies that ( ) ; j! I : v = Now w e m ultiply equation 6.4 from the right b y v and from the left by v to get ( ; j! ) v + v v I (A ; j! I) v + v B B : v =
This implies that v v) =, and B B )(B v =. By multiplying equation 6.4 on the right b y v we get ( v + ^ ^ (A ; j! I ) v + B B ; j! I ) v = and hence ( ; j! I ) v = : (6.) Since that the kernel of ( ; j! I ) is one dimensional, and both v and v are eigenvectors, it follows that v = ^ v, where ^ is one of the diagonal elements in. Now m ultiply equation 6.5 on the left by v and equation 6.8 by v on the left we get and v ^ A + v = (6.3) >From equations 6.3 and 6.4 we get that v A ^ ^ + v A = : (6.4) ;v + ^ v = which can be written as (v ) ; + ^ I = : Since by assumption and have no common eigenvalues, then ^ I and have no common eignevalues, and hence A v =. We have ( ; j! I )v = v = which can be written as v v ^ ^ = j! : A A ^ This statement implies that j! is an eigenvalue of A, w h i c h c o n tradicts the assumption of the theorem stating that G is asymptotically stable.