Lectures on ynamic Systems and Control Mohammed ahleh Munther A. ahleh George Verghese epartment of Electrical Engineering and Computer Science Massachuasetts Institute of Technology c
Chapter 6 Balanced Realization 6. Introduction One popular approach for obtaining a minimal realization is known as Balanced Realization. In this approach, a new state-space description is obtained so that the reachability and observability gramians are diagonalized. This denes a new set of invariant parameters known as Hankel singular values. This approach plays a major role in model reduction which will be highlighted in this chapter. 6. Balanced Realization Let us start with a system G with minimal realization A B G : C As we have seen in an earlier lecture, the controllability gramian P, and the observability gramian Q are obtained as solutions to the following Lyapunov equations AP + P A + BB = A Q + QA + C C = : P and Q are symmetric and since the realization is minimal they are also positive denite. The eigenvalues of the product of the controllability and observability gramians play an important role in system theory and control. We dene the Hankel singular values, i, as the square roots of the eigenvalues of P Q 4 i = ( i (P Q)) We would like to obtain coordinate transformation, T, that results in a realization for which the controllability and observability gramians are equal and diagonal. The diagonal entries of the transformed controllability and observability gramians will be the Hankel singular values. With the coordinate transformation T the new system realization is given by : T ; AT T ; B B^ G = CT C^
@. A : and the Lyapunov equations in the new coordinates are given by (T ; P T ; ) + (T ; P T ; ) + B^ ^B = (T QT ) + (T QT ) + C^ C^ = : Therefore the controllability and observability gramian in the new coordinate system are given by P^ = T ; P T ; Q^ = T QT: We are looking for a transformation T such that P ^ = Q ^ = = B.. C n We have the relation (T ; P T ; )(T QT ) = T ; P QT = : (6.) Since Q = Q and is positive denite, we can factor it as Q = R R, where R is an invertible matrix. ; P We can write equation 6. as T R RT =, which is equivalent to (RT ) ; RP R (RT ) = : (6.) Equation 6. means that RP R is similar to and is positive denite. Therefore, there exists an orthogonal transformation U, U U = I, such that By setting (RT ) ; U = I, we arrive at a denition for T and T ; as With this transformation it follows that RP R = U U : (6.3) T = R ; U ; ; T = U R: and ^ U R U ; P = ( ; )P (R ) = ( ; = U )(U U )(U ; ) Q^ = (R ; U ) R R(R ; U ) = ( U )(R ; R RR ; )(U ) = :
6.3 Model Reduction by Balanced Truncation Suppose we start with a system A B G C where A is asymptotically stable. Suppose T is the transformation that converts the above realization to a balanced realization, with G C^ and P ^ = Q ^ = = diag( : : : n ). In many applications it may be benecial to only consider the subsystem of G that corresponds to the Hankel singular values that are larger than a certain small constant. For that reason, suppose we partition as = B ^ where contains the small Hankel singular values. We can partition the realization of G accordingly as ^A ^A G 4 ^A ^C ^C Recall that the following Lyapunov equations hold which can be expanded as 3 ^B 5 B^ : + + B^B^ = + + C^ C^ = B^ B^ B^ B^ + + = B^ B^ B^ B^ C^ C^ C^ ^ C + + = : C^ C^ C^ C^ From the above two matrix equations we get the following set of equations + ^ ^ + B B = (6.4) ^ ^ A + A + B^ B^ = (6.5) + ^ ^ + B B = (6.6) + + C^ C^ = (6.7)
+ + C^ C^ = (6.8) + + C^ From this decomposition we can extract two subsystems G A ^ C ^ C^ = (6.9) : ^ ^ ^ B A B G C^ Theorem 6. G is an asymptotically stable system. If and do not have any common diagonal elements then G and G are asymptotically stable. : Proof: Let us show that the subsystem G B^ ^ C ^ is asymptotically stable. Since A satises the Lyapunov equation + + B^B^ then it immediately follows that all the eigenvalues of must be in the closed left half of the complex plane that is, Re i ( ). In order to show asymptotic stability we must show that has no purely imaginary eigenvalues. Suppose j! is an eigenvalue of, and let v be an eigenvector associated with j! A ^ ( ;j!i)v =. Assume that the Kernel of ( ; j!i) is one-dimensional. The general case where there may b e several independent eigenvectors associated with j! can be handled by a slight modication of the argument. present Equation 6.7 as can be written = ( ; j!i) + (A j!i) ; + C^ C^ = By multiplying the above equation by v on the right and v on the left we get v ( ; j!i) v + v (A ; j!i)v + v C^ C^v = which implies that (C^ v) (C^v) =, and this in turn implies that Again from equation 6.7 we get which implies that ^C v = : (6.) ( ; j!i) v + (A ; j!i)v + C^ C^ v = ( ; j!i) v = : (6.) Now we multiply equation 6.4 from the right by v and from the left by v to get v ( ; j!i) v + v (A ; j!i) v + v B B v = :
This implies that v B )(B v) =, and B v =. By multiplying equation 6.4 on the right by v we get and hence ( ; j!i) v + (A ; j!i) v + B^ B^ v = ( ; j!i) v = : (6.) Since that the kernel of ( ; j!i) is one dimensional, and both v and v are eigenvectors, it follows that v = ^v, where ^ is one of the diagonal elements in. Now multiply equation 6.5 on the left by v and equation 6.8 by v on the left we get and v + v = (6.3) >From equations 6.3 and 6.4 we get that ^ ^ v A A = : (6.4) + v ^ ^ ;v A + ^v A = which can be written as (v ) ; + ^ I = : Since by assumption and have no common eigenvalues, then ^I and have no common eignevalues, and hence A v =. We have ( ; j!i)v = ^A v = which can be written as v v ^ ^ = j! : A A ^ This statement implies that j! is an eigenvalue of A, which contradicts the assumption of the theorem stating that G is asymptotically stable.
MIT OpenCourseWare http://ocw.mit.edu 6.4J / 6.338J ynamic Systems and Control Spring For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.