ECEEN 5448 Fall 211 Homework #5 Solutions Professor David G. Meyer December 8, 211 1. Consider the 1-dimensional time-varying linear system ẋ t (u x) (a) Find the state-transition matrix, Φ(t, τ). Here A(t) t. Since the A-matrix is a scalar, of course A(t) and e commute. So in this case a fundamental matrix is Ψ(t) e A(θ) dθ e 2t/2 and the state transition matrix is thus Φ(t, τ) Ψ(t)Ψ 1 (τ) e 2t/2 e 2τ/2 e 2 (τ 2/ t 2/ ) You can also get this result by solving the IVP A(θ) dθ by separation of variables: dx dt tx; x(τ) x i dx dt tx dx x tdt log(x) 2t/2 + C x Ce 2t/2 and x(τ) x i tells us and so x i Ce 2τ/2 C x i e 2τ/2 x(t) e 2τ/2 e 2t/2 x i which leads to the same result for Φ(t, τ). (b) Given x() 1 and u(t) 1fort, find x(t) for t. The solution is x(t) Φ(t, )x()+ Φ(t, τ)b(τ)u(τ) dτ e 2t/2 + and so the solution when x() 1 and u(t) 1is x(t) 1 e 2 (τ /2 t /2 ) τdτ e 2t/2 +1 e 22t/2
2 2. Consider the nonlinear system where α is a real parameter. q αq q (a) Find all the equilibrium points. The equilibrium points are determined by the solutions to αq q q ( α q 2) q or q 2 α and so the equilibrium points are { when α q e, ± α when α> (b) Using linearization, classify, as much as possible, each equilibrium point found in (2a). so we get when α< when α when α> when α> when α> d ( αq q ) α q 2 dq q e is stable no information available. q e is critical case q e is unstable q e α is stable q e α is stable. We have the nonlinear system ẋ F (x, u) and F (, ). (a) If A F F x (, ) and b u (, ) and det(si A) s(s 1)(s + 2), then what can we conclude about the asymptotic stability of the equilibrium point x,u? A has an eigenvalue at s 1which is in the RHP. This means when u, the equilibrium point at x is unstable (b) Suppose that there is a k such that det(si A + bk) (s +7) 2 (s + 1). What can you say about the nonlinear system By the chain rule ẋ F (x, kx)? F(x, kx) (, ) F F (, ) + x x u A bk (, ) ( kx) (, ) x And A bk has all eigenvalues in the (open) LHP so for the nonlinear system ẋ F (x, kx) the equilibrium point at x is asymptotically stable. This little calculation is the basis for designing feedback laws for nonlinear system based on a linear approximation.
4. Let us say that a time-varying square matrix A(t) is constantly diagonable if, and only if, there is a constant invertible matrix, T, such that T 1 A(t)T D(t) for all t and D(t) is diagonal for all t (a) If A(t) is constantly diagonable, prove that A(t) and A(θ) dθ commute. Assume A(t) is constantly diagonable. First note that, since T does not depend on time, A(θ) dθ T 1 t D(θ) dθt then A(t) A(θ) dθ A(θ) dθa(t) T 1 D(t)TT 1 D(θ) dθt T 1 D(θ) dθt T 1 D(t)T T 1 D(t) D(θ) dθt T 1 D(θ) dθd(t)t but D(t) is diagonal, so D(t) D(θ) dθ D(θ) dθd(t), hence: T 1 D(θ) dθd(t)t T 1 D(θ) dθd(t)t (b) Find the state-transition matrix for the parametrically chirped harmonic oscillator system: 1 ẋ(t) 1+t 1 1 x(t)+ u(t) 1+t g(t) g(t) cos(g(t)) sin(g(t)) Hint: e which is a rotation matrix. And sin(g(t)) cos(g(t)) 1 cos(θ1 ) sin(θ 1 ) cos(θ2 ) sin(θ 2 ) cos(θ1 θ 2 ) sin(θ 1 θ 2 ) sin(θ 1 ) cos(θ 1 ) sin(θ 2 ) cos(θ 2 ) sin(θ 1 θ 2 ) cos(θ 1 θ 2 ) One can easily check that the A(t) for the parametrically chirped harmonic oscillator is constantly diagonable. Hence the fundamental matrix is Ψ(t) e A(θ) dθ and we just need to do some computations. First log(1 + t) A(θ) dθ log(1 + t) From the hint, then cos (log(1 + t)) sin (log(1 + t)) Ψ(t) sin (log(1 + t)) cos (log(1 + t)) Notice Ψ() I as expected. Now, then we find Φ(t, τ) from Φ(t, τ)ψ(t)ψ 1 cos (log(1 + t) log(1 + τ)) sin (log(1 + t) log(1 + τ)) (τ) sin (log(1 + t) log(1 + τ)) cos (log(1 + t) log(1 + τ)) Since log(a/b) log(a) log(b) you can simplify this somewhat: ( ) ( ) Φ(t, τ) cos log 1+t sin log 1+t ( ) ( ) sin log 1+t cos log 1+t
4 (c) Is the system asymptotically stable? Prove your answer. NO. Since cos (log(1 + t)) sin (log(1 + t)) lim Φ(t, ) lim t t sin (log(1 + t)) cos (log(1 + t)) and we see this limit is NOT zero (in fact, the limit does not even exist!). (d) If we add the output equation y(t) 1 x(t) (1) show we have an observable system on any interval, T 2 with T 2 >. On,T 2 the observability gramian is 2 2 W o,t 2 Φ T (t, )c T (t)c(t)φ(t, ) dt Φ T 1 (t, ) 1 Φ(t, ) dt In order for W o,t 2 to be singular, we d need to find a vector v such that cφ(t, )v for all t between and T 2 but Φ(t, ) is rotation through log(1 + t) radians. And since log(1), near t, Φ(t, ) rotates through as small an angle as we choose. Clearly one vector v cannot be orthogonal to both c and c rotated by a suitably small amount. Hence W o,t 2 is non-singular. Notice that the impulse responses for the system are ( ) 1+t h(t, τ)c(t)φ(t, τ)b(τ) cos log So a check for BIBO (Bounded-Input, Bounded-Output) stability is ( ) sup t 1+t log cos dz < 1+z 5. Deducing asymptotic stability through the output. (a) Consider the observable time-invariant linear system ẋ Ax x() x y cx Prove the following: If we find y(t) ast for all x then in fact e At as t. An easy way to proceed is by contradiction. Suppose e At doesn t go to zero as t. We know this means A has at least one eigenvalue with non-negative real part. Let λ be an eigenvalue of A with non-negative real part and let v be a corresponding eigenvector. Then if x v the output is y(t) ce At v cv e λ t Since {A, c} is observable, by PBH we know cv. Since Real (λ ) we know e λt does not go to zero as t. Thus y(t) does not go to zero as t which is a contradiction.
(b) Would this theorem generalize to time-varying systems? Consider ẋ A(t)x x() x y c(t)x and suppose we find y(t) ast for all x. Can we conclude Φ(t, τ) as t for each τ? Either prove it or find a counter-example. Since our proof in the time-invariant case relied on relating the asymptotic behavior of e At to the eigenvalues of A, we are immediately skeptical that this theorem will hold true in the time-varying case, where the aymptotic behavior of Φ(t, ) cannot be related to the eigenvalues of A(t). Consider the 1-dimensional system with A(t) 1and c(t) e 2t. The statetransition matrix is clearly { e t τ for t τ Φ(t, τ) otherwise and so here clearly lim t Φ(t, ). On the other hand, however, y(t) e 2t x(t) e 2t e t x e t x as t We just need to show that this {A(t),c(t)} pair is observable and we have our counterexample. The t o,t f Observability Gramian is W o t o,t f f t o Φ T (t, t o )c T (t)c(t)φ(t, t o ) dt f t o e 2t e 2(t to) e 4t dt f t o e 2t dt e 2t 2 f t o ( e 2t e 2t f ) which is clearly PD if t f >t o so the system is observable. e t to e 2t e 2t e t to dt
6 6. Given a T>and a function h(, ):R R R satisfying sup h 2 (t, τ) dτ < τ consider the operator H defined by Hu(t) h(t, τ)u(τ) dτ If we think of H as mapping L 2,TtoL 2,T. Compute H (the adjoint of H). H will be a map from L 2,T back to L 2,T. The defining property is y(t),hu( )(t) H y( )(t),u(t) The inner product is the standard L 2 inner product, so we require y(t) y(t)hu(t) dt h(t, τ)u(τ) dτ dt y(t)h(t, τ)u(τ) dτdt {y(t)h(t, τ) dt} u(τ) dτ H y(t)u(t) dt H y(t)u(t) dt H y(t)u(t) dt H y(t)u(t) dt (2) If we look at equation (6) realizing that t and τ are just dummy variables we can rewrite it as (y(t)h(t, z) dt) u(z) dz H y(z)u(z) dz and from this it is clear that H y(t) h(τ,t)y(τ) dτ Notice that this is much like finite-dimensional linear maps. For a matrix A ij A ji. If we think of h(t, τ) as a matrix with uncountably many rows and columns, the adjoint is found by transposition as well: h (t, τ) h(τ,t)!
7. A time-varying linear system is a so-called jump-parameter system and its A(t), b(t), and c(t) are given by b(t) 1 ; c(t) 1 1 ; A(t) 1 1 1 for t<2 for t 2. (a) Find the state-transition matrix, Φ(t, τ). If t, τ < 2 then the system looks time invariant and Φ(t, τ)e A1(t τ) where A 1 1. Likewise, if t, τ 2 we have Φ(t, τ) e 1 A2(t τ) where A 2. It 1 is easy to calculate that 1 e t L 1 (si A) 1 1 e t 1 1 and e t 1 e t 1 e t Now, when τ 2 but t>2 we cross the jump boundary. The system state is propagated first by A 1 for 2 τ and then by A 2 for t 2 so Φ(t, τ)e A2(t 2) e A1(2 τ) 1 e t 2 1 1 e t 2 e 2 τ Notice the ORDER here!! So, finally 1 e t τ 1 e t τ e 2 τ e t τ Φ(t, τ) 1 e τ t e 2 t 1 1 e e τ t 2 τ 1 e t τ 1 e t τ 1 e t τ for t, τ < 2 for τ 2 and t 2 for t<2 and τ 2 for t, τ 2 (b) Give an exhaustive enumeration of the regions in (t o,t f )-space where the Controllability Gramian, W c (t o,t f ), of the system is positive definite. The controllability Gramian will be PD on those intervals t o,t f where we can hit any desired end state, x(t f ) we want starting from any initial state x(t o ) we want. For t<2 the time-invariant system {A 1,b} is uncontrollable. For t 2 the TI system {A 2,b} is controllable. Thus W c (t o,t f ) is { not PD when tf < 2 PD whenever t f 2 (c) Repeat the above for the Observability Gramian, W o (t o,t f ). Similar reasoning shows that { PD when t W o (t o,t f ) is o < 2 not PD whenever t o 2
8. Show that if P 1 > and P 2 > (both are symmetric) then the product P 1 P 2, while neither symmetric nor PD, actually has real, positive eigenvalues. Since P 2 >, a symmetric and positive definite square root, P 1/2 2, exists. Since P 1/2 2 is PD, it is invertible so P 1/2 2 exists and is also symmetric and PD. Since similarity transform preserves eigenvalues, the eigenvalues of P 1 P 2 are the same as those of P 1/2 2 (P 1 P 2 )P 1/2 2 but P 1/2 2 (P 1 P 2 )P 1/2 2 P 1/2 2 P 1 P 1/2 2 and this matrix is symmetric. So the eigenvalues of P 1 P 2 are real. Further since P 1/2 2 is symmetric and P 1 is PD x T P 1/2 2 P 1 P 1/2 2 x (P 1/2 2 x) T (P 1 )(P 1/2 2 x) > if P 1/2 2 x but since P 1/2 2 is invertible, P 1/2 2 x x and hence P 1/2 2 P 1 P 1/2 2 is PD. So the eigenvalues of P 1 P 2 are positive as well. That completes the proof.