LMI Methods in Optimal and Robust Control

Similar documents
Nonlinear Systems Theory

Half of Final Exam Name: Practice Problems October 28, 2014

LMI Methods in Optimal and Robust Control

Nonlinear Control Systems

EE222 - Spring 16 - Lecture 2 Notes 1

System Control Engineering 0

Nonlinear Systems and Control Lecture # 12 Converse Lyapunov Functions & Time Varying Systems. p. 1/1

Converse Lyapunov theorem and Input-to-State Stability

Lyapunov Stability Theory

ẋ = f(x, y), ẏ = g(x, y), (x, y) D, can only have periodic solutions if (f,g) changes sign in D or if (f,g)=0in D.

Prof. Krstic Nonlinear Systems MAE281A Homework set 1 Linearization & phase portrait

3 Stability and Lyapunov Functions

1 Lyapunov theory of stability

Optimization of Polynomials

Modern Optimal Control

Existence and uniqueness of solutions for nonlinear ODEs

Topic # /31 Feedback Control Systems. Analysis of Nonlinear Systems Lyapunov Stability Analysis

Nonlinear Control. Nonlinear Control Lecture # 2 Stability of Equilibrium Points

Lecture 4. Chapter 4: Lyapunov Stability. Eugenio Schuster. Mechanical Engineering and Mechanics Lehigh University.

Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems

Dynamical Systems as Solutions of Ordinary Differential Equations

An introduction to Mathematical Theory of Control

THE INVERSE FUNCTION THEOREM

Polynomial level-set methods for nonlinear dynamical systems analysis

ODE Homework 1. Due Wed. 19 August 2009; At the beginning of the class

MCE693/793: Analysis and Control of Nonlinear Systems

Passivity-based Stabilization of Non-Compact Sets

Nonlinear equations. Norms for R n. Convergence orders for iterative methods

Handout 2: Invariant Sets and Stability

Main topics and some repetition exercises for the course MMG511/MVE161 ODE and mathematical modeling in year Main topics in the course:

Problem List MATH 5173 Spring, 2014

MCE693/793: Analysis and Control of Nonlinear Systems

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

CDS 101/110a: Lecture 2.1 Dynamic Behavior

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N

1 The Observability Canonical Form

Applied Dynamical Systems

SOLVING DIFFERENTIAL EQUATIONS

Solution of Additional Exercises for Chapter 4

CDS 101/110a: Lecture 2.1 Dynamic Behavior

Complex Dynamic Systems: Qualitative vs Quantitative analysis

Existence and Uniqueness

2.1 Dynamical systems, phase flows, and differential equations

EN Nonlinear Control and Planning in Robotics Lecture 3: Stability February 4, 2015

Lecture 7. Please note. Additional tutorial. Please note that there is no lecture on Tuesday, 15 November 2011.

Measure and Integration: Solutions of CW2

BIBO STABILITY AND ASYMPTOTIC STABILITY

Nonlinear Dynamical Systems Lecture - 01

Chapter III. Stability of Linear Systems

Lecture 8. Chapter 5: Input-Output Stability Chapter 6: Passivity Chapter 14: Passivity-Based Control. Eugenio Schuster.

THREE DIMENSIONAL SYSTEMS. Lecture 6: The Lorenz Equations

Chapter 13 Internal (Lyapunov) Stability 13.1 Introduction We have already seen some examples of both stable and unstable systems. The objective of th

Part II. Dynamical Systems. Year

LMI Methods in Optimal and Robust Control

Solution of Linear State-space Systems

Math 328 Course Notes

Ordinary Differential Equation Theory

McGill University Math 354: Honors Analysis 3

Metric Spaces and Topology

Stability and Robustness Analysis of Nonlinear Systems via Contraction Metrics and SOS Programming

Applied Math Qualifying Exam 11 October Instructions: Work 2 out of 3 problems in each of the 3 parts for a total of 6 problems.

6.2 Brief review of fundamental concepts about chaotic systems

ACM/CMS 107 Linear Analysis & Applications Fall 2016 Assignment 4: Linear ODEs and Control Theory Due: 5th December 2016

Nonlinear Control. Nonlinear Control Lecture # 3 Stability of Equilibrium Points

Stability in the sense of Lyapunov

On Stability of Linear Complementarity Systems

Lyapunov Theory for Discrete Time Systems

Modeling and Analysis of Dynamic Systems

1. Nonlinear Equations. This lecture note excerpted parts from Michael Heath and Max Gunzburger. f(x) = 0

MAS3706 Topology. Revision Lectures, May I do not answer enquiries as to what material will be in the exam.

In essence, Dynamical Systems is a science which studies differential equations. A differential equation here is the equation

Nonlinear Control Lecture 5: Stability Analysis II

Applied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books.

Lorenz like flows. Maria José Pacifico. IM-UFRJ Rio de Janeiro - Brasil. Lorenz like flows p. 1

Nonlinear Control Lecture # 1 Introduction. Nonlinear Control

FIXED POINT METHODS IN NONLINEAR ANALYSIS

Solutions: Problem Set 4 Math 201B, Winter 2007

Stability lectures. Stability of Linear Systems. Stability of Linear Systems. Stability of Continuous Systems. EECE 571M/491M, Spring 2008 Lecture 5

Convergence Rate of Nonlinear Switched Systems

11 Chaos in Continuous Dynamical Systems.

Continuous Functions on Metric Spaces

DO NOT DO HOMEWORK UNTIL IT IS ASSIGNED. THE ASSIGNMENTS MAY CHANGE UNTIL ANNOUNCED.

Chapter #4 EEE8086-EEE8115. Robust and Adaptive Control Systems

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

Nonlinear Control Systems

Economics 204 Summer/Fall 2011 Lecture 5 Friday July 29, 2011

Lecture 25: Subgradient Method and Bundle Methods April 24

MA5510 Ordinary Differential Equation, Fall, 2014

Solutions for B8b (Nonlinear Systems) Fake Past Exam (TT 10)

Georgia Institute of Technology Nonlinear Controls Theory Primer ME 6402

LECTURE 8: DYNAMICAL SYSTEMS 7

Economics 204 Fall 2011 Problem Set 2 Suggested Solutions

CDS 101 Precourse Phase Plane Analysis and Stability

Lecture 4: Numerical solution of ordinary differential equations

2.152 Course Notes Contraction Analysis MIT, 2005

Formal Groups. Niki Myrto Mavraki

Nonlinear systems. Lyapunov stability theory. G. Ferrari Trecate

An idea how to solve some of the problems. diverges the same must hold for the original series. T 1 p T 1 p + 1 p 1 = 1. dt = lim

Numerical Methods for Differential Equations Mathematical and Computational Tools

Econ Lecture 14. Outline

Transcription:

LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 15: Nonlinear Systems and Lyapunov Functions

Overview Our next goal is to extend LMI s and optimization to nonlinear systems analysis. Today we will discuss 1. Nonlinear Systems Theory 1.1 Existence and Uniqueness 1.2 Contractions and Iterations 1.3 Gronwall-Bellman Inequality 2. Stability Theory 2.1 Lyapunov Stability 2.2 Lyapunov s Direct Method 2.3 A Collection of Converse Lyapunov Results The purpose of this lecture is to show that Lyapunov stability can be solved Exactly via optimization of polynomials. M. Peet Lecture 15: 1 / 44

Ordinary Nonlinear Differential Equations Computing Stability and Domain of Attraction Consider: A System of Nonlinear Ordinary Differential Equations ẋ(t) = f(x(t)) Problem: Stability Given a specific polynomial f : R n R n, find the largest X R n such that for any x(0) X, lim t x(t) = 0. M. Peet Lecture 15: 2 / 44

Nonlinear Dynamical Systems Long-Range Weather Forecasting and the Lorentz Attractor A model of atmospheric convection analyzed by E.N. Lorenz, Journal of Atmospheric Sciences, 1963. ẋ = σ(y x) ẏ = rx y xz ż = xy bz M. Peet Lecture 15: 3 / 44

Stability and Periodic Orbits The Poincaré-Bendixson Theorem and van der Pol Oscillator 3 An oscillating circuit model: ẏ = x (x 2 1)y ẋ = y y 2 1 0 1 2 Domain of attraction 3 3 2 1 0 1 2 3 x Figure: The van der Pol oscillator in reverse Theorem 1 (Poincaré-Bendixson). Invariant sets in R 2 always contain a limit cycle or fixed point. M. Peet Lecture 15: 4 / 44

Stability of Ordinary Differential Equations Consider ẋ(t) = f(x(t)) with x(0) R n. Theorem 2 (Lyapunov Stability). Suppose there exists a continuous V and α, β, γ > 0 where β x 2 V (x) α x 2 V (x) T f(x) γ x 2 for all x X. Then any sub-level set of V in X is a Domain of Attraction. A Sublevel Set: Has the form V δ = {x : V (x) δ}. M. Peet Lecture 15: 5 / 44

Do The Equations Have a Solution? The Cauchy Problem The first question people ask is the Cauchy problem: For Autonomous (Uncontrolled) Systems: Definition 3 (Cauchy Problem). The Cauchy problem is to find a unique, continuous x : [0, t f ] R n for some t f such that ẋ is defined and ẋ(t) = f(t, x(t)) for all t [0, t f ]. If f is continuous, the solution must be continuously differentiable. Controlled Systems: For a controlled system, we have ẋ(t) = f(x(t), u(t)) and assume u(t) is given. This precludes feedback In this lecture, we focus on the autonomous system. Including t complicates the analysis. However, results are almost all the same. M. Peet Lecture 15: 6 / 44

Ordinary Differential Equations Existence of Solutions There exist many systems for which no solution exists or for which a solution lems onlyof exists solutions over a finite time interval. Finite Escape Time exponential stability quadratic stability time-varying systems invariant sets center manifold theorem Even for something as simple as ation ẋ(t) = x(t) 2 x(0) = x 0 x(0)= x 0 has the solution t, 0 t< 1 x 0 x(t) = x 0 1 x 0 t which clearly has escape time = 1 x 0 x(t) 5 4.5 4 3.5 3 2.5 2 1.5 1 0.5 Finite escape time of dx/dt = x 2 t e = 1 x 0 0 0 1 2 3 4 5 Time t ssproblems Figure: Simulation of ẋ = x 2 for several x(0) Previous problem is like the water-tank problem in backwar time M. Peet Lecture 15: 7 / 44

Ordinary Differential Equations Non-Uniqueness A classical example of a system without a unique solution is ẋ(t) = x(t) 1/3 x(0) = 0 For the given initial condition, it is easy to verify that ( 2t x(t) = 0 and x(t) = 3 both satisfy the differential equation. ) 3/2 1.6 1.6 1.4 1.4 1.2 1.2 1 1 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 0 0 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 Figure: Matlab simulation of ẋ(t) = x(t) 1/3 with x(0) = 0 Figure: Matlab simulation of ẋ(t) = x(t) 1/3 with x(0) =.000001 M. Peet Lecture 15: 8 / 44

Ordinary Differential Equations Non-Uniqueness Uniqueness Problems Example: The equation ẋ= x, x(0)=0has man An Example of a system with several solutions is given by ẋ(t) = { (t C) x(t) x(t) x(0) = 2 /4 t> C = 0 0 t C For the given initial condition, it is easy to verify that for any C, { (t C) 2 x(t) = 4 t > C 0 t C satisfies the differential equation. x(t) 2 1.5 1 0.5 0 0.5 1 0 1 2 3 4 5 Time t Compare with water tank: Figure: Several solutions of ẋ = x M. Peet Lecture 15: 9 / 44

Continuity of a Function Customary Notions of Continuity Nonlinear Stability requires some additional Math Definitions. Definition 4 (Continuity at a Point). For normed spaces X, Y, a function f : X Y is continuous at the point x 0 X if for any ɛ > 0, there exists a δ > 0 such that x x 0 < δ (U) implies f(x) f(x 0 ) < ɛ (V ). M. Peet Lecture 15: 10 / 44

Ordinary Differential Equations Customary Notions of Continuity Definition 5 (Continuity on a Set of Points (B)). For normed spaces X, Y, a function f : A X Y is continuous on B if it is continuous at any point x 0 B. A function is simply continuous if B = A. Dropping some of the notation, Definition 6 (Uniform Continuity on a Set of Points (B)). f : A X Y is uniformly continuous on B if for any ɛ > 0, there exists a δ > 0 such that for x, y B, x y < δ implies f(x) f(y) < ɛ. A function is simply uniformly continuous if B = A. Example: f(x) = x 3 is uniformly continuous on B = [0, 1], but not B = R f (x) = 3x 2 < 3 for x [0, 1] hence f(x) f(y) 3 x y. So given δ, choose ɛ < 1 3 δ. M. Peet Lecture 15: 11 / 44

Lipschitz Continuity A Quantitative Notion of Continuity Definition 7 (Lipschitz Continuity). The function f is Lipschitz continuous on X if there exists some L > 0 such that f(x) f(y) L x y for any x, y X. The constant L is referred to as the Lipschitz constant for f on X. Definition 8 (Local Lipschitz Continuity). The function f is Locally Lipschitz continuous on X if for every x X, there exists a neighborhood, B of x such that f is Lipschitz continuous on B. Definition 9. The function f is Globally Lipschitz if it is Lipschitz on its entire domain. Example: f(x) = x 3 is Locally Lipschitz on [ 1, 1] with L = 3. But f(x) = x 3 is NOT Globally Lipschitz on R L is typically just a bound on the derivative. M. Peet Lecture 15: 12 / 44

A Theorem on Existence of Solutions Existence and Uniqueness Let B(x 0, r) be the unit ball, centered at x 0 of radius r. Theorem 10 (A Typical Existence Theorem). Suppose x 0 R n, f : R n R n and there exist L, r such that for any x, y B(x 0, r), f(x) f(y) L x y and f(x) c for x B(x 0, r). Let t f < min{ 1 L, r c }. Then there exists a unique differentiable x : [0, t f ] R n, such that x(0) = x 0, x(t) B(x 0, r) and ẋ(t) = f(x(t)). M. Peet Lecture 15: 13 / 44

Examples on Existence of Solutions Theorem 11 (A Typical Existence Theorem). Suppose x 0 R n, f : R n R n and there exist L, r such that peaking for any x, y B(x 0, r), f(x) f(y) L x y and f(x) c for x B(x 0, r). Let t f < min{ 1 L, r c }. Then quadratic there stability time-varying systems exists a invariant sets unique differentiable solution on interval [0, t f ]. center manifold theorem Recall: has the solution Nonlinear Control Theory 2006 Existence problems of solutions ẋ(t) Example: = x(t) The differential 2 equation x(0) = x 0 has the solution dx dt = x2, x(0)= x0 x(t) = x 0 x(t)= x0 1 x 0 t 1 x0t, 0 t< 1 x0 Lets take r = 1, xfinite 0 = escape 1. time Then L = sup x [0,2] f (x) = 4. t f= c = sup x [0,2] f(x) = 4. Then we 1 x0have a solution for t f < min{ 1 L, r c } = min{ 1 4, 1 4 } = 1 4 and where x(t) < 2 for t [0, t f ]. We can verify that the solution Uniqueness x(t) Problems = 1 1 t < 4 3 for t < t f. Lecture 1++, 2006 Nonlinear Phenomena and Stability theory Nonlinear phenomena [Khalil Ch 3.1] existence and uniqueness finite escape time Linear system theory revisited Second order systems [Khalil Ch 2.4, 2.6] periodic solutions / limit cycles Stability theory [Khalil Ch. 4] Lyapunov Theory revisited exponential stability x(t) 5 4.5 4 3.5 3 2.5 2 1.5 1 0.5 Finite Escape Time Finite escape time of dx/dt = x 2 0 0 1 2 3 4 5 Time t Previous problem is like the water-tank problem in backw time M. Peet Example: The equation ẋ= x, x(0)=0has many solutions: Lecture 15: (Substitute τ = t in differential equation). 14 / 44

Examples on Existence of Solutions Non-Uniqueness Theorem 12 (A Typical Existence Theorem). Suppose x 0 R n, f : R n R n and there exist L, r such that for any x, y B(x 0, r), f(x) f(y) L x y and f(x) c for x B(x 0, r). Let t f < min{ 1 L, r c }. Then there exists a unique differentiable solution on interval [0, t f ]. Recall the system without a unique solution is ẋ(t) = x(t) 1/3 x(0) = 0 The problem here is that f (x) = 1 3 x 2 3 = 1 L = 3x 2 3 sup f (x) = sup x [0,2] x [0,2] Since 1 0 =. So there is no Lipschitz Bound.. 1 3x 2 3 = M. Peet Lecture 15: 15 / 44

Typical Existence of Solutions Proof Approach Step 1: Contraction Mapping Theorem Theorem 13 (Contraction Mapping Principle). Let (X, ) be a complete normed space and let P : X X. Suppose there exists a ρ < 1 such that P x P y ρ x y for all x, y X. Then there is a unique x X such that P x = x. Furthermore for y X, define the sequence {x i } as x 1 = y and x i = P x i 1 for i > 2. Then lim i x i = x. Some Observations: Proof of CM: Show that P k y is a Cauchy sequence for any y X. For existence of solutions, X is the space of solutions. The contraction is and x := sup t [0,tf ] x(t). (P x)(t) = x 0 + t 0 f(x(s))ds, M. Peet Lecture 15: 16 / 44

Ordinary Differential Equations Contraction Mapping Theorem Key Point: P x = x if and only if x is a solution! (P x)(t) = x 0 + t 0 f(x(s))ds, Theorem 14 (Fundamental Theorem of Calculus). Suppose x and f : R n R n are continuous. Then the following are equivalent. 1. x is differentiable at any t [0, t f ] and ẋ(t) = f(x(t)) at all t [0, t f ] x(0) = x 0 2. x(t) = x 0 + t 0 f(x(s))ds for all t [0, t f ] M. Peet Lecture 15: 17 / 44

Ordinary Differential Equations Existence and Uniqueness First recall what we are trying to prove: Theorem 15 (A Typical Existence Theorem). Suppose x 0 R n, f : R n R n and there exist L, r such that for any x, y B(x 0, r), f(x) f(y) L x y and f(x) c for x B(x 0, r). Let t f < min{ 1 L, r c }. Then there exists a unique differentiable solution on interval [0, t f ]. We will show that (P x)(t) = x 0 + t 0 f(x(s))ds is a contraction if t f is sufficiently small. Then P x = x has a solution! M. Peet Lecture 15: 18 / 44

Proof of Existence Theorem Proof. For given x 0, define the complete normed space B = {x C[0, t f ] : x(0) = x 0, x(t) B(x 0, r)} with norm sup t [0,tf ] x(t) (Proof of Completeness omitted). Recall we define P as P x(t) = x 0 + t 0 f(x(s))ds Step 1: We first show that if x B, then P x B. Suppose x B. To see that P x is continuous in t, we note that P x(t 2 ) P x(t 1 ) = t2 t 1 f(x(s))ds t2 t 1 f(x(s)) ds c(t 2 t 1 ) Which is the definition of uniform continuity. Clearly P x(0) = x 0. Finally, we need to show (P x)(t) B(x 0, r). Since f(x) c when x B(x 0, r), we have sup P x(t) x 0 = sup t [0,t f ] t [0,t f ] t 0 f(x(s))ds tf where the final inequality is from the constraint t f < min{ 1 L, r c }. 0 f(x(s)) ds t f c < r M. Peet Lecture 15: 19 / 44

Proof of Existence Theorem Proof. We have shown that P : B B. Step 2: Now show that P is a contraction. For any two x, y B, we have P x P y = sup t [0,t f ] tf L 0 sup t [0,t f ] t ( 0 t 0 f(x(s)) f(y(s))ds f(x(s)) f(y(s)) ds) = tf x(s) y(s) ds Lt f x y < x y 0 f(x(s)) f(y(s)) ds Where the last inequality comes from the constraint t f < min{ 1 L, r c }. Thus P is a contraction which means there exists a solution x B such that ẋ = f(x(t)) M. Peet Lecture 15: 20 / 44

Picard Iteration Constructing the Solution This proof provides a way of actually constructing the solution. Picard-Lindelöf Iteration: From the proof, unique solution of P x = x is a solution of ẋ = f(x ), where t (P x)(t) = x 0 + f(x(s))ds From the contraction mapping theorem, the solution P x = x can be found as x = lim P k z for ANY z B k Note that this existence theorem only guarantees existence on the interval [ t 0, 1 ] [ or t 0, r ] L c Where L is a Lipschitz constant for f in the ball of radius r centered at x 0 c is a bound for f in the ball of radius r centered at x 0. Q: What does a Lyapunov function prove if there is no guaranteed solution? A: Nothing. (Particularly problematic for PDEs) 0 M. Peet Lecture 15: 21 / 44

Illustration of Picard Iteration This is a plot of Picard iterations for the solution map of ẋ = x 3. z(t, x) = 0; P z(t, x) = x; P 2 z(t, x) = x tx 3 ; P 3 z(t, x) = x tx 3 + 3t 2 x 5 3t 3 x 7 + t 4 x 9 1.5 1.0 0.5 0.2 0.4 0.6 0.8 1.0 Figure: The solution for x 0 = 1 Convergence is only guaranteed on interval t [0,.333]. M. Peet Lecture 15: 22 / 44

Extension of Existence Theorem For Lyapunov Functions to work, we need to extend the solution Indefinitely! Step 1: Prove that solutions exist for a fixed r (yielding L, c, and t f ). Step 2: Use a Lyapunov function to show that there exists a δ < r, such that any solution which begins in B(0, δ) will stay in B(0, r). Step 3: Since x(0) < δ < r, a solution exists on [0, t f ]. Further, x(t f ) r. Step 4: Since x(t f ) < r, a solution exists on [0, 2t f ]. Further, Step 2 proves x(2t f ) r. Repeat Step 4 Indefinitely For a given set W, define W r := {x : x y r, y W }. Theorem 16 (Formal Solution Extension Theorem). Suppose that there exists a domain D and L > 0 such that f(x) f(y) L x y for all x, y D R n and t > 0. Suppose there exists a compact set W and r > 0 such that W r D. Furthermore suppose that it has been proven that for x 0 W, any solution to ẋ(t) = f(x), x(0) = x 0 must lie entirely in W. Then, for x 0 W, there exists a unique solution x with x(0) = x 0 such that x lies entirely in W. M. Peet Lecture 15: 23 / 44

Illustration of Extended Picard Iteration We take the previous approximation to the solution map and extend it. x t 1.0 0.9 0.8 0.7 0.6 0.5 0.2 0.4 0.6 0.8 1.0 t Figure: The Solution map φ and the functions G k i for k = 1, 2, 3, 4, 5 and i = 1, 2, 3 for the system ẋ(t) = x(t) 3. The interval of convergence of the Picard Iteration is T = 1 3. M. Peet Lecture 15: 24 / 44

Stability Definitions Whenever you are trying to prove stability, Please define your notion of stability! Denote the set of bounded continuous functions by C := {x C : x(t) r, r 0} with norm x = sup t x(t). We define g : D C to be the solution map: g(x 0, t) if t g(x 0, t) = f(g(x 0, t)) and g(x 0, 0) = x 0 x 0 D Definition 17. The system is locally Lyapunov stable on D where D contains an open neighborhood of the origin if it defines a unique map g : D C which is continuous at the origin (x 0 = 0). The system is locally Lyapunov stable on D if for any ɛ > 0, there exists a δ(ɛ) such that for x(0) δ(ɛ), x(0) D we have x(t) ɛ for all t 0 M. Peet Lecture 15: 25 / 44

Stability Definitions Definition 18. The system is globally Lyapunov stable if it defines a unique map g : R n C which is continuous at the origin. We define the subspace of bounded continuous functions which tend to the origin by G := {x C : lim t x(t) = 0} with norm x = sup t x(t). Definition 19. The system is locally asymptotically stable on D where D contains an open neighborhood of the origin if it defines a map g : D G which is continuous at the origin. M. Peet Lecture 15: 26 / 44

Stability Definitions Definition 20. The system is globally asymptotically stable if it defines a map g : R n G which is continuous at the origin. Definition 21. The system is locally exponentially stable on D if it defines a map g : D G where g(x, t) Ke γt x for some positive constants K, γ > 0 and any x D. Definition 22. The system is globally exponentially stable if it defines a map g : R n G where g(x, t) Ke γt x for some positive constants K, γ > 0 and any x R n. M. Peet Lecture 15: 27 / 44

Lyapunov Theorem Consider the system: ẋ = f(x), f(0) = 0 Theorem 23. Let V : D R be a continuously differentiable function and D compact such that V (0) = 0 V (x) > 0 for x D, x 0 V (x) T f(x) 0 for x D. Then ẋ = f(x) is well-posed and locally Lyapunov stable on the largest sublevel set V γ = {x : V (x) γ} of V contained in D. Furthermore, if V (x) T f(x) < 0 for x D, x 0, then ẋ = f(x) is locally asymptotically stable on the largest sublevel set V γ = {x : V (x) γ} contained in D. M. Peet Lecture 15: 28 / 44

Lyapunov Theorem Sublevel Set: For a given Lyapunov function V and positive constant γ, we denote the set V γ = {x : V (x) γ}. Proof. Existence: Denote the largest bounded sublevel set of V contained in the interior of D by V γ. Because V (x(t)) = V (x(t)) T f(x(t)) 0, if x(0) V γ, then x(t) V γ for all t 0. Therefore since f is locally Lipschitz continuous on the compact V γ, by the extension theorem, there is a unique solution for any initial condition x(0) V γ. Lyapunov Stability: Given any ɛ > 0, choose ɛ < ɛ with B(ɛ) V γ, choose γ i such that V γi B(ɛ). Now, choose δ > 0 such that B(δ) V γi. Then B(δ) V γi B(ɛ) and hence if x(0) B(δ), we have x(0) V γi B(ɛ) B(ɛ ). Asymptotic Stability: V monotone decreasing implies lim t V (x(t)) = 0. V (x) = 0 implies x = 0. Proof omitted. M. Peet Lecture 15: 29 / 44

Lyapunov Theorem Exponential Stability Theorem 24. Suppose there exists a continuously differentiable function V and constants c 1, c 2, c 3 > and radius r > 0 such that the following holds for all x B(r). c 1 x p V (x) c 2 x p V (x) T f(x) c 3 x p Then ẋ = f(x) is exponentially stable on any ball contained in the largest sublevel set contained in B(r). Exponential Stability allows a quantitative prediction of system behavior. M. Peet Lecture 15: 30 / 44

The Gronwall-Bellman Inequality Exponential Stability Lemma 25 (Gronwall-Bellman). Let λ be continuous and µ be continuous and nonnegative. Let y be continuous and satisfy for t b, Then y(t) λ(t) + If λ and µ are constants, then y(t) λ(t) + t a t a µ(s)y(s)ds. [ t ] λ(s)µ(s) exp µ(τ)dτ ds s y(t) λe µt. For λ(t) = y 0, the condition is equivalent to ẏ(t) µ(t)y(t), y(0) = y(t). M. Peet Lecture 15: 31 / 44

Lyapunov Theorem Exponential Stability Proof. We begin by noting that we already satisfy the conditions for existence, uniqueness and asymptotic stability and that x(t) B(r). Now, observe that V (x(t)) c 3 x(t) 2 c 3 c 2 V (x(t)) Which implies by the Gronwall-Bellman inequality (µ = c3 c 2, λ = V (x(0))) that V (x(t)) V (x(0))e c 3 t c2. Hence x(t) 2 1 V (x(t)) 1 e c 3 t c2 V (x(0)) c 2 e c t 2 x(0) 2. c 1 c 1 c 1 c 3 M. Peet Lecture 15: 32 / 44

Lyapunov Theorem Invariance Sometimes, we want to prove convergence to a set. Recall Definition 26. V γ = {x, V (x) γ} A set, X, is Positively Invariant if x(0) X implies x(t) X for all t 0. Theorem 27. Suppose that there exists some continuously differentiable function V such that V (x) > 0 for x D, x 0 V (x) T f(x) 0 for x D. for all x D. Then for any γ such that the level set X = {x : V (x) = γ} D, we have that V γ is positively invariant. Furthermore, if V (x) T f(x) 0 for x D, then for any δ such that X V δ D, we have that any trajectory starting in V δ will approach the sublevel set V γ. M. Peet Lecture 15: 33 / 44

Problem Statement 1: Global Lypunov Stability Given: Vector field, f(x) Find: function V, scalars α, β such that i α i =.01, i β i =.01 and V (x) i α i (x T x) i for all x V (x) i β i (x T x) i for all x V (x) T f(x) 0 for all x Conclusion: Lyapunov stability for any x(0) R n. Can replace V (x) i β i(x T x) i with V (0) = 0 if it is well-behaved. M. Peet Lecture 15: 34 / 44

Examples of Lyapunov Functions Mass-Spring: Pendulum: ẍ = c mẋ k m x ẋ 2 = g l sin x 1 ẋ 1 = x 2 V (x) = 1 2 mẋ2 + 1 2 kx2 V (x) = (1 cos x 1 )gl + 1 2 l2 x 2 2 V (x) = ẋ( cẋ kx) + kxẋ = cẋ 2 kẋx + kxẋ V (x) = glx 2 sin x 1 glx 2 sin x 1 = 0 = cẋ 2 0 M. Peet Lecture 15: 35 / 44

Problem Statement 2: Global Exponential Stability Given: Vector field, f(x), exponent, p Find: function V, scalars α, β, δ such that i α i =.01, i β i =.01 and V (x) i V (x) i α i x 2p i β i x 2p i for all x for all x V (x) T f(x) δv (x) for all x Conclusion: Exponential stability for any x(0) R n. Convergence Rate: x(t) 2p βmax α min x(0) e δ 2p t M. Peet Lecture 15: 36 / 44

Problem Statement 2: Global Exponential Stability Example Consider: Attitude Dynamics of a rotating Spacecraft: What about: V (x) T f(x) = ω 1 ω 2 ω 3 J 1 ω 1 = (J 2 J 3 )ω 2 ω 3 J 2 ω 2 = (J 3 J 1 )ω 3 ω 1 J 3 ω 3 = (J 1 J 2 )ω 1 ω 2 V (x) = ω 2 1 + ω 2 2 + ω 2 3? T J 2 J 3 J 1 ω 2 ω 3 J 3 J 1 J 2 ω 3 ω 1 J 1 J 2 J 3 ω 1 ω 2 OK, maybe not. Try u i = k i ω i. ( J2 J 3 = + J 3 J 1 + J ) 1 J 2 ω 1 ω 2 ω 3 J 1 J 2 J 3 ( J 2 = 2 J 3 J3 2 J 2 + J3 2 J 1 J1 2 J 3 + J 2 J1 2 J2 2 J 1 J 1 J 2 J 3 ) ω 1 ω 2 ω 3 M. Peet Lecture 15: 37 / 44

Problem Statement 3: Local Exponential Stability Given: Vector field, f(x), exponent, p Ball of radius r, Find: function V, scalars α, β, δ such that i α i =.01, i β i =.01 and V (x) i V (x) i α i x 2p i for all x : x T x r 2 β i x 2p i for all x : x T x r 2 V (x) T f(x) δv (x) for all x : x T x r 2 Conclusion: A Domain of Attraction! of the origin Exponential stability for x(0) {x : V (x) γ} if V γ B r. Sub-Problem: Given, V, r, max γ γ such that V (x) γ for all x {x T x r} M. Peet Lecture 15: 38 / 44

Domain of Attraction The van der Pol Oscillator An oscillating circuit: (in reverse time) ẋ = y Choose: Derivative V (x) T f(x) = ẏ = x + (x 2 1)y V (x) = x 2 + y 2, r = 1 = xy + xy + (x 2 1)y 2 0 for x 2 1 [ ] T [ ] 2x y 2y x (x 2 1)y Level Set: V γ=1 = {(x, y) : x 2 + y 2 1} = B 1 y 3 2 1 0 1 2 Domain of attraction 3 3 2 1 0 1 2 3 x Figure: The van der Pol oscillator in reverse So B 1 = V γ=1 is a Domain of Attraction! M. Peet Lecture 15: 39 / 44

Problem Statement 4: Invariant Regions/Manifolds Given: Vector field, f(x), exponent, p Ball of radius r, Find: function V, scalars α, β, δ such that i α i =.01, i β i =.01 and V (x) i V (x) i α i x 2p i for all x : x T x r 2 β i x 2p i for all x : x T x r 2 V (x) T f(x) δv (x) for all x : x T x r 2 Conclusion: There exist a T such that x(t) {x : V (x) γ} for all t T if B r V γ. Sub-Problem: Given, V, r, min γ γ such that x T x r 2 for all x {V (x) γ} M. Peet Lecture 15: 40 / 44

Nonlinear Dynamical Systems Long-Range Weather Forecasting and the Lorentz Attractor A model of atmospheric convection analyzed by E.N. Lorenz, Journal of Atmospheric Sciences, 1963. ẋ = σ(y x) ẏ = rx y xz ż = xy bz M. Peet Lecture 15: 41 / 44

Problem Statement 5: Controller Synthesis (Local) Suppose ẋ(t) = f(x) + g(x)u(t) Given: Vector fields, f(x), g(x), exponent, p u(t) = k(x) Find: function functions, k, V, scalars α 0, β 0 such that i α i =.01, i β i =.01 and V (x) i V (x) i α i x 2p i for all x : x T x r 2 β i x 2p i for all x : x T x r 2 V (x) T f(x) + V (x) T g(x)k(x) 0 for all x : x T x r 2 Conclusion: Controller u(u) = k(x(t)) stabilizes the system for x(0) {x : V (x) γ} if V γ B r. M. Peet Lecture 15: 42 / 44

Problem Statement 6: Output Feedback Controller Synthesis (Global Exponential ) Suppose ẋ(t) = f(x) + g(x)u(t) y(t) = h(x(t)) Given: Vector fields, f(x), g(x), h(x) exponent, p u(t) = k(y(t)) Find: function functions, k, V, scalars α 0, β 0, δ > 0 such that i α i =.01, i β i =.01 and V (x) α i x 2p i for all x i V (x) i β i x 2p i for all x V (x) T f(x) + V (x) T g(x)k(h(x)) δv (x) for all x Conclusion: Controller u(t) = k(y(t)) exponentially stabilizes the system for any x(0) R n. M. Peet Lecture 15: 43 / 44

How to Solve these Problems? General Framework for solving these problems Convex Optimization of Functions: Variables V C[R n ] and γ R max V,γ γ subject to V (x) x T x 0 x X V (x) T f(x) + γx T x 0 x X Going Forward Assume all functions are polynomials or rationals. Assume X := {x : g i (x) 0} (Semialgebraic) M. Peet Lecture 15: 44 / 44