Lecture 5: Lyapunov Functions and Storage Functions 1

Similar documents
Lecture 10: Singular Perturbations and Averaging 1

L2 gains and system approximation quality 1

Nonlinear Control Systems

Existence and uniqueness of solutions for nonlinear ODEs

Nonlinear Systems and Control Lecture # 12 Converse Lyapunov Functions & Time Varying Systems. p. 1/1

Deterministic Dynamic Programming

Signals and Systems Chapter 2

Lecture 4: Numerical solution of ordinary differential equations

Converse Lyapunov theorem and Input-to-State Stability

Input to state Stability

Linear dynamical systems with inputs & outputs

LMI Methods in Optimal and Robust Control

Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems

arxiv: v3 [math.ds] 22 Feb 2012

Stability of Nonlinear Systems An Introduction

Lyapunov Stability Theory

Lecture 19 Observability and state estimation

Input to state Stability

10 Transfer Matrix Models

AC&ST AUTOMATIC CONTROL AND SYSTEM THEORY SYSTEMS AND MODELS. Claudio Melchiorri

Chapter III. Stability of Linear Systems

Outline. Input to state Stability. Nonlinear Realization. Recall: _ Space. _ Space: Space of all piecewise continuous functions

Implicit Euler Numerical Simulation of Sliding Mode Systems

Observer design for a general class of triangular systems

On Discontinuous Differential Equations

The first order quasi-linear PDEs

Solution of Linear State-space Systems

Trajectory Tracking Control of Bimodal Piecewise Affine Systems

Principles of Optimal Control Spring 2008

THE REAL NUMBERS Chapter #4

EG4321/EG7040. Nonlinear Control. Dr. Matt Turner

21 Linear State-Space Representations

Nonlinear Control. Nonlinear Control Lecture # 2 Stability of Equilibrium Points

Global Analysis of Piecewise Linear Systems Using Impact Maps and Quadratic Surface Lyapunov Functions

2. Dual space is essential for the concept of gradient which, in turn, leads to the variational analysis of Lagrange multipliers.

Dynamical systems: basic concepts

Nonlinear Control Systems

NUMERICAL ANALYSIS 2 - FINAL EXAM Summer Term 2006 Matrikelnummer:

Order Preserving Properties of Vehicle Dynamics with Respect to the Driver s Input

Small Gain Theorems on Input-to-Output Stability

Least Squares Based Self-Tuning Control Systems: Supplementary Notes

Higher Order Averaging : periodic solutions, linear systems and an application

Stability theory is a fundamental topic in mathematics and engineering, that include every

Nonlinear systems. Lyapunov stability theory. G. Ferrari Trecate

3 Gramians and Balanced Realizations

Near-Potential Games: Geometry and Dynamics

Nonlinear Control Lecture 1: Introduction

f(s)ds, i.e. to write down the general solution in

On Stability of Linear Complementarity Systems

QUANTITATIVE L P STABILITY ANALYSIS OF A CLASS OF LINEAR TIME-VARYING FEEDBACK SYSTEMS

Modeling and Analysis of Dynamic Systems

Boundary Value Problems For A Differential Equation On A Measure Chain

IN [1], an Approximate Dynamic Inversion (ADI) control

EE363 homework 7 solutions

Observability and state estimation

1 The Observability Canonical Form

Nonlinear Systems Theory

DYNAMICS IN 3-SPECIES PREDATOR-PREY MODELS WITH TIME DELAYS. Wei Feng

Hybrid Control and Switched Systems. Lecture #11 Stability of switched system: Arbitrary switching

Topic # /31 Feedback Control Systems. Analysis of Nonlinear Systems Lyapunov Stability Analysis

CIS 4930/6930: Principles of Cyber-Physical Systems

Source-Free RC Circuit

CDS Solutions to the Midterm Exam

Hankel Optimal Model Reduction 1

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

1. Theorem. (Archimedean Property) Let x be any real number. There exists a positive integer n greater than x.

Observability for deterministic systems and high-gain observers

Near-Potential Games: Geometry and Dynamics

On finite gain L p stability of nonlinear sampled-data systems

Lecture Notes of EE 714

L 2 -induced Gains of Switched Systems and Classes of Switching Signals

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

Lecture 4. Chapter 4: Lyapunov Stability. Eugenio Schuster. Mechanical Engineering and Mechanics Lehigh University.

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

On Sontag s Formula for the Input-to-State Practical Stabilization of Retarded Control-Affine Systems

EN Nonlinear Control and Planning in Robotics Lecture 3: Stability February 4, 2015

Reflected Brownian Motion

Piecewise Smooth Solutions to the Burgers-Hilbert Equation

DYNAMIC inversion (DI) or feedback linearization is

Intelligent Control. Module I- Neural Networks Lecture 7 Adaptive Learning Rate. Laxmidhar Behera

Optimal Control. Macroeconomics II SMU. Ömer Özak (SMU) Economic Growth Macroeconomics II 1 / 112

LMI Methods in Optimal and Robust Control

Math 710 Homework 1. Austin Mohr September 2, 2010

6.241 Dynamic Systems and Control

Trajectory planning and feedforward design for electromechanical motion systems version 2

From convergent dynamics to incremental stability

Non-linear wave equations. Hans Ringström. Department of Mathematics, KTH, Stockholm, Sweden

NOTES ON EXISTENCE AND UNIQUENESS THEOREMS FOR ODES

In essence, Dynamical Systems is a science which studies differential equations. A differential equation here is the equation

Theory of Ordinary Differential Equations

Dynamical Systems. August 13, 2013

ME 680- Spring Representation and Stability Concepts

Quasistatic Nonlinear Viscoelasticity and Gradient Flows

STABILITY ANALYSIS OF DYNAMIC SYSTEMS

Automatic Control 2. Nonlinear systems. Prof. Alberto Bemporad. University of Trento. Academic year

EN Applied Optimal Control Lecture 8: Dynamic Programming October 10, 2018

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Relative Controllability of Fractional Dynamical Systems with Multiple Delays in Control

Lecture 2 ELE 301: Signals and Systems

LECTURE 3 RANDOM VARIABLES, CUMULATIVE DISTRIBUTION FUNCTIONS (CDFs)

Transcription:

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.243j (Fall 2003): DYNAMICS OF NONLINEAR SYSTEMS by A. Megretski Lecture 5: Lyapunov Functions and Storage Functions 1 This lecture gives an introduction into system analysis using Lyapunov functions and their generalizations. 5.1 Recognizing Lyapunov functions There exists a number of slightly different ways of defining what constitutes a Lyapunov function for a given system. Depending on the strength of the assumptions, a variety of conclusions about a system s behavior can be drawn. 5.1.1 Abstract Lyapunov and storage functions In general, Lyapunov functions are real-valued functions of system s state which are monotonically non-increasing on every signal from the system s behavior set. More generally, stotage functions are real-valued functions of system s state for which explicit upper bounds of increments are available. Let B = {z} be a behavior set of a system (i.e. elements of B are are vector signals, which represent all possible outputs for autonomous systems, and all possible input/output pairs for systems with an input). Remember that by a state of a system we mean a function x : B [0, ) X such that two signals z 1, z 2 B define same state of B at time t whenever x(z 1 ( ), t) = x(z 2 ( ), t) (see Lecture 1 notes for details and examples). Here X is a set which can be called the state space of B. Note that, given the behavior set B, state space X is not uniquelly defined. 1 Version of September 19, 2003

2 Definition A real-valued function V : X R defined on state space X of a system with behavior set B and state x : B [0, ) X is called a Lyapunov function if t V (t) = V (x(t)) = V (x(z( ), t)) is a non-increasing function of time for every z B. According to this definition, Lyapunov functions provide limited but very explicit information about system behavior. For example, if X = R n and V (x(t)) = x(t) 2 is a Lyapunov function then we now that system state x(t) remains bounded for all times, though we may have no idea of what the exact value of x(t) is. For conservative systems in physics, the total energy is always a Lyapunov function. Even for non-conservative systems, it is frequently important to look for energy-like expressions as Lyapunov function candidates. One can say that Lyapunov functions have an explicit upper bound (zero) imposed on their increments along system trajectories: V (x(z( ), t 1 )) V (x(z( ), )) 0 t 1 0, z B. A useful generalization of this is given by storage functions. Definition Let B be a set of n-dimensional vector signals z : [0, ) R n. Let σ : R n R be a given function such that σ(z(t)) is locally integrable for all z( ) B. A real-valued function V : X R defined on state space X of a system with behavior set B and state x : B [0, ) X is called a storage function with supply rate σ if t 1 V (x(z( ), t 1 )) V (x(z( ), )) σ(z(t))dt t 1 0, z B. (5.1) In many applications σ is a function comparing the instantaneous values of input and output. For example, if B = {z(t) = [v(t); w(t)]} is the set of all possible input/output pairs of a given system, existence of a non-negative storage function with supply rate σ(z(t)) = v(t) 2 w(t) 2 proves that power of the output, defined as never exceed power of the input. 1 t 2 w( ) p = lim sup w(τ) 2 dτ, T t T Example 5.1 Let behavior set B = {(i(t), v(t))} descrive the (dynamcal) voltage-current relation of a passive single port electronic circuit. Then the total energy E = E(t) accumulated in the circuit can serve as a storage function with supply rate σ(i(t), v(t)) = i(t)v(t).

3 5.1.2 Lyapunov functions for ODE models It is important to have tools for verifying that a given function of a system s state is monotonically non-increasing along system trajectories, without explicitly calculating solutions of system equations. For systems defined by ODE models, this can usually be done. Consider an autonomous system defined by ODE model ẋ(t) = a(x(t)), (5.2) where a : X R n is a function defined on a subset of R n. A functional V : X R is a Lyapunov function for system (5.2) if t V (x(t)) is monotonically non-increasing for every solution of (5.2). Remember that x : [, t 1 ] X is called a solution of (5.2) if the composition a x is absolutely integrable on [, t 1 ] and equality x(t) = x( ) + t a(x(τ))dτ holds for all t [, t 1 ]. To check that a given function V is a Lyapunov function for system (5.2), one usually attempts to differentiate V (x(t)) with respect to t. If X is an open set, and both V and x are differentiable (note that the differentiability of x is assured by the continuity of a), the composition t V (x(t)) is also differentiable, and the monotonicity condition is given by V ( x)a( x) 0 x X, (5.3) where V (x) denotes the gradient of V at x. In some applications one may be forced to work with systems that have non-differentiable solutions (for example, because of a jump in an external input signal). The convenient Lyapunov function candidates V may also be non-differentiable at some points. In such situations, it is tempting to consider, for every x X, the subgradient of V at x X in the direction a( x). One may expect that non-positivity of such subgradients, which can be expressed as V ( x + ta( x)) V ( x) lim sup 0 x X, (5.4) ɛ 0,ɛ>0 0<t<ɛ t implies that V is a valid Lyapunov function. However, this is not always true. Example 5.2 Using the famous example of a Kantor function, one can construct a bounded integrable function a : R R and a continuous function V : R R such that t V ( x + ta( x)) is constant in a neigborhood of t = 0 for every x R, but the ODE (5.2) has a solution for which V (x(t)) is strictly monotonically increasing! Here by a Kantor function we mean a continuous strictly monotonic function k : [0, 1] R such that k(0) = 0 and k(1) = 1 despite the fact that k(t) is constant on a

4 family T = {T } of open disjoint intervals T [0, 1] of total length 1. Indeed, for a fixed Kantor function k define V ( x) = floor( x) + k(1 floor( x)), where floor( x) denotes the largest integer not larger than x. Let a( x) be zero on every interval (m + t 1, m + t 2 ), where m is an integer and (t 1, t 2 ) T, and a( x) = 0 otherwise. Then x(t) t is a solution of ODE (5.2), but V (x(t)) is strictly monotonically increasing, despite the fact that t V ( x + ta( x)) is constant in a neigborhood of t = 0 for every x R. However, if V and all solutions of (5.2) are smooth enough, condition (5.4) is sufficient for V to be a Lyapunov function. Theorem 5.1 If X is an open set in R n, V : X R is locally Lipschitz, a : X R n is continuous, and condition (5.4) is satisfied then V (x(t)) is monotonically non-increasing for all solutions x : [, t 1 ] X of (5.2). Proof We will use the following statement: if h : [, t 1 ] R is continuous and satisfies h(t + δ) h(t) lim sup 0 t [, t 1 ), (5.5) d 0,d>0 δ (0,d) δ then h is monotonically non-increasing. Indeed, for every r > 0 let h r (t) = h(t) rt. If h r is monotonically non-increasing for all r > 0 then so is h. Otherwise, assume that h r (t 3 ) > h r (t 2 ) for some t 2 < t 3 t 1 and r > 0. Let t 4 be the maximal solution of equation h r (t) = h r (t 2 ) with t [t 2, t 3 ]. Then h r (t) > h r (t 4 ) for all t (t 4, t 3 ], and hence (5.5) is violated at t = t 4. Now let M be the Lipschitz constant for V in a neigborhood of the trajectory of x. Since a is continuous, x(t + δ) x(t) lim a(x(t)) δ = 0 t. δ 0,δ>0 Hence the maximum (over t [, t 1 δ]) of V (x(t + δ)) V (x(t)) V (x(t) + δa(x(t))) V (x(t)) V (x(t + δ)) V (x(t) + δa(x(t))) = + δ δ δ V (x(t) + δa(x(t))) V (x(t)) x(t + δ) x(t) δa(x(t)) + M δ δ converges to a non-positive limit as δ 0.

5 A time-varying ODE model can be converted to (5.2) by introducing ẋ 1 (t) = a 1 (x 1 (t), t) (5.6) x(t) = [x 1 (t); t], a([barx; τ]) = [a 1 ( x, τ); 1], in which case the Lyapunov function V = V (x(t)) = V (x 1 (t), t) can naturally depend on time. 5.1.3 Storage functions for ODE models Consider the ODE model ẋ(t) = f(x(t), u(t)) (5.7) with state vector x(t) X R n, input u(t) U R m, where f : X U R n is a given function. Let σ : X U R be a given functional. A function V : X R is called a storage function with supply rate σ for system (5.7) t 1 V (x(t 1 )) V (x( )) σ(x(t), u(t))dt for every pair of integrable functions x : [, t 1 ] X, u : [, t 1 ] U such that the composition t f(x(t), u(t)) satisfies the identity x(t) = x( ) + t f(x(t), u(t))dt for all t [, t 1 ]. When X is an open set, f and σ are continuous, and V is continuously differentiable, verifying that a given f is a valid storage function with supply rate σ is straightforward: it is sufficient to check that V f( x, u) σ( x, u) x X, u U. When V is locally Lipschitz, the following generalization of Theorem 5.1 is available. Theorem 5.2 If X is an open set in R n, V : X R is locally Lipschitz, f, σ : X U R n are continuous, and condition V ( x + tf( x, u)) V ( x) lim sup σ( x, u) x X, u U (5.8) t ɛ 0,ɛ>0 0<t<ɛ is satisfied then V (x(t)) is a storage function with supply rate σ for system (5.7). The proof of the theorem follows the lines of Theorem 5.1. Further generalizations to discontinuous functions f, etc., are possible.