Nonlinear Control Systems

Similar documents
Nonlinear Control Lecture # 1 Introduction. Nonlinear Control

Nonlinear Control Systems

Nonlinear Systems and Control Lecture # 12 Converse Lyapunov Functions & Time Varying Systems. p. 1/1

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

Converse Lyapunov theorem and Input-to-State Stability

Nonlinear Systems Theory

LMI Methods in Optimal and Robust Control

Dynamical Systems as Solutions of Ordinary Differential Equations

An introduction to Mathematical Theory of Control

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

Lecture 4. Chapter 4: Lyapunov Stability. Eugenio Schuster. Mechanical Engineering and Mechanics Lehigh University.

Ordinary Differential Equation Theory

Nonlinear Control Systems

Entrance Exam, Differential Equations April, (Solve exactly 6 out of the 8 problems) y + 2y + y cos(x 2 y) = 0, y(0) = 2, y (0) = 4.

ẋ = f(x, y), ẏ = g(x, y), (x, y) D, can only have periodic solutions if (f,g) changes sign in D or if (f,g)=0in D.

I. The space C(K) Let K be a compact metric space, with metric d K. Let B(K) be the space of real valued bounded functions on K with the sup-norm

Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems

Half of Final Exam Name: Practice Problems October 28, 2014

f(s)ds, i.e. to write down the general solution in

Deterministic Dynamic Programming

Nonlinear Control Lecture # 14 Input-Output Stability. Nonlinear Control

Nonlinear Control Lecture 5: Stability Analysis II

2.1 Dynamical systems, phase flows, and differential equations

NOTES ON EXISTENCE AND UNIQUENESS THEOREMS FOR ODES

Existence and uniqueness of solutions for nonlinear ODEs

Applied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books.

Relative Controllability of Fractional Dynamical Systems with Multiple Delays in Control

2 Statement of the problem and assumptions

3 Stability and Lyapunov Functions

1 The Observability Canonical Form

Metric Spaces and Topology

Lecture 5: Lyapunov Functions and Storage Functions 1

Identification of Parameters in Neutral Functional Differential Equations with State-Dependent Delays

MA5510 Ordinary Differential Equation, Fall, 2014

Theory of Ordinary Differential Equations

IN [1], an Approximate Dynamic Inversion (ADI) control

Nonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability

Stabilization of persistently excited linear systems

Stabilization in spite of matched unmodelled dynamics and An equivalent definition of input-to-state stability

Applied Math Qualifying Exam 11 October Instructions: Work 2 out of 3 problems in each of the 3 parts for a total of 6 problems.

Approximating solutions of nonlinear second order ordinary differential equations via Dhage iteration principle

Global stabilization of an underactuated autonomous underwater vehicle via logic-based switching 1

Higher Order Averaging : periodic solutions, linear systems and an application

Math 421, Homework #9 Solutions

Implications of the Constant Rank Constraint Qualification

Lecture 8. Chapter 5: Input-Output Stability Chapter 6: Passivity Chapter 14: Passivity-Based Control. Eugenio Schuster.

Course 212: Academic Year Section 1: Metric Spaces

Piecewise Smooth Solutions to the Burgers-Hilbert Equation

EXISTENCE THEOREMS FOR FUNCTIONAL DIFFERENTIAL EQUATIONS IN BANACH SPACES. 1. Introduction

A NOTE ON ALMOST PERIODIC VARIATIONAL EQUATIONS

AP Calculus AB 2017 Free-Response Solutions

Most Continuous Functions are Nowhere Differentiable

Prof. Krstic Nonlinear Systems MAE281A Homework set 1 Linearization & phase portrait

From convergent dynamics to incremental stability

Observability for deterministic systems and high-gain observers

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

An Integral-type Constraint Qualification for Optimal Control Problems with State Constraints

Chapter 4 Optimal Control Problems in Infinite Dimensional Function Space

Nonlinear Control. Nonlinear Control Lecture # 2 Stability of Equilibrium Points

On finite gain L p stability of nonlinear sampled-data systems

L2 gains and system approximation quality 1

Chap. 3. Controlled Systems, Controllability

Feedback stabilization methods for the numerical solution of ordinary differential equations

Modeling and Analysis of Dynamic Systems

Trajectory Tracking Control of Bimodal Piecewise Affine Systems

On Discontinuous Differential Equations

Observer design for a general class of triangular systems

On perturbations in the leading coefficient matrix of time-varying index-1 DAEs

Linear System Theory

Functional Differential Equations with Causal Operators

Differential Games II. Marc Quincampoix Université de Bretagne Occidentale ( Brest-France) SADCO, London, September 2011

Picard s Existence and Uniqueness Theorem

On the Ψ - Exponential Asymptotic Stability of Nonlinear Lyapunov Matrix Differential Equations

Stability theory is a fundamental topic in mathematics and engineering, that include every

Institut für Mathematik

Random Locations of Periodic Stationary Processes

Cauchy s Theorem (rigorous) In this lecture, we will study a rigorous proof of Cauchy s Theorem. We start by considering the case of a triangle.

Min-Max Certainty Equivalence Principle and Differential Games

Cycling in Newton s Method

ANALYSIS QUALIFYING EXAM FALL 2016: SOLUTIONS. = lim. F n

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

THE ANALYSIS OF SOLUTION OF NONLINEAR SECOND ORDER ORDINARY DIFFERENTIAL EQUATIONS

Passivity-based Stabilization of Non-Compact Sets

Chain differentials with an application to the mathematical fear operator

Convergence Rate of Nonlinear Switched Systems

Calculus of Variations. Final Examination

Existence And Uniqueness Of Mild Solutions Of Second Order Volterra Integrodifferential Equations With Nonlocal Conditions

Input-output finite-time stabilization for a class of hybrid systems

Stability for solution of Differential Variational Inequalitiy

arxiv: v3 [math.ds] 22 Feb 2012

Order Preserving Properties of Vehicle Dynamics with Respect to the Driver s Input

The Convergence of the Minimum Energy Estimator

for all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true

A brief introduction to ordinary differential equations

The extreme points of symmetric norms on R^2

6 Linear Equation. 6.1 Equation with constant coefficients

Lecture 13 - Wednesday April 29th

Notes on uniform convergence

Full averaging scheme for differential equation with maximum

arxiv: v1 [math.oc] 22 Sep 2016

Transcription:

Nonlinear Control Systems António Pedro Aguiar pedro@isr.ist.utl.pt 3. Fundamental properties IST-DEEC PhD Course http://users.isr.ist.utl.pt/%7epedro/ncs2012/ 2012 1

Example Consider the system ẋ = f (t, x), x (t 0 ) = x 0 Does it have a solution over an interval [t 0, t 1 ]? That is, does exist a continuous function x : [t 0, t 1 ] R m such that ẋ(t) is defined and satisfies ẋ(t) = f(t, x(t)), t [t 0, t 1 ]? Is it unique? or is possible to have more than one solution?... and if we restrict f(t, x) to be continuous in x and piecewise continuous in t. Is this sufficient to guarantee existence and uniqueness? No!, e.g. ẋ = x 1 /3, x(0) = 0 It has solution x(t) = ` 3 2t /2. 3 But is not unique, since x(t) = 0 is another solution! 2

Lipschitz condition A function f(x) is said to be locally Lipschitz on a domain (open and connected set) D R n if each point of D has a neighborhood D 0 such that f(.) satisfies f (x) f (y) L x y, x, y D 0 (with the same Lipschitz constant L). The same terminology is extended to a function f(t, x), provided that the Lipschitz constant holds uniformly in t for all t in a given interval. Remark: For f : R R, we have f (x) f (y) x y which means that a straight line joining any two points of f(.) cannot have a slope whose absolute value is greater than L. Example f(x) = x 1/3 is not locally Lipschitz at x = 0 since f (x) = 1 3 x 2/3 as x 0. L 3

Existence and Uniqueness Theorem 3.1 - Local Existence and Uniqueness Let f(t, x) be piecewise continuous in t and satisfy the Lipschitz condition f (t, x) f (t, y) L x y x, y B = {x R n : x x 0 r}, t [t 0, t 1 ]. Then, there exists some δ > 0 such that the state equation ẋ = f(t, x) with x(t 0 ) = x 0 has a unique solution over [t 0, t 0 + δ]. Theorem 3.2 - Global Existence and Uniqueness Suppose that f(t, x) is piecewise continuous in t and satisfies f (t, x) f (t, y) L x y, x, y R n, t [t 0, t 1 ] Then, the state equation ẋ = f(t, x), with x(t 0 ) = x 0, has a unique solution over [t 0, t 1 ]. 4

Existence and Uniqueness Lemma 3.2 h i If f(t, x) and f x are continuous on [a, b] D for some domain D (t,x) Rn, then f is locally Lipschitz in x on [a, b] D. Lemma 3.3 h i If f(t, x) and f x are continuous on [a, b] (t,x) Rn, then f is globally Lipschitz in x h i on [a, b] R n if and only if f is uniformly bounded on [a, b] R x n. 5

Examples 1. with A(t), g(t) piecewise continuous functions of t. ẋ = A(t)x + g(t) = f(t, x) (1) f (t, x) f (t, y) = A (t) x + g (t) (A (t) y + g (t)) = A (t) (x y) A (t) (x y) Note that for any finite interval of time [t 0, t 1 ], the elements of A(t) are bounded. Thus A(t) a for any induced norm and f (t, x) f (t, y) a (x y) Therefore, from Theorem 3.1 we conclude that (1) has a unique solution over [t 0, t 1 ]. Since t 1 can be arbitrarily large it follows that the system has a unique solution t t 0. There is no finite escape time. 6

Examples 2. ẋ = x 3 = f(x), x R (2) Is it globally Lipschitz? No! From Lemma 3.3, f(x) is continuous but the Jacobian f x = 3x2 is not globally bounded. Nevertheless, x(t 0 ) = x 0, (2) has the unique solution s x 2 0 x(t) = sgn(x 0 ) 1 + 2x 2 0 (t t 0) 3. ẋ = x 2, x(0) = 1 (3) From Lemma 3.2 we conclude that is locally Lipschitz in any compact subset of R because f(x) and f are continuos. Hence, there exists a unique solution over x [0, δ] for some δ > 0. In particular the solution is x(t) = 1 t 1 and only exists over [0, 1), i.e, there is finite escape time at t = 1! 7

Uniqueness and Existence Theorem 3.3 Let f(t, x) be piecewise continuous in t and locally Lipschitz in x for all t t 0 and all x D R n. Let W be a compact subset of D, x 0 W and suppose it is known that every solution of ẋ = f(t, x), x(t 0 ) = x 0 lies entirely in W. Then, there is a unique solution that is defined for all t t 0. Proof. By Theorem 3.1, there is a unique solution over [t 0, t 0 + δ]. Let [t 0, T ) be its maximum interval of existence. We would like to show that T =. Suppose that is not, i.e., T is finite. Then the solution must leave any compact subset of D. But this is a contradiction, becuase x never leaves the compact set W. Thus we can conclude that T =. 8

Example Returning to the example ẋ = x 3 It is locally Lipschitz in R. For any initial condition x(0) = x 0 R, the solution cannot leave the compact set W = {x R : x x 0 } because for any instant of time if x > 0 then ẋ < 0 if x < 0 then ẋ > 0 Thus, without computing explicitly the solution we can conclude from Theorem 3.3 that the system has a unique solution for all t 0. 9

Continuous dependence on initial conditions and parameters Consider the following nominal model ẋ = f(t, x, λ 0 ) (4) where λ 0 R p denotes the nominal vector of constant parameters of the model and x R n is the state. Let y(t) be a solution of (4) that starts at y(t 0 ) = y 0 and is defined on the interval [t 0, t 1 ]. Let z(t) be a solution of ẋ = f(t, x, λ) defined on [t 0, t 1 ] with z(t 0 ) = z 0. When does z(t) remains close to y(t)? Or in other words, is the solution continuous dependent on the initial condition and parameter λ? That is, ε>0 δ>0 : z 0 y 0 < δ, λ λ 0 < δ z (t) y (t) < ε, t [t0,t 1 ] 10

Continuous dependence on initial conditions and parameters Theorem 3.4 Let f(t, x) be piecewise continuous in t and Lipschitz in x (with a Lipschits constant L) on [t 0, t 1 ] W, where W R n is an open connected set. Let y(t) and z(t) be solutions of ẏ = f (t, y), y (t 0 ) = y 0 ż = f (t, z) + g (t, z), z (t 0 ) = z 0 such that y(t), z(t) W, t [t 0, t 1 ]. Suppose that g (t, z) µ, (t, x) [t 0, t 1 ] W for some µ > 0. Then y (t) z (t) y 0 z 0 e L(t t 0) + µ L e L(t t0) 1, t [t 0, t 1 ] 11

Proof Z t y (t) = y 0 + f (τ, y (τ)) dτ t 0 Z t z (t) = z 0 + [f (τ, z (τ)) + g (τ, z (τ))] dτ t 0 Subtracting and taking norms yields Z t Z t y (t) z (t) y 0 z 0 + f (τ, y (τ)) f (τ, z (τ)) dτ + g (τ, z (τ)) dτ t 0 t 0 Z t y 0 z 0 + {z } γ t 0 L y (τ) z (τ) dτ + µ (t t 0 ) 12

Applying the Gronwall-Bellman inequality, yields Z t y (t) z (t) γ + µ (t t 0 ) + L [γ + µ (τ t 0 )] e L(t τ) dτ t 0 Integrating the right-hand side by parts ( R uv = uv R vu ) we obtain Z t y (t) z (t) γ + µ(t t 0 ) γ µ(t t 0 ) + γe L(t t0) + µe L(t s) ds = γe L(t t 0) + µ L e L(t t0) 1 t 0 13

Theorem 3.5 - Continuity of solutions Let f(t, x, λ) be continuous in (t, x, λ) and locally Lipschitz in x on [t 0, t 1 ] D { λ λ 0 c}, where D R n is an open connected set. Let y(t, λ 0 ) be a solution of ẋ = f(t, x, λ 0 ), with y(t 0, λ 0 ) = y 0 D. Suppose y(t, λ 0 ) is defined and belongs to D for all t [t 0, t 1 ]. Then, given ɛ > 0, there is δ > 0 such that if z 0 y 0 < δ, λ λ 0 < δ then there is a unique solution z(t, λ) of ẋ = f(t, x, λ) defined on [t 0, t 1 ], with z(t 0, λ) = z 0 such that z(t, λ) y(t, λ 0 ) < ɛ, t [t 0, t 1 ] Proof. By continuity µ > 0, δ > 0: ż = f(t, z, λ 0 ) + f(t, z, λ) f(t, z, λ 0 ) {z } g(t,z) λ λ 0 < δ g(t, z) µ Therefore, using Theorem 3.4 we conclude Theorem 3.5 by noticing that y 0 z 0 and µ can be chosen arbitrarily small. 14

Comparison Principle Quite often when we study the state equation ẋ = f(t, x) we need to compute bounds on the solution x(t). For that we have Gronwall-Bellman inequality The comparison Lemma Compares the solution of the differential inequality v(t) f(t, v(t)) with the solution of u(t) = f(t, u). Moreover, v(t) is not needed to be differentiable. Definition Upper right-hand derivative D + v(t + h) v(t) v(t) = lim sup h 0 + h The following properties hold: if v(t) is differentiable at t then D + v(t) = v(t) if 1 h v(t + h) v(t) g(t, h) and lim h 0 + g(t, h) = g 0(t), then D + v(t) g 0 (t) 15

Comparison Principle Lemma 3.4 - Comparison Lemma Let u = f(t, u), u(t 0 ) = u 0, µ R where f(t, u) is continuous in t and locally Lipschitz in u, for all t 0 and u J R. Let [t 0, T ) (T can be ) be the maximal interval of existence of the solution u(t) J. Let v(t) be a continuous function that satisfies D + v(t) f(t, v), v(t 0 ) u 0 with v(t) J for all t [t 0, T ). Then, v(t) u(t), t [t 0, T ) 16

Example Show that the solution of ẋ = f(x) = (1 + x 2 )x, x(0) = a is unique and defined for all t 0. Because f(x) is locally Lipschitz it has a unique solution on [0, t 1 ] for some t 1 > 0. Let v(t) = x 2 (t). Then Let u(t) be the solution of v(t) 2v(t), v(0) = a 2 u = 2u, u(0) = a 2 u(t) = a 2 e 2t Then, by comparison lemma the solution x(t) is defined t 0 and satisfies x(t) = p v(t) a e t, t 0 By Theorem 3.3 it follows that the solution is unique and defined for all t 0. 17