STABILITY ANALYSIS OF DYNAMIC SYSTEMS

Similar documents
AC&ST AUTOMATIC CONTROL AND SYSTEM THEORY SYSTEMS AND MODELS. Claudio Melchiorri

Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems

Control Systems. Internal Stability - LTI systems. L. Lanari

Lyapunov stability ORDINARY DIFFERENTIAL EQUATIONS

Hybrid Control and Switched Systems. Lecture #7 Stability and convergence of ODEs

Consider a particle in 1D at position x(t), subject to a force F (x), so that mẍ = F (x). Define the kinetic energy to be.

Lecture 4. Chapter 4: Lyapunov Stability. Eugenio Schuster. Mechanical Engineering and Mechanics Lehigh University.

BIBO STABILITY AND ASYMPTOTIC STABILITY

2 Lyapunov Stability. x(0) x 0 < δ x(t) x 0 < ɛ

Nonlinear systems. Lyapunov stability theory. G. Ferrari Trecate

Stability of Nonlinear Systems An Introduction

P321(b), Assignement 1

1 Lyapunov theory of stability

Lyapunov Stability Theory

Nonlinear Control. Nonlinear Control Lecture # 2 Stability of Equilibrium Points

ENGI 9420 Lecture Notes 4 - Stability Analysis Page Stability Analysis for Non-linear Ordinary Differential Equations

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

Nonlinear Systems and Control Lecture # 12 Converse Lyapunov Functions & Time Varying Systems. p. 1/1

Nonlinear Control Lecture 5: Stability Analysis II

MCE693/793: Analysis and Control of Nonlinear Systems

Using Lyapunov Theory I

Symmetries 2 - Rotations in Space

Modeling and Analysis of Dynamic Systems

Nonlinear Control. Nonlinear Control Lecture # 3 Stability of Equilibrium Points

Nonlinear Control. Nonlinear Control Lecture # 8 Time Varying and Perturbed Systems

EN Nonlinear Control and Planning in Robotics Lecture 3: Stability February 4, 2015

Video 8.1 Vijay Kumar. Property of University of Pennsylvania, Vijay Kumar

STABILITY. Phase portraits and local stability

Georgia Institute of Technology Nonlinear Controls Theory Primer ME 6402

MCE/EEC 647/747: Robot Dynamics and Control. Lecture 8: Basic Lyapunov Stability Theory

Solution of Linear State-space Systems

Dynamical Systems & Lyapunov Stability

Solution of Additional Exercises for Chapter 4

Hybrid Systems - Lecture n. 3 Lyapunov stability

Nonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability

Department of Mathematics IIT Guwahati

Topic # /31 Feedback Control Systems. Analysis of Nonlinear Systems Lyapunov Stability Analysis

Nonlinear Autonomous Systems of Differential

Control of Robotic Manipulators

Lyapunov Stability Analysis: Open Loop

Nonlinear Control Lecture 4: Stability Analysis I

ME 680- Spring Representation and Stability Concepts

CONTROL OF DIGITAL SYSTEMS

Review: control, feedback, etc. Today s topic: state-space models of systems; linearization

System Control Engineering 0

Stability in the sense of Lyapunov

Control Systems. Internal Stability - LTI systems. L. Lanari

1.7. Stability and attractors. Consider the autonomous differential equation. (7.1) ẋ = f(x),

Chapter III. Stability of Linear Systems

MCE693/793: Analysis and Control of Nonlinear Systems

B5.6 Nonlinear Systems

CDS 101/110a: Lecture 2.1 Dynamic Behavior

Handout 2: Invariant Sets and Stability

Chapter 14 Periodic Motion

The Pendulum. The purpose of this tab is to predict the motion of various pendulums and compare these predictions with experimental observations.

Hybrid Systems Course Lyapunov stability

CDS 101/110a: Lecture 2.1 Dynamic Behavior

An homotopy method for exact tracking of nonlinear nonminimum phase systems: the example of the spherical inverted pendulum

Analytical Dynamics: Lagrange s Equation and its Application A Brief Introduction

Oscillations. Simple Harmonic Motion of a Mass on a Spring The equation of motion for a mass m is attached to a spring of constant k is

EML5311 Lyapunov Stability & Robust Control Design

Stability of Stochastic Differential Equations

3. Fundamentals of Lyapunov Theory

Lagging Pendulum. Martin Ga

Convergence Rate of Nonlinear Switched Systems

Lyapunov Theory for Discrete Time Systems

Nonlinear System Analysis

ECEN 420 LINEAR CONTROL SYSTEMS. Lecture 6 Mathematical Representation of Physical Systems II 1/67

There is a more global concept that is related to this circle of ideas that we discuss somewhat informally. Namely, a region R R n with a (smooth)

Nonlinear Control Lecture 2:Phase Plane Analysis

Complex Dynamic Systems: Qualitative vs Quantitative analysis

LMI Methods in Optimal and Robust Control

the EL equation for the x coordinate is easily seen to be (exercise)

Rotational motion problems

DOMAIN OF ATTRACTION: ESTIMATES FOR NON-POLYNOMIAL SYSTEMS VIA LMIS. Graziano Chesi

Autonomous Systems and Stability

3 Stability and Lyapunov Functions

Lecture 4. Alexey Boyarsky. October 6, 2015

Theoretical physics. Deterministic chaos in classical physics. Martin Scholtz

Discrete and continuous dynamic systems

In the presence of viscous damping, a more generalized form of the Lagrange s equation of motion can be written as

J07M.1 - Ball on a Turntable

High-Gain Observers in Nonlinear Feedback Control

E209A: Analysis and Control of Nonlinear Systems Problem Set 6 Solutions

Chapter #4 EEE8086-EEE8115. Robust and Adaptive Control Systems

Input to state Stability

Chapter 8. Potential Energy and Conservation of Energy

L = 1 2 a(q) q2 V (q).

Module 06 Stability of Dynamical Systems

CDS 101 Precourse Phase Plane Analysis and Stability

Automatic Control Systems theory overview (discrete time systems)

Exam in Systems Engineering/Process Control

arxiv: v3 [math.ds] 22 Feb 2012

28. Pendulum phase portrait Draw the phase portrait for the pendulum (supported by an inextensible rod)

Automatic Control 2. Nonlinear systems. Prof. Alberto Bemporad. University of Trento. Academic year

Linear Differential Equations. Problems

where x 0 is arbitrary.

Control Systems Design

1 Controllability and Observability

Chapter One. Introduction

Transcription:

C. Melchiorri (DEI) Automatic Control & System Theory 1 AUTOMATIC CONTROL AND SYSTEM THEORY STABILITY ANALYSIS OF DYNAMIC SYSTEMS Claudio Melchiorri Dipartimento di Ingegneria dell Energia Elettrica e dell Informazione (DEI) Università di Bologna Email: claudio.melchiorri@unibo.it

C. Melchiorri (DEI) Automatic Control & System Theory Definition of stability Let us consider a generic dynamic systems (linear/non-linear, time varying, stationary) described by Assume that U, U f, X, Y are normed vector spaces (equipped with a norm). Given the initial time instant t, the initial state x(t ) and the input function u(.), let consider the reference motion We are interested to study two types of perturbations : 1) Variation of motion due to a perturbation of the initial state: Difference between motions ) Variation of motion due to a perturbation of the input: Difference between motions

C. Melchiorri (DEI) Automatic Control & System Theory 3 Local stability Initial state perturbation R 3 The motion x(.) corresponding to the input function u(.) is stable with respect to perturbations of the initial state if: if The motion x(.) is asimptotically stable with respect to perturbations of the initial state if: if

Local stability Input perturbation The motion x(.) corresponding to the input function u(.) is stable with respect to perturbations of the input function if: if Notes 1. These definitions of stability refer to a particular motion, defined by the three elements t, x(t ), u(.).. The stability definitions, besides to ϕ(t,t,x(t ),u(.)), can also be applied to the response function γ(t,t,x(t ),u(.)). 3. Stability can be referred to an equilibrum state x(t)=ϕ(t,t,x(t ),u(.)), that is a particular motion of the system. C. Melchiorri (DEI) Automatic Control & System Theory 4

C. Melchiorri (DEI) Automatic Control & System Theory 5 Local stability Notes 4. If the system is stationary, stability does not depend on t but on x(t ), u(.) only 5. A particular motion may correspond to different input functions, and its stability may depend on the applied input Example: x=, u(.) is an equilibrium state: u > Mg stable u < Mg unstable 6. The given definition of stability refers to a local stability property (for small perturbations).

C. Melchiorri (DEI) Automatic Control & System Theory 6 Stability in large Stability in large: initial state perturbations Let consider the asymptotic stability with respect to perturbations of the initial state; if δx 1 (t ) such that x(t )+δ x 1 (t ) X (t, x(t ), u( )), then X is called asymptotic stability domain with respect to t, x(t ), u( ). If X (t, x(t ), u( )) = X (i.e. the whole state space) then the motion x( ) corresponding to the input function u( ) is globally asymptotically stable. If this condiiton holds for all t and all u( ) the system is globally asymptotically stable.

C. Melchiorri (DEI) Automatic Control & System Theory 7 Stability in large Example: x= is an equilibrium point whose stability domain is X < x < X.

Stability in large Stability in large: input perturbations Let consider again the definition of stability with respect to variations of the input function. The system is bounded input - bounded state stable (b.i.b.s.) with respect to the motion t, x(t ), u( ) if two positive real numbers M u and M x exist, in general function of t, x(t ), u( ), such that for all δu( ) such that δu(t) < M u, t t. Similarly, by considering the response function γ(t,t,x(t ),u( )), it is possible to define the bounded input bounded output stability (b.i.b.o.). C. Melchiorri (DEI) Automatic Control & System Theory 8

C. Melchiorri (DEI) Automatic Control & System Theory 9 Stability of the equilibrium Stability of an equilibrium state Let consider an autonomous (homogeneous) system Def. 1: x e is an equilibrium point (state) if =f(x e,t), 8t t Let consider now system (1) and assume, without loss of generality, that the zero state is an equilibrium state (x e = ), i.e.

C. Melchiorri (DEI) Automatic Control & System Theory 1 Stability of the equilibrium Stability of an equilibrium state The assumption of considering the zero state as an equilibrium state is not restrictive. As a matter of fact, if the equilibrium is for x e (i.e. f(x e,t)=), it is possible to consider a new state z and a new function g(z, t) such that In which, obviously, we have

C. Melchiorri (DEI) Automatic Control & System Theory 11 Stability of the equilibrium Stability of an equilibrium state Let consider an autonomous system ẋ(t) =f(x(t)), (1) Note: A system is autonomous if it is without input and does not depend explicitly on time. Def. : The zero state of system (1) is stable in the sense of Lyapunov at time instant t if for all ε > there exists a parameter η > such that Def. 3: The zero state of system (1) is asymptotically stable in the sense of Lyapunov at time instant t if it is stable and if Note: These definitions apply to the equilibrium state, not to the system. A dynamic system can have several equilibrium points.

C. Melchiorri (DEI) Automatic Control & System Theory 1 Stability of the equilibrium Def. 4: The zero state of system (1) is said to be globally asymptotically stable or asymptotically stable at large in the sense of Lyapunov at time instant t if it is stable and if where X is the whole state space. In this case, since no other equilibrium state exists, the system (1) is said to be globally asymptotically stable.

C. Melchiorri (DEI) Automatic Control & System Theory 13 Stability of the equilibrium t x 1 3 ε t In summary: 1. Stable. Asymptotically stable 3. Unstable η x 1 If the previous definitions hold for all initial time instants t 1 [t, ], the stability property is said to be uniform in [t, ].

C. Melchiorri (DEI) Automatic Control & System Theory 14 Stability of the equilibrium Def. 5: The zero state of system (1) is said to be (locally) exponentially stable if there exist two real constants α, λ > such that x(t) x e α x() x e e λt t > whenever x() x e < δ. It is said to be globally exponentially stable if this property holds for any x R n. Note: Clearly, exponential stability implies asymptotic stability. The converse is, however, not true.

The integrator as a dynamic system Example: the integrator u(s) 1 s Integrator y(s) We want to study the time evolution of the state x(t) with initial condition x and null input. 1) Time evolution of the state x(t) due to perturbations of x. Given x x(t) = x Given a perturbation of the initial state δx x(t) = x + δx The perturbations on the time evolution of the state and of the output are bounded (globally stable in the sense of Lyapunov non asymptotically stable) ) Time evolution of the state x(t) due to perturbations of the input. Given x and u(t)= x(t) = x In case of a (constant) perturbation of the input δu x(t) = x + δu t The perturbations on the time evolution of the state and of the output are not bounded, x(t) for t (non b.i.b.s. stable and non b.i.b.o. stable) C. Melchiorri (DEI) Automatic Control & System Theory 15

C. Melchiorri (DEI) Automatic Control & System Theory 16 Analysis of stability for dynamic systems The study of the stability properties is of fundamental importance for the analysis of dynamic systems and the design of control laws. In the following, we discuss two methods applicable to both linear and non linear systems expressed in state form ẋ(t) =f(x, u, t), x(t )=x 1) First Lyapunov method (local linearization) ) Second Lyapunov method (direct method) Aleksandr Mikhailovich Lyapunov (1857 1918) was a Russian mathematician, mechanician and physicist. Lyapunov is known for his development of the stability theory of a dynamical system, as well as for his many contributions to mathematical physics and probability theory.

C. Melchiorri (DEI) Automatic Control & System Theory 17 First Lyapunov method (local linearization) Let us consider the autonoumous system where x = is an equilibrium point. If it is possible to compute the derivative of the functions f( ), then in a proper neighbourhood of the origin it is possible to write, by using the Taylor series expansion: ẋ(t) =f(x(t)) = f() + f x x= x(t)+f (x(t)) where f (x(t)) are the residuals of the Taylor series expansion, assumed negligible. Since f() =, by defining then the homogeneous linear system ẋ(t) =Ax(t) (3) represents the local linearization or local approximation of the non linear system ().

C. Melchiorri (DEI) Automatic Control & System Theory 18 First Lyapunov method (local linearization) Theorem: (Lyapunov linearization method) 1) If the linearized system (3) is strictly stable (all the eigenvalues of A are with negative real part), the equilibrium point is asymptotically stable for system (). ) If the linearized system (3) is unstable (at least an eigenvalue of A has real positive part), the equilibrium point is unstable for system (). 3) If A has eigenvalues with null real part, the linearization does not give information on the stability of the considered equilibrium point.

C. Melchiorri (DEI) Automatic Control & System Theory 19 First Lyapunov method (local linearization) Example 1 f(x) x = 4 f 1 (x) x 1 f (x) x 1 f 1 (x) x f (x) x 3 5 = " sin x x 1 cos x 1 x 1 cos x x 1 cos x # Asimptotycally stable system

C. Melchiorri (DEI) Automatic Control & System Theory First Lyapunov method (local linearization) Example 1 Evolution of the original and of the linearized system (dash) 1.4 Original system 1. 1.8 Linearized system.6.4. 1 3 4 5

C. Melchiorri (DEI) Automatic Control & System Theory 1 First Lyapunov method Example - Pendulum Gravity force Viscous friction Input If u(t) =, the system has equilibrium states: x e,1 = apple x e, = apple m g

First Lyapunov method Example - Pendulum f(x) x = 4 f 1 (x) x 1 f (x) x 1 f 1 (x) x f (x) x 3 5 = " 1 g L cos x 1 d L m # Then: STABLE UNSTABLE An eigenvalue is positive C. Melchiorri (DEI) Automatic Control & System Theory

First Lyapunov method Example - Pendulum f(x) x = 4 f 1 (x) x 1 f (x) x 1 f 1 (x) x f (x) x 3 5 = " 1 g L cos x 1 d L m # By considering d =, then:??????? UNSTABLE C. Melchiorri (DEI) Automatic Control & System Theory 3

C. Melchiorri (DEI) Automatic Control & System Theory 4 Lyapunov method Definitions Positive definite functions: A function V(x): R n -> R is positive definite in ϑ(,ρ), a spherical neighbourhood of the origin of the state space, if 1. V() = ;. V(x) > x Lyapunov functions: A function V(x): R n à R is a Lyapunov function in ϑ(,ρ) for system () if 1. V(x) is positive definite in ϑ(,ρ);. V(x) has continuous first-order partial derivatives with respect to x; 3.

C. Melchiorri (DEI) Automatic Control & System Theory 5 Lyapunov method 5 15 1 5 Positive definite functions 4 3 1-1 - -3-4 -4-4 5 5 5-5 -5 15 4 3 1 1 5-1 5 - -3 5-4 -4-4 -5-5

C. Melchiorri (DEI) Automatic Control & System Theory 6 Lyapunov method Undefinite function Positive definite function 6 3 4 1 - -1-4 - -6 4 - -4-4 - 4-3 4 - -4-4 - X 4 3 3 1 1-1 -1 - - -3-3 - -1 1 3-3 -3 - -1 1 3

Lyapunov method Lyapunov theorems 1. (simple stability): The zero state is stable in the sense of Lyapunov if there exists a Lyapunov function V in a neighbourhood of the origin ϑ(,ρ).. (asymptotic stability): The zero state is asymptotically stable (AS) in the sense of Lyapunov if there exists a Lyapunov function V in a neighbourhood of the origin ϑ(,ρ) such that dv(x)/dt < for all x ϑ(,ρ), x. 3. (global asymptotic stability): System () is globally asymptotically stable (GAS) in x = if there exists a Lyapunov function V defined in all the state space such that: (radially unlimited Lyapunov function). C. Melchiorri (DEI) Automatic Control & System Theory 7

Lyapunov method 4. (asymptotic stability domain): Let V(x) be a Lyapunov function and h a positive real number such that the (open) set is bounded and let dv(x)/dt < for all x D, x. Then, all trajectories starting from a point in D converge asymptotically to zero. V(x) t Time evolution of the Lyapunov function x 1 x() x Time evolution of the state of an autonomous system C. Melchiorri (DEI) Automatic Control & System Theory 8

C. Melchiorri (DEI) Automatic Control & System Theory 9 Lyapunov method NOTE: The function describes the variation in time of function V(x) when this is computed along a solution x(t) of the differential equation. NOTE: The previous theorems can be applied to non autonomous systems This can be made by defining a Lyapunov function V(x,t) (with time) such that and by properly modifying the theorems in order to consider the factor V/ t.

C. Melchiorri (DEI) Automatic Control & System Theory 3 Lyapunov method Chetaev theorem (instability theorem) If there exists a function V(x) positive definite, with positive definite derivative in an arbitrary small neighbourhood ϑ(, ρ) of the origin, that is then the zero state x = of the system is unstable.

C. Melchiorri (DEI) Automatic Control & System Theory 31 Lyapunov method Chetaev theorem (instability theorem -- another formulation) Consider an autonomous dynamical systems and assume that x = is an equilibrium point. Let V : D R have the following properties: (i) V() = (ii) x R n, arbitrarily close to x =, such that V(x ) > (iii) dv/dt > x U, where the set U is defined as follows: U = {x D : x ρ, and V (x) > }. Under these conditions, x = is unstable.

C. Melchiorri (DEI) Automatic Control & System Theory 3 Lyapunov method Example: mass with spring and dumper m x(t) k b m ẍ = kx b ẋ Function V(x) decreases since a dissipative element ( b ) is present, otherwise V(x) would be constant and therefore the energy would not be dissipated. The system evolves until velocity reaches the zero value, and then stops (in a stable equilibrium configuration). In this configuration, from the diff. equation of the system we obtain: = k x +, x = Therefore, indeed the system stops in the zero state x =.

C. Melchiorri (DEI) Automatic Control & System Theory 33 Lyapunov method x =[1, ] T 1.5 m = 5 K = 5 b = 1.5 -.5-1 -1.5 5 4 3 1 - -.5-3 5 1 15 3 1-1 - -3-1 -.5.5 1 V (x) V (x)

C. Melchiorri (DEI) Automatic Control & System Theory 34 Lyapunov method x =[1, ] T 4 m = 5 K = 5 b = 3 1 5 4 3-1 1 - -3 4 1-4 5 1 15 - -4-1 -.5.5 V (x) V (x)

C. Melchiorri (DEI) Automatic Control & System Theory 35 Lyapunov method The pendulum Gravity force Viscous friction Input By considering u(t) =, the system has equilibrium states: x e,1 = apple x e, = apple m g

C. Melchiorri (DEI) Automatic Control & System Theory 36 Lyapunov method The pendulum Gravity force Viscous friction Input Let define the function V(x) as the sum of the potential and kinetic energy m g Clearly, V(x) is positive definite

C. Melchiorri (DEI) Automatic Control & System Theory 37 Lyapunov method The pendulum 5 15 1 5 4 x 1 : posizione angolare θ(t) x 1 : angular position θ(t) - -4-4 - 4 x x : velocità angolare : angular velocity ω(t)

C. Melchiorri (DEI) Automatic Control & System Theory 38 Lyapunov method The pendulum 3 x 1 Angular position θ (t) 1-1 - -3-3 - -1 1 3 x : angular velocity ω (t)

C. Melchiorri (DEI) Automatic Control & System Theory 39 Lyapunov method The pendulum The derivative of V(x) is: If d = : The system is conservative: V(x) = c, a constant depending on the initial condition V(x) = c is a closed curve about x = If c is sufficiently small, then x(t) is constrained to remain close to zero x = is then a stable equilibrium. If d : The system dissipates energy: V(x) decreases and dv(x)/dt < V(x) decreases until the system is in motion (velocity non zero) x goes to x = is then an asymptotically stable equilibrium.

C. Melchiorri (DEI) Automatic Control & System Theory 4 Applications of the Lyapunov method First order systems Let consider the following first order system: with f(x) continuous, and the condition: ẋ + f(x) = (1) xf(x) >, x 6= () Let consider the Lyapunov function V = x, then for The origin x = is a global asymptotically stable (GAS) point. In fact V is unlimited for x

C. Melchiorri (DEI) Automatic Control & System Theory 41 Applications of the Lyapunov method First order systems - Example Given the system 1 8 6 4 the function f(x)= -sin x + x verifies condition () since: - -4-6 -8-1 -1-5 5 1

Applications of the Lyapunov method 3 First order systems - Example 1 1 Position Velocity ẋ.5 -.5-1 -1-1.5 - - -.5 Phase plane -3 5 1 15 9 8-3 -1 -.5.5 1 1.5.5 3 x x = 3 7 6 5 4 3 Plot of V(x) = x 1.5 1 1.5.5 3 C. Melchiorri (DEI) Automatic Control & System Theory 4

C. Melchiorri (DEI) Automatic Control & System Theory 43 Applications of the Lyapunov method Second order systems Let consider the following second order system: where f and g are continuous functions, and the conditions:

C. Melchiorri (DEI) Automatic Control & System Theory 44 Applications of the Lyapunov method Second order systems Consider the Lyapunov function Then: for for If the integral function is unlimited for x, V is unlimited and the origin is GAS.

C. Melchiorri (DEI) Automatic Control & System Theory 45 Applications of the Lyapunov method Second order systems - Example Equations such as (3) and (4) are common when mechanical or electrical systems are considered with elements able to accumulate energy (potential or kinetic) (mass-spring-damper or LRC). Let consider the circuit shown in the figure. The equation is of type (3) and verifies (4). If x(t)=i(t), then: The integral is radially unlimited The system is GAS The Lyapunov function can be considered as the sum of the kinetic and potential energy, the fact that it decreases is related to the dissipativity of the system.

C. Melchiorri (DEI) Automatic Control & System Theory 46 Applications of the Lyapunov method Second order systems - Example

C. Melchiorri (DEI) Automatic Control & System Theory 47 Limit cycles Limit cycles and invariant sets Non-linear systems may have: 1. Equilibrium points Ø stable Ø unstable. Limit cycles Ø stable Ø unstable (closed trajectory) Definition: Limit set A positive limit set S + is a set in the state space that attracts sufficiently close trajectories when t A negative limit set S + is a set in the state space that attracts sufficiently close trajectories when τ where τ, = t 1.8.6.4. 5-5 -3 - -1 1 3 4 An invariant J is a set defined in the state space with the property that for each initial state belonging to J, the whole trajectory belongs to J for both positive and negative values of t.

C. Melchiorri (DEI) Automatic Control & System Theory 48 Limit cycles Limit cycles and invariant sets Let consider a non-linear autonomous system: Assume that 1) a positive definite function V 1 exists such that in the limited domain D 1 ={ x: V 1 (x) < h 1 } the following condition holds: ) a positive definite function V exists such that in the domain D ={ x: h < V (x) < h 3 } the following condition holds: D can be unlimited: in this case the term h 3 is not considered. It is possible to show that the closed domain contains a positive limit set.

C. Melchiorri (DEI) Automatic Control & System Theory 49 Limit cycles D 1 ={ x: V 1 (x) < h 1 } D ={ x: h < V (x) < h 3 } x D D 1 x 1 D 3 V h h 1 V 1 D D 3 D D 3 1 D

Limit cycles Let consider the system It is simple to verify that a limit cycle exists for r = 1 and that such limit cycle is globally asymptotically stable for the system, that is for any initial conditions the trajectories tend to this limit cycle..8.6 V(z).4. 1.5 1 1 X -1 - - -1 X 1 1.5 -.5-1 The limit cycle r = 1 is asymptotically stable N.B.: r > in any case - - -1 1 C. Melchiorri (DEI) Automatic Control & System Theory 5-1.5

Limit cycles Let consider the system ẋ 1 = x ẋ = µ (1 x 1) x x 1 ẋ 3 = x 3 Van der Pool oscillator (λ < ) C. Melchiorri (DEI) Automatic Control & System Theory 51

Limit cycles Evolution of the system defined by 3 1-1 - -3 5 1 15 In the state space the system has an evolution, with non zero initial conditions, as shown in the figure. NB: this system is linear, then this is NOT a limit cycle! 1.8.6.4. 4-4 -3 - -1 1 3 4 C. Melchiorri (DEI) Automatic Control & System Theory 5 - -4

Lyapunov method Non stationary linear systems Let us consider the non stationary homogeneous system: As known, this system is stable in the sense of Lyapunov if, for all t and for all ε >, a parameter η > exists such that: The system is asymptotically stable in the sense of Lyapunov if it is stable and C. Melchiorri (DEI) Automatic Control & System Theory 53

C. Melchiorri (DEI) Automatic Control & System Theory 54 Lyapunov method Non stationary linear systems The following theorem relates the stability properties of system (1) to the state transition matrix φ(t,t ). Theorem: System (1) is stable in the sense of Lyapunov if and only if for all t a positive real number M exists such that: System (1) is asymptoticaly stable in the sense of Lyapunov if it is stable and if:

C. Melchiorri (DEI) Automatic Control & System Theory 55 Lyapunov method Non stationary linear systems Let us consider the non stationary linear system : The system is B.I.B.S. if for all t and for all ε > a parameter η > exists such that if u(t) < η, t t then Theorem: System (3) is B.I.B.S. stable if and only if:

C. Melchiorri (DEI) Automatic Control & System Theory 56 Lyapunov method Non stationary linear systems Similarly, system (3) is B.IB.O. if for all t and for all ε > a parameter η > exists such that if u(t) < η, t t then Theorem: System (3) is B.I.B.O. stable if and only if:

C. Melchiorri (DEI) Automatic Control & System Theory 57 Lyapunov method Continuous-time, time invariant linear systems Let us consider the following linear time invariant system: By considering V(x) = x T P x, then: where: If matrix M is positive definite, system (1) is globally asymptotically stable. The following theorem holds: Theorem: The Lyapunov matrix equation () admits a unique solution P (symmetric positive definite matrix) for each symmetric positive definite matrix M if and only if all the eigenvalues of A are with negative real part.

C. Melchiorri (DEI) Automatic Control & System Theory 58 Lyapunov method Discrete time, time invariant linear systems Let us consider the following linear time invariant system: By considering V(x) = x T P x, then: where: If matrix M is positive definite, system (3) is globally asymptotically stable. The following theorem holds: Theorem: The Lyapunov matrix equation (4) admits a unique solution P (symmetric positive definite matrix) for each symmetric positive definite matrix M if and only if all the eigenvalues of A have magnitude less than 1.

C. Melchiorri (DEI) Automatic Control & System Theory 59 AUTOMATIC CONTROL AND SYSTEM THEORY STABILITY ANALYSIS OF DYNAMIC SYSTEMS THE END