Complex Dynamic Systems: Qualitative vs Quantitative analysis Complex Dynamic Systems Chiara Mocenni Department of Information Engineering and Mathematics University of Siena (mocenni@diism.unisi.it)
Dynamic Systems and ODEs Autonomous dynamic systems in continuous time are described by an ordinary differential equation (ODE): where ẋ = dx dt is the time derivative of variable x and ẋ = f (x), x R n, (1) f (x) : U R n, U R n, is a differentiable function. A solution of (1) is a function x(x 0, t) with values in R n which depends on t and initial condition x(0) = x 0.
The flow defined by the ODE The right hand side of (1) defines a flow φ t : U R n, U R n where φ t (x) = φ(x, t) is a smooth function satisfying equation (1) for all x U and for all t I R. Satisfying (1) means that for all x U and τ I. d dt (φ(x, t)) t=τ = f (φ(x, τ))
Systems of equations A system of ordinary differential equations is defined as: ẋ 1 = f 1 (x 1,..., x n ) ẋ 2 = f 2 (x 1,..., x n ). ẋ n = f n (x 1,..., x n ) (2) A solution of this system is a vector value function x(t) : I R n starting from the initial condition x(0) and satisfying equations (2). R n is called phase space. The elements of the phase space are the space variables.
Existence of solutions Given the initial condition x(0), if function f (x) is differentiable with continuous derivative, then the ODE system ẋ = f (x) has a solution. The solution is unique. This fact has important implications. The trajectory can t intersect itself in the phase space If it intersect itself once it will intersect itself infinite times
The vector field At each point of the phase space the system defines a vector. The components of this vector represent the velocity of each state variable. The ensemble of all vectors is a vector field. An example in the 1-dimensional case:
Some examples in 1D ẋ = x 2 1 Other examples ẋ = x cos(x) ẋ = rx(1 x k )
Periodic functions f (x) in 1D ẋ = sin(x)
An example in 2D: the harmonic oscillator mẍ + kx = 0 { ẋ = v v = k m x { ẋ = v v = ω 2 x At each point of the cartesian plane (x, v) it is associated by the equation the vector (ẋ, v) = (v, ω 2 x)
The phase space of pendulum { ẋ 1 = x 2 ẋ 2 = g L sin(x 1) What about a pendulum with friction? { ẋ 1 = x 2 ẋ 2 = g L sin(x 1) k m x 2
The steady states A steady state of ẋ = f (x) is a solution x such that x = f (x ) = 0 This means that at steady state the system is motionless. Steady states can be stable, unstable or indifferent.
Qualitative vs quantitative Quantitative analysis deals with behavior in time of single ODEs solutions (solutions of the Cauchy problem) Qualitative analysis deals with Families of solutions in the phase space (trajectories) Geometry of the phase space Existence and stability of steady states Flow in the phase space Some questions about periodic motion What means periodic behavior in time and in the phase space? Is periodic motion possible in 1D systems? Is periodic motion possible in linear systems? If yes, what kind of periodic motion? In general how can we study the geometry of the phase space?
Geometry of the phase space In order to study the geometry of the phase space we need to Locate the steady states Study the stability of steady states Know the flow nearby the steady states (asymptotic behavior) Linearise nonlinear systems nearby steady states
Classification of steady states Steady states can be Nodes Symmetric nodes (stars) Degenerate nodes Line of steady states Saddles Spirals Centers
Stability of steady states A steady state x is attracting if all trajectories starting near x approach it as t. If x attracts all trajectories then it is globally attracting. A steady state x is Lyapunov stable if all trajectories that start sufficiently close to x remain close to it for all time. A steady state can be Lyapunov stable but not attracting. These kind of steady states are called neutrally stable (indifferent). Steady states that are Lyapunov stable and attracting are called asymptotically stable (or stable).
Attracting and Lyapunov stable steady states
Two simple linear systems Study the 1D system ẋ = ax Study the 2D system { ẋ = ax ẏ = y
Solutions of 2D linear systems (1) A linear system can be written as where A is a matrix. We seek solutions of the kind ẋ = Ax x(t) = ve λt where v is an eigenvector and λ is an eigenvalue of A. We can check: ẋ = λe λt v = Ave λt This implies the condition for λ and v to be eigenvalue and eigenvector of A: λv = Av
Solutions of 2D linear systems (1) Provided that there are 2 non-zero eigenvalues, λ 1 and λ 2, with eigenvectors v 1 and v 2 then the general solution is x(t) = αv 1 e λ 1t + βv 2 e λ 2t We can check: and ẋ = αv 1 λ 1 e λ 1t + βv 2 λ 2 e λ 2t Ax = α (Av 1 ) e λ 1t + β (Av 2 ) e λ 2t = αλ 1 v 1 e λ 1t + βλ 2 v 2 e λ 2t Since we assumed λ 1 and λ 2 non zero and different, then v 1 and v 2 are independent and span R 2. Then, for any initial condition x 0 we can find the arbitrary constants α and β to satisfy x 0 = αv 1 + βv 2
Solutions of 2D linear systems (2) What about complex eigenvalues, λ 1,2 = a ± ib? From the Euler formula we know that e λt = e a±ib = e at e ibt = e at (cos(bt) + i sin(bt)) Hence, x(t) is a combination of terms involving e at cos(bt) and e at sin(bt) (ne need to calculate the solutions, ref. linear algebra text if interested) Such terms represent exponentially growing oscillations if a = Re(λ) > 0 and decaying oscillations if a = Re(λ) < 0 Zero eigenvalues and degenerate case will be treated later
An example Steady state is (0, 0) λ 1 = 2, λ 2 = 3 ẋ = [ 1 1 4 2 v 1 = (1, 1), v 2 = (1, 4) [ ] [ 1 1 x(t) = α e 1 2t + β 4 ] x ] e 3t α[ and ] β can [ be calculated ] [ by ] solving 2 1 1 = α + β. We find α = 1 and β = 1. 3 1 4 Then, x(t) = e 2t + e 3t and y(t) = e 2t 4e 3t
Classification of the steady states of a 2D linear system If λ 1 and λ 2 are the two eigenvalues of A, then the trace of matrix A is such that τ = λ 1 + λ 2 and the determinant of A is such that = λ 1 λ 2 Moreover, λ 1,2 are the roots of the polynomial λ 2 τλ + Then we can classify all the steady states of a 2D linear system on the basis of τ and
The plane (, τ) lines of s.s. unstable nodes saddles saddles plane of s.s. centers stable nodes unstable spirals stable spirals stars degenerate nodes
Geometry of the phase space: the nullclines The nullclines are curves of the phase space where the flow along each of the state variables annihilates. For example, in a 2D systems the nullclines are the curves of the (x, y) plane such that ẋ = 0 and ẏ = 0, respectively. In a linear 2D system the nullclines are lines in the (x, y) phase space The steady states lie at the intersection among all nullclines The nullclines are different from the eigenvectors (although sometimes they coincide)
Geometry of the phase space: the vector field The ODE equation defines a flow in the phase space This flow is associated to a vector field This vector field can be seen as the velocity field (ẋ 1, ẋ 2,..., ẋ n ) (in 2D it is (ẋ, ẏ)) The eigenvectors represent the vector that span all vectors of the vector field The nullclines point out how is the curvature of the flow along particular curves (ẋ = 0, ẏ = 0)
An example of node
An example of stable node A = [ 4 1 2 1 ] { ẋ = 4x y ẏ = 2x y
An example of saddle A = [ 2 1 1 3 ] { ẋ = 2x y ẏ = x 3y
An example of stable spiral A = [ 2 4 2 3 ] { ẋ = 2x 4y ẏ = 2x 3y
An example of center A = [ 0.6 2 1 0.6 ] { ẋ = 0.6x 2y ẏ = x + 0.6y
An example of marginally stable steady states A = [ 0 0 0 1 ] { ẋ = 0 ẏ = y
Something about using equations for modeling! Question: in which way these equations are useful? Let s try to understand a bit! Suppose that all parameters a, b, c and d are positive numbers Concentrate on the right hand side of the equation The sign in front of parameters a, b, c, d can tell us if the corresponding term is contributing to increase or decrease the velocity of the variable in the left hand side Thus it say something about increasing or decreasing state variables Let s use the classification of 2D linear systems
An example of modeling Let x be the amount of goods produced by a company A and y be the amount of goods consumed by B Let s write the model equations { ẋ = ax by ẏ = cx dy the term ax indicates that the velocity of production may be proportional to the amount of goods produced at any time instant. a is never negative, but it may be null (the velocity of production is independent on the available goods) the term by means that consumers B reduce the amount of goods produced by A the term cx indicates that the amounts of goods possessed by B increase proportionally to the amount of goods produced by A the term dy means that consumers B will loose some of the goods got from A for deterioration or saturation
If a = 0 and d = 0 we have { ẋ = by ẏ = cx What do you expect the dynamics will be?
An example of modeling The general case: { ẋ = ax by ẏ = cx dy What do you expect the dynamics will be?
Why nonlinear? In the linear case the amount of goods consumed depends only on the consumer ( by). But it should also depend on the amount of goods produced by A. Consumers can t buy more than available goods! Correspondingly the amount of goods consumed by B will also depend on the same variables. For example { ẋ = ax bxy ẏ = cxy dy What do you expect the dynamics will be? How many steady states the system has?
Nonlinear and saturation dynamics (1) In the previous case the amount of goods produced by A can be infinite (when no consumers are present the state x(t) will grow exponentially!). The amount produced may saturate nonlinearly. For example { ẋ = ax(1 x) bxy ẏ = cxy dy What do you expect the dynamics will be? How many steady states the system has?
Nonlinear and saturation dynamics (2) Consumer B may be subject to saturation because of the finite capacity of storage or lack of money. The same holds for the consumer equation. For example [ ] y ẋ = ax(1 x) bx [ ] e + y y ẏ = cx dy e + y What do you expect the dynamics will be? How many steady states the system has?