Control Theory in Physics and other Fields of Science

Similar documents
Handbook of Stochastic Methods

OPTIMAL CONTROL AND ESTIMATION

Handbook of Stochastic Methods

Table of Contents [ntc]

STOCHASTIC PROCESSES IN PHYSICS AND CHEMISTRY

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents

Sliding Modes in Control and Optimization

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Gaussian Process Approximations of Stochastic Differential Equations

NPTEL

Langevin Methods. Burkhard Dünweg Max Planck Institute for Polymer Research Ackermannweg 10 D Mainz Germany

Applied Probability and Stochastic Processes

Principles of Nuclear Magnetic Resonance in One and Two Dimensions

Mathematical Theory of Control Systems Design

ADAPTIVE FILTER THEORY

Random Vibrations & Failure Analysis Sayan Gupta Indian Institute of Technology Madras

Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle

Contents. 1 State-Space Linear Systems 5. 2 Linearization Causality, Time Invariance, and Linearity 31

Information Dynamics Foundations and Applications

Large Fluctuations in Chaotic Systems

Continuum Limit of Forward Kolmogorov Equation Friday, March 06, :04 PM

Overview of the Seminar Topic

Optimal Control and Viscosity Solutions of Hamilton-Jacobi-Bellman Equations

Control Design Techniques in Power Electronics Devices

Optimal Control. McGill COMP 765 Oct 3 rd, 2017

Path integrals for classical Markov processes

Population Games and Evolutionary Dynamics

Stochastic Partial Differential Equations with Levy Noise

Solution of Stochastic Optimal Control Problems and Financial Applications

A scaling limit from Euler to Navier-Stokes equations with random perturbation

Time Series: Theory and Methods

Hierarchical Modeling of Complicated Systems

Chapter One. Introduction

Process Modelling, Identification, and Control

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

If we want to analyze experimental or simulated data we might encounter the following tasks:

The Boltzmann Equation and Its Applications

Contents. PART I METHODS AND CONCEPTS 2. Transfer Function Approach Frequency Domain Representations... 42

Optimal Control of Weakly Coupled Systems and Applications

Passivity-based Control of Euler-Lagrange Systems

Analytical Mechanics for Relativity and Quantum Mechanics

OPTIMAL SPACECRAF1 ROTATIONAL MANEUVERS

Large-population, dynamical, multi-agent, competitive, and cooperative phenomena occur in a wide range of designed

Dissipative Systems Analysis and Control

Monte Carlo Methods. Handbook of. University ofqueensland. Thomas Taimre. Zdravko I. Botev. Dirk P. Kroese. Universite de Montreal

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Statistical and Adaptive Signal Processing

Independent Component Analysis. Contents

Adaptive Dual Control

Best-fit quasi-equilibrium ensembles: a general approach to statistical closure of underresolved Hamiltonian dynamics

Stochastic process for macro

EQUATION LANGEVIN. Physics, Chemistry and Electrical Engineering. World Scientific. With Applications to Stochastic Problems in. William T.

Introduction to Applied Nonlinear Dynamical Systems and Chaos

New Introduction to Multiple Time Series Analysis

Optimization: Insights and Applications. Jan Brinkhuis Vladimir Tikhomirov PRINCETON UNIVERSITY PRESS PRINCETON AND OXFORD

Robotics: Science & Systems [Topic 6: Control] Prof. Sethu Vijayakumar Course webpage:

Continuous quantum measurement process in stochastic phase-methods of quantum dynamics: Classicality from quantum measurement

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION CONTENTS VOLUME XIII

Introduction to. Process Control. Ahmet Palazoglu. Second Edition. Jose A. Romagnoli. CRC Press. Taylor & Francis Group. Taylor & Francis Group,

PRINCIPLES OF NONLINEAR OPTICAL SPECTROSCOPY

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers

Different approaches to model wind speed based on stochastic differential equations

Stochastic Mechanics of Particles and Fields

Statistical Mechanics

Stochastic Behavior of Dissipative Hamiltonian Systems with Limit Cycles

Session 1: Probability and Markov chains

ADAPTIVE FILTER THEORY

Vibration Dynamics and Control

Contents. 1 Preliminaries 3. Martingales

MATHEMATICAL PHYSICS

Vector fields and phase flows in the plane. Geometric and algebraic properties of linear systems. Existence, uniqueness, and continuity

Approximation of Top Lyapunov Exponent of Stochastic Delayed Turning Model Using Fokker-Planck Approach

Game Theory with Information: Introducing the Witsenhausen Intrinsic Model

Chaotic Modelling and Simulation

Der SPP 1167-PQP und die stochastische Wettervorhersage

A path integral approach to the Langevin equation

Nonlinear Functional Analysis and its Applications

REVIEW. Hamilton s principle. based on FW-18. Variational statement of mechanics: (for conservative forces) action Equivalent to Newton s laws!

Analytical Mechanics. of Space Systems. tfa AA. Hanspeter Schaub. College Station, Texas. University of Colorado Boulder, Colorado.

Statistícal Methods for Spatial Data Analysis

Elements of Multivariate Time Series Analysis

Lecture 6: Bayesian Inference in SDE Models

Matthew Zyskowski 1 Quanyan Zhu 2

Statistical Signal Processing Detection, Estimation, and Time Series Analysis

Systems Driven by Alpha-Stable Noises

Physics 106b: Lecture 7 25 January, 2018

Elementary Lectures in Statistical Mechanics

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt.

Introduction to Modern Quantum Optics

Incremental Policy Learning: An Equilibrium Selection Algorithm for Reinforcement Learning Agents with Common Interests

Intrinsic Noise in Nonlinear Gene Regulation Inference

Copyrighted Material. 1.1 Large-Scale Interconnected Dynamical Systems

Stochastic Processes

Efficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter

Irr. Statistical Methods in Experimental Physics. 2nd Edition. Frederick James. World Scientific. CERN, Switzerland

Fundamentals of Applied Probability and Random Processes

EN Applied Optimal Control Lecture 8: Dynamic Programming October 10, 2018

Control Systems. LMIs in. Guang-Ren Duan. Analysis, Design and Applications. Hai-Hua Yu. CRC Press. Taylor & Francis Croup

10. Zwanzig-Mori Formalism

Preferred spatio-temporal patterns as non-equilibrium currents

Transcription:

Michael Schulz Control Theory in Physics and other Fields of Science Concepts, Tools, and Applications With 46 Figures Sprin ger

1 Introduction 1 1.1 The Aim of Control Theory 1 1.2 Dynamic State of Classical Mechanical Systems 3 1.3 Dynamic State of Complex Systems 6 1.3.1 What Is a Complex System? 6 1.3.2 Relevant and Irrelevant Degrees of Freedom 9 1.3.3 Quasi-Deterministic Versus Quasi-Stochastic Evolution. 10 1.4 The Physical Approach to Control Theory 13 References 14 2 Deterministic Control Theory 17 2.1 Introduction: The Brachistochrone Problem 17 2.2 The Deterministic Control Problem 19 2.2.1 Functionals, Constraints, and Boundary Conditions... 19 2.2.2 Weak and Streng Minima 20 2.3 The Simplest Control Problem: Classical Mechanics 22 2.3.1 Euler-Lagrange Equations 22 2.3.2 Optimum Criterion 24 2.3.3 One-Dimensional Systems 30 2.4 General Optimum Control Problem 33 2.4.1 Lagrange Approach 33 2.4.2 Hamilton Approach 40 2.4.3 Pontryagin's Maximum Principle 42 2.4.4 Applications of the Maximum Principle 45 2.4.5 Controlled Molecular Dynamic Simulations 53 2.5 The Hamilton-Jacobi Equation 55 References 59

XIV 3 Linear Quadratic Problems 61 3.1 Introduction to Linear Quadratic Problems 61 3.1.1 Motivation 61 3.1.2 The Performance Functional 62 3.1.3 Stability Analysis 63 3.1.4 The General Solution of Linear Quadratic Problems... 71 3.2 Extensions and Applications 73 3.2.1 Modifications of the Performance 73 3.2.2 Inhomogeneous Linear Evolution Equations 75 3.2.3 Scalar Problems 75 3.3 The Optimal Regulator 77 3.3.1 Algebraic Ricatti Equation 77 3.3.2 Stability of Optimal Regulators 79 3.4 Control of Linear Oscillations and Relaxations 81 3.4.1 Integral Representation of State Dynamics 81 3.4.2 Optimal Control of Generalized Linear Evolution Equations 85 3.4.3 Perturbation Theory for Weakly Nonlinear Dynamics.. 88 References 90 4 Control of Fields 93 4.1 Field Equations 93 4.1.1 Classical Field Theory 93 4.1.2 Hydrodynamic Field Equations 99 4.1.3 Other Field Equations 101 4.2 Control by External Sources 103 4.2.1 General Aspects 103 4.2.2 Control Without Spatial Boundaries 104 4.2.3 Passive Boundary Conditions 114 4.3 Control via Boundary Conditions 116 References 118 5 Chaos Control 123 5.1 Characterization of Trajectories in the Phase Space 123 5.1.1 General Problems 123 5.1.2 Conservative Hamiltonian Systems 124 5.1.3 Nonconservative Systems 126 5.2 Time-Discrete Chaos Control 128 5.2.1 Time Continuous Control Versus Time Discrete Control 128 5.2.2 Chaotic Behavior of Time Discrete Systems 132 5.2.3 Control of Time Discrete Equations 135 5.2.4 Reachability and Stabilizability 137 5.2.5 Observability 140 5.3 Time-Continuous Chaos Control 141 5.3.1 Delayed Feedback Control 141

XV 5.3.2 Synchronization 144 References 146 6 Nonequilibrium Statistical Physics 149 6.1 Statistical Approach to Phase Space Dynamics 149 6.1.1 The Probability Distribution 149 6.2 The Liouville Equation 152 6.3 Generalized Rate Equations 153 6.3.1 Probability Distribution of Relevant Quantities 153 6.3.2 The Formal Solution of the Liouville Equation 155 6.3.3 The Nakajima-Zwanzig Equation 156 6.4 Notation of Probability Theory 161 6.4.1 Measures of Central Tendency 161 6.4.2 Measure of Fluctuations around the Central Tendency. 162 6.4.3 Moments and Characteristic Functions 162 6.4.4 Cumulants 163 6.5 Combined Probabilities 164 6.5.1 Conditional Probability 164 6.5.2 Joint Probability 165 6.6 Markov Approximation 167 6.7 Generalized Fokker-Planck Equation 169 6.7.1 Differential Chapman-Kolmogorov Equation 169 6.7.2 Deterministic Processes 173 6.7.3 Markov Diffusion Processes 174 6.7.4 Jump Processes 175 6.8 Correlation and Stationarity 176 6.8.1 Stationarity 176 6.8.2 Correlation 177 6.8.3 Spectra 178 6.9 Stochastic Equations of Motions 179 6.9.1 The Mori-Zwanzig Equation 179 6.9.2 Separation of Time Scales 182 6.9.3 Wiener Process 183 6.9.4 Stochastic Differential Equations 185 6.9.5 Ito's Formula and Fokker-Planck Equation 189 References 191 7 Optimal Control of Stochastic Processes 193 7.1 Markov Diffusion Processes under Control 193 7.1.1 Information Level and Control Mechanisms 193 7.1.2 Path Integrals 194 7.1.3 Performance 197 7.2 Optimal Open Loop Control 199 7.2.1 Mean Performance 199 7.2.2 Tree Approximation 201

XVI 7.3 Feedback Control 204 7.3.1 The Control Equation 204 7.3.2 Linear Quadratic Problems 210 References 211 8 Filters and Predictors 213 8.1 Partial Uncertainty of Controlled Systems 213 8.2 Gaussian Processes 215 8.2.1 The Central Limit Theorem 215 8.2.2 Convergence Problems 220 8.3 Levy Processes 223 8.3.1 Form-Stable Limit Distributions 223 8.3.2 Convergence to Stable Levy Distributions 226 8.3.3 Truncated Levy Distributions 227 8.4 Rare Events 228 8.4.1 The Cramer Theorem 228 8.4.2 Extreme Fluctuations 230 8.5 Kaiman Filter 232 8.5.1 Linear Quadratic Problems with Gaussian Noise 232 8.5.2 Estimation of the System State 232 8.5.3 Ljapunov Differential Equation 237 8.5.4 Optimal Control Problem for Kaiman Filters 239 8.6 Filters and Predictors 243 8.6.1 General Filter Concepts 243 8.6.2 Wiener Filters 244 8.6.3 Estimation of the System Dynamics 245 8.6.4 Regression and Autoregression 246 8.6.5 The Bayesian Concept 249 8.6.6 Neural Networks 251 References 261 9 Game Theory 265 9.1 Unpredictable Systems 265 9.2 Optimal Control and Decision Theory 267 9.2.1 Nondeterministic and Probabilistic Regime 267 9.2.2 Strategies 269 9.3 Zero-Sum Games 271 9.3.1 Two-Player Games 271 9.3.2 Deterministic Strategy 272 9.3.3 Random Strategy 273 9.4 Nonzero-Sum Games 274 9.4.1 Nash Equilibrium 274 9.4.2 Random Nash Equilibria 276 References 276

XVII 10 Optimization Problems 279 10.1 Notations of Optimization Theory 279 10.1.1 Introduction 279 10.1.2 Convex Objects 280 10.2 Optimization Methods 282 10.2.1 Extremal Solutions Without Constraints 282 10.2.2 Extremal Solutions with Constraints 285 10.2.3 Linear Programming 286 10.2.4 Combinatorial Optimization Problems 287 10.2.5 Evolution Strategies 289 References 292