Outline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42)

Size: px
Start display at page:

Download "Outline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42)"

Transcription

1 Outline 2(42) Sysid Course T An Overview. Automatic Control, SY, Linköpings Universitet An Umbrella Contribution for the aterial in the Course The classic, conventional System dentification Setup The dentification Loop odel and odel Structures dentification ethods odel alidation xyz means: details are given on page xyz in the textbook Ljung: System dentification. Theory for the User, 2nd Edition, Prentice-Hall An ntroductory Example 2,530 3(42) ata from Gripen 4(42) Pitch rate, Canard, Elevator, Leading Edge Flap How do the control surface angles affect the pitch rate? Aerodynamical derivatives? How to use the information in flight data?

2 Aircraft ynamics: From input 1 5(42) Using all inputs 6(42) y(t) pitch rate at time t. u 1 (t) canard angle at time t. T = 1/60. Try y(t) = +b 1 u 1 (t T) + b 2 u 1 (t 2T) + b 3 u 1 (t 3T) + b 4 u 1 (t 4T) u 1 canard angle; u 2 Elevator angle; u 3 Leading edge flap; y(t)= a 1 y(t T) a 2 y(t 2T) a 3 y(t 3T) a 4 y(t 4T) +b 1 1 u 1(t T) b 4 1 u 1(t 4T) +b 1 2 u 2(t T) b 3 1 u 3(t T) b 3 4 u 3(t 4T) ashed line: Actual Pitch rate. Solid line: 10 step ahead predicted pitch rate, based on the fourth order model from canard angle only. System dentification: ssues 14 7(42) System dentification: State-of-the-Art Setup 8(42) Select a class of candidate models Select a member in this class using the observed data Evaluate the quality of the obtained model esign the experiment so that the model will be good. A Typical Problem Given Observed nput-output ata: Find a escription of the System that Generated the ata [Simulator or Predictor. Linear System: mpulse response or Bode plot]. Basic Approach Find a suitable odel Structure, Estimate its parameters, and compute the response of the resulting model Techniques Estimate the parameters by L techniques/pe (prediction error methods). Find the model structure by AC, BC or Cross alidation

3 The S Flow 15 9(42) The S Flow; odel Structures 10(42) o, try new o, try new (ˆ ) : The Experiment : The easured ata : The odel Set : The dentification ethod : The alidation Procedure o, try new o, try new (ˆ ) odels: General Aspects 79,161 11(42) Linear odels 79 12(42) A model is a mathematical expression that describes the connections between measured inputs and outputs, and possibly related noise sequences. They can come in many different forms The models are labeled with a parameter vector θ A common framework is to describe the model as a predictor of the next output, based on observations of past input-output data. Observed input output (u, y) data up to time t: Z t odel described by predictor: (θ) : ŷ(t θ) = g(t, θ, Z t 1 ). General escription y(t) = G(q, θ)u(t) + H(q, θ)e(t), q : shift op. e : white noise G(q, θ)u(t) = k=1 g k u(t k), H(q, θ)e(t) = 1 + k=1 h k e(t k) y(t) = G(q, θ)u(t) + v(t); Spectrum Φ v (ω) = λ H(e iω ) 2 Predictor ŷ(t θ) = G(q, θ)u(t) + [ H 1 (q, θ)][y(t) G(q, θ)u(t)]

4 Common Black-Box Parameterizations 81 13(42) BJ (Box-Jenkins) AR: G(q, θ) = B(q) F(q) ; C(q) H(q, θ) = (q) B(q) = b 1 q 1 + b 2 q b nb q nb F(q) = 1 + f 1 q f nf q nf θ = [b 1, b 2,..., f nf ] y(t) = B(q) A(q) u(t) + 1 e(t) or A(q) A(q)y(t) = B(q)u(t) + e(t) or y(t) + a 1 y(t 1) a na y(t na) = b 1 u(t 1) b nb u(t nb) Special Feature of AR-models 14(42) The AR odel is a Linear Regression The predictor for AR can be written ŷ(t θ) = ϕ T (t)θ ϕ(t) = [ y(t 1)... y(t na) u(t 1)... u(t nb) ] T θ = [ ] T a 1... a na b 1... b nb Common Black and Grey Parameterizations 97 15(42) Continuous Time (CT) odels 93 16(42) State-Space with Possibly Physically Parameterized atrices x(t + 1) = A(θ)x(t) + B(θ)u(t) + K(θ)e(t) y(t) = C(θ)x(t) + e(t) Corresponds to G(q, θ) = C(θ)(q A(θ)) 1 B(θ). H(q, θ) = C(θ)(q A(θ)) 1 K(θ) + Physical odel with unknown parameters ẋ(t) = F(θ)x(t) + G(θ)u(t) + w(t) y(t) = C(θ)x(t) + (θ)u(t) + v(t) Sample it (with correct nput ntersample Behaviour): x(t + 1) = A(θ)x(t) + B(θ)u(t) + K(θ)e(t) y(t) = C(θ)x(t) + e(t) ow apply the discrete time formalism to this model, which is parameterized in terms of the CT parameters θ

5 on-linear odels (42) The S Flow: Estimation: 18(42) (θ) : ŷ(t θ) = g(t, θ, Z t 1 ). g non-linear in Z. ore in the L lecture notes Physical Grey-Box odels Perform physical modeling (e.g. in OELCA) and denote unknown physical parameters by θ. Collect the model equations as ẋ(t) = f (x(t), u(t), θ) y(t) = h(x(t), u(t), θ) (ˆ ) LAR models Block Oriented odels, Wiener-Hammerstein and similar Local Linear odels, Linear Parameter arying odels. Linear model G(z, p) parameterized by a measured regime variable p o, try new o, try new Criteria: 1. Pragmatically (42) f a model, ŷ(t θ), essentially is a predictor of the next output, is is natural to evaluate its quality by assessing how well it predicts: Form the Prediction error and measure its size: ε(t, θ) = y(t) ŷ(t θ), l(ε(t, θ)) (ote: Linear system ε = 1 H (y Gu)) Typically l(x) = x2. How has it performed historically? Criteria: 2. Statistically (42) aximum Likelihood (L): Which model makes the recorded observations most likely? (pdf: probability density function) max p(y θ) p : the pdf of the outputs for a given parameter value f the innovations e of the measured data have the pdf f (x), then (θ) = l(ε(t, θ)) t=1 log p(y θ) = (θ) = l(ε(t, θ)) t=1 l(x) = log f (x) Which model in the structure performed best? ˆθ = arg min θ (θ) So, what minimizes the pragmatic fit is also the L estimate! Gaussian pdf l(x) x 2

6 Estimation: ariants 21(42) ore in the Special ssues Lecture notes Regularization: To curb the flexibility of the model structure and to provide better numerics in the minimization we can postulate a regularized criterion: 221 odel Estimate Properties ( ˆθ ) 22(42) W (θ) = (θ) + λ(θ θ ) T R((θ θ ) Bayesian iew: The parameter vector θ is a random vector with a certain prior pdf, and we seek the posterior pdf, given the observations. Suppose the prior is θ (θ, R 1 /λ) 213 o, try new o, try new (ˆ ) Sub-space methods 208 -techniques 224 (The E algorithm, Errors-in-ariable problems) odel Estimate Properties, 223, (42) As the number of data,, tends to infinity ˆθ θ arg min θ El(ε(t, θ)) the best possible predictor in f contains a true description of the system Cov ˆθ = λ [Eψ(t)ψT (t)] 1 [ψ(t) = dθ d ŷ(t θ), λ : noise level] is the Cramér-Rao lower bound for any (unbiased) estimator. E: Expectation. These are very nice optimal properties: The model structure is large enough: The L/PE estimated model is (asymptotically) the best possible one. Has smallest possible variance (Cramér- Rao) The model structure is not large enough: The L/PE estimate converges to the best possible approximation of the system. The estimate has smallest possible asymptotic bias. Proof: Formal Calculations 24(42) We illustrate the main ideas for a simple curve-fitting problem: System: y(t) = g 0 (x t ) + e(t), e white noise. odel: g(x t, θ) Convergence: (LL= Law of Large umbers) (θ) = 1 (g 0 (x t ) + e(t) g(x t, θ)) 2 = 1 (g 0 (x t ) g(x t, θ)) e 2 (t) + 2 (g 0 (x t ) g(x t, θ))e(t) 1 LL: (g 0 (x t ) g(x t, θ))e(t) 0 so (θ) E(g 0 (x t ) g(x t, θ)) 2 (uniformly in θ!) = lim 1 (g 0 (x t ) g(x t, θ)) 2 as

7 Proof: Formal Calculations, ctn d 25(42) Asymptotics for linear models 263, (42) Asymptotic istribution (CLT= Central Limit Theorem) 0 = ( ˆθ ) = (θ ) + (θ )( ˆθ θ ) ( ˆθ θ ) = [ (θ )] 1 (θ ) (θ) = 2 (y(t) g(x t, θ))g (x t, θ) (θ ) = 2 e(t)ψ(t); ψ(t) = g (x t ; θ ) LL: (θ ) = 2 ψ(t)ψ T (t) + 2 e(t)g (x t, θ ) 2Eψψ T CLT: 1 e(t)ψ(t) (0, λeψψ T ) [Φ u, Φ v : Spectra of input and additive noise v = He. G 0 true transfer function.] π ˆθ θ = arg min θ π CovG(e iω, ˆθ ) n G(e iω, θ) G 0 (e iω ) 2 Φ u (ω) H(e iω, θ) 2 dω Φ v (ω) as n, Φ u (ω) n : model order ( ˆθ θ ) (0, λ[eψψ T ] 1 ) Experiment esign: 27(42) odel alidation: 28(42) (ˆ ) (ˆ ) o, try new o, try new o, try new o, try new See Lecture otes Experiment esign

8 odel alidation Techniques (42) Twist and turn the model(s) to check if they are good enough for the intended application. Essentially a subjective decision. Several basic techniques are available: 1. Simulation (prediction) cross validation Check how well the estimated model can reproduce new, validation data 2. Residual Analysis Are the "leftovers" unpredictable? 3. Comparing ifferent odels Which one performs best? 4. Bias-ariance Trade-off What is the suitable complexity (flexibility) of a model? Simulation Cross alidation (42) Collect a estimation data set Z e and a validation data set Z v. Estimate the model(s) using the estimation data and simulate it using the input signal in the validation data. Plot that simultated model output together with the measured validation output. m = arx(z1(1:150),[2 2 1]); compare(z1(151:end),m) The ultimate test. eeds no probabilistic justification. Plot simulated outputs and measured outputs for the fresh data set. Alternatively one may use predictions over longer periods. Typically the performance can be evaluated as sum of squared mismatches. ( FT ) odel alidation: Residual Analysis (42) Typical Plots 32(42) 1 Correlation function of residuals. Output # ε(t) = ε(t, ˆθ ) = y(t) ŷ(t ˆθ ) ε(t) should be independent of u(t τ), τ > 0 ε(t) should ideally be white noise. Compute the corresponding correlation functions Confidence intervals for the estimates can also be computed. This requires some care! lag 0.6 Cross corr. function between input 1 and residuals from output lag True delay: 4 guessed delay: 5

9 Typical Plots, ct d 33(42) odel alidation Techniques (42) 1 Correlation function of residuals. Output # lag 0.2 Cross corr. function between input 1 and residuals from output lag Guessed delay = true delay. Twist and turn the model(s) to check if they are good enough for the intended application. Essentially a subjective decision. Several basic techniques are available: 1. Simulation (prediction) cross validation Check how well the estimated model can reproduce new, validation data 2. Residual Analysis Are the "leftovers" unpredictable? 3. Comparing ifferent odels Which one performs best? 4. Bias-ariance Trade-off What is the suitable complexity (flexibility) of a model? Comparing ifferent odels (42) What to compare? simulation performance prediction performance compare(zv,th,k) Comparing models on fresh data sets (Simulation cross validation as before) Comparing models on second-hand data sets Comparing odels on Second-Hand ata (42) Problem: Larger models will always perform better on estimation data. BASC EA: Compensate for over-fit FT OEL ORER Add odel Complexity Penalty Akaike: AC, FPE Rissanen L Hypothesis test Check if decrease in fit is larger than expected

10 t (s) t (s) OUTPUT #1 PUT # AC, FPE, L (42) Comparing ifferent odels (42) AC for Gaussian case: min min θ [(1 + 2d ) 1 ε 2 (t, θ )], t=1 [ -log likelihood + d ] d = dim θ FPE similar, with (1 + 2d ) replaced by +d d. Aims at estimating the fit that would be obtained for a fresh data set. L (or BC) is like AC but with a 2d log penalty. What to compare? simulation performance prediction performance compare(zv,th,k) Comparing models on fresh data sets (Simulation cross validation as before) Comparing models on second-hand data sets Comparing many models simulaneously Bias ariance Trade Off (42) Any estimated model is incorrect. The errors have two sources: Bias: The model structure is not flexible enough to contain a correct description of the system. ariance: The disturbances on the measurements affect the model estimate, and cause variations when the experiment is repeated, even with the same input. ean Square Error (SE) = Bias 2 + ariance. When model flexibility,bias and ariance. To minimize SE is a good trade-off in flexibility. n state-of-the-art dentification, this flexibility trade-off is governed primarily by model order. Status of the State-of-the-Art Framework 40(42) Well established statistical theory Optimal asymptotic properties Efficient software any applications in very diverse areas. Some examples: Aircraft ynamics: Brain Activity (fr): Pulp Buffer essel: ;

Outline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification

Outline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification Outline What Can Regularization Offer for Estimation of Dynamical Systems? with Tianshi Chen Preamble: The classic, conventional System Identification Setup Bias Variance, Model Size Selection Regularization

More information

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET The Problem Identification of Linear and onlinear Dynamical Systems Theme : Curve Fitting Division of Automatic Control Linköping University Sweden Data from Gripen Questions How do the control surface

More information

System Identification: From Data to Model

System Identification: From Data to Model 1 : From Data to Model With Applications to Aircraft Modeling Division of Automatic Control Linköping University Sweden Dynamic Systems 2 A Dynamic system has an output response y that depends on (all)

More information

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 9 - Parameter estimation in linear models Model structures Parameter estimation via prediction error minimization Properties of the estimate: bias and variance

More information

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström PREDICTIO ERROR METHODS Torsten Söderström Department of Systems and Control, Information Technology, Uppsala University, Uppsala, Sweden Keywords: prediction error method, optimal prediction, identifiability,

More information

Identification, Model Validation and Control. Lennart Ljung, Linköping

Identification, Model Validation and Control. Lennart Ljung, Linköping Identification, Model Validation and Control Lennart Ljung, Linköping Acknowledgment: Useful discussions with U Forssell and H Hjalmarsson 1 Outline 1. Introduction 2. System Identification (in closed

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Basics of System Identification Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont (UBC) EECE574 - Basics of

More information

f-domain expression for the limit model Combine: 5.12 Approximate Modelling What can be said about H(q, θ) G(q, θ ) H(q, θ ) with

f-domain expression for the limit model Combine: 5.12 Approximate Modelling What can be said about H(q, θ) G(q, θ ) H(q, θ ) with 5.2 Approximate Modelling What can be said about if S / M, and even G / G? G(q, ) H(q, ) f-domain expression for the limit model Combine: with ε(t, ) =H(q, ) [y(t) G(q, )u(t)] y(t) =G (q)u(t) v(t) We know

More information

What can regularization offer for estimation of dynamical systems?

What can regularization offer for estimation of dynamical systems? 8 6 4 6 What can regularization offer for estimation of dynamical systems? Lennart Ljung Tianshi Chen Division of Automatic Control, Department of Electrical Engineering, Linköping University, SE-58 83

More information

Using Multiple Kernel-based Regularization for Linear System Identification

Using Multiple Kernel-based Regularization for Linear System Identification Using Multiple Kernel-based Regularization for Linear System Identification What are the Structure Issues in System Identification? with coworkers; see last slide Reglerteknik, ISY, Linköpings Universitet

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28

More information

9. Model Selection. statistical models. overview of model selection. information criteria. goodness-of-fit measures

9. Model Selection. statistical models. overview of model selection. information criteria. goodness-of-fit measures FE661 - Statistical Methods for Financial Engineering 9. Model Selection Jitkomut Songsiri statistical models overview of model selection information criteria goodness-of-fit measures 9-1 Statistical models

More information

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 10 - System identification as a model building tool Experiment design Examination and prefiltering of data Model structure selection Model validation Lecture

More information

12. Prediction Error Methods (PEM)

12. Prediction Error Methods (PEM) 12. Prediction Error Methods (PEM) EE531 (Semester II, 2010) description optimal prediction Kalman filter statistical results computational aspects 12-1 Description idea: determine the model parameter

More information

Matlab software tools for model identification and data analysis 11/12/2015 Prof. Marcello Farina

Matlab software tools for model identification and data analysis 11/12/2015 Prof. Marcello Farina Matlab software tools for model identification and data analysis 11/12/2015 Prof. Marcello Farina Model Identification and Data Analysis (Academic year 2015-2016) Prof. Sergio Bittanti Outline Data generation

More information

Control Systems Lab - SC4070 System Identification and Linearization

Control Systems Lab - SC4070 System Identification and Linearization Control Systems Lab - SC4070 System Identification and Linearization Dr. Manuel Mazo Jr. Delft Center for Systems and Control (TU Delft) m.mazo@tudelft.nl Tel.:015-2788131 TU Delft, February 13, 2015 (slides

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Recursive Identification in Closed-Loop and Adaptive Control Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont

More information

Matlab software tools for model identification and data analysis 10/11/2017 Prof. Marcello Farina

Matlab software tools for model identification and data analysis 10/11/2017 Prof. Marcello Farina Matlab software tools for model identification and data analysis 10/11/2017 Prof. Marcello Farina Model Identification and Data Analysis (Academic year 2017-2018) Prof. Sergio Bittanti Outline Data generation

More information

THERE are two types of configurations [1] in the

THERE are two types of configurations [1] in the Linear Identification of a Steam Generation Plant Magdi S. Mahmoud Member, IAEG Abstract The paper examines the development of models of steam generation plant using linear identification techniques. The

More information

Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data.

Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. ISS0031 Modeling and Identification Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. Aleksei Tepljakov, Ph.D. October 21, 2015 Discrete-time Transfer Functions

More information

IDENTIFICATION FOR CONTROL

IDENTIFICATION FOR CONTROL IDENTIFICATION FOR CONTROL Raymond A. de Callafon, University of California San Diego, USA Paul M.J. Van den Hof, Delft University of Technology, the Netherlands Keywords: Controller, Closed loop model,

More information

Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee

Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee Stochastic Processes Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee 1 Outline Methods of Mean Squared Error Bias and Unbiasedness Best Unbiased Estimators CR-Bound for variance

More information

Model Identification and Validation for a Heating System using MATLAB System Identification Toolbox

Model Identification and Validation for a Heating System using MATLAB System Identification Toolbox IOP Conference Series: Materials Science and Engineering OPEN ACCESS Model Identification and Validation for a Heating System using MATLAB System Identification Toolbox To cite this article: Muhammad Junaid

More information

OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP. KTH, Signals, Sensors and Systems, S Stockholm, Sweden.

OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP. KTH, Signals, Sensors and Systems, S Stockholm, Sweden. OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP Henrik Jansson Håkan Hjalmarsson KTH, Signals, Sensors and Systems, S-00 44 Stockholm, Sweden. henrik.jansson@s3.kth.se Abstract: In this contribution we extend

More information

Introduction to system identification

Introduction to system identification Introduction to system identification Jan Swevers July 2006 0-0 Introduction to system identification 1 Contents of this lecture What is system identification Time vs. frequency domain identification Discrete

More information

The Problem 2(47) Nonlinear System Identification: A Palette from Off-white to Pit-black. A Common Frame 4(47) This Presentation... 3(47) ˆθ = arg min

The Problem 2(47) Nonlinear System Identification: A Palette from Off-white to Pit-black. A Common Frame 4(47) This Presentation... 3(47) ˆθ = arg min y y y y y Tunn: measurement; Tjock: simulation 6 7 8 9.. 6 7 8 9.. 6 7 8 9 6 7 8 9 6 7 8 9 [s] 6 OUTPUT # 6 - INPUT # - 6.. -. OUTPUT # INPUT # - 6 The Problem (7) : A Palette from Off-white to Pit-black

More information

Linear Approximations of Nonlinear FIR Systems for Separable Input Processes

Linear Approximations of Nonlinear FIR Systems for Separable Input Processes Linear Approximations of Nonlinear FIR Systems for Separable Input Processes Martin Enqvist, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581

More information

EIE6207: Estimation Theory

EIE6207: Estimation Theory EIE6207: Estimation Theory Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: Steven M.

More information

Identification of Non-linear Dynamical Systems

Identification of Non-linear Dynamical Systems Identification of Non-linear Dynamical Systems Linköping University Sweden Prologue Prologue The PI, the Customer and the Data Set C: I have this data set. I have collected it from a cell metabolism experiment.

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Recursive Identification Algorithms Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2012 Guy Dumont (UBC EECE) EECE 574 -

More information

Data mining for system identi cation

Data mining for system identi cation LERTEKNIK REG AU T O MA RO T IC C O N T L applications to process André Carvalho Bittencourt Automatic Control, Linköping Sweden Outline 1. Problem formulation. Theoretical guiding principles Modeling

More information

Further Results on Model Structure Validation for Closed Loop System Identification

Further Results on Model Structure Validation for Closed Loop System Identification Advances in Wireless Communications and etworks 7; 3(5: 57-66 http://www.sciencepublishinggroup.com/j/awcn doi:.648/j.awcn.735. Further esults on Model Structure Validation for Closed Loop System Identification

More information

Initialization Methods for System Identification

Initialization Methods for System Identification Linköping studies in science and technology. Thesis. No. 1426 Initialization Methods for System Identification Christian Lyzell REGLERTEKNIK AUTOMATIC CONTROL LINKÖPING Division of Automatic Control Department

More information

Finite Sample Confidence Regions for Parameters in Prediction Error Identification using Output Error Models

Finite Sample Confidence Regions for Parameters in Prediction Error Identification using Output Error Models Proceedings of the 7th World Congress The International Federation of Automatic Control Finite Sample Confidence Regions for Parameters in Prediction Error Identification using Output Error Models Arnold

More information

REGLERTEKNIK AUTOMATIC CONTROL LINKÖPING

REGLERTEKNIK AUTOMATIC CONTROL LINKÖPING Estimation Focus in System Identication: Preltering, Noise Models, and Prediction Lennart Ljung Department of Electrical Engineering Linkping University, S-81 83 Linkping, Sweden WWW: http://www.control.isy.liu.se

More information

PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR MADE PARAMETERIZATION. Jianhong Wang, Hong Jiang and Yonghong Zhu

PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR MADE PARAMETERIZATION. Jianhong Wang, Hong Jiang and Yonghong Zhu International Journal of Innovative Computing, Information and Control ICIC International c 208 ISSN 349-498 Volume 4, Number, February 208 pp. 8 96 PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR

More information

On Identification of Cascade Systems 1

On Identification of Cascade Systems 1 On Identification of Cascade Systems 1 Bo Wahlberg Håkan Hjalmarsson Jonas Mårtensson Automatic Control and ACCESS, School of Electrical Engineering, KTH, SE-100 44 Stockholm, Sweden. (bo.wahlberg@ee.kth.se

More information

Brief Review on Estimation Theory

Brief Review on Estimation Theory Brief Review on Estimation Theory K. Abed-Meraim ENST PARIS, Signal and Image Processing Dept. abed@tsi.enst.fr This presentation is essentially based on the course BASTA by E. Moulines Brief review on

More information

Model Selection Tutorial 2: Problems With Using AIC to Select a Subset of Exposures in a Regression Model

Model Selection Tutorial 2: Problems With Using AIC to Select a Subset of Exposures in a Regression Model Model Selection Tutorial 2: Problems With Using AIC to Select a Subset of Exposures in a Regression Model Centre for Molecular, Environmental, Genetic & Analytic (MEGA) Epidemiology School of Population

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

A NEW INFORMATION THEORETIC APPROACH TO ORDER ESTIMATION PROBLEM. Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A.

A NEW INFORMATION THEORETIC APPROACH TO ORDER ESTIMATION PROBLEM. Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A. A EW IFORMATIO THEORETIC APPROACH TO ORDER ESTIMATIO PROBLEM Soosan Beheshti Munther A. Dahleh Massachusetts Institute of Technology, Cambridge, MA 0239, U.S.A. Abstract: We introduce a new method of model

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Lecture 2 Machine Learning Review

Lecture 2 Machine Learning Review Lecture 2 Machine Learning Review CMSC 35246: Deep Learning Shubhendu Trivedi & Risi Kondor University of Chicago March 29, 2017 Things we will look at today Formal Setup for Supervised Learning Things

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

Mathematical statistics

Mathematical statistics October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter

More information

Estimation of Dynamic Regression Models

Estimation of Dynamic Regression Models University of Pavia 2007 Estimation of Dynamic Regression Models Eduardo Rossi University of Pavia Factorization of the density DGP: D t (x t χ t 1, d t ; Ψ) x t represent all the variables in the economy.

More information

Identifiability in dynamic network identification

Identifiability in dynamic network identification Identifiability in dynamic network identification Harm Weerts 1 Arne Dankers 2 Paul Van den Hof 1 1 Control Systems, Department of Electrical Engineering, Eindhoven University of Technology, The Netherlands.

More information

LTI Approximations of Slightly Nonlinear Systems: Some Intriguing Examples

LTI Approximations of Slightly Nonlinear Systems: Some Intriguing Examples LTI Approximations of Slightly Nonlinear Systems: Some Intriguing Examples Martin Enqvist, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581

More information

Testing Restrictions and Comparing Models

Testing Restrictions and Comparing Models Econ. 513, Time Series Econometrics Fall 00 Chris Sims Testing Restrictions and Comparing Models 1. THE PROBLEM We consider here the problem of comparing two parametric models for the data X, defined by

More information

EIE6207: Maximum-Likelihood and Bayesian Estimation

EIE6207: Maximum-Likelihood and Bayesian Estimation EIE6207: Maximum-Likelihood and Bayesian Estimation Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak

More information

1 Introduction When the model structure does not match the system, is poorly identiable, or the available set of empirical data is not suciently infor

1 Introduction When the model structure does not match the system, is poorly identiable, or the available set of empirical data is not suciently infor On Tikhonov Regularization, Bias and Variance in Nonlinear System Identication Tor A. Johansen SINTEF Electronics and Cybernetics, Automatic Control Department, N-7034 Trondheim, Norway. Email: Tor.Arne.Johansen@ecy.sintef.no.

More information

Parametric Techniques Lecture 3

Parametric Techniques Lecture 3 Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

Advanced Signal Processing Introduction to Estimation Theory

Advanced Signal Processing Introduction to Estimation Theory Advanced Signal Processing Introduction to Estimation Theory Danilo Mandic, room 813, ext: 46271 Department of Electrical and Electronic Engineering Imperial College London, UK d.mandic@imperial.ac.uk,

More information

Lecture 5: Estimation of time series

Lecture 5: Estimation of time series Lecture 5, page 1 Lecture 5: Estimation of time series Outline of lesson 5 (chapter 4) (Extended version of the book): a.) Model formulation Explorative analyses Model formulation b.) Model estimation

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

form the corresponding predictions (3) it would be tempting to use ^y(tj) =G(q )u(t)+(ih (q ))(y(t)g(q )u(t)) (6) and the associated prediction errors

form the corresponding predictions (3) it would be tempting to use ^y(tj) =G(q )u(t)+(ih (q ))(y(t)g(q )u(t)) (6) and the associated prediction errors Some Results on Identifying Linear Systems Using Frequency Domain Data Lennart Ljung Department of Electrical Engineering Linkoping University S-58 83 Linkoping, Sweden Abstract The usefulness of frequency

More information

Machine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li. https://funglee.github.io

Machine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li. https://funglee.github.io Machine Learning Lecture 4: Regularization and Bayesian Statistics Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 207 Overfitting Problem

More information

Inference in non-linear time series

Inference in non-linear time series Intro LS MLE Other Erik Lindström Centre for Mathematical Sciences Lund University LU/LTH & DTU Intro LS MLE Other General Properties Popular estimatiors Overview Introduction General Properties Estimators

More information

A. Poulimenos, M. Spiridonakos, and S. Fassois

A. Poulimenos, M. Spiridonakos, and S. Fassois PARAMETRIC TIME-DOMAIN METHODS FOR NON-STATIONARY RANDOM VIBRATION IDENTIFICATION AND ANALYSIS: AN OVERVIEW AND COMPARISON A. Poulimenos, M. Spiridonakos, and S. Fassois DEPARTMENT OF MECHANICAL & AERONAUTICAL

More information

IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS

IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS M Gevers,1 L Mišković,2 D Bonvin A Karimi Center for Systems Engineering and Applied Mechanics (CESAME) Université Catholique de Louvain B-1348 Louvain-la-Neuve,

More information

Regression Estimation - Least Squares and Maximum Likelihood. Dr. Frank Wood

Regression Estimation - Least Squares and Maximum Likelihood. Dr. Frank Wood Regression Estimation - Least Squares and Maximum Likelihood Dr. Frank Wood Least Squares Max(min)imization Function to minimize w.r.t. β 0, β 1 Q = n (Y i (β 0 + β 1 X i )) 2 i=1 Minimize this by maximizing

More information

Model comparison and selection

Model comparison and selection BS2 Statistical Inference, Lectures 9 and 10, Hilary Term 2008 March 2, 2008 Hypothesis testing Consider two alternative models M 1 = {f (x; θ), θ Θ 1 } and M 2 = {f (x; θ), θ Θ 2 } for a sample (X = x)

More information

Statistical inference

Statistical inference Statistical inference Contents 1. Main definitions 2. Estimation 3. Testing L. Trapani MSc Induction - Statistical inference 1 1 Introduction: definition and preliminary theory In this chapter, we shall

More information

Basic concepts in estimation

Basic concepts in estimation Basic concepts in estimation Random and nonrandom parameters Definitions of estimates ML Maimum Lielihood MAP Maimum A Posteriori LS Least Squares MMS Minimum Mean square rror Measures of quality of estimates

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

Regression, Ridge Regression, Lasso

Regression, Ridge Regression, Lasso Regression, Ridge Regression, Lasso Fabio G. Cozman - fgcozman@usp.br October 2, 2018 A general definition Regression studies the relationship between a response variable Y and covariates X 1,..., X n.

More information

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Detection theory 101 ELEC-E5410 Signal Processing for Communications Detection theory 101 ELEC-E5410 Signal Processing for Communications Binary hypothesis testing Null hypothesis H 0 : e.g. noise only Alternative hypothesis H 1 : signal + noise p(x;h 0 ) γ p(x;h 1 ) Trade-off

More information

COS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 10

COS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 10 COS53: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 0 MELISSA CARROLL, LINJIE LUO. BIAS-VARIANCE TRADE-OFF (CONTINUED FROM LAST LECTURE) If V = (X n, Y n )} are observed data, the linear regression problem

More information

ECE531 Lecture 8: Non-Random Parameter Estimation

ECE531 Lecture 8: Non-Random Parameter Estimation ECE531 Lecture 8: Non-Random Parameter Estimation D. Richard Brown III Worcester Polytechnic Institute 19-March-2009 Worcester Polytechnic Institute D. Richard Brown III 19-March-2009 1 / 25 Introduction

More information

Solutions for examination in TSRT78 Digital Signal Processing,

Solutions for examination in TSRT78 Digital Signal Processing, Solutions for examination in TSRT78 Digital Signal Processing 212-12-2 1. (a) The forgetting factor balances the accuracy and the adaptation speed of the RLS algorithm. This as the forgetting factor controls

More information

LTI Systems, Additive Noise, and Order Estimation

LTI Systems, Additive Noise, and Order Estimation LTI Systems, Additive oise, and Order Estimation Soosan Beheshti, Munther A. Dahleh Laboratory for Information and Decision Systems Department of Electrical Engineering and Computer Science Massachusetts

More information

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS EMERY N. BROWN AND CHRIS LONG NEUROSCIENCE STATISTICS RESEARCH LABORATORY DEPARTMENT

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

Cramér-Rao Bounds for Estimation of Linear System Noise Covariances

Cramér-Rao Bounds for Estimation of Linear System Noise Covariances Journal of Mechanical Engineering and Automation (): 6- DOI: 593/jjmea Cramér-Rao Bounds for Estimation of Linear System oise Covariances Peter Matiso * Vladimír Havlena Czech echnical University in Prague

More information

Lecture Stat Information Criterion

Lecture Stat Information Criterion Lecture Stat 461-561 Information Criterion Arnaud Doucet February 2008 Arnaud Doucet () February 2008 1 / 34 Review of Maximum Likelihood Approach We have data X i i.i.d. g (x). We model the distribution

More information

Lecture 5 September 19

Lecture 5 September 19 IFT 6269: Probabilistic Graphical Models Fall 2016 Lecture 5 September 19 Lecturer: Simon Lacoste-Julien Scribe: Sébastien Lachapelle Disclaimer: These notes have only been lightly proofread. 5.1 Statistical

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Mathematical statistics

Mathematical statistics October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:

More information

CMU-Q Lecture 24:

CMU-Q Lecture 24: CMU-Q 15-381 Lecture 24: Supervised Learning 2 Teacher: Gianni A. Di Caro SUPERVISED LEARNING Hypotheses space Hypothesis function Labeled Given Errors Performance criteria Given a collection of input

More information

Tutorial lecture 2: System identification

Tutorial lecture 2: System identification Tutorial lecture 2: System identification Data driven modeling: Find a good model from noisy data. Model class: Set of all a priori feasible candidate systems Identification procedure: Attach a system

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

Estimation and Model Selection in Mixed Effects Models Part I. Adeline Samson 1

Estimation and Model Selection in Mixed Effects Models Part I. Adeline Samson 1 Estimation and Model Selection in Mixed Effects Models Part I Adeline Samson 1 1 University Paris Descartes Summer school 2009 - Lipari, Italy These slides are based on Marc Lavielle s slides Outline 1

More information

MOOSE: Model Based Optimal Input Design Toolbox for Matlab

MOOSE: Model Based Optimal Input Design Toolbox for Matlab MOOSE: Model Based Optimal Input Design Toolbox for Matlab Technical Report Mariette Annergren mariette.annergren@ee.kth.se Christian Larsson christian.larsson@ee.kth.se September 23, 2011 Contents 1 Introduction

More information

557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE

557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE 557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE An estimator, T (X), of θ can be evaluated via its statistical properties. Typically, two aspects are considered: Expectation Variance either in terms

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

The Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models

The Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models The Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models Centre for Molecular, Environmental, Genetic & Analytic (MEGA) Epidemiology School of Population Health

More information

University of Pavia. M Estimators. Eduardo Rossi

University of Pavia. M Estimators. Eduardo Rossi University of Pavia M Estimators Eduardo Rossi Criterion Function A basic unifying notion is that most econometric estimators are defined as the minimizers of certain functions constructed from the sample

More information

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Jeremy S. Conner and Dale E. Seborg Department of Chemical Engineering University of California, Santa Barbara, CA

More information

On instrumental variable-based methods for errors-in-variables model identification

On instrumental variable-based methods for errors-in-variables model identification On instrumental variable-based methods for errors-in-variables model identification Stéphane Thil, Marion Gilson, Hugues Garnier To cite this version: Stéphane Thil, Marion Gilson, Hugues Garnier. On instrumental

More information

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe System Identification written by L. Ljung, Prentice Hall PTR, 1999 Chapter 6: Nonparametric Time- and Frequency-Domain Methods Problems presented by Uwe System Identification Problems Chapter 6 p. 1/33

More information

Parametric Techniques

Parametric Techniques Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure

More information

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review STATS 200: Introduction to Statistical Inference Lecture 29: Course review Course review We started in Lecture 1 with a fundamental assumption: Data is a realization of a random process. The goal throughout

More information

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation Variations ECE 6540, Lecture 10 Last Time BLUE (Best Linear Unbiased Estimator) Formulation Advantages Disadvantages 2 The BLUE A simplification Assume the estimator is a linear system For a single parameter

More information

COS513 LECTURE 8 STATISTICAL CONCEPTS

COS513 LECTURE 8 STATISTICAL CONCEPTS COS513 LECTURE 8 STATISTICAL CONCEPTS NIKOLAI SLAVOV AND ANKUR PARIKH 1. MAKING MEANINGFUL STATEMENTS FROM JOINT PROBABILITY DISTRIBUTIONS. A graphical model (GM) represents a family of probability distributions

More information

Econometrics II - EXAM Outline Solutions All questions have 25pts Answer each question in separate sheets

Econometrics II - EXAM Outline Solutions All questions have 25pts Answer each question in separate sheets Econometrics II - EXAM Outline Solutions All questions hae 5pts Answer each question in separate sheets. Consider the two linear simultaneous equations G with two exogeneous ariables K, y γ + y γ + x δ

More information

ECE531 Lecture 10b: Maximum Likelihood Estimation

ECE531 Lecture 10b: Maximum Likelihood Estimation ECE531 Lecture 10b: Maximum Likelihood Estimation D. Richard Brown III Worcester Polytechnic Institute 05-Apr-2011 Worcester Polytechnic Institute D. Richard Brown III 05-Apr-2011 1 / 23 Introduction So

More information