f-domain expression for the limit model Combine: 5.12 Approximate Modelling What can be said about H(q, θ) G(q, θ ) H(q, θ ) with

Size: px
Start display at page:

Download "f-domain expression for the limit model Combine: 5.12 Approximate Modelling What can be said about H(q, θ) G(q, θ ) H(q, θ ) with"

Transcription

1 5.2 Approximate Modelling What can be said about if S / M, and even G / G? G(q, ) H(q, ) f-domain expression for the limit model Combine: with ε(t, ) =H(q, ) [y(t) G(q, )u(t)] y(t) =G (q)u(t) v(t) We know (convergence result) that: then follows: and = arg min V () = Ēε(t, ) 2 = 2π π π V () Φ ε (ω)dω (Parsseval; both expressions are equal to R ε ()) ε(t, ) =H(q, ) [[G (q) G(q, )]u(t) v(t)] Consequence of the identification criterion: V () = π Φ ε (ω)dω = 2π π V () = π G (e iω ) G(e iω,) 2 Φ u (ω)φ v (ω) dω 2π π H(e iω,) 2 This criterion plays the role of approximation criterion. Lec5a- Lec5a-2 Alternative form Write ε(t, ) =e(t) G (q) G(q, ) H(q, ) then u(t) H (q) H(q, ) e(t) H(q, ) = arg min [ π G (e iω ) G(e iω, ) 2 Φ 2π π H(e iω, ) 2 u (ω) ] H (e iω ) H(e iω, ) 2 σ 2 H(e iω, ) 2 e dω Expression shows how is obtained. Two mechanisms: Minimization of G (e iω ) G(e iω, ) 2 Φ u (ω) H(e iω, ) 2 Minimization of H (e iω ) H(e iω, ) 2 σ 2 e H(e iω, ) 2 Problems are coupled if H(q, ) is parametrized. Observation: Type of approximation of G by G(q, ˆ N ) is dependent on noise model H(q, ). Lec5a-3 Lec5a-4

2 Special case: fixed noise model H(q, ) =H (q) (e.g. OE). Then = arg min 2π π π G (e iω ) G(e iω,) 2 Φ u (ω) H (e iω ) 2 dω (second term in integrand is -independent). is determined by minimizing the integrated quadratic error G G() with weighting function Φ u (ω) H (e iω ) 2 At those frequencies where the weighting function is large, the model error will be small. Prefiltering of data Prefiltering of the data with filter L(q), leads to ε F (t, ) =L(q)ε(t, ) and Φ εf (ω) = L(e iω ) 2 Φ ε (ω) For a fixed noise model the weighting function becomes: Φ u (ω) L(e iω ) 2 H (e iω ) 2 Lec5a-5 Lec5a-6 Example Frequency response y(t) =G (q)u(t) G 5th order; Φ u (ω) =(white noise). To illustrate the effect of approximation 2nd order models will be estimated: Amplitude 2 G OE model ARX model 2 OE-model, 2nd order y(t) = ARX-model, 2nd order b q b 2 q 2 u(t) e(t) f q f 2 q 2 Phase (deg) 2 3 ( a q a 2 q 2 )y(t) = (b q b 2 q 2 )u(t) e(t) 4 Frequency (rad/s) Lec5a-7 Lec5a-8

3 Comparison of situations: OE-model ARX-model min π G G() 2 Φ u dω 2π π - Amplitude Bode plot min π G G() 2 Φ u A(e iω,) 2 dω 2π π Additional weighting with (a priori unknown) function frequency (rad/sec) Weighting function A(e iω, ˆ N ) in ARX-case. Lec5a-9 Lec5a- General situation: 2π π π G (e iω ) G(e iω,) 2 Φ u (ω) Φ v (ω) dω H(e iω,) 2 In case of independent parametrization or fixed noise model H = H : G(q, ˆρ N ) is obtained through ˆρ N = arg min π G G(ρ) 2 Φ u L 2 2π π H dω 2 This holds for OE, BJ, FIR. Influencing the approximation criterion (and so the resulting model) by Choice of input spectrum Φ u Choice of prefilter L Choice of noise model H In other cases: compromise in which a.o.φ v is playing a role in the approximation of G. Lec5a- Lec5a-2

4 Example S : y(t) =G (q)u(t) e(t) Since Z N is given, the only degree of freedom we have is to use a pre-filter L to shape the bias error with G 4th order with three delays. We have to use a given set of data Z N (N = 5) for the identification where u is the sum of a white noise of variance 5 and three high-frequencies sinus of amplitude Objective: Using the given data, identify a good model G(q, ˆ N ) for G in the frequency range [.7] in the reduced order model structure: We want a small bias error in the frequency range [.7] choose L such that L(e iω ) 2 Φ u (ω) is relatively (much) larger in the frequency range [.7] than in [.7 π] L Butterworth low pass filter of order 7 and cut-off frequency.7rad/s We filter u and y collected from S by this L and we obtain filtered data with which we perform the identification in M M = OE(n b =2,n f =2,n k =3) Lec5a-3 Lec5a-4 What if we do not use a pre-filter L? G (red) and G(ˆN ) (blue) identified with the filtered data. From u to y G (red) and G(ˆN ) (blue) identified with the data in Z N Amplitude Amplitude 2 From u to y Phase (degrees) Frequency (rad/s) Phase (degrees) G(ˆ N ) is OK. G(ˆ N ) is KO 4 2 Frequency (rad/s) Lec5a-5 Lec5a-6

5 Identification criteria - from PE to ML Prediction error characteristics: Extensions on identification methods and model structures Model structure represented by G(q, ),H(q, ) One step ahead predictor: Contents Identification criteria - from PE to ML The stochastic/deterministic identification paradigm Attractive model structures: OBF s ŷ(t t ; ) =H(q, ) G(q, )u(t)[ H(q, ) ]y(t) Prediction error: ε F (t, ) =L(q)[y(t) ŷ(t t ; )] Identification criterion: ˆ N = arg min ε F (t, ) 2 t= Characterization of results in terms of bias and variance of parameter ˆ N and related transfer functions G(q, ˆ N ) and H(q, ˆ N ). Lec5- Lec5-2 Minimization of quadratic cost function is not the unambiguous universal choice Alternative for quadratic prediction error criterion is a correlation approach: Most simple choice is ζ(t) =[u(t ) u(t d)] T Instrumental Variable (IV) estimator: leading to a parameter estimate that generates a prediction error signal ε F (t, ) that is uncorrelated to past inputs. ˆ N = arg { ζ(t)ε F (t, ) =} t= IV estimators are generally directed towards estimating G only. with ζ(t) R d, a signal vector with a dimension at least as large as dim(). Consistency properties are attractive and in line with general PE methods. ζ(t) = instrumental vector / signals. ˆ N is now obtained as the solution of a set of equations, rather How to compare different estimators? Here we will take a look at (asymptotic) estimator variance. than as minimizing argument of a cost function. Lec5-3 Lec5-4

6 PE methods: From covariance matrix to uncertainty interval Asymptotic result (under appropriate conditions): For N : N(ˆN ) N(,P ) where P can be specified for = (no bias). The sum of quadratic variables follows a χ 2 distribution: N(ˆN ) N(,P ) N(ˆ N ) T P (ˆ N ) χ 2 (n). Let P be decomposed as P = R T R then NR (ˆ N ) N(,I n ) i.e. a vector of n standard (independent) normally distributed random variables. Lec5-5 Lec5-6 Note that contour lines: while the level of probability related to the event N(ˆ N ) T P (ˆ N )=c ( ) T P ( ) <c/n are ellipsoidal contour lines of the multivariable Gaussian distribution: 2sqrt{cλ } is determined by the χ 2 (n)-distribution. Example χ 2 -distribution (2) w 2 w.6 ().4.2 (P has eigenvalues λ i and eigenvectors w i ) 5 5 Lec5-7 Lec5-8

7 Denote D = { ( ) T P ( ) < c χ(α, n) N } ellipsoid centered at then ˆ N D with probability α. Alternatively, denote: Dˆ N = { ( ˆ N ) T P ( ˆ N ) < c χ(α, n) N } ellipsoid centered at ˆ N then it can simply be verified that Central question in estimation theory: Does there exist - in a specified situation - a lower bound for the variance of a parameter estimator? ˆ N D DˆN so that DˆN with probability α. Lec5-9 Lec5- Cramé-Rao lower bound (CRLB) Consider observations from a random variable y with pdf f y (y, ), where is the unknown parameter. Then for any unbiased estimator ˆ of the parameter, its covariance matrix satisfies the inequality Remarks on CRLB CRLB requires knowledge of pdf f y (y; ) CRLB generally requires exact knowledge of Exception: Gaussian pdf s with linear regression model It provides a lower bound for unbiased estimators cov(ˆ) J with the Fisher Information Matrix: { J = E 2 log f y(y; ) 2 = } Independent of the particular estimation method Useful for issues as analysis, and experiment design Estimator that reaches CRLB does not necessarily exist! Lec5- Lec5-2

8 Example Measurement of a physical variable with 5 different sensors y i = e i i =, 5 and e i are independent zero-mean Gaussian random variables with variance σ 2 i σ σ 5. f y (y,)= (2π) 5/2 detσ exp[ 2 (y )T Σ (y )] with y:=[y y 2 y 5 ] T, =[ ] T and covariance matrix Σ=diag(σ 2,,σ2 5 ). log f y (y,)=c 5 (y i ) 2 ; 2 i= σ 2 i log f y (y; ) = 5 (y i ) i= σ 2 i Lec5-3 2 log f y (y; ) 2 = 5, var(ˆ) /σ 2. /σ2 5 σ 2 i= i CRLB decreases with every additional measurement (determined by essential information content in the data) Note that a LS estimator will provide and leads to ˆ = 5 For σi 2 =(.,, 5,, 2) this equals.444 while CRLB =.88 5 i= var(ˆ) = 25 y i 5 i= σ 2 i Lec5-4 Maximum Likelihood Estimator General estimator principle for situations where the probability density function (pdf) of the observed/measured variables is known. Goal: Estimate an unknown parameter in the pdf of a random variable y on the basis of observations of y. Example: random variable y has a Gaussian (unit variance) pdf with unknown μ y f y (y; ) = e (y )2 2 2π Maximum likelihood principle: For given observations y, determine such that L() is maximum. (Choose that pdf that - a posteriori - makes the observed value of the considered variable most likely) For observation y: L() = e (y )2 2 2π Maximizing L() leads to: ˆ = y For N independent observations y i : For given this is a pdf For given y and unknown this is a deterministic function of likelihood function L() L() =f(y,,y n ; ) = N f(y i ; ) i= Lec5-5 Lec5-6

9 Example: observation with model y= u e.4.2 pdf f (y;) y 5 Model: y(t) =ŷ(t t ; ) ε(t), withε =efor =. If {e(t)} independent r.v. s for different t with equal pdf f e, then: N f y (x N )= f e (x(t) ˆx(t t ; )) t= (probability density function of the observations) and for the particular observation x = y (a posteriori): likelihood L(;y=) 5 5 y When observing y =this leads to ˆ = arg max L(; y =). L y (; y N )= N f e (y(t) ŷ(t t ; )) t= the likelihood function, a deterministic function of. L y (; y N )= N f e (ε(t, )). t= Lec5-7 Lec5-8 If f e Gaussian: L y (; y N )= N e ε(t,) 2σe 2 2πσe t= ML-estimator: Maximizing L y is equivalent to minimizing log L y : log L y (; y N )= N 2 log2π Nlogσ e 2 2σe 2 t= ε(t, ) 2 If the noise e on the data is a sequence of independent observations (white noise) from a Gaussian pdf with zero mean and equal variance for all observations, then the ML estimator is equal to the least squares (LS) prediction error estimator. If σ e is either fixed or parametrized independent from, then min ε(t, ) 2 t= =LS Lec5-9 Lec5-2

10 Properties of the ML estimator for N : N(ˆN ) N(,NJ ) Or differently phrased: with J the Fisher Information Matrix, so that asymptotically in N If in the PE setting, the noise disturbance e is Gaussian and S M,then PE = ML cov(ˆ N )=J (Cramér-Rao lower bound). This implies that the ML estimator is consistent asymptotically reaches the smallest possible variance (CRLB) over all unbiased estimators no guarantees for properties in case of finite N Lec5-2 Lec5-22 Alternative interpretation of stochastic estimation in PE The stochastic/deterministic identification paradigm e H u G H e v y Probabilistic estimation framework results from assumptions on noise e Modelling the characteristics of e or v is a user s choice Probabilistic framework useful for random type of disturbances u G G(q,) v -- /H(q,) ε(t,) y system model ˆN (2) The (probabilistic) ellipsoidal uncertainty set is a selection of s for which the resulting ε(t, ) is likely to be a realization of a (Gaussian) white noise process ˆN () Assumptions on e determine the parameter uncertainty set Lec5-23 Lec5-24

11 Generalization Set-membership identification u G G(q,) v -- ε(t,) y system model Identification of model uncertainty bounds can be reformulated: Choose a noise set V of which the disturbance signal v is assumed to be a member For a given identification experiment (y N,u N ), determine the parameter uncertainty set as Driven by the desire to have a hard-bounded uncertainty set Θ unc,thenoisesetv can be chosen hard-bounded (deterministic) also: amplitude-bounded: v(t) c energy-bounded: v 2 (t) c N t=... In combination with a l-i-p model structure, the resulting identification algorithm becomes a linear programming problem (2) Θ unc = { ˆv(t, ) :=ε(t, ) V(with probability α)} If V non-probabilistic = Θ unc also () Lec5-25 Lec5-26 Set-membership identification (cont d) Characteristic properties Alternative noise sets allow for nonlinear, time-varying disturbances Close to the parameter uncertainty bounds, the related (worst-case) v becomes strongly correlated to u (Nature is playing against you) With the indicated noise sets, uncertainty bounds become hard-bounded but very large Conservatism results from a choice of V that is bigger than the realistically occurring set of noise signals Improvements can be made in V by including a bound on the cross-correlation v(t)u(t τ ) N c t= (Hakvoort et al., 995) This limits the knowledge that the (worst-case) noise signal has about the (past and future) input. Consistency is generally lost (set does not shrink to a single point for increasing N) Lec5-27 Lec5-28

12 Finite impulse response models: Attractive model structures: OBF s In PE results (Chapter 5) attractive properties of model structures: Linearity-in-the-parameters: LS leads to convex optimization G(z, ) = n b k= b k z k Output-Error structure: Consistent identification of G irrespective of the noise model Both properties are combined by the FIR-model: G(z, ) = n b k= b k z k H(z, ) = I (fixed) Disadvantage For systems with relatively high sampling rates and small damping (long tails), the necessary number of parameters n b can become very high General aim towards parsimonious parametrizations: Use parametrizations with few parameters, but in strategic locations Lec5-29 Lec5-3 Generalized basis functions Particular properties of the sequence {z,z,z 2, }: From linear combinations of shifted pulse functions towards linear combinations of smartly chosen dynamic functions They constitute a complete basis for the systems space H 2, Systems with a pulse response g(k) l 2 that satisfies k= g(k) 2 < The inner product in this space is <G,G 2 >:= π G 2π (eiω )G 2 (e iω )dω = g (k)g 2(k) π All basis functions are orthonormal, i.e. <z i,z j > =, i j k= n G(z, ) = c k F k (z) k= =, i = j When using a white noise input signal u, then E[q i u(t) q j u(t)] = i j Lec5-3 Lec5-32

13 Generalized orthonormal basis functions Generalization of this idea: Consider the sequence of functions Laguerre functions Consider the sequence of functions {, z ξ z ξ 2, z ξ 3, }, ( ξ i < ) { z a, (z a),, }, ( a < ) 2 (z a) 3 These functions constitute a complete basis for H 2. After Gram-Schmidt orthogonalization: [ ] a 2 az k F k (z) = k =, 2, z a z a with a < real-valued pole. Note: If a =then the FIR structure results. Classical result (Walsh, 935): basis functions are complete in H 2 if i= ( ξ i ) = (I.e. the poles should not tend to the unit circle). After Gram-Schmidt orthogonalization: ξk 2 k [ ξ ] F k (z) = i z z ξ k z ξ i= i }{{} all pass function Again: For ξ i =, i, the FIR structure results. Lec5-33 Lec5-34 Use in system identification The considered model structure leads to n ŷ(t t ; ) = c k F k (q)u(t) k= Then ε(t, ) =y(t) ϕ T (t) with ϕ(t) :=[F (q)u(t) F 2 (q)u(t) F n (q)u(t)] T. The LS-estimate is simply obtained by [ ˆ N = N ] [ ϕ(t)ϕ T (t) N t= ] ϕ(t)y(t) t= and all relevant properties of linear regression estimators are maintained. Lec5-35 Degrees of freedom: selection of basis poles Pole selection usually done periodically: {ξ,ξ 2, ξ n } = {ξ,ξ 2, ξ nb,ξ, ξ nb, }. Convergence result: If G (z) has poles {p i } i=,..,n, and the basis is induced by a set of poles {ξ j } j=,..,nb, then the slowest eigenvalue that determines the convergence rate of G (z) = c k F k (z) is given by max i n b j= k= p i ξ j ξ j p i = coefficients c k decay quickly if pole sets are close Lec5-36

14 Example: 3rd order G : p,2 =.985 ±.6i; p 3 =.75 ρ N. FIR Laguerre with ξ = Kautz with ξ,2 =.97 ±.i GOBF: ξ,2 =.97 ±.i, ξ 3 = GOBF: ξ,2 =.98 ±.5i, ξ 3 = N. : number of coefficients before magnitude is decayed below %. Lec5-37 Features in identification with GOBF s Use of uncertain prior knowledge in identification: The more accurate the priors (pole information) the more simple the ID-problem (less parameters) but no loss of generality (for increasing n still all systems can be represented) Additional properties: Variance analysis is available Noise model can be added (iteratively through GLS) Easily extendable to MIMO models Finite-time variance analysis Linear constraints can be added (static gain,...) Lec5-38 Summary Alternative interpretation of basis construction u(t) G b G b G b Interpretation of the PE identification criterion in a statistical ML framework Justification of a stochastic approach () (2) (n) x b (t) x b (t) x b (t) Balanced state readout c c 2 c n y(t) Discussion of deterministic alternatives Extension of the classical model structures towards attractive GOBF s (Generalized Orthonormal Basis Functions) Further reading in Heuberger, Van den Hof and Wahlberg (Springer 25) Through balanced-state representations of all-pass functions G b generated by the poles {ξ, ξ nb }. Generalization of so-called tapped delay line (G b = z ). What we could not discuss yet MIMO models and subspace identification (Lect. 7) Nonlinear dynamics Estimation of physical parameters (on-line/recursive) Lec5-39 Lec5-4

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 9 - Parameter estimation in linear models Model structures Parameter estimation via prediction error minimization Properties of the estimate: bias and variance

More information

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 10 - System identification as a model building tool Experiment design Examination and prefiltering of data Model structure selection Model validation Lecture

More information

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe System Identification written by L. Ljung, Prentice Hall PTR, 1999 Chapter 6: Nonparametric Time- and Frequency-Domain Methods Problems presented by Uwe System Identification Problems Chapter 6 p. 1/33

More information

Basic concepts in estimation

Basic concepts in estimation Basic concepts in estimation Random and nonrandom parameters Definitions of estimates ML Maimum Lielihood MAP Maimum A Posteriori LS Least Squares MMS Minimum Mean square rror Measures of quality of estimates

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR MADE PARAMETERIZATION. Jianhong Wang, Hong Jiang and Yonghong Zhu

PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR MADE PARAMETERIZATION. Jianhong Wang, Hong Jiang and Yonghong Zhu International Journal of Innovative Computing, Information and Control ICIC International c 208 ISSN 349-498 Volume 4, Number, February 208 pp. 8 96 PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR

More information

Further Results on Model Structure Validation for Closed Loop System Identification

Further Results on Model Structure Validation for Closed Loop System Identification Advances in Wireless Communications and etworks 7; 3(5: 57-66 http://www.sciencepublishinggroup.com/j/awcn doi:.648/j.awcn.735. Further esults on Model Structure Validation for Closed Loop System Identification

More information

Identification, Model Validation and Control. Lennart Ljung, Linköping

Identification, Model Validation and Control. Lennart Ljung, Linköping Identification, Model Validation and Control Lennart Ljung, Linköping Acknowledgment: Useful discussions with U Forssell and H Hjalmarsson 1 Outline 1. Introduction 2. System Identification (in closed

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

Outline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42)

Outline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42) Outline 2(42) Sysid Course T1 2016 An Overview. Automatic Control, SY, Linköpings Universitet An Umbrella Contribution for the aterial in the Course The classic, conventional System dentification Setup

More information

IDENTIFICATION FOR CONTROL

IDENTIFICATION FOR CONTROL IDENTIFICATION FOR CONTROL Raymond A. de Callafon, University of California San Diego, USA Paul M.J. Van den Hof, Delft University of Technology, the Netherlands Keywords: Controller, Closed loop model,

More information

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström PREDICTIO ERROR METHODS Torsten Söderström Department of Systems and Control, Information Technology, Uppsala University, Uppsala, Sweden Keywords: prediction error method, optimal prediction, identifiability,

More information

On Input Design for System Identification

On Input Design for System Identification On Input Design for System Identification Input Design Using Markov Chains CHIARA BRIGHENTI Masters Degree Project Stockholm, Sweden March 2009 XR-EE-RT 2009:002 Abstract When system identification methods

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Basics of System Identification Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont (UBC) EECE574 - Basics of

More information

Estimators as Random Variables

Estimators as Random Variables Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maimum likelihood Consistency Confidence intervals Properties of the mean estimator Introduction Up until

More information

12. Prediction Error Methods (PEM)

12. Prediction Error Methods (PEM) 12. Prediction Error Methods (PEM) EE531 (Semester II, 2010) description optimal prediction Kalman filter statistical results computational aspects 12-1 Description idea: determine the model parameter

More information

Experiment design for batch-to-batch model-based learning control

Experiment design for batch-to-batch model-based learning control Experiment design for batch-to-batch model-based learning control Marco Forgione, Xavier Bombois and Paul M.J. Van den Hof Abstract An Experiment Design framewor for dynamical systems which execute multiple

More information

6.435, System Identification

6.435, System Identification SET 6 System Identification 6.435 Parametrized model structures One-step predictor Identifiability Munther A. Dahleh 1 Models of LTI Systems A complete model u = input y = output e = noise (with PDF).

More information

Introduction to system identification

Introduction to system identification Introduction to system identification Jan Swevers July 2006 0-0 Introduction to system identification 1 Contents of this lecture What is system identification Time vs. frequency domain identification Discrete

More information

On Identification of Cascade Systems 1

On Identification of Cascade Systems 1 On Identification of Cascade Systems 1 Bo Wahlberg Håkan Hjalmarsson Jonas Mårtensson Automatic Control and ACCESS, School of Electrical Engineering, KTH, SE-100 44 Stockholm, Sweden. (bo.wahlberg@ee.kth.se

More information

Model structure. Lecture Note #3 (Chap.6) Identification of time series model. ARMAX Models and Difference Equations

Model structure. Lecture Note #3 (Chap.6) Identification of time series model. ARMAX Models and Difference Equations System Modeling and Identification Lecture ote #3 (Chap.6) CHBE 70 Korea University Prof. Dae Ryoo Yang Model structure ime series Multivariable time series x [ ] x x xm Multidimensional time series (temporal+spatial)

More information

System Identification

System Identification System Identification Lecture : Statistical properties of parameter estimators, Instrumental variable methods Roy Smith 8--8. 8--8. Statistical basis for estimation methods Parametrised models: G Gp, zq,

More information

CHEAPEST IDENTIFICATION EXPERIMENT WITH GUARANTEED ACCURACY IN THE PRESENCE OF UNDERMODELING

CHEAPEST IDENTIFICATION EXPERIMENT WITH GUARANTEED ACCURACY IN THE PRESENCE OF UNDERMODELING CHEAPEST IDENTIFICATION EXPERIMENT WITH GUARANTEED ACCURACY IN THE PRESENCE OF UNDERMODELING Xavier Bombois, Marion Gilson Delft Center for Systems and Control, Delft University of Technology, Mekelweg

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 9 0 Linear regression Coherence Φ ( ) xy ω γ xy ( ω) = 0 γ Φ ( ω) Φ xy ( ω) ( ω) xx o noise in the input, uncorrelated output noise Φ zz Φ ( ω) = Φ xy xx ( ω )

More information

Outline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification

Outline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification Outline What Can Regularization Offer for Estimation of Dynamical Systems? with Tianshi Chen Preamble: The classic, conventional System Identification Setup Bias Variance, Model Size Selection Regularization

More information

Improving performance and stability of MRI methods in closed-loop

Improving performance and stability of MRI methods in closed-loop Preprints of the 8th IFAC Symposium on Advanced Control of Chemical Processes The International Federation of Automatic Control Improving performance and stability of MRI methods in closed-loop Alain Segundo

More information

Non-parametric identification

Non-parametric identification Non-parametric Non-parametric Transient Step-response using Spectral Transient Correlation Frequency function estimate Spectral System Identification, SSY230 Non-parametric 1 Non-parametric Transient Step-response

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Recursive Identification in Closed-Loop and Adaptive Control Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont

More information

THIS paper studies the input design problem in system identification.

THIS paper studies the input design problem in system identification. 1534 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 50, NO. 10, OCTOBER 2005 Input Design Via LMIs Admitting Frequency-Wise Model Specifications in Confidence Regions Henrik Jansson Håkan Hjalmarsson, Member,

More information

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let

More information

Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data.

Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. ISS0031 Modeling and Identification Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. Aleksei Tepljakov, Ph.D. October 21, 2015 Discrete-time Transfer Functions

More information

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Jeremy S. Conner and Dale E. Seborg Department of Chemical Engineering University of California, Santa Barbara, CA

More information

Identification of Linear Systems

Identification of Linear Systems Identification of Linear Systems Johan Schoukens http://homepages.vub.ac.be/~jschouk Vrije Universiteit Brussel Department INDI /67 Basic goal Built a parametric model for a linear dynamic system from

More information

Cover page. : On-line damage identication using model based orthonormal. functions. Author : Raymond A. de Callafon

Cover page. : On-line damage identication using model based orthonormal. functions. Author : Raymond A. de Callafon Cover page Title : On-line damage identication using model based orthonormal functions Author : Raymond A. de Callafon ABSTRACT In this paper, a new on-line damage identication method is proposed for monitoring

More information

Control Systems Lab - SC4070 System Identification and Linearization

Control Systems Lab - SC4070 System Identification and Linearization Control Systems Lab - SC4070 System Identification and Linearization Dr. Manuel Mazo Jr. Delft Center for Systems and Control (TU Delft) m.mazo@tudelft.nl Tel.:015-2788131 TU Delft, February 13, 2015 (slides

More information

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation Variations ECE 6540, Lecture 10 Last Time BLUE (Best Linear Unbiased Estimator) Formulation Advantages Disadvantages 2 The BLUE A simplification Assume the estimator is a linear system For a single parameter

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

14 th IFAC Symposium on System Identification, Newcastle, Australia, 2006

14 th IFAC Symposium on System Identification, Newcastle, Australia, 2006 14 th IFAC Symposium on System Identification, Newcastle, Australia, 26 LINEAR REGRESSION METHOD FOR ESTIMATING APPROXIMATE NORMALIZED COPRIME PLANT FACTORS M.R. Graham R.A. de Callafon,1 University of

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Input-Output systems The z-transform important issues

More information

FrE09-3. Feedforward Noise Cancellation in an Airduct using Generalized FIR Filter Estimation

FrE09-3. Feedforward Noise Cancellation in an Airduct using Generalized FIR Filter Estimation Proceedings of the 42nd IEEE Conference on Decision and Control M aui, Hawaii USA, December 23 FrE9-3 Feedforward Noise Cancellation in an Airduct using Generalized FIR Filter Estimation Jie Zeng University

More information

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors and editors Downloads Our

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Matlab software tools for model identification and data analysis 11/12/2015 Prof. Marcello Farina

Matlab software tools for model identification and data analysis 11/12/2015 Prof. Marcello Farina Matlab software tools for model identification and data analysis 11/12/2015 Prof. Marcello Farina Model Identification and Data Analysis (Academic year 2015-2016) Prof. Sergio Bittanti Outline Data generation

More information

9. Model Selection. statistical models. overview of model selection. information criteria. goodness-of-fit measures

9. Model Selection. statistical models. overview of model selection. information criteria. goodness-of-fit measures FE661 - Statistical Methods for Financial Engineering 9. Model Selection Jitkomut Songsiri statistical models overview of model selection information criteria goodness-of-fit measures 9-1 Statistical models

More information

Finite Sample Confidence Regions for Parameters in Prediction Error Identification using Output Error Models

Finite Sample Confidence Regions for Parameters in Prediction Error Identification using Output Error Models Proceedings of the 7th World Congress The International Federation of Automatic Control Finite Sample Confidence Regions for Parameters in Prediction Error Identification using Output Error Models Arnold

More information

DOA Estimation using MUSIC and Root MUSIC Methods

DOA Estimation using MUSIC and Root MUSIC Methods DOA Estimation using MUSIC and Root MUSIC Methods EE602 Statistical signal Processing 4/13/2009 Presented By: Chhavipreet Singh(Y515) Siddharth Sahoo(Y5827447) 2 Table of Contents 1 Introduction... 3 2

More information

LPV system identification using series expansion models

LPV system identification using series expansion models LPV system identification using series expansion models Toth, R.; Heuberger, P.S.C.; Van den Hof, P.M.J. Published in: Linear parameter-varying system identification : new developments and trends DOI:

More information

Frequency-Domain Robust Control Toolbox

Frequency-Domain Robust Control Toolbox Frequency-Domain Robust Control Toolbox Alireza Karimi Abstract A new frequency-domain robust control toolbox is introduced and compared with some features of the robust control toolbox of Matlab. A summary

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 2: Non-parametric methods Henrik Madsen Advanced Time Series Analysis September 206 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes September

More information

Quantification of frequency domain error bounds with guaranteed confidence level in Prediction Error Identification

Quantification of frequency domain error bounds with guaranteed confidence level in Prediction Error Identification Quantification of frequency domain error bounds with guaranteed confidence level in Prediction Error Identification X. Bombois (1), B.D.O. Anderson (2), M. Gevers (3) (1) Delft Center for Systems and Control,

More information

Subspace-based Identification

Subspace-based Identification of Infinite-dimensional Multivariable Systems from Frequency-response Data Department of Electrical and Electronics Engineering Anadolu University, Eskişehir, Turkey October 12, 2008 Outline 1 2 3 4 Noise-free

More information

Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory

Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory Statistical Inference Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory IP, José Bioucas Dias, IST, 2007

More information

Analysis of Discrete-Time Systems

Analysis of Discrete-Time Systems TU Berlin Discrete-Time Control Systems 1 Analysis of Discrete-Time Systems Overview Stability Sensitivity and Robustness Controllability, Reachability, Observability, and Detectabiliy TU Berlin Discrete-Time

More information

Rozwiązanie zagadnienia odwrotnego wyznaczania sił obciąŝających konstrukcje w czasie eksploatacji

Rozwiązanie zagadnienia odwrotnego wyznaczania sił obciąŝających konstrukcje w czasie eksploatacji Rozwiązanie zagadnienia odwrotnego wyznaczania sił obciąŝających konstrukcje w czasie eksploatacji Tadeusz Uhl Piotr Czop Krzysztof Mendrok Faculty of Mechanical Engineering and Robotics Department of

More information

Identification of LPV Output-Error and Box-Jenkins Models via Optimal Refined Instrumental Variable Methods

Identification of LPV Output-Error and Box-Jenkins Models via Optimal Refined Instrumental Variable Methods 200 American Control Conference Marriott Waterfront, Baltimore, MD, USA June 30-July 02, 200 ThB2.2 Identification of LPV Output-Error and Box-Jenkins Models via Optimal Refined Instrumental Variable Methods

More information

A New Subspace Identification Method for Open and Closed Loop Data

A New Subspace Identification Method for Open and Closed Loop Data A New Subspace Identification Method for Open and Closed Loop Data Magnus Jansson July 2005 IR S3 SB 0524 IFAC World Congress 2005 ROYAL INSTITUTE OF TECHNOLOGY Department of Signals, Sensors & Systems

More information

EIE6207: Estimation Theory

EIE6207: Estimation Theory EIE6207: Estimation Theory Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: Steven M.

More information

Lecture 9 Infinite Impulse Response Filters

Lecture 9 Infinite Impulse Response Filters Lecture 9 Infinite Impulse Response Filters Outline 9 Infinite Impulse Response Filters 9 First-Order Low-Pass Filter 93 IIR Filter Design 5 93 CT Butterworth filter design 5 93 Bilinear transform 7 9

More information

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans

More information

Introduction. Performance and Robustness (Chapter 1) Advanced Control Systems Spring / 31

Introduction. Performance and Robustness (Chapter 1) Advanced Control Systems Spring / 31 Introduction Classical Control Robust Control u(t) y(t) G u(t) G + y(t) G : nominal model G = G + : plant uncertainty Uncertainty sources : Structured : parametric uncertainty, multimodel uncertainty Unstructured

More information

Statistical Signal Processing Detection, Estimation, and Time Series Analysis

Statistical Signal Processing Detection, Estimation, and Time Series Analysis Statistical Signal Processing Detection, Estimation, and Time Series Analysis Louis L. Scharf University of Colorado at Boulder with Cedric Demeure collaborating on Chapters 10 and 11 A TT ADDISON-WESLEY

More information

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Ch.10 Autocorrelated Disturbances (June 15, 2016) Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the

More information

Closed-loop Identification of Hammerstein Systems Using Iterative Instrumental Variables

Closed-loop Identification of Hammerstein Systems Using Iterative Instrumental Variables Proceedings of the 18th World Congress The International Federation of Automatic Control Closed-loop Identification of Hammerstein Systems Using Iterative Instrumental Variables Younghee Han and Raymond

More information

Linear Approximations of Nonlinear FIR Systems for Separable Input Processes

Linear Approximations of Nonlinear FIR Systems for Separable Input Processes Linear Approximations of Nonlinear FIR Systems for Separable Input Processes Martin Enqvist, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581

More information

Econ 582 Nonparametric Regression

Econ 582 Nonparametric Regression Econ 582 Nonparametric Regression Eric Zivot May 28, 2013 Nonparametric Regression Sofarwehaveonlyconsideredlinearregressionmodels = x 0 β + [ x ]=0 [ x = x] =x 0 β = [ x = x] [ x = x] x = β The assume

More information

Finite-time experiment design with multisines

Finite-time experiment design with multisines Proceedings of the 7th World Congress The International Federation of Automatic Control Seoul, Korea, July 6-, 8 Finite-time experiment design with multisines X. Bombois M. Barenthin P.M.J. Van den Hof

More information

Econometrics II - EXAM Answer each question in separate sheets in three hours

Econometrics II - EXAM Answer each question in separate sheets in three hours Econometrics II - EXAM Answer each question in separate sheets in three hours. Let u and u be jointly Gaussian and independent of z in all the equations. a Investigate the identification of the following

More information

DS-GA 1002 Lecture notes 10 November 23, Linear models

DS-GA 1002 Lecture notes 10 November 23, Linear models DS-GA 2 Lecture notes November 23, 2 Linear functions Linear models A linear model encodes the assumption that two quantities are linearly related. Mathematically, this is characterized using linear functions.

More information

Model Selection and Geometry

Model Selection and Geometry Model Selection and Geometry Pascal Massart Université Paris-Sud, Orsay Leipzig, February Purpose of the talk! Concentration of measure plays a fundamental role in the theory of model selection! Model

More information

Development of Nonlinear Black Box Models using Orthonormal Basis Filters: A Review*

Development of Nonlinear Black Box Models using Orthonormal Basis Filters: A Review* Development of Nonlinear Black Box Models using Orthonormal Basis Filters: A Review* Sachin C. atwardhan Abstract Over the last two decades, there has been a growing interest in the use of orthonormal

More information

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given

More information

Analysis of Discrete-Time Systems

Analysis of Discrete-Time Systems TU Berlin Discrete-Time Control Systems TU Berlin Discrete-Time Control Systems 2 Stability Definitions We define stability first with respect to changes in the initial conditions Analysis of Discrete-Time

More information

System Identification for MPC

System Identification for MPC System Identification for MPC Conflict of Conflux? B. Erik Ydstie, Carnegie Mellon University Course Objectives: 1. The McNamara Program for MPC 2. The Feldbaum Program for MPC 3. From Optimal Control

More information

Estimation, Detection, and Identification CMU 18752

Estimation, Detection, and Identification CMU 18752 Estimation, Detection, and Identification CMU 18752 Graduate Course on the CMU/Portugal ECE PhD Program Spring 2008/2009 Instructor: Prof. Paulo Jorge Oliveira pjcro @ isr.ist.utl.pt Phone: +351 21 8418053

More information

Terminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1

Terminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1 Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maximum likelihood Consistency Confidence intervals Properties of the mean estimator Properties of the

More information

Identification for robust deconvolution filtering

Identification for robust deconvolution filtering Identification for robust deconvolution filtering Xavier Bombois, Håkan Hjalmarsson, Gérard Scorletti To cite this version: Xavier Bombois, Håkan Hjalmarsson, Gérard Scorletti. Identification for robust

More information

OPTIMAL DESIGN INPUTS FOR EXPERIMENTAL CHAPTER 17. Organization of chapter in ISSO. Background. Linear models

OPTIMAL DESIGN INPUTS FOR EXPERIMENTAL CHAPTER 17. Organization of chapter in ISSO. Background. Linear models CHAPTER 17 Slides for Introduction to Stochastic Search and Optimization (ISSO)by J. C. Spall OPTIMAL DESIGN FOR EXPERIMENTAL INPUTS Organization of chapter in ISSO Background Motivation Finite sample

More information

Chapter 2. Classical Control System Design. Dutch Institute of Systems and Control

Chapter 2. Classical Control System Design. Dutch Institute of Systems and Control Chapter 2 Classical Control System Design Overview Ch. 2. 2. Classical control system design Introduction Introduction Steady-state Steady-state errors errors Type Type k k systems systems Integral Integral

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

ESTIMATION ALGORITHMS

ESTIMATION ALGORITHMS ESTIMATIO ALGORITHMS Solving normal equations using QR-factorization on-linear optimization Two and multi-stage methods EM algorithm FEL 3201 Estimation Algorithms - 1 SOLVIG ORMAL EQUATIOS USIG QR FACTORIZATIO

More information

Multivariate ARMA Processes

Multivariate ARMA Processes LECTURE 8 Multivariate ARMA Processes A vector y(t) of n elements is said to follow an n-variate ARMA process of orders p and q if it satisfies the equation (1) A 0 y(t) + A 1 y(t 1) + + A p y(t p) = M

More information

Übersetzungshilfe / Translation aid (English) To be returned at the end of the exam!

Übersetzungshilfe / Translation aid (English) To be returned at the end of the exam! Prüfung Regelungstechnik I (Control Systems I) Prof. Dr. Lino Guzzella 5. 2. 2 Übersetzungshilfe / Translation aid (English) To be returned at the end of the exam! Do not mark up this translation aid -

More information

Properties of Open-Loop Controllers

Properties of Open-Loop Controllers Properties of Open-Loop Controllers Sven Laur University of Tarty 1 Basics of Open-Loop Controller Design Two most common tasks in controller design is regulation and signal tracking. Regulating controllers

More information

Likelihood Bounds for Constrained Estimation with Uncertainty

Likelihood Bounds for Constrained Estimation with Uncertainty Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference 5 Seville, Spain, December -5, 5 WeC4. Likelihood Bounds for Constrained Estimation with Uncertainty

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56 Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The

More information

Refined Instrumental Variable Methods for Identifying Hammerstein Models Operating in Closed Loop

Refined Instrumental Variable Methods for Identifying Hammerstein Models Operating in Closed Loop Refined Instrumental Variable Methods for Identifying Hammerstein Models Operating in Closed Loop V. Laurain, M. Gilson, H. Garnier Abstract This article presents an instrumental variable method dedicated

More information

Estimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator

Estimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator Estimation Theory Estimation theory deals with finding numerical values of interesting parameters from given set of data. We start with formulating a family of models that could describe how the data were

More information

Kalman filter using the orthogonality principle

Kalman filter using the orthogonality principle Appendix G Kalman filter using the orthogonality principle This appendix presents derivation steps for obtaining the discrete Kalman filter equations using a method based on the orthogonality principle.

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS

IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS M Gevers,1 L Mišković,2 D Bonvin A Karimi Center for Systems Engineering and Applied Mechanics (CESAME) Université Catholique de Louvain B-1348 Louvain-la-Neuve,

More information

Econometrics I, Estimation

Econometrics I, Estimation Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the

More information

LTI Systems, Additive Noise, and Order Estimation

LTI Systems, Additive Noise, and Order Estimation LTI Systems, Additive oise, and Order Estimation Soosan Beheshti, Munther A. Dahleh Laboratory for Information and Decision Systems Department of Electrical Engineering and Computer Science Massachusetts

More information

Full-State Feedback Design for a Multi-Input System

Full-State Feedback Design for a Multi-Input System Full-State Feedback Design for a Multi-Input System A. Introduction The open-loop system is described by the following state space model. x(t) = Ax(t)+Bu(t), y(t) =Cx(t)+Du(t) () 4 8.5 A =, B =.5.5, C

More information

Design Methods for Control Systems

Design Methods for Control Systems Design Methods for Control Systems Maarten Steinbuch TU/e Gjerrit Meinsma UT Dutch Institute of Systems and Control Winter term 2002-2003 Schedule November 25 MSt December 2 MSt Homework # 1 December 9

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

The problem is to infer on the underlying probability distribution that gives rise to the data S.

The problem is to infer on the underlying probability distribution that gives rise to the data S. Basic Problem of Statistical Inference Assume that we have a set of observations S = { x 1, x 2,..., x N }, xj R n. The problem is to infer on the underlying probability distribution that gives rise to

More information