Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.1/30

Size: px
Start display at page:

Download "Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.1/30"

Transcription

1 Model Validation in Non-Linear Continuous-Discrete Grey-Box Models Jan Holst, Erik Lindström, Henrik Madsen and Henrik Aalborg Niels Division of Mathematical Statistics, Centre for Mathematical Sciences Lund Institute of Technology, Lund, Sweden Informatics and Mathematical Modeling Technical University of Denmark, Kongens Lyngby, Denmark Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.1/30

2 General model: Class of models dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t) Y (t k ) = h(t k, u(t k ), X (t k ), θ) + e k where (t, u(t), X (t), θ) : [0, T ] R l R n R d R n The integrals are interpreted in the sense of Itô. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.2/30

3 General model: Class of models dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t) where Y (t k ) = h(t k, u(t k ), X (t k ), θ) + e k (t, u(t), X (t), θ) : [0, T ] R l R n R d R n (t, u(t), X (t), θ) : [0, T ] R l R n R d R n m The integrals are interpreted in the sense of Itô. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.2/30

4 General model: Class of models dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t) where Y (t k ) = h(t k, u(t k ), X (t k ), θ) + e k (t, u(t), X (t), θ) : [0, T ] R l R n R d R n (t, u(t), X (t), θ) : [0, T ] R l R n R d R n m dw (t) is a m-dimensional Wiener process. The integrals are interpreted in the sense of Itô. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.2/30

5 General model: Class of models dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t) where Y (t k ) = h(t k, u(t k ), X (t k ), θ) + e k (t, u(t), X (t), θ) : [0, T ] R l R n R d R n (t, u(t), X (t), θ) : [0, T ] R l R n R d R n m dw (t) is a m-dimensional Wiener process. h(t, u(t), X (t), θ) : [0, T ] R l R n R d R p The integrals are interpreted in the sense of Itô. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.2/30

6 General model: Class of models dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t) where Y (t k ) = h(t k, u(t k ), X (t k ), θ) + e k (t, u(t), X (t), θ) : [0, T ] R l R n R d R n (t, u(t), X (t), θ) : [0, T ] R l R n R d R n m dw (t) is a m-dimensional Wiener process. h(t, u(t), X (t), θ) : [0, T ] R l R n R d R p e k N (0, S), e k R p The integrals are interpreted in the sense of Itô. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.2/30

7 Overview Three tools on model validation in continuous-discrete time grey-box models: Dependence identification. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.3/30

8 Overview Three tools on model validation in continuous-discrete time grey-box models: Dependence identification. Model structure identification. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.3/30

9 Overview Three tools on model validation in continuous-discrete time grey-box models: Dependence identification. Model structure identification. Generalized Gaussian residuals. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.3/30

10 Dependence identification It is often reasonable, under some regularity conditions, to approximate the system as Gaussian. This means that filter methods can be used to calculate conditional mean and covariance. Define the one-step prediction error: r k = Y t k E[Y tk F k 1 ] V [Ytk F k 1 ], where F k 1 is the available information at time t k 1. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.4/30

11 Dependence identification It is often reasonable, under some regularity conditions, to approximate the system as Gaussian. This means that filter methods can be used to calculate conditional mean and covariance. Define the one-step prediction error: r k = Y t k E[Y tk F k 1 ] V [Ytk F k 1 ], where F k 1 is the available information at time t k 1. Non-equidistantly sampled data is normalized. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.4/30

12 Dependence identification, cont. The squared multiple correlation coefficient is given by 2 0(1...k) = V [Y ] V [Y X 1,..., X k ]. V [Y ] Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.5/30

13 Dependence identification, cont. The squared multiple correlation coefficient is given by 2 0(1...k) = V [Y ] V [Y X 1,..., X k ]. V [Y ] Assuming normality and maximum likelihood estimates yields R 2 0(1...k) = SS 0 SS 0(1...k) SS 0, where SS 0 = (y i (y i /N )) 2 and SS 0(1...k) is calculated conditional on X 1,..., X k. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.5/30

14 Dependence identification, cont. 2 is very similar to squared Sample AutoCorrelation Function! Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.6/30

15 Dependence identification, cont. 2 is very similar to squared Sample AutoCorrelation Function! Let f k (r) = E[r k r k n = r] and define Lag Dependent Function as LDF(k) = sign (ˆfk (r max ) ˆf ) k (r min ) (R0(k) 2 ) +. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.6/30

16 Dependence identification, cont. 2 is very similar to squared Sample AutoCorrelation Function! Let f k (r) = E[r k r k n = r] and define Lag Dependent Function as LDF(k) = sign (ˆfk (r max ) ˆf ) k (r min ) (R0(k) 2 ) +. The SACF is obtained as a special case of the LDF if a linear model is used to calculate R0(k) 2. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.6/30

17 Dependence identification, cont. We can define Partial Lag Dependent Functions in a similar fashion. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.7/30

18 Dependence identification, cont. We can define Partial Lag Dependent Functions in a similar fashion. The partial square correlation coefficient is given by 2 (0k) (1...k 1) = V [Y X 1,..., X k 1 ] V [Y X 1,..., X k ]. V [Y X 1,..., X k 1 ] Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.7/30

19 Dependence identification, cont. We can define Partial Lag Dependent Functions in a similar fashion. The partial square correlation coefficient is given by 2 (0k) (1...k 1) = V [Y X 1,..., X k 1 ] V [Y X 1,..., X k ]. V [Y X 1,..., X k 1 ] Warning: Defining the Partial Lag Dependent Functions for general non-linear autoregressive models X (n) = g(x (n 1),..., X (n k)) + (n) is not feasible for large k. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.7/30

20 Dependence identification, cont. Confidence intervals are obtained by bootstrap under the hypothesis of a i.i.d. process (for LDF and PLDF). Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.8/30

21 Dependence identification, cont. Confidence intervals are obtained by bootstrap under the hypothesis of a i.i.d. process (for LDF and PLDF). The distribution for the LDF will be symmetric about zero. Hence, an upper confidence limit for LDF(k) or PLDF(k) is to be approximated. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.8/30

22 Dependence identification, cont. Confidence intervals are obtained by bootstrap under the hypothesis of a i.i.d. process (for LDF and PLDF). The distribution for the LDF will be symmetric about zero. Hence, an upper confidence limit for LDF(k) or PLDF(k) is to be approximated. Confidence intervals will in general be wider than corresponding intervals for linear models (SACF and/or SPACF). Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.8/30

23 Dependence identification, cont observations from a non-linear MA(1) model: X t = e t + 2 cos e t 1, Sample Autocorrelation (left) and Lag dependent function (right) SACF LDF Lag Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.9/30 Lag

24 Model structure identification Most physical, chemical or biological systems are thought of having a complex determistic behavior and only small state noise. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.10/30

25 Model structure identification Most physical, chemical or biological systems are thought of having a complex determistic behavior and only small state noise. The system noise can sometimes be interpreted as model deficiency. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.10/30

26 Model structure identification Most physical, chemical or biological systems are thought of having a complex determistic behavior and only small state noise. The system noise can sometimes be interpreted as model deficiency. Filtering techniques allow us to separate the state noise from the measurement noise. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.10/30

27 Model structure identification Most physical, chemical or biological systems are thought of having a complex determistic behavior and only small state noise. The system noise can sometimes be interpreted as model deficiency. Filtering techniques allow us to separate the state noise from the measurement noise. Idea: Use the estimated elements in the diffusion term to pinpoint model deficiencies. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.10/30

28 Model structure identification, cont. Pseudoalgorithm: Test the residuals for dependence using e.g. LDF or PLDF etc. Repeat until no significant dependence remains. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.11/30

29 Model structure identification, cont. Pseudoalgorithm: Test the residuals for dependence using e.g. LDF or PLDF etc. Extending the state space, guided by large elements in the diffusion term. Repeat until no significant dependence remains. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.11/30

30 Model structure identification, cont. Pseudoalgorithm: Test the residuals for dependence using e.g. LDF or PLDF etc. Extending the state space, guided by large elements in the diffusion term. Pinpoint dependence by non-parametric regression. Repeat until no significant dependence remains. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.11/30

31 Model structure identification, cont. Pseudoalgorithm: Test the residuals for dependence using e.g. LDF or PLDF etc. Extending the state space, guided by large elements in the diffusion term. Pinpoint dependence by non-parametric regression. Expand the state space model to model dependence. Repeat until no significant dependence remains. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.11/30

32 Model structure identification, cont. We can test parameter dependence by extending the state space. This can also be generalized to other functional relations r(t) = (t, u(t), X (t), θ). Example: Let r(t) = θ (j). We test this by extending the state space dx (t) dr(t) = ( ) 0 dt + ( ) 0 0 r dw (t) dw (t) We expect the added state (a fixed parameter) to be independent of t, u(t) and X (t).. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.12/30

33 Model structure identification, cont. Example: Simulated model of a fed-batch bioreactor. dx (t) = ( ds(t) = ( dv (t) = F(t)dt + (S(t))X (t) F(t)X (t)/v (t) ) dt + 11dW (t) (1) (S(t))X (t)/y (t) + F(t)(S F (t) S(t))/V (t) ) dt + 33dW (t) (3) 22dW (t) (2) where X (t) is biomass concentration S(t) is substrate concentration V (t) is the volyme F(t) is the feed flow rate, S F (t) is the feed concentration of substrate and Y (t) is the yield coefficient of biomass. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.13/30

34 Model structure identification, cont. Finally, (S) is the biomass growth rate given by (S) = max S K 2 S 2 + S + K 1. Measurement equation: y 1 y 2 = y 3 k X S V k + e k where e k N (0, S), S = S S S 33. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.14/30

35 Model structure identification, cont. Test the model ( (S) = ) where we have extended the state space. dx (t) = ( ds(t) = ( dv (t) = F(t)dt + d (t) = 0dt + (t)x (t) F(t)X (t)/v (t) ) dt + 11dW (t) (1) (t)x (t)/y (t) + F(t)(S(t) F S(t))/V (t) ) dt + 33dW (t) (3) 44dW (t) (4) 22dW (t) (2) The parameters and states are estimated using an Extended Kalman filter. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.15/30

36 Model structure identification, cont. Parameter estimates (Quasi Maximum Likelihood): Parameter True Estimate Std dev t-score Significant? X E E Yes S E E Yes V E E Yes E E Yes E E No E E No E E No E E Yes S E E Yes S E E Yes S E E Yes Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.16/30

37 Model structure identification, cont. Partial dependence plot of ˆk k vs. ˆX k k and ˆk k vs. Ŝk k µ t t 0 µ t t X t t S t t Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.17/30

38 Generalized Gaussian residuals Univariate but state dependence in diffusion term is no theoretical problem. dx (t) = (t, u(t), X (t), θ)dt + Y (t k ) = X (t k ). (t, u(t), X (t), θ)dw (t), where (t, u(t), X (t), θ) : [0, T ] R l R R d R Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.18/30

39 Generalized Gaussian residuals Univariate but state dependence in diffusion term is no theoretical problem. dx (t) = (t, u(t), X (t), θ)dt + Y (t k ) = X (t k ). (t, u(t), X (t), θ)dw (t), where (t, u(t), X (t), θ) : [0, T ] R l R R d R (t, u(t), X (t), θ) : [0, T ] R l R R d R Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.18/30

40 Generalized Gaussian residuals Univariate but state dependence in diffusion term is no theoretical problem. dx (t) = (t, u(t), X (t), θ)dt + Y (t k ) = X (t k ). (t, u(t), X (t), θ)dw (t), where (t, u(t), X (t), θ) : [0, T ] R l R R d R (t, u(t), X (t), θ) : [0, T ] R l R R d R dw (t) is the increment of a Wiener process. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.18/30

41 Generalized Gaussian residuals, cont. How do we define residuals? Discrete time models X (n + 1) = f (X (n), θ) + g(x (n), θ) (n + 1). The i.i.d. sequence { (n)} is the natural choice. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.19/30

42 Generalized Gaussian residuals, cont. How do we define residuals? Discrete time models X (n + 1) = f (X (n), θ) + g(x (n), θ) (n + 1). The i.i.d. sequence { (n)} is the natural choice. Continuous time models X (n+1) = X (n)+ (s, u(s), X (s), θ)ds+ No natural, general choice! (s, u(s), X (s), θ)dw (s Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.19/30

43 Generalized Gaussian residuals, cont. We follow the approach taken in (Pedersen, 1994) but will define Generalized Gaussian Residuals. The reason is threefold Uncorrelated Gaussian independent random variables! Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.20/30

44 Generalized Gaussian residuals, cont. We follow the approach taken in (Pedersen, 1994) but will define Generalized Gaussian Residuals. The reason is threefold Uncorrelated Gaussian independent random variables! Better outlier detection. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.20/30

45 Generalized Gaussian residuals, cont. We follow the approach taken in (Pedersen, 1994) but will define Generalized Gaussian Residuals. The reason is threefold Uncorrelated Gaussian independent random variables! Better outlier detection. More powerful distributional tests. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.20/30

46 Generalized Gaussian residuals, cont. Pseudoalgorithm: Define x = F 1 X (n) X (n 1) (y) = inf{x : F X (n) X (n 1)(x) y} as the generalized inverse. It follows that U (n) = F X (n) X (n 1) (X (n)) U (0, 1) U (n) is the generalized (uniform) residuals introduced by (Pedersen, 1994). This is the inverse method! We will define the sequence {Y (n)} as the Generalized Gaussian Residuals. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.21/30

47 Generalized Gaussian residuals, cont. Pseudoalgorithm: Define x = F 1 X (n) X (n 1) (y) = inf{x : F X (n) X (n 1)(x) y} as the generalized inverse. It follows that U (n) = F X (n) X (n 1) (X (n)) U (0, 1) U (n) is the generalized (uniform) residuals introduced by (Pedersen, 1994). Y (n) = 1 (U (n)) N (0, 1). This is the inverse method! We will define the sequence {Y (n)} as the Generalized Gaussian Residuals. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.21/30

48 Algorithm, cont. Calculation of F X (n) X (n 1) (x) can be done by using: Monte Carlo techniques. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.22/30

49 Algorithm, cont. Calculation of F X (n) X (n 1) (x) can be done by using: Monte Carlo techniques. Solving a Partial Differential Equation (Fokker-Planck). Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.22/30

50 Algorithm, cont. Calculation of F X (n) X (n 1) (x) can be done by using: Monte Carlo techniques. Solving a Partial Differential Equation (Fokker-Planck). Other approaches, such as binomial trees, path integration etc. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.22/30

51 Algorithm, cont. It turns out that simulation is not well suited for this problem as U (n) = P(X (n) x(n) X (n 1)) = 1 N N j=1 1 {X (n) (j) x(n)}, which is a discrete approximation of the distribution function with low accuracy in the tails. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.23/30

52 Algorithm, cont. It turns out that simulation is not well suited for this problem as U (n) = P(X (n) x(n) X (n 1)) = 1 N N j=1 1 {X (n) (j) x(n)}, which is a discrete approximation of the distribution function with low accuracy in the tails. Applying 1 will amplify the errors leading to a bad approximation (possibly 1 (0) = ). Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.23/30

53 Algorithm, cont. It turns out that simulation is not well suited for this problem as U (n) = P(X (n) x(n) X (n 1)) = 1 N N j=1 1 {X (n) (j) x(n)}, which is a discrete approximation of the distribution function with low accuracy in the tails. Applying 1 will amplify the errors leading to a bad approximation (possibly 1 (0) = ). This problem is magnified by heavy tails, c.f. Jarque-Bera tests. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.23/30

54 Algorithm, cont. The Fokker-Planck (or Kolmogorov forward) equation for a diffusion dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t), is given by: p t = A p(x ti 1, t i 1 ; x ti, t i ), where the operator A is defined as A f = x ( (t, u(t), x, θ)f ) The equation is solved by 2 ( x 2 Finite difference approximation of the derivatives 2 (t, u(t), x, θ)f ). Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.24/30

55 Algorithm, cont. The Fokker-Planck (or Kolmogorov forward) equation for a diffusion dx (t) = (t, u(t), X (t), θ)dt + (t, u(t), X (t), θ)dw (t), is given by: p t = A p(x ti 1, t i 1 ; x ti, t i ), where the operator A is defined as A f = x ( (t, u(t), x, θ)f ) The equation is solved by 2 ( x 2 2 (t, u(t), x, θ)f ). Finite difference approximation of the derivatives Pade approximation of matrix exponentials Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.24/30

56 Algorithm, cont. Finally, F X (n) X (n 1) (x) = x p(s, x s; t, y)dy is calculated by a Gauss-Cotes method of the same order of accuracy as the finite difference approximation. The calculation of the Generalized Gaussian residuals is closely connected to Maximum Likelihood estimation. It is therefore suggested that the model is identified by Estimating the parameters in an approximative discrete time model. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.25/30

57 Algorithm, cont. Finally, F X (n) X (n 1) (x) = x p(s, x s; t, y)dy is calculated by a Gauss-Cotes method of the same order of accuracy as the finite difference approximation. The calculation of the Generalized Gaussian residuals is closely connected to Maximum Likelihood estimation. It is therefore suggested that the model is identified by Estimating the parameters in an approximative discrete time model. Using these estimates as initial values for the Maximum Likelihood estimator. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.25/30

58 Example, Cox-Ingersoll-Ross Model: dr t = 0.17 (0.05 r t )dt r t dw t. 500 observations, t=1. (Simulated data) Probability Normal Probability Plot Data lag lag Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.26/30

59 Example, Cox-Ingersoll-Ross Generalized Gaussian residuals Probability Normal Probability Plot Data lag lag Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.27/30

60 Example, CKLS Model: dr t = ( r t )dt + rt dw t. Weekly observations, US 3-month T-bill Feb83 Aug85 Feb88 Aug90 Feb93 Aug95 Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.28/30

61 Example, CKLS Generalized Gaussian residuals. Parameter estimated using an approximative Maximum Likelihood estimator Feb83 Feb87 Feb91 Feb95 Probability Normal Probability Plot Data lag Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.29/ lag

62 Summary We have presented three tools Lag dependent functions to identify dependence. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.30/30

63 Summary We have presented three tools Lag dependent functions to identify dependence. Non-parametric regression and the estimated state diffusion term to identify model defiencies. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.30/30

64 Summary We have presented three tools Lag dependent functions to identify dependence. Non-parametric regression and the estimated state diffusion term to identify model defiencies. Generalized Gaussian Residuals as a definition of residuals to examine the true residuals. Model Validation in Non-Linear Continuous-Discrete Grey-Box Models p.30/30

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Simo Särkkä Aalto University, Finland November 18, 2014 Simo Särkkä (Aalto) Lecture 4: Numerical Solution of SDEs November

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Simo Särkkä Aalto University Tampere University of Technology Lappeenranta University of Technology Finland November

More information

LECTURE 10: MORE ON RANDOM PROCESSES

LECTURE 10: MORE ON RANDOM PROCESSES LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more

More information

Modeling the Heat Dynamics of Buildings

Modeling the Heat Dynamics of Buildings Modeling the Heat Dynamics of Buildings Henrik Madsen ESI 101, KU Leuven, May 2015 www.henrikmadsen.org hmad.dtu@gmail.com Henrik Madsen ESI 101, KU Leuven, May 2015 1 Outline 1 Introduction 2 Selection

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

Advanced Data Analysis Methods

Advanced Data Analysis Methods Advanced Data Analysis Methods Henrik Madsen www.smart-cities-centre.org Lawrence Berkeley National Lab hmad.dtu@gmail.com Henrik Madsen www.smart-cities-centre.org 1 Outline 1 Introduction 2 Selection

More information

Financial Econometrics and Quantitative Risk Managenent Return Properties

Financial Econometrics and Quantitative Risk Managenent Return Properties Financial Econometrics and Quantitative Risk Managenent Return Properties Eric Zivot Updated: April 1, 2013 Lecture Outline Course introduction Return definitions Empirical properties of returns Reading

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

Modeling Hydrologic Chanae

Modeling Hydrologic Chanae Modeling Hydrologic Chanae Statistical Methods Richard H. McCuen Department of Civil and Environmental Engineering University of Maryland m LEWIS PUBLISHERS A CRC Press Company Boca Raton London New York

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

* Tuesday 17 January :30-16:30 (2 hours) Recored on ESSE3 General introduction to the course.

* Tuesday 17 January :30-16:30 (2 hours) Recored on ESSE3 General introduction to the course. Name of the course Statistical methods and data analysis Audience The course is intended for students of the first or second year of the Graduate School in Materials Engineering. The aim of the course

More information

Machine Learning CSE546 Carlos Guestrin University of Washington. September 30, What about continuous variables?

Machine Learning CSE546 Carlos Guestrin University of Washington. September 30, What about continuous variables? Linear Regression Machine Learning CSE546 Carlos Guestrin University of Washington September 30, 2014 1 What about continuous variables? n Billionaire says: If I am measuring a continuous variable, what

More information

Systems Driven by Alpha-Stable Noises

Systems Driven by Alpha-Stable Noises Engineering Mechanics:A Force for the 21 st Century Proceedings of the 12 th Engineering Mechanics Conference La Jolla, California, May 17-20, 1998 H. Murakami and J. E. Luco (Editors) @ASCE, Reston, VA,

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it

More information

Introduction to Probabilistic Graphical Models: Exercises

Introduction to Probabilistic Graphical Models: Exercises Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri

Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri University of Bergamo (Italy) ilia.negri@unibg.it SAPS VIII, Le Mans 21-24 March,

More information

Lecture on Parameter Estimation for Stochastic Differential Equations. Erik Lindström

Lecture on Parameter Estimation for Stochastic Differential Equations. Erik Lindström Lecture on Parameter Estimation for Stochastic Differential Equations Erik Lindström Recap We are interested in the parameters θ in the Stochastic Integral Equations X(t) = X(0) + t 0 µ θ (s, X(s))ds +

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Practical Statistics

Practical Statistics Practical Statistics Lecture 1 (Nov. 9): - Correlation - Hypothesis Testing Lecture 2 (Nov. 16): - Error Estimation - Bayesian Analysis - Rejecting Outliers Lecture 3 (Nov. 18) - Monte Carlo Modeling -

More information

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Machine learning - HT Maximum Likelihood

Machine learning - HT Maximum Likelihood Machine learning - HT 2016 3. Maximum Likelihood Varun Kanade University of Oxford January 27, 2016 Outline Probabilistic Framework Formulate linear regression in the language of probability Introduce

More information

Machine Learning CSE546 Carlos Guestrin University of Washington. September 30, 2013

Machine Learning CSE546 Carlos Guestrin University of Washington. September 30, 2013 Bayesian Methods Machine Learning CSE546 Carlos Guestrin University of Washington September 30, 2013 1 What about prior n Billionaire says: Wait, I know that the thumbtack is close to 50-50. What can you

More information

Nonparametric Tests for Multi-parameter M-estimators

Nonparametric Tests for Multi-parameter M-estimators Nonparametric Tests for Multi-parameter M-estimators John Robinson School of Mathematics and Statistics University of Sydney The talk is based on joint work with John Kolassa. It follows from work over

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans

More information

Hidden Markov Models for precipitation

Hidden Markov Models for precipitation Hidden Markov Models for precipitation Pierre Ailliot Université de Brest Joint work with Peter Thomson Statistics Research Associates (NZ) Page 1 Context Part of the project Climate-related risks for

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

BOOTSTRAP PREDICTION INTERVALS IN STATE SPACE MODELS. Alejandro Rodriguez 1 and Esther Ruiz 2

BOOTSTRAP PREDICTION INTERVALS IN STATE SPACE MODELS. Alejandro Rodriguez 1 and Esther Ruiz 2 Working Paper 08-11 Departamento de Estadística Statistic and Econometric Series 04 Universidad Carlos III de Madrid March 2008 Calle Madrid, 126 28903 Getafe (Spain) Fax (34-91) 6249849 BOOTSTRAP PREDICTION

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Regression based methods, 1st part: Introduction (Sec.

More information

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents Navtech Part #s Volume 1 #1277 Volume 2 #1278 Volume 3 #1279 3 Volume Set #1280 Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents Volume 1 Preface Contents

More information

Regression with correlation for the Sales Data

Regression with correlation for the Sales Data Regression with correlation for the Sales Data Scatter with Loess Curve Time Series Plot Sales 30 35 40 45 Sales 30 35 40 45 0 10 20 30 40 50 Week 0 10 20 30 40 50 Week Sales Data What is our goal with

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Homogenization with stochastic differential equations

Homogenization with stochastic differential equations Homogenization with stochastic differential equations Scott Hottovy shottovy@math.arizona.edu University of Arizona Program in Applied Mathematics October 12, 2011 Modeling with SDE Use SDE to model system

More information

Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models

Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models 02433 - Hidden Markov Models Pierre-Julien Trombe, Martin Wæver Pedersen, Henrik Madsen Course week 10 MWP, compiled

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

The Smoluchowski-Kramers Approximation: What model describes a Brownian particle?

The Smoluchowski-Kramers Approximation: What model describes a Brownian particle? The Smoluchowski-Kramers Approximation: What model describes a Brownian particle? Scott Hottovy shottovy@math.arizona.edu University of Arizona Applied Mathematics October 7, 2011 Brown observes a particle

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

Gaussian processes for inference in stochastic differential equations

Gaussian processes for inference in stochastic differential equations Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017

More information

Modelling the heat consumption in district heating systems using a grey-box approach

Modelling the heat consumption in district heating systems using a grey-box approach Energy and Buildings 38 (2006) 63 71 www.elsevier.com/locate/enbuild Modelling the heat consumption in district heating systems using a grey-box approach Henrik Aalborg Nielsen *, Henrik Madsen Informatics

More information

Functional time series

Functional time series Rob J Hyndman Functional time series with applications in demography 4. Connections, extensions and applications Outline 1 Yield curves 2 Electricity prices 3 Dynamic updating with partially observed functions

More information

Machine Learning CSE546 Sham Kakade University of Washington. Oct 4, What about continuous variables?

Machine Learning CSE546 Sham Kakade University of Washington. Oct 4, What about continuous variables? Linear Regression Machine Learning CSE546 Sham Kakade University of Washington Oct 4, 2016 1 What about continuous variables? Billionaire says: If I am measuring a continuous variable, what can you do

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture State space models, 1st part: Model: Sec. 10.1 The

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

A State Space Model for Wind Forecast Correction

A State Space Model for Wind Forecast Correction A State Space Model for Wind Forecast Correction Valrie Monbe, Pierre Ailliot 2, and Anne Cuzol 1 1 Lab-STICC, Université Européenne de Bretagne, France (e-mail: valerie.monbet@univ-ubs.fr, anne.cuzol@univ-ubs.fr)

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Multilevel Statistical Models: 3 rd edition, 2003 Contents

Multilevel Statistical Models: 3 rd edition, 2003 Contents Multilevel Statistical Models: 3 rd edition, 2003 Contents Preface Acknowledgements Notation Two and three level models. A general classification notation and diagram Glossary Chapter 1 An introduction

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Gaussians Linear Regression Bias-Variance Tradeoff

Gaussians Linear Regression Bias-Variance Tradeoff Readings listed in class website Gaussians Linear Regression Bias-Variance Tradeoff Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University January 22 nd, 2007 Maximum Likelihood Estimation

More information

Simple Linear Regression for the MPG Data

Simple Linear Regression for the MPG Data Simple Linear Regression for the MPG Data 2000 2500 3000 3500 15 20 25 30 35 40 45 Wgt MPG What do we do with the data? y i = MPG of i th car x i = Weight of i th car i =1,...,n n = Sample Size Exploratory

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Variance stabilization and simple GARCH models. Erik Lindström

Variance stabilization and simple GARCH models. Erik Lindström Variance stabilization and simple GARCH models Erik Lindström Simulation, GBM Standard model in math. finance, the GBM ds t = µs t dt + σs t dw t (1) Solution: S t = S 0 exp ) ) ((µ σ2 t + σw t 2 (2) Problem:

More information

Fundamental Probability and Statistics

Fundamental Probability and Statistics Fundamental Probability and Statistics "There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are

More information

1.1 Basis of Statistical Decision Theory

1.1 Basis of Statistical Decision Theory ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 1: Introduction Lecturer: Yihong Wu Scribe: AmirEmad Ghassami, Jan 21, 2016 [Ed. Jan 31] Outline: Introduction of

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Discussion of Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions, by Li Pan and Dimitris Politis

Discussion of Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions, by Li Pan and Dimitris Politis Discussion of Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions, by Li Pan and Dimitris Politis Sílvia Gonçalves and Benoit Perron Département de sciences économiques,

More information

Normalising constants and maximum likelihood inference

Normalising constants and maximum likelihood inference Normalising constants and maximum likelihood inference Jakob G. Rasmussen Department of Mathematics Aalborg University Denmark March 9, 2011 1/14 Today Normalising constants Approximation of normalising

More information

Independent and conditionally independent counterfactual distributions

Independent and conditionally independent counterfactual distributions Independent and conditionally independent counterfactual distributions Marcin Wolski European Investment Bank M.Wolski@eib.org Society for Nonlinear Dynamics and Econometrics Tokyo March 19, 2018 Views

More information

Review of the role of uncertainties in room acoustics

Review of the role of uncertainties in room acoustics Review of the role of uncertainties in room acoustics Ralph T. Muehleisen, Ph.D. PE, FASA, INCE Board Certified Principal Building Scientist and BEDTR Technical Lead Division of Decision and Information

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

2D Image Processing (Extended) Kalman and particle filter

2D Image Processing (Extended) Kalman and particle filter 2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Terminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1

Terminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1 Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maximum likelihood Consistency Confidence intervals Properties of the mean estimator Properties of the

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Pattern Recognition. Parameter Estimation of Probability Density Functions

Pattern Recognition. Parameter Estimation of Probability Density Functions Pattern Recognition Parameter Estimation of Probability Density Functions Classification Problem (Review) The classification problem is to assign an arbitrary feature vector x F to one of c classes. The

More information

Nonlinear Time Series Modeling

Nonlinear Time Series Modeling Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Different approaches to model wind speed based on stochastic differential equations

Different approaches to model wind speed based on stochastic differential equations 1 Universidad de Castilla La Mancha Different approaches to model wind speed based on stochastic differential equations Rafael Zárate-Miñano Escuela de Ingeniería Minera e Industrial de Almadén Universidad

More information

Multivariate Regression: Part I

Multivariate Regression: Part I Topic 1 Multivariate Regression: Part I ARE/ECN 240 A Graduate Econometrics Professor: Òscar Jordà Outline of this topic Statement of the objective: we want to explain the behavior of one variable as a

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 2: Non-parametric methods Henrik Madsen Advanced Time Series Analysis September 206 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes September

More information

Trend and Variability Analysis and Forecasting of Wind-Speed in Bangladesh

Trend and Variability Analysis and Forecasting of Wind-Speed in Bangladesh J. Environ. Sci. & Natural Resources, 5(): 97-07, 0 ISSN 999-736 Trend and Variability Analysis and Forecasting of Wind-Speed in Bangladesh J. A. Syeda Department of Statistics, Hajee Mohammad Danesh Science

More information

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm by Korbinian Schwinger Overview Exponential Family Maximum Likelihood The EM Algorithm Gaussian Mixture Models Exponential

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Applied Probability and Stochastic Processes

Applied Probability and Stochastic Processes Applied Probability and Stochastic Processes In Engineering and Physical Sciences MICHEL K. OCHI University of Florida A Wiley-Interscience Publication JOHN WILEY & SONS New York - Chichester Brisbane

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Data assimilation in the MIKE 11 Flood Forecasting system using Kalman filtering

Data assimilation in the MIKE 11 Flood Forecasting system using Kalman filtering Water Resources Systems Hydrological Risk, Management and Development (Proceedings of symposium IlS02b held during IUGG2003 al Sapporo. July 2003). IAHS Publ. no. 281. 2003. 75 Data assimilation in the

More information

AJAE Appendix: The Commodity Terms of Trade, Unit Roots, and Nonlinear Alternatives

AJAE Appendix: The Commodity Terms of Trade, Unit Roots, and Nonlinear Alternatives AJAE Appendix: The Commodity Terms of Trade, Unit Roots, and Nonlinear Alternatives Joseph V. Balagtas Department of Agricultural Economics Purdue University Matthew T. Holt Department of Agricultural

More information

Introduction to Smoothing spline ANOVA models (metamodelling)

Introduction to Smoothing spline ANOVA models (metamodelling) Introduction to Smoothing spline ANOVA models (metamodelling) M. Ratto DYNARE Summer School, Paris, June 215. Joint Research Centre www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting

More information

1/f Fluctuations from the Microscopic Herding Model

1/f Fluctuations from the Microscopic Herding Model 1/f Fluctuations from the Microscopic Herding Model Bronislovas Kaulakys with Vygintas Gontis and Julius Ruseckas Institute of Theoretical Physics and Astronomy Vilnius University, Lithuania www.itpa.lt/kaulakys

More information