Fast computation of reconciled forecasts in hierarchical and grouped time series

Size: px
Start display at page:

Download "Fast computation of reconciled forecasts in hierarchical and grouped time series"

Transcription

1 Rob J Hyndman Fast computation of reconciled forecasts in hierarchical and grouped time series Total A B C AX AY AZ BX BY BZ CX CY CZ 1

2 Outline 1 Optimally reconciled forecasts 2 Fast computation 3 hts package for R 4 References Fast computation of reconciled forecasts Optimally reconciled forecasts 2

3 Hierarchical data Total A B C Y t = AX AY AZ Y t Y A,t Y B,t Y C,t Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t = BX BY BZ } {{ } S Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t }{{} B t CX CY CZ Fast computation of reconciled forecasts Optimally reconciled forecasts 3

4 Hierarchical data Total A B C Y t = AX AY AZ Y t Y A,t Y B,t Y C,t Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t = BX BY BZ } {{ } S Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t }{{} B t CX CY CZ Fast computation of reconciled forecasts Optimally reconciled forecasts 3

5 Hierarchical data Total A B C Y t = AX AY AZ Y t Y A,t Y B,t Y C,t Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t = BX BY BZ } {{ } S Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t }{{} B t CX CY CZ Y t = SB t Fast computation of reconciled forecasts Optimally reconciled forecasts 3

6 Grouped data Y t = AX AY AZ A BX BY BZ B CX CY CZ C X Y Z Total Y t Y A,t Y B,t Y C,t Y AX,t Y X,t Y Y Y,t AY,t Y Y Z,t AZ,t Y AX,t Y Y AY,t = BX,t Y BY,t Y AZ,t Y BZ,t Y BX,t Y BY,t Y CX,t Y Y BZ,t CY,t Y CX,t Y CZ,t }{{} Y CY,t B t Y CZ,t Fast computation of} reconciled forecasts{{ Optimally } reconciled forecasts 4

7 Grouped data Y t = AX AY AZ A BX BY BZ B CX CY CZ C X Y Z Total Y t Y A,t Y B,t Y C,t Y AX,t Y X,t Y Y Y,t AY,t Y Y Z,t AZ,t Y AX,t Y Y AY,t = BX,t Y BY,t Y AZ,t Y BZ,t Y BX,t Y BY,t Y CX,t Y Y BZ,t CY,t Y CX,t Y CZ,t }{{} Y CY,t B t Y CZ,t Fast computation of} reconciled forecasts{{ Optimally } reconciled forecasts 4

8 Grouped data Y t = AX AY AZ A BX BY BZ B CX CY CZ C X Y Z Total Y t Y A,t Y B,t Y C,t Y AX,t Y X,t Y Y Y,t AY,t Y Y Z,t AZ,t Y AX,t Y Y AY,t = BX,t Y BY,t Y AZ,t Y BZ,t Y BX,t Y BY,t Y CX,t Y Y BZ,t CY,t Y CX,t Y CZ,t }{{} Y CY,t B t Y CZ,t Fast computation of} reconciled forecasts{{ Optimally } reconciled forecasts 4 Y t = SB t

9 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

10 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

11 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

12 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

13 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

14 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

15 Forecasts Key idea: forecast reconciliation Ignore structural constraints and forecast every series of interest independently. Adjust forecasts to impose constraints. Let Ŷ n (h) be vector of initial h-step forecasts, made at time n, stacked in same order as Y t. Y t = SB t. So Ŷ n (h) = Sβ n (h) + ε h. β n (h) = E[B n+h Y 1,..., Y n ]. ε h has zero mean. Estimate β n (h) using weighted least squares. Fast computation of reconciled forecasts Optimally reconciled forecasts 5

16 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

17 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Initial forecasts Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

18 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Revised forecasts Initial forecasts Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

19 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Revised forecasts Initial forecasts Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

20 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Revised forecasts Initial forecasts Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

21 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Revised forecasts Initial forecasts Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

22 Optimal reconciled forecasts Ỹ n (h) = S ˆβ n (h) = S(S ΛS) 1 S ΛŶn(h) Revised forecasts Initial forecasts Λ weights (usually inverse of one-step forecast variances, but any choice possible). Easy to estimate, and places weight where we have best forecasts. Ignores covariances. Still difficult to compute for large numbers of time series. Need to do calculation without explicitly forming S or (S ΛS) 1 or S Λ. Fast computation of reconciled forecasts Optimally reconciled forecasts 6

23 Outline 1 Optimally reconciled forecasts 2 Fast computation 3 hts package for R 4 References Fast computation of reconciled forecasts Fast computation 7

24 Fast computation: hierarchical data Total A B C Y t = AX AY AZ Y t Y A,t Y B,t Y C,t Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t = BX BY BZ } {{ } S Y AX,t Y AY,t Y AZ,t Y BX,t Y BY,t Y BZ,t Y CX,t Y CY,t Y CZ,t }{{} B t CX CY CZ Y t = SB t Fast computation of reconciled forecasts Fast computation 8

25 Fast computation: hierarchies Think of the hierarchy as a tree of trees: Total T 1 T 2... T K Then the summing matrix contains k smaller summing matrices: S = 1 n 1 1 n 2 1 n K S S S K where 1 n is an n-vector of ones and tree T i has n i terminal nodes. Fast computation of reconciled forecasts Fast computation 9.

26 Fast computation: hierarchies Think of the hierarchy as a tree of trees: Total T 1 T 2... T K Then the summing matrix contains k smaller summing matrices: S = 1 n 1 1 n 2 1 n K S S S K where 1 n is an n-vector of ones and tree T i has n i terminal nodes. Fast computation of reconciled forecasts Fast computation 9.

27 Fast computation: hierarchies S ΛS = S 1Λ 1 S S 2Λ 2 S S KΛ K S K. +λ 0 J n λ 0 is the top left element of Λ; Λ k is a block of Λ, corresponding to tree T k ; J n is a matrix of ones; n = k n k. Now apply the Sherman-Morrison formula... Fast computation of reconciled forecasts Fast computation 10

28 Fast computation: hierarchies S ΛS = S 1Λ 1 S S 2Λ 2 S S KΛ K S K. +λ 0 J n λ 0 is the top left element of Λ; Λ k is a block of Λ, corresponding to tree T k ; J n is a matrix of ones; n = k n k. Now apply the Sherman-Morrison formula... Fast computation of reconciled forecasts Fast computation 10

29 Fast computation: hierarchies (S 1 Λ 1S 1 ) (S ΛS) 1 0 (S 2 = Λ 2S 2 ) cs (S K Λ KS K ) 1 S 0 can be partitioned into K 2 blocks, with the (k, l) block (of dimension n k n l ) being J nk,n l (S k Λ ks k ) 1 J nk,n l (S l Λ ls l ) 1 is a n k n l matrix of ones. c 1 = λ n k (S k Λ ks k ) 1 1 nk. k Each S k Λ ks k can be inverted similarly. S ΛY can also be computed recursively. Fast computation of reconciled forecasts Fast computation 11

30 Fast computation: hierarchies (S 1 Λ 1S 1 ) (S ΛS) 1 0 (S 2 = Λ 2S 2 ) cs (S K Λ KS K ) 1 S 0 can be partitioned into K 2 blocks, with the (k, l) block (of dimension n k n l ) being The recursive (S k Λ calculations ks k ) 1 J nk,n l (S l Λ can ls l ) 1 be done in such a way that we never J nk,n l store is a n k any n l matrix of ones. of the large matrices c 1 = involved. λ n k (S k Λ ks k ) 1 1 nk. k Each S k Λ ks k can be inverted similarly. S ΛY can also be computed recursively. Fast computation of reconciled forecasts Fast computation 11

31 Fast computation: grouped data Y t = AX AY AZ A BX BY BZ B CX CY CZ C X Y Z Total Y t Y A,t Y B,t Y C,t Y AX,t Y X,t Y Y Y,t AY,t Y Y Z,t AZ,t Y AX,t Y Y AY,t = BX,t Y BY,t Y AZ,t Y BZ,t Y BX,t Y BY,t Y CX,t Y Y BZ,t CY,t Y CX,t Y CZ,t Y t = SB t }{{} Y CY,t B t Y CZ,t Fast computation of reconciled forecasts Fast computation 12 }{{}

32 Fast computation: grouped data S = 1 m 1 n 1 m I n I m 1 n I m I n m = number of rows n = number of columns S ΛS = λ 00 J mn + (Λ R J n ) + (J m Λ C ) + Λ U Λ R, Λ C and Λ U are diagonal matrices corresponding to rows, columns and unaggregated series; λ 00 corresponds to aggregate. Fast computation of reconciled forecasts Fast computation 13

33 Fast computation: grouped data S = 1 m 1 n 1 m I n I m 1 n I m I n m = number of rows n = number of columns S ΛS = λ 00 J mn + (Λ R J n ) + (J m Λ C ) + Λ U Λ R, Λ C and Λ U are diagonal matrices corresponding to rows, columns and unaggregated series; λ 00 corresponds to aggregate. Fast computation of reconciled forecasts Fast computation 13

34 Fast computation: grouped data (SΛS) 1 = A A1 mn 1 mn A 1/λ mn A1 mn A = Λ 1 U Λ 1 U (J m D)Λ 1 U EM 1 E. D is diagonal with elements d j = λ 0j /(1 + λ 0j E has m m blocks where e ij has kth element { (e ij ) k = λ 1/2 i0 λ 1 ik λ1/2 i0 λ 2 λ 1/2 j0 λ 1 ik λ 1 ik d k, i = j, jk d k, i j. i λ 1 ij ). M is m m with (i, j) element { 1 + λ i0 k (M) ij = λ 1 λ ik i0 k λ 2 ik d k, i = j, k λ 1 ik λ 1 jk d k, i j. λ 1/2 i0 λ1/2 j0 Fast computation of reconciled forecasts Fast computation 14

35 Fast computation: grouped data (SΛS) 1 = A A1 mn 1 mn A 1/λ mn A1 mn A = Λ 1 U Λ 1 U (J m D)Λ 1 U EM 1 E. D is diagonal with elements d j = λ 0j /(1 + λ 0j E has m m blocks where e ij has kth element { Again, theλ calculations 1/2 i0 (e ij ) k = λ 1 ik λ1/2 can i0 λ 2 ik d be k, done i = j, in such a way λ 1/2 that j0 λ 1 ik λ 1 we jk d never k, store i j. any of the large matrices involved. i λ 1 ij ). M is m m with (i, j) element { 1 + λ i0 k (M) ij = λ 1 λ ik i0 k λ 2 ik d k, i = j, k λ 1 ik λ 1 jk d k, i j. λ 1/2 i0 λ1/2 j0 Fast computation of reconciled forecasts Fast computation 14

36 Fast computation When the time series are not strictly hierarchical and have more than two grouping variables: Use sparse matrix storage and arithmetic. Use iterative approximation for inverting large sparse matrices. Paige & Saunders (1982) ACM Trans. Math. Software Fast computation of reconciled forecasts Fast computation 15

37 Fast computation When the time series are not strictly hierarchical and have more than two grouping variables: Use sparse matrix storage and arithmetic. Use iterative approximation for inverting large sparse matrices. Paige & Saunders (1982) ACM Trans. Math. Software Fast computation of reconciled forecasts Fast computation 15

38 Fast computation When the time series are not strictly hierarchical and have more than two grouping variables: Use sparse matrix storage and arithmetic. Use iterative approximation for inverting large sparse matrices. Paige & Saunders (1982) ACM Trans. Math. Software Fast computation of reconciled forecasts Fast computation 15

39 Outline 1 Optimally reconciled forecasts 2 Fast computation 3 hts package for R 4 References Fast computation of reconciled forecasts hts package for R 16

40 hts package for R hts: Hierarchical and grouped time series Methods for analysing and forecasting hierarchical and grouped time series Version: 4.3 Depends: forecast ( 5.0) Imports: SparseM, parallel, utils Published: Author: Rob J Hyndman, Earo Wang and Alan Lee Maintainer: Rob J Hyndman <Rob.Hyndman at monash.edu> BugReports: License: GPL ( 2) Fast computation of reconciled forecasts hts package for R 17

41 Example using R library(hts) # bts is a matrix containing the bottom level time series # nodes describes the hierarchical structure y <- hts(bts, nodes=list(2, c(3,2))) Fast computation of reconciled forecasts hts package for R 18

42 Example using R library(hts) # bts is a matrix containing the bottom level time series # nodes describes the hierarchical structure y <- hts(bts, nodes=list(2, c(3,2))) Total A B AX AY AZ BX BY Fast computation of reconciled forecasts hts package for R 18

43 Example using R library(hts) # bts is a matrix containing the bottom level time series # nodes describes the hierarchical structure y <- hts(bts, nodes=list(2, c(3,2))) # Forecast 10-step-ahead using WLS combination method # ETS used for each series by default fc <- forecast(y, h=10) Fast computation of reconciled forecasts hts package for R 19

44 forecast.gts function Usage forecast(object, h, method = c("comb", "bu", "mo", "tdgsf", "tdgsa", "tdfp"), fmethod = c("ets", "rw", "arima"), weights = c("sd", "none", "nseries"), positive = FALSE, parallel = FALSE, num.cores = 2,...) Arguments object Hierarchical time series object of class gts. h Forecast horizon method Method for distributing forecasts within the hierarchy. fmethod Forecasting method to use positive If TRUE, forecasts are forced to be strictly positive weights Weights used for "optimal combination" method. When weights = "sd", it takes account of the standard deviation of forecasts. parallel If TRUE, allow parallel processing num.cores If parallel = TRUE, specify how many cores are going to be used Fast computation of reconciled forecasts hts package for R 20

45 Outline 1 Optimally reconciled forecasts 2 Fast computation 3 hts package for R 4 References Fast computation of reconciled forecasts References 21

46 References RJ Hyndman, RA Ahmed, G Athanasopoulos, and HL Shang (Sept. 2011). Optimal combination forecasts for hierarchical time series. Computational statistics & data analysis 55(9), RJ Hyndman, AJ Lee, and E Wang (2014). Fast computation of reconciled forecasts for hierarchical and grouped time series. Working paper 17/14. Department of Econometrics & Business Statistics, Monash University RJ Hyndman, AJ Lee, and E Wang (2014). hts: Hierarchical and grouped time series. cran.r-project.org/package=hts. RJ Hyndman and G Athanasopoulos (2014). Forecasting: principles and practice. OTexts. FastOTexts.org/fpp/. computation of reconciled forecasts References 22

47 References RJ Hyndman, RA Ahmed, G Athanasopoulos, and HL Shang (Sept. 2011). Optimal combination forecasts for hierarchical time series. Computational statistics & data analysis 55(9), RJ Hyndman, AJ Lee, and E Wang (2014). Fast computation of reconciled forecasts for hierarchical and grouped time series. Working paper 17/14. Department of Econometrics & Business Statistics, Monash University RJ Hyndman, AJ Lee, and E Wang (2014). hts: Hierarchical and grouped time series. robjhyndman.com cran.r-project.org/package=hts. RJ Hyndman and G Athanasopoulos (2014). Forecasting: principles and practice. OTexts. FastOTexts.org/fpp/. computation of reconciled forecasts References 22 Papers and R code: Rob.Hyndman@monash.edu

Forecasting: principles and practice. Rob J Hyndman 4 Hierarchical forecasting

Forecasting: principles and practice. Rob J Hyndman 4 Hierarchical forecasting Forecasting: principles and practice Rob J Hyndman 4 Hierarchical forecasting 1 Outline 1 Hierarchical and grouped time series 2 Optimal forecast reconciliation 3 hts package for R 4 Application: Australian

More information

Optimal forecast reconciliation for big time series data

Optimal forecast reconciliation for big time series data Rob J Hyndman Optimal forecast reconciliation for big time series data Total A AA AB AC B BA BB BC C CA CB CC Follow along using R Requirements Install the hts package and its dependencies. Optimal forecast

More information

Forecasting hierarchical (and grouped) time series

Forecasting hierarchical (and grouped) time series George Athanasopoulos (with Rob J. Hyndman, Nikos Kourentzes and Fotis Petropoulos) Forecasting hierarchical (and grouped) time series Total A B C AA AB AC BA BB BC CA CB CC 1 Outline 1 Hierarchical and

More information

Rob J Hyndman. Visualizing and forecasting. big time series data. Victoria: scaled

Rob J Hyndman. Visualizing and forecasting. big time series data. Victoria: scaled Rob J Hyndman Visualizing and forecasting big time series data Victoria: scaled Outline 1 Examples of biggish time series 2 Time series visualisation 3 BLUF: Best Linear Unbiased Forecasts 4 Application:

More information

Fast computation of reconciled forecasts for hierarchical and grouped time series

Fast computation of reconciled forecasts for hierarchical and grouped time series ISSN 1440-771X Department of Econometrics and Business Statistics http://wwwbusecomonasheduau/depts/ebs/pubs/wpapers/ Fast computation of reconciled forecasts for hierarchical and grouped time series Rob

More information

Time series can often be naturally disaggregated

Time series can often be naturally disaggregated Optimally Reconciling Forecasts in a Hierarchy Rob J. Hyndman and George Athanasopoulos Preview We know that when forecasting the product hierarchy, reconciliation is invariably needed to make the sum

More information

Optimal combination forecasts for hierarchical time series

Optimal combination forecasts for hierarchical time series Optimal combination forecasts for hierarchical time series Rob J. Hyndman Roman A. Ahmed Department of Econometrics and Business Statistics Outline 1 Review of hierarchical forecasting 2 3 Simulation study

More information

Forecasting in Hierarchical Models

Forecasting in Hierarchical Models Forecasting in Hierarchical Models Lucy Morgan Supervisor: Nikolaos Kourentzes 20 th February 2015 Introduction Forecasting is the process of making statements about events whose actual outcomes (typically)

More information

Package thief. R topics documented: January 24, Version 0.3 Title Temporal Hierarchical Forecasting

Package thief. R topics documented: January 24, Version 0.3 Title Temporal Hierarchical Forecasting Version 0.3 Title Temporal Hierarchical Forecasting Package thief January 24, 2018 Methods and tools for generating forecasts at different temporal frequencies using a hierarchical time series approach.

More information

Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization

Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization ISSN 1440-771X Department of Econometrics and Business Statistics http://business.monash.edu/econometrics-and-businessstatistics/research/publications Optimal forecast reconciliation for hierarchical and

More information

Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization

Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization Shanika L Wickramasuriya Department of Econometrics and Business Statistics, Monash University, VIC 3800,

More information

Package hts. June 21, 2017

Package hts. June 21, 2017 Title Hierarchical and Grouped Time Series Version 5.1.4 Package hts June 21, 2017 Methods for analysing and forecasting hierarchical and grouped time series. Depends R (>= 3.0.2), forecast (>= 8.1) Imports

More information

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017 Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.

More information

Regularization in Hierarchical Time Series Forecasting With Application to Electricity Smart Meter Data

Regularization in Hierarchical Time Series Forecasting With Application to Electricity Smart Meter Data Regularization in Hierarchical Time Series Forecasting With Application to Electricity Smart Meter Data Souhaib Ben Taieb Monash Business School Monash University Jiafan Yu Dept. of CEE Stanford University

More information

U.C. Berkeley CS270: Algorithms Lecture 21 Professor Vazirani and Professor Rao Last revised. Lecture 21

U.C. Berkeley CS270: Algorithms Lecture 21 Professor Vazirani and Professor Rao Last revised. Lecture 21 U.C. Berkeley CS270: Algorithms Lecture 21 Professor Vazirani and Professor Rao Scribe: Anupam Last revised Lecture 21 1 Laplacian systems in nearly linear time Building upon the ideas introduced in the

More information

Grouped time-series forecasting: Application to regional infant mortality counts

Grouped time-series forecasting: Application to regional infant mortality counts Grouped time-series forecasting: Application to regional infant mortality counts Han Lin Shang and Peter W. F. Smith University of Southampton Motivation 1 Multiple time series can be disaggregated by

More information

BALANCING-RELATED MODEL REDUCTION FOR DATA-SPARSE SYSTEMS

BALANCING-RELATED MODEL REDUCTION FOR DATA-SPARSE SYSTEMS BALANCING-RELATED Peter Benner Professur Mathematik in Industrie und Technik Fakultät für Mathematik Technische Universität Chemnitz Computational Methods with Applications Harrachov, 19 25 August 2007

More information

A Second Course in Elementary Differential Equations

A Second Course in Elementary Differential Equations A Second Course in Elementary Differential Equations Marcel B Finan Arkansas Tech University c All Rights Reserved August 3, 23 Contents 28 Calculus of Matrix-Valued Functions of a Real Variable 4 29 nth

More information

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS ISSN 1440-771X ISBN 0 7326 1085 0 Unmasking the Theta Method Rob J. Hyndman and Baki Billah Working Paper 5/2001 2001 DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS AUSTRALIA Unmasking the Theta method

More information

Mathematics I. Exercises with solutions. 1 Linear Algebra. Vectors and Matrices Let , C = , B = A = Determine the following matrices:

Mathematics I. Exercises with solutions. 1 Linear Algebra. Vectors and Matrices Let , C = , B = A = Determine the following matrices: Mathematics I Exercises with solutions Linear Algebra Vectors and Matrices.. Let A = 5, B = Determine the following matrices: 4 5, C = a) A + B; b) A B; c) AB; d) BA; e) (AB)C; f) A(BC) Solution: 4 5 a)

More information

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues

More information

Lecture 2 and 3: Controllability of DT-LTI systems

Lecture 2 and 3: Controllability of DT-LTI systems 1 Lecture 2 and 3: Controllability of DT-LTI systems Spring 2013 - EE 194, Advanced Control (Prof Khan) January 23 (Wed) and 28 (Mon), 2013 I LTI SYSTEMS Recall that continuous-time LTI systems can be

More information

Applied Mathematics Letters

Applied Mathematics Letters Applied Mathematics Letters 24 (2011) 797 802 Contents lists available at ScienceDirect Applied Mathematics Letters journal homepage: wwwelseviercom/locate/aml Model order determination using the Hankel

More information

Using Temporal Hierarchies to Predict Tourism Demand

Using Temporal Hierarchies to Predict Tourism Demand Using Temporal Hierarchies to Predict Tourism Demand International Symposium on Forecasting 29 th June 2015 Nikolaos Kourentzes Lancaster University George Athanasopoulos Monash University n.kourentzes@lancaster.ac.uk

More information

Linear Algebra Solutions 1

Linear Algebra Solutions 1 Math Camp 1 Do the following: Linear Algebra Solutions 1 1. Let A = and B = 3 8 5 A B = 3 5 9 A + B = 9 11 14 4 AB = 69 3 16 BA = 1 4 ( 1 3. Let v = and u = 5 uv = 13 u v = 13 v u = 13 Math Camp 1 ( 7

More information

Regression. Oscar García

Regression. Oscar García Regression Oscar García Regression methods are fundamental in Forest Mensuration For a more concise and general presentation, we shall first review some matrix concepts 1 Matrices An order n m matrix is

More information

Vectors, matrices, eigenvalues and eigenvectors

Vectors, matrices, eigenvalues and eigenvectors Vectors, matrices, eigenvalues and eigenvectors 1 ( ) ( ) ( ) Scaling a vector: 0.5V 2 0.5 2 1 = 0.5 = = 1 0.5 1 0.5 ( ) ( ) ( ) ( ) Adding two vectors: V + W 2 1 2 + 1 3 = + = = 1 3 1 + 3 4 ( ) ( ) a

More information

Block Bidiagonal Decomposition and Least Squares Problems

Block Bidiagonal Decomposition and Least Squares Problems Block Bidiagonal Decomposition and Least Squares Problems Åke Björck Department of Mathematics Linköping University Perspectives in Numerical Analysis, Helsinki, May 27 29, 2008 Outline Bidiagonal Decomposition

More information

Zhaoxing Gao and Ruey S Tsay Booth School of Business, University of Chicago. August 23, 2018

Zhaoxing Gao and Ruey S Tsay Booth School of Business, University of Chicago. August 23, 2018 Supplementary Material for Structural-Factor Modeling of High-Dimensional Time Series: Another Look at Approximate Factor Models with Diverging Eigenvalues Zhaoxing Gao and Ruey S Tsay Booth School of

More information

Forecast UPC-Level FMCG Demand, Part III: Grouped Reconciliation

Forecast UPC-Level FMCG Demand, Part III: Grouped Reconciliation 2016 IEEE International Conference on Big Data (Big Data) Forecast UPC-Level FMCG Demand, Part III: Grouped Reconciliation Dazhi Yang, Gary S. W. Goh, Siwei Jiang and Allan N. Zhang Singapore Institute

More information

An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84

An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84 An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84 Introduction Almost all numerical methods for solving PDEs will at some point be reduced to solving A

More information

Zellner s Seemingly Unrelated Regressions Model. James L. Powell Department of Economics University of California, Berkeley

Zellner s Seemingly Unrelated Regressions Model. James L. Powell Department of Economics University of California, Berkeley Zellner s Seemingly Unrelated Regressions Model James L. Powell Department of Economics University of California, Berkeley Overview The seemingly unrelated regressions (SUR) model, proposed by Zellner,

More information

Ch 3: Multiple Linear Regression

Ch 3: Multiple Linear Regression Ch 3: Multiple Linear Regression 1. Multiple Linear Regression Model Multiple regression model has more than one regressor. For example, we have one response variable and two regressor variables: 1. delivery

More information

B553 Lecture 5: Matrix Algebra Review

B553 Lecture 5: Matrix Algebra Review B553 Lecture 5: Matrix Algebra Review Kris Hauser January 19, 2012 We have seen in prior lectures how vectors represent points in R n and gradients of functions. Matrices represent linear transformations

More information

The Hilbert Space of Random Variables

The Hilbert Space of Random Variables The Hilbert Space of Random Variables Electrical Engineering 126 (UC Berkeley) Spring 2018 1 Outline Fix a probability space and consider the set H := {X : X is a real-valued random variable with E[X 2

More information

Diagonalization. P. Danziger. u B = A 1. B u S.

Diagonalization. P. Danziger. u B = A 1. B u S. 7., 8., 8.2 Diagonalization P. Danziger Change of Basis Given a basis of R n, B {v,..., v n }, we have seen that the matrix whose columns consist of these vectors can be thought of as a change of basis

More information

2.3. VECTOR SPACES 25

2.3. VECTOR SPACES 25 2.3. VECTOR SPACES 25 2.3 Vector Spaces MATH 294 FALL 982 PRELIM # 3a 2.3. Let C[, ] denote the space of continuous functions defined on the interval [,] (i.e. f(x) is a member of C[, ] if f(x) is continuous

More information

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Jim Lambers MAT 610 Summer Session Lecture 1 Notes Jim Lambers MAT 60 Summer Session 2009-0 Lecture Notes Introduction This course is about numerical linear algebra, which is the study of the approximate solution of fundamental problems from linear algebra

More information

THREE-DIMENSIONAL DYNAMICAL SYSTEMS WITH FOUR-DIMENSIONAL VESSIOT-GULDBERG-LIE ALGEBRAS

THREE-DIMENSIONAL DYNAMICAL SYSTEMS WITH FOUR-DIMENSIONAL VESSIOT-GULDBERG-LIE ALGEBRAS Journal of Applied Analysis and Computation Volume 7, Number 3, August 207, 872 883 Website:http://jaac-online.com/ DOI:0.948/207055 THREE-DIMENSIONAL DYNAMICAL SYSTEMS WITH FOUR-DIMENSIONAL VESSIOT-GULDBERG-LIE

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture State space models, 1st part: Model: Sec. 10.1 The

More information

7.2. Matrix Multiplication. Introduction. Prerequisites. Learning Outcomes

7.2. Matrix Multiplication. Introduction. Prerequisites. Learning Outcomes Matrix Multiplication 7.2 Introduction When we wish to multiply matrices together we have to ensure that the operation is possible - and this is not always so. Also, unlike number arithmetic and algebra,

More information

x 1. x n i + x 2 j (x 1, x 2, x 3 ) = x 1 j + x 3

x 1. x n i + x 2 j (x 1, x 2, x 3 ) = x 1 j + x 3 Version: 4/1/06. Note: These notes are mostly from my 5B course, with the addition of the part on components and projections. Look them over to make sure that we are on the same page as regards inner-products,

More information

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

LECTURE 2 LINEAR REGRESSION MODEL AND OLS SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another

More information

The value of feedback in forecasting competitions

The value of feedback in forecasting competitions ISSN 1440-771X Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ The value of feedback in forecasting competitions George Athanasopoulos and Rob

More information

Chapter 3. Vector spaces

Chapter 3. Vector spaces Chapter 3. Vector spaces Lecture notes for MA1111 P. Karageorgis pete@maths.tcd.ie 1/22 Linear combinations Suppose that v 1,v 2,...,v n and v are vectors in R m. Definition 3.1 Linear combination We say

More information

Topic 7 - Matrix Approach to Simple Linear Regression. Outline. Matrix. Matrix. Review of Matrices. Regression model in matrix form

Topic 7 - Matrix Approach to Simple Linear Regression. Outline. Matrix. Matrix. Review of Matrices. Regression model in matrix form Topic 7 - Matrix Approach to Simple Linear Regression Review of Matrices Outline Regression model in matrix form - Fall 03 Calculations using matrices Topic 7 Matrix Collection of elements arranged in

More information

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015 CS: Lecture #7 Mridul Aanjaneya March 9, 5 Solving linear systems of equations Consider a lower triangular matrix L: l l l L = l 3 l 3 l 33 l n l nn A procedure similar to that for upper triangular systems

More information

Forecasting hierarchical and grouped time series through trace minimization

Forecasting hierarchical and grouped time series through trace minimization ISSN 1440-771X Department of Econometrics and Business Statistics http://business.monash.edu/econometrics-and-business-statistics/research/publications Forecasting hierarchical and grouped time series

More information

S N. hochdimensionaler Lyapunov- und Sylvestergleichungen. Peter Benner. Mathematik in Industrie und Technik Fakultät für Mathematik TU Chemnitz

S N. hochdimensionaler Lyapunov- und Sylvestergleichungen. Peter Benner. Mathematik in Industrie und Technik Fakultät für Mathematik TU Chemnitz Ansätze zur numerischen Lösung hochdimensionaler Lyapunov- und Sylvestergleichungen Peter Benner Mathematik in Industrie und Technik Fakultät für Mathematik TU Chemnitz S N SIMULATION www.tu-chemnitz.de/~benner

More information

Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA

Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA March 6, 2017 KC Border Linear Regression II March 6, 2017 1 / 44 1 OLS estimator 2 Restricted regression 3 Errors in variables 4

More information

Local linear forecasts using cubic smoothing splines

Local linear forecasts using cubic smoothing splines Local linear forecasts using cubic smoothing splines Rob J. Hyndman, Maxwell L. King, Ivet Pitrun, Baki Billah 13 January 2004 Abstract: We show how cubic smoothing splines fitted to univariate time series

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Coskewness and Cokurtosis John W. Fowler July 9, 2005

Coskewness and Cokurtosis John W. Fowler July 9, 2005 Coskewness and Cokurtosis John W. Fowler July 9, 2005 The concept of a covariance matrix can be extended to higher moments; of particular interest are the third and fourth moments. A very common application

More information

Package Tcomp. June 6, 2018

Package Tcomp. June 6, 2018 Package Tcomp June 6, 2018 Title Data from the 2010 Tourism Forecasting Competition Version 1.0.1 The 1311 time series from the tourism forecasting competition conducted in 2010 and described in Athanasopoulos

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

1 Teaching notes on structural VARs.

1 Teaching notes on structural VARs. Bent E. Sørensen February 22, 2007 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest (to analyze, estimation is a different matter) time series models are

More information

Financial Econometrics

Financial Econometrics Material : solution Class : Teacher(s) : zacharias psaradakis, marian vavra Example 1.1: Consider the linear regression model y Xβ + u, (1) where y is a (n 1) vector of observations on the dependent variable,

More information

Package TSPred. April 5, 2017

Package TSPred. April 5, 2017 Type Package Package TSPred April 5, 2017 Title Functions for Benchmarking Time Series Prediction Version 3.0.2 Date 2017-04-05 Author Rebecca Pontes Salles [aut, cre, cph] (CEFET/RJ), Eduardo Ogasawara

More information

1+t 2 (l) y = 2xy 3 (m) x = 2tx + 1 (n) x = 2tx + t (o) y = 1 + y (p) y = ty (q) y =

1+t 2 (l) y = 2xy 3 (m) x = 2tx + 1 (n) x = 2tx + t (o) y = 1 + y (p) y = ty (q) y = DIFFERENTIAL EQUATIONS. Solved exercises.. Find the set of all solutions of the following first order differential equations: (a) x = t (b) y = xy (c) x = x (d) x = (e) x = t (f) x = x t (g) x = x log

More information

A Review of Matrix Analysis

A Review of Matrix Analysis Matrix Notation Part Matrix Operations Matrices are simply rectangular arrays of quantities Each quantity in the array is called an element of the matrix and an element can be either a numerical value

More information

ELA THE MINIMUM-NORM LEAST-SQUARES SOLUTION OF A LINEAR SYSTEM AND SYMMETRIC RANK-ONE UPDATES

ELA THE MINIMUM-NORM LEAST-SQUARES SOLUTION OF A LINEAR SYSTEM AND SYMMETRIC RANK-ONE UPDATES Volume 22, pp. 480-489, May 20 THE MINIMUM-NORM LEAST-SQUARES SOLUTION OF A LINEAR SYSTEM AND SYMMETRIC RANK-ONE UPDATES XUZHOU CHEN AND JUN JI Abstract. In this paper, we study the Moore-Penrose inverse

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Fundamentals of Linear Algebra. Marcel B. Finan Arkansas Tech University c All Rights Reserved

Fundamentals of Linear Algebra. Marcel B. Finan Arkansas Tech University c All Rights Reserved Fundamentals of Linear Algebra Marcel B. Finan Arkansas Tech University c All Rights Reserved October 9, 200 2 PREFACE Linear algebra has evolved as a branch of mathematics with wide range of applications

More information

MA 527 first midterm review problems Hopefully final version as of October 2nd

MA 527 first midterm review problems Hopefully final version as of October 2nd MA 57 first midterm review problems Hopefully final version as of October nd The first midterm will be on Wednesday, October 4th, from 8 to 9 pm, in MTHW 0. It will cover all the material from the classes

More information

14 Multiple Linear Regression

14 Multiple Linear Regression B.Sc./Cert./M.Sc. Qualif. - Statistics: Theory and Practice 14 Multiple Linear Regression 14.1 The multiple linear regression model In simple linear regression, the response variable y is expressed in

More information

Lecture 16 Methods for System of Linear Equations (Linear Systems) Songting Luo. Department of Mathematics Iowa State University

Lecture 16 Methods for System of Linear Equations (Linear Systems) Songting Luo. Department of Mathematics Iowa State University Lecture 16 Methods for System of Linear Equations (Linear Systems) Songting Luo Department of Mathematics Iowa State University MATH 481 Numerical Methods for Differential Equations Songting Luo ( Department

More information

EE363 homework 2 solutions

EE363 homework 2 solutions EE363 Prof. S. Boyd EE363 homework 2 solutions. Derivative of matrix inverse. Suppose that X : R R n n, and that X(t is invertible. Show that ( d d dt X(t = X(t dt X(t X(t. Hint: differentiate X(tX(t =

More information

APPM 3310 Problem Set 4 Solutions

APPM 3310 Problem Set 4 Solutions APPM 33 Problem Set 4 Solutions. Problem.. Note: Since these are nonstandard definitions of addition and scalar multiplication, be sure to show that they satisfy all of the vector space axioms. Solution:

More information

Hierarchical Electricity Time Series Forecasting for Integrating Consumption Patterns Analysis and Aggregation Consistency

Hierarchical Electricity Time Series Forecasting for Integrating Consumption Patterns Analysis and Aggregation Consistency Hierarchical Electricity Time Series Forecasting for Integrating Consumption Patterns Analysis and Aggregation Consistency Yue Pang 1, Bo Yao 1, Xiangdong Zhou 1, Yong Zhang 2, Yiming Xu 1 and Zijing Tan

More information

Control Systems. Laplace domain analysis

Control Systems. Laplace domain analysis Control Systems Laplace domain analysis L. Lanari outline introduce the Laplace unilateral transform define its properties show its advantages in turning ODEs to algebraic equations define an Input/Output

More information

Math 671: Tensor Train decomposition methods

Math 671: Tensor Train decomposition methods Math 671: Eduardo Corona 1 1 University of Michigan at Ann Arbor December 8, 2016 Table of Contents 1 Preliminaries and goal 2 Unfolding matrices for tensorized arrays The Tensor Train decomposition 3

More information

Problem 1: (3 points) Recall that the dot product of two vectors in R 3 is

Problem 1: (3 points) Recall that the dot product of two vectors in R 3 is Linear Algebra, Spring 206 Homework 3 Name: Problem : (3 points) Recall that the dot product of two vectors in R 3 is a x b y = ax + by + cz, c z and this is essentially the same as the matrix multiplication

More information

Math Matrix Theory - Spring 2012

Math Matrix Theory - Spring 2012 Math 440 - Matrix Theory - Spring 202 HW #2 Solutions Which of the following are true? Why? If not true, give an example to show that If true, give your reasoning (a) Inverse of an elementary matrix is

More information

MODEL REDUCTION BY A CROSS-GRAMIAN APPROACH FOR DATA-SPARSE SYSTEMS

MODEL REDUCTION BY A CROSS-GRAMIAN APPROACH FOR DATA-SPARSE SYSTEMS MODEL REDUCTION BY A CROSS-GRAMIAN APPROACH FOR DATA-SPARSE SYSTEMS Ulrike Baur joint work with Peter Benner Mathematics in Industry and Technology Faculty of Mathematics Chemnitz University of Technology

More information

Relation of Pure Minimum Cost Flow Model to Linear Programming

Relation of Pure Minimum Cost Flow Model to Linear Programming Appendix A Page 1 Relation of Pure Minimum Cost Flow Model to Linear Programming The Network Model The network pure minimum cost flow model has m nodes. The external flows given by the vector b with m

More information

Department of Econometrics and Business Statistics

Department of Econometrics and Business Statistics ISSN 1440-771X Australia Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Exponential Smoothing: A Prediction Error Decomposition Principle Ralph

More information

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I

More information

Linear dynamical systems with inputs & outputs

Linear dynamical systems with inputs & outputs EE263 Autumn 215 S. Boyd and S. Lall Linear dynamical systems with inputs & outputs inputs & outputs: interpretations transfer function impulse and step responses examples 1 Inputs & outputs recall continuous-time

More information

Math 344 Lecture # Linear Systems

Math 344 Lecture # Linear Systems Math 344 Lecture #12 2.7 Linear Systems Through a choice of bases S and T for finite dimensional vector spaces V (with dimension n) and W (with dimension m), a linear equation L(v) = w becomes the linear

More information

Elementary maths for GMT

Elementary maths for GMT Elementary maths for GMT Linear Algebra Part 2: Matrices, Elimination and Determinant m n matrices The system of m linear equations in n variables x 1, x 2,, x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1

More information

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT Math Camp II Basic Linear Algebra Yiqing Xu MIT Aug 26, 2014 1 Solving Systems of Linear Equations 2 Vectors and Vector Spaces 3 Matrices 4 Least Squares Systems of Linear Equations Definition A linear

More information

Forecasting 1 to h steps ahead using partial least squares

Forecasting 1 to h steps ahead using partial least squares Forecasting 1 to h steps ahead using partial least squares Philip Hans Franses Econometric Institute, Erasmus University Rotterdam November 10, 2006 Econometric Institute Report 2006-47 I thank Dick van

More information

Lecture Notes 6: Dynamic Equations Part B: Second and Higher-Order Linear Difference Equations in One Variable

Lecture Notes 6: Dynamic Equations Part B: Second and Higher-Order Linear Difference Equations in One Variable University of Warwick, EC9A0 Maths for Economists Peter J. Hammond 1 of 56 Lecture Notes 6: Dynamic Equations Part B: Second and Higher-Order Linear Difference Equations in One Variable Peter J. Hammond

More information

Linear Programming The Simplex Algorithm: Part II Chapter 5

Linear Programming The Simplex Algorithm: Part II Chapter 5 1 Linear Programming The Simplex Algorithm: Part II Chapter 5 University of Chicago Booth School of Business Kipp Martin October 17, 2017 Outline List of Files Key Concepts Revised Simplex Revised Simplex

More information

ECEN 420 LINEAR CONTROL SYSTEMS. Lecture 6 Mathematical Representation of Physical Systems II 1/67

ECEN 420 LINEAR CONTROL SYSTEMS. Lecture 6 Mathematical Representation of Physical Systems II 1/67 1/67 ECEN 420 LINEAR CONTROL SYSTEMS Lecture 6 Mathematical Representation of Physical Systems II State Variable Models for Dynamic Systems u 1 u 2 u ṙ. Internal Variables x 1, x 2 x n y 1 y 2. y m Figure

More information

Matrix operations Linear Algebra with Computer Science Application

Matrix operations Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application February 14, 2018 1 Matrix operations 11 Matrix operations If A is an m n matrix that is, a matrix with m rows and n columns then the scalar entry in the

More information

Lec 2: Mathematical Economics

Lec 2: Mathematical Economics Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen

More information

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS ISSN 1440-771X ISBN 0 7326 1091 5 Prediction Intervals for Exponential Smoothing State Space Models Rob J Hyndman, Anne B Koehler, J. Keith Ord and Ralph D Snyder Working Paper 11/2001 2001 DEPARTMENT

More information

Quasi-Newton methods for minimization

Quasi-Newton methods for minimization Quasi-Newton methods for minimization Lectures for PHD course on Numerical optimization Enrico Bertolazzi DIMS Universitá di Trento November 21 December 14, 2011 Quasi-Newton methods for minimization 1

More information

Basic Concepts in Data Reconciliation. Chapter 6: Steady-State Data Reconciliation with Model Uncertainties

Basic Concepts in Data Reconciliation. Chapter 6: Steady-State Data Reconciliation with Model Uncertainties Chapter 6: Steady-State Data with Model Uncertainties CHAPTER 6 Steady-State Data with Model Uncertainties 6.1 Models with Uncertainties In the previous chapters, the models employed in the DR were considered

More information

15-780: LinearProgramming

15-780: LinearProgramming 15-780: LinearProgramming J. Zico Kolter February 1-3, 2016 1 Outline Introduction Some linear algebra review Linear programming Simplex algorithm Duality and dual simplex 2 Outline Introduction Some linear

More information

Linear Algebra. Session 8

Linear Algebra. Session 8 Linear Algebra. Session 8 Dr. Marco A Roque Sol 08/01/2017 Abstract Linear Algebra Range and kernel Let V, W be vector spaces and L : V W, be a linear mapping. Definition. The range (or image of L is the

More information

Consider the following example of a linear system:

Consider the following example of a linear system: LINEAR SYSTEMS Consider the following example of a linear system: Its unique solution is x + 2x 2 + 3x 3 = 5 x + x 3 = 3 3x + x 2 + 3x 3 = 3 x =, x 2 = 0, x 3 = 2 In general we want to solve n equations

More information

A matrix over a field F is a rectangular array of elements from F. The symbol

A matrix over a field F is a rectangular array of elements from F. The symbol Chapter MATRICES Matrix arithmetic A matrix over a field F is a rectangular array of elements from F The symbol M m n (F ) denotes the collection of all m n matrices over F Matrices will usually be denoted

More information

Principles of forecasting

Principles of forecasting 2.5 Forecasting Principles of forecasting Forecast based on conditional expectations Suppose we are interested in forecasting the value of y t+1 based on a set of variables X t (m 1 vector). Let y t+1

More information

An improved approximation algorithm for two-machine flow shop scheduling with an availability constraint

An improved approximation algorithm for two-machine flow shop scheduling with an availability constraint An improved approximation algorithm for two-machine flow shop scheduling with an availability constraint J. Breit Department of Information and Technology Management, Saarland University, Saarbrcken, Germany

More information

Matrix-valued functions

Matrix-valued functions Matrix-valued functions Aim lecture: We solve some first order linear homogeneous differential equations using exponentials of matrices. Recall as in MATH2, the any function R M mn (C) : t A(t) can be

More information

ECEN 605 LINEAR SYSTEMS. Lecture 7 Solution of State Equations 1/77

ECEN 605 LINEAR SYSTEMS. Lecture 7 Solution of State Equations 1/77 1/77 ECEN 605 LINEAR SYSTEMS Lecture 7 Solution of State Equations Solution of State Space Equations Recall from the previous Lecture note, for a system: ẋ(t) = A x(t) + B u(t) y(t) = C x(t) + D u(t),

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

Lecture Notes in Linear Algebra

Lecture Notes in Linear Algebra Lecture Notes in Linear Algebra Dr. Abdullah Al-Azemi Mathematics Department Kuwait University February 4, 2017 Contents 1 Linear Equations and Matrices 1 1.2 Matrices............................................

More information