THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

Size: px
Start display at page:

Download "THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES"

Transcription

1 THE ROYAL STATISTICAL SOCIETY 9 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES The Society provides these solutions to assist candidates preparing for the examinations in future years and for the information of any other persons using the examinations. The solutions should NOT be seen as "model answers". Rather, they have been written out in considerable detail and are intended as learning aids. Users of the solutions should always be aware that in many cases there are valid alternative methods. Also, in the many cases where discussion is called for, there may be other valid points that could be made. While every care has been taken with the preparation of these solutions, the Society will not be responsible for any errors or omissions. The Society will not enter into any correspondence in respect of these solutions. Note. In accordance with the convention used in the Society's examination papers, the notation log denotes logarithm to base e. Logarithms to any other base are explicitly identified, e.g. log 1. RSS 9

2 Graduate Diploma, Module 3, 9. Question 1 i G( z) = pi z. i= Xn X i n Gn( z) = E( z ) = pe i ( z X1 = i) = pi[ Gn 1( z) ] = G( Gn 1( z) ) i= i=. θ n = P(X n = ) = G n (). Setting z = in the relationship of part, we obtain θ n = G(θ n 1 ) (n ). Letting n in the result of part, and noting that G is a continuous function of z so that G(θ n 1 ) G(θ) as n, we obtain the equation θ = G(θ). We now have the special case as identified in the question. (v) In this special case, quoting the standard result for a binomial distribution, G(z) = (1 p + pz). (vi) θ 1 is simply the zero term of the binomial distribution, so θ 1 = (1 p). Equivalently, θ 1 = G 1 () = G() = (1 p). (vii) θ = G( θ1) = 1 p+ p(1 p) = ( 1 p) [ 1 + p(1 p) ] ( ) ( ) = 1 p 1+ p p, as required. (viii) θ is the smallest positive root of the equation θ = G(θ), i.e. of θ = (1 p + pθ), so we must solve p θ + (p p 1)θ + (1 p) =. Because θ = 1 is necessarily a root of the equation θ = G(θ), it is easy to factorize the quadratic to give (θ 1)[p θ (1 p) ] =. It follows that the extinction probability is given by min{1, [(1 p)/p] }. Hence the extinction probability is 1 if p ½ and is [(1 p)/p] if p > ½.

3 Graduate Diploma, Module 3, 9. Question A Markov chain is said to be irreducible if it is possible, with non-zero probability, to move from any state in the state space to any other state. A chain is said to be recurrent if, starting from any state in the space, the probability of eventually returning to that state is 1. [These explanations may be put more formally in terms of n-step transition probabilities.] In the present case, because all the transition probabilities are non-zero, it is clearly possible to move from any state to any other state in one step, so the chain is irreducible. It is a general result that all finite irreducible Markov chains are recurrent. The stationary distribution (π 1, π, π 3 ) is given by the solution of the equations (/5)π 1 + (1/5)π + (1/5)π 3 = π 1 (/5)π 1 + (3/5)π + (/5)π 3 = π (1/5)π 1 + (1/5)π + (/5)π 3 = π 3, which reduce to 3π 1 = π + π 3 π = π 1 + π 3 3π 3 = π 1 + π, together with the normalisation condition π 1 + π + π 3 = 1. It readily follows that the solution is (π 1, π, π 3 ) = (¼, ½, ¼). The probabilities and hence also the proportions in the second generation are given by the terms of the matrix product /5 /5 1/5 (/5 /5 1/5) 1/5 3/5 1/5 1/5 /5 /5 which gives the vector of probabilities/proportions (7/5 1/5 6/5). The approximate proportions that we would expect to find are the ones given by the stationary distribution (π 1 π π 3 ) = (1/4 1/ 1/4). The reasoning lying behind this is as follows. Let p ij (n) represent the n-step transition probability from state i to state j; then, for all i and j, p ij (n) π j as n. Hence, after a large number, n, of generations, we would expect that p ij (n) π j. In a large population of individuals, each of whom has the same approximate probability π j of being in state j, π j is also the approximate proportion of the population who are in state j.

4 Graduate Diploma, Module 3, 9. Question 3 Because of the memory-less property of the exponential distribution, how long line has been under repair is statistically independent of how much longer it will take to repair it. Define states as follows. : no line runs 1: line 1 runs, line under repair : line 1 under repair, line runs 3: both lines run. The instantaneous transition rates are as follows. transition rate 1 1/ 1 1/ / 1/5 3 1/ 3 1/ /15 The equilibrium equations are (1/)π = (1/1)π 1 + (1/5)π (3/5)π 1 = (1/)π + (1/15)π 3 (7/1)π = (1/3)π 3 (1/1)π 3 = (1/)π 1 + (1/)π. These reduce to 5π = 6π 5π 1 = 16π π 3 = 1π. Using the normalisation condition π + π 1 + π + π 3 = 1, it follows that (π, π 1, π, π 3 ) = k (6/5, 16/5, 1, 1), where 1/k = (6/5) + (16/5) = 656/5. So (π, π 1, π, π 3 ) = (13/38, 5/41, 5/656, 55/656) = (.4,.1,.4,.8). In particular the long-term proportion of time that the factory is unable to meet the production target is π =.4.

5 Graduate Diploma, Module 3, 9. Question 4 The state space is the set of all non-negative integers. The instantaneous transition rates are as follows. transition rate i i + 1 λ (i ) i i 1 μ (i 1) The traffic intensity is defined by ρ = λ/μ. A necessary and sufficient condition for an equilibrium distribution to exist is ρ < 1, i.e. λ < μ. The detailed balance equations are λπ n 1 = μπ n (n 1). Thus π n = ρπ n 1 (n 1) and, using this relation recursively, we find π n = ρ n π (n ). Using the normalisation condition Σπ n =1, we find, using the formula for the sum of a geometric series (or observing that we are dealing with a geometric distribution), that π n = (1 ρ)ρ n (n ), as required. The service time distribution, i.e. here the waiting time distribution, for this model is exponential with parameter μ. The pdf is μe μt (t ). (v) The arriving customer s waiting time is the sum of n + 1 independently and identically distributed service times, each having an exponential distribution with parameter μ. These are the service times of the customers ahead of him in the queue plus his own (note that, because of the memory-less property of the exponential distribution, the residual service time of the customer being served at the time of arrival of the new customer is also exponential with parameter μ). Using the note given in the question, the required pdf is μ n+1 t n e μ t /n! (t ). (vi) In equilibrium, from part the probability that an arriving customer finds n customers ahead of him in the queue is given by π n = (1 ρ)ρ n (n ). Thus, using the result of part (v), the pdf of his waiting time is given by n n+ ( ) 1 n μt μt n 1 ρ ρ μ te / n! = ( 1 ρ) μe ( ρμt) / n! n= n= ( ) t, t n ( μ λ) μ ( λ ) /! ( μ λ) = e t n = e μ λ n= which is the pdf of the exponential distribution with parameter μ λ.

6 Graduate Diploma, Module 3, 9. Question 5 The autoregressive characteristic equation is 1 (3/4)z + (1/8)z =, which has roots z =, 4. Both the roots are greater than one in modulus, so the stationarity condition is satisfied. On substituting, we obtain ψ ε = ψ ε ψ ε + ε = ψ ε ψ ε +. ε i t i i t 1 i i t i t i 1 t i i t i t i= 4 i= 8 i= 4 i= 1 8 i= Equating coefficients of the ε t i, we obtain the following. i = : ψ = 1 i = 1: ψ 1= (3/4)ψ = 3/4 i : ψ i = (3/4)ψ i 1 (1/8)ψ i. The last of these provides the required set of recurrence relations, for i, and the values for ψ and ψ 1 provide the required initial conditions. The general solution is of the form ψ i = A 1 α i i 1 + A α (i ), where A 1 and A are arbitrary constants and α 1 and α are the roots of the auxiliary equation α = (3/4)α (1/8). The roots of the auxiliary equation are [the inverses of the roots of the characteristic equation of part ] 1/ and 1/4. Hence the general solution is ψ i = A 1 (1/) i + A (1/4) i. Using the initial conditions, A 1 + A = 1 and (1/)A 1 + (1/4)A = 3/4. Hence A 1 = and A = 1, and the solution for the ψ i is as stated in the question. Generally, Var(Y t ) = ψ i i σ =. In the present case, this gives i i i i σ Var(Y t ) = ( 1/) ( 1/4) i σ = 4( 1/4) 4(1/8) + ( 1/16) i= i= Summing the geometric series in this expression gives Var(Y t ) = σ = σ. 1 (1/ 4) 1 (1/ 8) 1 (1/16) 35

7 Graduate Diploma, Module 3, 9. Question 6 If the underlying trend is an exponential one, taking logarithms will transform the trend to a linear one, in which case an ARIMA model is likely to provide a better fit. If the variability of the series and, in particular, of any seasonal effects increases with increase in the underlying level of the series, taking logarithms will tend to stabilise the variation, and in this case also an ARIMA model is likely to provide a better fit. Approximate 95% confidence limits are at ±/ 18 = ±.149. So any autocorrelation outside these limits differs significantly from zero at the 5% level. We see that a number of autocorrelations lie well outside these limits, notably at lags 1, 6, 1, 18, 4, 3, 36. This clearly indicates the presence of seasonality of period 1 months and also suggests the presence of trend. The purpose of taking differences is to eliminate the trend and the purpose of taking seasonal differences is to eliminate the seasonality. Approximate 95% confidence limits are at ±/ 167 = ±.155. So any autocorrelation outside these limits differs significantly from zero at the 5% level. Here the only significant autocorrelations are at lag 1 and at lag 1. This shows that any trend and seasonality have been removed by the differencing to obtain a stationary series and suggests that the stationary series may be modelled by moving average terms at lags 1 and 1. So a seasonal ARIMA(,1,1) (,1,1) 1 model is suggested. A seasonal ARIMA(,1,1) (,1,1) 1 has been fitted. The equation of the fitted model is (see the parameter estimates in the computer output in the question) (1 L)(1 L 1 )Y t = (1.8759L)(1.7789L 1 )ε t, where L is the lag operator (backward shift operator) and {ε t } is a white noise process, i.e. or (1 L L 1 + L 13 )Y t = (1.8759L.7789L L 13 )ε t Y t = Y t 1 + Y t 1 Y t 13 + ε t.8759ε t ε t ε t 13. (v) (vi) None of the p-values of the modified Box-Pierce statistics is significant. So the residuals of the fitted model appear to come from a white noise process our model appears to give a good fit to the data. The forecast and 95% prediction interval for Y 19 are given by and ( , 8.894) respectively. The forecast sales and prediction interval are given by taking exponentials of these values. This, correct to the nearest 1 litres, gives 5867 as the forecast sales for December 1995 and (471, 79) as the 95% prediction interval.

8 Graduate Diploma, Module 3, 9. Question 7 The updating equations are as follows. ( / ) ( 1 α)( 1 1) Lt = α Yt It p + Lt + Bt ( ) ( 1 γ) ( / ) ( 1 ) B = γ L L + B t t t 1 t 1 I = δ Y L + δ I t t t t p ŷ T (h) = (L T + hb T ) I T p+h. We require ˆ y T (1) and ŷ T (1). (a) ŷ T (1) = ( )(.79) = (313.14) (.79) =. = to nearest whole number. (b) ŷ T (1) = [ (1)(1.7)](.97) = (4519.6)(.97) = = 3 to nearest whole number. For January 1994, the values are as follows. Level L t =.4(45/.79) +.6( ) = Trend B t =.1( ) +.9(1.7) = 3. Index I t =.1(45/36.11) +.99(.79) =.79 Fitted From, ŷ T (1) =. Residual Deaths Fitted = 45. =.98 (v) Given some appropriately chosen initial values for the level and the trend and for the first twelve seasonal indices, for any given set of values of the smoothing constants α, γ and δ (each between and 1 inclusive, of course) the numerical values of all the quantities in the table may be calculated for each month in the series. The sum of squares of the residuals (or some other appropriate function of the residuals) may be used as a measure of how well the Holt-Winters method with the chosen values of α, γ and δ performs. By looking at a grid of values of α, γ and δ or by carrying out a formal optimisation, the values that minimise the sum of squares of the residuals may be found as the best set of values to use.

9 Graduate Diploma, Module 3, 9. Question 8 {Y t } is an ARIMA(,,1) process. Y t = Y t 1 Y t + ε t θε t 1. ŷ T (1) = E(Y T+1 H T ) = E(Y T Y T 1 + ε T+1 θε T H T ) = Y T Y T 1 θε T. [Note that these are conditional expectations given the entire history of the process up to and including time T. So Y T and Y T 1 are known. Further, ε T (sometimes called the "innovation" at time T) can be found using the one-stepahead prediction at time T 1 and the observed Y T this is as shown in part, replacing T + 1 by T. ε T+1 cannot be found, of course, and so has expectation as a white noise term.] Y T+1 ˆ y T (1) = ε T+1. (v) ŷ T () = E(Y T+ H T ) = E(Y T+1 Y T + ε T+ θε T+1 H T ) = ŷ T (1) Y T = 3Y T Y T 1 θε T (substituting from the result of part ). (vi) For h 3, setting t = T + h in the model equation, ŷ T (h) = E(Y T+h H T ) = E(Y T+h 1 Y T+h + ε T+h θε T+h 1 H T ) = E(Y T+h 1 H T ) E(Y T+h H T ) + = ŷ T (h 1) y ˆ T (h ). (vii) The general form of the solution of the difference equation of part (vi) (in the examination, this could be quoted or easily found) is y ˆ T (h) = A + Bh (h 1). To determine A and B, the initial conditions of parts and (v) give A + B = Y T Y T 1 θε T, A + B = 3Y T Y T -1 θε T. Hence B = b T = Y T Y T 1 θε T and A = Y T. (viii) Using the result of part, replacing T by T 1, we have Y T ˆ y T 1 (1) = ε T. Note also that ˆ y T 1 (1) = Y T 1 + b T 1. Substituting into the expression for b T in part (vii), b T = Y T Y T 1 θ (Y T ŷ T 1 (1)) = Y T Y T 1 θ (Y T Y T 1 b T 1 ) = (1 θ)(y T Y T 1 ) + θb T 1.

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 011 MODULE 3 : Stochastic processes and time series Time allowed: Three Hours Candidates should answer FIVE questions. All questions carry

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future

More information

Economic Forecasting Lecture 9: Smoothing Methods

Economic Forecasting Lecture 9: Smoothing Methods Economic Forecasting Lecture 9: Smoothing Methods Richard G. Pierse 1 Introduction Smoothing methods are rather different from the model-based methods that we have been looking at up to now in this module.

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

THE ROYAL STATISTICAL SOCIETY 2007 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 2 PROBABILITY MODELS

THE ROYAL STATISTICAL SOCIETY 2007 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 2 PROBABILITY MODELS THE ROYAL STATISTICAL SOCIETY 007 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE ROBABILITY MODELS The Society provides these solutions to assist candidates preparing for the examinations

More information

Ch 9. FORECASTING. Time Series Analysis

Ch 9. FORECASTING. Time Series Analysis In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2004

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2004 EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 004 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should

More information

THE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS

THE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS THE ROYAL STATISTICAL SOCIETY 008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS The Society provides these solutions to assist candidates preparing for the examinations

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2 THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION NEW MODULAR SCHEME introduced from the eaminations in 7 MODULE SPECIMEN PAPER B AND SOLUTIONS The time for the eamination is ½ hours The paper

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010 Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Seasonal Models and Seasonal Adjustment

Seasonal Models and Seasonal Adjustment LECTURE 10 Seasonal Models and Seasonal Adjustment So far, we have relied upon the method of trigonometrical regression for building models which can be used for forecasting seasonal economic time series.

More information

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them. TS Module 1 Time series overview (The attached PDF file has better formatting.)! Model building! Time series plots Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book;

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

IE 5112 Final Exam 2010

IE 5112 Final Exam 2010 IE 5112 Final Exam 2010 1. There are six cities in Kilroy County. The county must decide where to build fire stations. The county wants to build as few fire stations as possible while ensuring that there

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. MAT 4371 - SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. Question 1: Consider the following generator for a continuous time Markov chain. 4 1 3 Q = 2 5 3 5 2 7 (a) Give

More information

Stationary remaining service time conditional on queue length

Stationary remaining service time conditional on queue length Stationary remaining service time conditional on queue length Karl Sigman Uri Yechiali October 7, 2006 Abstract In Mandelbaum and Yechiali (1979) a simple formula is derived for the expected stationary

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Lecture 7: Exponential Smoothing Methods Please read Chapter 4 and Chapter 2 of MWH Book

Lecture 7: Exponential Smoothing Methods Please read Chapter 4 and Chapter 2 of MWH Book Lecture 7: Exponential Smoothing Methods Please read Chapter 4 and Chapter 2 of MWH Book 1 Big Picture 1. In lecture 6, smoothing (averaging) method is used to estimate the trend-cycle (decomposition)

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Performance Evaluation of Queuing Systems

Performance Evaluation of Queuing Systems Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

QUEUING MODELS AND MARKOV PROCESSES

QUEUING MODELS AND MARKOV PROCESSES QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically

More information

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE THE ROYAL STATISTICAL SOCIETY 004 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE PAPER II STATISTICAL METHODS The Society provides these solutions to assist candidates preparing for the examinations in future

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Statistical Methods for Forecasting

Statistical Methods for Forecasting Statistical Methods for Forecasting BOVAS ABRAHAM University of Waterloo JOHANNES LEDOLTER University of Iowa John Wiley & Sons New York Chichester Brisbane Toronto Singapore Contents 1 INTRODUCTION AND

More information

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1 Queueing systems Renato Lo Cigno Simulation and Performance Evaluation 2014-15 Queueing systems - Renato Lo Cigno 1 Queues A Birth-Death process is well modeled by a queue Indeed queues can be used to

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

Cointegration, Stationarity and Error Correction Models.

Cointegration, Stationarity and Error Correction Models. Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Scenario 5: Internet Usage Solution. θ j

Scenario 5: Internet Usage Solution. θ j Scenario : Internet Usage Solution Some more information would be interesting about the study in order to know if we can generalize possible findings. For example: Does each data point consist of the total

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1 Forecasting using R Rob J Hyndman 3.2 Dynamic regression Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Stochastic and deterministic trends 3 Periodic seasonality 4 Lab session 14 5 Dynamic

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Chapter 3: Regression Methods for Trends

Chapter 3: Regression Methods for Trends Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Modelling Complex Queuing Situations with Markov Processes

Modelling Complex Queuing Situations with Markov Processes Modelling Complex Queuing Situations with Markov Processes Jason Randal Thorne, School of IT, Charles Sturt Uni, NSW 2795, Australia Abstract This article comments upon some new developments in the field

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

GI/M/1 and GI/M/m queuing systems

GI/M/1 and GI/M/m queuing systems GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting

More information

Computer Systems Modelling

Computer Systems Modelling Computer Systems Modelling Computer Laboratory Computer Science Tripos, Part II Lent Term 2010/11 R. J. Gibbens Problem sheet William Gates Building 15 JJ Thomson Avenue Cambridge CB3 0FD http://www.cl.cam.ac.uk/

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 00 MODULE : Statistical Inference Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks. The

More information

CS 798: Homework Assignment 3 (Queueing Theory)

CS 798: Homework Assignment 3 (Queueing Theory) 1.0 Little s law Assigned: October 6, 009 Patients arriving to the emergency room at the Grand River Hospital have a mean waiting time of three hours. It has been found that, averaged over the period of

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Applied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot

Applied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot Applied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot SCHOOL OF COMPUTER SCIENCE AND STATISTICS TRINITY COLLEGE DUBLIN IRELAND https://www.scss.tcd.ie/rozenn.dahyot Michaelmas Term 2017 Contents 1 Introduction

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

P (L d k = n). P (L(t) = n),

P (L d k = n). P (L(t) = n), 4 M/G/1 queue In the M/G/1 queue customers arrive according to a Poisson process with rate λ and they are treated in order of arrival The service times are independent and identically distributed with

More information

Forecasting with ARMA Models

Forecasting with ARMA Models LECTURE 4 Forecasting with ARMA Models Minumum Mean-Square Error Prediction Imagine that y(t) is a stationary stochastic process with E{y(t)} = 0. We may be interested in predicting values of this process

More information

INSTITUTE AND FACULTY OF ACTUARIES. Curriculum 2019 SPECIMEN SOLUTIONS

INSTITUTE AND FACULTY OF ACTUARIES. Curriculum 2019 SPECIMEN SOLUTIONS INSTITUTE AND FACULTY OF ACTUARIES Curriculum 09 SPECIMEN SOLUTIONS Subject CSA Risk Modelling and Survival Analysis Institute and Faculty of Actuaries Sample path A continuous time, discrete state process

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

THE ROYAL STATISTICAL SOCIETY 2002 EXAMINATIONS SOLUTIONS ORDINARY CERTIFICATE PAPER II

THE ROYAL STATISTICAL SOCIETY 2002 EXAMINATIONS SOLUTIONS ORDINARY CERTIFICATE PAPER II THE ROYAL STATISTICAL SOCIETY 2002 EXAMINATIONS SOLUTIONS ORDINARY CERTIFICATE PAPER II The Society provides these solutions to assist candidates preparing for the examinations in future years and for

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

Queueing Networks and Insensitivity

Queueing Networks and Insensitivity Lukáš Adam 29. 10. 2012 1 / 40 Table of contents 1 Jackson networks 2 Insensitivity in Erlang s Loss System 3 Quasi-Reversibility and Single-Node Symmetric Queues 4 Quasi-Reversibility in Networks 5 The

More information

Lecture 20: Reversible Processes and Queues

Lecture 20: Reversible Processes and Queues Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 0 Output Analysis for a Single Model Purpose Objective: Estimate system performance via simulation If θ is the system performance, the precision of the

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

arxiv: v1 [stat.me] 5 Nov 2008

arxiv: v1 [stat.me] 5 Nov 2008 arxiv:0811.0659v1 [stat.me] 5 Nov 2008 Estimation of missing data by using the filtering process in a time series modeling Ahmad Mahir R. and Al-khazaleh A. M. H. School of Mathematical Sciences Faculty

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

2. Transience and Recurrence

2. Transience and Recurrence Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times

More information

Photo: US National Archives

Photo: US National Archives ESD.86. Markov Processes and their Application to Queueing II Richard C. Larson March 7, 2007 Photo: US National Archives Outline Little s Law, one more time PASTA treat Markov Birth and Death Queueing

More information

VAR Models and Applications

VAR Models and Applications VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Model selection Mean square error

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state

More information

Markov Chain Model for ALOHA protocol

Markov Chain Model for ALOHA protocol Markov Chain Model for ALOHA protocol Laila Daniel and Krishnan Narayanan April 22, 2012 Outline of the talk A Markov chain (MC) model for Slotted ALOHA Basic properties of Discrete-time Markov Chain Stability

More information

Suan Sunandha Rajabhat University

Suan Sunandha Rajabhat University Forecasting Exchange Rate between Thai Baht and the US Dollar Using Time Series Analysis Kunya Bowornchockchai Suan Sunandha Rajabhat University INTRODUCTION The objective of this research is to forecast

More information

Week 5: Markov chains Random access in communication networks Solutions

Week 5: Markov chains Random access in communication networks Solutions Week 5: Markov chains Random access in communication networks Solutions A Markov chain model. The model described in the homework defines the following probabilities: P [a terminal receives a packet in

More information

Forecasting Module 2. Learning Objectives. Trended Data. By Sue B. Schou Phone:

Forecasting Module 2. Learning Objectives. Trended Data. By Sue B. Schou Phone: Forecasting Module 2 By Sue B. Schou Phone: 8-282-408 Email: schosue@isu.edu Learning Objectives Make forecast models using trend analysis in Minitab Make forecast models using Holt s exponential smoothing

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov

More information