RANDOM PROCESSES. THE FINAL TEST. Prof. R. Liptser and P. Chigansky 9:00-12:00, 18 of October, 2001 Student ID: any supplementary material is allowed duration of the exam is 3 hours write briefly the main idea of your answers in the exam itself. If needed, give reference to your copybook, where you may elaborate other technical details. good luck!
1 Problem 1. Let ξ and η be a pair of random variables, such that Eξ = Eη = 0, Eξ 2 <, Eη 2 < and Eξη 0. Which of the following statements are generally true: (a) E ( η Ê(η ξ)) = E(η ξ) (b) Ê( η E(η ξ) ) = E(η ξ) (c) Ê( η E(η ξ) ) = Ê(η ξ) (d) Ê( η Ê(η ξ)) = Ê(η ξ)
2 (e) Ê(η ξ) = ξ Ê(ξ η) = η } = η = ξ P a.s. (f) E ( E(η ξ) η ) = η = E(η ξ) = η P a.s.
3 Problem 2. Let (X n ) n 0 be a Markov chain, taking values in the alphabet S = {1,..., d}, with transition probabilities λ ij = P{X n = j X n 1 = i} and initial distribution π 0 (i) = P{X 0 = i}. Define transitions indicator process (ν n ) n 1 : ν n = I(X n X n 1 ) (a) Which of the following filters generates π n = P(X n = i ν1 n ): (1) π n (i) = λ iiπ n 1 (i) j λ jjπ n 1 (j) I(ν n = 0) + π n 1(i) k i λ ik j π n 1(j) k j λ I(ν n = 1) jk (2) π n (i) = λ iiπ n 1 (i) j λ jjπ n 1 (j) I(ν k i n = 0) + λ kiπ n 1 (k) k j λ kjπ n 1 (k) I(ν n = 1) j (3) π n (i) = π n 1 (i)i(ν n = 0) + π n 1(i) k i λ ik j π n 1(j) k j λ I(ν n = 1) jk (4) π n (i) = π n 1 (i)i(ν n = 0) + k λ ki π n 1 (k)i(ν n = 1) (b) Assume that additional observation process is available: Y n = a(x n ) + ξ n where a(i) is S R function, (ξ n ) n 1 is a sequence of i.i.d. random variables and ξ 1 has a probability density function f(x). Which of the following filters generates the estimates ζ n (i) = P(X n = i ν1 n, Y1 n ): (1) ζ n (i) = λ iiζ n 1 (i)f(y n a i ) j λ jjζ n 1 (j)f(y n a j ) I(ν n = 0)+ + ζ n 1(i) k i λ ikf(y n a k ) j ζ n 1(j) k j λ jkf(y n a k ) I(ν n = 1) (2) ζ n (i) = λ iiζ n 1 (i)f(y n a i ) j λ jjζ n 1 (j)f(y n a j ) I(ν n = 0)+ + f(y n a i ) k i λ kiζ n 1 (k) j f(y n a j ) k j λ kjζ n 1 (k) I(ν n = 1) (3) ζ n (i) = ζ n 1 (i)f(y n a i )I(ν n = 0)+ + ζ n 1(i) k i λ ikf(y n a k ) j ζ n 1(j) k j λ jkf(y n a k ) I(ν n = 1) (4) ζ n (i) = ζ n 1 (i)f(y n a i )I(ν n = 0)+ + f(y n a i ) k λ ki ζ n 1 (k)i(ν n = 1)
4 Problem 3. Consider process (X n ) n 0, generated by so called bilinear recursion: X n = αx n 1 + βx n 1 ε n, X 0 1 where α, β are real numbers and (ε n ) n 1 is a sequence of i.i.d. random variables, Eε 2 1 <. (a) Determine in which of the following cases (X n ) converges in appropriate sense (check all correct answers and explain briefly) (1) α = β = 0.75, ε 1 is distributed uniformly on [ 1, 1]. (2) α = β = 0.75, ε 1 takes two values { 1, 1} with equal probabilities. (3) α = 0, β = 1, ε 1 takes two values { 1, 1} with equal probabilities. (4) α = 1, β = 1/2, ε 1 takes two values { 1, 1} with equal probabilities.
5 Note: From now on assume Eε 1 = 0 and Eε 2 1 = 1. (b) Consider the observation sequence Y n = X n 1 + ε n Write down the recursive equations for X n = Ê(X n Y n 1 ) and P n = E(X n X n ) 2 : X n =... P n =...
6 (c) Consider the case (a)-(3), i.e. X n = X n 1 ε n, X 0 1 Y n = X n 1 + ε n where ε 1 = ±1 with equal probabilities. In this question Ê(X n Y1 n ) and E(X n Y1 n ) are understood as linear and nonlinear filters respectively. A filter is trivial if it does not depend on the observations and is precise if it estimates the signal exactly (i.e. with zero mean square error). Choose the correct statement and prove your answer (1) both the linear and nonlinear filters are trivial (2) the nonlinear filter is trivial, while the linear one is precise (3) the nonlinear filter is precise, while the linear one is trivial (4) both the linear and nonlinear filters are precise (5) none of the above Proof: