ECE534, Spring 2018: Solutions for Problem Set #5

Size: px
Start display at page:

Download "ECE534, Spring 2018: Solutions for Problem Set #5"

Transcription

1 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described by a Poisson process with rate λ (iii) X(0) Compute the following: (a) P (X(t) ) (b) P (X(t) ) (c) EX(t) (d) R XX (t, t ), t, t 0 Let N(t) be a Poisson process with rate λ (a) Since X(0), the event {X(t) } is equivalent to the event {N(t) is even}: P ( X(t) ) P ( N(t) k ) k0 e λt k0 ( e λt + e λt) λt (λt)k e (k)! ( + e λt), where the latter is due to the fact that k0 xk (k)! (ex + e x ) (b) The complementary event is P ( X(t) ) P ( X(t) ) and corresponds to the event {N(t) is odd} ( e λt ) (c) The expectation is E(X(t)) P(X(t) )(+) + ( ) P(X(t) ) e λt (d) For the autocorrelation function, let x 0, x and note that: R XX (t, t ) EX(t )X(t ) x i x j P (X(t ) x i, X(t ) x j ) i0 j0 i0 j0 x i x j P (X(t ) x i )P (X(t ) x j X(t ) x i ) ()

2 We now note that for t t, the number of sign changes in the interval (t, t ) is the number of zero crossings in the same interval The number of zero crossings is a Poisson process and N(t ) N(t ) Poi(λ(t t )) Therefore, P (X(t ) X(t ) ) P (X(t ) X(t ) ) ( ) + e λ(t t ), () P (X(t ) X(t ) ) P (X(t ) X(t ) ) ( ) e λ(t t ) (3) By ()-(3) and parts (a) and (b), we have: R XX (t, t ) e λ(t t ), t t Using the symmetry of the autocorrelation function we conclude that R XX (t, t ) e λ t t, t, t 0 Low-pass Filtering in Discrete Time Let X(n) be a discrete-time random process with autocorrelation function R XX (m) EX(n)X(n + m) σ δ(m) Consider further a discrete-time process Y (n) described by the following difference equation: Y (n) X(n) + ay (n ) This difference equation represents a digital filter (a) Find the impulse response of this filter (b) Show that R XY (m) EY (n)x(n m) a m R XY (0) (c) Find the autocorrelation function of the output R Y Y (m) EY (n)y (n + m) Note: Use the fact that R Y Y ( m) R Y Y (m) (a) Option : Take the z-transform of the difference equation Y (n) X(n) + ay (n ): Y (z) X(z) + az Y (z) H(z) Y (z) X(z) az

3 Then, the (causal) inverse z-transform is the sought impulse response: where u(n) denotes the unit step h(n) a n u(n), Option : Time-domain computation of the impulse response h(n), which is the output Y (n) corresponding to an impulse input X(n) δ(n) That is h(n) δ(n) + ah(n ) Assuming causality h(n) 0 for all n < 0 (which is an initial condition for this equation), we have: Therefore, h(n) a n u(n) h(0) δ(0) + a h( ) h() δ() + ah(0) a h() δ() + ah() a h(n) δ(n) + ah(n ) a n, n > 0 (b) Option : We have R XY (m) : E(X(n m)y (n)) We examine two cases: For m < 0 the input X(n m) is uncorrelated with any historical output Y (n) so that R XY (m) 0 for all m < 0 This is due to the fact that X(n) is a white sequence according to its autocorrelation function For m 0: R XY (m) E {X(n m)(x(n) + ay (n ))} σ δ(m) + ar XY (m ) For all m > 0 we have δ(m) 0 and therefore (similarly to part (a)) for all m 0 R XY (m) ar XY (m ) R XY (m) a m R XY (0) Combining the two cases yields R XY (m) a m R XY (0)u(m) Option : By convolution R XY (m) EX(n m)y (n) E X(n m) j h(j)x(n j) j h(j)r XX (m j) (h R XX )(m) σ a m u(m) Since R XY (0) σ, we get R XY (m) R XY (0)a m u(m) 3

4 (c) Option : R Y Y (0) EY () E(X() + ay (0)) σ δ(0) + a R Y Y (0) + a EX()Y (0) σ + a R Y Y (0), hence For m > 0, we have: R Y Y (0) σ a R Y Y (m) EY (m)y (0) E(X(m) + ay (m ))Y (0) EX(m)Y (0) + ar Y Y (m ) Therefore, R Y Y (m) a m R Y Y (0) σ a m Due to the symmetry of R a Y Y conclude with R Y Y (m) σ a m a Option : For m > 0, we have R Y Y (m) EY (m)y (0) E a i X(m i) a j X( j) i0 j0 i0 j0 σ i0 a i+j EX(m i)x( j) a i+j σ δ(m + j i) j0 a m+j σ a m a j0 we and by symmetry we have R Y Y (m) σ a m a 3 Wide Sense Stationarity Consider the harmonic oscillator X(t) A cos(ω 0 t + Θ) where ω 0 is a constant and A is a random variable (a) Let Θ θ 0, where θ 0 is a constant Argue that X(t) is not WSS for any value of EA (b) Show that X(t) is WSS if Θ Unif π, π, independent of A 4

5 (c) Continuing part (b), argue that X(t) is mean ergodic in the ms sense (a) In this case, the process X(t) A cos(ω 0 t + θ 0 ) has the following mean and autocorrelation functions: E(X(t)) E(A) cos(ω 0 t + θ 0 ), EX(s)X(t) E(A ) cos(ω 0 s + θ 0 ) cos(ω 0 t + θ 0 ) If E(A) 0, the mean is time-varying, and hence, X(t) is not WSS If E(A) 0, then EX(t) 0, t which is time-independent Nevertheless, the autocorrelation condition fails Let s θ 0 ω 0 and s π/ θ 0 Then, EX(s )X(s ) E(A ) cos(0) E(A ), ( π ) EX(s )X(s ) E(A ) cos 0 Excluding the degenerate case A 0 almost surely, the above two autocorrelation values correspond to time difference τ s s s s 0 and they are different Hence, X(t) is not WSS in this case either (b) The process X(t) A cos(ω 0 t + Θ) has mean function E(X(t)) E(A) E(cos(ω 0 t + Θ)) (independence of A, Θ) π π E(A) cos(ω 0 t + θ)dθ 0 and autocorrelation EX(t)X(s) π E(A ) π π π π 4π E(A ) { 4π E(A ) π E(A ) cos(ω 0 (t s)) cos(ω 0 t + θ) cos(ω 0 s + θ)dθ cos(ω 0 (t + s) + θ) + cos(ω 0 (t s))dθ } π π cos(ω 0 (t s)) + cos(ω 0 (t + s) + θ)dθ π ω 0 Hence, X(t) is WSS in this case 5

6 T (c) We want to show that T 0 X(t)dt µ X 0 in the ms sense We have: { T } E X(t)dt 0 { T } T 0 T E A cos(ω 0 t + Θ)dt 0 E(A ) ω0 T E sin(ω 0 T + Θ) sin(θ) }{{} 4 E(A ) ω 0 T 0 as T 4 Wiener Filtering Let X(n) S(n) + W (n) be the input to a (noncausal) Wiener filter aiming at optimally approximating the target signal S(n) in the MSE sense Assume that S(n) and W (n) are zero mean, uncorrelated processes such that X(n) is a WSS process (a) Find an expression for the z-transform of the optimal filter H opt (z) (b) Let R SS (k) 0 7 ( ) k and R W W (k) 3 δ(k), ie, W (n) is additive white noise Compute the impulse response of the optimal filter h opt (n), n (a) Note that R SX (n) ES(0)(S(n) + W (n)) R SS (n) + R SW (n) and likewise R XX (n) EX(0)X(n) R SS (n)+r W W (n)+ R SW (0)+ R W S (n) Therefore, S SX (z) S SS (z), S XX (z) S SS (z) + S W W (z) As per the derived formula in class, we have H opt (z) S SX(z) S XX (z) S SS (z) S SS (z) + S W W (z) (b) We begin with a useful z-transform of a symmetric geometric sequence: Z ( a n ) (a )z, ROC {/ a < z < a } (4) (az )(a z) 6

7 Using this formula, S SS (z) 0 ( 7 Z k ) 0 9 and furthermore, S W W (z) 3 We therefore obtain: H opt (z) S SX (z) S SS (z) + S W W (z) 5z (3 z)(3z ) This is of the form (4) (up to scaling), hence, z (z )( z) 0z 9(z )( z) 0z 9(z )( z) + 3 h opt (n) n, n 5 Kalman Filtering bonus problem Consider an unobservable signal defined by the recursion X n X n ε n where X 0 and {ε n } n is a sequence of independent random variables such that P (ε n ) p n and P (ε n 0) p n with 0 < p n < for all n Assume that we observe Y n X n + ξ n, where {ξ n } n is a sequence of iid random variables, independent of {ε n } n with density f ξ (x) Let Eξ n 0 and Eξ n < for any n (a) Find a suitable state space model for the Kalman filter (b) Derive the Kalman filter (a) Using the provided recursion for X n, we can write: X n p n X n + (ε n p n )X n Let v n (ε n p n )X n We then have: Ev n E(ε n p n )X n E(ε n p n )EX n 0 Also, for m n, we have: Ev n X m E(ε n p n )EX n X m 0 7

8 Moreover, EX n Eε nex n p n EX n Therefore, EX n n i p i This leads to: n Evn E(ε n p n ) EXn p n ( p n ) p i ( p n ) i n p i Combining the above results, we obtain the following state space model for the Kalman filter: { Xn p n X n + v n Y n X n + ξ n (b) Let ˆX n EX n Y n, Σ n E(X n ˆX n ) and P n Ev n Then, p ˆX n p n ˆXn + nσ n + P n p nσ n + P n + Eξn (Y n p n ˆXn ), ( ) p Σ n p nσ n + P n n Σ n + P n p nσ n + P n + Eξn are the corresponding Kalman filter recursions i 6 Decorrelation Let X X, X T be a random vector such that Cov(X) Find a matrix A such that the entries of the random vector Y AX are uncorrelated random variables We seek a matrix A such that Y AX contains uncorrelated random variables This implies that Cov(Y ) has to be a diagonal matrix Such an A corresponds to the transpose of the modal matrix of Cov(X) To see this, let the eigenvalue decomposition of this matrix be UΛU T Note that Cov(Y ) E (Y EY ) (Y EY ) T E (AX AEX) (AX AEX) T ACov(X)A T AUΛU T A T (5) 8

9 Setting A U T and using the orthonormality of U, ie, UU T U T U I, we obtain: λ 0 Cov(Y ) Λ 0 λ Here, λ, λ are the eigenvalues of Cov(X) and also the eigenvalues of Cov(Y ) To find A, we first solve the characteristic polynomial: det (Cov(X) λi) 0 The roots of this polynomial are: λ 3, λ To find the corresponding eigenvectors, which correspond to the columns of U, we solve the equations: Cov(X)u λ u, Cov(X)u λ u, by taking into account that u u The resulting A is A U T Note: A can be chosen to whiten Cov(X) The corresponding choice is A Λ / U T Plugging this choice in (5), we verify that Cov(Y ) I 7 Gaussian Processes bonus problem Consider two random processes {X n } n 0 and {Y n } n 0 generated by the following (nonlinear) recursion X n+ ax n + X nɛ n+ + Y n ζ n+ X n + Y n Y n+ bx n + X nζ n+ Y n ɛ n+, X n + Yn where {ɛ n } n and {ζ n } n are iid Gaussian sequences with zero mean and unit variance Moreover, assume that {ɛ n } n and {ζ n } n are independent and a, b are constants The initial pair X 0, Y 0 T is assumed to be a Gaussian vector with zero mean and invertible covariance Σ (a) Show that the joint process {X n, Y n } n 0 is Gaussian (b) Derive the pdf of the vector X n, Y n T (c) Derive the pdf of the vector X 0,, X n, Y 0,, Y n T Note: Use the Markov property of the processes {X n } n 0 and {Y n } n 0 9

10 (a) We provide a short description for the solution of this part with the main ideas The key observation to deal with this part is to rewrite the recursion pair as follows: Xn+ Y n+ a 0 b 0 Xn Y n + X n + Yn Xn Y n Y n X n ɛn+ ζ n+ We now note that the matrix B Xn,Y n X n + Y n Xn Y n Y n X n is orthonormal by its structure: B T X n,y n B Xn,Y n B Xn,Y n B T X n,y n I To establish Gaussianity of {Z n } with Z n X n, Y n T, we need to show that Φ n (u 0:n ) E e jut 0:n Z 0:n, where Z 0:n Z0 T, ZT,, ZT n T and u 0:n u 0,X, u 0,Y, u,x, u,y,, u n,x, u n,y T, has an exponent which is a quadratic function of u 0:n This can be performed by using the Markov property of {Z n } and the orthonormality of B Xn,Yn Starting with Φ n (u 0:n ) E E e jut 0:n Z 0:n E e jut 0:n Z 0:n E E e jut 0:n Z 0:n Z 0:n e j(ux n X n+u Y n Y n) X0:n, Y 0:n, one can show that E e j(ux n X n+u Y n Y n) X0:n, Y 0:n e j(aux n X n +bu Y n Y n ) e (u X n ) +(u Y n ) Proceeding inductively on the term E e jut 0:n Z 0:n e j(aux n X n +bu Y n Y n ), one can establish the aforementioned desired form (b) Since Z n is a Gaussian vector, the corresponding probability density function is totally characterized by the corresponding mean vector and covariance matrix Clearly, EZ n 0, 0

11 since Z 0 is zero mean Also, P n E Z n Zn T a 0 b 0 P n a 0 b 0 T + I, P 0 Σ Therefore, f Zn (z n ) πdet(p n ) / e z n T P n z n (c) Using the orthonomarlity of B Xn,Y n, it is easy to see that the process ɛn+ ζ n+ X n + Y n Xn Y n Y n X n ɛn+ ζ n+ is Gaussian Employing the Markov property of {Z n }, one can establish that f X0,,X n,y 0,,Y n (x 0,, x n, y 0,, y n ) n i π e (x i ax i ) n i π e (y i by i ) by following a similar approach as in part (a) πdet(σ) / e z 0 T Σ z 0 8 Markov Chains again (countable state space) Consider a discrete-time Markov chain X with state space X Let the probability flow from a set A X to its complement A c (with respect to X ) under a distribution π be given by: F (A, A c ) π(i)p i,j i A j A c Theorem: π is a stationary distribution if and only if i π(i) and for all A X Consider the following Markov chain: F (A, A c ) F (A c, A) Figure : Markov Chain where p+q Find the stationary distribution of this chain and specify a condition under which this distribution exists

12 Note: You can find the stationary distribution using the classical approach Nevertheless, the provided theorem can be used to simplify the derivation Option : (Without the theorem) A stationary distribution π (π in (infinite) vector form) must satisfy πp π We therefore obtain the following balance equation: pπ(j ) + qπ(j + ) π(j), j > 0 The characteristic polynomial for this recursion is: qz z + p 0, with roots λ and λ p q Therefore, the stationary distribution must have the form: π(j) c λ j + c λ j c + c ( p q ) j For convergence of j π(j) we require c 0 and p q <, ie, p < / The value of c can be obtained by requiring j π(j) and equals c p q Option : (Using the provided theorem) Pick A as A {0,, i}, whose complement is A c {i +, i +,, } The respective flows from A to A c and vice versa are easy to compute, since the interaction between A and A c occurs only between states i and i + : F (A, A c ) π(i)p, F (A c, A) π(i + )q By equating the two flows, π(i)p π(i + )q yields a geometric sequence π(i) π(0) ( ) p i q As before, j π(j) converges only when p q < or p < (otherwise no stationary distribution exists, as the state is more likely to drift towards and never to return) Obtaining the stationary distribution is easy: π(i) π(i) π(0) i0 ( p ) ( ) p i q q i0 ( ) p i π(0) q p, q

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Wiener Filtering. EE264: Lecture 12

Wiener Filtering. EE264: Lecture 12 EE264: Lecture 2 Wiener Filtering In this lecture we will take a different view of filtering. Previously, we have depended on frequency-domain specifications to make some sort of LP/ BP/ HP/ BS filter,

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model 5 Kalman filters 5.1 Scalar Kalman filter 5.1.1 Signal model System model {Y (n)} is an unobservable sequence which is described by the following state or system equation: Y (n) = h(n)y (n 1) + Z(n), n

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

ENGR352 Problem Set 02

ENGR352 Problem Set 02 engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

X n = c n + c n,k Y k, (4.2)

X n = c n + c n,k Y k, (4.2) 4. Linear filtering. Wiener filter Assume (X n, Y n is a pair of random sequences in which the first component X n is referred a signal and the second an observation. The paths of the signal are unobservable

More information

Your solutions for time-domain waveforms should all be expressed as real-valued functions.

Your solutions for time-domain waveforms should all be expressed as real-valued functions. ECE-486 Test 2, Feb 23, 2017 2 Hours; Closed book; Allowed calculator models: (a) Casio fx-115 models (b) HP33s and HP 35s (c) TI-30X and TI-36X models. Calculators not included in this list are not permitted.

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

EE482: Digital Signal Processing Applications

EE482: Digital Signal Processing Applications Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu EE482: Digital Signal Processing Applications Spring 2014 TTh 14:30-15:45 CBC C222 Lecture 11 Adaptive Filtering 14/03/04 http://www.ee.unlv.edu/~b1morris/ee482/

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

ECE 438 Exam 2 Solutions, 11/08/2006.

ECE 438 Exam 2 Solutions, 11/08/2006. NAME: ECE 438 Exam Solutions, /08/006. This is a closed-book exam, but you are allowed one standard (8.5-by-) sheet of notes. No calculators are allowed. Total number of points: 50. This exam counts for

More information

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter D. Richard Brown III Worcester Polytechnic Institute 09-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 09-Apr-2009 1 /

More information

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M: Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Chapter 2 Wiener Filtering

Chapter 2 Wiener Filtering Chapter 2 Wiener Filtering Abstract Before moving to the actual adaptive filtering problem, we need to solve the optimum linear filtering problem (particularly, in the mean-square-error sense). We start

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

EEL3135: Homework #4

EEL3135: Homework #4 EEL335: Homework #4 Problem : For each of the systems below, determine whether or not the system is () linear, () time-invariant, and (3) causal: (a) (b) (c) xn [ ] cos( 04πn) (d) xn [ ] xn [ ] xn [ 5]

More information

EE 438 Essential Definitions and Relations

EE 438 Essential Definitions and Relations May 2004 EE 438 Essential Definitions and Relations CT Metrics. Energy E x = x(t) 2 dt 2. Power P x = lim T 2T T / 2 T / 2 x(t) 2 dt 3. root mean squared value x rms = P x 4. Area A x = x(t) dt 5. Average

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Digital Filters Ying Sun

Digital Filters Ying Sun Digital Filters Ying Sun Digital filters Finite impulse response (FIR filter: h[n] has a finite numbers of terms. Infinite impulse response (IIR filter: h[n] has infinite numbers of terms. Causal filter:

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Discrete-Time Signals and Systems. The z-transform and Its Application. The Direct z-transform. Region of Convergence. Reference: Sections

Discrete-Time Signals and Systems. The z-transform and Its Application. The Direct z-transform. Region of Convergence. Reference: Sections Discrete-Time Signals and Systems The z-transform and Its Application Dr. Deepa Kundur University of Toronto Reference: Sections 3. - 3.4 of John G. Proakis and Dimitris G. Manolakis, Digital Signal Processing:

More information

26. Filtering. ECE 830, Spring 2014

26. Filtering. ECE 830, Spring 2014 26. Filtering ECE 830, Spring 2014 1 / 26 Wiener Filtering Wiener filtering is the application of LMMSE estimation to recovery of a signal in additive noise under wide sense sationarity assumptions. Problem

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Each problem is worth 25 points, and you may solve the problems in any order.

Each problem is worth 25 points, and you may solve the problems in any order. EE 120: Signals & Systems Department of Electrical Engineering and Computer Sciences University of California, Berkeley Midterm Exam #2 April 11, 2016, 2:10-4:00pm Instructions: There are four questions

More information

III.C - Linear Transformations: Optimal Filtering

III.C - Linear Transformations: Optimal Filtering 1 III.C - Linear Transformations: Optimal Filtering FIR Wiener Filter [p. 3] Mean square signal estimation principles [p. 4] Orthogonality principle [p. 7] FIR Wiener filtering concepts [p. 8] Filter coefficients

More information

STAT 100C: Linear models

STAT 100C: Linear models STAT 100C: Linear models Arash A. Amini April 27, 2018 1 / 1 Table of Contents 2 / 1 Linear Algebra Review Read 3.1 and 3.2 from text. 1. Fundamental subspace (rank-nullity, etc.) Im(X ) = ker(x T ) R

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie WIENER FILTERING Presented by N.Srikanth(Y8104060), M.Manikanta PhaniKumar(Y8104031). INDIAN INSTITUTE OF TECHNOLOGY KANPUR Electrical Engineering dept. INTRODUCTION Noise is present in many situations

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

[POLS 8500] Review of Linear Algebra, Probability and Information Theory [POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

The Multivariate Gaussian Distribution

The Multivariate Gaussian Distribution The Multivariate Gaussian Distribution Chuong B. Do October, 8 A vector-valued random variable X = T X X n is said to have a multivariate normal or Gaussian) distribution with mean µ R n and covariance

More information

ECE534, Spring 2018: Solutions for Problem Set #3

ECE534, Spring 2018: Solutions for Problem Set #3 ECE534, Spring 08: Solutions for Problem Set #3 Jointly Gaussian Random Variables and MMSE Estimation Suppose that X, Y are jointly Gaussian random variables with µ X = µ Y = 0 and σ X = σ Y = Let their

More information

The Hilbert Space of Random Variables

The Hilbert Space of Random Variables The Hilbert Space of Random Variables Electrical Engineering 126 (UC Berkeley) Spring 2018 1 Outline Fix a probability space and consider the set H := {X : X is a real-valued random variable with E[X 2

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

Adaptive Systems Homework Assignment 1

Adaptive Systems Homework Assignment 1 Signal Processing and Speech Communication Lab. Graz University of Technology Adaptive Systems Homework Assignment 1 Name(s) Matr.No(s). The analytical part of your homework (your calculation sheets) as

More information

STAT 100C: Linear models

STAT 100C: Linear models STAT 100C: Linear models Arash A. Amini June 9, 2018 1 / 56 Table of Contents Multiple linear regression Linear model setup Estimation of β Geometric interpretation Estimation of σ 2 Hat matrix Gram matrix

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

16.584: Random Vectors

16.584: Random Vectors 1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Lecture 04: Discrete Frequency Domain Analysis (z-transform)

Lecture 04: Discrete Frequency Domain Analysis (z-transform) Lecture 04: Discrete Frequency Domain Analysis (z-transform) John Chiverton School of Information Technology Mae Fah Luang University 1st Semester 2009/ 2552 Outline Overview Lecture Contents Introduction

More information

ELEN E4810: Digital Signal Processing Topic 2: Time domain

ELEN E4810: Digital Signal Processing Topic 2: Time domain ELEN E4810: Digital Signal Processing Topic 2: Time domain 1. Discrete-time systems 2. Convolution 3. Linear Constant-Coefficient Difference Equations (LCCDEs) 4. Correlation 1 1. Discrete-time systems

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Discrete Time Fourier Transform (DTFT)

Discrete Time Fourier Transform (DTFT) Discrete Time Fourier Transform (DTFT) 1 Discrete Time Fourier Transform (DTFT) The DTFT is the Fourier transform of choice for analyzing infinite-length signals and systems Useful for conceptual, pencil-and-paper

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science. Fall Solutions for Problem Set 2

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science. Fall Solutions for Problem Set 2 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Issued: Tuesday, September 5. 6.: Discrete-Time Signal Processing Fall 5 Solutions for Problem Set Problem.

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet EE538 Final Exam Fall 005 3:0 pm -5:0 pm PHYS 3 Dec. 17, 005 Cover Sheet Test Duration: 10 minutes. Open Book but Closed Notes. Calculators ARE allowed!! This test contains five problems. Each of the five

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, 2007 Cover Sheet Test Duration: 120 minutes. Open Book but Closed Notes. Calculators allowed!! This test contains five problems. Each of

More information

QUALIFYING EXAM IN SYSTEMS ENGINEERING

QUALIFYING EXAM IN SYSTEMS ENGINEERING QUALIFYING EXAM IN SYSTEMS ENGINEERING Written Exam: MAY 23, 2017, 9:00AM to 1:00PM, EMB 105 Oral Exam: May 25 or 26, 2017 Time/Location TBA (~1 hour per student) CLOSED BOOK, NO CHEAT SHEETS BASIC SCIENTIFIC

More information

Lecture Note 12: Kalman Filter

Lecture Note 12: Kalman Filter ECE 645: Estimation Theory Spring 2015 Instructor: Prof. Stanley H. Chan Lecture Note 12: Kalman Filter LaTeX prepared by Stylianos Chatzidakis) May 4, 2015 This lecture note is based on ECE 645Spring

More information

ECE531 Lecture 12: Linear Estimation and Causal Wiener-Kolmogorov Filtering

ECE531 Lecture 12: Linear Estimation and Causal Wiener-Kolmogorov Filtering ECE531 Lecture 12: Linear Estimation and Causal Wiener-Kolmogorov Filtering D. Richard Brown III Worcester Polytechnic Institute 16-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 16-Apr-2009

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

MIT Spring 2015

MIT Spring 2015 Regression Analysis MIT 18.472 Dr. Kempthorne Spring 2015 1 Outline Regression Analysis 1 Regression Analysis 2 Multiple Linear Regression: Setup Data Set n cases i = 1, 2,..., n 1 Response (dependent)

More information

Digital Control & Digital Filters. Lectures 1 & 2

Digital Control & Digital Filters. Lectures 1 & 2 Digital Controls & Digital Filters Lectures 1 & 2, Professor Department of Electrical and Computer Engineering Colorado State University Spring 2017 Digital versus Analog Control Systems Block diagrams

More information

ESTIMATION THEORY. Chapter Estimation of Random Variables

ESTIMATION THEORY. Chapter Estimation of Random Variables Chapter ESTIMATION THEORY. Estimation of Random Variables Suppose X,Y,Y 2,...,Y n are random variables defined on the same probability space (Ω, S,P). We consider Y,...,Y n to be the observed random variables

More information

5 Linear Algebra and Inverse Problem

5 Linear Algebra and Inverse Problem 5 Linear Algebra and Inverse Problem 5.1 Introduction Direct problem ( Forward problem) is to find field quantities satisfying Governing equations, Boundary conditions, Initial conditions. The direct problem

More information

Solutions: Homework Set # 5

Solutions: Homework Set # 5 Signal Processing for Communications EPFL Winter Semester 2007/2008 Prof. Suhas Diggavi Handout # 22, Tuesday, November, 2007 Solutions: Homework Set # 5 Problem (a) Since h [n] = 0, we have (b) We can

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information