Stochastic Process II Dr.-Ing. Sudchai Boonto

Size: px
Start display at page:

Download "Stochastic Process II Dr.-Ing. Sudchai Boonto"

Transcription

1 Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand

2 Random process Consider a random experiment specified by the outcome ρ from some sample space S, by the event defined on S, and by the probabilities on these events Suppose that every outcome ρ S, we assign a function of time according to some rule: X(t, ρ), t I The graph of the function X(t, ρ) versus t, for fixed ζ, is called a realization, sample path or sample function of the random process For fixed t i from the index set I, X(t i, ρ) is a random variable The indexed family of random variables, {X(t, ρ), t I} is call a random process or stochastic processes 2/34

3 Random process example x(t 1, ρ 1 ) x(t 2, ρ 1 ) Noise Gen ρ 1 x(t, ρ 1 ) t x(t 1, ρ 2 ) x(t 2, ρ 2 ) Noise Gen ρ 2 x(t, ρ 2 ) t x(t 1, ρ 3 ) x(t 2, ρ 3 ) Noise Gen ρ 3 x(t, ρ 3 ) t t 1 t 2 3/34

4 Random process continuous vs discrete A random process is said to be discrete-time if the index set I is a countable set, ie, the set of integers or the set of nonnegative integers) We will usually use k to denote the time index and X(k) to denote the random process A continuous-time stochastic process is one in which I is continuous, ie, the real line of the nonnegative real line 4/34

5 Random process Random sinusoids Let ρ be selected at random from the interval [ 1, 1] Define the continuous-time random process X(t, ρ) by X(t, ρ) = ρ sin(2πt), < t < The realizations of this random process are sinusoids with amplitude ρ, as shown in Fig ρ = 09 ρ = 04 t ρ = 02 5/34

6 Random process Random sinusoids Let ρ be selected at random from the interval [ π, π], and let Y (t, ρ) = sin(2πt + ρ) The realizations of Y (t, ρ) are time-shifted versions of sin 2πt, as shown in Fig ρ = 0 t ρ = π/2 6/34

7 Distribution and Density Functions Consider the time sequence {x(k, ρ} N 1 k=0 The k th sample {x(k, ρ j )} of each run is a random variable The first-order distribution function is defined as F x(k) (α) = P [x(t) α] Assuming that this function is continuous, its first-order density function is defined as f x(k) (α) = df x(k)(α, k) dα 7/34

8 Distribution and Density Functions The joint distribution function as F x(t1 ),x(t 2 )(α 1, α 2 ) = P [x(t 1 ) α 1, x(t 2 ) α 2 ] The joint density function as f x(t1 ),x(t 2 )(α 1, α 2 ) = 2 F x(t1 ),x(t 2 )(α 1, α 2 ) α 1 α 2 8/34

9 Expectations of Random signals We indicate the sequence by the time sequence x(k), the mean is also a time sequence and is given by µ x (k) = E [x(k)] = αf x(k) (α)dα the auto-correlation function of a random process x(t) is defined as the joint moment of x(t 1 ) and x(t 2 ) R x (k, l) = E [x(k)x(l)] = α 1 α 2 f x(k),x(l) (α 1, α 2 )dα 1 dα 2 In general, the autocorrelation is a function of k and l The auto-covariance function of a random process x(k) is defined as the covariance of x(k) and x(l) C x (k, l) = E [ (x(k) µ x (k))(x(l) µ x (l)) T ] = R x (k, l) µ x (k)µ x (l) 9/34

10 Cross-correlation function Assuming that k = l = 1, 2 [ (x(1) µx (1)) 2 C x (k, l) = E (x(2) µ x (2)) 2 ], where are cross terms Then we have C x (k, k) = var[x(k)] the correlation coefficient of x(k) is defined as the correlation coefficient of x(k) and x(l) ρ x (k, l) = C x (k, l) Cx (k, k) C x (l, l) Note C x (k, k) and C x (l, l) are scalar The random signal x(k) and y(k) are uncorrelated if C xy (k, l) = 0, k, l, and orthogonal if R xy (k, l) = 0, k, l 10/34

11 Random signals example 1 Let x(k) = A cos 2πk, where A is some random variable The mean of x(k) is The auto-correlation is The auto-covariance is then µ x (k) = E [A cos 2πk] = E [A] cos 2πk R x (k, l) = E [(Acos2πk)(A cos 2πl)] = E [ A 2] cos 2πk cos 2πl C x (k, l) = R x (k, l) µ x (k)µ x (l) = {E [ A 2] E [A] 2 } cos 2πk cos 2πl = var[a] cos 2πk cos 2πl 11/34

12 Random signals example 2 Let x(k) = cos(ωk + Θ), where Θ is uniformly distributed in the interval ( π, π) The mean of x(k) is µ x (k) = E [cos(ωk + Θ)] = 1 2π π π The autocorrelation and auto-covariance are then cos(ωk + θ)dθ = 0 C x (k, l) = R x (k, l) = E [cos(ωk + Θ) cos(ωl + Θ)] = 1 2π π π = 1 2 cos(ω(t 1 t 2 )), 1 {cos(ω(k l)) + cos(ω(k + l) + 2θ)}dθ 2 where cos(a) cos(b) = 1 2 cos(a + b) cos(a b) 12/34

13 Gaussian random signals A discrete-time random signal x(k) is a Gaussian random signal if every collection of a finite number of samples of this random signal is jointly Gaussian the probability density function of the samples x(k), k = 0, 1, 2,, N 1 of a Gaussian random signal is given by f x(0),x(1),,x(n 1) (α 0, α 1,, α N 1 ) ( 1 = (2π) N/2 exp 1 ) det(c x ) 1/2 2 (α µ x)cx 1 (α µ x ) T, where µ X = [ ] T E [x(0)] E [x(1)] E [x(n 1)] 13/34

14 Gaussian random signals and C x (0, 0) C x (0, 1) C x (0, N 1) C x (1, 0) C x (1, 1) C x (1, N 1) C x = C x (N 1, 0) C x (N 1, 1) C x (N 1, N 1) 14/34

15 IID random signals The IID stands for independent, identically distributed An IID random signal x(k) is a sequence of independent, identically distributed random variables with common PDF F x(k) (α), mean µ, and variance σ 2 F x(0),x(1),,x(n 1) (α 0, α 1,, α N 1 ) = F x (α 0 )F x (α 1 ) F x (α N 1 ) The mean of IID process is obtained from µ x (k) = E [x(k)] = µ, k Thus the mean is constant 15/34

16 IID random signals The auto-covariance function is obtained from: for k l then C x (k, l) = E [(x(k) µ)(x(l) µ)] = E [(x(k) µ)] E [x(l) µ] = 0 since x(k) and x(l) are independent random variables for k = l then C x (k, k) = E [ (x(k) µ) 2] = σ 2 in matrix from C x (k, l) = σ 2 δ k,l, where δ k,l = 1 if k = l and 0 otherwise 16/34

17 IID random signals The auto-correlation function of the IID process is obtained from R x (k, l) = C x (k, l) + µ 2 The second term is µ 2 because the mean value of IID signal is constant 17/34

18 Stationary random signals A discrete-time random signal x(k) is stationary if the joint probability distribution function of any finite number of samples does note depend on the placement of the time origin, that is, F x(k0 ),x(k 1 ),,x(k N 1 )(α 0, α 1,, α N 1 ) = F x(k0 +τ),x(k 1 +τ),,x(k N 1 +τ)(α 0, α 1,, α N 1 ), τ Z The first-order probability distribution function of a station random process must be independent of time, ie F x(k) (α) = F x(k+τ) (α) = F x(k) (α), t, τ 18/34

19 Stationary random signals This implies that the mean and variance of x(k) are constant and independent of time: µ x (k) = E [x(k)] = µ x, k var[x(k)] = E [ (x(k) µ x ) 2] = σ 2 x, t The second-order probability distribution function of a stationary random process can depend only on the time difference between the samples and not on the particular time of the samples F x(k),x(l) (α 1, α 2 ) = F x(0),x(k l) (α 1, α 2 ), k, l 19/34

20 Stationary random signals This is implied that the auto-correlation and the auto-covariance of x(k) can depend only on k l: R x (k, l) = R x (k l), k, l C x (k, l) = C x (k l), k, l for iid random signal, the joint PDF for the samples at any N time instants, 0,, N 1, is F x(0),x(2),,x(n 1) (α 0, α 1,, α N 1 ) = F x (α 0 )F x (α 1 ) F x (α N 1 ) = F x(0 τ),x(1 τ),,x(n 1 τ) (α 0, α 1,, α N 1 ) Thus the IID process is also stationary 20/34

21 Wide-Sense Stationary random signals In many situations we cannot determine whether a random process is stationary but we can determine whether the mean is a constant: µ x(k) (k) = m, k and whether the auto-covariance (or equivalently the autocorrelation) is a function of k l only: C x (k, l) = C x (k l), k, l We call the random process which satisfy both conditions above wide-sense stationary (WSS) process 21/34

22 Wide-Sense Stationary random signals A random signal x(k) is wide-sense stationary (WSS) if the following three conditions are satisfied i Its mean is constant: µ x (k) = E [x(k)] = µ x ii Its auto-correlation function R x (k, l) depends only on the lag k l iii Its variance is finite: var[x(k)] = E [ (x(k) µ x ) 2] < 22/34

23 Wide-Sense Stationary random signals Let x(k) consist of two interleaved sequences of independent random variables For k even, x(k) assumes the values ±1 with probability 1/2; for k odd, x(k) assumes the values 1/3 and 3 with probability 9/10 and 1/10, respectively x(k) is not stationary since its PDF varies with k µ x (k) = k αp [x(k) = α] = (1( 1 2 ) 1(1 2 )) + (( )) = 0, k 10 x=even k=odd and the covariance function is E [x(k)] E [x(l)] = 0, for k j C x (k, l) = E [ x(k) 2] = 1 for k = j x(k) is therefore wide-sense stationary 23/34

24 Wide-Sense Stationary random signals Example White Gaussian noise (WGN): Since x(k) N (0, σ 2 ) for all k, we have that µ x (k) = 0 σ 2 x(k) = σ 2 Recalling that the auto-covariance between x(k) is just the variance, we have { 0, k l C x (k, l) = σ 2, k = l Or in short C x (k, l) = σ 2 xδ(k l), k, l In summary, for a WGN random process we have that µ x (k) = 0 for all k and C x (k, l) = σ 2 δ(k l) 24/34

25 Wide-Sense Stationary random signals Auto-correlation function The auto-correlation function R x (τ) of a WSS random signa x(k) is symmetric in its argument τ, that is Since from R x (τ) = R x ( τ) R x (τ) = E [x(k)x(k τ)] = E [x(k τ)x(k)] = R x ( τ) The auto-correlation function R x (τ) of a WSS random signal x(k) satisfies, for τ = 0, R x (0) = E [x(k)x(k)] 0 25/34

26 Wide-Sense Stationary random signals Maximum of the auto-correlation function The maximum of the auto-correlation function R x (τ) of a WSS random signal x(k) occurs at τ = 0, R x (0) R x (k), k Since, E [x(k)y(k)] 2 E [ x(k) 2] E [ y(k) 2], based on Cauchy-Schwarz inequality If we apply this to x(k + τ) and x(k), we obtain R x (τ) 2 = E [x(k)x(k + τ)] 2 E [ x 2 (k) ] E [ x 2 (k + τ) ] = R x (0) 2 Thus R x (τ) R x (0) 26/34

27 Ergodicity and time averages of random signals The ergodicity of a random signal state that, for a stationary IID random signal x(k) with mean E [x(k)] = µ x, the time average converges with probability unity to the mean value µ x, provided that the number of observation N goes to infinity This is denoted by P [ lim N N 1 1 N k=0 x(k) = µ x ] = 1 the ergodic theorem states under what conditions statistical quantities characterizing a stationary random signal, such as its covariance function, can be derived with probability unity from a single realization of that random signal 27/34

28 Ergodicity and time averages of random signals Let {x(k)} N 1 k=0 and {y(k)}n 1 k=0 be two realizations of the stationary random signals x(k) and y(k), respectively Then, under the ergodicity argument, we obtain relationships of the following kind: [ ] N 1 1 P lim x(k) = E [x(k)] = 1, N N P [ lim N k=0 N 1 1 N k=0 y(k) = E [y(k)] ] = 1 If E [x(k)] and E [y(k)] are denoted by µ x and µ y, respectively, then [ ] N 1 1 P lim (x(k) µ x ) (x(k τ) µ x ) = C x (τ) = 1, N N P [ lim N k=0 N 1 1 N k=0 (x(k) µ x ) (y(k τ) µ y ) = C xy (τ) ] = 1 28/34

29 Power spectra The Fourier transform of a random signal will remain a random signal To get a deterministic notion of the frequency content for a random signal, the power-spectral density function, or the power spectrum, is used The spectrum of a signal can be thought of as the distribution of the signal s energy over the whole frequency band Signal spectra are defined for WSS time sequences 29/34

30 Power spectra definition Let x(k) and y(k) be two zero-mean WSS sequences with sampling time T The (power) spectrum of x(k) is Φ x (ω) = τ= R x (τ)e jωτt, and the cross-spectrum between x(k) and y(k) is Φ xy (ω) = τ= R xy (τ)e jωτt The inverse DTFT applied to the spectrum yields R x (τ) = T 2π π/t π/t Φ x (ω)e jωτt dω 30/34

31 Power spectra properties The power spectrum Φ x (ω) is real-valued and symmetric with respect to ω, that is Φ x ( ω) = Φ x (ω) Let a WSS random signal x(k) R with sampling time T and power spectrum Φ x (ω) be given Then E [ x(k) 2] = T 2π π/t π/t Φ x (ω)dω This property shows that the total energy of the signal x(k) given by E [ x(k) 2] is distributed over the frequency band π/t ω π/t 31/34

32 Power spectra example Let x(k) is a white noise sequence then R x (k) = σ 2 δ(k), we have Φ x (ω) = R x (k)e jωτt = = σ 2 τ= τ= σ 2 δ(τ)e j2πτt We need only consider frequencies in the range π/t ω π/t Φ x (ω) σ 2 π T π T ω 32/34

33 Power spectra Filtering WSS random signals Let u(k) be WSS and the input to the BIBO-stable LTI system with transfer function G(q) = g(k)q k, such at y(k) = G(q)u(k) Then i y(k) is WSS ii Φ yu (ω) = G(e jωt )Φ u (ω) iii Φ y (ω) = G(e jωt ) 2 Φ u (ω) k=0 33/34

34 Reference 1 Michel Verhaegen and Vincent Verdult, Filtering and System Identification: A Least Squares Approach, Cambridge University Press, Lecture note on Control Systems Theory and Design, Herbert Werner, Humburg University of Technology 3 David T Westwick and Robert E Kearney, Identification of Nonlinear Physiological Systems, IEEE Press, Alberto Leon-Garcia Probability and Random Processes for Electrical Engineering, 2nd editon, Addison Wesley, /34

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

Random Processes Handout IV

Random Processes Handout IV RP-IV.1 Random Processes Handout IV CALCULATION OF MEAN AND AUTOCORRELATION FUNCTIONS FOR WSS RPS IN LTI SYSTEMS In the last classes, we calculated R Y (τ) using an intermediate function f(τ) (h h)(τ)

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Linear Least Square Problems Dr.-Ing. Sudchai Boonto

Linear Least Square Problems Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Linear Least-Squares Problems Given y, measurement signal, find

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Homework 3 (Stochastic Processes)

Homework 3 (Stochastic Processes) In the name of GOD. Sharif University of Technology Stochastic Processes CE 695 Dr. H.R. Rabiee Homework 3 (Stochastic Processes). Explain why each of the following is NOT a valid autocorrrelation function:

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

7 The Waveform Channel

7 The Waveform Channel 7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel

More information

Signals and Spectra - Review

Signals and Spectra - Review Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your

More information

Non-parametric identification

Non-parametric identification Non-parametric Non-parametric Transient Step-response using Spectral Transient Correlation Frequency function estimate Spectral System Identification, SSY230 Non-parametric 1 Non-parametric Transient Step-response

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania Spectral Analysis Jesús Fernández-Villaverde University of Pennsylvania 1 Why Spectral Analysis? We want to develop a theory to obtain the business cycle properties of the data. Burns and Mitchell (1946).

More information

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES 13.42 READING 6: SPECTRUM OF A RANDOM PROCESS SPRING 24 c A. H. TECHET & M.S. TRIANTAFYLLOU 1. STATIONARY AND ERGODIC RANDOM PROCESSES Given the random process y(ζ, t) we assume that the expected value

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

Random signals II. ÚPGM FIT VUT Brno,

Random signals II. ÚPGM FIT VUT Brno, Random signals II. Jan Černocký ÚPGM FIT VUT Brno, cernocky@fit.vutbr.cz 1 Temporal estimate of autocorrelation coefficients for ergodic discrete-time random process. ˆR[k] = 1 N N 1 n=0 x[n]x[n + k],

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

D.S.G. POLLOCK: BRIEF NOTES

D.S.G. POLLOCK: BRIEF NOTES BIVARIATE SPECTRAL ANALYSIS Let x(t) and y(t) be two stationary stochastic processes with E{x(t)} = E{y(t)} =. These processes have the following spectral representations: (1) x(t) = y(t) = {cos(ωt)da

More information

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement: ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify

More information

Detection & Estimation Lecture 1

Detection & Estimation Lecture 1 Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Spectral Analysis of Random Processes

Spectral Analysis of Random Processes Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all

More information

Detection & Estimation Lecture 1

Detection & Estimation Lecture 1 Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Stable Process. 2. Multivariate Stable Distributions. July, 2006 Stable Process 2. Multivariate Stable Distributions July, 2006 1. Stable random vectors. 2. Characteristic functions. 3. Strictly stable and symmetric stable random vectors. 4. Sub-Gaussian random vectors.

More information

Nonlinear System Identification Using MLP Dr.-Ing. Sudchai Boonto

Nonlinear System Identification Using MLP Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkut s Unniversity of Technology Thonburi Thailand Nonlinear System Identification Given a data set Z N = {y(k),

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

LQR, Kalman Filter, and LQG. Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin

LQR, Kalman Filter, and LQG. Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin LQR, Kalman Filter, and LQG Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin May 2015 Linear Quadratic Regulator (LQR) Consider a linear system

More information

ON THE BOUNDEDNESS BEHAVIOR OF THE SPECTRAL FACTORIZATION IN THE WIENER ALGEBRA FOR FIR DATA

ON THE BOUNDEDNESS BEHAVIOR OF THE SPECTRAL FACTORIZATION IN THE WIENER ALGEBRA FOR FIR DATA ON THE BOUNDEDNESS BEHAVIOR OF THE SPECTRAL FACTORIZATION IN THE WIENER ALGEBRA FOR FIR DATA Holger Boche and Volker Pohl Technische Universität Berlin, Heinrich Hertz Chair for Mobile Communications Werner-von-Siemens

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

System Identification

System Identification System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

Random Process Review

Random Process Review Rando Process Review Consider a rando process t, and take k saples. For siplicity, we will set k. However it should ean any nuber of saples. t () t x t, t, t We have a rando vector t, t, t. If we find

More information

Advanced Process Control Tutorial Problem Set 2 Development of Control Relevant Models through System Identification

Advanced Process Control Tutorial Problem Set 2 Development of Control Relevant Models through System Identification Advanced Process Control Tutorial Problem Set 2 Development of Control Relevant Models through System Identification 1. Consider the time series x(k) = β 1 + β 2 k + w(k) where β 1 and β 2 are known constants

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME

CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME Shri Mata Vaishno Devi University, (SMVDU), 2013 Page 13 CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME When characterizing or modeling a random variable, estimates

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe System Identification written by L. Ljung, Prentice Hall PTR, 1999 Chapter 6: Nonparametric Time- and Frequency-Domain Methods Problems presented by Uwe System Identification Problems Chapter 6 p. 1/33

More information

Atmospheric Flight Dynamics Example Exam 1 Solutions

Atmospheric Flight Dynamics Example Exam 1 Solutions Atmospheric Flight Dynamics Example Exam 1 Solutions 1 Question Figure 1: Product function Rūū (τ) In figure 1 the product function Rūū (τ) of the stationary stochastic process ū is given. What can be

More information

EE 438 Essential Definitions and Relations

EE 438 Essential Definitions and Relations May 2004 EE 438 Essential Definitions and Relations CT Metrics. Energy E x = x(t) 2 dt 2. Power P x = lim T 2T T / 2 T / 2 x(t) 2 dt 3. root mean squared value x rms = P x 4. Area A x = x(t) dt 5. Average

More information

Lecture 8: Discrete-Time Signals and Systems Dr.-Ing. Sudchai Boonto

Lecture 8: Discrete-Time Signals and Systems Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongut s Unniversity of Technology Thonburi Thailand Outline Introduction Some Useful Discrete-Time Signal Models

More information

Lecture 9: Input Disturbance A Design Example Dr.-Ing. Sudchai Boonto

Lecture 9: Input Disturbance A Design Example Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand d u g r e u K G y The sensitivity S is the transfer function

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Final Exam January 31, Solutions

Final Exam January 31, Solutions Final Exam January 31, 014 Signals & Systems (151-0575-01) Prof. R. D Andrea & P. Reist Solutions Exam Duration: Number of Problems: Total Points: Permitted aids: Important: 150 minutes 7 problems 50 points

More information

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Simo Särkkä Aalto University, Finland (visiting at Oxford University, UK) November 13, 2013 Simo Särkkä (Aalto) Lecture 1: Pragmatic

More information

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS In this unit we will study the characteristics of random processes regarding correlation and covariance functions which are defined in time domain. This

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

4 Classical Coherence Theory

4 Classical Coherence Theory This chapter is based largely on Wolf, Introduction to the theory of coherence and polarization of light [? ]. Until now, we have not been concerned with the nature of the light field itself. Instead,

More information

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

1 Signals and systems

1 Signals and systems 978--52-5688-4 - Introduction to Orthogonal Transforms: With Applications in Data Processing and Analysis Signals and systems In the first two chapters we will consider some basic concepts and ideas as

More information

Fundamentals of Noise

Fundamentals of Noise Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency

More information

Signals and Spectra (1A) Young Won Lim 11/26/12

Signals and Spectra (1A) Young Won Lim 11/26/12 Signals and Spectra (A) Copyright (c) 202 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version.2 or any later

More information

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007 Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Correlator I. Basics. Chapter Introduction. 8.2 Digitization Sampling. D. Anish Roshi

Correlator I. Basics. Chapter Introduction. 8.2 Digitization Sampling. D. Anish Roshi Chapter 8 Correlator I. Basics D. Anish Roshi 8.1 Introduction A radio interferometer measures the mutual coherence function of the electric field due to a given source brightness distribution in the sky.

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information