Gaussian, Markov and stationary processes

Size: px
Start display at page:

Download "Gaussian, Markov and stationary processes"

Transcription

1 Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester November 16, 2018 Introduction to Random Processes Gaussian, Markov and stationary processes 1

2 Introduction and roadmap Introduction and roadmap Gaussian processes Brownian motion and its variants White Gaussian noise Introduction to Random Processes Gaussian, Markov and stationary processes 2

3 Random processes Random processes assign a function X (t) to a random event Without restrictions, there is little to say about them Markov property simplifies matters and is not too restrictive Also constrained ourselves to discrete state spaces Further simplification but might be too restrictive Time t and range of X (t) values continuous in general Time and/or state may be discrete as particular cases Restrict attention to (any type or a combination of types) Markov processes (memoryless) Gaussian processes (Gaussian probability distributions) Stationary processes ( limit distribution ) Introduction to Random Processes Gaussian, Markov and stationary processes 3

4 Markov processes X (t) is a Markov process when the future is independent of the past For all t > s and arbitrary values x(t), x(s) and x(u) for all u < s P ( X (t) x(t) X (s) x(s), X (u) x(u), u < s ) = P ( X (t) x(t) X (s) x(s) ) Markov property defined in terms of cdfs, not pmfs Markov property useful for same reasons as in discrete time/state But not that useful as in discrete time /state More details later Introduction to Random Processes Gaussian, Markov and stationary processes 4

5 Gaussian processes X (t) is a Gaussian process when all prob. distributions are Gaussian For arbitrary n > 0, times t 1, t 2,..., t n it holds Values X (t 1 ), X (t 2 ),..., X (t n ) are jointly Gaussian RVs Simplifies study because Gaussian distribution is simplest possible Suffices to know mean, variances and (cross-)covariances Linear transformation of independent Gaussians is Gaussian Linear transformation of jointly Gaussians is Gaussian More details later Introduction to Random Processes Gaussian, Markov and stationary processes 5

6 Markov processes + Gaussian processes Markov (memoryless) and Gaussian properties are different Will study cases when both hold Brownian motion, also known as Wiener process Brownian motion with drift White noise Linear evolution models Geometric brownian motion Arbitrages Risk neutral measures Pricing of stock options (Black-Scholes) Introduction to Random Processes Gaussian, Markov and stationary processes 6

7 Stationary processes Process X (t) is stationary if probabilities are invariant to time shifts For arbitrary n > 0, times t 1, t 2,..., t n and arbitrary time shift s P (X (t 1 + s) x 1, X (t 2 + s) x 2,..., X (t n + s) x n ) = P (X (t 1 ) x 1, X (t 2 ) x 2,..., X (t n ) x n ) System s behavior is independent of time origin Follows from our success studying limit probabilities Study of stationary process Study of limit distribution Will study Spectral analysis of stationary random processes Linear filtering of stationary random processes More details later Introduction to Random Processes Gaussian, Markov and stationary processes 7

8 Gaussian processes Introduction and roadmap Gaussian processes Brownian motion and its variants White Gaussian noise Introduction to Random Processes Gaussian, Markov and stationary processes 8

9 Jointly Gaussian random variables Def: Random variables X 1,..., X n are jointly Gaussian (normal) if any linear combination of them is Gaussian Given n > 0, for any scalars a 1,..., a n the RV (a = [a 1,..., a n ] T ) Y = a 1 X 1 + a 2 X a n X n = a T X is Gaussian distributed May also say vector RV X = [X 1,..., X n ] T is Gaussian Consider 2 dimensions 2 RVs X 1 and X 2 are jointly normal To describe joint distribution have to specify Means: µ 1 = E [X 1 ] and µ 2 = E [X 2 ] Variances: σ 2 11 = var [X 1] = E [ (X 1 µ 1 ) 2] and σ 2 22 = var [X 2] Covariance: σ 2 12 = cov(x 1, X 2 ) = E [(X 1 µ 1 )(X 2 µ 2 )]= σ 2 21 Introduction to Random Processes Gaussian, Markov and stationary processes 9

10 Pdf of jointly Gaussian RVs in 2 dimensions Define mean vector µ = [µ 1, µ 2 ] T and covariance matrix C R 2 2 C = ( σ 2 11 σ 2 12 σ 2 21 σ 2 22 ) C is symmetric, i.e., C T = C because σ 2 21 = σ2 12 Joint pdf of X = [X 1, X 2 ] T is given by ( 1 f X (x) = 2π det 1/2 (C) exp 1 ) 2 (x µ)t C 1 (x µ) Assumed that C is invertible, thus det(c) 0 If the pdf of X is f X (x) above, can verify Y = a T X is Gaussian Introduction to Random Processes Gaussian, Markov and stationary processes 10

11 Pdf of jointly Gaussian RVs in n dimensions For X R n (n dimensions) define µ = E [X] and covariance matrix σ [ ] 11 2 σ σ 2 1n C := E (X µ)(x µ) T σ21 2 σ σ2n 2 = σn1 2 σn σnn 2 C symmetric, (i, j)-th element is σ 2 ij = cov(x i, X j ) Joint pdf of X defined as before (almost, spot the difference) ( 1 f X (x) = (2π) n/2 det 1/2 (C) exp 1 ) 2 (x µ)t C 1 (x µ) C invertible and det(c) 0. All linear combinations normal To fully specify the probability distribution of a Gaussian vector X The mean vector µ and covariance matrix C suffice Introduction to Random Processes Gaussian, Markov and stationary processes 11

12 Notational aside and independence With x R n, µ R n and C R n n, define function N (x; µ, C) as ( 1 N (x; µ, C) := (2π) n/2 det 1/2 (C) exp 1 ) 2 (x µ)t C 1 (x µ) µ and C are parameters, x is the argument of the function Let X R n be a Gaussian vector with mean µ, and covariance C Can write the pdf of X as f X (x) = N (x; µ, C) If X 1,..., X n are mutually independent, then C = diag(σ 2 11,..., σ2 nn) and f X (x) = n 1 exp ( (x i µ i ) 2 ) 2πσ 2 ii i=1 2σ 2 ii Introduction to Random Processes Gaussian, Markov and stationary processes 12

13 Gaussian processes Gaussian processes (GP) generalize Gaussian vectors to infinite dimensions Def: X (t) is a GP if any linear combination of values X (t) is Gaussian For arbitrary n > 0, times t 1,..., t n and constants a 1,..., a n Y = a 1 X (t 1 ) + a 2 X (t 2 ) a n X (t n ) is Gaussian distributed Time index t can be continuous or discrete More general, any linear functional of X (t) is normally distributed A functional is a function of a function Ex: The (random) integral Y = t2 t 1 X (t) dt is Gaussian distributed Integral functional is akin to a sum of X (t i ), for all t i [t 1, t 2 ] Introduction to Random Processes Gaussian, Markov and stationary processes 13

14 Joint pdfs in a Gaussian process Consider times t 1,..., t n. The mean value µ(t i ) at such times is µ(t i ) = E [X (t i )] The covariance between values at times t i and t j is C(t i, t j ) = E [( X (t i ) µ(t i ) )( X (t j ) µ(t j ) )] Covariance matrix for values X (t 1 ),..., X (t n ) is then C(t 1, t 1) C(t 1, t 2)... C(t 1, t n) C(t 2, t 1) C(t 2, t 2)... C(t 2, t n) C(t 1,..., t n) = C(t n, t 1) C(t n, t 2)... C(t n, t n) Joint pdf of X (t 1 ),..., X (t n ) then given as ( ) f X (t1 ),...,X (t n)(x 1,..., x n) = N [x 1,..., x n] T ; [µ(t 1),..., µ(t n)] T, C(t 1,..., t n) Introduction to Random Processes Gaussian, Markov and stationary processes 14

15 Mean value and autocorrelation functions To specify a Gaussian process, suffices to specify: Mean value function µ(t) = E [X (t)]; and Autocorrelation function R(t 1, t 2 ) = E [ X (t 1 )X (t 2 ) ] Autocovariance obtained as C(t 1, t 2 ) = R(t 1, t 2 ) µ(t 1 )µ(t 2 ) For simplicity, will mostly consider processes with µ(t) = 0 Otherwise, can define process Y (t) = X (t) µ X (t) In such case C(t 1, t 2 ) = R(t 1, t 2 ) because µ Y (t) = 0 Autocorrelation is a symmetric function of two variables t 1 and t 2 R(t 1, t 2 ) = R(t 2, t 1 ) Introduction to Random Processes Gaussian, Markov and stationary processes 15

16 Probabilities in a Gaussian process All probs. in a GP can be expressed in terms of µ(t) and R(t 1, t 2 ) For example, pdf of X (t) is ( ( 1 f X (t) (x t ) = 2π ( R(t, t) µ 2 (t) ) exp xt µ(t) ) ) 2 2 ( R(t, t) µ 2 (t) ) X (t) µ(t) R(t,t) µ2 (t) Notice that is a standard Gaussian random variable ( ) a µ(t) P (X (t) > a) = Φ, where Φ(x) = R(t,t) µ 2 (t) x 1 2π exp ( x 2 2 ) dx Introduction to Random Processes Gaussian, Markov and stationary processes 16

17 Joint and conditional probabilities in a GP For a zero-mean GP X (t) consider two times t 1 and t 2 The covariance matrix for X (t 1 ) and X (t 2 ) is C = ( R(t1, t 1 ) R(t 1, t 2 ) R(t 1, t 2 ) R(t 2, t 2 ) Joint pdf of X (t 1 ) and X (t 2 ) then given as (recall µ(t) = 0) ( 1 f X (t1),x (t 2)(x t1, x t2 ) = 2π det 1/2 (C) exp 1 ) 2 [x t 1, x t2 ] T C 1 [x t1, x t2 ] Conditional pdf of X (t 1 ) given X (t 2 ) computed as f X (t1) X (t 2)(x t1 xt2 ) = f X (t 1),X (t 2)(x t1, x t2 ) f X (t2)(x t2 ) ) Introduction to Random Processes Gaussian, Markov and stationary processes 17

18 Brownian motion and its variants Introduction and roadmap Gaussian processes Brownian motion and its variants White Gaussian noise Introduction to Random Processes Gaussian, Markov and stationary processes 18

19 Brownian motion as limit of random walk Gaussian processes are natural models due to Central Limit Theorem Let us reconsider a symmetric random walk in one dimension x(t) Time interval = h t Walker takes increasingly frequent and increasingly smaller steps Introduction to Random Processes Gaussian, Markov and stationary processes 19

20 Brownian motion as limit of random walk Gaussian processes are natural models due to Central Limit Theorem Let us reconsider a symmetric random walk in one dimension x(t) Time interval = h/2 t Walker takes increasingly frequent and increasingly smaller steps Introduction to Random Processes Gaussian, Markov and stationary processes 20

21 Brownian motion as limit of random walk Gaussian processes are natural models due to Central Limit Theorem Let us reconsider a symmetric random walk in one dimension x(t) Time interval = h/4 t Walker takes increasingly frequent and increasingly smaller steps Introduction to Random Processes Gaussian, Markov and stationary processes 21

22 Random walk, time step h and step size σ h Let X (t) be the position at time t with X (0) = 0 Time interval is h and σ h is the size of each step Walker steps right or left w.p. 1/2 for each direction Given X (t) = x, prob. distribution of the position at time t + h is P (X (t + h) = x + σ h ) X (t) = x = 1/2 P (X (t + h) = x σ h ) X (t) = x = 1/2 Consider time T = Nh and index n = 1, 2,..., N Introduce step RVs Y n = ±1, with P (Y n = ±1) = 1/2 Can write X (nh) in terms of X ((n 1)h) and Y n as ( X (nh) = X ((n 1)h) + σ ) h Y n Introduction to Random Processes Gaussian, Markov and stationary processes 22

23 Central Limit Theorem as h 0 Use recursion to write X (T ) = X (Nh) as (recall X (0) = 0) ( X (T ) = X (Nh) = X (0) + σ ) N h Y n = n=1 ( σ h ) N Y 1,..., Y N are i.i.d. with zero-mean and variance var [Y n ] = E [ Yn 2 ] = (1/2) (1/2) ( 1) 2 = 1 As h 0 we have N = T /h, and from Central Limit Theorem N Y n N (0, N) = N (0, T /h) n=1 X (T ) N ( 0, (σ 2 h) (T /h) ) = N ( 0, σ 2 T ) n=1 Y n Introduction to Random Processes Gaussian, Markov and stationary processes 23

24 Conditional distribution of future values More generally, consider times T = Nh and T + S = (N + M)h Let X (T ) = x(t ) be given. Can write X (T + S) as ( X (T + S) = x(t ) + σ ) N+M h From Central Limit Theorem it then follows N+M n=n+1 n=n+1 Y n Y n N ( 0, (N + M N) ) = N (0, S/h) [X (T + S) ] X (T ) = x(t ) N (x(t ), σ 2 S) Introduction to Random Processes Gaussian, Markov and stationary processes 24

25 Definition of Brownian motion The former analysis was for motivational purposes Def: A Brownian motion process (a.k.a Wiener process) satisfies (i) X (t) is normally distributed with zero mean and variance σ 2 t X (t) N ( 0, σ 2 t ) (ii) Independent increments For disjoint intervals (t 1, t 2 ) and (s 1, s 2 ) increments X (t 2 ) X (t 1 ) and X (s 2 ) X (s 1 ) are independent RVs (iii) Stationary increments Probability distribution of increment X (t + s) X (s) is the same as probability distribution of X (t) Property (ii) Brownian motion is a Markov process Properties (i)-(iii) Brownian motion is a Gaussian process Introduction to Random Processes Gaussian, Markov and stationary processes 25

26 Mean and autocorrelation of Brownian motion Mean function µ(t) = E [X (t)] is null for all times (by definition) µ(t) = E [X (t)] = 0 For autocorrelation R X (t 1, t 2 ) start with times t 1 < t 2 Use conditional expectations to write [ [ R X (t 1, t 2 ) = E [X (t 1 )X (t 2 )] = E X (t1) E X (t2) X (t1 )X (t 2 ) X (t1 ) ]] In the innermost expectation X (t 1 ) is a given constant, then [ [ R X (t 1, t 2 ) = E X (t1) X (t 1 )E X (t2) X (t2 ) X (t 1 ) ]] Proceed by computing innermost expectation Introduction to Random Processes Gaussian, Markov and stationary processes 26

27 Autocorrelation of Brownian motion (continued) The conditional distribution of X (t 2 ) given X (t 1 ) for t 1 < t 2 is [ X (t 2 ) ] ( ) X (t 1 ) N X (t 1 ), σ 2 (t 2 t 1 ) Innermost expectation is E X (t2)[ X (t2 ) X (t 1 ) ] = X (t 1 ) From where autocorrelation follows as R X (t 1, t 2 ) = E X (t1)[ X (t1 )X (t 1 ) ] = E X (t1)[ X 2 (t 1 ) ] = σ 2 t 1 Repeating steps, if t 2 < t 1 R X (t 1, t 2 ) = σ 2 t 2 Autocorrelation of Brownian motion R X (t 1, t 2 ) = σ 2 min(t 1, t 2 ) Introduction to Random Processes Gaussian, Markov and stationary processes 27

28 Brownian motion with drift Similar to Brownian motion, but start from biased random walk Time interval h, step size σ h, right or left with different probs. ( P X (t + h) = x + σ h ) X (t) = x = 1 ( 1 + µ ) h 2 σ P (X (t + h) = x σ h ) X (t) = x = 1 ( 1 µ ) h 2 σ If µ > 0 biased to the right, if µ < 0 biased to the left Definition requires h small enough to make (µ/σ) h 1 Notice that bias vanishes as h, same as step size Introduction to Random Processes Gaussian, Markov and stationary processes 28

29 Mean and variance of biased steps Define step RV Y n = ±1, with probabilities P (Y n = 1) = 1 ( 1 + µ ) h, P (Y n = 1) = 1 ( 1 µ ) h 2 σ 2 σ Expected value of Y n is E [Y n] = 1 P (Y n = 1) + ( 1) P (Y n = 1) = 1 ( 1 + µ ) h 1 ( 1 µ ) h = µ h 2 σ 2 σ σ Second moment of Y n is [ ] E Yn 2 = (1) 2 P (Y n = 1) + ( 1) 2 P (Y n = 1) = 1 [ ] Variance of Y n is var [Y n] = E Yn 2 E 2 [Y n] = 1 µ2 σ h 2 Introduction to Random Processes Gaussian, Markov and stationary processes 29

30 Central Limit Theorem as h 0 Consider time T = Nh, index n = 1, 2,..., N. Write X (nh) as ( X (nh) = X ((n 1)h) + σ ) h Use recursively to write X (T ) = X (Nh) as ( X (T ) = X (Nh) = X (0) + σ ) N h Y n = n=1 Y n ( σ h ) N As h 0 we have N and N n=1 Y n normally distributed As h 0, X (T ) tends to be normally distributed by CLT Need to determine mean and variance (and only mean and variance) n=1 Y n Introduction to Random Processes Gaussian, Markov and stationary processes 30

31 Mean and variance of X (T ) Expected value of X (T ) = scaled sum of E [Y n ] (recall T = Nh) ( E [X (T )] = σ ) ( h N E [Y n ] = σ ) ( µ ) h N h = µt σ Variance of X (T ) = scaled sum of variances of independent Y n ( var [X (T )] = σ 2 h) N var [Yn ] = ( σ 2 h ) ) N (1 µ2 σ 2 h σ 2 T Used T = Nh and 1 (µ 2 /σ 2 )h 1 Brownian motion with drift (BMD) X (t) N ( µt, σ 2 t ) Normal with mean µt and variance σ 2 t Independent and stationary increments Introduction to Random Processes Gaussian, Markov and stationary processes 31

32 Geometric random walk Suppose next state follows by multiplying current by a random factor Compare with adding or subtracting a random quantity Define RV Y n = ±1 with probabilities as in biased random walk P (Y n = 1) = 1 ( 1 + µ ) h, P (Y n = 1) = 1 ( 1 µ ) h 2 σ 2 σ Def: The geometric random walk follows the recursion Z(nh) = Z((n 1)h)e (σ h)y n When Y n = 1 increase Z(nh) by relative amount e (σ h) When Y n = 1 decrease Z(nh) by relative amount e (σ h) Notice e ±(σ h) 1 ± (σ ) h Useful to model investment return Introduction to Random Processes Gaussian, Markov and stationary processes 32

33 Geometric Brownian motion Take logarithms on both sides of recursive definition ( ) ( ) ( log Z(nh) = log Z((n 1)h) + σ ) h Y n ( ) Define X (nh) = log Z(nh), thus recursion for X (nh) is ( X (nh) = X ((n 1)h) + σ ) h Y n As h 0, X (t) becomes BMD with parameters µ and σ 2 Def: Given a BMD X (t) with parameters µ and σ 2, the process Z(t) Z(t) = e X (t) is a geometric Brownian motion (GBM) with parameters µ and σ 2 Introduction to Random Processes Gaussian, Markov and stationary processes 33

34 White Gaussian noise Introduction and roadmap Gaussian processes Brownian motion and its variants White Gaussian noise Introduction to Random Processes Gaussian, Markov and stationary processes 34

35 Dirac delta function Consider a function δ h (t) defined as δ h (t) { 1/h if h/2 t h/2 δ h (t) = 0 else Define delta function as limit of δ h (t) as h 0 { if t = 0 δ(t) = lim δ h (t) = h 0 0 else Q: Is this a function? A: Of course not t Consider the integral of δ h (t) in an interval that includes [ h/2, h/2] b a δ h (t) dt = 1, Integral is 1 independently of h for any a, b such that a h/2, h/2 b Introduction to Random Processes Gaussian, Markov and stationary processes 35

36 Dirac delta function (continued) Another integral involving δ h (t) (for h small) b f (t)δ h (t) dt h/2 a h/2 f (0) 1 dt f (0), h a h/2, h/2 b Def: The generalized function δ(t) is the entity having the property b a { f (0) if a < 0 < b f (t)δ(t) dt = 0 else A delta function is not defined, its action on other functions is Interpretation: A delta function cannot be observed directly But can be observed through its effect on other functions Delta function helps to define derivatives of discontinuous functions Introduction to Random Processes Gaussian, Markov and stationary processes 36

37 Heaviside s step function and delta function Integral of delta function between and t t { } 0 if t < 0 δ(u) du = := H(t) 1 if t > 0 H(t) is called Heaviside s step function Define the derivative of Heaviside s step function as H(t) t = δ(t) Maintains consistency of fundamental theorem of calculus H(t) δ(t) t Introduction to Random Processes Gaussian, Markov and stationary processes 37

38 White Gaussian noise Def: A white Gaussian noise (WGN) process W (t) is a GP with Zero mean: µ(t) = E [W (t)] = 0 for all t Delta function autocorrelation: R W (t 1, t 2 ) = σ 2 δ(t 1 t 2 ) To interpret W (t) consider time step h and process W h (nh) with (i) Normal distribution W h (nh) N (0, σ 2 /h) (ii) W h (n 1h) and W h (n 2h) are independent for n 1 n 2 White noise W (t) is the limit of the process W h (nh) as h 0 W (t) = lim n W h(nh), with n = t/h Process W h (nh) is the discrete-time representation of WGN Introduction to Random Processes Gaussian, Markov and stationary processes 38

39 Properties of white Gaussian noise For different times t 1 and t 2, W (t 1 ) and W (t 2 ) are uncorrelated E [W (t 1 )W (t 2 )] = R W (t 1, t 2 ) = 0, t 1 t 2 But since W (t) is Gaussian uncorrelatedness implies independence Values of W (t) at different times are independent WGN has infinite power E [ W 2 (t) ] = R W (t, t) = σ 2 δ(0) = WGN does not represent any physical phenomena However WGN is a convenient abstraction Approximates processes with large power and independent samples Some processes can be modeled as post-processing of WGN Cannot observe WGN directly But can model its effect on systems, e.g., filters Introduction to Random Processes Gaussian, Markov and stationary processes 39

40 Integral of white Gaussian noise Consider integral of a WGN process W (t) X (t) = t 0 W (u) du Since integration is linear functional and W (t) is GP, X (t) is also GP To characterize X (t) just determine mean and autocorrelation The mean function µ(t) = E [X (t)] is null [ t ] t µ(t) = E W (u) du = E [W (u)] du = The autocorrelation R X (t 1, t 2 ) is given by (assume t 1 < t 2 ) [( t1 ) ( t2 )] R X (t 1, t 2 ) = E W (u 1 ) du 1 W (u 2 ) du Introduction to Random Processes Gaussian, Markov and stationary processes 40

41 Integral of white Gaussian noise (continued) Product of integral is double integral of product R X (t 1, t 2) = E [ t1 t2 Interchange expectation and integration R X (t 1, t 2) = 0 t1 t W (u 1)W (u 2) du 1du 2 ] E [W (u 1)W (u 2)] du 1du 2 Definition and value of autocorrelation R W (u 1, u 2) = σ 2 δ(u 1 u 2) R X (t 1, t 2) = = = t1 t2 0 0 t1 t1 0 t1 0 0 σ 2 δ(u 1 u 2) du 1du 2 t1 t2 σ 2 δ(u 1 u 2) du 1du 2 + σ 2 δ(u 1 u 2) du 1du 2 0 t 1 σ 2 du 1 = σ 2 t 1 Same mean and autocorrelation functions as Brownian motion Introduction to Random Processes Gaussian, Markov and stationary processes 41

42 White Gaussian noise and Brownian motion GPs are uniquely determined by mean and autocorrelation functions The integral of WGN is a Brownian motion process Conversely the derivative of Brownian motion is WGN With W (t) a WGN process and X (t) Brownian motion t 0 W (u) du = X (t) X (t) t = W (t) Brownian motion can be also interpreted as a sum of Gaussians Not Bernoullis as before with the random walk Any i.i.d. distribution with same mean and variance works This is all nice, but derivatives and integrals involve limits What are these derivatives and integrals? Introduction to Random Processes Gaussian, Markov and stationary processes 42

43 Mean-square derivative of a random process Consider a realization x(t) of the random process X (t) Def: The derivative of (lowercase) x(t) is x(t) t = lim h 0 x(t + h) x(t) h When this limit exists Limit may not exist for all realizations Can define sure limit, a.s. limit, in probability,... Notion of convergence used here is in mean-squared sense Def: Process X (t)/ t is the mean-square sense derivative of X (t) if [ (X ) ] 2 (t + h) X (t) lim E X (t) = 0 h 0 h t Introduction to Random Processes Gaussian, Markov and stationary processes 43

44 Mean-square integral of a random process Likewise consider the integral of a realization x(t) of X (t) b a (b a)/h x(t)dt = lim hx(a + nh) h 0 n=1 Limit need not exist for all realizations Can define in sure sense, almost sure sense, in probability sense,... Again, adopt definition in mean-square sense Def: Process b X (t)dt is the mean square sense integral of X (t) if a ( (b a)/h b ) 2 lim hx (a + nh) X (t)dt = 0 h 0 E n=1 Mean-square sense convergence is convenient to work with GPs a Introduction to Random Processes Gaussian, Markov and stationary processes 44

45 Linear state model example Def: A random process X (t) follows a linear state model if X (t) t = ax (t) + W (t) with W (t) WGN, autocorrelation R W (t 1, t 2) = σ 2 δ(t 1 t 2) Discrete-time representation of X (t) X (nh) with step size h Solving differential equation between nh and (n + 1)h (h small) X ((n + 1)h) X (nh)e ah + (n+1)h nh W (t) dt Defining X (n) := X (nh) and W (n) := (n+1)h W (t) dt may write nh X (n + 1) (1 + ah)x (n) + W (n) Where E [ W 2 (n) ] = σ 2 h and W (n 1) independent of W (n 2) Introduction to Random Processes Gaussian, Markov and stationary processes 45

46 Vector linear state model example Def: A vector random process X(t) follows a linear state model if X(t) t = AX(t) + W(t) with W(t) vector WGN, autocorrelation R W (t 1, t 2) = σ 2 δ(t 1 t 2)I Discrete-time representation of X(t) X(nh) with step size h Solving differential equation between nh and (n + 1)h (h small) X((n + 1)h) X(nh)e Ah + (n+1)h nh W(t) dt Defining X(n) := X(nh) and W(n) := (n+1)h W(t) dt may write nh X(n + 1) (I + Ah)X(n) + W(n) Where E [ W 2 (n) ] = σ 2 hi and W(n 1) independent of W(n 2) Introduction to Random Processes Gaussian, Markov and stationary processes 46

47 Glossary Markov process Gaussian process Stationary process Gaussian random vectors Mean vector Covariance matrix Multivariate Gaussian pdf Linear functional Autocorrelation function Brownian motion (Wiener process) Brownian motion with drift Geometric random walk Geometric Brownian motion Investment returns Dirac delta function Heaviside s step function White Gaussian noise Mean-square derivatives Mean-square integrals Linear (vector) state model Introduction to Random Processes Gaussian, Markov and stationary processes 47

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions Copyright c 2007 by Karl Sigman 1 Simulating normal Gaussian rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions Fundamental to many applications

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution ECE 54 Stochastic Signals and Systems Problem Set Solution Problem Solutions : Yates and Goodman,..4..7.3.3.4.3.8.3 and.8.0 Problem..4 Solution Since E[Y (t] R Y (0, we use Theorem.(a to evaluate R Y (τ

More information

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

Brownian Motion and Poisson Process

Brownian Motion and Poisson Process and Poisson Process She: What is white noise? He: It is the best model of a totally unpredictable process. She: Are you implying, I am white noise? He: No, it does not exist. Dialogue of an unknown couple.

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

ECE 673-Random signal analysis I Final

ECE 673-Random signal analysis I Final ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1]. 1 Introduction The content of these notes is also covered by chapter 3 section B of [1]. Diffusion equation and central limit theorem Consider a sequence {ξ i } i=1 i.i.d. ξ i = d ξ with ξ : Ω { Dx, 0,

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Large Sample Theory. Consider a sequence of random variables Z 1, Z 2,..., Z n. Convergence in probability: Z n

Large Sample Theory. Consider a sequence of random variables Z 1, Z 2,..., Z n. Convergence in probability: Z n Large Sample Theory In statistics, we are interested in the properties of particular random variables (or estimators ), which are functions of our data. In ymptotic analysis, we focus on describing the

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute

More information

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES ROY M. HOWARD Department of Electrical Engineering & Computing Curtin University of Technology Perth, Australia WILEY CONTENTS Preface xiii 1 A Signal

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

Malliavin Calculus in Finance

Malliavin Calculus in Finance Malliavin Calculus in Finance Peter K. Friz 1 Greeks and the logarithmic derivative trick Model an underlying assent by a Markov process with values in R m with dynamics described by the SDE dx t = b(x

More information

The Multivariate Gaussian Distribution

The Multivariate Gaussian Distribution The Multivariate Gaussian Distribution Chuong B. Do October, 8 A vector-valued random variable X = T X X n is said to have a multivariate normal or Gaussian) distribution with mean µ R n and covariance

More information

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2) Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2) Statistical analysis is based on probability theory. The fundamental object in probability theory is a probability space,

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

Probabilities & Statistics Revision

Probabilities & Statistics Revision Probabilities & Statistics Revision Christopher Ting Christopher Ting http://www.mysmu.edu/faculty/christophert/ : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 January 6, 2017 Christopher Ting QF

More information

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Simo Särkkä Aalto University, Finland (visiting at Oxford University, UK) November 13, 2013 Simo Särkkä (Aalto) Lecture 1: Pragmatic

More information

Probability. Table of contents

Probability. Table of contents Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

Transformation of Probability Densities

Transformation of Probability Densities Transformation of Probability Densities This Wikibook shows how to transform the probability density of a continuous random variable in both the one-dimensional and multidimensional case. In other words,

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

A Few Notes on Fisher Information (WIP)

A Few Notes on Fisher Information (WIP) A Few Notes on Fisher Information (WIP) David Meyer dmm@{-4-5.net,uoregon.edu} Last update: April 30, 208 Definitions There are so many interesting things about Fisher Information and its theoretical properties

More information

Kolmogorov Equations and Markov Processes

Kolmogorov Equations and Markov Processes Kolmogorov Equations and Markov Processes May 3, 013 1 Transition measures and functions Consider a stochastic process {X(t)} t 0 whose state space is a product of intervals contained in R n. We define

More information