Statistical signal processing

Size: px
Start display at page:

Download "Statistical signal processing"

Transcription

1 Statistical signal processing Short overview of the fundamentals

2 Outline Random variables Random processes Stationarity Ergodicity Spectral analysis

3 Random variable and processes Intuition: A random variable can be considered as the outcome of an experiment. This can be A number (es lottery) A function of time (es. EGC signal) A function of space (es. noise in photographic images) Random (or stochastic) process: set of random variables The repetition of the experiment results in a set of random variables (), t (), t, () t 1 2 n Experimental physical measure x(t) measured function: random variable, realization realizzazione del processo NOTATIONS: RV, x realization

4 Deterministic and Random variables Deterministic phenomenon Precise relation between causes and effects Repeatability Predictability Stochastic phenomenon The relation between causes and effects is not given in mathematical sense There is a stochastic regularity among the different observations The regularity can be observed if a large number of observations is carried out such that the expectations of the involved variables can be inferred

5 Random variables A random variable (RV) is a real valued variable which depends on the outcomes of an experiment A and it is such that { x R}, is an event (it gathers the outcomes of an experiment for which a probability is defined) The probability of the events x=- and x=+ is zero { = } = p{ x =+ } = 0 p x Otherwise stated: x(a) is a function which is defined over the domain A (that corresponds to the set of possible results) and whose value is a real number A

6 Random variables: résumé A random variable is a mapping between the sample space and the real line (real-valued RV) or the complex plan (complex valued RV)

7 Probability distribution function

8 Discrete and Continuous RV The distinction concerns the values that can be taken by the RV Continuous: there is no restriction of the set of values that x can take Discrete: there exists a countable sequence of distinct real numbers x i that the RV can take Discrete RV Probability mass function replaces the distribution function { i} { x} = 1 P = P x = x Mixed RV: combination of the two types m i P m i

9 Examples Continuous RV Discrete RV F (x) F (x) f (x) x f (x) x x x1 x2 x3 x Deltas allow to derive the pdf of a discrete variable.the height of the deltas represents the probability of the event.

10 Examples F (x) Mixed variable f (x) x x

11 Probability density function

12 Joint random variables Given two random variables defined over the same space S, then Y, = { } ( ) = (, ) ( ) = Y (, ) ( x, ) = 0 (, y) = 0 F ( x, y) P x, y Y joint distribution function F x F x FY y F y F F Y Y 2 fy, ( x, y) = FY, ( x, y) xy F ( x, y) f ( x, y) dxdy Y, Y, x y = the events x=+ and y=+ are certain the events x=- and y=- have probability zero

13 Marginal density functions f ( x) = f ( x, y) dy Y f ( x) = f ( x, y) dx Y + + Y

14 Conditional density function Conditional density of Y given ( = ) = ( ) f y x f y x Y Y Conditional distribution function F Y ( y x) = y f Y f (, ) xu du ( x) Thus ( ) ( ) Y Y Y ( ) ( ) f xy, fy xy, f ( y x) = f ( x y) = f x f y Y

15 Independent RV

16 Independent RV In case of more than two RV are involved,, are independent 1 2 n (,,, ) = ( ) ( ) ( ) n (,,, ) = ( ) ( ) ( ) F x x x F x F x F x n 1 2 n 1 2 n 1 2 f x x x f x f x f x n 1 2 n 1 2 n 1 2 n

17 Moments of a RV Expectations: provide a description of the RV in terms of a few parameters instead of specifying the entire distribution function or the density function Mean (expectation value) { } ( ) m = μ = E = x P x μ i i i= 1 + = xf ( ) x dx For any piecewise constant function g(x), the expectation value is Y = g( ) { } { } n + k k { } ( ) mk = E = x f x dx + E Y = E g( ) = g( x) f ( x) dx discrete random variable continuous random variable

18 Moments of a RV Second order moment Variance n 2 2 { } = x ip( xi) E ( ) Measures the dispersion of the distribution around the mean σ x : standard deviation σ = i= x f + 2 = 2 ( μ ) ( ) xdx x f xdx

19 Joint moment of order (k+r) Joint moments { } + + k r k r mkr, = E Y = x y fy( x, y) dxdy Central joint moment + + k r k { } ( ) ( ) μkr, = E Y = x μ y μy fy( x, y) dxdy r μ μ 2,0 0,2 = σ = σ 2 2 Y

20 Joint moments Covariance Correlation coefficient Measures the similarity among two random variables ρ( Y, ) 1 1 { } ( )( ) μ1,1 = σ = E Y = x μ y μ f ( x, y) dxdy σ Y Y Y { } { }{ } { } Y Y E{ ( )( Y )} {( μ )} ( μ ) μ μy σ Y = = 2 2 E E Y σ σ Y ρ ( Y) + + = E Y E Y = E Y μ μ, , { } ( ) x = x x = x f x dx Y

21 Uncorrelated RV Two RV are uncorrelated iif σ = Y 0 { } = E{ } E{ Y} E Y Warning: uncorrelation does not imply statistical independence BUT statistical independence implies uncorrelation Two uncorrelated RV are also independent iif they are jointly Gaussian Uncorrelated RV can de dependent Independent RV are uncorrelated

22 Orthogonal RV Orthogonality condition E{ Y } = 0 For zero-mean RV orthogonality independence ( ) { } { } { } { } { } = 0 Cov(, Y ) = 0 { } = E{ Y} = 0 Cov, Y = E Y E E Y = E Y = 0 E Y E

23 Sum of uncorrelated variables If Z is the sum of and Y that are uncorrelated, its variance is given by the sum of the respective variances Z = + Y σ = σ + σ Z Y

24 Characteristic function Expectation of the exponential function ω ( ω) { } j jω M = E e = e f ( x) dx 1 jω f( x) = e M ( ω) dx 2π n M n ω ω = 0 = + n jmn + n-order moment of The characteristic function only depends on the moments of and can be used to estimate the pdf

25 Law of large numbers Let 1, 2,, n be a sequence RV and let their expected value be μ ˆ N Then, let be the RV corresponding to their sum μ i N 1 ˆ μn = i then, if the RV are uncorrelated and have finite variance N i= 1 N 1 ˆ μn lim μ (mean square convergence) N i N i= 1 Then, if all the RV have the same mean μ = μ ˆ μn μ 1 σ μ μ σ { 2 { } } n 2 2 ˆ μ = E ˆ ˆ x x E x = 2 n i= 1 The variance in the estimation of the expected value can be made arbitrarily small by adding a sufficiently large number of RV. i i

26 Central limit theorem The distribution of the sum of n independent RV having means µ i and variance σ 2 i is a RV having mean and variance given by the following relations = μ σ and its pdf is given by the convolution of the pdfs of the RV = n i= 1 n i= 1 n 2 2 = σ i= 1 i μ i i ( ) = ( ) ( ) ( ) f x f x f x f x n n It converges to a Gaussian irrespectively of the type of the single pdfs

27 Central limit theorem 1 f ( x) e n 2 2πσ ( μ ) 2 2σ 2

28 Random processes

29 Random processes

30 Random processes

31 Random processes realizations (functions of time) Space of realizations

32 Random processes In a stochastic or random process there is some indeterminacy in its future evolution described by probability distributions. Even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths are more probable and others less Single realizations are deterministic functions (t,a 1 ) (t,a 2 ) t t

33 Random processes random variables t1 t2 t3 x 2 (t) x 3 (t) time x 1 (t) realizations t 1 t 2 t 3 A RV ti is associated to any time instant ti and it is defined by the set of values taken by all the realizations at t=t i The set of functions concerning all the values of t completely define the Random process

34 Random variables The set of values taken by the realizations x k (t) in a given instant t i generates a sequence describing a random variable (t m ) This random variable can be described by a probability density function x(t m ) t=t m x 1 (t m ) x 3 (t m ) x 2 (t m ) realizations (,,, ) = (,,,,,,, ) f x x x f x x x t t t, t t t, 1 2 n 1 2 n t t2,, t 1 2 n t t2,, t 1 n 1 n

35 Random processes

36 Parameters Expectation value of a random process η + { } () t () t = E () t = xf ( x) dx Autocorrelation: expectation of the product of the RV at instants t 1 and t 2 { } ( ) ( ) ( ) ( ) R t, t E t t x x f x, x dxdx = = ( t ) ( t ) Autocovariance: corresponding central moment R {( μ )( ( ) μ ( ))} ( ) μ ( ) μ ( ) ( ) ( ) ( ) C t, t = E t t t t = R t, t t t (, ) = (, ) μ ( ) μ ( ) C t t R t t t t

37 Parameters Mutual correlation { } ( ) ( ) ( ) ( ) R t, t E t Y t x y f x, x dx dx = = Y ( t ) Y ( t ) R Covariance ( ) ( ) ( ) Warning {( μ )( ( ) μ ( ))} ( ) μ ( ) μ ( ) C t, t = E t t Y t t = R t, t t t Y Y 2 Y Y 2 (, ) = (, ) μ ( ) μ ( ) C t t R t t t t Y 1 2 Y Y 2 R C Y Y R C Y Y

38 Uncorrelated processes If the covariance of two random processes (t) and Y(t), evaluated at time instants t 1 and t 2, is zero for any t 1 and t 2, then the two processes are uncorrelated Y ( 1, 2) = 0, 1, 2 ( ) μ ( ) μ ( ) C t t t t R t, t = t t, t, t Y Y 2 1 2

39 Stationarity The pdfs which define the random process are time-invariant, i.e. they don t change when moving the origin of the time axis. (t 1 ) and (t 1 +τ) have the same probability distribution function For a stationary random process the following relations hold ( ) ( ) ( ) ( ) f() t x f x f x, x f x, x t, Δt ( t ) ( t ) 1 2 ( t +Δ t) ( t +Δt) 1 2 n

40 Stationary random processes The first order probability density function don t depend on time The second order pdfs only depend on the time delay τ = t t 1 2 Thus The expectation value is independent on time The covariance and autocorrelation depend on the time lag

41 Autocorrelation funct of WSS processes The value of R in the origin (τ=0) is the second order moment and corresponds to the maximum value = + R { τ } ( τ ) E ( t) ( t ) 2 { } R (0) = E ( t) = μ R ( τ ) R ( 0) τ 2 Symmetry R R Y Y { τ } { τ } ( τ ) = E ( t) Y( t+ ) ( τ) = E ( t) Y( t ) letting t' = t τ t = t' + τ { } R ( τ ) = E ( t' + τ) Y( t') = R ( τ) R R Y Y ( τ) = R ( τ) Y ( τ) = R ( τ) Y ( ) ( ) ( ) f + x = f x = f x ( t τ ) ( t)

42 Autocorrelation funct of WSS processes The autocorrelation function of a wide sense stationary process is symmetric about the origin The width of the autocorrelation function is related to the correlation among the signal samples If R(τ) drops quickly the samples are weakly correlated which means that they go through fast changes with time Viceversa, R(τ) drops slowly the samples take similar values at close time instants, thus slow signal changes are expected R(τ) is related to the frequency content of the signal

43 Example: White Gaussian Noise

44 Example: Filtered White Gaussian Noise

45 Example: Filtered White Gaussian Noise

46 Time averages From a practical point of view, it is preferable to deal with a single sequence rather than an infinite ensemble of sequences. When the pdfs are independent of time (e.g. for stationary processes), it is reasonable to expect that the amplitude distribution of a long sequence corresponding to a single realization should be approximately equal to the probability density Similarly, the arithmetic average of a large number of samples of a single realization should be very close to the mean of the process. Time averages NB: Such time averages are functions of an infinite set of RV, and thus are properly viewed as RV themselves!

47 Time averages of single realizations 1 mx = x() t = lim x( t) dt T 2T T MSEx = x() t = lim x() t dt T 2 T 1 Rx ( τ) = lim x() t x( t+ τ) dt T 2T T T T T T temporal mean mean square value temporal autocorrelation

48 Ergodicity When temporal averages are equal with probability 1 (namely for almost all the realizations) to the corresponding ensemble averages the process is said to be ergodic Random processes Stationary processes Ergodic processes

49 Ergodicity Ergodicity implies stationarity Otherwise the ensemble averages would depend on time, which contradicts the hypothesis Temporal averages are the same for almost all the realizations So that we can talk about temporal average Temporal and ensemble averages are the same For ergodic processes, a single realization is sufficient to completely characterize the process! T 1 t () = lim xtdt () = E{ t ()} = μ T 2L + 1 T T 1 t ( ) ( t+ τ ) = lim = t ( ) ( t+ τ) dt= E t ( ) ( t+ τ) T 2T T { }

50 Discrete time formulation

51 Discrete time formulation for RP The temporal axis is sampled and the integer valued index n is used. All the rest remains the same It s only a matter of using different notations and replacing integrals in time domain with discrete summations A sequence {x[n]} is considered one of an ensemble of sample sequences A random process is an indexed set of random variables n The family of random variables is characterized by a set of probability distribution functions that in general may be functions of the index n (unless it is stationary) In the case of discrete time signals, the index n is associated to the discrete time variable An individual RV n is described by the probability distribution function (1) P x, n = Pr x, n n ( ) ( ) n n n where n denotes the RV and x n is a particular value. The probability density function is obtained from (1) by differentiation and represents the probability of the RV to be in the infinitesimal interval d n around x n

52 Discrete time random processes Each variable n is a random variable. The values it takes over the different realizations of the corresponding process are its observations Ensemble averages Since a random process is an indexed set of RV, it can be characterized by statistical averages of the RV comprising the process (over the different realizations). Such averages are called ensemble averages. Definitions Average, or mean m =Ε { } = xp ( x, n) dx n n n n where E denotes the expectation. In general, the expected value (mean) depends on n P (, ) n n n p ( x, ) n n n = n

53 Discrete time random processes Mean square value (average power) Variance σ [ ] { 2 } 2 n =Ε n = (, ) n n rms x p x n dx = { } n n =Ε n = n n n n var[ ] ( m ) x p ( x, n) dx { } = E m = σ n n n In general, the mean and the variance are functions of time (index n), while they are constant for stationary processes The absolute value has been introduced to allow dealing with complex random processes ( n and Y n are real random processes) Wn = n + jyn

54 Discrete time random processes Autocorrelation sequence ϕ { } [ nm, ] =Ε = p ( x, nx,, mdxdx ) * * n m n m n, m n m n m Samples of n and m are taken on different realizations Autocovariance sequence γ { } [ nm, ] =Ε ( m )( m ) = ϕ [ nm, ] m m * * n m n m n m Cross-correlations and cross-covariance are obtained in the case the same quantities are evaluated between two different processes (ex. n and Y m )

55 Uncorrelation and Independence In general, the average of the product of 2 RV is not equal to the product of the averages. If this is the case, the RV are said to be uncorrelated (2) γ γ Statistically independent processes (3) ( ) ( ) mn, = 0 E{ } = E{ } E{ } = m 2 n m n m mn, = 0 E{ Y} = E{ } EY { } = m m Y n m n m Y p (, n, Y, m) = p (, n) p ( Y, m) Y n m n Y m n m n m n n m Condition (3) is stronger than condition (2): statistically independent processes are also uncorrelated, but NOT viceversa.

56 Stationary random processes A stationary random process is characterized by an equilibrium condition in which the statistical properties are invariant to a shift in the time origin. Accordingly The first-order probability distribution is independent of time The pdf is the same for all n The joint probability distributions are also invariant to a shift in the time origin The first order averages, like the mean and variance, are independent of time The second order averages, like the autocorrelation, depend on the time difference (m-n) Slightly different notations: n [n] (1) μ n n { [ ]} = E n = μ { ( [ ] ) } { ( [ ] ) } n { } ϕ σ = E n μ = E n μ = σ ϕ Independent of n [ nn, + m] = E n [ ] [ m] = [ m] Dependent on the time shift m

57 Stationary random processes Strict stationarity: the full probabilistic description is time invariant Wide-sense stationarity: the probability distributions are not time-invariant but the relations (1) still hold In particular, relations (1) show 2-nd order stationarity Linear operations preserve wide-sense stationarity Filtering by a linear time invariant system (LTIS) conserves wide-sense stationariety

58 Time averages From a practical point of view, it is preferable to deal with a single sequence rather than an infinite ensemble of sequences. When the pdfs are independent of time (e.g. for stationary processes), it is reasonable to expect that the amplitude distribution of a long sequence corresponding to a single realization should be approximately equal to the probability density Similarly, the arithmetic average of a large number of samples of a single realization should be very close to the mean of the process. Time averages x L 1 = lim x Time average of a random process n n L 2L + 1n= L L * 1 * n+ m m = n+ m m L 2L + 1n= L x x lim x x Autocorrelation sequence NB: Such time averages are functions of an infinite set of RV, and thus are properly viewed as RV themselves!

59 Ergodicity For an ergodic process, time averages coincide with ensemble averages That is, for a single realization (sequence {x[n]}) L 1 xn [ ] = lim xn [ ] = E{ n} = μ L 2L + 1 n= L 1 x n m x n x x E m R m { } L * * * [ + ] [ ] = lim n+ m m = n+ m m = ϕ [ ] = [ ] L 2L + 1n= L Sample means and variances are estimates of the corresponding RV, and as such are corrupted by estimation errors. mˆ ˆ σ x 2 x = = 1 L 1 L L 1 n= 0 L 1 n= 0 x[ n] x[ n] mˆ x 2

60 Ergodic random processes We don t need to keep the index n for n and we can abbreviate it with Let s consider a zero-mean wide-sense stationary random process The autocorrelation and the autocovariance coincide

61 Covariance matrix Given a sequence of RV, 1, 2,, n, we can calculate the covariance between any couple of them, and organize the results in a matrix The sequence of RV represent the observations at given time instants Referring to the continuous time case The formalization generalizes to the discrete time by replacing t n n ( ) ( ) ( ) {( μ )( ( ) μ ( ))} ( ) μ ( ) μ ( ) C t, t = E t t t t = R t, t t t {( ) } 1 ( )( ( ) ( )) ( )( ( ) ( )) (, ) ( ) ( ) = 1 μ 1 = 1,1 = σ = σ1 C t t E t t C { } { } (, ) ( ) ( ) C t t = E t μ t t μ t = C = σ = σ ,2 1,2 (, ) ( ) ( ) C t t = E t μ t t μ t = C = σ = σ ,3 1,3 {( )( ( ) ( ))} (, ) ( ) ( ) C t t = E t μ t t μ t = C = σ = σ 1 n 1 1 n n 1, n 1, n n

62 Covariance matrix These data can be put in matrix form C 2 C1 C12 C1 n σ1 σ12 σ1 n 2 C21 C2 C 2n σ21 σ2 σ2n = = 2 Cn 1 Cn2 C n σn1 σn2 σ n The matrix element at position (n,m) represents the covariance between the RV n and m. If the two RV are uncorrelated, the element is null. THUS The covariance matrix of uncorrelated RV is diagonal

63 Covariance matrix observation for RV time realizations random variables n=1 n=2 n=3 Vectors of RV can be built by gathering the RV in a column vector 1 = 2 n

64 Covariance matrix Each RV n corresponds to k observations that can be put in vector form as well 1 = x1,1 x1,2 x1,k Then, the covariance matrix can be written as C = E μ { T ( μ ) ( μ ) } μ1 = μn

65 Covariance matrix Proof C = E = { T ( μ )( μ )} ( 1 μ1) ( μ ) 2 2 = E ( 1 μ1) ( 2 μ2) ( n μn) = ( n μn) 2 2 σ1 σ12 σ1 n ( 1 μ1) ( 1 μ1)( n μ ) n 2 σ21 σ2 σ2n = E = 2 ( n μn)( 1 μ1) ( n μn) 2 σn1 σn2 σ n

66 Covariance matrix for WSS processes For wide sense stationary processes R C ( τ ) = R ( τ ) ( τ ) = C ( τ ) Thus the covariance matrix is symmetric about the diagonal C 2 C1 C12 C1n σ1 σ12 σ1n 2 C12 C2 C 2n σ12 σ2 σ2n = = 2 C1n C2n C n σ1n σ2n σ n

67 Gaussian Random Process C : covariance matrix

68 Gaussian random processes Gaussian process Covariance matrix The columns of the matrix are iid RV The matrix is symmetric The elements out of the diagonal are close to zero

69 Covariance matrix: properties The covariance matrix is symmetric and nonnegative definite The elements along the principal diagonal are the variances of the elements of the random vector The elements out of the principal diagonal are the correlation coefficients between couples of elements Uncorrelated vector elements correspond to a diagonal covariance matrix Is it possible to define a linear transformation mapping the RP to the RP such that the RP Y has a covariance matrix in diagonal form?

70 Karunen-Loeve transform The KLT is a linear transform that maps the random process to a random process Y whose covariance matrix is diagonal whose components are uncorrelated If is a generalized Gaussian, then the components of Y are independent For Gaussian processes, uncorrelation is necessary and sufficient for independence Given a wide sense stationary process, with covariance matrix C, we look for a linear transform T such that Y=T T such that C Y is diagonal It can be proved that T consists of the eigenvectors of C C φ = λφ k = 0,..., N 1 k k k Φ= Λ= [ φ0 φ1 φn 1] diag ( λ λ λ ) 0 1 N 1 eigenvectors matrix (eigenvectors are the columns),,..., (diagonal) eigenvalues matrix

71 Properties The eigenvector matrix is square (NxN), unitary and orthogonal The eigenvectors form an orthonormal basis i i i, k k k = 0 T ΦΦ= I 1 T Φ =Φ T 1 i = j φi, φj = φi φj = δi, j = 0 i j Projection on a basis y0 x0 a0 x0 a0,0 a0, N 1 y 1 x 1 a 1 x 1 y = = Α x =... = an 1,0 a N 1, N 1 y N 1 x N 1 a N 1 x N 1 N 1 y a, x a x = =

72 KLT Projection on the eigenvector basis Analysis Synthesis T y0 φ0 x0 T y 1 T φ x 1 1 y = =Φ x == T y N 1 φ x N 1 N 1 T yi = φ i, x T y =Φ x x y y ( ) 1 1 T T Φ =Φ = Φ =Φ x =Φy

73 KLT The KLT diagonalizes the covariance matrix C Y T = Φ CΦ = Λ

74 Principal Component Analysis (PCA) Principal component analysis (PCA) involves a mathematical procedure that transforms a number of possibly correlated variables into a smaller number of uncorrelated variables called principal components. The first principal component accounts for as much of the variability in the data as possible, and each succeeding component accounts for as much of the remaining variability as possible. This is derived from the KLT Due to its properties of signal decorrelation the KLT can be used for compression by reducing the size of the dataset

75 PCA Algorithm 1. Find the mean vector and the covariance Cx 2. Find the eigenvalues (λi, i=0,..,n-1) and sort them in descending order, and sort the eigenvectors ϕi, i=0,..,n-1 accordingly 3. Choose a lower dimensionality m<n (following an energy-based criterion) 4. Construct an Nxm transform matrix composed by the m eigenvectors corresponding to the largest eigenvalues Φ = m y = Φ x =Φ [ φ... ] T m m 0 φm 1 x y Analysis basis vectors Synthesis Nxm

76 Example m=32 m=16

77 Example m=8

78 Karunen-Loeve Transform Y C i Y C v v = T T = = Λ T T CT = λ v i i i [ ] 0 1 N 1 diagonal matrix of the eigenvalues eigenvector equation for C eigenvector associated to the eigenvalue λ T = v v v The columns are the eigenvectors The matrix T transforms into Y whose covariance matrix is diagonal with elements λ = var[ y ] ii i i

79 Properties of the KLT It is optimal for Gaussian sources namely it minimizes the MSE between the vector and its approximation when only k out of K transform coefficients are retained It basically removes the redundancy in the input vector allowing better compression performance The KLT transforms a Gaussian random vector to a Gaussian random vector with statistically independent components. If the vector is not Gaussian, the components of Y will be uncorrelated but not independent Under some conditions, it is well approximated by the DCT, which in addition allows fast algorithms for its implementation JPEG

80 What about ergodicity? The hypothesis of ergodicity (which encloses stationarity) is often assumed in applications. This is because it allows to focus on the single realization to estimate the probability density function (and its parameters) of a random process What does this mean? For 1D signals (ECG, EEG): the measurements correspond to the realizations, and thus are used to study the signals through the estimation of the stochastic parameters However, for 1D signals many realizations of a given process are often available

81 1 Images k Each image is the realization of a 2D random process The process consists of NxM RV The observations of each RV run orthogonally to the image plan, that is, gather the pixel values at position (n,m) in the set of images Ensemble averages should be evaluated on such RV Assuming stationarity and ergodicity facilitates the task by allowing to perform all the computations on the single image

82 Images Stationarity Averages are assumed to be equal What matters is the distance among the pixels: C (d) is the same irrespectively of the direction Ergodicity All the calculations are performed locally: instead of looking at different realizations, the different moments are calculated on the image Simplification Columns represent the RV Rows represents the realizations ( τ ) = ( τ ) ( ) = ( ) ( τ) = ( τ) ( ) = ( ) R R R d R d C C C d C d 1 d k

83 Images observations realizations The covariance matrix is symmetric about the major diagonal Covariances and correlations are evaluated between columns Limit: the stationarity and ergodicity assumptions are asymptotic: they assume that the number of realizations (k) and the size of each realization (n in 1D, NxM in 2D) tend to infinity. When dealing with signals of finite size the hypothesis are not satisfied and the estimations are poor

84 Example

85 Other solution Consider the image as a set of subimages Assuming that the stationarity holds also locally, each subimage is considered as a realization. The covariance matrix is calculated on subimages.

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

Stochastic Process II Dr.-Ing. Sudchai Boonto

Stochastic Process II Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

Signals and Spectra - Review

Signals and Spectra - Review Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs

More information

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1 OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Basic Principles of Video Coding

Basic Principles of Video Coding Basic Principles of Video Coding Introduction Categories of Video Coding Schemes Information Theory Overview of Video Coding Techniques Predictive coding Transform coding Quantization Entropy coding Motion

More information

Review of Probability

Review of Probability Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin ACE 562 Fall 2005 Lecture 2: Probability, Random Variables and Distributions Required Readings: by Professor Scott H. Irwin Griffiths, Hill and Judge. Some Basic Ideas: Statistical Concepts for Economists,

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Basics on 2-D 2 D Random Signal

Basics on 2-D 2 D Random Signal Basics on -D D Random Signal Spring 06 Instructor: K. J. Ray Liu ECE Department, Univ. of Maryland, College Park Overview Last Time: Fourier Analysis for -D signals Image enhancement via spatial filtering

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

1. Stochastic Processes and Stationarity

1. Stochastic Processes and Stationarity Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution: 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

Efficient Observation of Random Phenomena

Efficient Observation of Random Phenomena Lecture 9 Efficient Observation of Random Phenomena Tokyo Polytechnic University The 1st Century Center of Excellence Program Yukio Tamura POD Proper Orthogonal Decomposition Stochastic Representation

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Stationary stochastic processes

Stationary stochastic processes Chapter 1-2 Stationary stochastic processes Maria Sandsten Lecture 1 Webpage: www.maths.lu.se/kurshemsida/fmsf10masc04/ September 3 2018 1 / 35 Why stochastic processes? Many signals or time series can

More information

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ) Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),

More information