Size: px
Start display at page:

Download ""

Transcription

1 Part IV Stochastic Image Analysis 1

2

3 Contents IV Stochastic Image Analysis 1 7 Introduction to Stochastic Processes Probability Random events and subjective experience Axioms of probability Statistical independence The Bayes formula Random variables Distribution function and probability density function Expectation and moments The covariance matrix Complex random variables Functions of random variables Stochastic processes Denition Characterization of a stochastic process Statistical vs. time characterization of a random signal Stationary processes Ergodicity Spectral analysis of random signals Outlook

4 Chapter 7 Introduction to Stochastic Processes Every statistical analysis must be built upon a mathematical model linking observable reality with the mechanism generating the observations. This model should be a parsimonious description of nature: its functional form should be simple and the number of its parameters and components should be a minimum. The model should be parameterized in such a way that each parameter can be interpreted easily and identied with some aspect of reality. The functional form should be suciently tractable to permit the sort of mathematical manipulations required for the estimation of its parameters and other inference about nature. Mathematical models may be divided into three general classes: 1) Purely deterministic. 2) Deterministic with simple random components. 3) Stochastic. Examples: Observation using a TV camera of a falling object. Newtonian physics states that the distance traveled by a falling object is directly related to the squared time of fall. If atmospheric turbulence, observing errors and other transient eects can be ignored, the displacement can be calculated and predicted exactly. In the second kind of model each observation is a function of a strictly deterministic component and a random term, e.g. camera electrical noise.the random components are assumed not to depend one of another for dierent observations. Stochastic models are constructed from fundamental random events as components, e.g. the observation of a buttery ight using a noisy camera. Further, to dene more precisely these approaches, we present anoverview of the following concepts: probability, random variable, stochastic process, random signal and information. 4

5 7.1. Probability Probability \Probability theory is nothing but common sense reduced to calculations." (Laplace, 1819) Random events and subjective experience Denition 1. \The probability of one event is the ratio of the number of cases favorable to it, to the number of all cases possible when nothing leads us to expect that every one of these cases should occur more than any other, which renders them, for us, equally possible" (Laplace 1812). Denition 2. \The probability is a representation of degrees of plausibilityby real numbers" (Jerey, today's understanding). The rst denition is the classical \f requentist" approach. The second is the modern \B ayesian" interpretation. In these lectures the rst denition will be used, the second will be mentioned only few times to turn the attention to elds of advanced statistics Axioms of probability The concept of probability Pr associated to an event E is further axiomatically introduced. Set operation E 1 [ E 2 E 1 \ E 2 E E 1 \ E 2 = Probability statement At least one of E 1 or E 2 occurs Both E 1 and E 2 occur E does not occur The impossible event E 1 and E 2 are mutually exclusive E 1 E 2 E 1 implies E 2 Table 7.1: Some set operations and corresponding probability statements Axioms: 1) Pr(E) 0 2) Pr(I) =1 3) Pr(E 1 [ E 2 )=Pr(E 1 ) + Pr(E 2 ) if E 1 [ E 2 = (sum axiom) 4) Pr(E 1 \ E 2 )=Pr(E 2 je 1 )Pr(E 1 )=Pr(E 1 je 2 )Pr(E 2 ) (product axiom)

6 6 Chapter 7. Introduction to Stochastic Processes Observation The product axiom does not assume any exclusivity of the events. As a remainder the notation Pr(AjB) refers the probability ofevent A conditioned by B. As consequence: Pr( ) =0 The impossible event has zero probability. However it does not follow that an event of zero probability isimpossible (e.g. the real numbers in the interval [0 1]). 0 Pr(E) 1 These two conclusions are in the spirit of the second denition of the prob- Remark ability Statistical independence Two events are statistical independent if Pr(E 1 \ E 2 )=Pr(E 1 )Pr(E 2 ) (7.1) thus deducing that and Pr(E 2 je 1 )=Pr(E 2 ) (7.2) Pr(E 1 je 2 ) = Pr(E 1 ): (7.3) The probability of an event is not inuenced by the fact that another event takes place The Bayes formula Consider a mutually exclusive and complete set of events fh 1 H 2 ::: H n g that is not independent of an event E in a certain experiment. We call the events H i hypotheses and interpret them as hypothetical causes of the event E. The following decomposition formula can be written: and also the formula of Bayes: Pr(E) = nx i=1 Pr(EjH i )Pr(H i ) (7.4) Pr(H i je) = Pr(EjH i)pr(h i ) : (7.5) Pr(E) The probability Pr(H i je) is the probability satisfying the hypothesis H i knowing that the event E was eectively produced. This is called \a posteriori" probability ofh i (one knows that E take place) and Pr(H i ) is called \a priori" probability. Remark The Bayes formula can be understood as a formula for inverting conditional probabilities, i.e. compute Pr(H i je) given Pr(EjH i ) and Pr(H i ).

7 7.2. Random variables Random variables Consider an experiment characterized by its elementary events supposed mutually independent and exclusive. A particular event consists of the union of several elementary events. Its probability is the sum of the probabilities of the elementary events. A random variables (r.v.) is dened by the biunivoque correspondence with an ensemble of elementary events and is characterized by the probability distribution of these events. Example In an image the fact that a pixel value has a certain grey value is an elementary event characterized by its probability. Thus the ensemble of pixel grey levels, its intensity, is a random variable Distribution function and probability density function Given a r.v. X dened on ( 1) the distribution functionis dened as and has the following properties: F (x) =Pr(X x) (7.6) F () = 0 (7.7) F (1) = 1 F (b) ; F (a) = Pr(a <X<b) and F (x) is a monotonic non decreasing function. The probability density function p.d.f. is dened as df (x) p(x) = (7.8) dx and is interpreted as Pr(x <X x + dx) =p(x)dx: (7.9) A distribution function and a p.d.f. can be dened for a multidimensional 1 r.v. X = (X 1 ::: X n ) T as: and knowing the interpretation F (x 1 ::: x n )=Pr(X 1 x 1 ::: X n x n (7.10) p(x 1 ::: x n F (x 1 ::: x n 1 :::@x n (7.11) Pr(x 1 <X 1 x 1 + dx 1 ::: x n <X n x n + dx n )=p(x 1 ::: x n ) dx 1 :::dx n : (7.12) Example In an image \interesting" patterns are e.g. lines. One considers the neighborhood of pixel! (Fig. 7.1, and denes the p.d.f. of a line on the neighborhood variables. The joint distribution p(x x ::: x ) captures the orientation of the line. 1 Also called joint p.d.f. of X 1 ::: Xn.

8 8 Chapter 7. Introduction to Stochastic Processes Figure 7.1: Example of 3x3 pixels neighborhood and diagonal image line. Marginal p.d.f. Given a n-dimensional r.v. from which only k <ncomponents are of interest, a marginal p.d.f. is dened as: p(x 1 :::x k )= dx k+1 p(x 1 :::x k x k+1 ::: x n ): (7.13) Example Assuming the knowledge of a joint p.d.f. characterizing edges on a 9 pixels neighborhood, the p.d.f. of the grey level in the center of the neighborhood is obtained as a marginal p.d.f. by integration over the 8 r.v. attached to the surrounding neighbors. Conditional p.d.f. An n-dimensional joint p.d.f. is called conditional relative ton ; k variables if these n ; k variables have predened values. p(x 1 ::: x k j x k+1 ::: x n )= (7.14) Pr(x 1 <X 1 x 1 + dx 1 ::: x k <X k x k + dx k j X k+1 = x k+1 ::: X n = x n ): Using the Bayes formula one obtains: p(x 1 ::: x k jx k+1 ::: x n )= p(x 1 ::: x k x k+1 ::: x n ) : (7.15) p(x k+1 ::: x n ) In the more general case and using a vector notation x =(x 1 ::: x n ) T, y =(y 1 ::: y n ) T p(x y) = p(xjy) p(y)=p(yjx) p(x) (7.16) p(x) = p(y) = Z Z y x p(x y) dy = p(x y) dx = Z Z y x p(xjy)p(y) dy p(yjx)p(x) dx Statistical independence The generalization of the statistical independence gives: p(x 1 ::: x k j x k+1 ::: x n )=p(x 1 ::: x k ) (7.17) and p(x 1 ::: x k x k+1 ::: x n )=p(x 1 ::: x k ) p(x k+1 ::: x n ): (7.18)

9 7.2. Random variables Expectation and moments Given a function f(x 1 ::: X n ) of the r.v. X 1 ::: X n, the expectation operator E[] is introduced: E[f(X 1 ::: X n )] = dx 1 dx nf(x 1 ::: x n )p(x 1 ::: x n ): (7.19) The expectation operator allows the denition of the moments of a r.v. For the 2- dimensional case we have: m 1 = m 2 = R 1 = R 12 = The variance and the covariance dx 1 dx 2 x 1 p(x 1 x 2 )= dx 1 dx 2 x 2 p(x 1 x 2 )= dx 1 dx 2 x 2 1 p(x 1 x 2 )= dx 1 dx 2 x 1 x 2 p(x 1 x 2 ): dx 1 x 1 p(x 1 ) (7.20) dx 2 x 2 p(x 2 ) dx 1 x 2 1 p(x 1) (7.21) 1 2 = E[(X 1 ; m 1 ) 2 ]=R 1 ; m 2 1 (7.22) 2 2 = E[(X 2 ; m 2 ) 2 ]=R 2 ; m 2 2 C 12 = E[(X 1 ; m 1 )(X 2 ; m 2 )] = R 12 ; m 1 m 2 : (7.23) Remark If the r.v. are independent their covariance is zero. C 12 = E[(X 1 ; m 1 )(X 2 ; m 2 )] (7.24) = = = 0 Z 1 Z 1 dx 1 Z 1 dx 2 (x 1 ; m 1 )(x 2 ; m 2 ) p(x 1 x 2 ) dx 1 (x 1 ; m 1 ) p(x 1 ) Z 1 dx 2 (x 2 ; m 2 ) p(x 2 ) (7.25) However if the covariance is zero one can not conclude that the r.v. are independent. A simple example follows. Example We consider ar.v. uniformly distributed on the interval [0 2]: p() = ( [0 2] 0 else : (7.26) We dene two others r.v. as X 1 = sin (7.27) X 2 = cos :

10 10 Chapter 7. Introduction to Stochastic Processes We note that The covariance can be computed as: m 1 = 1 Z 2 sin =0 (7.28) 2 0 m 2 = 1 Z 2 cos =0: 2 0 c 12 = Z 2 0 = 1 4 sin cos p()d (7.29) Z 2 0 sin 2d=0: We found that the r.v. X 1 and X 2 are not correlated. However we observe that X 1 and X 2 are strongly dependent X X 2 2 =1 (7.30) (cos 2 + sin 2 = 1) but in a non linear relationship. The covariance can describe only linear interrelations The covariance matrix We dene a vector X of random variables and its mean The covariance matrix C X of X is dened as and has the form X =[X 1 ::: X n ] T (7.31) m X =[m 1 ::: m n ] T : (7.32) C X = E h (X ; m X )(X ; m X ) T i (7.33) C X = c 12 ::: c 1n c ::: c 2n ::: ::: ::: ::: c n1 c n2 ::: 2 X (7.34) of a symmetric, squared, non negative matrix. If the components of a random vector are independent, the covariance matrix is diagonal. However if the covariance matrix is diagonal there is non implication of independence of the two random vectors, only their components are mutually uncorrelated Complex random variables A complex r.v. Z has two components, real X and imaginary Y, each at its turn being a r.v. Z = X + jy: (7.35)

11 7.2. Random variables 11 The characterization of complex r.v. is done by a 2-dimensional p.d.f.: the joint p.d.f. of the real and imaginary parts. The mean of a complex r.v. Z is also complex and given by The variance of a complex r.v. is m Z = E[Z] =E[X + jy ]=E[X]+jE[Y ]: (7.36) 2 Z = E[(Z ; m Z )(Z ; m Z ) ]= 2 X + 2 Y (7.37) which is real. The covariance of Z is dened as: Functions of random variables C Z = C X + C Y + j(c YX ; C XY ): (7.38) We suppose a r.v. X of known p.d.f. p(x) and dene another r.v. Y Y = f(x) (7.39) where f() is a bijection. We would like to nd the p.d.f. p(y) ofy. Knowing that f() is a bijective function where and Thus we obtain Pr(x 0 <X x 0 + dx) = Pr(y 0 <Y y 0 + dy) (7.40) and p(y) =p(x) dx dy = p(f (y)) dx dy which gives the p.d.f. of the r.v. Y. In the multidimensional case with y 0 = f(x 0 ) (7.41) dy = f 0 (x 0 )dx: (7.42) jp(x)dxj = jp(y)dyj (7.43) (7.44) Y 1 = f 1 (X 1 ::: X n ) (7.45) = Y n = f n (X 1 ::: X n ) (7.46) where f i are continuous and dierentiable functions and the Jacobian of the transformation is non negative. We have: jp(x 1 ::: x n ) dx 1 :::dx n j = jp(y 1 ::: y n ) dy 1 :::dy n j: (7.47) The p.d.f. of the r.v. after the transformation will be: where the Jacobian is dened by: p(y 1 ::: y n )=p(x 1 ::: x n 1 ::: x n 1 ::: y n 1 ::: x n 1 ::: y n ) = ::: ::: ::: n n (7.48) : (7.49)

12 12 Chapter 7. Introduction to Stochastic Processes Example To exemplication we use the transformation from cartesian to polar coordinates in a 2-dimensional space. q Y 1 = X1 2 + X2 2 (7.50) The inverse transformation is: Y 2 = arctan X 2 X 1 X 1 = Y 1 cos Y 2 (7.51) X 2 = Y 1 sin Y 2 and has the 1 x 2 1 y 2 ) = cos sin ; sin cos = (7.52) when Y 1 = (7.53) Y 2 = : Given the p.d.f. of the r.v. X 1, X 2 in cartesian coordinates p(x 1 X 2 ) one nds the p.d.f. in polar coordinates with p( ) = p(x 1 x 2 ) (7.54) p( ) = p( cos sin ) >0and0 2: (7.55) Further we consider an important case, X 1 and X 2 are independent r.v. with normal p.d.f. of zero mean and the same variance 2 : p(x 1 x 2 ) = p(x 1 ) p(x 2 ) (7.56)! 1 = 2 exp 2 ;x2 1 + x : For exemplication in Fig are presented the real X 1 and imaginary X 2 parts of asynthetic Aperture Radar (SAR) image 2 and their p.d.f.s. In polar coordinates the p.d.f. is obtained in the form p( ) = exp ; 2 2 2! (7.57) where p() = 2 exp ; 2 2 2! (7.58) 2 Synthetic Aperture Radar images are obtained by coherently illuminating a scene with microwave radiation. SAR images are in the class of important images as laser, sonar, ecographical, computer tomography.

13 7.2. Random variables 13 Figure 7.2: Synthetic Aperture Radar image, part of which (marked) was used for the example of section 7.2.5

14 14 Chapter 7. Introduction to Stochastic Processes isarayleigh distribution and p() = 1 (7.59) 2 is a uniform distribution. The images in polar coordinates and their p.d.f.'s are presented in Fig The \-image" is the amplitude image and is Rayleigh distributed. The -image is the phase image and is uniformly distributed. Remark In this example we can observe that the \information" is contained in the amplitude image. The phase image has the maximum incertitude, there is no preferred phase value. The Rayleigh distribution of the intensity suggests that such an image can be modeled as a noise-less amplitude multiplied with a noise. That is why is interesting to consider the exercise to compute the p.d.f. of the logarithmic transformed Rayleigh distributed r.v. _The logarithm transforms the multiplicative nature of the noise in an additive one. Linear transformation of r.v. Further we consider the vectors of r.v. X =[X 1 ::: X n ] T and Y =[Y 1 ::: Y n ]. Without loss of generality we consider the r.v. to have zero mean. A linear transformation is given by Y = AX (7.60) where A(nn) is the matrix characterizing the transformation. With simple computations we obtain: E[Y ] = A E[X] (7.61) C Y = E[YY T ]=A E[XX T ] A T = AC X A T C XY = E[XY T ]=E[XX T ]A T = C X A T C XY = C T YX (7.62) An important linear transformation is the Karhunen-Loeve transform. We can write C X = MM T (7.63) where is the diagonal matrix of eigenvalues of C X, and M is the orthogonal matrix having the eigenvectors of C X as columns. It follows: The Karhunen-Loeve transform has the matrix =M T C X M: (7.64) A = M T (7.65) and transforms X in Y having a diagonal matrix C Y. The Karhunen-Loeve transform uncorrelates the components of the r.v. vector X.

15

16

17 7.3. Stochastic processes Stochastic processes Denition In the previous section we introduced two concepts: random variables and vector of random variables. A r.v. was dened by a biunivoque correspondence to the elementary events obtained as a result of an experiment. The vector of r.v.'s was introduced as a formal notation, but now we give a more precise meaning: each component is in correspondence with a time moment 3. In this way a stochastic process is dened by the biunivoque correspondence of the results of an experience with time (space) functions. We can interpret a stochastic process as a process to generate random functions. Each random function is a particular realization of the stochastic process called random signal. Thus a stochastic process is the ensemble of random signals which evolve in time (Fig ). Figure 7.3: Exemplication of a stochastic process x(t). Each x k (t), for a xed value k, is a particular realization, a random signal. Each ensemble fx 1 (t) x 2 (t) :::g for a given time moment t, is a random variable. A stochastic process is characterized by its across-realizations behavior for all time moments. The problematic of stochastic processes is to make evidence and to characterize not only the individual r.v. at xed moments of time, but also the relationship and statistical dependencies in between r.v. obtained at dierent time measurements. Stochastic image analysis considers images as particular realizations of a spatial stochastic process and aims at the local and neighborhood analysis of the statistical dependencies of the pixel intensities. 3 Historically time is the variable used in signal theory. In these lectures time will be substitute to space t! (i j), where (i j) are the coordinates of a pixel in an image.

18 18 Chapter 7. Introduction to Stochastic Processes Characterization of a stochastic process A stochastic process is fully determined by the knowledge of all p.d.f.'s lim n!1 p(x(t 1) ::: x(t n )) (7.66) where x(t k )isa r.v. obtained sampling the process at the time t k Statistical vs. time characterization of a random signal The statistical characterization is done at a xed time moment t k across all the realizations of the stochastic process. It is the characterization of a r.v. x(t k ). The time characterization refers one particular representation of the stochastic process x (k) (t) as a time dependent random signal. The following table presents comparatively the stochastic and temporal characterization of random signals in the rst and second order statistics. Stochastic Temporal Mean m X (t) = E[X(t)] = xp(x(t))dx R (k) 1 + T (t 0 )=lim 2 T!1 T ; T x (k) (t 0 + t)dt 2 R Variance X 2 = E[X(t)XT (t)] var (k) 1 + T (t 0 ) = lim 2 T!1 T ; T [x (k) (t 0 + t)] 2 dt 2 Covariance C XX (t 1 t 2 )=E[X(t 1 )X T (t 2 )] R (k) XX (t 1 t 2 ) = lim T!1 1 T (autocorrelation) R + T 2 ; T 2 x (k) (t 1 + t)x (k) (t 2 + t)dt Mutual covariance C XY (t 1 t 2 )=E[X(t 1 )Y T (t 2 )] R (k) XY (t 1 t 2 )=lim T!1 1 T (cross correlation) R + T 2 ; T 2 x (k) (t 1 + t)y (k) (t 2 + t)dt ( ) For simplicity themean is assumed to be zero Stationary processes A stochastic process is called strict-sense stationary if all its statistical properties are invariant to a shift of the time origin. A stochastic process is called wide-sense stationary if its mean is constant and its autocorrelation depends only on the distance t 1 ; t 2 E[x(t)] = constant (7.67) C XX = C XX (t 1 ; t 2 ): Example We consider a random variable uniformly distributed on the interval [0 2] and dene the random vector X(t) " # " # X1 (t) cos(!t + ') X(t) = = : (7.68) X 2 (t) sin(!t + ')

19 7.3. Stochastic processes 19 The expectations are E[X 1 (t)] = 0 (7.69) E[X 2 (t)] = 0: Thus the covariance of X(t) can be evaluated as "" # # X1 (t C X (t 1 t 2 ) = E 1 ) [ X X 2 (t 2 ) 1 (t 1 ) X 2 (t 2 ) ] " # CX1 (t = 1 t 2 ) C X1 X 2 (t 1 t 2 ) : C X2 X 1 (t 1 t 2 ) C X2 (t 1 t 2 ) (7.70) And further: C X1 (t 1 t 2 ) = E[X 1 (t 1 )X 2 (t 2 )] (7.71) = E[cos(!t 1 + ') cos(!t 2 + ')] = 1 2 cos[!(t 2 ; t 1 )]: In similar way and C X2 (t 1 t 2 )= 1 2 cos[!(t 1 ; t 2 )] (7.72) C X1 X 2 (t 1 t 2 )=; 1 2 sin[!(t 1 ; t 2 )]: (7.73) Because the mean is zero and the covariance matrix depends only on the time dierence t 2 ; t 1 we conclude that the process X(t) is wide-sense stationary Ergodicity Stochastic processes Wide-sense stationary Strict-sense stationary Ergodic Figure 7.4: Classication of stochastic processes. We already have seen from the denition of a stochastic process that two modalities of analysis are possible:

20 20 Chapter 7. Introduction to Stochastic Processes stochastic temporal. Ergodicity is the theory dealing with the study of stochastic vs. temporal characterization of stochastic processes, e.g. is a temporal average equal with a stochastic expectation? A stochastic process is called ergodic if all its statistical properties can be determined from a single realization. Further we comment only a limited case: the ergodicity of the mean. Supposing that X(t) is a stochastic process with constant mean we study under what condition the time average E[x(t)] = (7.74) T = 1 2T Z T ;T x(t)dt (7.75) is close to. The necessary and sucient condition for the process X(t) to be meanergodic is 1 Z T lim 1 ; jj! T!1 T 0 T C X()d =0: (7.76) Remarks Ergodicity necessarily requires strict-sense stationary. The classication of stochastic processes presented in Fig In practice is very dicult to have access to enough realizations of a stochastic process, thus the ensemble statistical analysis is not possible. It remain only the possibility for temporal analysis of one realization Spectral analysis of random signals The harmonic analysis of deterministic functions can not be directly applied to random signals because in general jx(k) (t)dtj!1: (7.77) The power spectrum density concept is introduced: E[jX (k) T (!)j 2 ] (!) = lim T!1 T (7.78) where X (k) T (!) is the Fourier transform of the realization X (k) of the stochastic process X(t), restricted to the time interval T. One of the basic results of the spectral analysis of random signals is the Wiener-Khintchine theorem: the power spectrum density is the Fourier transform of the autocorrelation function (!) = R XX()exp(;j!)d: (7.79) The Wiener-Khintchine theorem was reminded to show the connection of stochastic and spectral analyses.

21 7.3. Stochastic processes 21 Figure 7.5: Exemplication of the Wiener-Khintchine theorem. a) texture image from Brodatz album, b) the power spectrum density of the image and, c) the autocorrelation function of the image. One could note the strong quasi-periodicity of the texture reected in the stochastic characterization of it, power spectrum density and autocorrelation function. The signals in images b) and c) are Fourier pairs.

22 22 Chapter 7. Introduction to Stochastic Processes 7.4 Outlook This introduction was a short recapitulation of some of the basics concepts in statistics which will be used in the next chapters. Three nested notions have been referred: probabilistic event, random variable and stochastic process. The random signal (image) was dened as being a particular realization of a stochastic process. The statistical independence and correlation have been commented, intuitively showing the importance of nonlinear interrelationships. Stationarity and ergodicity have been introduced from an "engineering" point of view.

23 Bibliography [1] Al. Spataru, Fondements de la theorie de la transmission de l'information, Presses Polytechniques Romandes, 1987 [2] A. K. Jain, Fundamentals of Digital Image Processing, Prentice Hall, [3] J.-R. Ohm, Digitale Bildcodierung, Springer, [4] D. Cox, H. Miller, The theory of stochastic processes, Methuen, 1965 [5] A. Papoulis, Probability, random variables and stochastic processes, McGraw-Hill,

No. of dimensions 1. No. of centers

No. of dimensions 1. No. of centers Contents 8.6 Course of dimensionality............................ 15 8.7 Computational aspects of linear estimators.................. 15 8.7.1 Diagonalization of circulant andblock-circulant matrices......

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart 1 Motivation Imaging is a stochastic process: If we take all the different sources of

More information

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007 Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)

More information

p(z)

p(z) Chapter Statistics. Introduction This lecture is a quick review of basic statistical concepts; probabilities, mean, variance, covariance, correlation, linear regression, probability density functions and

More information

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Stochastic Processes (Master degree in Engineering) Franco Flandoli

Stochastic Processes (Master degree in Engineering) Franco Flandoli Stochastic Processes (Master degree in Engineering) Franco Flandoli Contents Preface v Chapter. Preliminaries of Probability. Transformation of densities. About covariance matrices 3 3. Gaussian vectors

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Gaussian random variables inr n

Gaussian random variables inr n Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

var D (B) = var(b? E D (B)) = var(b)? cov(b; D)(var(D))?1 cov(d; B) (2) Stone [14], and Hartigan [9] are among the rst to discuss the role of such ass

var D (B) = var(b? E D (B)) = var(b)? cov(b; D)(var(D))?1 cov(d; B) (2) Stone [14], and Hartigan [9] are among the rst to discuss the role of such ass BAYES LINEAR ANALYSIS [This article appears in the Encyclopaedia of Statistical Sciences, Update volume 3, 1998, Wiley.] The Bayes linear approach is concerned with problems in which we want to combine

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO ACKGROUND NOTES FYS 4550/FYS9550 - EXERIMENTAL HIGH ENERGY HYSICS AUTUMN 2016 ROAILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO efore embarking on the concept of probability, we will first define

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

STOCHASTIC PROCESSES and NOISE in ELECTRONIC and OPTICAL SYSTEMS. Prof. Yosef PINHASI

STOCHASTIC PROCESSES and NOISE in ELECTRONIC and OPTICAL SYSTEMS. Prof. Yosef PINHASI STOCHASTIC PROCESSES and NOISE in ELECTRONIC and OPTICAL SYSTEMS Prof. Yosef PINHASI June 8, 004 Contents Fundamentals of Stochastic Processes. The random experiment............................... Set

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

From Determinism to Stochasticity

From Determinism to Stochasticity From Determinism to Stochasticity Reading for this lecture: (These) Lecture Notes. Outline of next few lectures: Probability theory Stochastic processes Measurement theory You Are Here A B C... α γ β δ

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

Lecture 7 Random Signal Analysis

Lecture 7 Random Signal Analysis Lecture 7 Random Signal Analysis 7. Introduction to Probability 7. Amplitude Distributions 7.3 Uniform, Gaussian, and Other Distributions 7.4 Power and Power Density Spectra 7.5 Properties of the Power

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES LECTURE NOTES ON PTSP (15A04304) B.TECH ECE II YEAR I SEMESTER

More information

Outline. Random Variables. Examples. Random Variable

Outline. Random Variables. Examples. Random Variable Outline Random Variables M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Random variables. CDF and pdf. Joint random variables. Correlated, independent, orthogonal. Correlation,

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Some Concepts of Probability (Review) Volker Tresp Summer 2018

Some Concepts of Probability (Review) Volker Tresp Summer 2018 Some Concepts of Probability (Review) Volker Tresp Summer 2018 1 Definition There are different way to define what a probability stands for Mathematically, the most rigorous definition is based on Kolmogorov

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

G. Larry Bretthorst. Washington University, Department of Chemistry. and. C. Ray Smith

G. Larry Bretthorst. Washington University, Department of Chemistry. and. C. Ray Smith in Infrared Systems and Components III, pp 93.104, Robert L. Caswell ed., SPIE Vol. 1050, 1989 Bayesian Analysis of Signals from Closely-Spaced Objects G. Larry Bretthorst Washington University, Department

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

ECE 5615/4615 Computer Project

ECE 5615/4615 Computer Project Set #1p Due Friday March 17, 017 ECE 5615/4615 Computer Project The details of this first computer project are described below. This being a form of take-home exam means that each person is to do his/her

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 0: Continuous Bayes Rule, Joint and Marginal Probability Densities Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

Probability Theory Review Reading Assignments

Probability Theory Review Reading Assignments Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"

More information

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX

More information

16.584: Random Vectors

16.584: Random Vectors 1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models.  Domingo Gallardo Robots Autónomos 2. Bayesian Estimation and sensor models Domingo Gallardo Depto. CCIA http://www.rvg.ua.es/master/robots References Recursive State Estimation: Thrun, chapter 2 Sensor models and robot

More information

ECE 673-Random signal analysis I Final

ECE 673-Random signal analysis I Final ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Review of Basic Probability Theory

Review of Basic Probability Theory Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Department o Electrical Engineering University o Arkansas ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Two discrete random variables

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

Slide a window along the input arc sequence S. Least-squares estimate. σ 2. σ Estimate 1. Statistically test the difference between θ 1 and θ 2

Slide a window along the input arc sequence S. Least-squares estimate. σ 2. σ Estimate 1. Statistically test the difference between θ 1 and θ 2 Corner Detection 2D Image Features Corners are important two dimensional features. Two dimensional image features are interesting local structures. They include junctions of dierent types Slide 3 They

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Review of probability. Nuno Vasconcelos UCSD

Review of probability. Nuno Vasconcelos UCSD Review of probability Nuno Vasconcelos UCSD robability probability is the language to deal with processes that are non-deterministic examples: if I flip a coin 00 times how many can I expect to see heads?

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Chapter 13 Internal (Lyapunov) Stability 13.1 Introduction We have already seen some examples of both stable and unstable systems. The objective of th

Chapter 13 Internal (Lyapunov) Stability 13.1 Introduction We have already seen some examples of both stable and unstable systems. The objective of th Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology 1 1 c Chapter

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Stat 5101 Notes: Algorithms (thru 2nd midterm) Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information