where r n = dn+1 x(t)
|
|
- Homer Allison
- 5 years ago
- Views:
Transcription
1 Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution Sums of independent random variables Central limit theorem Taylor Series Expansion Taylor series expansion about t = a, x(t) =x(a)+ dx(t) dt where (t a)+ d2 x(t) (t a) 2 d 2 t 2! r n = dn+1 x(t) dt (t a) n+1 t=τ [t,a] + + dn x(t) (t a) n + r d n n t n! J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Introduction Random signals evolve in time in an unpredictable manner We must assume something doesn t change in order to use them Usually this is their average properties Probability Space Let Ω denote all possible outcomes, ζ, of an experiment Event: A subset of outcomes The event is said to occur if the outcome of the experiment is one of the members of the subset It s an or (union) Not an and (intersection) A collection of subsets with certain properties is called a field and will be denoted as F The probability of each event in the field is denoted Pr { } for k =1, 2,... The collection (Ω, F, Pr { }) is called a probability space J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
2 Ω ζ Random Variables x(ζ) Range Real Line Random variable: a function that assigns a real number, x(ζ), to each outcome ζ in the sample space of a random experiment Conditions {ζ : ζ Ω and <x(ζ) x 0 } F for all x 0 R 1 Pr {x(ζ) = } = 0 Pr {x(ζ) = } = 0 Random Variable Properties The sample space Ω is the domain of the random variable The set of all values that x( ) can have is the range of the random variable For now the range is the real line, R 1 Later we will generalize to allow it to be a vector or a sequence This is a many to one mapping. That is, a set of points, ζ 1,ζ 2,... may take on the same value of the random variable Will sometimes abbreviate RV The RV may be real- or complex-valued Random variables are actually a deterministic function of any event in the field: {ζ 1,ζ 2,...,ζ k } J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Random Variable Properties Continued Notation: x(ζ) =x x(ζ) denotes a random variable (function of the experimental outcome) x denotes its value A random variable may be Discrete-Valued, Continuous-Valued, ormixed Similar to spectral densities If x(t) is nearly periodic, it has discrete spectra Most stationary random signals x(t) have continuous spectra A combination of the two types has mixed spectra Cumulative Distribution Function (cdf) Recall that probability is defined on the field of possible events Some of these events (subsets of outcomes) can be defined as {ζ : x(ζ) x} Cumulative Distribution Function (cdf): The probability of these events, Pr {x(ζ) x} Note the cdf is a function of x R 1 : F x (x) Pr {x(ζ) x} The cdf is continuous from the right J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
3 Properties of the cdf 0 F x (x) 1 lim x + F x (x) =1 lim x F x (x) =0 F x (x) is a nondecreasing function of x Thus, if a<b,thenf x (a) F x (b) F x (x) is continuous from the right That is, for h>0, F x (b) = lim h 0 F x (b + h) =F x (b + ) Pr {a <x(ζ) b} = F x (b) F x (a) Pr {x(ζ) =b} = F x (b) F x (b ) Probability Density Function (pdf) Probability Density Function (pdf): When it exists, is defined as f x (x) df x(x) dx Thus, we also have F x (x) = x f x (u)du For a small interval x, can be thought of as f x (x) x F x (x + x) F x (x) =Pr{x <x(ζ) x + x} J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Properties of the pdf f x (x) 0 Pr {a x(ζ) b} = b a f x(u)du F x (x) = x f x(u)du + f x(u)du =1 A valid pdf can be formed from any nonnegative, piecewise continuous function g(x) that has a finite integral The pdf must be defined for all real values of x If x does not take on some values, this implies f x (x) =0for those values Point Statistics, Averages, and Moments In general, we will often not have a complete description of an RV (the pdf or cdf) Estimating a pdf or cdf is difficult in general, especially if x(ζ) is a sequence or vector However, we can often estimate some properties of the distribution without estimating the distribution itself These are called point statistics We will only discuss a subset of point statistics: statistical averages or moments These are useful because In many cases, we can estimate them accurately from data They give us useful information about the distribution We don t have to know the distribution to estimate them J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
4 Expected Values Defined Expected value: defined for a random variable x(ζ) as E[x(ζ)] μ x = + xf x (x)dx If x(ζ) is discrete-valued, the pdf consists only of impulses and can be alternatively written in terms of the pmf ( ) + E[x(ζ)] = x p k δ(x x k ) dx = k = k k + p k xδ(x x k )dx p k x k Expectation Properties E[x(ζ)] μ x = Also called the mean of x(ζ) + xf x (x)dx The mean can be regarded as the center of gravity of f x (x) If f x (x) is symmetric about a, f(x a) =f(a x), thenμ x = a If f x (x) is even, μ x =0 This is a linear operation E[αx(ζ)+β] =αμ x + β If y(ζ) =g(x(ζ)), a function of the random variable x(ζ), y(ζ) is also a random variable such that E[y(ζ)] E[g(x(ζ))] = g(x)f x (x)dx J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Moments Defined mth-order Moment of x(ζ) is defined as r (m) x E[x m (ζ)] = x m f x (x)dx mth-order Central Moment of x(ζ) is defined as γ (m) x E[(x(ζ) μ x ) m ]= (x μ x ) m f x (x)dx Mean-Squared Value: The second-order moment, r x (2). Note that E [ x 2 (ζ) ] E 2 [x(ζ)] Moments and Definitions r x (m) E[x m (ζ)] = x m f x (x)dx γ x (m) E[(x(ζ) μ x ) m ]= (x μ x ) m f x (x)dx Variance of x(ζ) is defined as var[x(ζ)] σx 2 γ x (2) =E [ (x(ζ) μ x ) 2] Standard deviation of x(ζ) is defined as σ x = var[x(ζ)] Obvious moments r x (0) =1 r x (1) = μ x Trivial central moments γ x (0) =1 γ x (1) =0 γ x (2) = σx 2 J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
5 Relationship of Moments Moments and central moments are related: m ( ) m γ x (m) = ( 1) k μ k k xr x (m k) k=0 It also possible to calculate the first m moments from the first m central moments. Characteristic Functions Characteristic Function of x(ζ) is defined as [ Φ(ξ) E e jξx(ζ)] = f x (x)e jξx dx Similar to the continuous-time Fourier transform However, the exponent is positive (why?) Will not use F x (ξ) to avoid confusion with cdf (same as text) Text claims the independent variable ξ should not be thought of as frequency (why?) This will be useful for handling sums of random variables because the distribution of the sum is the convolution of the pdf s J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Moment Generating Functions Moment Generating Function of x(ζ) is defined as [ Φ(s) E e sx(ζ)] = f x (x)e sx dx Similar to the continuous-time Laplace transform However, the exponent is positive (why?) Using a Taylor series expansion of e sx at x close to zero we have Φ(s) = E = E [e sx(ζ)] [1+sx(ζ)+ [sx(ζ)]2 2! + + [sx(ζ)]m m! = 1+sμ x + s2 2! r(2) x + + sm m! r(m) x +... ] +... Moment Generating Functions Φ(s) = 1+sμ x + s2 2! r(2) x + + sm m! r(m) x +... If all the moments of x(ζ) are known and exist, we can create Φ(s) and solve for f x (x) by inverse Laplace transform Thus the set of all moments (if they exist) completely define the pdf! If Φ(s) is known and analytic (not in the book), we can use it to solve for the moments r (m) x = dm [ Φ(s)] ds m =( j) m dm [ Φ(ξ)] s=0 dξ m ξ=0 J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
6 Cumulants Cumulant Generating Function of x(ζ) is defined as Ψ x (s) ln Φ x (s) =lne [e sx(ζ)] Second Characteristic Function of x(ζ) is defined as Common Random Variables Uniform distribution Normal distribution Cauchy distribution Ψ x (ξ) ln Φ x (ξ) =lne [e jξx(ζ)] mth Cumulant of x(ζ) is defined as κ x (m) dm [ Ψ(s)] ds m =( j) m dm [ Ψ(ξ)] s=0 dξ m ξ=0 Useful for high-order moment analysis (advanced SSP topic) and working with products of characteristic functions J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Uniform Distribution f x (x) = { 1 b a a x b 0 otherwise Useful for situations in which outcomes are equally likely Often denoted as x(ζ) U[a, b] Mean and variance μ x = a + b 2 σ 2 x = (b a)2 12 F(x) Gaussian Distribution Function x f x (x) = Normal Distribution f(x) πσ 2 x exp Also called the Gaussian distribution Often denoted as x(ζ) N(μ x,σx) 2 Arises naturally in many applications Central limit theorem (more later) Gaussian Density Function x ( (x μ x) 2 ) 2σ 2 x J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
7 Normal Distribution Properties f x (x) = 1 e (x μ x) 2σx 2 2πσ 2 x Φ x (ξ) = exp(jμ x ξ 1 2 σ2 xξ 2 ) Defined completely by μ x and σx 2 All higher-order moments can be determined in terms of the first two moments Higher-order moments provide no additional information Due to the central limit theorem, would like to know how the distribution of random variables differs from a normal distribution Cumulants generally provide this information For normally distributed random variables, κ x (m) =0for m>2 2 Cauchy Random Variable f x (x) = β π 1 (x μ) 2 + β 2 F x (x) = π arctan x μ β Φ x (ξ) = exp(jμξ β ξ ) A heavy-tailed distribution (relative to Gaussian) Two parameters, μ and β Mean: μ x = μ Variance does not exist because E[x 2 ] does not exist Moment generating function does not exist for some values of s The sum of M independent Cauchy random variables is also Cauchy! Example of an infinite-variance random variable J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Random Vectors Random Vectors Ω x(ζ) R M Ω x(ζ) R M ζ ζ Much of what we have discussed generalizes to vector random-variables in an obvious manner However, the lower order moments have special properties and are important to signal processing Each outcome of the assumed underlying random experiment produces an entire random vector Each element of the vector is not generated independently from a separate experiment This is an important concept Random M vector: a real-valued vector containing M random variables x =[x 1 (ζ),x 2 (ζ),...,x m (ζ)] T Transpose is denoted by T Maps an abstract probability space to a vector-valued real space R M Thus the range is an M-dimensional space J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
8 Distribution and Density Functions Distribution and Density Functions f x (x) = F x (x 1,...,x M ) Pr {x 1 (ζ) x 1,...x M (ζ) x M } lim x 1 0. x M 0 Pr {x 1 <x 1 (ζ) x 1 + x 1,...,x M <x M (ζ) x M + x M } x 1... x M The commas denote an and condition: it is the probability that all M random variables x i (ζ) are less than the stated values A RV is completely characterized by its joint cdf or pdf Often written as F x (x) =Pr{x(ζ) x} f x (x) F x (x) = = F x (x) x 1 x m x1 x xm f x (u)du f x (u)du 1 du M J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Independence Independent Two random variables x 1 (ζ) and x 2 (ζ) are independent if the events {x 1 (ζ) x 1 } and {x 2 (ζ) x 2 } are jointly independent Pr {x 1 (ζ) x 1,x 2 (ζ) x 2 } =Pr{x 1 (ζ) x 1 } Pr {x 2 (ζ) x 2 } Equivalent conditions are F x1,x 2 (x 1,x 2 ) = F x1 (x 1 )F x2 (x 2 ) f x1,x 2 (x 1,x 2 ) = f x1 (x 1 )f x2 (x 2 ) Vector Statistics In general it is very difficult to estimate vector pdf s and/or cdf s Called the curse of dimensionality Even if they are known, they are difficult to work with in general However, there is a rich statistical theory developed for second order moments (mean and variance) We will on this aspect of SSP for this course Higher-order moments is an advanced topic covered later in the text Worth self-study J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
9 Vector Mean Mean Vector: defined for a random variable x(ζ) as E[x 1 (ζ)] μ 1 μ x =E[x(ζ)] =. =. E[x M (ζ)] μ M The integral of the vector expectation is taken over the entire C M space, if the random vector is complex-valued The mean vector is simply the vector of means Autocorrelation Matrix Autocorrelation matrix: defined for a random vector x(ζ) as R x E [ x(ζ)x H (ζ) ] r r 1M = r M1... r mm where r ii E [ x i (ζ) 2] r ij E[x i (ζ)x j (ζ) ]=rji H denotes conjugate transpose operation This is a completely different definition than the autocorrelation defined for deterministic signals r ii are second-order moments of x i (ζ), denoted earlier as r x (2) i r ij measure correlation between two random variables This will be precisely defined later The autocorrelation matrix R x is Hermitian: R x = R H x J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Autocovariance Matrix Autocovariance matrix: defined for a random variable x(ζ) as Γ x E [ γ γ 1M (x(ζ) μ x )(x(ζ) μ x ) H] =..... γ M1... γ MM where γ ii E [ x i (ζ) μ i 2] = σ 2 x i γ ij E[(x i (ζ) μ i )(x j (ζ) μ j ) ]=E [ x i (ζ)x j (ζ) ] μ i μ j = γ ji Sometimes γ ij is denoted as σ ij γ ii is sometimes called the self variance of x i (ζ) γ ij is called the covariance of x i (ζ) and x j (ζ) Correlation Coefficients Correlation coefficient: is denoted as ρ ij and is defined for random variables x i (ζ) and x j (ζ) as γ ij = ρ ij σ i σ j ρ ij γ ij σ i σ j ρ ii =1and 1 ρ ij 1 Extreme values of ρ ij indicate a linear relationship between x j (ζ) and x i (ζ): x j (ζ) =αx i (ζ)+β Does not tell you what α or β are ρ ij =1implies α>0, ρ ij = 1 implies α<0 If ρ ij =0, x i (ζ) and x j (ζ) are said to be uncorrelated The covariance matrix Γ x is Hermitian: Γ x = Γ H x J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
10 If μ x = 0, Γ x = R x Autocorrelation and Autocovariance Γ x = R x μ x μ H x If ρ ij =0or γ ij =0, x i (ζ) and x j (ζ) are uncorrelated If r ij =0, x i (ζ) and x j (ζ) are orthogonal The degree of correlation is a weaker measure of interaction than independence If x i (ζ) and x j (ζ) are independent, γ ij = ρ ij =0 Converse is not true (sufficient, but not a necessary condition) If x i (ζ) and x j (ζ) have a normal distribution, then γ ij = ρ ij =0implies x i (ζ) and x j (ζ) are independent If one or both random variables have zero mean and are uncorrelated, they are orthogonal Cross-Correlation and Cross-Covariance Cross-correlation matrix: defined for a random vectors x(ζ) C M and y(ζ) C L as R x y E [ x(ζ)y H (ζ) ] E[x 1 (ζ)y1(ζ)]... E[x 1 (ζ)yl (ζ)] =..... E[x M (ζ)y1(ζ)]... E[x M (ζ)yl (ζ)] Cross-covariance matrix: defined for a random vectors x(ζ) C M and y(ζ) C L as Γ xy E [ (x(ζ) μ x )(y(ζ) μ y ) H] = R xy μ x μ H y Uncorrelated if Γ xy = 0 or, equivalently, R xy = μ x μ H y Orthogonal if R xy = 0 J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Linear Transformations y(ζ) = Ax(ζ) μ y = E[y(ζ)]=E[Ax(ζ)] = A E[x(ζ)] = Aμ x R y = E [ y(ζ)y(ζ) H] =E [ Ax(ζ) x(ζ) H A H] = AR x A H Γ y = AΓ x A H R xy = R x A H R yx = AR x Γ xy = Γ x A H Γ yx = AΓ x f x (x) = Normal Random Vectors 1 (2π) M/2 Γ x 1 2 Φ x (ξ) = exp(jξ T μ x 1 2 ξt Γ x ξ) exp ( 1 2 (x μ x) T Γ 1 x (x μ x ) ) Often denoted as x(ζ) N(μ x, Γ x ) The exponent is a positive definite quadratic function of x The pdf is completely specified by μ x and Γ x Both are relatively easy to estimate in practice All higher-order moments can be calculated from these If all pairs uncorrelated, are also independent A linear transformation, y = Ax + b is also normally distributed, y(ζ) N(μ x + b, AΓ x A H ) J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
11 Let Sums of Independent Random Variables y(ζ) = c k x k (ζ) μ y = c k μ xk σy 2 = c k 2 σx 2 k Therefore y(ζ) = x 1 (ζ)+x 2 (ζ) Φ y (ξ) = E [e jξy(ζ)] = E [e ] jξ[x 1(ζ)+x 2 (ζ)] [ ] = E e jξx 1(ζ) E [e ] jξx 2(ζ) = Φ x1 (ξ)φ x2 (ξ) f y (y) = f x1 (y) f x2 (y) Sums of Independent Random Variables Continued This generalizes in the obvious manner y(ζ) = x k (ζ) f y (y) =f x1 (y) f x2 (y) f xm (y) M Φ y (ξ) = Φ xk (ξ) Ψ y (ξ) = κ (m) y = Ψ xk (ξ) κ (m) x k J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver Stable Distributions y(ζ) = x k (ζ) Stable Distributions: distributions that are preserved (self-reproduce) under convolution Examples: Gaussian distribution, Cauchy distribution The only stable distribution with finite variance is the Gaussian distribution The other distributions have infinite variance and possibly infinite mean Examples: some diffusion processes, electrical noise, some random walks, coupled oscillators with friction Useful to model signals with large variability Very good discussion in text (not critical to this class) Central Limit Theorem y(ζ) = x k (ζ) If x i (ζ) are independent identically distributed (IID) random variables and the distribution f x (x) is stable, clearly y(ζ) converges to the same distribution as M Central Limit Theorem: If The mean and variance of each random variable exist (finite) The random variables are mutually independent The random variables are identically distributed then the distribution of y(ζ) approaches a normal distribution as M J. McNames Portland State University ECE 538/638 Random Variables Ver J. McNames Portland State University ECE 538/638 Random Variables Ver
12 Central Limit Theorem Comments y(ζ) = x k (ζ) The convergence often occurs even if the distributions are not identical The convergence is more accurate (rapid) near the mean (center) of the distribution The approximation may be poor in the tails The convergence is in the cdf, not the pdf. Consider discrete RV s Does not apply when the variance of the RV s is infinite J. McNames Portland State University ECE 538/638 Random Variables Ver
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationProbability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationDefinition of a Stochastic Process
Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationSystem Identification
System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationLecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs
s of Spring Quarter 2018 ECE244a - Spring 2018 1 Function s of The characteristic function is the Fourier transform of the pdf (note Goodman and Papen have different notation) C x(ω) = e iωx = = f x(x)e
More information3 Operations on One Random Variable - Expectation
3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationA Brief Analysis of Central Limit Theorem. SIAM Chapter Florida State University
1 / 36 A Brief Analysis of Central Limit Theorem Omid Khanmohamadi (okhanmoh@math.fsu.edu) Diego Hernán Díaz Martínez (ddiazmar@math.fsu.edu) Tony Wills (twills@math.fsu.edu) Kouadio David Yao (kyao@math.fsu.edu)
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationTerminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1
Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maximum likelihood Consistency Confidence intervals Properties of the mean estimator Properties of the
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More information3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES
3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationIntroduction to Information Entropy Adapted from Papoulis (1991)
Introduction to Information Entropy Adapted from Papoulis (1991) Federico Lombardo Papoulis, A., Probability, Random Variables and Stochastic Processes, 3rd edition, McGraw ill, 1991. 1 1. INTRODUCTION
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More information3F1 Random Processes Examples Paper (for all 6 lectures)
3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationSection 8.1. Vector Notation
Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information1 Joint and marginal distributions
DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1
EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation
More informationMath 3338, Homework 4. f(u)du. (Note that f is discontinuous, whereas F is continuous everywhere.)
Math 3338, Homework 4. Define a function f by f(x) = { if x < if x < if x. Calculate an explicit formula for F(x) x f(u)du. (Note that f is discontinuous, whereas F is continuous everywhere.) 2. Define
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationEcon 508B: Lecture 5
Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline
More informationIEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion
IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationEstimators as Random Variables
Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maimum likelihood Consistency Confidence intervals Properties of the mean estimator Introduction Up until
More information2 Continuous Random Variables and their Distributions
Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationLecture 7 Random Signal Analysis
Lecture 7 Random Signal Analysis 7. Introduction to Probability 7. Amplitude Distributions 7.3 Uniform, Gaussian, and Other Distributions 7.4 Power and Power Density Spectra 7.5 Properties of the Power
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More information