Introduction to Probability and Stocastic Processes - Part I

Similar documents
EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Introduction to Probability and Stocastic Processes - Part I

Multiple Random Variables

5 Operations on Multiple Random Variables

EE4601 Communication Systems

Continuous Random Variables

A Probability Review

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Joint Gaussian Graphical Model Review Series I

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Chapter 5 continued. Chapter 5 sections

Multivariate Random Variable

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Introduction to Normal Distribution

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Solutions to Homework Set #6 (Prepared by Lele Wang)

3. Probability and Statistics

EE 302: Probabilistic Methods in Electrical Engineering

Random Variables and Their Distributions

18 Bivariate normal distribution I

Lecture 14: Multivariate mgf s and chf s

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

16.584: Random Vectors

Introduction to Probability and Stochastic Processes I

ECE Lecture #10 Overview

Bivariate Distributions. Discrete Bivariate Distribution Example

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Random Variables. P(x) = P[X(e)] = P(e). (1)

Basics on Probability. Jingrui He 09/11/2007

Section 8.1. Vector Notation

Bivariate Transformations

Elements of Probability Theory

Multivariate random variables

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

conditional cdf, conditional pdf, total probability theorem?

Let X and Y denote two random variables. The joint distribution of these random

Bivariate distributions

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Notes on Random Vectors and Multivariate Normal

The Multivariate Normal Distribution. In this case according to our theorem

Lecture 11. Multivariate Normal theory

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

STAT Chapter 5 Continuous Distributions

Probability and Distributions

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

8 - Continuous random vectors

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

Review: mostly probability and some statistics

Review of Probability Theory

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

where r n = dn+1 x(t)

L2: Review of probability and statistics

2 (Statistics) Random variables

01 Probability Theory and Statistics Review

ECE Lecture #9 Part 2 Overview

2 Functions of random variables

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Lecture 2: Repetition of probability theory and statistics

Multivariate probability distributions and linear regression

4. Distributions of Functions of Random Variables

TAMS39 Lecture 2 Multivariate normal distribution

Lecture 11. Probability Theory: an Overveiw

CMPSCI 240: Reasoning Under Uncertainty

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

(Multivariate) Gaussian (Normal) Probability Densities

ENGG2430A-Homework 2

Introduction to Probability Theory

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Multivariate Distribution Models

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Multivariate Statistics

ECE 636: Systems identification

More than one variable

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

Chapter 5,6 Multiple RandomVariables

The Multivariate Gaussian Distribution

BASICS OF PROBABILITY

University of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 2: Multivariate Gaussians

The Multivariate Gaussian Distribution [DRAFT]

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

Introduction to Computational Finance and Financial Econometrics Matrix Algebra Review

Lecture Note 1: Probability Theory and Statistics

FINAL EXAM: Monday 8-10am

STOR Lecture 16. Properties of Expectation - I

ECON 3150/4150, Spring term Lecture 6

Statistical Learning Theory

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Uncorrelatedness and Independence

3F1 Random Processes Examples Paper (for all 6 lectures)

Definition of a Stochastic Process

Preliminary statistics

Multiple Random Variables

Transcription:

Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides originally by: Line Ørtoft Endelt Introduction to Probability and Stocastic Processes - Part I p. 1/26

From Experiment to Probability Experiment, E Sample space, S (containing the outcomes of the experiment) Events and/or a Random variable is defined on the sample space A probability measure is found/chosen A Probabilistic Model containes Sample space. Probability measure. Class of sets forming the domain of the probability measure. Introduction to Probability and Stocastic Processes - Part I p. 2/26

Example The joint density function of X and Y is Find a: f X,Y (x,y) = axy 1 x 3, 2 y 4 f X,Y (x,y) = 0 elsewhere 1 = = a 4 3 2 1 4 2 axy dx dy = a 4y dy = 24a 4 2 y [ x 2] 3 2 dy 1 so a = 1 24 Introduction to Probability and Stocastic Processes - Part I p. 3/26

Example (continued) The marginal pdf of X: f X (x) = 24 1 f X (x) = 0 4 2 xy dy = x 4 1 x 3 elsewhere The distribution function of Y is F Y (y) = 0 y 2 F Y (y) = 1 y > 4 F Y (y) = 24 1 y 3 2 1 xv dx dv = 1 y 6 2 v dv = 12 1 (y2 4) 2 y 4 Introduction to Probability and Stocastic Processes - Part I p. 4/26

Uniform Probability Density Function X has an uniform pdf if f X (x) = { 1 b a a x b 0 elsewhere The mean and variance are µ X = b + a 2 σ 2 X = (b a)2 12 Introduction to Probability and Stocastic Processes - Part I p. 5/26

f X (x) The Uniform pdf 1 b a 0 F X (x) a µ b x 1 0 a b x Introduction to Probability and Stocastic Processes - Part I p. 6/26

Gaussian Probability Density Function Electrical noise in communication systems is often due to the cumulative effects of a large number of randomly moving charged particles and hence the instantaneous value of the noise will tend to have a Gaussian distribution. The Gaussian pdf is given by f X (x) = P(X > a) = = 1 2πσX 2 a exp { (x µ X) 2 } 1 2πσX 2 (a µ X )/σ X 1 2σ 2 X exp { (x µ X) 2 } dx 2π exp 2σ 2 X { z2 2 } dz Introduction to Probability and Stocastic Processes - Part I p. 7/26

Gaussian Probability Density Function The Q function is defined by Q(y) = 1 2π y exp { z2 P(X > a) = Q[(a µ X )/σ X )] 2 } dz, y > 0 For the standard normal distribution (µ = 0,σ = 1) P(X x) = 1 Q(x) P( a X a) = 2P( a X 0) = 2P(0 X a) P(X 0) = 1 2 = Q(0) Introduction to Probability and Stocastic Processes - Part I p. 8/26

The Standard Normal Distribution The normal distribution with µ = 0 and σx 2 standard normal distribution. = 1 is called the 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0 5 0 5 Introduction to Probability and Stocastic Processes - Part I p. 9/26

The Normal Distribution 0.8 0.7 µ = 0, σ = 0.5 0.6 0.5 0.4 0.3 0.2 0.1 µ = 0, σ = 2 µ = 0, σ = 1 µ = 2, σ = 1 0 5 0 5 Introduction to Probability and Stocastic Processes - Part I p. 10/26

Example I X: the voltage output of a noise generator, std. norm. distribution. Find P(X > 2.3) and P(1 X 2.3) y Q(y) y Q(y) 0.9 0.1841 2.20 0.0139 0.95 0.1711 2.30 0.0107 1.00 0.1587 2.40 0.0082 P(X > 2.3) = Q(2.3) 0.0107 P(1 X 2.3) = (1 Q(2.3)) (1 Q(1)) = Q(1) Q(2.3) 0.148 Introduction to Probability and Stocastic Processes - Part I p. 11/26

Example II V : the velocity of the wind at a certain location, normal distributed, with µ = 2 and σ = 5. Find P( 3 V 8) P( 3 V 8) = 8 3 y = ( 3 2)/5 (8 2)/5 Q(y) 1.00 0.1587 1.2 0.1151 1 2π25 exp [ (v 2)2 1 2π exp 2 25 [ x2 2 ] dv ] dx = P(X 1.2) P(X 1) = (1 Q(1.2)) (1 Q( 1)) 0.726 since Q( 1) = 1 Q(1) due to symmetry. Introduction to Probability and Stocastic Processes - Part I p. 12/26

Bivariate Gaussian pdf The Bivariate Gaussian pdf is given by { [ (x ) 2 1 f X,Y = 2πσ X σ exp 1 µx Y 1 ρ 2 2(1 ρ 2 + ) σ X ( ) ]} 2 y µy + 2ρ(x µ X)(y µ Y ) σ Y σ X σ Y where ρ = ρ XY = E{(X µ X)(Y µ Y )} σ X σ Y = σ XY σ X σ Y Introduction to Probability and Stocastic Processes - Part I p. 13/26

Complex Random Variables Given two random variables X and Y, a complex random variable can be defined as Z = X + jy And the expected value of g(z) is defined as E{g(Z)} = g(z)f X,Y (x,y)dxdy µ Z = E{Z} = E{X} + je{y } = µ X + jµ Y σz 2 = E{ Z µ Z 2 } and the covariance of two complex RV is C Zm Z n = E{(Z m µ Zm ) (Z n µ Zn )} Introduction to Probability and Stocastic Processes - Part I p. 14/26

Joint Distribution and Density The joint probability distribution function for m random variables X 1,...,X m are given by F X1,...,X m (x,...,x m ) = P[(X 1 x 1 ),...,(X m x m )] The density function for a continuous random variable is the partial derivative of the distribution function, f X1,...,X m (x,...,x m ) = dm F X1,...,X m (x,...,x m ) dx 1 dx m Introduction to Probability and Stocastic Processes - Part I p. 15/26

Marginal PDF The marginal density function for X 1 is given by f X1 (x 1 ) = f X1,...,X m (x,...,x m )dx 2 dx m The joint marginal density function between two of the RV is found as f X1,X 2 (x 1,x 2 ) = f X1,...,X m (x,...,x m )dx 3 dx m Introduction to Probability and Stocastic Processes - Part I p. 16/26

Conditional PDF The conditional density functions are given by f X2,X 3...X m X 1 (x 2,x 3...x m x 1 ) = f X 1,X 2,...X m (x 1,x 2,...x m ) f X1 (x 1 ) and f X3,X 4...X m X 1,X 2 (x 3,x 4...x m x 1,x 2 ) = f X 1,X 2,...X m (x 1,x 2,...x m ) f X1,X 2 (x 1,x 2 ) Introduction to Probability and Stocastic Processes - Part I p. 17/26

Expected values The expected value of a function g(x 1,...,X m ) defined on m random variables are given by E{g(X 1,...,X m )} = g(x 1,...,x m )f X1,...,X m (x 1,...,x m )dx 1 dx m E{g(X 1,...,X m ) X 1,X 2 } = g(x 1,.,x m )f X3,.,X m X 1,X 2 (x 3,.,x m x 1,x 2 )dx 3 dx m Introduction to Probability and Stocastic Processes - Part I p. 18/26

Mean value and covariance The mean value and the covariance µ Xi = E{X i } σ Xi X j = E{X i X j } µ Xi µ Xj Note that when i = j, σ Xi X i is the variance of X i. Introduction to Probability and Stocastic Processes - Part I p. 19/26

Random vectors The m random variables X 1,...,X m can be represented using a vector X = X 1. or X = (X 1,X 2,...,X m ) T X m A possible value of the random variable (relating to and outcome of the underlying experiment), is represented as x = (x 1,x 2,...,x m ) T. And the joint PDF (The same as in slide no. 15) is denoted by f X (x) = f X1,...,X m (x,...,x m ) Introduction to Probability and Stocastic Processes - Part I p. 20/26

Mean Value and Covariance-matrix The mean vector is defined as E(X 1 ) E(X µ X = E(X) = 2 ). E(X m ) and the covariance-matrix is defined as σ X1 X 1 σ X1 X 2 σ X1 X m Σ X = E{XX T } µ X µ T X = σ X2 X 1 σ X2 X 2 σ X2 X m...... σ Xm X 1 σ Xm X 2 σ Xm X m Introduction to Probability and Stocastic Processes - Part I p. 21/26

Independent RV The random variables are uncorrelated, when their covariances are 0, ie. and independent when σ Xi X j = σ ij = 0, for i j f X (x) = f X1,...,X m (x,...,x m ) = m i=1 f Xi (x i ). Introduction to Probability and Stocastic Processes - Part I p. 22/26

Multivariate Gaussian Distribution A random vector X is multivariate Gaussian if its pdf is given by [ 1 f X (x) = (2π) m/2 exp 1 ] Σ X 1/2 2 (x µ X) T Σ 1 X (x µ X) where Σ X is the determinant of Σ X. Introduction to Probability and Stocastic Processes - Part I p. 23/26

Multivariate Gaussian Distribution If X has a multivariate Gaussian distribution, then 1. If X is partitioned as X = [ X 1 X 2 ], X 1 = X 1. X k, X 2 = X k+1. X m and [ ] [ ] µ X = µ X1 µ X2 Σ X = Σ 11 Σ 12 Σ 21 Σ 22 then X 1 has a k-dimensional multivariate Gaussian distribution, with mean value µ X1 and covariance Σ 11. Introduction to Probability and Stocastic Processes - Part I p. 24/26

Multivariate Gaussian Distribution 2. If Σ X is diagonal, then the components of X are independent. Note, that this ONLY holds for the Gaussian distribution. 3. If A is a k m matrix of rank k, then Y = AX has a k-variate Gaussian distribution with µ Y = Aµ X = AΣ X A T Σ Y Introduction to Probability and Stocastic Processes - Part I p. 25/26

Multivariate Gaussian Distribution 4. If X is partioned as in 1, the conditional density of X 1 given X 2 = x 2 is a k-dimensional multivariate Gaussian with µ X1 X 2 = E[X 1 X 2 = x 2 ] = µ X1 + Σ 12 Σ 1 22 (x 2 µ X2 ) Σ X1 X 2 = Σ 11 Σ 12 Σ 1 22 Σ 21 Introduction to Probability and Stocastic Processes - Part I p. 26/26