Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Similar documents
Joint Distribution of Two or More Random Variables

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

Lecture 2: Repetition of probability theory and statistics

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Algorithms for Uncertainty Quantification

More on Distribution Function

Chapter 4 Multiple Random Variables

Chapter 1 Probability Theory

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

ECE Homework Set 3

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

BASICS OF PROBABILITY

Chapter 2: The Random Variable

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Review of Probability. CS1538: Introduction to Simulations

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Appendix A : Introduction to Probability and stochastic processes

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Deep Learning for Computer Vision

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Chapter 3: Random Variables 1

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

ECON 5350 Class Notes Review of Probability and Distribution Theory

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Ch. 5 Joint Probability Distributions and Random Samples

Notes 12 Autumn 2005

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Review of Probability Theory

Introduction to Probability Theory for Graduate Economics Fall 2008

Multivariate Distributions (Hogg Chapter Two)

Probability theory. References:

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

ECON Fundamentals of Probability

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture 3: Random variables, distributions, and transformations

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

p. 4-1 Random Variables

Chapter 3: Random Variables 1

Summary of Random Variable Concepts March 17, 2000

Applied Statistics I

Probability Distributions for Discrete RV

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

1 Random variables and distributions

Expected Value 7/7/2006

Discrete Random Variable

Discrete Probability Refresher

3. Probability and Statistics

Lecture 6 Random Variable. Compose of procedure & observation. From observation, we get outcomes

conditional cdf, conditional pdf, total probability theorem?

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

1.1 Review of Probability Theory

Lecture Notes 2 Random Variables. Random Variable

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Conditional Probability

Homework 4 Solution, due July 23

MULTIVARIATE PROBABILITY DISTRIBUTIONS

CME 106: Review Probability theory

ECE Lecture #10 Overview

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Chapter 4 continued. Chapter 4 sections

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Multivariate Random Variable

Probability Review. Chao Lan

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

STT 441 Final Exam Fall 2013

Probability and Stochastic Processes

Stochastic Processes

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Part (A): Review of Probability [Statistics I revision]

Random Processes Why we Care

Midterm Exam 1 Solution

What is a random variable

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

SDS 321: Introduction to Probability and Statistics

Class 26: review for final exam 18.05, Spring 2014

Discrete Probability Distribution

2 (Statistics) Random variables

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

ENGG2430A-Homework 2

1 Presessional Probability

STAT 430/510: Lecture 16

Conditional Probability

Lecture 2: Review of Probability

Review of Elementary Probability Lecture I Hamid R. Rabiee

Recitation 2: Probability

Probability, Random Processes and Inference

ECE Lecture #9 Part 2 Overview

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

01 Probability Theory and Statistics Review

3 Multiple Discrete Random Variables

Class 8 Review Problems 18.05, Spring 2014

Review of Statistics I

Transcription:

Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and D.J. Goodman, Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers (third Edition) Slides, eams, answers on some selected eercises etc.: http://cas.tudelft.nl/education/courses/ee2s3/ (The least) preparation of the net lecture: Make sure to have understood lecture material last lecture read the corresponding chapters solved the selected eercises Your responsibility: Study and solve eercises before the net lecture.

Eam The eam consists of two parts:. Part : Mid-term eam on 28th of May 2. Part 2: Final eam on 3rd of July. Both parts will consist of eercises from signal processing and stochastic processes (50/50). The end result is the average of the mid-term and final eam. The result of mid-term eam is only valid in case you take the final eam on July 3rd. (Hence, it is not anymore valid for the re-eam) During the eam a HANDWRITTEN double sided A4 formula sheet may be used. Topics to be Discussed (3 rd edition) 2

Message of this course () X(t) linear system (LTI) h(n), h(t) Y (t) 5 Message of this course (2) X(t) linear system (LTI) h(n), h(t) Y (t) Statistical descriptions of X(t): mean µ X Autocorrelation function R X ( ). Statistical descriptions of Y (t): mean µ Y = µ X Rt h(t)dt R Y ( ) =h( ) h( ) R X ( ). 6 3

Message of this course (3) Some questions to answer: What is a stochastic process? How to characterize a stochastic process? What is the autocorrelation function? Statistical descriptions of output Y (t) of LTI system with WSS input X(t): mean µ Y = µ X Rt h(t)dt R Y ( ) =h( ) h( ) R X ( ). 7 Definition of Random a Variable () Eperiments often take place in a physical world: We observe physical quantities. Probability theory works in a mathematical world, with mathematical tools. How to map physical observations to numbers we can do mathematics on? E.g. Flip a coin: Outcomes are head/tail E.g. Observe flooding because of frequent rainfall. Outcomes are yes/no 4

Definition of Random a Variable (2) = X(Outcome) = X(s) X(Outcome) Outcome s Arbitrary sample space Often related to physical world Real numbers < Definition of Random a Variable (3) Eperiment:Head or tail in 3 tosses Outcomes TTT Random variable X(outcome): number of heads H in three tosses HHT HHH HTH THT THH X(s) X(s) HTT TTH 0 2 3 Random variable is simply a rule to assign a number to each outcome in the sample space 5

Definition of Random a Variable (4) X(event) < A<X(s) apple B roll even; grown up {2, 4, 6}; [8, ) From scalar, to stochastic processes X(s) =X S X Y X X(t, s) =X(t) One eperiment ) one outcome or realization of the stochastic/random process 6

Scalar random variables Discrete random variable: Discrete sample space Countable number of outcomes Each outcome has a non-zero probability of occurrence Continuous random variable: continuous sample space Have an infinite number of outcomes Eample: Pick a real-valued number between 3 and 5. How many outcomes? Every outcome has probability zero: P [X = ] = 0. Therefore we will consider events instead, e.g., 4 apple X apple 5. How to describe a scalar RV? Using the cumulative distribution function F X (): For continuous AND discrete RVs: F X () =P [X apple ] Properties: F X ( ) =0 F X () = P [ <Xapple 2 ]=F X ( 2 ) F X ( ) For all 0, F X ( 0 ) F X () For continuous RVs: F X () =P [X apple ] = R f X(u)du 7

Eample CDF F X () =P [X apple ] F X () F X (20) F X (0) 0 0 0 20 P (0 <Xapple 20) = F X (20) F X (0) 38 38 PDF versus PMF Discrete RVs are usually described by their Probability Mass Function (PMF) Continuous RVs are usually described by their Probability Density Function (PDF) For both cases the CDF can also be used For Discrete RVs the PDF also eists Useful for mied random variables PDF becomes a sum of delta functions 8

Properties of PDF and PMF Properties of the PDF Instead of a PMF (for discrete RVs), we have a probability density function, or pdf, for continuous RVs: f X () = df X() d A probability density is NOT a probability. It can even be larger than one! 9

Eample Scalar RV: from PMF to CDF Y 0 2 P Y (y) 0. 0.09 0.8 Plot the CDF F Y (y). What is F Y ()? Eample Scalar RV: from pdf to CDF Given is the eponential pdf ( f T (t) = 3 e t/3 t 0 0 otherwise Calculate the CDF F T (t) Calculate the epected value E[T ] 0

Eample Scalar RV: from pdf to CDF Given is the eponential pdf ( f T (t) = 3 e t/3 t 0 0 otherwise F T (t) = Z t f T (u)du = Z t 0 3 e u/3 du =[ e u/3 ] t 0 = e t/3 E[T ] = = Z h 0 t 3 e t/3 dt = te t/3i 0 + Z 0 Z 0 tde t/3 h e t/3 dt =0+ 3e t/3i 0 =3 From Scalar RVs to Multiple RVs Model: Y = X + N How to model this scenario? Using a pair of random variables: (X, Y ) or (X, N)

Multiple random variables S Y X(s) =X X X We will now consider a collection of random variables X,X 2,...,X n (Today, Chapter 5). Notice that we can also consider this as a vector X =[X,X 2,...,X n ] T (Chapter 8) How to describe multiple RVs? Remember for scalar RVs: CDF and pdf (or pmf). For multiple RVs we use the the joint CDF F X,Y (, y) and the joint pdf f X,Y (, y) (or joint pmf P X,Y (, y)). The joint cumulative distribution function F X,Y (, y) =P [X apple, Y apple y] 2

The joint cumulative distribution function Properties: 0 apple F X,Y (, y) apple F X,Y (, ) = F X () =F X,Y (, ) F Y (y) =F X,Y (,y) F X,Y (, ) =0 F X,Y (, y) =P [X apple, Y apple y] F X,Y (,y)=0 If apple and y apple then F X,Y (, y) apple F X,Y (,y ) The joint cumulative distribution function F X,Y (, y) y 3

Eample: j-cdf The joint probability mass function Using the j-cdf can be comple. It is often easier to work with the joint pmf and joint pdf. The j-pmf: P X,Y (, y) =P [X =, Y = y] Let S X,Y = {(, y) P X,Y (, y) > 0} be the set of possible (X, Y ) pairs. Eample: P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 X X 2S X y2s Y P X,Y (, y) = 4

The j-pmf for events For discrete RVs X and Y and any set B in the X, Y -plane, the probability of the event {(X, Y ) 2 B} is P [B] = X (,y)2b Eample: Let B be the event {X = Y }. Give P [B] for the j-pmf P X,Y (, y) P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 P X,Y (, y). The marginal pmf Eample: Given is the j-pmf P X,Y (, y). GiveP X () and P Y (y) P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 5

Eample: From j-pmf to j-cdf P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 F X,Y (, y) =P [X apple, Y apple y] F X,Y (, y) y =0 y = y =2 =0 0.0 0.0 0.0 = 0. 0.9 0.9 =2 0. 0.9 From marginal pmf to j-pmf? P X,Y (, y) y =0 y = P X () =0?? 0.5 =?? 0.5 P Y (y) 0.5 0.5 6

From marginal pmf to j-pmf? P X,Y (, y) y =0 y = P X () =0 0.5 0 0.5 = 0 0.5 0.5 P Y (y) 0.5 0.5 P X,Y (, y) y =0 y = P X () =0 0.25 0.25 0.5 = 0.25 0.25 0.5 P Y (y) 0.5 0.5 P X,Y (, y) y =0 y = P X () =0 0. 0.4 0.5 = 0.4 0. 0.5 P Y (y) 0.5 0.5 From marginal pmf to j-pmf? j-pmf (or j-pdf) marginal pmf (or marginal pdf) 7

The joint probability density function For continuous multiple random variables we use the joint pdf The j-pdf is related to the j-cdf by: F X,Y (, y) = Z Z y f X,Y (u, v)dudv f X,Y (, y) = @2 F X,Y (, y) @@y Eample j-pdf f X,Y (, y) = = ( @ 2 @@y ( 5)(y 6) 5 apple <6, 6 apple y<7 0 otherwise ( 5 apple <6, 6 apple y<7 0 otherwise 8

The joint probability density function Remember the properties of the pdf for scalar RVs: f X () 0 8 F X () = R f X(u)du R f X()d = Similarly, for the j-pdf we thus have f X,Y (, y) 0 8 (, y) F X,Y (, y) = R R y f X,Y (u, v)dudv R R f X,Y (, y)ddy = The joint probability density function The j-pdf and j-pmf thus simply etend what we already know from scalar RVs. However: Finding the right integration area for these higher dimensional integrals can be complicated! Z Z y P (A) = f X,Y (, y)ddy A A 9

Eample y f X,Y (, y) = ( c 0 apple y apple apple 0 otherwise 0 Value of c? Eample y f X,Y (, y) = ( c 0 apple y apple apple 0 otherwise 0 Value of c? P (X >/2)? P (X >/2) = Z Z /2 0 2dyd =3/4 20

The marginal pdf Remember for pmfs: P Y (y) = P 2S X P X,Y (, y) P X () = P y2s Y P X,Y (, y) How to obtain f X () and f Y (y) from f X,Y (, y)? f X () = f Y (y) = Z Z Proof using CDF: f X,Y (, y)dy f X,Y (, y)d F X () =P [X apple ] = Z f X () = @F X() @ Z f X,Y (u, y)dy du Illustration f X,Y (, y) f Y (y) y y f X () 2

Eample marginal pdf y What are the marginal pdfs? 0 Independent RVs Two RVs are independent if and only if: Discrete RVs: P X,Y (, y) =P X ()P Y (y) Continuous RVs: f X,Y (, y) =f X ()f Y (y) Are X and Y independent? No, as f X ()f Y (y) 6= f X,Y (, y) 22

Epected values of functions of RVs Let g(x, Y ) be a function of the variables X and Y. Epected value of W = g(x, Y ), i.e.,e [g(x, Y )]? Determine the pdf of W and calculate R wf W (w)dw. However, often it is di cult to find the pdf of W = g(x, Y ). Much easier: E[W ]= R R g(, y)f X,Y (, y)ddy 45 Epected values of Sums of RVs For the sum of two RVs we have: Typical value of X + Y E[X + Y ]=E[X]+E[Y ] Var[X+Y ]=Var[X]+Var[Y ]+2 E[(X E[X])(Y E[Y ])] {z } Covariance Spread around E[X + Y ] 4 6 23

Covariance Cov[X, Y ]=E[(X E[X])(Y E[Y ])] E[X + Y ]: Typical value of X + Y. Var[X + Y ]: Spread around E[X + Y ]. Cov[X, Y ]=E[(X E[X])(Y E[Y ])]: Describes how X and Y vary together. Covariance y y Cov[X, Y ] > 0 Cov[X, Y ]=0 y y Cov[X, Y ] < 0 Cov[X, Y ]=0 24

The correlation coefficient Correlation The correlation between X and Y is r X,Y = E[XY ] Correlation r X,Y = E[XY ] Covariance Cov[X, Y ]=E[(X E[X])(Y E[Y ])] = r X,Y E[X]E[Y ] Var[X + Y ]=Var[X]+Var[Y ]+2Cov[X, Y ] If X = Y, Cov[X, Y ]=Var[X] =Var[Y ] and r X,Y = E[X 2 ]=E[Y 2 ] 25

Uncorrelated and orthogonal RVs Two RVs are orthogonal if there is no correlation: r X,Y =0 Two RVs are uncorrelated if Cov[X, Y ]=0. Independent RVs 26

Independent vs. uncorrelated Eample: covariance y What are E[X] and E[Y ]? E[XY ] Cov(X, Y )? 0 5 4 27

Eample: covariance y 0 5 5 Eample: covariance y 0 28

Gaussian variables Joint Gaussian pdf 29

Gaussian variables ep f X,Y (, y) = " E[X] X 2 2 X,Y ( E[X])(y E[Y ]) X Y 2( 2 X,Y ) 2 X Y q 2 X,Y + y E[Y ] 2 Y # For uncorrelated variables X and Y (Cov[X, Y ]=0]): f X,Y (, y) = p 2 2 e ( E[X])2 (y E[Y ])2 2 2 p e 2 2 2 2 Gaussian variables 30

Multivariate models Multivariate models 3

Practice before net lecture Some selected eercises: 3rd ed: 5.., 5.2., 5.2.2, 5.3.2, 5.4., 5.5.3, 5.5.9, 5.5.8 32