FACULTY OF ENGINEERING AND ARCHITECTURE. MATH 256 Probability and Random Processes. 04 Multiple Random Variables

Similar documents
Chapter 5 Joint Probability Distributions

Multivariate Distributions CIVL 7012/8012

CS145: Probability & Computing

What is a random variable

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Continuous Random Variables

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

EE 4372 Tomography. Carlos E. Davila, Dept. of Electrical Engineering Southern Methodist University

Joint Probability Distributions, Correlations

conditional cdf, conditional pdf, total probability theorem?

Review: mostly probability and some statistics

Chapter 4. Chapter 4 sections

Lecture 2 OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Will Landau. Feb 21, 2013

Chapter 2: Random Variables

5 Operations on Multiple Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

Mathematics 426 Robert Gross Homework 9 Answers

2 Functions of random variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

PROBABILITY AND RANDOM PROCESSESS

Chapter 4 Multiple Random Variables

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

3 Operations on One Random Variable - Expectation

Sampling Distributions

1 Solution to Problem 2.1

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

Definition of a Stochastic Process

Brief Review of Probability

Class 8 Review Problems solutions, 18.05, Spring 2014

More on Distribution Function

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

STA 256: Statistics and Probability I

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Sampling Distributions

4. Distributions of Functions of Random Variables

EE4601 Communication Systems

Continuous Random Variables

(1 + 2y)y = x. ( x. The right-hand side is a standard integral, so in the end we have the implicit solution. y(x) + y 2 (x) = x2 2 +C.

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

f (1 0.5)/n Z =

Basics on Probability. Jingrui He 09/11/2007

Joint Probability Distributions, Correlations

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Joint Distribution of Two or More Random Variables

p. 6-1 Continuous Random Variables p. 6-2

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Continuous distributions

Homework 9 (due November 24, 2009)

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

1 Probability Review. 1.1 Sample Spaces

Course 1 Solutions November 2001 Exams

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

MATH4210 Financial Mathematics ( ) Tutorial 7

Introduction to Probability and Statistics (Continued)

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Statistics, Data Analysis, and Simulation SS 2015

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CS 160: Lecture 16. Quantitative Studies. Outline. Random variables and trials. Random variables. Qualitative vs. Quantitative Studies

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Independent random variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

HW Solution 12 Due: Dec 2, 9:19 AM

3F1 Random Processes Examples Paper (for all 6 lectures)

Chapter 3: Random Variables 1

Econ 508B: Lecture 5

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Transform Techniques - CF

EXAM # 3 PLEASE SHOW ALL WORK!

Multivariate Random Variable

Statistics for scientists and engineers

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Solutions of Math 53 Midterm Exam I

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

i=1 k i=1 g i (Y )] = k

Lecture 2: Repetition of probability theory and statistics

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Stat 5101 Notes: Algorithms

Probability review. September 11, Stoch. Systems Analysis Introduction 1

2 Belief, probability and exchangeability

3. Probability and Statistics

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Transform Techniques - CF

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

1 Random Variable: Topics

Distributions of Functions of Random Variables

Transcription:

OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Multiple Random Variables Fall 2012 Yrd. Doç. Dr. Didem Kivanc Tureli didem@ieee.org didem.ivanc@oan.edu.tr 1 Mode and Median MODE: value which occurs most often. If there are two, three or more values that have a high probability of occurring, the distribution if bimodal, trimodal, or multimodal. MEDIAN: value of x for which P( < x) 0.5 and P( > x) 0.5. In the case of a discrete distribution a unique median may not exist. 2 1

Moments If is a random variable with mean μ = E[] th raw moment of a random variable : M = E th central moment of a random variable : ( μ) μ = E 3 Moment Generating Function If is a random variable with mean μ = E[], then the moment generating function of is defined as: () tx e i p xi i t = = tx e f M t E e x dx (discrete case) (continuous case) t 1 2 1 M ( t) = E e = E 1 + t + ( t ) + + ( t ) + 2!! 2 t 2 t = 1+ te[ ] + E + + E + 2!! 4 2

Why do we care about the Moment Generating Function? t 1 2 1 M () t = E e = E 1 + t + ( t) + + ( t) + 2!! 2 t 2 t = 1+ te[ ] + E E 2! + + +! To find the moments, tae the th derivative of the moment generating function and evaluate at t = 0. m = E = M 0 = 1, 2,... ( ) E M ( ) d ( 0) = () dt t= 0 M M t 5 Characteristic Function ( ω ) jω Ψ = E e = i jωx e i p xi (discrete case) (continuous case) jωx e f x dx where ω is a real variable and j is the square root of 1. Notice that if you now the moment generating function, you can find the characteristic function by setting t = jω. The probability density function is the inverse Fourier Transform of the characteristic function. 1 jωx f ( x) = Ψ ( ω) e dω 2π 6 3

Joint Distribution of Random Variables. Sometimes you want to study the relationship of two different random variables to each other. For instance, is there a correlation between height and grades? Is there a correlation between the number of hours someone sleeps and the amount of food they eat? 7 Independent random variables Random variables 1,, n are independent if for any two variables, the value of one gives us no information about the value of the other. Example: = height of a random person on the street, T = temperature outside. and T are independent random variables. The math (formally): and Y are independent if for any two sets of numbers A and B, P A, Y B = P A P Y B { } { } { } 8 4

Joint Probability Mass Function The joint probability mass function of the discrete random variables and Y, denoted as f Y (x, y), satisfies () 1 fy ( x, y) 0 ( 2 ) fy ( x, y) = 1 x y ( 3 ) f ( xy, ) = P( = xy, = y) Y 9 Joint Probability Density Function The joint probability density function of the continuous random variables and Y, denoted as f Y (x, y), satisfies () 1 fy ( x, y) 0 fy R 2 x, y dxdy = 1 3 For any region R of two-dimensional space, P Y f x y dx dy (, R ) = Y (, ) R 10 5

Independent random variables For independent random variables (, ) = f x y f x f y Y Y 11 Joint Cumulative Distribution Function The joint cumulative distribution function (or joint cdf) of and Y, denoted by F Y (x, y) is the function defined by F x, y = P x, Y y Y { } If the events A and B below are independent, then the random variables and Y are independent. { ζ S ( ζ ) x } ζ S Y( ζ ) y A = S x B = { } 12 6

Marginal Probability Density Function If the joint probability density function of random variables and Y is f Y (x, y), the marginal probability density functions of and Y are = (, ) and = (, ) f x f x y dy f y f x y dx Y Y Y 13 Example Let the random variable denote the time until a computer server connects to your machine (in milliseconds) and let Y denote the time until the server authorizes you as a valid user (in milliseconds). Each of these random variables measures the wait from a common starting time and < Y. Assume that the joint probability density function for and Y is: 6 ( xy) = ( y) fy, 6 10 exp 0.001x 0.002 for x< y. 14 Chec that the integral of f Y (x,y) is 1 over the entire region. Find P( 1000, Y 2000). Find P(Y 2000). 7

15 Conditional Probability Density Function Given continuous random variables and Y, with joint probability density function f Y (x, y), the conditional probability density function of Y, given = x is ( x, y) ( x) fy fyx ( y) = for f( x) 0 f > Example continued: Find the conditional probability density function of Y given = x, then calculate P(Y > 2000 = 1500). 16 8

Independent random variables For independent random variables (, ) f Y xy f x f Y y fyx ( y) = = = fy y f x f x 17 Conditional mean and variance The conditional mean of Y given = x denoted as E(Y x) or µ Y x is E Y x = y f y dy Yx Y The conditional variance of Y given = x denoted as V(Y x) or σ 2 Y x is ( ) = ( 2 2 2 μyx ) Yx = Yx μ Y Y V Y x y f y dy y f y dy Yx 18 9

19 Example y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 Joint probability mass function of and Y where is the number of bars of signal strength on your cellphone, and Y is the number of times you need to repeat the name of a city to the airline operator. The joint probability mass function is also sometimes called the joint distribution so you better be careful 20 Example y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 Sum of all of the probabilities is: 0.15+0.02+0.02+0.01=0.20 0.10+0.10+0.03+0.02=0.25 0.05+0.05+0.20+0.25=0.55 0.2+0.25+0.55=1 10

Find the Joint cumulative distribution function y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 21 x y 1 2 3 4 3 0.20 2 0.08 1 Sum of these probabilities goes here Find the Joint cumulative distribution function Joint Prob. Mass Fn. y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 22 Joint Prob. Distribution Fn. y = number of times city name is stated. ttd 1 2 3 4 0.20 0.45 1.00 3 0.05 0.20 0.70 2 0.03 0.08 0.53 1 0.01 0.03 0.28 11

Find the marginal probability mass function y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 Pr(Y = 2) 1 0.01 0.02 0.25 Pr( = 2) Sum of these probabilities gives the probability that Y = 2. 23 Find the marginal probability mass function y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 Pr(Y = 2) 1 0.01 0.02 0.25 Pr( = x) Pr( = 3) Sum of these probabilities gives the probability that = 3. 24 12

Find the marginal probability mass function y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15 0.10 0.05 0.30 3 0.02 0.10 0.05 0.17 2 0.02 0.03 0.20 0.25 1 0.01 0.02 0.25 0.28 Pr( = x) 0.20 0.25 0.55 Marginal probability density function p (x) Marginal probability density function p Y (y) 25 Find the conditional probability mass function p x ( Y = y) p( Y=y) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.3 0.10/0.3 0.05/0.3 0.30 3 0.02/0.17 0.10/0.17 0.05/0.17 0.17 2 0.02/0.25 0.03/0.25 0.20/0.25 0.25 1 0.01/0.28 0.02/0.28 0.25/0.28 0.28 Pr( = x) 0.20 0.25 0.55 26 13

Find the conditional probability mass function p y (Y = x) p(y =x) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.2 0.10/0.25 0.05/0.55 0.30 3 0.02/0.2 0.10/0.25 0.05/0.55 0.17 2 0.02/0.2 0.03/0.25 0.20/0.55 0.25 1 0.01/0.2 0.02/0.25 0.25/0.55 0.28 Pr( = x) 0.20 0.25 0.55 27 Find the conditional mean E( Y = y) p( Y=y) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.3 0.10/0.3 0.05/0.3 0.30 3 0.02/0.17 0.10/0.17 0.05/0.17 0.17 2 0.02/0.25 0.03/0.25 0.20/0.25 0.25 1 0.01/0.28 0.02/0.28 0.25/0.28 0.28 Pr( = x) 0.20 0.25 0.55 28 015 0.15 0.10 010 005 0.05 E [ Y = 1] = 1+ 2+ 3= 1.6667 0.30 0.30 0.30 0.02 0.10 0.05 E[ Y = 2] = 1+ 2+ 3= 2.1765 0.17 0.17 0.17 0.02 0.03 0.20 E[ Y = 3] = 1+ 2 + 3 = 2.7200 0.25 0.25 0.25 0.01 0.02 0.25 E[ Y = 4] = 1+ 2 + 3 = 2.8571 0.28 0.28 0.28 14

Find the conditional mean E(Y = x) 29 p(y =x) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.2 0.10/0.25 0.05/0.55 0.30 3 0.02/0.2 0.10/0.25 0.05/0.55 0.17 2 0.02/0.2 0.03/0.25 0.20/0.55 0.25 1 0.01/0.2 0.02/0.25 0.25/0.55 0.28 Pr( = x) 0.20 0.25 0.55 0.15 0.0202 0.0202 0.0101 EY [ = 1] = 4 + 3 + 2 + 1 = 3.5500 0.20 0.20 0.20 0.20 0.10 0.10 0.02 0.03 EY [ = 2] = 4 + 3 + 2 + 1 = 3.0800 0.25 0.25 0.25 0.25 0.05 0.05 0.20 0.25 EY [ = 3] = 4 + 3 + 2 + 1 = 1.8182 0.55 0.55 0.55 0.55 15