Chapter 5 Joint Probability Distributions
|
|
- Laurel Washington
- 5 years ago
- Views:
Transcription
1 Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions
2 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two or More Random Variables Joint Probability Distributions Marginal Probability Distributions Conditional Probability Distributions Independence More Than Two Random Variables 5-5 General Functions of Random Variables 5-6 Moment Generating Functions 5-2 Covariance and Correlation 5-3 Common Joint Distributions Multinomial Probability Distribution Bivariate Normal Distribution 5-4 Linear Functions of Random Variables Chapter 5 Title and Outline 2
3 Learning Objectives for Chapter 5 After careful study of this chapter, you should be able to do the following: 1. Use joint probability mass functions and joint probability density functions to calculate probabilities. 2. Calculate marginal and conditional probability distributions from joint probability distributions. 3. Interpret and calculate covariances and correlations between random variables. 4. Use the multinomial distribution to determine probabilities. 5. Properties of a bivariate normal distribution and to draw contour plots for the probability density function. 6. Calculate means and variances for linear combinations of random variables, and calculate probabilities for linear combinations of normally distributed random variables. 7. Determine the distribution of a general function of a random variable. 8. Calculate moment generating functions and use them to determine moments and distributions Chapter 5 Learning Objectives 3
4 Joint Probability Mass Function Sec Joint Probability Distributions 4
5 Joint Probability Density Function The joint probability density function for the continuous random variables X and Y, denotes as f XY (x,y), satisfies the following properties: Figure 5-2 Joint probability density function for the random variables X and Y. Probability that (X, Y) is in the region R is determined by the volume of f XY (x,y) over the region R. Sec Joint Probability Distributions 5
6 Example 5-2: Server Access Time-1 Let the random variable X denote the time until a computer server connects to your machine (in milliseconds), and let Y denote the time until the server authorizes you as a valid user (in milliseconds). X and Y measure the wait from a common starting point (x < y). The joint probability density function for X and Y is , x y for 0 and 610 f x y ke x y k XY Figure 5-4 The joint probability density function of X and Y is nonzero over the shaded region where x < y. Sec Joint Probability Distributions 6
7 Example 5-2: Server Access Time-2 The region with nonzero probability is shaded in Fig We verify that it integrates to 1 as follows: 0.001x0.002 y y 0.001x f XY x, ydydx ke dy dx k e dy e dx x e 0.001x 0.003x k e dx e dx Sec Joint Probability Distributions 7
8 Example 5-2: Server Access Time-3 Now calculate a probability: , 2000, P X Y f x y dydx 0 x y 0.001x k e dy e dx x 4 e e 0.001x k e dx x x e e e dx e 4 1e e x XY Figure 5-5 Region of integration for the probability that X < 1000 and Y < 2000 is darkly shaded. Sec Joint Probability Distributions 8
9 Marginal Probability Distributions (discrete) The marginal probability distribution for X is found by summing the probabilities in each column whereas the marginal probability distribution for Y is found by summing the probabilities in each row. X Y f x f xy f y f xy y x y = Response time(nearest second) x = Number of Bars of Signal Strength f (y ) f (x ) Marginal probability distributions of X and Y Sec Marginal Probability Distributions 9
10 Marginal Probability Density Function (continuous) If the joint probability density function of random variables X and Y is f XY (x,y), the marginal probability density functions of X and Y are: Sec Marginal Probability Distributions 10
11 Example 5-4: Server Access Time-1 For the random variables that denotes times in Example 5-2, find the probability that Y exceeds 2000 milliseconds. Integrate the joint PDF directly using the picture to determine the limits PY 2000 f XY x, ydy dx f XY x, ydy dx x Dark region left dark region right dark region Sec Marginal Probability Distributions 11
12 Example 5-4: Server Access Time-2 Alternatively, find the marginal PDF and then integrate that to find the desired probability. y 0.001x0.002 y fy y ke dx 0 y y 0.001x ke e dx ke y x e e y y ke y y 0.001y 6 10 e 1 e for y P Y f y dy 2000 Y y 0.001y 610 e 1 e dy y y 3 e e e e Sec Marginal Probability Distributions 12
13 Mean & Variance of a Marginal Distribution E(X) and V(X) can be obtained by first calculating the marginal probability distribution of X and then determining E(X) and V(X) by the usual method. E X x f x R V X x f x R 2 2 X X E Y y f y R V Y y f y R X Y 2 2 Y Y Sec Marginal Probability Distributions 13
14 Mean & Variance for Example 5-1 y = Response time(nearest second) x = Number of Bars of Signal Strength f (y ) y *f (y ) y 2 *f (y ) f (x ) x *f (x ) x 2 *f (x ) E(X) = 2.35 V(X) = = = E(Y) = 2.49 V(Y) = = = Sec Marginal Probability Distributions 14
15 Conditional Probability Density Function Sec Conditional Probability Distributions 15
16 Example 5-6: Conditional Probability-1 From Example 5-2, determine the conditional PDF for Y given X=x. f 0.001x0.002 y f X x k e dy x Yx y ke 0.001x e y e x ke x 0.003x e for x0 f XY x, y ke f ( x) 0.003e X 0.001x0.002 y 0.003x 0.002x0.002 y for 0 e x and x y Sec Conditional Probability Distributions 16
17 Example 5-6: Conditional Probability-2 Now find the probability that Y exceeds 2000 given that X=1500: P Y X 1500 f Y e 0.002e 3 y dy y y e e e e Sec Conditional Probability Distributions 17
18 Mean & Variance of Conditional Random Variables The conditional mean of Y given X = x, denoted as E(Y x) or μ Y x is Yx E Y x y f y y The conditional variance of Y given X = x, denoted as V(Y x) or σ 2 Y x is Y x Y x Y x Y x y V Y x y f y y f y y Sec Conditional Probability Distributions 18
19 Example 5-8: Conditional Mean And Variance From Example 5-2 & 5-6, what is the conditional mean for Y given that x = 1500? y y E Y X 1500 y 0.002e dy 0.002e y e dy y y 3 e e e y dy y e 0.002e e e Sec Conditional Probability Distributions e e e e If the connect time is 1500 ms, then the expected time to be authorized is 2000 ms.
20 Example 5-9 For the discrete random variables in Exercise 5-1, what is the conditional mean of Y given X=1? y = Response time(nearest second) x = Number of Bars of Signal Strength f (x ) y*f(y x=1) y 2 *f(y x=1) Sum of f(y x) The mean number of attempts given one bar is 3.55 with variance of Sec Conditional Probability Distributions 20 f (y )
21 Independent Random Variables For random variables X and Y, if any one of the following properties is true, the others are also true. Then X and Y are independent. Sec Independence 21
22 Example 5-11: Independent Random Variables Suppose the Example 5-2 is modified such that the joint PDF is: Are X and Y independent? X Find the probability x y f x, y 210 e for x 0 and y 0. XY x0.002 y x Y 210 f x e dy x e for x0 y f y e dx y e for y > , P X Y P X P Y 1 2 e e Sec Independence 22
23 Joint Probability Density Function The joint probability density function for the continuous random variables X 1, X 2, X 3, X p, denoted as f X X X x, x,..., xp satisfies the following properties: p 1 2 Sec More Than Two Random Variables 23
24 Example 5-14: Component Lifetimes In an electronic assembly, let X 1, X 2, X 3, X 4 denote the lifetimes of 4 components in hours. The joint PDF is: x x x x4 f x, x, x, x 9 10 e for x 0 X X X X What is the probability that the device operates more than 1000 hours? The joint PDF is a product of exponential PDFs. P(X 1 > 1000, X 2 > 1000, X 3 > 1000, X 4 > 1000) = e = e -7.5 = i Sec More Than Two Random Variables 24
25 Marginal Probability Density Function Sec More Than Two Random Variables 25
26 Mean & Variance of a Joint Distribution The mean and variance of X i can be determined from either the marginal PDF, or the joint PDF as follows: Sec More Than Two Random Variables 26
27 Example 5-16 Points that have positive probability in the joint probability distribution of three random variables X1, X2, X3 are shown in Figure. Suppose the 10 points are equally likely with probability 0.1 each. The range is the non-negative integers with x 1 +x 2 +x 3 = 3 List the marginal PDF of X 2 f f f f x P (X 2 = 0) = x1x2 x3(3,0,0) + x1x2 x (0,0,3) + x1x2 x (1,0,2) + 1x2x (2,0,1) = 0.4 P (X 2 = 1) = f x 1x2x (2,1,0) + f (0,1,2) + f 3 x1x2 x3 x 1x2x3(1,1,1) = 0.3 P (X 2 = 2) = f (1,2,0) + f x1x2 x3 x 1x2x3(0,2,1) = 0.2 P (X 2 = 3) = f x x (0,3,0) = x3 Also, E(x 2 ) = 0(0.4) + 1(0.3) + 2(0.2) + 3(0.1) = 1 Sec More Than Two Random Variables 27
28 Distribution of a Subset of Random Variables Sec More Than Two Random Variables 28
29 Conditional Probability Distributions Conditional probability distributions can be developed for multiple random variables by extension of the ideas used for two random variables. Suppose p = 5 and we wish to find the distribution conditional on X 4 and X 5. f x, x, x X X X for f x, x 0. X X X X 4 5 f x, x, x, x, x X X X X X f x, x X X 4 5 Sec More Than Two Random Variables 29
30 Independence with Multiple Variables The concept of independence can be extended to multiple variables. Sec More Than Two Random Variables 30
31 Example 5-18: Layer Thickness Suppose X 1,X 2, and X 3 represent the thickness in μm of a substrate, an active layer and a coating layer of a chemical product. Assume that these variables are independent and normally distributed with parameters and specified limits as tabled. What proportion of the product meets all specifications? Answer: , 3 layer product. Which one of the three thicknesses has the least probability of meeting specs? Answer: Layer 3 has least prob. Parameters and specified limits Normal Random Variables X 1 X 2 X 3 Mean (μ) 10,000 1, Std dev (σ) Lower limit 9, Upper limit 10,800 1, P(in limits) P(all in limits) = Sec More Than Two Random Variables 31
32 Covariance Covariance is a measure of the relationship between two random variables. First, we need to describe the expected value of a function of two random variables. Let h(x, Y) denote the function of interest. Sec 5-2 Covariance & Correlation 32
33 Example 5-19: Expected Value of a Function of Two Random Variables For the joint probability distribution of the two random variables in Example 5-1, calculate E [(X-μ X )(Y-μ Y )]. The result is obtained by multiplying x - μ X times y - μ Y, times f xy (X,Y) for each point in the range of (X,Y). First, μ X and μ y were determined previously from the marginal distributions for X and Y: Therefore, μ X = 2.35 and μ y = 2.49 Sec 5-2 Covariance & Correlation 33
34 Covariance Defined Sec 5-2 Covariance & Correlation 34
35 Correlation (ρ = rho) Sec 5-2 Covariance & Correlation 35
36 Example 5-21: Covariance & Correlation Determine the covariance and correlation to the figure below. Figure 5-13 Discrete joint distribution, f(x, y). Joint Marginal Mean StDev x y f(x, y) x-μ X y-μ Y Prod covariance = correlation = μ X = 1.8 μ Y = 1.8 σ X = σ Y = Sec 5-2 Covariance & Correlation 36 Note the strong positive correlation.
37 Independence Implies ρ = 0 If X and Y are independent random variables, σ XY = ρ XY = 0 ρ XY = 0 is necessary, but not a sufficient condition for independence. Sec 5-2 Covariance & Correlation 37
38 Example 5-23: Independence Implies Zero Covariance Let f xy x y 16 for 0 x 2 and 0 y 4 XY Show that E XY E X E Y 0 E X E Y XY x ydx dy x y dy y xy dx dy x y dy y E XY XY x y dx dy x y dy y dy y E XY E X E Y Figure 5-15 A planar joint distribution. Sec 5-2 Covariance & Correlation 38
39 Multinomial Probability Distribution Suppose a random experiment consists of a series of n trials. Assume that: 1) The outcome of each trial can be classifies into one of k classes. 2) The probability of a trial resulting in one of the k outcomes is constant, and equal to p 1, p 2,, p k. 3) The trials are independent. The random variables X 1, X 2,, X k denote the number of outcomes in each class and have a multinomial distribution and probability mass function: Sec Multinomial Probability Distribution 39
40 Example 5-25: Digital Channel Of the 20 bits received over a digital channel, 14 are of excellent quality, 3 are good, 2 are fair, 1 is poor. The sequence received was EEEEEEEEEEEEEEGGGFFP. Let the random variables X1, X2, X3, and X4 denote the number of bits that are E, G, F, and P, respectively, in a transmission of 20 bits. What is the probability that 12 bits are E, 6 bits are G, 2 are F, and 0 are P? 20! P X1 12, X2 6, X3 2, X !6!2!0! Using Excel = (FACT(20)/(FACT(12)*FACT(6)*FACT(2))) * 0.6^12*0.3^6*0.08^2 Sec Multinomial Probability Distribution 40
41 Multinomial Mean and Variance The marginal distributions of the multinomial are binomial. If X 1, X 2,, X k have a multinomial distribution, the marginal probability distributions of X i is binomial with: E(X i ) = np i and V(X i ) = np i (1-p i ) Sec Multinomial Probability Distribution 41
42 Bivariate Normal Probability Density Function The probability density function of a bivariate normal distribution is 1 f XY x, y; X, X, Y, Y, e X Y u where u x 2 x y y X X Y Y X XY Y for x and y. x 0, x, Parameter limits: 1 1 y 0, y, Sec Bivariate Normal Distribution 42
43 Marginal Distributions of the Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density function f XY (x,y;σ X,σ Y,μ X,μ Y,ρ) the marginal probability distributions of X and Y are normal with means μ X and μ Y and standard deviations σ X and σ Y, respectively. Sec Bivariate Normal Distribution 43
44 Conditional Distribution of Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density f XY (x,y;σ X,σ Y,μ X,μ Y,ρ), the conditional probability distribution of Y given X = x is normal with mean and variance as follows: Y Yx Y x X X Yx Y Sec Bivariate Normal Distribution 44
45 Correlation of Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density function f XY (x,y;σ X,σ Y,μ X,μ Y,ρ), the correlation between X and Y is ρ. Sec Bivariate Normal Distribution 45
46 Bivariate Normal Correlation and Independence In general, zero correlation does not imply independence. But in the special case that X and Y have a bivariate normal distribution, if ρ = 0, then X and Y are independent. If X and Y have a bivariate normal distribution with ρ=0, X and Y are independent. Sec Bivariate Normal Distribution 46
47 Linear Functions of Random Variables A function of random variables is itself a random variable. A function of random variables can be formed by either linear or nonlinear relationships. We limit our discussion here to linear functions. Given random variables X 1, X 2,,X p and constants c 1, c 2,, c p Y= c 1 X 1 + c 2 X c p X p is a linear combination of X 1, X 2,,X p. Sec 5-4 Linear Functions of Random Variables 47
48 Mean and Variance of a Linear Function If X 1, X 2,,X p are random variables, and Y= c 1 X 1 + c 2 X c p X p, then Sec 5-4 Linear Functions of Random Variables 48
49 Example 5-31: Error Propagation A semiconductor product consists of three layers. The variances of the thickness of each layer is 25, 40 and 30 nm. What is the variance of the finished product? Answer: Sec 5-4 Linear Functions of Random Variables 49
50 Mean and Variance of an Average Sec 5-4 Linear Functions of Random Variables 50
51 Reproductive Property of the Normal Distribution Sec 5-4 Linear Functions of Random Variables 51
52 Example 5-32: Linear Function of Independent Normal Random variables Let the random variables X 1 and X 2 denote the length and width of a manufactured part. Their parameters are shown in the table. What is the probability that the perimeter exceeds 14.5 cm? Let Y 2X 2 X perimeter 1 2 E Y 2E X 2E X cm cm 2 2 V Y 2 V X 2 V X SD Y PY Parameters of X 1 X 2 Mean 2 5 Std Dev Using Excel = 1 - NORMDIST(14.5, 14, SQRT(0.2), TRUE) Sec 5-4 Linear Functions of Random Variables 52
53 General Function of a Discrete Random Variable Suppose that X is a discrete random variable with probability distribution f X (x). Let Y = h(x) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability mass function of the random variable Y is f Y (y) = f X [u(y)] Sec 5-5 General Functions of Random Variables 53
54 Example 5-34: Function of a Discrete Random Variable Let X be a geometric random variable with probability distribution f X (x) = p(1-p) x-1, x = 1, 2, Find the probability distribution of Y = X 2. Solution: Since X 0, the transformation is one-to-one. The inverse transform function is X = y. y f Y (y) = p(1-p) -1, y = 1, 4, 9, 16, Sec 5-5 General Functions of Random Variables 54
55 General Function of a Continuous Random Variable Suppose that X is a continuous random variable with probability distribution f X (x). Let Y = h(x) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability distribution of Y is f Y (y) = f X [u(y)] J where J = u (y) is called the Jacobian of the transformation and the absolute value of J is used. Sec 5-5 General Functions of Random Variables 55
56 Example 5-35: Function of a Continuous Random Variable Let X be a continuous random variable with probability distribution: x f X ( x) for 0 x 4 8 Find the probability distribution of Y = h(x) = 2X + 4 Note that Y has a one-to-one relationship to X. y 4 1 x u y and the Jacobian is J u ' y 2 2 y y 4 f Y y for 4 y Sec 5-5 General Functions of Random Variables 56
57 Definition of Moments about the Origin The rth moment about the origin of the random variable X is r ' EX ( ) r r X f ( x), X discrete r X f ( x) dx, X continuous Sec 5-6 Moment Generating Functions 57
58 Definition of a Moment-Generating Function The moment-generating function of the random variable X is the expected value of e tx and is denoted by M X (t). That is, tx M ( t) M ( e ) X tx e f ( x), X discrete tx e f ( x) dx, X continuous Let X be a random variable with moment-generating function M X (t). Then Sec 5-6 Moment Generating Functions 58
59 Example 5-36 Moment-Generating Function for a Binomial Random Variable-1 Let X follows a binomial distribution, that is n x nx f ( x) p (1 p), x 0,1,..., n x Determine the moment generating function and use it to verify that the mean and variance of the binomial random variable are μ=np and σ 2 =np(1-p). The moment-generating function is n n n n M X ( t) e p (1 p) ( pe ) (1 p) x0 x x0 x tx x n x t x nx which is the binomial expansion of [ pe t (1 p)] Now the first and second order derivatives will be M ( t) npe [1 p( e 1)] and ' t t n1 x M ( t) npe (1 p npe )[1 p( e 1)] '' t t t n2 x n Sec 5-6 Moment Generating Functions 59
60 Example 5-36 Moment-Generating Function for a Binomial Random Variable-2 If we set t = 0 in the above two equations we get M ( t) np and ' x '' x ' 1 M ( t) np(1 p np) ' 2 Now the variance is np(1 p np) ( np) 2 ' np np 2 np(1 p) 2 Hence, the mean is np and variance is np(1 p). Sec 5-6 Moment Generating Functions 60
61 Properties of Moment-Generating Function If X is a random variable and a is a constant, then 1 2 at 1. M ( t) e M ( t) X a X 2. M ( t) M ( at) ax X If X, X,..., X are independent random variables with n moment generating functions M ( t), M ( t),..., M ( t) X X X 1 2 respectively, and if Y X X... X then the moment generating function of Y is M ( t) M ( t). M ( t)..... M ( t) Y X X X 1 2 n n n Sec 5-6 Moment Generating Functions 61
62 Example 5-38 Distribution of a Sum of Poisson Random Variables Suppose that X 1 and X 2 are two independent Poisson random variables with parameters λ 1 and λ 2, respectively. Determine the probability distribution of Y = X 1 + X 2. The moment-generating function of a Poisson random variable with parameter λ is t ( e 1) M () t e X M ( t) e and M ( t) e ( e 1) ( e 1) X Hence for X 1 and X 2, 1 2 X 1 2 Using MY ( t) M X ( t). M, the moment-generating 1 X ( t)..... M ( ) 2 X n t function of Y = X 1 + X 2 is M ( t) M ( t). M ( t) Y X X e e 1 2 t 1 e 2 ( 1) ( e 1) ( )( e 1) 1 2 e t Sec 5-6 Moment Generating Functions 62 t t t
63 Important Terms & Concepts for Chapter 5 Bivariate distribution Bivariate normal distribution Conditional mean Conditional probability density function Conditional probability mass function Conditional variance Contour plots Correlation Covariance Error propagation General functions of random variables Independence Joint probability density function Joint probability mass function Linear functions of random variables Marginal probability distribution Multinomial distribution Reproductive property of the normal distribution Chapter 5 Summary 63
Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 4 Continuous Random Variables and Probability Distributions 4 Continuous CHAPTER OUTLINE Random
More informationChapter 4 Continuous Random Variables and Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 4 Continuous Random Variables and Probability Distributions 4 Continuous CHAPTER OUTLINE 4-1
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationContinuous r.v practice problems
Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationChapter 5 Class Notes
Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationIE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.
Closed book and notes. 10 minutes. Two summary tables from the concise notes are attached: Discrete distributions and continuous distributions. Eight Pages. Score _ Final Exam, Fall 1999 Cover Sheet, Page
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationFACULTY OF ENGINEERING AND ARCHITECTURE. MATH 256 Probability and Random Processes. 04 Multiple Random Variables
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Multiple Random Variables Fall 2012 Yrd. Doç. Dr. Didem Kivanc Tureli didem@ieee.org didem.ivanc@oan.edu.tr
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More informationClosed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators. Score Final Exam, Spring 2005 (May 2) Schmeiser Closed book and notes. 120 minutes. Consider an experiment
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationStat 100a, Introduction to Probability.
Stat 100a, Introduction to Probability. Outline for the day: 1. Geometric random variables. 2. Negative binomial random variables. 3. Moment generating functions. 4. Poisson random variables. 5. Continuous
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationAn-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)
An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 3 Discrete Random
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationmatrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2
Short Guides to Microeconometrics Fall 2018 Kurt Schmidheiny Unversität Basel Elements of Probability Theory 2 1 Random Variables and Distributions Contents Elements of Probability Theory matrix-free 1
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationHomework 5 Solutions
126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationProblem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},
ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationProblem 1. Problem 2. Problem 3. Problem 4
Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationIE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.
Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationMultivariate Distributions CIVL 7012/8012
Multivariate Distributions CIVL 7012/8012 Multivariate Distributions Engineers often are interested in more than one measurement from a single item. Multivariate distributions describe the probability
More informationMath 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is
Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More information5. Conditional Distributions
1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 0: Continuous Bayes Rule, Joint and Marginal Probability Densities Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationStatistics and data analyses
Statistics and data analyses Designing experiments Measuring time Instrumental quality Precision Standard deviation depends on Number of measurements Detection quality Systematics and methology σ tot =
More informationCDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationUCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationSolutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π
Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationSolution to Assignment 3
The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The
More informationData Analysis and Monte Carlo Methods
Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationMath 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationFINAL EXAM: 3:30-5:30pm
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationWrite your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.
2016 Booklet No. Test Code : PSA Forenoon Questions : 30 Time : 2 hours Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationSTAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.
STAT 509 Section 3.4: Continuous Distributions Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. A continuous random variable is one for which the outcome
More information3-1. all x all y. [Figure 3.1]
- Chapter. Multivariate Distributions. All of the most interesting problems in statistics involve looking at more than a single measurement at a time, at relationships among measurements and comparisons
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationChapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory
Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More information