C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.

Size: px
Start display at page:

Download "C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A."

Transcription

1 Lecture Next consider the topic that includes both discrete and continuous cases that of the multivariate probability distribution. We now want to consider situations in which two or more r.v.s act in conjunction with each other. It is not hard to see the myriad of applications for this; temperatures on a hot plate, price predictions based on time and place, or any other probability based on a multidimensional vector. Of course, we will begin our study with the 2--3 dimensional function. This is a function whose domain is of dim = 2, range is of dim =, and whose overall graph is of dim = 3. Definition 5. (p. 2) tells us that for discrete random variables Y, and Y 2, the joint probability function (jpf) is given by p(y, y 2 ) = P(Y = y and Y 2 = y 2 ) for y, y 2 R 2. Theorem 5. says that any jpf p(y, y 2 ) has the following properties.... p(y, y 2 ) 0 for all y, y 2 R Σ p(y, y 2 ) =. Lecture Now, this is good for discrete r.v.s, but Definition 5.2 (p. 22) is for discrete or continuous r.v.s. BUT, for any r.v.s Y, and Y 2, the joint cumulative distribution function (jcdf), is given by F(y, y 2 ) = P(Y y and Y 2 y 2 ) for y, y 2 R 2. Definition 5.3 then says that if for continuous r.v.s Y and Y 2 with jcdf F(y, y 2 ), there exists nonnegative function f(y, y 2 ) such that F(y, y 2 ) = y y2 f(x, x 2 ) dx 2 dx for y, y 2 R 2, then Y and Y 2 are said to be jointly continuous r.v.s and f(y, y 2 ) is called the joint probability density function (jpdf). Note that in the continuous case the probability is represented by volume for the 2--3 dimensional jpdf. Remember that in the case of the --2 dimensional pdf probability was represented by area. Is there a pattern here? Absolutely! The last dimension number of the graph gives the dimension for the probability. Theorems 5.2 and 5.3 (p. 24) give some important properties for jcdf s. Using the extended Real numbers (R ext ) for r.v.s Y and Y 2 having jpdf f (in the joint continuous case) and jcdf F (regardless), we see that. F(-,- ) = F(-, y 2 ) = F(y, - ) = F(, ) =. 3. F(y, y 2 ) 0. An interesting probability form is given here. For y b y a, and y 2b y 2a ( R), P(y a < Y < y b and y 2a < Y 2 < y 2b ) = F(y b, y 2b ) F(y b, y 2a ) F(y a, y 2b ) + F(y a, y 2a ). An illustration of the r.v. space is helpful. It is easy to see that the value of F in the upper right corner of the rectangle must be decreased by each of the values at the upper left and lower right corners. Then the value at the lower left corner needs to be added back since it s been discounted twicet. Now, let s do some examples. S

2 Ex. In problem 5.6 (p. 29) we have the jpdf f(y, y 2 ) = ky y 2 x I (y, y2) (0, ) 2. In this instance, the area (0, ) 2 is called the support of the jpdf. This is the positive value location for the function. Integration of f on the support yields 0 0 ky y 2 dy dy 2 = k/4 =. => k = 4. => f(y, y 2 ) = 4y y 2 x I (y,y2) (0, ) 2. To find the jcdf, we really only need find the functional value on the support and utilize the properties of Theorems 5.2 and 5.3. So, for (y, y 2 ) (this is a vector) (0, ) 2, y2 y 0 0 4x x 2 dx dx 2 = 4(y 2 /2) (y 22 /2) = y 2 y 22. So the final result is 2 2 F(y, y 2 ) = y x I (y, y2) ((0, ) X (, ))+ y 2 x I (y, y2) ((, ) X (0, )) 2 + y 2 y 2 x I (y, y2) (0, ) 2 + I (y, y2) (, ) 2. Whew!!! I m glad we have indicator functions! Here's a graph of F, the cumulative distribution function illustrated in light brown (right). So P(Y <.5 and Y 2 <.75) = F(.5,.75) =.5 2 x.75 2 =.406. [] Ex. In problem 5.0 (p. 220) we have the vector (Y, Y 2 ) representing proportions of component chemicals in an insecticide, and their jpdf is given by f(y, y 2 ) = 2 x I (y, y2) (S) where S is the support. f is illustrated (right) with support in light orange. The actual jpdf looks like this when viewed from the third quadrant of R 2 : We are asked to find P(Y and Y 2 <.75) and P(Y and Y 2 <.5). We'll start by locating the vector endpoint (.75,.75) in R 2. Next we'll use the symmetry of the support and integrate accordingly. P(Y and Y 2 <.75) = F(.75,.75) -y = dy 2 dy -y = ( 2y 2 ] 0 dy = ( 2 2y )dy = 2(.0625) =.875. The second probability is found by noting that the vector endpoint (.5,.5) is found on the boundary of S so the volume determined by F(.5,.5) is actually half of the total volume ( cu). Hence, P(Y and Y 2 <.5) =.5. Consider what this means in terms of the problem, and the nature of the jcdf F. [] 2S

3 Student Problem set H H. Given here is the joint probability function associated with data obtained in a study of automobile accidents in which a child (under age 5) was in the car and at least one fatality occurred. Probabilities are given in gold. Specifically, the study focused on whether or not the child survived and what type of seat belt (if any) he or she used. Y = {0, if the child survived Y 2 = {, if not {0, if no belt used {, if adult belt used {2, if car-seat belt used Note that Y is the number of fatalities per child and, since children's car seats use two belts, Y 2 is the number of seat belts in use at the time of the accident. a) Verify that the preceding probability function satisfies theorem 5. (p ). b) Find F(,2). What is the interpretation of this value? c) Find F(0, 2). What is the interpretation of this value? d) Find P(Y = 0 Y 2 = 2) and interpret this result. H2. Suppose that the random variables Y and Y 2 have joint probability density function f(y,y 2 ) given by f(y,y 2 )= {6y 2 y 2, 0 y y 2, y + y 2 2 {0, elsewhere a) Verify that this is a valid joint density function. b) What is P(Y +Y 2 ). H3. Use the joint probability function from H regarding the correlation between car accident fatalities and the use of seat belts. a) Give the marginal probability functions for Y and Y 2. b) Give the conditional probability function for (Y 2 Y = 0). c) What is the probability that a child survived given that she wore a car-seat belt? H4. Let Y and Y 2 ~ iid U(0, ), and let 3 random observations be taken from each variable. Denote by Y (max), and Y 2(max) represent the maximum observation for each variable. Find the jpdf and jcdf for the vector <Y (max), Y 2(max) >. Student Answer Set H H. a) Notice that all of the probabilities are least 0 and sum to. b) Note F(,2) = P(Y, Y 2 2) =. The interpretation of this value is that every child in the experiment either survived or didn't and used either 0,, or 2 seat belts. H2. a) Clearly, f(y,y 2 ) 0. Then we take the double integral of our joint density function from limits 0 to and y to 2-y. Doing so gives us, meaning that the joint density function is valid. b) P(Y + Y 2 < ) is the double integration of our joint density function from 0 to ½ and y to -y = /8 6/64 = /32 = H3. a) P(Y =0)=.76, P(Y =)=.24 P(Y 2 =0)=.55, P(Y 2 =) =.6, P(Y 2 =2)=.29 b) P(Y 2 =0 Y =0)=P(Y 2 = 0, Y = 0)/P(Y = 0)=.38/.76=.5, P(Y 2 = Y = 0) =.8, P(Y 2 =2 Y =0)=.32 3S

4 c) The desired probability is P(Y =0 Y 2 =2)=.24/.29 = [] Lecture Now that we have mastered the basics of multidimensional probability we are ready for some extensions into conditional probability. First, some definitions which we shall present in the continuous cases (the discrete cases can be defined in the appropriate way). Definition 5.4 (p. 223) tells us that if Y, and Y 2 are jointly continuous random variables with jpdf f(y, y 2 ), then the marginal density functions for Y and Y 2 respectively are f (y ) = f(y, y 2 )dy 2 and f 2 (y 2 ) = f(y, y 2 )dy. It is interesting to note again that the dimensionality is consistent. In the case of the --2 dimensional pdf, probability = area, and the height of the pdf is meaningless because it represents a line segment which has 0 area. Here, in the 2--3 dimensional case, we see that probability = volume, and the value of the marginal pdf is given by area (having 0 volume), but serving the same purpose. In the discrete case, this can be seen quite well in a contingency table. Recall that in that case using the jpmf we had p(y y 2 ) = p(y, y 2 )/p(y 2 ) for p(y 2 ) > 0. Well, the same type of thing occurs for the jointly continuous r.v.s. Definition 5.6 (p. 227) says that if Y, and Y 2 have jpdf f(y, y 2 ), then the conditional distribution function of Y given Y 2 is F(y y 2 ) = P(Y y Y 2 = y 2 ) (note that y 2 is a constant). Similarly, definition 5.7 (p. 228) gives the conditional density function for Y given Y 2 by f(y y 2 ) = f(y, y 2 )/ f 2 (y 2 ) and is only defined for y 2 such that f 2 (y 2 ) > 0. Clearly, we must now pay close attention to our indicator function. Ex Problem 5.22 (p. 230) gives us an excellent starting point. Here, recall from problem 5.6 that f(y, y 2 ) = 4y y 2 x I (y, y2) (0, ) 2. We will proceed to find the marginal and conditional densities for f. f (y ) = 4y y 2 dy 2 = 4y (y 22 /2] 0 = 2y x I y (0, ), and f 2 (y 2 ) = 4y y 2 dy = 4y 2 (y 2 /2] 0 = 2y 2 x I y2 (0, ). From this we can obtain P(Y <.5 Y 2 >.75) = P(Y <.5, and Y 2 >.75)/P(Y 2 >.75) = ( y y 2 dy dy 2 )/.75 2y 2 dy 2 =.094/.4375 =.25. The conditional densities are now easy to obtain. f(y y 2 ) = f(y, y 2 )/ f 2 (y 2 ) = (4y y 2 x I (y, y2) (0, ) 2 )/( 2y 2 x I y2 (0, )) = 2y x (I y (0, ) x I y2 (0, ))/( I y2 (0, )) = 2y x I (y, y2) (0, ) 2 (why?). Similarly, f(y 2 y ) = f(y, y 2 )/ f (y ) = 2y 2 x (I y (0, ) x I y2 (0, ))/( I y2 (0, )) = 2y 2 x I (y, y2) (0, ) 2. We are now asked to find P(Y <.75 Y 2 =.5). This is given by -.75 f(y y 2 )dy = y x I (y, y2) (0, ) 2 dy = y dy = The moral of the story, again, is that indicator functions are terrific! [] 4S

5 Ex. In prob we have that Y and Y 2 are jointly continuous and the jpdf is given by f(y, y 2 ) = 6y 2 y 2 x I (y, y2) (S), where S is given by The marginal densities are found by 2-y f (y ) = f(y, y 2 )dy 2 = y 6y 2 y 2 dy 2 x I y (0, ) = 2y 2 ( y ) x I y (0, ) (we see here that Y ~ β(3, 2)), and f 2 (y 2 ) = f(y, y 2 )dy y2 2-y2 = 0 6y 2 y 2 dy 2 x I y2 (0, ) + 0 6y 2 y 2 dy x I y2 (, 2) 4 = 2y 2 x I y2 (0, ) + 2y 2 (2 - y 2 ) 3 x I y2 (, 2). Note, again, that the marginals must be functions of the r.v. in question only. The conditional density of Y 2 given Y = y is given by f(y 2 y ) = f(y, y 2 )/ f (y ) = 6y 2 y 2 x I (y, y2) (S)/(2y 2 ( y ) x I y (0, )) = (y 2 /(2( - y ))) x I (y, y2) (S). We can now find P(Y 2 <. Y =.6) which is given by.. - f(y 2 y )dy 2 = - (y 2 /(2( - y ))) x I (y, y2) (S)dy 2. =.25 x.6 y 2 dy 2 = [] Lecture Recall from earlier days in statistics that two events were deemed independent iff P(A B) = P(A) x P(B). Well, the same sort of thing applies here in a larger sense. Definition 5.8 (p. 233) tells us that if Y and Y 2 are jointly continuous r.v.s having cdfs F (y ) and F 2 (y 2 ) respectively, then they are independent iff F(y, y 2 ) = F (y ) x F 2 (y 2 ) for all vectors (y, y 2 ) in R 2. Now, the astute observer will note that the units have a hard time matching up here, and would be correct in noting that. However, we must state that this relationship is purely numerical. If this is not the case, then they are said to be dependent. Note that our previous definition dealt with events being independent, whereas now we are concerned with r.v.s. Clearly, the same type of definition can be developed for discrete r.v.s. Theorem 5.4 (p. 234) tells us that we can take the issue down to the jpdfs specifically, for the previously defined r.v.s, Y and Y 2 are independent iff f(y, y 2 ) = f (y ) x f 2 (y 2 ) for all vectors (y, y 2 ) R 2. Theorem 5.5 (p. 236) gives an even easier way to determine independence. It says that if Y and Y 2 are jointly continuous r.v.s with jpdf f(y, y 2 ) then they are independent iff f can be written as a product of nonnegative functions of y alone and y 2 alone. So to show independence we can use either of theorems 5.4 and 5.5. But to show dependence, we need only demonstrate that f(y, y 2 ) f (y ) x f 2 (y 2 ) for some vector (y, y 2 ) R 2. Ex. We now reconsider our favorite jpdf (from problem 5.6) in problem 5.44 f (y, y 2 ) = 4y y 2 x I (y, y2) (0, ) 2. Note from problem 5.22 that f(y, y 2 ) = 2y I y (0, ) x 2y 2 I y2 (0, ) = f (y ) x f 2 (y 2 ). 5S

6 Therefore, Y, and Y 2 are independent by Theorem 5.4. [] Lecture Now here are some problems. Student Problem set I I. In exercise 5.8, we proved that f(y, y 2 ) = I (<Y, Y2>) (0, ) 2 is a valid joint probability density function for Y, the amount of pollutant per sample collected above the stack without the cleaning device, and Y 2, the amount collected above the stack with the cleaner. Are the amounts of pollutants per sample collected with and without the cleaning device independent? I2. Suppose that that random variables Y and Y 2 have joint probability density function, given by f(y, y 2 )={6y 2 y 2, 0 y 2, y +y 2 2 {0, elsewhere Show that Y and Y 2 are dependent random variables. I3. Three balanced coins are tossed independently. Let Y = # of heads, and Y 2 = amount of money won on a side bet in which Y 2 = I Y (head on coin ) + 2I Y (first head on coin 2) + 3I Y (first head on coin 3) I Y (Y = 0). a) derive f 2 (y 2 ). b) find P(Y = 3 Y 2 = ). I4. A radioactive particle is uniformly and randomly located on the unit square so that f(y, y 2 ) = I (<Y, Y2>) (0, ) 2 a) Find the marginal density functions f (Y ) and f 2 (Y 2 ). b) Find the conditional density function f(y 2 Y ). c) Find P(Y 2 <.5 y =.7). I5. The following 2--3 function, f(y,y 2 )={, 0 y 2, 0 y 2, 2y 2 y {0, elsewhere is a valid joint probability density function for Y, the amount of pollutant per sample collected above the stack without the cleaning device, and for Y 2, the amount collected above the stack with the cleaner. a) If we consider the stack with a cleaner installed, find the probability that the amount of pollutant in a given sample will exceed.5. b) Given that the amount of pollutant in a sample taken above the stack with the cleaner is observed to be.5, find the probability that the amount of pollutant exceeds.5 above the other stack (without the cleaner). I6. Suppose Y and Y 2 denoted the proportions of time during which employees I and II actually performed their assigned tasks during a workday. The joint density of Y and Y 2 is given by f(y,y 2 ) = (y +y 2 ) I <y,y2> ((0, ) x (0, )). a) Find the marginal density functions for Y and Y 2. b) Find P(Y >.5 Y 2 >.5). c) If employee II spends exactly 50% of the day working on assigned duties, find the probability that employee I spends more than 75% of the day working on similar duties. 6S

7 I7. The number of defects per yard, Y, for a certain fabric is known to have a Poisson distribution with parameter lambda. However, lambda itself is a random variable with probability density function given by f( ) = e - I [0, ) Find the unconditional probability function for Y. Student Solution Set H I. The variables Y and Y 2 are dependent, as the range of y values on which f(y, y 2 ) is defined depends on Y 2. I2. Dependent as the range of y values on which f(y, y 2 ) is defined depends on y 2. More rigorously, one could verify from the solution to exercise 5.28 that f(y 2 Y =y ) f(y 2 ). I5. a) We want to integrate f 2 (y 2 ) = 2(-y 2 ) from.5 to, which gives us.25. b) First, we determine that f(y y 2 )=/(2(-y 2 )) for 2(y 2 ) y 2. Then, since y 2 =/2, f(y y 2 =/2)= for y 2. Therefore, P(Y >.5 Y 2 =/2)=2-.5=.5. I6. a) f (y ) = y +/2 and f 2 (y 2 ) = y 2 +/2 b) P(Y 2 > /2, Y > ½) = 3/8 and P(Y 2 > ½) = 5/8 Thus, P(Y > /2 Y 2 > (/2))=(6/6)/(5/8)=3/5 c) First, we consider that f(y y 2 )=(y +y 2 )/(y 2 +/2). P(Y >.75 Y 2 =.5) = (the integral of f(y y 2 ) from.75 to )= = I7. We have, if > 0, P(Y=y )=(( y )(e - )/(y!)) and f(λ)=e (- ) I ). Thus, P(y)= 0 P(Y = y, ) d = (/2) (y+). [] 7S

8 Lecture Finally, it is worth noting that the role played by indicator functions is huge. If the support for a jpdf lends itself to a factorization of the type seen in the previous example, then the r.v.s have a good chance of being independent. If, on the other hand, the support looks like that of problem 5.28, then it is unlikely that the r.v.s are independent. 8S

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Conditional distributions (discrete case)

Conditional distributions (discrete case) Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can

More information

EXAM # 3 PLEASE SHOW ALL WORK!

EXAM # 3 PLEASE SHOW ALL WORK! Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households

More information

JOINT PROBABILITY DISTRIBUTIONS

JOINT PROBABILITY DISTRIBUTIONS MTH/STA 56 JOINT PROBABILITY DISTRIBUTIONS The study of random variables in the previous chapters was restricted to the idea that a random variable associates a single real number with each possible outcome

More information

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS 3.1 Concept of a Random Variable Random Variable A random variable is a function that associates a real number with each element in the sample space.

More information

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

Continuous Random Variables

Continuous Random Variables MATH 38 Continuous Random Variables Dr. Neal, WKU Throughout, let Ω be a sample space with a defined probability measure P. Definition. A continuous random variable is a real-valued function X defined

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. STAT 509 Section 3.4: Continuous Distributions Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. A continuous random variable is one for which the outcome

More information

5. Conditional Distributions

5. Conditional Distributions 1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an

More information

STA 584 Supplementary Examples (not to be graded) Fall, 2003

STA 584 Supplementary Examples (not to be graded) Fall, 2003 Page 1 of 8 Central Michigan University Department of Mathematics STA 584 Supplementary Examples (not to be graded) Fall, 003 1. (a) If A and B are independent events, P(A) =.40 and P(B) =.70, find (i)

More information

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Probability theory. References:

Probability theory. References: Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.

More information

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Check Your Understanding of the Lecture Material Finger Exercises with Solutions Check Your Understanding of the Lecture Material Finger Exercises with Solutions Instructor: David Dobor March 6, 2017 While the solutions to these questions are available, you are strongly encouraged

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Random Variables and Expectations

Random Variables and Expectations Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure

More information

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) = 1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

STAT 430/510: Lecture 15

STAT 430/510: Lecture 15 STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

Probability. Table of contents

Probability. Table of contents Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Math438 Actuarial Probability

Math438 Actuarial Probability Math438 Actuarial Probability Jinguo Lian Department of Math and Stats Jan. 22, 2016 Continuous Random Variables-Part I: Definition A random variable X is continuous if its set of possible values is an

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

STAT509: Continuous Random Variable

STAT509: Continuous Random Variable University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.

More information

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999. Math 447. 1st Homework. First part of Chapter 2. Due Friday, September 17, 1999. 1. How many different seven place license plates are possible if the first 3 places are to be occupied by letters and the

More information

Week 9, 10/15/12-10/19/12, Notes: Continuous Distributions in General and the Uniform Distribution

Week 9, 10/15/12-10/19/12, Notes: Continuous Distributions in General and the Uniform Distribution Week 9, 10/15/12-10/19/12, Notes: Continuous Distributions in General and the Uniform Distribution 1 Monday s, 10/15/12, notes: Review Review days are generated by student questions. No material will be

More information

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University. Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV An example: Assume we toss a coin 3 times and record the outcomes. Let X i be a random variable defined by { 1, if the i th outcome is Head; X i = 0, if the i th outcome is Tail; Let X be the random variable

More information

Course 1 Solutions November 2001 Exams

Course 1 Solutions November 2001 Exams Course Solutions November Exams . A For i =,, let R = event that a red ball is drawn form urn i i B = event that a blue ball is drawn from urn i. i Then if x is the number of blue balls in urn, ( R R)

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Discrete random variables and probability distributions

Discrete random variables and probability distributions Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or

More information

Final Examination December 17, 2012 MATH 323 P (Y = 5) =

Final Examination December 17, 2012 MATH 323 P (Y = 5) = . The eight-member Human Relations Advisory Board of Montreal, considered the complaint of a woman who claimed discrimination, based on sex, on the part of a local company. The board, composed of five

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Systems of Linear Equations In Lesson 2, you gained experience in writing linear equations

Systems of Linear Equations In Lesson 2, you gained experience in writing linear equations LESSON 3 Systems of Linear Equations In Lesson 2, you gained experience in writing linear equations with two variables to express a variety of problem conditions. Sometimes, problems involve two linear

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park August 19, 2008 1 Introduction There are three main objectives to this section: 1. To introduce the concepts of probability and random variables.

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010 Outline Math 143 Department of Mathematics and Statistics Calvin College Spring 2010 Outline Outline 1 Review Basics Random Variables Mean, Variance and Standard Deviation of Random Variables 2 More Review

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

Week 2: Review of probability and statistics

Week 2: Review of probability and statistics Week 2: Review of probability and statistics Marcelo Coca Perraillon University of Colorado Anschutz Medical Campus Health Services Research Methods I HSMP 7607 2017 c 2017 PERRAILLON ALL RIGHTS RESERVED

More information

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N

More information

STAT 516 Midterm Exam 2 Friday, March 7, 2008

STAT 516 Midterm Exam 2 Friday, March 7, 2008 STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Chapter 3. Chapter 3 sections

Chapter 3. Chapter 3 sections sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions

More information

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous

More information

1 Probability Distributions

1 Probability Distributions 1 Probability Distributions In the chapter about descriptive statistics sample data were discussed, and tools introduced for describing the samples with numbers as well as with graphs. In this chapter

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C, Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values. Name 18.05 Exam 1 No books or calculators. You may have one 4 6 notecard with any information you like on it. 6 problems, 8 pages Use the back side of each page if you need more space. Simplifying expressions:

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2 IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read

More information

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.

More information

9. DISCRETE PROBABILITY DISTRIBUTIONS

9. DISCRETE PROBABILITY DISTRIBUTIONS 9. DISCRETE PROBABILITY DISTRIBUTIONS Random Variable: A quantity that takes on different values depending on chance. Eg: Next quarter s sales of Coca Cola. The proportion of Super Bowl viewers surveyed

More information

STAT 201 Chapter 5. Probability

STAT 201 Chapter 5. Probability STAT 201 Chapter 5 Probability 1 2 Introduction to Probability Probability The way we quantify uncertainty. Subjective Probability A probability derived from an individual's personal judgment about whether

More information

Continuous-Valued Probability Review

Continuous-Valued Probability Review CS 6323 Continuous-Valued Probability Review Prof. Gregory Provan Department of Computer Science University College Cork 2 Overview Review of discrete distributions Continuous distributions 3 Discrete

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

Topic 5 Part 3 [257 marks]

Topic 5 Part 3 [257 marks] Topic 5 Part 3 [257 marks] Let 0 3 A = ( ) and 2 4 4 0 B = ( ). 5 1 1a. AB. 1b. Given that X 2A = B, find X. The following table shows the probability distribution of a discrete random variable X. 2a.

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

Probability and Probability Distributions. Dr. Mohammed Alahmed

Probability and Probability Distributions. Dr. Mohammed Alahmed Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about

More information

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections 9.8-9.9 Fall 2011 Lecture 8 Part 1 (Fall 2011) Probability Distributions Lecture 8 Part 1 1 / 19 Probability

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras (Refer Slide Time: 00:23) Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras Lecture - 22 Independent

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Normal Random Variables and Probability

Normal Random Variables and Probability Normal Random Variables and Probability An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2015 Discrete vs. Continuous Random Variables Think about the probability of selecting

More information

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables EE5110: Probability Foundations for Electrical Engineers July-November 015 Lecture 13: Conditional Distributions and Joint Continuity Lecturer: Dr. Krishna Jagannathan Scribe: Subrahmanya Swamy P 13.1

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

ST 371 (IX): Theories of Sampling Distributions

ST 371 (IX): Theories of Sampling Distributions ST 371 (IX): Theories of Sampling Distributions 1 Sample, Population, Parameter and Statistic The major use of inferential statistics is to use information from a sample to infer characteristics about

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 3: The Exponential Distribution and the Poisson process Section 4.8 The Exponential Distribution 1 / 21 Exponential Distribution

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

Change Of Variable Theorem: Multiple Dimensions

Change Of Variable Theorem: Multiple Dimensions Change Of Variable Theorem: Multiple Dimensions Moulinath Banerjee University of Michigan August 30, 01 Let (X, Y ) be a two-dimensional continuous random vector. Thus P (X = x, Y = y) = 0 for all (x,

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information