Multivariate distributions

Size: px
Start display at page:

Download "Multivariate distributions"

Transcription

1 CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density p(x, y) P(X x, Y y). Remember that here the comma means and."" In the continuous case a density is a function such that P(a X b, c Y d) ˆ b ˆ d a c f(x, y)dy dx. Example.: If f X,Y (x, y) ce x e y for < x < and x < y <, what is c? Answer. We use the fact that a density must integrate to. So Recalling multivariable calculus, this is so c 6. x ce x e y dy dx. ce x e x dx c 6, The multivariate distribution function of (X, Y ) is defined by F X,Y (x, y) P(X x, Y y). In the continuous case, this is and so we have F X,Y (x, y) ˆ x ˆ y f(x, y) F (x, y). x y The extension to n random variables is exactly similar. We have or P(a X b, c Y d) ˆ ˆ P((X, Y ) D) f X,Y (x, y)dy dx, ˆ b ˆ d a D c f X,Y (x, y)dy dx, f X,Y dy dx

2 . MULTIVARIATE DISTRIBUTIONS when D is the set (x, y) : a x b, c y d}. One can show this holds when D is any set. For example, ˆ ˆ P(X < Y ) f X,Y (x, y)dy dx. x<y} If one has the joint density f X,Y (x, y) of X and Y, one can recover the marginal densities of X and of Y : f X (x) f X,Y (x, y)dy, f Y (y) f X,Y (x, y)dx. If we have a binomial with parameters n and p, this can be thought of as the number of successes in n trials, and P(X k) n! k!(n k)! pk ( p) n k. If we let k k, k n k, p p, and p p, this can be rewritten as n! k!k! pk p k, as long as n k + k. Thus this is the probability of k successes and k failures, where the probabilities of success and failure are p and p, resp. A multivariate random vector is (X,..., X r ) with P(X n,..., X r n r ) n! n! n r! pn p nr r, where n + + n r n and p + p r. Thus this generalizes the binomial to more than categories. In the discrete case we say X and Y are independent if P(X x, Y y) P(X x)p(y y) for all x and y. In the continuous case, X and Y are independent if P(X A, Y B) P(X A)P(Y B) for all pairs of subsets A, B of the reals. The left hand side is an abbreviation for and similarly for the right hand side. In the discrete case, if we have independence, P(ω : X(ω) is in A and Y (ω) is in B}) p X,Y (x, y) P(X x, Y y) P(X x)p(y y) p X (x)p Y (y).

3 In other words, the joint density p X,Y ˆ b ˆ d a c.. INTRODUCTION 3 factors. In the continuous case, f X,Y (x, y)dy dx P(a X b, c Y d) P(a X b)p(c Y d) ˆ b a ˆ b ˆ d a f X (x)dx c ˆ d c f Y (y)dy f X (x)f Y (y) dy dx.. One can conclude from this by taking partial derivatives that f X,Y (x, y) f X (x)f Y (y), or again the joint density factors. Going the other way, one can also see that if the joint density factors, then one has independence. Example.: Suppose one has a floor made out of wood planks and one drops a needle onto it. What is the probability the needle crosses one of the cracks? Suppose the needle is of length L and the wood planks are D across. Answer. Let X be the distance from the midpoint of the needle to the nearest crack and let Θ be the angle the needle makes with the vertical. Then X and Θ will be independent. X is uniform on [, D/] and Θ is uniform on [, π/]. A little geometry shows that the needle will cross a crack if L/ > X/ cos Θ. We have f X,Θ 4 and so we have to integrate this πd constant over the set where X < L cos Θ/ and Θ π/ and X D/. The integral is ˆ π/ ˆ L cos θ/ 4 L dx dθ πd πd. If X and Y are independent, then ˆ ˆ P(X + Y a) f X,Y (x, y)dx dy ˆ ˆ ˆ x+y a} x+y a} ˆ a y f X (x)f Y (y)dx dy f X (x)f Y (y)dx dy F X (a y)f Y (y)dy. Differentiating with respect to a, we have ˆ f X+Y (a) f X (a y)f Y (y)dy. There are a number of cases where this is interesting.

4 4. MULTIVARIATE DISTRIBUTIONS () If X is a gamma with parameters s and λ and Y is a gamma with parameters t and λ, then straightforward integration shows that X + Y is a gamma with parameters s + t and λ. In particular, the sum of n independent exponentials with parameter λ is a gamma with parameters n and λ. () If Z is a N (, ), then F Z (y) P(Z y) P( y Z y) F Z ( y) F Z ( y). Differentiating shows that f Z (y) ce y/ (y/) (/), or Z is a gamma with parameters and. So using () above, if Z i are independent N (, ) s, then n i Z i is a gamma with parameters n/ and, i.e., a χ n. (3) If X i is a N (µ i, σi ) and the X i are independent, then some lengthy calculations show that n i X i is a N ( µ i, σi ). (4) The analogue for discrete random variables is easier. If X and Y takes only nonnegative integer values, we have r P(X + Y r) P(X k, Y r k) k r P(X k)p(y r k). k In the case where X is a Poisson with parameter λ and Y is a Poisson with parameter µ, we see that X + Y is a Poisson with parameter λ + µ. To check this, use the above formula to get r P(X + Y r) P(X k)p(y r k) k r e k λ λk µr k e µ k! e (λ+µ) r! (r k)! r r λk µ r k k k using the binomial theorem. (λ+µ) (λ + µ)r e r! Note that it is not always the case that the sum of two independent random variables will be a random variable of the same type. If X and Y are independent normals, then Y is also a normal (with E ( Y ) E Y and Var( Y ) ( ) VarY VarY ), and so X Y is also normal. To define a conditional density in the discrete case, we write p X Y y (x y) P(X x Y y).

5 This is equal to P(X x, Y y) P(Y y) Analogously, we define in the continuous case.. INTRODUCTION 5 f X Y y (x y) p(x, y) p Y (y). f(x, y) f Y (y). Just as in the one-dimensional case, there is a change of variables formula. Let us recall how the formula goes in one dimension. If X has a density f X and y g(x), then F Y (y) P(Y y) P(g(X) y) P(X g (y)) F X (g (y)). Taking the derivative, using the chain rule, and recalling that the derivative of g (y) is /g (y), we have f Y (y) f X (g (y)) g (y). The higher dimensional case is very analogous. Suppose Y g (X, X ) and Y g (X, X ). Let h and h be such that X h (Y, Y ) and X h (Y, Y ). (This plays the role of g.) Let J be the Jacobian of the mapping (x, x ) (g (x, x ), g (x, x )), so that J g x g x g x g x. (This is the analogue of g (y).) Using the change of variables theorem from multivariable calculus, we have f Y,Y (y, y ) f X,X (x, x ) J. Example.3: Suppose X is N (, ), X is N (, 4), and X and X are independent. Let Y X + X, Y X 3X. Then y g (x, x ) x + x, y g (x, x ) x x 3, so J 7. 3 (In general, J might depend on x, and hence on y.) Some algebra leads to x 3 7 y + 7 y, x 7 y 7 y. Since X and X are independent, f X,X (x, x ) f X (x )f X (x ) π e x / 8π e x /8. Therefore f Y,Y (y, y ) π e ( 3 7 y + 7 y ) / 8π e ( 7 y 7 y ) /8 7.

6 6. MULTIVARIATE DISTRIBUTIONS.. Further examples and applications Example.4: Suppose we roll two dice with sides,,,, 3, 3. Let X be the largest value obtained on any of the two dice. Let Y be the sum of the two dice. Find the joint p.m.f. of X and Y. Answer. First we make a table of all the possible outcomes. Note that individually, X,, 3 and Y, 3, 4, 5, 6. The table for possible outcomes of (X, Y ) jointly is: outcome 3 (X, Y ) (, ) (, 3) (3, 4) (, 3) (, 4) (3, 5) 3 (3, 4) (3, 5) (3, 6) Using this table we have that the p.m.f. is given by: X\Y P (X, Y ) Example.5: Let X, Y have joint pdf ce x e y, < x <, < y < f(x, y). otherwise (a) Find c that makes this a joint pdf: Answer. The region that we integrate over in the first quadrant thus Then c. (b) Find P (X < Y ). c ce x e y dxdy c [ ] e y e y dy c e y [ e x] dy c. Copyright 7 Phanuel Mariano, Patricia Alonso-Ruiz.

7 .. FURTHER EXAMPLES AND APPLICATIONS 7 Answer. We need to draw the region Let D (x, y) < x < y, < y < } and set up the the double integral: ˆ ˆ P (X < Y ) f(x, y)da D ˆ y work 3. (c) Set up the double integral for P (X >, Y < ) Answer. P (X >, Y < ) (d) Find the marginal f X (x): Answer. We have f X (x) ˆ e x e y dxdy e x e y dxdy ( e )e. f(x, y)dy e x [ e y e x. ] e x e y dy [ e x + ] Example.6: Let X, Y be r.v. with joint pdf f(x, y) 6e x e 3y < x <, < y <. Are X, Y independent? Answer. Find f X and f Y and see if f f X f Y. First f X (x) f Y (y) 6e x e 3y dy e x, 6e x e 3y dx 3e y. which are both exponential. Since f f X f Y then yes they are independent! Example.7: Let X, Y have Are X, Y independent? f X,Y (x, y) x + y, < x <, < y < Answer. Note that there is no way to factor x + y f X (x)f Y (y), hence they can t be independent.

8 8. MULTIVARIATE DISTRIBUTIONS Proposition.. If X i N (µ i, σi ) are independent for i n then X + + X n N ( ) µ + µ n, σ + + σn. In particular if X N (µ x, σ x) and Y N (µ y, σ y) then X + Y N ( µ x + µ y, σ x + σ y) and X Y N ( µ x µ y, σ x + σ y). In general for two independent Gaussian X and Y we have cx + dy N ( cµ x + dµ y, cσ x + dσ y). Example.8: Suppose T N (95, 5) and H N (65, ) represents the grades of T. and H. in their Probability class. (a) What is the probability that their average grades will be less than 9? Answer. By the proposition T + H N (6, 6). Thus ( ) T + H P 9 P (T + H 8) P ( Z Φ (.56).996 ) ( ) Φ 6 6 (b) What is the probability that H. will have scored higher than T.? Answer. Using H T N ( 3, 6) we compute P (H > T ) P (H T > ) P (H T < ) ( P Z ( 3) ) 6 Φ(3.84).6 (c) Answer question (b) if T N (9, 64) and H N (7, 5). Answer. By the proposition T H N (, 89) and so P (H > T ) P (H T > ) P (H T < ) ( P Z ( ) ) 7 Φ(.8).9 Example.9: Suppose X, X have joint distribution x + 3 f X,X (x, x ) (x ) x, x. otherwise

9 .. FURTHER EXAMPLES AND APPLICATIONS 9 Find the joint pdf of Y X + X and Y X. Answer. Step : Find the Jacobian: So Step : Solve for x, x and get y g (x, x ) x + x, y g (x, x ) x. J (x, x ) 4x x x y, x y y. Step 3: The joint pdf of Y, Y is given by the formula: f Y,Y (y, y ) f X,X (x, x ) J (x, x ) f X,X ( y, y y ) 4x [ y ( 4x + 3 y ) ] y y, y y otherwise

10 . MULTIVARIATE DISTRIBUTIONS.3. Exercises Exercise.: Suppose that balls are chosen without replacement from an urn consisting of 5 white and 8 red balls. Let X equal if the first ball selected is white and zero otherwise. Let Y equal if the second ball selected is white and zero otherwise. (A) Find the probability mass function of X, Y. (B) Find E (XY ). (C) Is it true that E (XY ) (E X)(E Y )? (D) Are X, Y independent? Exercise.: Suppose you roll two fair dice. Find the probability mass function of X and Y, where X is the largest value obtained on any die, and Y is the sum of the values. Exercise.3: Suppose the joint density function of X and Y is f(x, y) 4 for < x < and < y <. (A) Calculate P ( < X <, 3 < Y < 4 3). (B) Calculate P (XY < ). (C) Calculate the marginal distributions f X (x) and f Y (y). Exercise.4: Find P (X < Y ). The joint probability density function of X and Y is given by f(x, y) e (x+y), x <, y <. Exercise.5: Suppose X and Y are independent random variables and that X is exponential with λ and Y is uniform on (, 5). Calculate the probability that X + Y < 8. 4 Exercise.6: Consider X and Y given by the joint density x y y x f(x, y) otherwise. (A) Find the marginal pdfs, f X (x) and f Y (x). (B) Are X and Y independent random variables? (C) Find P ( ) Y X. (D) Find P ( Y X Y ) X 4. (E) Find E [X]. Exercise.7: Consider X and Y given by the joint density 4xy x, y f(x, y) otherwise. (A) Find the joint pdf s, f X and f Y. (B) Are X and Y independent? (C) Find EY.

11 Exercise.8:.3. EXERCISES Consider X, Y given by the joint pdf (x + y) x, y f(x, y) 3 otherwise. Are X and Y independent random variables? Exercise.9: Suppose that gross weekly ticket sales for UConn basketball games are normally distributed with mean $,, and standard deviation $3,. What is the probability that the total gross ticket sales over the next two weeks exceeds $4, 6,? Exercise.: Suppose the joint density function of the random variable X and X is 4x x < x <, < x < f(x, x ) otherwise. Let Y X + X and Y X 3X. What is the joint density function of Y and Y? Exercise.: Suppose the joint density function of the random variable X and X is 3 f(x, x ) (x + x ) < x <, < x < otherwise. Let Y X x and Y X + 3X. What is the joint density function of Y and Y? Exercise.: We roll two dice. Let X be the minimum of the two numbers that appear, and let Y be the maximum. Find the joint probability mass function of (X, Y ), that is, P (X i, Y j): i j Also find the marginal probability mass functions of X and Y. Finally, find the conditional probability mass function of X given that Y is 5, P (X i Y 5), for i,..., 6.

12 . MULTIVARIATE DISTRIBUTIONS Solution to Exercise.(A): We have Solution to Exercise.(B):.4. Selected solutions p(, ) P (X, Y ) P (RR) , p(.) P (X, Y ) P (W R) , p(, ) P (X, Y ) P (RW ) , p(, ) P (X, Y ) P (W W ) E (XY ) P (X, Y ) Solution to Exercise.(C): Not true because (E X)(E Y ) P (X ) P (Y ) ( ) Solution to Exercise.(D): X and Y are not independent because P (X, Y ) 5 39 P (X ) P (Y ) ( 5 3). Solution to Exercise.: First we need to figure what values X, Y can attain. Note that X can be any of,, 3, 4, 5, 6, but Y is the sum and can only be as low as and as high as. First we make a table of possibilities for (X, Y ) given the values of the dice. Recall X is the largest of the two, and Y is the sum of them. The possible outcomes are given by: st die\nd die (, ) (, 3) (3, 4) (4, 5) (5, 6) (6, 7) (, 3) (, 4) (3, 5) (4, 6) (5, 7) (6, 8) 3 (3, 4) (3, 5) (3, 6) (4, 7) (5, 8) (6, 9) 4 (4, 5) (4, 6) (4, 7) (4, 8) (5, 9) (6, ) 5 (5, 6) (5, 7) (5, 8) (5, 9) (5, ) (6, ) 6 (6, 7) (6, 8) (6, 9) (6, ) (6, ) (6, )

13 Then we make a table of the pmf p(x, y)..4. SELECTED SOLUTIONS 3 X\Y Solution to Exercise.3(A): We integrate the pdf over the bounds and get ˆ ˆ dydx ( ) ( ) 3. 3 Solution to Exercise.3(B): We need to find the region that is within < x, y < and y <. (Try to draw the region) We get two regions from this. One with bounds x < x <, < y < and the region < x <, < y <. Then x P (XY < ) ˆ ˆ 4 dydx + ˆ + x dx + ln. Solution to Exercise.3(C): Recall that f X (x) f(x, y)dy for < x < and otherwise. By symmetry, f Y ˆ ˆ ˆ x 4 dy is the same. 4 dydx Solution to Exercise.4: Draw a picture of the region and note the integral needs to be set up in the following way: are ˆ y [ P (X < Y ) e (x+y) dxdy e y + e y] dy [ ] ( ) e y e y.

14 4. MULTIVARIATE DISTRIBUTIONS Solution to Exercise.5: We know that f X (x) 4 e x 4 when x otherwise and f Y (y) 3 when < y < 5 otherwise Since X, Y are independent then f X,Y f X f Y, thus f X,Y (x, y) e x 4 when x, < y < 5 otherwise Draw the region (X + Y < 8), which correspond to x, < y < 5 and y < 8 x. Drawing a picture of the region, we get the corresponding bounds of < y < 5 and < x < 4 y, so that P (X + Y < 8) ˆ 5 ˆ 5 ˆ 4 y x e 4 dxdy ( e y/8 ) dx ( ) e 3 8 e 3 4 Solution to Exercise.6(A): We have 5x 4 x f X (x) otherwise, f Y (y) 3 y( y3 ) y otherwise. Solution to Exercise.6(B): No, since f X,Y f X f Y. Solution to Exercise.6(C): P ( ) Y X. 4 Solution to Exercise.6(D): Also 4. Solution to Exercise.6(E): Use f X and the definition of expected value, which is 5/6. Solution to Exercise.7(A): f X x and f Y y. Solution to Exercise.7(B): Yes! Since f(x, y) f X f Y. Solution to Exercise.7(C): We have EY y ydy 3. Solution to Exercise.8: We get f X ( 3 x + 3) while fy y and f f Xf Y. Solution to Exercise.9: If W X + X is the sales over the next two weeks, then W is normal with mean,, +,, 4, 4, and variance 3, + 3,.

15 .4. SELECTED SOLUTIONS 5 Thus the variance is 3, + 3, 35, Hence ( ) 4, 6, 4, 4, P (W > 5,, ) P Z > 35, P (Z >.649) Φ (.6).7 Solution to Exercise.: Step : Find the Jacobian: y g (x, x ) x + x, y g (x, x ) x 3x. So J (x, x ) 7. 3 Step : Solve for x, x and get x 3 7 y + y y x 7 y 7 y Step 3: The joint pdf of Y, Y is given by the formula: f Y,Y (y, y ) f X,X (x, x ) J (x, x ) ( 3 f X,X 7 y + y y, 7 y ) 7 y 7. Since we are given the joint pdf of X and X, then plugging it these into f X,X, we have 4 (3y f Y,Y (y, y ) y ) (y y ) < 3y + y < 7, < y y < otherwise. Solution to Exercise.: Step : Find the Jacobian: So y g (x, x ) x x, y g (x, x ) x + 3x. J (x, x ) 7. 3

16 6. MULTIVARIATE DISTRIBUTIONS Step : Solve for x, x and get x 7 (3y + y ) x 7 ( y + y ) Step 3: The joint pdf of Y, Y is given by the formula: f Y,Y (y, y ) f X,X (x, x ) J (x, x ) ( f X,X 7 (3y + y ), ) 7 ( y + y ) 7. Since we are given the joint pdf of X and X, then plugging it these into f X,X, we have 3 f Y,Y (y, y ) ( 7 3 (3y + y ) + ( y + y ) ) < 3y + y < 7, < y + y < 7 otherwise.

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

STAT 430/510: Lecture 15

STAT 430/510: Lecture 15 STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional

More information

CHAPTER 8. Normal distribution Introduction. A synonym for normal is Gaussian. The first thing to do is show that this is a density.

CHAPTER 8. Normal distribution Introduction. A synonym for normal is Gaussian. The first thing to do is show that this is a density. CHAPTER 8 Normal distribution 8.. Introduction A r.v. is a standard normal written N, )) if it has density e x /. A synonym for normal is Gaussian. The first thing to do is show that this is a density.

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

Introduction to Statistical Inference Self-study

Introduction to Statistical Inference Self-study Introduction to Statistical Inference Self-study Contents Definition, sample space The fundamental object in probability is a nonempty sample space Ω. An event is a subset A Ω. Definition, σ-algebra A

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

5. Conditional Distributions

5. Conditional Distributions 1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

4 Pairs of Random Variables

4 Pairs of Random Variables B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes. Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

Conditional distributions. Conditional expectation and conditional variance with respect to a variable. Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

1 Joint and marginal distributions

1 Joint and marginal distributions DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested

More information

Continuous Distributions

Continuous Distributions Chapter 5 Continuous Distributions 5.1 Density and Distribution Functions In many situations random variables can take any value on the real line or in a certain subset of the real line. For concrete examples,

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University. Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

sheng@mail.ncyu.edu.tw Content Joint distribution functions Independent random variables Sums of independent random variables Conditional distributions: discrete case Conditional distributions: continuous

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Random Variables (Continuous Case)

Random Variables (Continuous Case) Chapter 6 Random Variables (Continuous Case) Thus far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 1

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 1 CS434a/541a: Pattern Recognition Prof. Olga Veksler Lecture 1 1 Outline of the lecture Syllabus Introduction to Pattern Recognition Review of Probability/Statistics 2 Syllabus Prerequisite Analysis of

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information