9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown
|
|
- Dulcie Hicks
- 6 years ago
- Views:
Transcription
1 9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown I. Objectives Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables Understand the concept of a conditional distribution in the discrete and continuous cases. Understand how to derive the distribution of the sum of two random variables. Understand how to compute the distribution for the transformation of two or more random variables. II. Conditional Distributions Just as we used conditional probabilities in Lecture 1 to evaluate the likelihood of one event given another, we develop here the concepts of discrete and continuous conditional distributions and discrete and continuous conditional probability mass functions and probability density functions to evaluate the behavior of one random variable given knowledge of another. A. Discrete Conditional Distributions If X and Y are jointly distributed discrete random variables, the conditional probability that X = x i given Y = y j is Pr( X = x i,y = y j ) Pr( X = x i Y = y j ) = Pr( Y = y j ) pxy ( x i, y j ) = p ( y ) y j (5.1) provided that py( yj) > 0. This is the conditional probability mass function of X given Y = y j. The numerator is a function of x for a fixed value of y. Similarly, we have We also have xy ( i, j ) Pr( = j = i ) p x y Y y X x = (5.) p ( x ) x i pxy ( xi, y j ) = p x ( x i )p y x ( y i x i ) = p y ( y j )p x y ( x i y j ). (5.3) Example 5.1 Melencolia I. The following discrete joint probability mass function is based on a magical square in Albert Dϋrer s engraving Melencolia I.
2 page : Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables X Y (5.4) Find Pr(Y =1 X = 3) and Pr(X = 4 Y = ) Pr( Y = 1I X = 3) Pr( Y = 1 X = 3) = Pr( X = 3) Pr(Y = 1I X = 3) = 4 Pr(Y = y i I X = 3) i=1 9 1 = / ( ) = 9 34 Pr(X = 4 I Y = ) Pr( X = 4 Y = ) = 4 Pr(X = x i I Y = ) i= = / ( ) 15 = 34 (5.5) Example 5. Suppose that each acetylcholine molecule released at the neuromuscular junction of a frog has a probability p of being defective. If the distribution of the number of molecules released in a 10 msec time interval is a Poisson distribution with parameter λ, what is the distribution of the number of defective acetylcholine molecules? Let N be the number of acetylcholine molecules released in 10 msec and let X be the number that is defective. If we assume that one molecule being defective is independent of any other being defective, we have that the conditional pmf of X given N = n is the binomial distribution And we have that n k n k Pr( X = k N = n ) = p (1 p) (5.6) k
3 page 3: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables By the Law of Total Probability (Lecture 1), we have n λ λ e Pr( N = n) =. (5.7) n! Pr( X = k ) = Pr( N = n ) Pr( x = k N = n) n=0 n λ λ e n k n k = p (1 p) n! k n= k n λ λ e n! k n k = p (1 p) n! k!( n k )! n k = (5.8) k λ n k n k λ (λ p) (1 p) = e k! ( n k)! n= k k (λ p) λ λ(1 p) = e e k! (λ p) k λ p = e k! which is a Poisson model with rate parameter λ p. B. Continuous Conditional Distribution If X and Y are continuous random variables with joint probability density f xy (, x y ), then the conditional probability density of Y given X is defined as f yx ( y x ) = f xy ( x, y ) f ( x) x (5.9) for 0 ( )< and 0 otherwise. If we define this in terms of differentials we have < f x x Pr(y < Y y + dy I x X x + dx) Pr(y < Y y + dy x X x + dx) = Pr(x X x + dx) = f xy ( x, y ) dxdy f ( ) x x dx Hence, we have f xy ( x, y) dy =. f ( x) x (5.10)
4 page 4: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables f yx ( y x) = = ( Pr( y < Y y + dy x X x + dx) lim dy dy f xy ( x, y) fx ( x ) The numerator is a function of y for a fixed value of x. Similarly, we can define fxy ( x y) = f xy ( x, y) f xy ( x, y) = f y x (y x) f x (x) f y ( y ) (5.11) and the marginal probability density of Y is fy ( y)= (y x) f (x)dx (5.1) f y x x which is the Law of Total Probability for continuous random variables. Example 4.6 (continued). Figure 4H shows the joint probability density of the two MEG sensors. Consider the distribution of the front sensor (y-axis) given that the value of the back sensor (x-axis) is 0.5. If the bivariate Gaussian probability density is a reasonable model for these data, then we can show that the conditional distribution of Y given X is the Gaussian probability density defined as We have σ y [ y µ y ρ ( x µ x )] 1 σ x f ( x) = [πσ y (1 ρ )] 1 YX y exp (5.13) σ y (1 ρ ) σ y EY ( X ) = µ y + ρ (x µ x ) σ x (5.14) Var( Y X ) = σ y (1 ρ ). Note that if ρ = 0, we have the marginal density of Y is the Gaussian density with EY ( )= µ y and Var( Y)= σ, consistent with the idea that y X and Y would be independent. Notice also that σ y (1 ρ ) σ y showing that knowledge about X reduces the uncertainty in Y.
5 page 5: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables σ y The conditional mean µ yx = µ y + ρ (x µ x ) is called the regression line of y on x. It gives the σ x best prediction of y given x. When we study regression analysis we will make this statement precise. Example 4.5 (continued). In this example we have the joint and marginal densities as λ y fxy ( x,y ) = λ e 0 x y f x = λe λ x (5.15) x ( ) 0 x The conditional densities are for 0 < x< y <, and fy ( y ) = λ y e λ y 0 y f xy ( x, y ) λ e λ y λ( y x) ( y x) = = = λe (5.16) f ( x) λ x λe f yx x λ e λ y fxy x = λ ye λ y for 0 < x y. That is, X is unifo rmly distributed on (0, y]. Note that ( y) = y 1 (5.17) f xy (, x y) = f y x (y x) f x (x) = f x y (x y) f y (y). (5.18) We can simulate data from this joint probability density by either one of the following two algorithms. Algorithm 5.1 1) Draw X from the e xponential density fx( x ). ) Draw Y from the e xponential density f ( x) on the interval [ x, ). yx y Algorithm 5. 1) Draw Y from the ga mma density fy y. ) Draw X from the u niform density on the interval [0,Y ].
6 page 6: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables We will show later in this lecture that Algorithm 5. states that conditional on (or given that) the sum of two exponential random variables which is a gamma random variable, the distribution of the first random variable is uniform. Example 5.3 Dendritic Dynamics (Lee et al., PLoSB 4():e9 (006)). To study the neuronal remodeling that occurs in the brain on a day-to-day basis, Elly Nedivi and colleagues used a multiphoton-based microscopy system for chronic in vivo imaging and reconstruction of entire neurons in the superficial layers of the rodent cerebral cortex. Over a period of months, they observed neurons extending and retracting existing branches, and in rare cases, budding new tips. Thirty-five of 59 non-pyramidal interneuron dendritic tips showed rearrangement and 0 of 14 pyramidal cell dendritic tips showed rearrangement. The objective is to analyze the probability of rearrangement for the two types of neurons. Assume the propensity to change is a probability p i (0,1), where i = 1 is the interneuron propensity and i = is the pyramidal neuron propensity. We will perform Bayesian analysis of this problem. Assume that p i has a beta distribution with parameters α and β. This will represent our knowledge of p i prior to the experiment and is called a prior distribution. Given p i and X i : Binomial ( n i, p i ) we have n i x 1 xi f x p ( x i pi ) = p i i ( 1 p i ) (5.19) i i x i for x n and f p i is the beta density i = 0,1,..., i ( ) Γ(α+ β) α 1 β 1 f( p i ) = p ( 1 p ). (5.0) Γ( α) Γ( β) i i Let s compute f( x i ) and f( p i x i ). The joint distribution is the product of the pmf of a discrete random variable x i and a continuous random variable p i and is defined as f( xi, p i ) = f ( p i ) f( x i p i ) Γ(α + β) α 1 β 1 ni x i ni xi = p i (1 p i ) p i (1 p i ) (5.1) Γ( α) Γ( β) x i Γ(α + β) Γ( n i + 1 ) α + x i 1 +β x 1 i i i = p (1 p ) n Γ( α ) Γ( β) Γ( x i + 1 )Γ(n i x i +1) i i using the fact that if x i is an integer Γ( ) x )! by Factoid 1 from the Addendum to Lecture x i = ( i 1 3 ni n i! Γ( n i +1) = = (5.) x i xi!( ni x i )! Γ(x i +1)Γ(ni xi +1)
7 page 7: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables To compute f( x i ) we consider From Lecture 3 and the Addendum to Lecture 3, we know that 1 f (x i ) = f (x i, p i )dp i. (5.3) 0 1 α+x i 1 n +β x 1 Γ(α + x i )Γ(n i + β x i ) i i p i i (1 p i ) dp i = (5.4) 0 Γ(α+ β + n i ) Hence, Γ(α+ β) Γ(n i +1) Γ( α + x i ) Γ(n i + β x i ) f( x i ) = (5.5) Γ( α) Γ( β ) Γ(x i +1) Γ(n i x i +1 ) Γ( α + β + n i ) for x i = 0,1,,..., n i. If we take f( p i ) to be the beta probability density with α = β = 1, then we have the uniform probability density on [0,1]. This means we are indifferent among all the values of p in [0,1] prior to the start of the experiment and f( x i ) simplifies to Verify this. 1 f( xi ) =. (5.6) ni +1 For the Bayesian analysis we will be interested in f( pi x i ), the most likely values of p i given the data x i for i = 1,. It is f( p i, x i ) f( p i x i ) = f( x i ) Γ(α + β)γ(n i +1 ) α + xi 1 n i + β xi 1 p i (1 p i ) Γ( α) Γ( β )Γ(x +1) Γ(n x i +1) = i i (5.7) Γ(α + β)γ(n i +1) Γ( α + x i )Γ( n i + β x i ) Γ( α) Γ( β )Γ(x i +1) Γ(n i x i +1 )Γ ( α + β + n i ) Γ(α + β + n i ) α + xi 1 n +β x 1 = pi (1 p i ) i i Γ(α + x i )Γ(n i + β x i ) which is a beta probability density with parameters α + x i and n + β x. We have now laid the i i ground work for an analysis we will be performing on these data in a few weeks.
8 page 8: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables III. Functions of Jointly Distributed Random Variables The two functions of jointly distributed random variables X and Y that are most commonly considered are the sum and quotient Z = X + Y Y Z =. X (5.8) We will consider only the sum. The quotient is discussed in Rice, pp A. Sum of Two Random Variables 1. Discrete Random Variables If X and Y are discrete random variables with joint pmf pxy) (, then Z = X + Y is the set of all events Z = z, X = x, Y = z x. Hence, or if X + Y are independent pz ( z) = p(x, z x) (5.9) x= pz ( z)= p x (x) p y (z x). (5.30) x= The sum is called the convolution of p x and p y. Example 5.4. Suppose that the number of channel openings at two distant (independent) sites along the frog neuromuscular junction in a 500 msec interval are Poisson random variables with parameters λ 1 and λ. What is the probability mass function of the total number of channel openings in a 500 msec interval at the two sites? We have X : P( λ 1 ), Y : P( λ ) and let Z = X + Y We want to consider the event n {Z = X + Y = n} = U {X = k I Y = n k}. Hence, we have k =0
9 page 9: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables + = n Pr(X Y n) = Pr(X = k I Y = n k) k=0 n = Pr(X = k ) Pr(Y = n k) k=0 n λ 1 k λ n k e λ1 e λ = k! (n k)! k=0 (5.31) = e n k n k (λ + λ ) 1 1 k=0 ( λ 1 + λ ) n λ λ k!( n k )! e n! k n k = λ λ n! 1 k!(n k)! k=0 (λ λ ) n 1 + ) = 1 + e (λ λ. n! We see that the sum of two independent Poisson random variables is again Poisson with J J parameters λ λ. In general, if Z = X j where X P( ) then Z P( λ j. We will be able 1 + j : λ j : ) j=1 j=1 to establish this latter result rigorously once we develop the concept of the moment generating function in Lecture 6. Z=X+Y Figure 5.A. Set of X and Y values considered in the computation of a convolution of X+Y.. Continuous Random Variables Now we consider in the continuous case Z = X + Y z which is the region below the line x + y = z. Thus, if the joint pdf of X and Y is f(,y x ) we have
10 page 10: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables Fz ( z ) = Pr( X + Y z) = f ( X,Y )dxdy R z z x = f x (,y)dydx (5.3) Let y = v x, then dy = dv and y = z x implies v = y+x = z x + x = z z = f(,v x x)dvdx. (5.33) If f( x, v x) dx is continuous at z, then on differentiating, we have f () z = f(, x z x) dx z (5.34) and if X and Y are independent we obtain fz () z = fx () x fy ( z x) dx (5.35) which is the convolution of f x and f y. X + Y = Z Figure 5B. Convolution of Two Uniform Random Variables Yields a Triangular Distribution. Example 5.5. Suppose X and Y are independent random variables both uniformly distributed on (0,1). Find Z = X + Y (Figure 5B).
11 page 11: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables fx ( x ) = 1 0< x <1 0 otherwise fy ( y ) = 1 0< y <1 0 otherwise (5.36) z (0,). To derive this density we must consider two cases: Case i) 0 < z 1 then x+ y z and f z f(x) f (z x)d = dx = z (5.37) z () = 1 x z 0 0 Case ii) 1 < z < then x+ y = z such that 1 < z < then we must have x (z 1,1) and 1 1 fz ()= z d x= x =1 (z 1 ) = z (5.38) z 1 z 1 f()is z the triangle probability density. X + Y = Z Figure 5C. Convolution of Two Exponential Random Variables Yields a Gamma Distribution. Example 5.4 (continued). Suppose that the length of time that each of the two ion channels is open is an exponential random variable with parameters λ 1 for the first and λ for the second. Because the channels are distant, we assume that they are independent. Find the probability density for the sum of the channel opening times if λ 1 = λ = λ (Figure 5C). Let Z = X + Y, then we have λx λ y f(, x y ) = f () x f () y = λ e e (5.39) x y for x > 0 and y > 0
12 page 1: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables z ()= x y z λx λ(z x λ ) d f z f (x) f (z x)dx = e e x 0 z = λ λz λz e dx = λ ze 0 (5.40) which is the probability density of a gamma random variable with α = and β = λ. Note that the convolution here, as in the case of the uniform random variables, smooths out the probability densities and makes values in the middle more likely (Figure. 5C). We saw the start of a similar phenomenon with the sum of two uniform random variables in Example 5.5 (Figure 5B) having a triangular density. This is a key observation for understanding the Central Limit Theorem in Lecture 7. B. General Transformations of Two Random Variables We state the general case as a Proposition. Proposition 5.1. If X and Y are continuous random variables with joint pdf f(, x y) and Y are mapped onto U and V by the transformation suppose X u v = g 1 ( x, y) = g ( x, y) (5.41) and the transformation can be inverted to obtain x y = h 1 ( u,v) = h ( u,v) (5.4) Assume that g 1 and g have continuous partial derivatives, then the Jacobian is and the absolute value of its determinant is then h 1 h1 u v Ju (,v) = h h u v Ju, ( v ) = h1 h h h 1 (5.43) u v u v
13 page 13: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables for u g x y = x ) for some (, )and 0 elsewhere. = 1 (, ) v g (,y xy fuv (, u v) = f xy (h 1 (, u v), h (, u v)) J( u, v) (5.44) Notice that we used a form of Proposition 5.1 to compute the normalizing constant for the beta distribution in the Addendum to Lecture 3. Proof (Heuristic): The function g is a one-to-one mapping of a region in the ( xy)plane, into a region in the (,)plane. uv It must be the case that the probability for any small region in the ( xy, ) plane must be equal to the probability of the corresponding small region in the (,)plane uv that it maps into under g. That is, f uv ( u, v) d( u, v) = f xy ( x, y) d( x, y) f uv ( u, v) = f xy ( x, y) d( x,y) duv) (, f uv ( u, v) = f xy ( x, y) J ( u, v) f uv ( u, v) = f xy ( h 1 ( u, v), h ( u, v)) J ( u, v) where, by analogy with the univariate change-of-variables in Lecture 4, we define Juv (, ) as the absolute value of the determinant of Juv (, )to insure that fuv (,v) u 0. Example 5.6. If X and Y are independent gamma random variables with parameters α and λ and β and λ respectively, find the joint pdf of U = X + Y and V = X (X + Y ) 1. The joint density of X and Y is given by λ α α 1 λx λ β β 1 λ y f xy, ( x, y) = x e y e Γ( α) Γ( β ) λ α+ β α 1 β 1 λ( x+ y) = x y e Γ( α) Γ( β) (5.45) Now we have 1 u= g 1 (, x y) = x + y, v= g (,y) x = x(x + y), x= h 1 (,v) u = u v, and y= h (, u v) = u( 1 v). The Jacobian is
14 page 14: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables h 1 h = v 1 = u u v h h = 1 v = u u v (5.46) The absolute value of the determinant of the Jacobian is (. Thus, Ju,v) = uv u(1 v) = u f uv ( u, v) = f xy (uv, u( 1 v))u + λ α β α 1 β 1 λu = ( uv ) [u( 1 v)] e u (5.47) Γ( α)γ( β ) λ α+ β α+ β 1 λu α 1 β 1 = u e v (1 v). Γ( α)γ( β ) Multiplying and dividing by Γ(α β yields + ) α+ β λ α+ β 1 λ u Γ( α + β ) α 1 ( β 1 f uv ( u, v) = u e v 1 Γ(α + β) Γ( α)γ(β ) v) = f ( u)f ( v) u v (5.48) We see that fu ( u ) is the probability density of a gamma random variable with parameters α + β and λ, whereas fv ()is v the probability density of a beta random variable with parameters α and β. Therefore, the joint probability density fuv (, u v) is the product of a gamma and a beta density. By Lecture 4, because fuv (,v) u = f u (u) f v (v),we conclude that u and v are independent. Notice that the arguments used in Eqs to 5.48 are essentially the arguments used to compute the normalizing constant for the beta probability density in the Addendum to Lecture 3. Example 5.7. Let X and Y be independent standard Gaussian random variables. Find the joint density of U = X and V = X + Y. We have g x ) x g (,y) = 1 (,y = x x + y (5.49) The Jacobian is x= h 1 (,v) u = u y = h (,v) u = v u
15 page 15: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables h 1 h1 = 1 = 0 u v h h = 1 = 1 u v and the absolute value of its determinant is Ju,v) ( = =1. (5.50) Therefore, f uv ( u,v) = f xy ( u, v u) J = f x (u) f y (v u) J 1 = (π ) 1 exp{ [u + (v u) ]} 1 1 = (π ) 1 exp{ [u + v uv + u ]} 1 = (π ) 1 exp{ [u + v uv]}. (5.51) Now matching up the terms on the right hand side of Eq with the terms in the joint density function of the bivariate Gaussian density in Eq. 4.3 yields µ u = µ v = 0 σσ u v (1 ρ ) 1 = 1 and σ (1 ρ ) = 1 u σ v = (5.5) 1 σ v (1 ρ ) = 1 ρ = or ρ = 1 and σ u = 1. Therefore, U and V have a bivariate Gaussian probability density with 1 µ = µ v = 0,σ u = 1,σ v = and ρ =. u Proposition 5.1 generalizes to n random variables (see Rice pp ). IV. Summary We have shown how to construct conditional pmf s and pdf s. In so doing, we have laid important groundwork for our statistical analyses. Similarly, understanding the convolution process that is necessary to compute the pmf or pdf of the sum of two independent random variables gives us key insig ht into how an important result like the Central Limit Theorem comes about.
16 page 16: Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables Acknowledgments I am grateful to Paymon Hosseini for making the figures, James Mutch for helpful editorial comments and Julie Scott for technical assistance. References Lee W, Huang H, Feng G, Sanes JR, Brown EN, So PT, Nedivi E. Dynamic remodeling of dendritic arbors in non-pyramidal neurons of adult visual cortex. Public Library of Science Biology. 4()e9, 006. Rice JA. Mathematical Statistics and Data Analysis, 3 rd edition. Boston, MA.
17 MIT OpenCourseWare Statistics for Brain and Cognitive Science Fall 016 For information about citing these materials or our Terms of Use, visit:
9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown
9.07 Introduction to Probabilit and Statistics for Brain and Cognitive Sciences Emer N. Brown I. Objectives Lecture 4: Transformations of Random Variables, Joint Distributions of Random Variables A. Understand
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate
More informationThe Multivariate Gaussian Distribution
9.07 INTRODUCTION TO STATISTICS FOR BRAIN AND COGNITIVE SCIENCES Lecture 4 Emery N. Brown The Multivariate Gaussian Distribution Analysis of Background Magnetoencephalogram Noise Courtesy of Simona Temereanca
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationLecture 4. Continuous Random Variables and Transformations of Random Variables
Math 408 - Mathematical Statistics Lecture 4. Continuous Random Variables and Transformations of Random Variables January 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 4 January 25, 2013 1 / 13 Agenda
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More information9.07 Introduction to Statistics for Brain and Cognitive Neuroscience Emery N. Brown
9.07 Introduction to Statistics for Brain and Cognitive Neuroscience Emery N. Brown Lecture 3 Continuous Probability Models I. Objectives. Understand the concept of a continuous probability model and a
More information10/8/2014. The Multivariate Gaussian Distribution. Time-Series Plot of Tetrode Recordings. Recordings. Tetrode
10/8/014 9.07 INTRODUCTION TO STATISTICS FOR BRAIN AND COGNITIVE SCIENCES Lecture 4 Emery N. Brown The Multivariate Gaussian Distribution Case : Probability Model for Spike Sorting The data are tetrode
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy
More informationUNIT Define joint distribution and joint probability density function for the two random variables X and Y.
UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationBivariate Transformations
Bivariate Transformations October 29, 29 Let X Y be jointly continuous rom variables with density function f X,Y let g be a one to one transformation. Write (U, V ) = g(x, Y ). The goal is to find the
More informationMathematical statistics
October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationHW4 : Bivariate Distributions (1) Solutions
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationStat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota
Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution Charles J. Geyer School of Statistics University of Minnesota 1 The Dirichlet Distribution The Dirichlet Distribution is to the beta distribution
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationSTAT:5100 (22S:193) Statistical Inference I
STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationIE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.
Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationRandom Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.
Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More informationGeneration from simple discrete distributions
S-38.3148 Simulation of data networks / Generation of random variables 1(18) Generation from simple discrete distributions Note! This is just a more clear and readable version of the same slide that was
More informationSTAT 430/510 Probability
STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationSTAT 801: Mathematical Statistics. Distribution Theory
STAT 81: Mathematical Statistics Distribution Theory Basic Problem: Start with assumptions about f or CDF of random vector X (X 1,..., X p ). Define Y g(x 1,..., X p ) to be some function of X (usually
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationChapter 2 Continuous Distributions
Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationChapter 5. Random Variables (Continuous Case) 5.1 Basic definitions
Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 0: Continuous Bayes Rule, Joint and Marginal Probability Densities Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationRandom Variables (Continuous Case)
Chapter 6 Random Variables (Continuous Case) Thus far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationThree hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.
Three hours To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer QUESTION 1, QUESTION
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationSTAT 430/510: Lecture 15
STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional
More informationCh3. Generating Random Variates with Non-Uniform Distributions
ST4231, Semester I, 2003-2004 Ch3. Generating Random Variates with Non-Uniform Distributions This chapter mainly focuses on methods for generating non-uniform random numbers based on the built-in standard
More informationMultivariate random variables
DS-GA 002 Lecture notes 3 Fall 206 Introduction Multivariate random variables Probabilistic models usually include multiple uncertain numerical quantities. In this section we develop tools to characterize
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 2
Math 5. Rumbos Spring 22 Solutions to Review Problems for Exam 2. Let X and Y be independent Normal(, ) random variables. Put Z = Y X. Compute the distribution functions (z) and (z). Solution: Since X,
More informationPractice Examination # 3
Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More information2. Conditional Expectation (9/15/12; cf. Ross)
2. Conditional Expectation (9/15/12; cf. Ross) Intro / Definition Examples Conditional Expectation Computing Probabilities by Conditioning 1 Intro / Definition Recall conditional probability: Pr(A B) Pr(A
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationCommon ontinuous random variables
Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationMathematical statistics
October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter
More information