ACM 116: Lectures 3 4

Size: px
Start display at page:

Download "ACM 116: Lectures 3 4"

Transcription

1 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance of a random variable Covariance Conditional Expectation

2 Joint distributions 2 Two random variables X and Y. Interested in their joint outcome. (X, Y ) = (x, y) X = x and Y = y. Outcomes for Y Outcomes for X Example : Server on the web X : Y : # customers hitting your server in the next hour # customers in the next 10 minutes or # customers who will purchase an item in the next hour

3 3 Joint frequency function X, Y discrete r.v. s taking on values x 1, x 2,... and y 1, y 2,... resp. Joint frequency (distribution) function : p(x i, y j ) = P (X = x i, Y = y j ) Marginal probabilities P (X = x i ) = j P (X = x i, Y = y j ) Why? Addition rule

4 4 Joint frequency function Several random variables : similar story X 1,... X m r.v. s defined on the same sample space p(x 1,... x m ) = P (X 1 = x 1,..., X m = x m ) p X1 (x 1 ) = p X1,X 2 (x 1, x 2 ) = p(x 1, x 2,..., x m ) x 2,...,x m p(x 1, x 2, x 3,..., x m ) x 3,...,x m

5 5 Example: the multinomial distribution n fixed independent experiments resulting in one of r possible outcomes. Probabilities p 1,..., p r X i, 1 i r, # outcomes of type i. Joint distribution of the X i s? P (X 1 = x 1,..., X r = x r ) = n! x 1!... x r! px px r r Why?

6 6 Example : multinomial distribution Outcome of experiments such that (X 1 = x 1,..., X r = x r ) has prob. p x px r r (all equally likely). Indeed, First x 1 Type 1 p x 1 1 Next x 2 Type 2 p x Last x r Type r p x r r How many configurations? ( )( ) n n x1... x 1 x 2 ( xr x r ) = n! x 1!... x r!

7 Joint distributions: the continuous case 7 X and Y continuous r.v. s Joint density function p(x, y) s.t. P ((x, y) A) = for any reasonable set A. A f(x, y) dxdy A P ((X, Y ) A) = this volume

8 8 Interpretation Joint distributions: the continuous case P (x X x + dx, y Y y + dy) f(x, y) dxdy f(x,y) x y Marginal density of X : f X (x) = f XY (x, y) dy

9 9 Joint distributions: the continuous case Example Disk of radius 1. 1 f(x, y) = 1 π if x 2 + y otherwise R: distance from point to origin. Density of R?

10 Joint distributions: the continuous case 10 R: distance from point to origin. Density of R? X : first coordinate. Density of X? P (R r) = πr2 π = r2 2r if 0 r 1 f R (r) = 0 otherwise f X (x) = f XY (x, y) dy = 1 x 2 1 x 2 1 π dy = 2 π 1 x 2 f X (x) = 2 π 1 x 2 if 1 x 1 0 otherwise

11 11 Bivariate normal density 5 parameters : < µ x, µ y <, 0 < σ x, σ y <, 1 ρ 1 Extensively used in modeling Galto, mid 19th century : modeling of the heights of fathers and sons, e.g. X = father s height Y = son s height Density x = (x, y), Σ = σ2 x ρσ x σ y ρσ x σ y σ 2 y µ = (µ x, µ y ) is the vector of means is the covariance matrix f XY (x, y) = f X (x) = 1 2πdet(Σ) e 1 2 (x µ)t Σ 1 (x µ)

12 12 Bivariate normal density Level sets = ellipses Interesting calculation : marginals X N(µ x, σ 2 x ) Y N(µ y, σ 2 y )

13 13 Independent random variables Two random variables are independent if P (X A, Y B) = P (X A)P (Y B) for any subsets A and B. Discrete r.v. s : Independence iff P (X = x, Y = y) = P (X = x)p (Y = y) Continuous r.v. s : Independence iff f XY (x, y) = f X (x)f Y (y) (Complicated)

14 14 Example Node of a communication network If two packets of info arrive within time τ of each other they collide and then have to be retransmitted. Times of arrival independent U[0, T ]. What is the probability that they collide?

15 T 1, T 2 independent U[0, T ]. Independent random variables 15 Joint distribution : f(t 1, t 2 ) = 1 T 2 uniform distribution over the square τ τ P (Collision) = 1 (1 τ T )2 = 2 τ T ( τ T )2 = τ T (2 τ T )

16 Conditional distributions 16 Discrete case : X and Y joint discrete r.v. s Conditional prob. of X = x i given Y = y j is P (X = x i Y = y j ) = P (X = x i, Y = y j ) P (Y = y j ) Notation : p X Y Continuous case : X and Y jointly continuous r.v. s Conditional density of Y given X is defined to be NB : f Y X (y x) = f XY (x, y) f X (x) P (y Y y + dy x X x + dx) = f XY (x, y) dxdy f X (x) dx = f XY (x, y) dy f X (x)

17 17 Conditional distributions Independence f Y X = f Y Joint density = conditional marginal f XY (x, y) = f Y X (y x)f X (x) Law of total probability f Y (y) = f Y X (y x)f X (x) dx NB : p X Y (x Y = y) is a distribution : p X Y (x Y = y) 0 all x p X Y (x Y = y) = 1 i.e. conditional distribution.

18 18 Example Imperfect particle detector Detects each incoming particle w.p. p only. N true number of particles. Poisson distribution X: detected number of particles λ λn P (N = n) = e n! What is the distribution of the detected number of particles?

19 19 Conditional distributions p X N=n Binomial(n, p) P (X = k) = P (N = n)p (X = k N = n) n=0 ( n )p k (1 p) n k λn e λ = n k k n! = (λp)k k! e λ n k n k (1 p)n k λ (n k)! = (λp)k k! λp (λp)k = e k! e λ e λ(1 p) Poi(λp)

20 20 Conditional distributions Example: Bivariate normal density X and Y bivariate normal distribution Distribution of Y X = x is normal too : Linear regression! N(µ y + ρ σ y σ x (x µ x ), (1 ρ 2 )σ 2 y )

21 21 Sampling & Monte Carlo Density function f we wish to sample from. f is nonzero on an interval [a, b] and zero outside the interval (a and b may be infinite). Let M be a function such that M(x) f(x) on [a, b] and let m(x) = M(x) b a M(x)dx The idea is to choose M so that it is easy to generate random variables from m. If [a, b] is finite, m can be chosen to be the uniform distribution.

22 22 Rejection sampling algorithm. Step 1: Generate T with the density m. Step 2: Generate U, uniform on [0, 1] and independent of T. If M(T ) U f(t ) accept, X = T. Otherwise reject, goto step 1 y REJECT ACCEPT a T b x Remark. High efficiency if algorithm accepts with high probability, i.e. M close to f.

23 Why does this work? 23 To show P (X A) = A f(t) dt P (X A) = P (T A Accept) = Condition on T = t (I = b a M(t)dt) P (T A and Accept) P (Accept) P (T A and Accept) = = = b a b a A P (T A and Accept T = t)f T (t) dt P (U f(t)/m(t) and t A)m(t) dt f(t) M(t) m(t) dt = 1 I A f(t) dt Similarly P (Accept) = b a P (Accept T = t)m(t) dt = b a f(t) M(t) m(t) dt = 1 I.

24 Example 24 Suppose we want to sample from a density whose graph is shown below. 1.8 Density f f(x) x In this case we let M(T ) be the maximum of f over the interval [0, 1], namely M(x) = max(f), 0 x 1 so that m is the uniform density over the interval [0, 1].

25 25 Implementation N = 1000; t = (1:N)/N; M = max(f(t)); n = 2000; x = rejection_sampling(n,@f,m);

26 26 This routine will sample n iid samples from the density f function x = rejection_sampling(n,fun,m) x = []; for k = 1:n, OK = 0; while not(ok); T = rand(1); % Generate T U = rand(1); % Generate U if (M*U <= fun(t)), OK = 1; x = [x T]; end end end

27 27 Visualization of the results 250 Histogram of Sampled Data, Sample Size = frequency x Figure 1: Histogram of the Sampled Data

28 Variance and Standard Deviation 28 The Expected value can be viewed as an indication of the central value of the density or frequency function. The variance or standard deviation gives an indication about the dispersion of the probability distribution. The variance of random variable X is the mean square deviation from E[X] = µ Var(X) = E[(X µ) 2 ] = E[(X E[X]) 2 ]. The standard deviation is the square root of the variance SD(X) = Var(X). We often write µ = E[X] and σ 2 = Var(X) so that σ = SD(X). We may also calculate the variance as follows: Var(X) = E[X 2 ] (E[X]) 2.

29 Examples 29 Bernouilli: X Ber(p) Normal: X N(µ, σ 2 ) Var(X) = Var(X) = p(1 p) 1 2πσ = σ2 2π = σ 2. (x µ) 2 e (x µ)2 /(2σ 2) dx u 2 e u2 /2 du Remark: Y = a + bx Var(Y ) = b 2 Var(X) Example: X standard normal µ = 0, σ = 1 Y = µ + σx N(µ, σ 2 ) Then Var(Y ) = σ 2 Var(X) = σ 2.

30 30 Covariance and correlation The variance is a measure of the variability of a r.v. The covariance is a measure of the joint variability of two r.v s. The covariance of two jointly distributed r.v s (X, Y ) is given by Cov(X, Y ) = E[(X E[X])(Y E[Y ])] = E[(X µ X )(Y µ Y )]. Another formula Cov(X, Y ) = E[XY ] E[X]E[Y ].

31 31 Interpretation Positive covariance When X is larger than its mean, Y tends to be larger than its mean as well. When X is smaller than its mean, Y tends to be smaller than its mean as well. Negative covariance When X is larger than its mean, Y tends to be smaller than its mean. When X is smaller than its mean, Y tends to be larger than its mean.

32 Properties Cov(X, X) = Var(X) 2. X and Y two r.v s: Var(X + Y ) = Var(X) + 2Cov(X, Y ) + Var(Y ) 3. If X and Y are independent, then Cov(X, Y ) = 0(= E[XY ] E[X]E[Y ]). 4. Let X 1, X 2,...X m be m r.v s and Y 1, Y 2,...Y n be n r.v s, let S 1 = m and S 2 = n i=1 Y i. Then X i i=1 m Var(S 1 ) = Var( X i ) = i=1 m i=1 Var(X i ) + 2 i<j Cov(X i, X j ), and Cov(S 1, S 2 ) = m n Cov(X i, Y j ). i=1 j=1

33 33 5. Suppose X 1, X 2,...X m are independent: Var(X X m ) = Var(X 1 ) Var(X m ).

34 34 Example The variance of the binomial X Bin(n, p). X is the sum of n independent Bernouilli s and therefore X = I I n Var(X) = n Var(I i ) = i=1 n p(1 p) = np(1 p). i=1

35 Correlation 35 The correlation between two r.v s X and Y is defined by ρ = Cov(X, Y ) Var(X)Var(Y ). The correlation is a dimensionless quantity The correlation ρ between X and Y obeys 1 ρ 1. The correlation is a measure of the strength of the linear relationship existing between X and Y. ρ ± 1 ax + b = Y. Remark: (X, Y ) bivariate normal: the correlation between X and Y is ρ and Cov(X, Y ) = ρσ X σ Y

36 36 Conditional expectation The conditional expectation of Y given X = x is the mean of the conditional distribution of Y given X = x. For example, in the discrete case we have E[Y X = x] = y yp Y X (y x) and more generally the conditional expectation of h(y ) given X = x is E[h(Y ) X = x] = y h(y)p Y X (y x).

37 Example 37 X Binomial(n, p). Set m n and let Y be the number of successes in the first m trials. What are the conditional distribution and mean of Y given X = x? Conditional distribution: P (Y = y X = x) = ( m )( n m ) y x y ( n x). Conditional mean: Y = I I m and E(Y X = x) = E(I 1 X = x) E(I m X = x) = P (I 1 X = x) P (I m X = x) = x/n x/n = m n x

38 38 Conditional expectation as a random variable Assuming that the conditional expectation of Y given X = x exists for every x, it is a well defined function of X and, hence, a random variable. (The expectation of Y given X = x is a function depending on x : E[Y X = x] = g(x). E[Y X] = g(x) is r.v.) Example: E(Y X) = m n X. This random variable has an expectation, and a variance (proviso absolute convergence, etc.) Its expectation is E[E(Y X)]

39 39 Iterated Expectation Theorem E(Y ) = E[E(Y X)] Interpretation: the expectation of Y can be calculated by first conditioning on X, finding E(Y X) and then averaging this quantity with over X: E[Y ] = x E[Y X = x]p X (x) where E[Y X = x] = y yp Y X (y x). Example: E(Y X) = m n X and E(X) = np give E(Y ) = m E(X) = mp. n

40 40 Proof We need to show E(Y ) = x E(Y X = x)p (X = x) where E(Y X = x) = y yp Y X (y x). We have E(Y X = x)p X (x) = y y x P Y X (y x)p X (x) x = y yp Y (y) = E(Y ).

41 41 Random Sums Consider sums of the type where T = N X i, i=1 N is a r.v. with finite expectation E(N) <, the X i s are independent of N, with comon mean E[X i ] = E[X], i. Such sums arise in a variety of applications Insurance companies might receive N claims in a given period of time and the amounts of the individual claims may be modeled as r.v s X 1, X 2,.... N is the number of jobs in a single server queue and X i the service time for the ith job. T is the time to serve all the jobs in the queue.

42 42 E(T ) = all n E(T N = n)p (N = n) and i.e. ( n ) E(T N = n) = E X i = ne(x). i=1 E(T ) = ne(x)p (N = n) = E(N) E(X). n In other words, the average time to complete n jobs when n is random is the average value of N times the average time amount to complete a job.

43 43 The Moment Generating Function The moment generating function (mgf) of a random variable X is given by M(t) = E[e tx ] (In the continuous case, M(t) = e tx f(x) dx, Laplace transform of f) The mgf may not exist If X is Cauchy distributed (f(x) = (π(1 + x 2 )) 1 ), the mgf does not exist for any t 0. The mgf of a normal random variable exists for all t. Important property: if the mgf exists for t in an open interval containing zero, it uniquely determines the prob. distribution.

44 44 Why the Name MGF? e tx = 1 + tx + t2 X 2 E(e tx ) = 1 + te(x) + t2 E(X 2 ) tr X r r! tr E(X r ) r! +... Suppose that the mgf exists in an open interval containing 0. Then M(0) = 1, M (0) = E(X),..., M (r) (0) = E(X r ). May be useful to compute all the moments of a distribution.

45 45 Example Mgf of the Poisson distribution This gives M(t) = k 0 = k 0 = e λ e λet. e tk λ λk e k! e λ (et λ) k k! M (t) = λe t M(t), M (t) = λe t M(t) + λ 2 e 2t M(t). It follows that if X is Poisson E(X) = λ, E(X 2 ) = λ 2 + λ Var(X) = λ.

46 46 Mgf of Sums of Independent Random Variables Very useful property: If X and Y are RV s with mgf s M X and M Y S = X + Y, then M Z (t) = M X (t)m Y (t) and on the common interval where both mgf s exist. Why? M S (t) = E(e t(x+y ) ) = E(e tx e ty ) = E(e tx )E(e ty ) = M X (t)m Y (t). Obvious extensions to sums of random variables.

47 47 Example The sum of two independent Poisson rv s is a Poisson rv. X Poi(λ) and Y Poi(µ), then X + Y Poi(λ + µ) M X+Y (t) = e λ e λet e µ e µet = e (λ+µ) e (λ+µ)et but this is the mgf of a Poisson rv with parameter λ + µ.

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Lecture 14: Multivariate mgf s and chf s

Lecture 14: Multivariate mgf s and chf s Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

4 Moment generating functions

4 Moment generating functions 4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2010 Luc Rey-Bellet Contents 1 Random variables and Monte-Carlo method 3 1.1 Review of probability............................

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get 18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function

More information

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24 STAT/MATH 395 A - Winter Quarter 17 - Midterm - February 17, 17 Name: Student ID Number: Problem #1 # #3 #4 Total Points /5 /7 /8 /4 /4 Directions. Read directions carefully and show all your work. Define

More information

ECSE B Solutions to Assignment 8 Fall 2008

ECSE B Solutions to Assignment 8 Fall 2008 ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

Conditional distributions. Conditional expectation and conditional variance with respect to a variable. Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

MATH Notebook 4 Fall 2018/2019

MATH Notebook 4 Fall 2018/2019 MATH442601 2 Notebook 4 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 4 MATH442601 2 Notebook 4 3 4.1 Expected Value of a Random Variable............................

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

The Binomial distribution. Probability theory 2. Example. The Binomial distribution Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated

More information

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution 1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types

More information

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution: 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained

More information

Bivariate Paired Numerical Data

Bivariate Paired Numerical Data Bivariate Paired Numerical Data Pearson s correlation, Spearman s ρ and Kendall s τ, tests of independence University of California, San Diego Instructor: Ery Arias-Castro http://math.ucsd.edu/~eariasca/teaching.html

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2018 version Luc Rey-Bellet April 5, 2018 Contents 1 Simulation and the Monte-Carlo method 3 1.1 Review of probability............................

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information