ECON 5350 Class Notes Review of Probability and Distribution Theory
|
|
- Leslie Scott
- 6 years ago
- Views:
Transcription
1 ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one function X = X(c). An outcome of X is denoted. Eample. Single Coin Toss C = {c = T ; c = H} X(c) = if c = T X(c) = 1 if c = H 1.1 Probability Density Function (pdf) Two types: 1. Discrete pdf. A function f() such that f(), and f() = Continuous pdf. A function f() such that f(), and f()d = 1. = See MATLAB eample #1 for an eample to calculate the area under a pdf. 1. Pr(X = ) = f() in the discrete case, and Pr(X = ) = in the continuous case. 2. Pr(a X b) = b =a f()d. 1.2 Cumulative Distribution Function (cdf) Two types: 1. Discrete cdf. A function F () such that X f() = F (). 2. Continuous cdf. A function F () such that f(t)dt = F (). 1. F (b) F (a) = b f(t)dt a f(t)dt where b a. 2. F () lim F () =. 1
2 4. lim + F () = If > y, F () F (y). 2 Mathematical Epectations Consider the continuous case only. 2.1 Mean Definition. The mean or epected value of g(x) is given by E[g(X)] = g()f()d. 1. E(X) = µ = f()d is called the mean of X or the first moment of the distribution. 2. E( ) is a linear operator. Let g(x) = a + bx. E[g(X)] = (a + b)f()d = = E(a) + E(bX) = a + be(x). af()d + bf()d 3. Other measures of central tendency: median, mode. 2.2 Variance Definition. The variance of g(x) is given by V ar[g(x)] = E[{g(X) E[g(X)]} 2 ] = {g() E[g()]} 2 f()d. 1. Let g(x) = X. We have V ar(x) = σ 2 = ( µ) 2 f()d = = E(X 2 ) 2µE(X) + µ 2 = E(X 2 ) µ 2. 2 f()d 2µ f()d + µ 2 f()d 2
3 2. V ar(x) is NOT a linear operator. Let g() = a + bx. V ar[g(x)] = {g() g(µ)} 2 f()d = b 2 ( µ) 2 f()d = b 2 V ar(x) = b 2 σ σ is called the standard deviation of X. 2.3 Other Moments The measure E(X r ) is called the r th moment of the distribution while E[(X µ) r ] is called the r th central moment of the distribution. r Central Moment Measure 1 E[(X µ)] = 2 E[(X µ) 2 ] = σ 2 variance (dispersion) 3 E[(X µ) 3 ] skewness (asymmetry) 4 E[(X µ) 4 ] kurtosis (tail thickness). Moment Generating Function (MGF). The MGF uniquely determines a pdf when it eists and is given by M(t) = E(e tx ) = e t f()d. The r th moment of a distribution is given by d r M(t) dt r t=. 2.4 Chebyshev s Inequality Definition. Let X be a random variable with σ 2 <. For any k >, Pr(µ kσ X µ + kσ) 1 1 k 2. Chebyshev s inequality is used to calculate upper (and lower) bounds on a random variable without having to know the eact distribution. Eample. Let X f() where f() = 1 2 3, 3 < < 3 3
4 and zero elsewhere. If we let k = 3/2, we get Cheb : Pr( 3/2 X 3/2) 1 1 = 5/9 =.55 (3/2) 2 Eact : Pr( 3/2 X 3/2) = 3/2 3/ d = 1 2 [(3/2) ( 3/2)] Specific Probability Distributions 3.1 Normal pdf If X has a normal distribution, then f() = 1 ( ) ( µ) 2 σ 2π ep 2σ 2 where < <. In short-hand notation, X N(µ, σ 2 ). 1. The normal pdf is symmetric. 2. Z = (X µ)/σ N(, 1) is called a standardized random variable and φ(z) = 1 2π ep(.5z 2 ) is called the standard normal distribution. 3. Linear transformations of normal random variables are normal. If Y = a + bx where X N(µ, σ 2 ), then Y N(a + bµ, b 2 σ 2 ). 3.2 Chi-square pdf If Z i, i = 1,..., n, are independently distributed N(, 1) random variables, Y = n i=1 Z2 i χ 2 (n) where E(Y ) = n and V ar(y ) = 2n. Eercise. Find the MGF for Y = Z 2 and use it to derive the mean and variance. Answer. We begin by calculating the MGF for Z 2 where t <.5: M(t) = E(e tz2 ) = e tz2 φ(z)dz = (2π).5 e (t.5)z2 dz = (2π).5 e.5(1 2t)z2 dz. 4
5 Now using the method of substitution, let w = (1 2t)z so that dw = (1 2t) 1/2 dz. Now making the substitution produces M(t) = (1 2t) 1/2 (2π).5 e.5w2 dw = (1 2t) 1/2. To calculate the mean, we take the first derivative of M(t) and evaluate at t = : µ = dm(t) t= = (1 2t) 3/2 t= = 1. dt To calculate the variance, we take the second derivative of M(t), evaluate at t =, and subtract µ 2 : σ 2 = [ d 2 ] M(t) dt 2 t= µ 2 = 3(1 2t) t= µ 2 = F pdf If X 1 and X 2 are independently distributed χ 2 (n i ) random variables, F = X 1/n 1 X 2 /n 2 F (n 1, n 2 ). 3.4 Student s t pdf If Z N(, 1) and X χ 2 (n) are independent, T = Z X/n t(n). 3.5 Lognormal pdf If X N(µ, σ 2 ) then Y = ep(x) has the distribution 1 f(y) = ep[.5( ln(y) µ ) 2 ] 2πσy σ for y. Sometimes this is written as y LN(µ, σ 2 ). The mean and variance of Y are E(Y ) = ep(µ + σ 2 /2) and V ar(y ) = ep(2µ + σ 2 )(ep(σ 2 ) 1). 5
6 1. If Y 1 LN(µ 1, σ 2 1) and Y 2 LN(µ 2, σ 2 2) are independent random variables, then Y 1 Y 2 LN(µ 1 + µ 2, σ σ 2 2). 3.6 Gamma pdf The gamma distribution is given by f() = 1 Γ(α)β α α 1 ep( /β) for <. The mean and variance are E(X) = αβ and V ar(x) = αβ Γ(α) = y α 1 ep( y)dy is called the gamma function, α >. 2. Γ(α) = (α 1)! if α is a positive integer. 3. Greene sets β = 1/λ and α = P. 4. When α = 1, you get the eponential pdf. 5. When α = n/2 and β = 2, you get the chi-square pdf. Eample. Gamma distributions are sometimes used to model waiting times. Let W be the waiting time until death for a human. Let W Gamma(α = 1, β = 8) so that the epected waiting time until death is 8 years. (Note: W Eponential(β)). Find the Pr(W 3). Pr(W 3) = 3 1 Γ(1)8 ep( w/8)dw = ep( w/8)dw = 1 ( 8 ep( w/8)) 3 w= = [ep( 3/8) ep()] = = Beta pdf If X 1 and X 2 are independently distributed Gamma random variables then Y 1 = X 1 + X 2 and Y 2 = X 1 /Y 1 are independently distributed. The marginal distribution f 2 (y 2 ) of f(y 1, y 2 ) is called the beta pdf: g(y) = Γ(α + β) Γ(α)Γ(β) (y/c)α 1 [1 (y/c)] β 1 (1/c) where y c. The mean and variance are E(Y ) = cα/(α + β) and V ar(y ) = c 2 αβ/(α + β + 1). 6
7 3.8 Logistic pdf The logistic distribution is f() = Λ() [1 Λ()] where < < and Λ() = (1 + ep( )) 1. The mean and variance are E(X) = and V ar(x) = π 2 /3. A useful property of the logistic distribution is that the cdf has a closed-form solution F () = Λ(). 3.9 Cauchy pdf If X 1 and X 2 are independently distributed N(, 1), then Y = X 1 /X 2 f(y) = 1 π(1 + y 2 ) where < y <. The mean and the variance of the Cauchy pdf do not eist because the tails are too thick. See See MATLAB eample #2 for an eample that graphs the Cauchy and standard normal pdfs. 3.1 Binomial pdf The distribution for successes in n trials is b(n, α, ) = ( ) n α (1 α) n where =, 1,..., n and α 1. The mean and variance of the binomial distribution are E(X) = nα and V ar(x) = nα(1 α). set n distinct objects is The combinatorial formula for the number of ways to choose objects from a ( ) n n! =!(n )! Poisson pdf The Poisson pdf is often used to model the number of changes in a fied interval. The Poisson pdf is f() = ep( λ)λ! where =, 1,... and λ >. The mean and variance are E(X) = V ar(x) = λ. 7
8 4 Distributions of Functions of Random Variables Let X 1, X 2,..., X n have joint pdf f( 1,..., n ). What is the distribution of Y = g(x 1, X 2,..., X n )? To answer this question, we will use the change-of-variable technique. Change of Variable Technique. Let X 1 and X 2 have joint pdf f( 1, 2 ). Let Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 ) be the transformed random variables. If A is the set where f >, then let B be the set defined by the one-to-one transformation of A to B. Then g(y 1, y 2 ) = f(h 1 (y 1, y 2 ), h 2 (y 1, y 2 )) abs(j) where (y 1, y 2 ) B, 1 = h 1 (y 1, y 2 ), 2 = h 2 (y 1, y 2 ) and J = 1 1 y 1 y y 1 y 2. Eample. Let X 1 and X 2 be uniformly distributed on X i 1. The random sample X 1, X 2 is jointly distributed f( 1, 2 ) = f 1 ( 1 )f 2 ( 2 ) = 1 over 1, 2 1 and zero elsewhere. Find the joint distribution of Y 1 = X 1 + X 2 and Y 2 = X 1 X 2. Answer. We know that 1 = h 1 (y 1, y 2 ) =.5(y 1 + y 2 ) and 2 = h 2 (y 1, y 2 ) =.5(y 1 y 2 ). We also know that.5.5 J = = Therefore, g(y 1, y 2 ) = f 1 (h 1 (y 1, y 2 ))f 2 (h 1 (y 1, y 2 )) abs(j) =.5 where (y 1, y 2 ) B and zero elsewhere. 5 Joint Distributions 5.1 Joint pdfs and cdfs A joint pdf for X 1 and X 2 gives Pr(X 1 = 1, X 2 = 2 ) = f( 1, 2 ). property f( 1, 2 )d 2 d 1 = 1 and f( 1, 2 ) for all 1 and 2. A proper joint pdf will have the A joint cdf for X 1 and X 2 is Pr(X 1 1, X 2 2 ) = F ( 1, 2 ) = 1 2 f(t 1, t 2 )dt 2 dt 1. 8
9 5.2 Marginal Distributions The marginal pdf of X 1 is found by integrating over all X 2 : f 1 ( 1 ) = f( 1, 2 )d 2 and likewise for X 2. Eample. Let X 1 and X 2 have joint pdf f( 1, 2 ) = 2, < 1 < 2 < 1 and zero elsewhere. Is this a proper pdf? d 2 d 1 = 1 [ 22 1 ] 1 2= 1 d1 = 2(1 1 )d 1 = = = = 2 1 = 1. So yes, this is a proper pdf. The marginal distribution for X 1 is f 1 ( 1 ) = 1 1 2d 2 = = 1 = 2(1 1 ), < 1 < 1 and zero elsewhere. The marginal distribution for X 2 is f 2 ( 2 ) = 2 2d 1 = = = 2 2, < 2 < 1 and zero elsewhere. See MATLAB eample #4 for a graphical eample of a joint and marginal pdf. 1. Two random variables are stochastically independent if and only if f 1 ( 1 )f 2 ( 2 ) = f( 1, 2 ). 2. In our eample, X 1 and X 2 are not independent because f 1 ( 1 )f 2 ( 2 ) = = f( 1, 2 ). 3. Moments (e.g., means and variances) in joint distributions are calculated using marginal densities (e.g., E(X 1 ) = 1 f 1 ( 1 )d Covariance and Correlation Definition. The covariance between X and Y is cov(x, Y ) = E [ (X µ )(Y µ y ) ] = E(XY ) µ µ y. 9
10 Definition. The correlation coeffi cient between X and Y removes the dependence on the unit of measurement: ρ = corr(x, Y ) = cov(x, Y ) σ σ y where 1 ρ If X and Y are independent, then cov(x, Y ) = : cov(x, Y ) = E(XY ) µ µ y = yf ()f y (y)dyd µ µ y = f ()d yf y (y)dy µ µ y = µ µ y µ µ y =. 2. However, cov(x, Y ) = does not imply stochastic independence. Consider the following joint distribution table y 1 1 f () 1 1/3 1/3 1/3 1/3 1 1/3 1/3 f y (y) 1/3 2/3 where µ =, µ y = 2/3 and cov(x, Y ) = ( µ )(y µ y )f(, y) = ( 1)(1/3)(1/3) + ()( 2/3)(1/3) + (1)(1/3)(1/3) =. However, X and Y are not independent because for (, y) = (, ) we have f ()f y () = 1/9 f(, ) = 1/3. 6 Conditional Distributions Definition. The conditional pdf for X given Y is f( y) = f(, y) f y (y). 1
11 1. If X and Y are independent, f( y) = f () and f(y ) = f y (y). 2. The conditional mean is E(X Y ) = f( y)d = µ y. 3. The conditional variance is V ar(x Y ) = ( µ y ) 2 f( y)d. 7 Multivariate Distributions Let X = (X 1,..., X n ) be a (n 1) column vector of random variables. The mean and variance of X is µ = E(X) = (µ 1,..., µ n ) and σ 11 σ 12 σ 1n Σ = V ar(x) = E[(X µ)(x µ) σ 21 σ 22 σ 2n ] =..... σ n1 σ n2 σ nn 1. Let W = A + BX. Then E(W ) = A + BE(X). 2. The variance of W is V ar(w ) = E[(W E(W ))(W E(W )) ] = E[(BX BE(X))(BX BE(X)) ] = E[B(X E(X))(X E(X)) B ] = BΣB. 7.1 Multivariate Normal Distributions Let X = (X 1,..., X n ) N(µ, Σ). The form of the multivariate normal pdf is f() = (2π) n/2 Σ 1/2 ep[.5( µ) Σ 1 ( µ)]. See MATLAB eample #5 for an eample of a bivariate normal density function. 7.2 Quadratic Form in a Normal Vector If (X µ) is a normal vector, then the quadratic form Q = (X µ) Σ 1 (X µ) χ 2 (n). 11
12 Proof. The moment generating function of Q is M(t) = E(e tq ) = = (2π) n/2 Σ 1/2 ep[t( µ) Σ 1 ( µ).5( µ) Σ 1 ( µ)]d 1 d n (2π) n/2 Σ 1/2 ep[.5( µ) (1 2t)Σ 1 ( µ)]d 1 d n. Net, multiply and divide by (1 2t) n/2 : M(t) = (2π) n/2 Σ/(1 2t) 1/2 ep[.5( µ) (1 2t)Σ 1 ( µ)]d 1 d n (1 2t) n/2 = (1 2t) n/2, t <.5. The numerator is the integral of a multivariate normal random distribution with variance Σ/(1 2t) and so it equals one. M(t) then simplifies to the MGF for a χ 2 (n) random variable. 7.3 A Couple of Important Theorems 1. Let X N(, I) and A 2 = A (i.e., A is idempotent). X AX χ 2 (r) where the rank of A is r. 2. Let X N(, I). X AX and X BX are stochastically independent iff A B =. 12
Introduction to Probability Theory for Graduate Economics Fall 2008
Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationChapter 4 Multiple Random Variables
Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationChapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations
Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationStochastic processes Lecture 1: Multiple Random Variables Ch. 5
Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationPreliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More informationContinuous random variables
Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationMath 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.
Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationProperties of Summation Operator
Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,
More informationWeek 1 Quantitative Analysis of Financial Markets Distributions A
Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationTABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1
TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1 1.1 The Probability Model...1 1.2 Finite Discrete Models with Equally Likely Outcomes...5 1.2.1 Tree Diagrams...6 1.2.2 The Multiplication Principle...8
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationRandom Variables and Expectations
Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION.
A self published manuscript ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 21 EDITION. M I G U E L A R C O N E S Miguel A. Arcones, Ph. D. c 28. All rights reserved. Author Miguel A.
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More information1 x 2 and 1 y 2. 0 otherwise. c) Estimate by eye the value of x for which F(x) = x + y 0 x 1 and 0 y 1. 0 otherwise
Eample 5 EX: Which of the following joint density functions have (correlation) ρ XY = 0? (Remember that ρ XY = 0 is possible with symmetry even if X and Y are not independent.) a) b) f (, y) = π ( ) 2
More informationSeptember Statistics for MSc Weeks 1-2 Probability and Distribution Theories
September Statistics for MSc Weeks 1-2 Probability and Distribution Theories Ali C. Tasiran Department of Economics, Mathematics and Statistics Malet Street, London WC1E 7HX September 2014 Contents 1 Introduction
More informationMATH Notebook 4 Fall 2018/2019
MATH442601 2 Notebook 4 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 4 MATH442601 2 Notebook 4 3 4.1 Expected Value of a Random Variable............................
More informationContinuous Probability Distributions
1 Chapter 5 Continuous Probability Distributions 5.1 Probability density function Example 5.1.1. Revisit Example 3.1.1. 11 12 13 14 15 16 21 22 23 24 25 26 S = 31 32 33 34 35 36 41 42 43 44 45 46 (5.1.1)
More informationA Few Special Distributions and Their Properties
A Few Special Distributions and Their Properties Econ 690 Purdue University Justin L. Tobias (Purdue) Distributional Catalog 1 / 20 Special Distributions and Their Associated Properties 1 Uniform Distribution
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationChapter 3 Single Random Variables and Probability Distributions (Part 1)
Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationProblem 1. Problem 2. Problem 3. Problem 4
Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationSTATISTICS SYLLABUS UNIT I
STATISTICS SYLLABUS UNIT I (Probability Theory) Definition Classical and axiomatic approaches.laws of total and compound probability, conditional probability, Bayes Theorem. Random variable and its distribution
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationORF 245 Fundamentals of Statistics Chapter 4 Great Expectations
ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable
More informationReview of Statistics I
Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationContinuous Distributions
Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study
More informationi=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y
Math 480 Exam 2 is Wed. Oct. 31. You are allowed 7 sheets of notes and a calculator. The exam emphasizes HW5-8, and Q5-8. From the 1st exam: The conditional probability of A given B is P(A B) = P(A B)
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More information