Review of Probability Theory II

Similar documents
Chapter 7: Special Distributions

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

ECON 4130 Supplementary Exercises 1-4

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

MATH 829: Introduction to Data Mining and Analysis Consistency of Linear Regression

Probability and Distributions

4. Distributions of Functions of Random Variables

MAS223 Statistical Inference and Modelling Exercises

Exercises and Answers to Chapter 1

Advanced Econometrics II (Part 1)

1 Probability Spaces and Random Variables

3. Probability and Statistics

1 Presessional Probability

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

5 Operations on Multiple Random Variables

Continuous Random Variables

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Lecture 5: Expectation

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK

18.440: Lecture 28 Lectures Review

Chp 4. Expectation and Variance

B8.1 Martingales Through Measure Theory. Concept of independence

1 Review of Probability and Distributions

Lecture 2: Repetition of probability theory and statistics

Moment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution

Limiting Distributions

STA 2201/442 Assignment 2

Expectation of Random Variables

STAT 3610: Review of Probability Distributions

General Random Variables

1.1 Review of Probability Theory

18.440: Lecture 28 Lectures Review

STAT/MATH 395 PROBABILITY II

Probability Theory and Statistics. Peter Jochumzen

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Northwestern University Department of Electrical Engineering and Computer Science

Continuous Random Variables and Continuous Distributions

Qualifying Exam in Probability and Statistics.

the convolution of f and g) given by

Limiting Distributions

Formulas for probability theory and linear models SF2941

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

Maximum Likelihood Asymptotic Theory. Eduardo Rossi University of Pavia

Quick Tour of Basic Probability Theory and Linear Algebra

Discrete Distributions

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Characteristic Functions and the Central Limit Theorem

Actuarial Science Exam 1/P

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

ECON 5350 Class Notes Review of Probability and Distribution Theory

Lecture 1: August 28

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Algorithms for Uncertainty Quantification

Bivariate Transformations

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Final Exam # 3. Sta 230: Probability. December 16, 2012

Random Variables and Their Distributions

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

6 The normal distribution, the central limit theorem and random samples

Gaussian vectors and central limit theorem

MATH2715: Statistical Methods

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Lectures on Statistics. William G. Faris

Chapter 5. Chapter 5 sections

Continuous Random Variables

Stat 5101 Notes: Algorithms

Elements of Asymptotic Theory. James L. Powell Department of Economics University of California, Berkeley

1 Random Variable: Topics

1. Aufgabenblatt zur Vorlesung Probability Theory

Lecture 22: Variance and Covariance

3 Continuous Random Variables

1 Review of Probability

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

15 Discrete Distributions

Lectures on Elementary Probability. William G. Faris

Chapter 5 continued. Chapter 5 sections

2 (Statistics) Random variables

Convergence Concepts of Random Variables and Functions

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Set Theory Digression

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

1: PROBABILITY REVIEW

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

BASICS OF PROBABILITY

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

MAS113 Introduction to Probability and Statistics. Proofs of theorems

STAT 430/510: Lecture 16

Multivariate Random Variable

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Probability and Statistics

STA 256: Statistics and Probability I

Part II Probability and Measure

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution

Statistical Distributions

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Transcription:

Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function f of X by Eg(X) = i g(x(ω i ))P {ω i }. To create a formula for discrete random variables, write R(x) for the set of ω so that X(ω) = x, then the sum above can be written Eg(X) = g(x(ω))p {ω} = g(x)p {ω} x ω R(x) x ω R(x) = g(x) P {ω} = g(x)p {ω; X(ω) = x} x ω R(x) x = g(x)p {X = x} = g(x)(x). x x rovided that the sum converges absolutely. Here is the mass function for X. For a continuous random variable, with distribution function F and density f, choose a small ositive value x, and let X be the random variable obtained by rounding the value of X down to the nearest integer multile of x, then Eg( X) = x g( x)p { X = x} = x g( x)p { x X < x + x} = x g( x)(f ( x + x) F ( x)) x g( x)f( x) x g(x)f(x) dx. Provided that the integral converges absolutely, these aroximations become an equality in the limit as x 0. Exercise. Let X and X be random variables on a countable samle sace Ω having a common state sace Let g and g be two real valued functions on the state sace and two numbers c and c. Then E[c g (X ) + c g (X )] = c Eg (X ) + c Eg (X ).

Several choice for g have secial names.. If g(x) = x, then µ = EX is call variously the mean, and the first moment.. If g(x) = x k, then EX k is called the k-th moment. 3. If g(x) = (x) k, where (x) k = x(x ) (x k +), then E(X) k is called the k-th factorial moment. 4. If g(x) = (x µ) k, then E(X µ) k is called the k-th central moment. 5. The second central moment σ X = E(X µ) is called the variance. Note that Var(X) = E(X µ) = EX µex + µ = EX µ + µ = EX µ. 6. If X is R d -valued and g(x) = e i θ,x, where, is the standard inner roduct, then φ(θ) = Ee i θ,x is called the Fourier transform or the characteristic function. 7. Similarly, if X is R d -valued and g(x) = e θ,x, then m(θ) = Ee θ,x is called the Lalace transform or the moment generating function. 8. If X is Z + -valued and g(x) = z x, then ρ(z) = Ez X = x=0 P {X = x}zx is called the (robability) generating function. Table of Discrete Random Variables random variable arameters mean variance generating function Bernoulli ( ) ( ) + z binomial n, n n( ) (( ) + z) n nk hyergeometric N, n, k N geometric nk N ( N k ) ( N n ) N N ( ( )z) a a ( )z negative binomial a, a Poisson λ λ λ ex( λ( z)) uniform a, b b a+ (b a+) z a z b a+ b a+ z

Table of Continuous Random Variables random variable arameters mean variance characteristic function beta, β +β β (+β) (+β+) F, (a, b; iθ π ) Cauchy µ, σ none none ex(iµθ σ ) chi-squared a a a exonential λ λ F q, a a a λ, a > a q+a q(a 4)(a ) ( iθ) a/ iλ θ+iλ gamma, β β θ+iβ Lalace µ, σ µ σ ex(iµθ) +σ θ normal µ, σ µ σ ex(iµθ σ θ ) c Pareto, c, > c ( )( ) t a, µ, σ µ, a > σ a, a > a a+b uniform a, b Joint Distributions and Conditioning β ( iβ ) (b a) i ex(iθb) ex(iθa) θ(b a) A air of random variables X and X is called indeendent if for every air of events A, A, P {X A, X A } = P {X A }P {X A }. (8). For their distribution functions, F X and F X, (8) is equivalent to factoring of the joint distribution function F (x, x ) = F X (x )F X (x ), to the factoring of joint density for continuous random variables f(x, x ) = f X (x )f X (x ), to the factoring of the joint mass function for discrete random variables and, finally, to the factoring of exectations (x, x ) = X (x ) X (x ), Eg (X )g (X ) = Eg (X )Eg (X ). Definition. For a air of random variables X and X, the covariance with means µ and µ is defined by Cov(X, X ) = E(X µ )(X µ ) = EX X µ µ. In articular, if X and X are indeendent, then Cov(X, X ) = 0. The correlation Cov(X, X ) ρ(x, X ) = Var(X )Var(X ). 3

Exercise 3. Var(X + + X n ) = n i= n j= Cov(X i, X j ). For a air of jointly continuous random variables, the marginal density of X is The conditional density of Y given X is f X (x) = f Y X (y x) = f(x, y) dy. f(x, y) f X (x). The conditional exectation is the exectation using the conditional density. E[g(Y ) X = x] = g(y)f Y X (y x) dy. Similar exression marginal mass function and conditional mass function, relacing integrals by sums, exists for discrete random variables. The conditional mass function of Y given X is Y X (y x) = (x, y) X (x). The conditional exectation is the exectation using the conditional density. E[g(Y ) X = x] = y g(y) Y X (y x). 3 Law of Large Numbers The law of large numbers states that the long term emirical average of indeendent random variables X, X,... having a common distribution function F ossessing a mean µ. In words, we have with robability, X n = n (X + X + + X n ) = n S n µ as n. We can defiine the emrical distribution function F n (x) = n #( observations from X, X,..., X n that are less than or equal to x) = n n I (,x] (X i ). i= Then, by the strong law,we have with robability, F n (x) F (x) as n. The Glivenko-Cantelli theorem states that this convergence is uniform in x. 4

4 Central Limit Theorem For the situation above, we have that X n µ 0 as n with robability. The central limit theorem states that if we magnify the difference by a factor of n, then we see convergence of the distributions to a normal random variable. Definition 4. A sequence of distribution functions {F n ; n } is said to converge in distribution to the distribution function F if lim n F n(x) = F (x) whenever x is a continuity oint for F. Theorem 5 (Central Limit Theorem). If the sequence {X n ; n } introduced above has common variance σ, then { n lim P ( Xn µ ) } z = Φ(z) n σ where Φ is the distribution function of a standard normal random variable. We often write n σ ( Xn µ ) = S n nµ σ n. 5