Week 2. Review of Probability, Random Variables and Univariate Distributions

Similar documents
Set Theory Digression

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Recitation 2: Probability

Relationship between probability set function and random variable - 2 -

Week 12-13: Discrete Probability

Chapter 2. Discrete Distributions

Chapter 3 Discrete Random Variables

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

1 Presessional Probability

Statistics for Economists. Lectures 3 & 4

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Chapter 4 : Discrete Random Variables

1 Random Variable: Topics

Lecture 4: Probability and Discrete Random Variables

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

Quick Tour of Basic Probability Theory and Linear Algebra

Math-Stat-491-Fall2014-Notes-I

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Fundamental Tools - Probability Theory II

Northwestern University Department of Electrical Engineering and Computer Science

Mathematical Statistics 1 Math A 6330

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

3. Review of Probability and Statistics

EE514A Information Theory I Fall 2013

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Discrete Distributions

Math Bootcamp 2012 Miscellaneous

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

September Statistics for MSc Weeks 1-2 Probability and Distribution Theories

Actuarial Science Exam 1/P

p. 4-1 Random Variables

Lecture 11. Probability Theory: an Overveiw

Probability: Handout

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Things to remember when learning probability distributions:

Random Variables Example:

Chapter 2 Random Variables

Lecture 3. Discrete Random Variables

Discrete Distributions

Probability Theory Review

Recap of Basic Probability Theory

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Probability. Carlo Tomasi Duke University

Probability Theory and Random Variables

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Recap of Basic Probability Theory

Random Variables and Their Distributions

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Lecture 1: Review on Probability and Statistics

Sample Spaces, Random Variables

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Measure-theoretic probability

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Product measure and Fubini s theorem

Conditional Probability

Review of Probabilities and Basic Statistics

1.1 Review of Probability Theory

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Name: Firas Rassoul-Agha

Refresher on Discrete Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

2 (Statistics) Random variables

Conditional Probability and Independence

Discrete Random Variables

Motivation and Applications: Why Should I Study Probability?

Lecture 1: August 28

Deep Learning for Computer Vision

Introduction to Probability and Stocastic Processes - Part I

1 Variance of a Random Variable

Statistics and Econometrics I

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 215 STAT. By Weaam Alhadlaq

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 4a Probability Models

Part (A): Review of Probability [Statistics I revision]

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Probability and Statistics

Problem 1. Problem 2. Problem 3. Problem 4

STAT 7032 Probability Spring Wlodek Bryc

Introduction to Probability and Statistics Slides 3 Chapter 3

1.6 Families of Distributions

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

p. 6-1 Continuous Random Variables p. 6-2

Topic 3: The Expectation of a Random Variable

MAT 271E Probability and Statistics

Lecture notes for probability. Math 124

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Statistika pro informatiku

Analysis of Engineering and Scientific Data. Semester

Transcription:

Week 2 Review of Probability, Random Variables and Univariate Distributions

Probability Probability

Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

Probability Outcomes and Events Outcomes and Events Elementary Events the distinguishable outcomes of an experiment Sample Space set of all elementary events Event subset of the sample space Event Space all events associated with an experiment We work on a Probability space which consists of a sample space Ω and a σ-algebra of events. A σ-algebra is a collection of sets which 1. contains the empty-set, 2. contains all complements 3. and is stable under countable unions Example: Consider the roll of a die. What is an appropriate sample space and algebra of events?

Probability Outcomes and Events Exercise 2.1 Prove that a set of size M (sample space) has 2 M subsets (event space)

Probability Probability Function Probability Function Probability Function a set function with domain A (a σ-algebra of events) and counterdomain the interval [0,1] satisfying 1. P [A] 0 for every A A 2. P [Ω] = 1 3. If A 1, A 2,... is a sequence of mutually exclusive events in A (so that A i A), then i=1 [ ] P A i = i=1 P [A i ] i=1 Example: For the roll of the die it is natural to take Ω = {1, 2, 3, 4, 5, 6} and A to be the set of all subsets of Ω. Then P (A) = Card(A)/Card(Ω).

Probability Probability Function Basic Rules of Probability If A 1, A 2,..., A n are mutually exclusive elements in A, then P [A 1 A 2... A n ] = n P [A i ] i=1 If A A then For every two events A, B A P [Ā] = 1 P [A] P [A B] = P [A] + P [B] P [AB] If A, B A and A B, then Here AB is shorthand for A B. P [A] P [B]

Probability Conditional Probability and Independence Conditional Probability Conditional Probability the probability of event A given B P [A B] = P [AB] P [B] if P [B] > 0 (undefined for P[B] = 0)

Probability Conditional Probability and Independence Rules of Conditional Probability Theorem of Total Probability If B 1,..., B n partition Ω and P [B j ] > 0, j = 1,..., n, then n P [A] = P [A B j ]P [B j ] j=1 Bayes Formula If B 1,..., B n partition Ω and P [B j ] > 0, j = 1,..., n, then for every A A for which P [A] > 0, P [B k A] = P [A B k ]P [B k ] n j=1 P [A B j]p [B j ] Multiplication Rule If A 1,..., A n A and P [A 1,... A n 1 ] > 0, then P [A 1,... A n ] = P [A 1 ]P [A 2 A 1 ]... P [A n A 1... A n 1 ]

Probability Conditional Probability and Independence Exercise 2.2 There are 5 urns, numbered 1 to 5. Each urn contains 10 balls. Urn i has i defective balls, i = 1,..., 5. Consider the following experiment: First an urn is selected at random and then a ball is selected at random from the selected urn. The experimenter does not know which urn was selected. 1. What is the probability that a defective ball will be selected? 2. If we have already selected the ball and noted that it is defective, what is the probability that it came from urn 5? Generalise to urn k, k = 1,..., 5

Probability Conditional Probability and Independence Independent Events Independent Events events A and B are independent iff P [AB] = P [A]P [B]. It follows that 1. P [A B] = P [A] if P [B] > 0 2. P [B A] = P [B] if P [A] > 0

Probability Conditional Probability and Independence Exercise 2.3 Consider the experiment of tossing two dice. Let A = {total is odd}, B = {6 on the first die}, C = {total is seven}. 1. Are A and B independent? 2. Are A and C independent? 3. Are B and C independent?

Probability Conditional Probability and Independence Independence of Several Events Independent Events events A 1,..., A n are independent iff 1. P [A i A j ] = P [A i ]P [A j ], i j 2. P [A i A j A k ] = P [A i ]P [A j ]P [A k ], i j, j k, i k... n. P [ n i=1 A i] = n i=1 P [A i]

Probability Conditional Probability and Independence Exercise 2.4 Show that pairwise independence does not imply independence using the following events in the random experiment of tossing two dice: 1. A 1 = {odd face on first die} 2. A 2 = {odd face on second die} 3. A 3 = {odd total}

Random Variables Random Variables

Random Variables Random Variables and Cumulative Distribution Function Random Variables and CDF Random Variable For a given probability space (Ω, A, P [.]), a function with domain Ω and counterdomain the real line. Distribution Function The function F X : R [0, 1] such that for every x R. F X (x) = P [X x] = P [{ω : X(ω) x}]

Random Variables Random Variables and Cumulative Distribution Function Properties of CDFs 1. lim F X(x) = 0 and lim F X(X) = 1 x x 2. F X (a) F X (b) for a < b 3. F X (.) is continuous from the right lim F X (x + h) = F X (x) h 0

Random Variables Random Variables and Cumulative Distribution Function Exercise 2.5 Consider the experiment of tossing two dice. Let X = {total of upturned faces} and Y = {absolute difference of upturned faces}. Sketch F Y.

Random Variables Discrete Random Variables Mass Functions A random variable is discrete if the set of values it can take is countable. For random variable X with distinct values x 1, x 2,..., x n,..., the mass function p X : R [0, 1] such that { P [X = x j ] if x = x j, j = 1, 2,..., n,... p X (x) = 0 if x x j

Random Variables Discrete Random Variables Properties of Mass Functions 1. p(x j ) 0, j = 1, 2,... 2. p(x) = 0 for x x j, j = 1, 2,... 3. j p(x j) = 1 where summation is over x 1, x 2,..., x n,...

Random Variables Discrete Random Variables Exercise 2.6 Consider the experiment of tossing two dice. Let X = {total of upturned faces} and Y = {absolute difference of upturned faces}. Give the probability function p X and sketch it. Give p Y.

Random Variables Expectations and Moments Expectation For discrete X with mass points x 1, x 2,..., x j,... E[X] = j x j p X (x j )

Random Variables Expectations and Moments Let Y = g(x). Then Y is a discrete random variable. Y takes values y i = g(x i ). If g is strictly monotonic then the y i are distinct and p Y (y i ) = p X (x i ). More generally p Y (y i ) = j:g(x j )=y i p(x j ). Provided the sums below are absolutely convergent we have E[g(X)] = E[Y ] = y k p Y (y k ) = y k p(x j ) k k j:g(x j )=y k = g(x j )p X (x j ) k j:g(x j )=y k = j g(x j )p X (x j )

Random Variables Expectations and Moments Variance Let X be a random variable and let µ X = E[X]. Discrete For discrete X with mass points x 1, x 2,..., x j,... V ar[x] = j (x j µ x ) 2 f X (x j ) Variance in terms of expectations V ar[x] = E[(X E(X) 2 )] = E[X 2 ] (E[X]) 2

Random Variables Expectations and Moments Exercise 2.7 Consider the experiment of tossing two dice. Let X = {total of upturned faces} and Y = {absolute difference of upturned faces}. Compute E[X] and E[Y].

Random Variables Expectations and Moments Properties of Expectations 1. E[c] = c for a constant c 2. E[cg(X)] = ce[g(x)] for a constant c 3. E[c 1 g 1 (X) + c 2 g 2 (X)] = c 1 E[g 1 (X)] + c 2 E[g 2 (X)] 4. E[g 1 (X)] E[g 2 (X)] if g 1 (x) g 2 (x) x

Random Variables Expectations and Moments Two Useful Results Chebyshev Inequality For a RV X with finite variance P [ X µ X rσ X ] = P [(X µ X ) 2 r 2 σ 2 X] 1 r 2 Jensen Inequality For a RV X with mean E[X] and g(.) a convex function, E[g(X)] g(e[x])

Random Variables Moments and Moment Generating Functions Moments Moments For a RV X, the r th moment of X is given by µ r = E[X r ] if the expectation exists Central Moments For a RV X, the r th central moment of X about a is given by µ r = E[(X a) r ]

Random Variables Moments and Moment Generating Functions The First Four Moments about the Mean µ 1 = E[(X µ X )] = 0 variance skewness kurtosis µ 2 = E[(X µ X ) 2 ] µ 3 σx 3 µ 4 σx 4 = E[(X µ X) 3 ] σ 3 X = E[(X µ X) 4 ] σ 4 X

Random Variables Moments and Moment Generating Functions Moment Generating Function Let X be a RV with mass function p X (.) Discrete m(t) = E[e tx ] = e tx p X (x) x The MGF has the property that dm(t) dt = µ X t=0 This can be extended to d r m(t) dt r = µ r t=0

Random Variables Moments and Moment Generating Functions Exercise 2.8 Find the MGF of the binomial distribution ( ) n P (X = x) = p x (1 p) n x, x = 0, 1, 2,..., n x Use it to show that the mean is np and the variance is np(1 p)

Random Variables Moments and Moment Generating Functions Other Distribution Summaries Quantile For a RV X, the q th quantile η q is the smallest number η satisfying F X (η) q Median Interquartile Range η 0.5 η 0.75 η 0.25 Mode the point at which p X (.) obtains its maximum

Special Univariate Distributions Special Univariate Distributions

Special Univariate Distributions Discrete Distributions Discrete Uniform Distribution p(x) = { 1 N x = 1, 2, 3,..., N 0 otherwise E[X] = N + 1 2 V ar[x] = N 2 1 12 Discrete Uniform(10) f(x) 1/10 2 4 6 8 10 x

Special Univariate Distributions Discrete Distributions Bernoulli Distribution p(x) = { p x (1 p) 1 x x = 0, 1 0 p 1 0 otherwise E[X] = p V ar[x] = p(1 p) = pq Bernoulli(0.4) 0.6 f(x) 0.4 0.0 0.2 0.4 0.6 0.8 1.0 x

Special Univariate Distributions Discrete Distributions Binomial Distribution {( n ) p(x) = x p x (1 p) n x x = 0, 1,..., n 0 p 1 0 otherwise E[X] = np V ar[x] = np(1 p) f(x) 0.00 0.05 0.10 0.15 0.20 Bin(20, 0.2) 5 10 15 20 x

Special Univariate Distributions Discrete Distributions Hypergeometric Distribution ( A B ) x)( n x ( p(x; A, B, n) = A+B ) x = 0, 1,..., n n 0 otherwise E[X] = np V ar[x] = npq T n T 1 T = A + B, p = A T f(x) 0.0 0.1 0.2 0.3 0.4 0.5 Hypergeometric(10, 5, 5) 0 1 2 3 4 5 x

Special Univariate Distributions Discrete Distributions Poisson Distribution p(x; λ) = { e λ λ x x! x = 0, 1,... λ > 0 0 otherwise E[X] = V ar[x] = λ Poisson(2) f(x) 0.00 0.10 0.20 0 5 10 15 x

Special Univariate Distributions Discrete Distributions Exercise 2.9 Prove that E[X] = V ar[x] = λ using the MGF of the Poisson Distribution

Special Univariate Distributions Discrete Distributions Uses of the Poisson Distribution For large n, X Bin(n, p) is approximately P oisson(np) A Poisson Process with rate λ per unit time is such that 1. X, the number of occurences of an event in a given time interval t is P oisson(λt) 2. The number of events in non-overlapping time intervals are independent

Special Univariate Distributions Discrete Distributions Exercise 2.10 The number X of insect larvae found on a cm 2 on a petri plate is assumed to follow a Poisson distribution with λ = 3. Find P (X 3) P (X > 1) P (2 X 4) F (4.2) Find an example (plot the distribution on a petri plate) where the assumption of a Poisson distribution would not be reasonable.

Special Univariate Distributions Discrete Distributions Geometric and Negative Binomial Distribution p(x; p, r) = {( x+r 1 x where 0 p 1, q = 1 p ) p r q x = ( r x 0 otherwise E[X] = rq p ) p r ( q) x x = 0, 1,... V ar[x] = rq p 2 f(x) 0.0 0.1 0.2 0.3 0.4 NB(1, 0.4) = Geometric(0.4) 0 5 10 15 x

Special Univariate Distributions Discrete Distributions Exercise 2.11 Let X NB(r, p). Find its MGF and use it to derive E[X] and V ar[x].