Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).

Similar documents
Lecture 3. Discrete Random Variables

SDS 321: Introduction to Probability and Statistics

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Name: Firas Rassoul-Agha

Chapter 3 Discrete Random Variables

More on Distribution Function

1 Variance of a Random Variable

MATH Notebook 5 Fall 2018/2019

Mathematical Statistics 1 Math A 6330

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

1 Random Variable: Topics

Quick Tour of Basic Probability Theory and Linear Algebra

Things to remember when learning probability distributions:

Introducing the Normal Distribution

Introducing the Normal Distribution

Chapters 3.2 Discrete distributions

1 Presessional Probability

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Random Variables Example:

Chapter 4. Chapter 4 sections

Random Variables and Their Distributions

Part 3: Parametric Models

ESS011 Mathematical statistics and signal processing

Random variables (discrete)

3 Multiple Discrete Random Variables

Chapter 5. Means and Variances

MAT 135B Midterm 1 Solutions

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Continuous Probability Distributions. Uniform Distribution

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Chapter 3. Chapter 3 sections

Discrete Distributions

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Lecture 3. Biostatistics in Veterinary Science. Feb 2, Jung-Jin Lee Drexel University. Biostatistics in Veterinary Science Lecture 3

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

Class 26: review for final exam 18.05, Spring 2014

Chapter 5. Chapter 5 sections

Math Bootcamp 2012 Miscellaneous

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Stat 134 Fall 2011: Notes on generating functions

The random variable 1

Probability and Statistics. Vittoria Silvestri

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Topic 3: The Expectation of a Random Variable

1 Review of Probability

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Binomial and Poisson Probability Distributions

5. Conditional Distributions

Math/Stat 352 Lecture 10. Section 4.11 The Central Limit Theorem

Sample Spaces, Random Variables

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Chapter 2: Random Variables

1 Probability Model. 1.1 Types of models to be discussed in the course

THE ROYAL STATISTICAL SOCIETY 2007 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 2 PROBABILITY MODELS

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

p. 4-1 Random Variables

1: PROBABILITY REVIEW

Probability and Distributions

Practice Midterm 2 Partial Solutions

Analysis of Engineering and Scientific Data. Semester

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Practice Midterm 2 Partial Solutions

UNIT NUMBER PROBABILITY 6 (Statistics for the binomial distribution) A.J.Hobson

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Normal approximation to Binomial

Generating Random Variates 2 (Chapter 8, Law)

Page Max. Possible Points Total 100

Lecture 1: August 28

STAT 3610: Review of Probability Distributions

1 Introduction. P (n = 1 red ball drawn) =

6 The normal distribution, the central limit theorem and random samples

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Department of Mathematics

Fundamental Tools - Probability Theory II

Summary statistics, distributions of sums and means

Probability and Statistics

Probability Distributions Columns (a) through (d)

Chapter 4 : Expectation and Moments

M378K In-Class Assignment #1

a zoo of (discrete) random variables

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Department of Mathematics

Review of Probability Theory

PROBABILITY VITTORIA SILVESTRI

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Statistical Preliminaries. Stony Brook University CSE545, Fall 2016

Moment Generating Functions

STAT 418: Probability and Stochastic Processes

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Lecture 2: Repetition of probability theory and statistics

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Motivation and Applications: Why Should I Study Probability?

1.1 Review of Probability Theory

Math Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah

BNAD 276 Lecture 5 Discrete Probability Distributions Exercises 1 11

Transcription:

Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3). First of all, we note that µ = 2 and σ = 2. (a) Since X 3 is equivalent to Z = X 2 2 3 2 2 = 0.5, we find from the table that P (X 3) = P (Z 0.5) = Φ(0.5) 0.6915. (b) Since X 1 is equivalent to Z = X 2 2 1 2 2 = 0.5, we find from the table that P (X 1) = P (Z 0.5) = P (Z 0.5) = Φ(0.5) 0.6915. (c) It follows from that P (X 1) = 1 P (X 1) 1 0.6915 = 0.3085 P (1 X 3) = P (X 3) P (X 1) 0.6915 0.3085 = 0.3830. Exercise 1. Assume that X follows the normal distribution N(1, 9). Estimate (a) P (X 1.4); (b) P (X 1.22); (c) P ( 1.22 X 1.4). 1

Example 2. Estimate k such that P (Z k) = 0.1, where Z follows the standard normal distribution. Note that for such k, we have From the table we find Φ(k) = P (Z k) = 1 0.1 = 0.9. Φ(1.28) 0.8997 and Φ(1.29) 0.9015, which means that k can be approximated by either 1.28 or 1.29. Knowing that k (1.28, 1.29), we can have a better approximation of k using the following linear interpolation: 0.9 0.8997 0.9015 0.8997 = 0.0018 k 1.28 1.29 1.28 0.01 = 0.18, which implies that k 1.28 + 1 600. Exercise 2. Estimate k such that P ( k Z k) = 0.97 where Z is the standard normal random variable. 2

12.1 Some Special Probability For any Z following N(0, 1), we have P ( 1 Z 1) 0.68 and and P ( 2 Z 2) 0.95 P ( 3 Z 3) 0.99. So, for any X following N(µ, σ 2 ), we have P (µ σ X µ + σ) 0.68 and and P (µ 2σ X µ + 2σ) 0.95 P (µ 3σ X µ + 3σ) 0.99. 3

13 Applications of the Normal Distribution Example 3. Suppose the salary of a group of 1000 civil servants follows the normal distribution N(10000, 1000 2 ). (a) Estimate the number of civil servants having salary less than 10500. (b) Estimate the lowest salary of the top 200 civil servants. First of all, we note that µ = 10000 and σ = 1000. Let X be the salary of a civil servant, and let Z = (X 10000)/1000. (a) Then we have P (X 10500) = P ( Z So, the desired number of civil servants is ) 10500 10000 1000 1000 0.6915 692. = P (Z 0.5) 0.6915. }{{} from the N(0,1) table 4

(b) We are supposed to estimate k such that 1000 200 P (X k) = = 0.8 1000 or equivalently, ( P Z k 10000 ) ( ) k 10000 = Φ = Φ(K) = 0.8. 1000 1000 From the table, we find Φ(0.84) 0.7995 and Φ(0.85) 0.8023, which means K can be approximated by either 0.84 or 0.85, and accordingly k can be approximated by either 10840 or 10850. To have a better approximation, we apply the linear interpolation to obtain 0.8 0.7995 K 0.84 0.8023 0.7995, 0.85 0.84 which means K can be approximated by 0.8418 and accordingly k by 10842. 5

13.1 From Binomial or Poisson to Normal This subsection presents the relationship between the binomial or Poisson distributions and the normal distribution. Let us first recall the central limit theorem. Proposition 1. Let X 1, X 2,..., X n be a sequence of independent, identically distributed random variables with mean µ and variance σ 2. Then the cumulative distribution function of the following random variable tends to that of the standard normal random variable as n : Z n = X 1 + X 2 +... + X n nµ σ. n Proposition 2. For a large λ, P oisson(λ) can be approximated by N(λ, λ). Proof. It follows from the fact that P oisson(λ) is the sum of λ independent P oisson(1) and the central limit theorem. Proposition 3. For a large n, Binomial(n, p) can be approximated by N(np, np(1 p)). Proof. It follows from the fact that Binomial(n, p) is the sum of n independent Bernoulli(p) and the central limit theorem. 6

Example 4. A fair coin is tossed 100 times. We can find the probability that the number of heads obtained is between 48 and 52 by using the normal approximation: The number of heads obtained from the 100 tosses follows Binomial(100, 0.5), which, by Proposition 3, can be approximated by N(50, 25). Letting X N(50, 25) and Z = (X 50)/5, we then deduce that the area of the following five rectangles 1 are approximated by P (47.5 X 52.5), which can be computed as ( ) 47.5 50 52.5 50 P Z = P ( 0.5 Z 0.5) 1 2(1 0.6915) = 0.383. 5 5 48 49 50 51 52 Exercise 3. A fair coin is tossed 200 times. Find the probability that the number of heads obtained is between 98 and 102 by using the normal distribution approximation. 1 Since we are approximating a p.m.f. of a discrete random variable by a continuous one, some adjustments have been made so as to get a better approximation. 7

Example 5. Let Y 1, Y 2,..., Y n be n independent Poisson random variables having the same parameter 1, that is, each Y i follows P oisson(1). Then Z n = Y 1 + Y 2 +... + Y n, the sum of the n Poisson random variables with parameter 1 is a Poisson random variable with parameter n, namely, Z n follows P oisson(n). By the central limit theorem, for large n, (Z n n)/ n approximately follows the normal distribution N(0, 1). Then, we have P (Z n = n) = P (n 1 < Z n n) = P ( 1 < Z n n 0) n n 0 e x2 2 dx 2π 1 1 n 0 dx = 1, 2π 2πn 1 1 n 8

where we have used the fact that for large n, e x2 2 1, for x ( 1/ n, 0). Now, since Z n is a Poisson random variable with parameter n, we have P (Z n = n) = e n n n, n! which, together with the derived approximation P (Z n = n) 1/ 2πn, implies that n! n n+1 2e n 2π, which is Stirling s formula. So, using the connection between the Poisson distribution and normal distribution, we have given a heuristic argument for Stirling s formula. 9

A Summary 1. Bernoulli experiment 2. Bernoulli distribution Bernoulli(p) 3. Geometric distribution Geometric(p) 4. Binomial distribution Binomial(n, p) 5. Exponential distribution Expo(λ) 6. Poisson distribution P oisson(λ) 7. Normal distribution N(µ, σ 2 ) 8. The standard normal distribution N(0, 1) 9. Linear interpolation method 10. Normal approximates binomial 11. Normal approximates Poisson 10