Problem Points S C O R E Total: 120

Similar documents
Problem Points S C O R E Total: 120

MATH/STAT 3360, Probability

Modern Discrete Probability Branching processes

MAT 135B Midterm 1 Solutions

Introduction to Stochastic Processes

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

Introduction and Overview STAT 421, SP Course Instructor

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Solutions For Stochastic Process Final Exam

Essentials on the Analysis of Randomized Algorithms

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Qualifying Exam in Probability and Statistics.

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Chapter IV: Random Variables - Exercises

Homework 4 Solution, due July 23

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Martingale Theory for Finance

Notes for Math 324, Part 17

Mathematical Methods for Computer Science

Tom Salisbury

Chapter 4. Chapter 4 sections

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Qualifying Exam in Probability and Statistics.

1 Stat 605. Homework I. Due Feb. 1, 2011

Math 493 Final Exam December 01

1 Sequences of events and their limits

Week 2. Review of Probability, Random Variables and Univariate Distributions

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Assignment 4: Solutions

Lectures on Markov Chains

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Lecture 1: Review on Probability and Statistics

18.175: Lecture 14 Infinite divisibility and so forth

Math 6810 (Probability) Fall Lecture notes

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Eleventh Problem Assignment

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

17. Convergence of Random Variables

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Lecture 4. David Aldous. 2 September David Aldous Lecture 4

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Twelfth Problem Assignment

p. 4-1 Random Variables

Chapter 11 Advanced Topic Stochastic Processes

3 Multiple Discrete Random Variables

Discrete Distributions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

6.041/6.431 Fall 2010 Quiz 2 Solutions

Math 180B Homework 4 Solutions

1 Basic continuous random variable problems

Positive and null recurrent-branching Process

18.175: Lecture 17 Poisson random variables

Inference for Stochastic Processes

Qualifying Exam in Probability and Statistics.

Lecture 23. Random walks

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

18.440: Lecture 28 Lectures Review

BNAD 276 Lecture 5 Discrete Probability Distributions Exercises 1 11

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

X = X X n, + X 2

EE514A Information Theory I Fall 2013

RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME

MATH 118 FINAL EXAM STUDY GUIDE

Basic Probability space, sample space concepts and order of a Stochastic Process

Recitation 2: Probability

Notes 15 : UI Martingales

18.440: Lecture 28 Lectures Review

More on Distribution Function

18.440: Lecture 19 Normal random variables

On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance

The Geometric Random Walk: More Applications to Gambling

CS 246 Review of Proof Techniques and Probability 01/14/19

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Solutions to Problem Set 4

1 Gambler s Ruin Problem

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

4 Branching Processes

. Find E(V ) and var(v ).

Selected Exercises on Expectations and Some Probability Inequalities

7 Poisson random measures

Recap of Basic Probability Theory

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Formulas for probability theory and linear models SF2941

6.1 Moment Generating and Characteristic Functions

Stochastic Models in Computer Science A Tutorial

Joint Distribution of Two or More Random Variables

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Random Walks Conditioned to Stay Positive

3. DISCRETE RANDOM VARIABLES

Transcription:

PSTAT 160 A Final Exam Solution December 10, 2015 Name Student ID # Problem Points S C O R E 1 10 2 10 3 10 4 10 5 10 6 10 7 10 8 10 9 10 10 10 11 10 12 10 Total: 120

1. (10 points) Take a Markov chain with the state space {1, 2, 3, 4} and transition matrix 0.1 0.2 0.3 0.4 A = 0.3 0 0.5 0.2 0 0.2 0.8 0 0.5 0 0.5 0 Find P(X 2 = 4), if the initial distribution is P(X 0 = 1) = 0.3, P(X 0 = 2) = 0.4, P(X 0 = 3) = 0, P(X 0 = 4) = 0.3. Solution: The distribution of X 1 is given by the vector p(1) = p(0)a, where p(0) = [ 0.3 0.4 0 0.3 ] Calculating this, we get: p(1) = [ 0.3 0.06 0.44 0.2 ] Similarly, the distribution of X 2 is given by the vector p(2) = p(1)a. The fourht component of this vector is equal to P(X 2 = 4). Calculating this, we get: 0.132

2. (10 points) Fix the parameters N 1 = 10000, N 2 = 5000, N 3 = 2000, λ 1 = 2, λ 2 = 3, λ 3 = 5. Assume that an insurance company has N 1 clients with claims distributed as Poisson with parameters λ 1 : Poi(λ 1 ), N 2 clients with claims Poi(λ 2 ), and N 3 clients with claims Poi(λ 3 ). Suppose that this company wants to assign a premium to each client proportional to the mean amount (expected value) of his claim, so that this client pays this premium. The company wants to collect enough money from the premiums so that it can pay all the claims with probability greater than or equal to 99%. Find the premium for each client. Solution: Use the normal approximation (Central Limit Theorem). Let X 1,..., X N1 Poi(λ 1 ) i.i.d., Y 1,..., Y N2 Poi(λ 2 ) i.i.d., Z 1,..., Z N3 Poi(λ 3 ) i.i.d. The total amount of claims is equal to S := X 1 +... + X N1 + Y 1 +... + Y N2 + Z 1 +... + Z N3. By the Central Limit Theorem, S ES Var S N (0, 1). Because for ξ Poi(λ), we have: Eξ = Var ξ = λ, we get: Therefore, ES = EX 1 +... + EX N1 + EY 1 +... + EY N2 + EZ 1 +... + EZ N3 = = N 1 λ 1 + N 2 λ 2 + N 3 λ 3 = 45000, Var S = Var X 1 +... + Var X N2 + Var Y 1 +... + Var Y N2 + Var Z 1 +... + Var Z N3 = = N 1 λ 1 + N 2 λ 2 + N 3 λ 3 = 45000. ( ) S 45000 S 45000 N (0, 1), and P 2.326 = 0.99. 45000 45000 We can rewrite this as ( P S 45000 + 2.326 45000 ) = 0.99 In other words, the company needs to collect the amount 45000 + 2.326 45000 = 45493 of money from premiums. If the premium for a client with Poi(λ i ) claim is λ i u, for i = 1, 2, 3, then the total amount of premiums is λ 1 u N 1 + λ 2 u N 2 + λ 3 u N 3 = 45000u. Therefore, we need: 45000u = 45493, and u = 1.011. The clients with Poi(2) claims are charged 2u = 2.022 the clients with Poi(3) claims are charged 3u = 3.033 the clients with Poi(5) claims are charged 5u = 5.055

3. (10 points) Find all stationary distributions for the Markov chain with the transition matrix 0.4 0 0.6 A = 0 1 0 0.1 0 0.9 Solution: A stationary distribution p = [ p 1 p 2 ] p 3 must satisfy p = pa. Write this as a system of equations: p 1 = 0.4p 1 + 0.1p 3 { 0.6p 1 = 0.1p 3 p 2 = p 2 0.1p 3 = 0.6p 1 p 3 = 0.6p 1 + 0.9p 3 6p 1 = p 3. Combining this with p 1 + p 2 + p 3 = 1, we get: denote p 2 = u. Then { p 1 + p 3 = 1 u 6p 1 = p 3 { p 1 = 1 (1 u) 7 p 3 = 6 (1 u) 7 The final answer is (where 0 u 1): [ 1 (1 u) u 6 (1 u)] 7 7

4. (10 points) Toss a fair coin repeatedly. Let F n be the σ-subalgebra generated by the first n results, F 0 := {, Ω}. Let X n be the number of Heads during the first n tosses for n = 1, 2,... and X 0 := 0. Find a constant c such that the process (Y n ) n 0 is an (F n ) n 0 -martingale: Y n := 3X n cn, n = 0, 1, 2,... Solution: We can represent X n as follows. For n = 1, 2,... let { 1, if Heads on the nth toss; Z n = 0, otherwise. Then X n = Z 1 +... + Z n. Now, Z 1, Z 2,... are i.i.d. Bernoulli: P(Z i = 0) = P(Z i = 1) = 1/2. Also, F n is generated by Z 1,..., Z n. Therefore, Y n = 3X n cn = 3 (Z 1 +... + Z n ) cn, and Y n+1 = Y n + 3Z n+1 c. Therefore, because Y n is F n -measurable, and Z n+1 is independent of F n, we have: E (Y n+1 F n ) = E (Y n + 3Z n+1 c F n ) = Y n + E (3Z n+1 c). To make (Y n ) n 0 a martingale, we need E (3Z n+1 c) = 0. This can be rewritten as ( 3 0 1 2 + 1 1 ) c = 0 c = 3 2 2.

5. (10 points) Consider the probability space Ω = {0, 1, 2,..., 11} with three random variables { { { 1, ω 5; 1, ω is even; 12, ω = 6, a, b; X(ω) = Y (ω) = Z(ω) = 2, ω 6, 10, ω is odd, 12, else. Here, a, b Ω are some elements, a < b. Find the values of a and b such that Z is measurable with respect to the σ-sublagebra F := σ(x, Y ) generated by X and Y. Solution: The σ-subalgebra F contains events {X = 1, Y = 1} = {0, 2, 4}, {X = 2, Y = 1} = {6, 8, 10}, {X = 1, Y = 10} = {1, 3, 5}, {X = 2, Y = 10} Other events in F (except the empty set ) are unions of these events. For Z to be F-measurable, the event {Z = 12} has to belong to this σ-subalgebra, that is, to be a union of some of these events. But {Z = 12} = {6, a, b}, so this is possible only if {6, a, b} = {6, 8, 10}, that is, a = 8, b = 10

6. (10 points) Consider the following independent random variables: X n N (0, 3 n ), n = 1, 2,... Show that for all N = 1, 2,..., we have: P ( max ( 0, X 1, X 1 + X 2,..., X 1 +... + X N ) 6 ) 1 72. Solution: The process (S n ) n 0, defined by S 0 := 0, S n := X 1 +... + X n, n = 1, 2,... is an (F n ) n 0 -martingale, where F 0 := {, Ω}, and F n is generated by X 1,..., X n. Apply the martingale inequality for λ = 6 and p = 2: ( P max S n 1 ) E S N 2. n=0,...,n 6 λ 2 Now, ES N = EX 1 +... + EX N = 0, and so E S N 2 = ES 2 N = ES 2 N (ES N ) 2 = Var S N = Var X 1 +... + Var X N = Therefore, This completes the proof. = 3 1 + 3 2 +... + 3 N 3 n = 1 2. n=1 E S N 2 λ 2 1/2 6 2 = 1 72.

7. (10 points) Take i.i.d. random variables X 1, X 2,..., as well as a Poisson random variable τ Poi(λ), independent of X 1, X 2,... Assume EX i = µ, Var X i = σ 2, Ee tx i = ϕ(t) for t R. Consider the following sum: τ S = X k. k=1 Show that the Ee ts = e λ(ϕ(t) 1) for t R. Using this, express ES and Var S as combinations of µ, σ 2, and λ. Solution: Because τ Poi(λ), we have for k = 0, 1, 2,... Therefore (because X 1, X 2,... are i.i.d.) P(τ = k) = λk k! e λ. ψ(t) := Ee ts = k=0 Ee t(x 1+...+X k ) λk k! e λ = ( Ee tx i ) k λ k k! e λ = k=0 Next, = ( Ee tx i ) k λ k k! e λ = k=0 Letting t = 0, we have: ψ (t) = λϕ (t)ψ(t), k=0 ϕ k (t) λk k! e λ = e λ = e λ e λϕ(t) = e λ(ϕ(t) 1). (ϕ(t)λ) k k=0 k! ψ (t) = λϕ (t)ψ(t) + λ 2 ϕ 2 (t)ψ(t). = ψ(0) = 1, ϕ (0) = EX i = µ, ϕ (0) = EX 2 i = µ 2 + σ 2, and therefore ES = ψ (0) = λϕ (0)ψ(0) = λµ ES 2 = ψ (0) = λ ( µ 2 + σ 2) + λ 2 µ 2 = µ 2 (λ + λ 2 ) + σ 2 λ

8. (10 points) Consider the following Markov chain: 0.4 0 0.3 0.3 0 0 0.3 0.6 0.1 0 A = 0 0 0.5 0.5 0 0 0 0.1 0.9 0 0 0.2 0.3 0 0.5 For each transient state i, find the mean time m i spent in i if you start from i. Solution: There are three transient states: i = 1, 2, 5. The corresponding submatrix is 0.4 0 0 P = 0 0.3 0 0 0.2 0.5 Let us calculate 0.6 0 0 (I 3 P ) 1 = 0 0.7 0 0 0.2 0.5 We can calculate this inverse matrix by splitting into two blocks: the upper-left corner element and the 2 2-lower-right corner matrix. Let us invert this latter matrix: [ ] 1 0.7 0 = 0.2 0.5 1 [ ] 1 0.5 0 = 0.7 0.5 0 ( 0.2) 0.2 0.7 [ 10/7 ] 0 4/7 2 Therefore, Therefore, 5/3 0 0 (I 3 P ) 1 = 0 10/7 0 0 4/7 2 m 1 = 5 3, m 2 = 10 7, m 4 = 2

9. (10 points) For a random walk (X n ) n 0, starting from X 0 = 2, with p = 0.4, q = 0.6, find P ( X 6 = 0, X 12 = 2, X n 2, n = 0,..., 12 ). Solution: First, let us find the number of trajectories of random walk from (0, 2) to (6, 0) which stay above the line y = 3. This number is equal to the total number of trajectories of random walk from (0, 2) to (6, 0) minus the number of such trajectories which intersect or cross the line y = 3. The latter number, by the reflection principle, is equal to the number of trajectories from (0, 2) to the symmetric reflection of (6, 0) with respect to the line y = 3, that is, to (6, 6). The number of trajectories from (0, 2) to (6, 0): if they make a steps up and b steps down, then a + b = 6 and a b = 0 ( 2) = 2, so a = 4 and b = 2. We need to choose 2 downward steps from the total of 6 steps. This can be done in ( ) 6 2 = 6 5 = 15 ways. 2 The number of trajectories from (0, 2) to (6, 6): if they make a steps up and b steps down, then a + b = 6 and a b = 6 ( 2) = 4, so a = 1 and b = 5. We need to choose 1 upward step from the total of 6 steps. This can be done in 6 ways. Thus, the number of trajectories from (0, 2) to (6, 0) which stay above y = 3 is equal to 15 6 = 9. Each such trajectory has 4 steps upward and 2 steps downward, so its probability is p 4 q 2. Therefore, the probability P (X 6 = 0, X n 2, n = 0,..., 6) = 9p 4 q 2. Now, consider the next half of the trajectory. Let us find the number of trajectories of random walk from (6, 0) to (12, 2) which stay above the line y = 3. Every such trajectory makes a steps upward and b steps downward, and so a + b = 6, a b = 2. So a = 4 and b = 2. The trajectory makes only 2 steps downward, so it simply cannot touch the line y = 3 if it starts from (6, 0). So we need simply to find the number of trajectories from (6, 0) to (12, 2). This is equivalent to choosing 4 upward steps from 6 total steps, which can be done in ( ) 6 4 = 6 5 = 15 2 ways. So there are 15 such trajectories. Each has probability p 4 q 2, so the total probability that the random walk goes from (6, 0) to (12, 2) is 15p 4 q 2. Multiplying it by the total probability 9p 4 q 2 above, we get the answer: 135p 8 q 4

10. (10 points) Find the probability that the Markov chain, starting from 3, hits 4 before hitting 1: 0.1 0.2 0.3 0.4 A = 0.5 0 0.5 0 0 0.5 0.1 0.4 0.4 0.3 0.2 0.1 Solution: Suppose that p(x) is the probability that, starting from x, the Markov chain hits 4 before 1. Then p(4) = 1, p(1) = 0, and by the Markov property we have: p(3) = 0.5p(2) + 0.1p(3) + 0.4p(4), p(2) = 0.5p(1) + 0.5p(3). We can rewrite these latter equations as 0.9p(3) = 0.5p(2) + 0.4, p(2) = 0.5p(3). Solve them: 0.9p(3) = 0.25p(3) + 0.4 0.65p(3) = 0.4 p(3) = 0.4 0.65 = 8 13

11. (10 points) Chebyshev s inequality states that, for λ > 0 and a random variable X, we have: P ( X EX λ) Var X λ 2. For λ = 2, find an example of X which turns it into an equality, but with positive left- and right-hand sides. Solution: Try X with the distribution P(X = 2) = P(X = 2) = 0.5. Then EX = 2 0.5 + ( 2) 0.5 = 0, and Var X = EX 2 (EX) 2 = EX 2 = 4, because X 2 4. Therefore, the left and right sides of this inequality are equal to 1.

12. (10 points) Suppose the company has N 1 = 10000 clients, which have car insurance for n 1 = 10 years, and N 2 = 20000 clients, which have car insurance for n 2 = 12 years. In each group, three-quarters of the clients are careful drivers, and one-quarter are wild drivers. Each month, a careful driver can have an accident with probability p 1 = 2 10 7, and a wild driver can have an accident with probability p 2 = 5 10 7. All accidents occur independently. For each accident, the company has to pay 1000$. The company wants to collect enough money from the premiums so that it can pay all the claims with probability greater than or equal to 95%. Find the amount of money needed to do this. Solution: First, let us model the total number X of accidents with a Poisson random variable with parameter λ = 1 4 12n 1N 1 p 2 + 1 4 12n 2N 2 p 2 + 3 4 12n 1N 1 p 1 + 3 4 12n 2N 2 p 1 = ( 1 = 12 4 p 2 + 3 ) 4 p 1 (n 1 N 1 + n 2 N 2 ) = 33 10 7 340000 = 1.122. Let us find n such that P(X n) 95%. We have: P(X = k) = λk k! e λ, k = 0, 1, 2,... and after calculations we find n = 3. So the company needs 3000$

Cumulative Probabilities of the Standard Normal Distribution The table gives the probabilities α = Φ(z) to the left of given z values for the standard normal distribution. For example, the probability that a standard normal random variable Z is less than 1.53 is found at the intersection of the 1.5 rows and the 0.03 column, thus Φ(1.53) = P (Z 1.53) = 0.9370. Due to symmetry it holds Φ( z) = 1 Φ(z) for all z. z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.0 0.5000 0.5040 0.5080 0.5120 0.5160 0.5199 0.5239 0.5279 0.5319 0.5359 0.1 0.5398 0.5438 0.5478 0.5517 0.5557 0.5596 0.5636 0.5675 0.5714 0.5753 0.2 0.5793 0.5832 0.5871 0.5910 0.5948 0.5987 0.6026 0.6064 0.6103 0.6141 0.3 0.6179 0.6217 0.6255 0.6293 0.6331 0.6368 0.6406 0.6443 0.6480 0.6517 0.4 0.6554 0.6591 0.6628 0.6664 0.6700 0.6736 0.6772 0.6808 0.6844 0.6879 0.5 0.6915 0.6950 0.6985 0.7019 0.7054 0.7088 0.7123 0.7157 0.7190 0.7224 0.6 0.7257 0.7291 0.7324 0.7357 0.7389 0.7422 0.7454 0.7486 0.7517 0.7549 0.7 0.7580 0.7611 0.7642 0.7673 0.7704 0.7734 0.7764 0.7794 0.7823 0.7852 0.8 0.7881 0.7910 0.7939 0.7967 0.7995 0.8023 0.8051 0.8078 0.8106 0.8133 0.9 0.8159 0.8186 0.8212 0.8238 0.8264 0.8289 0.8315 0.8340 0.8365 0.8389 1.0 0.8413 0.8438 0.8461 0.8485 0.8508 0.8531 0.8554 0.8577 0.8599 0.8621 1.1 0.8643 0.8665 0.8686 0.8708 0.8729 0.8749 0.8770 0.8790 0.8810 0.8830 1.2 0.8849 0.8869 0.8888 0.8907 0.8925 0.8944 0.8962 0.8980 0.8997 0.9015 1.3 0.9032 0.9049 0.9066 0.9082 0.9099 0.9115 0.9131 0.9147 0.9162 0.9177 1.4 0.9192 0.9207 0.9222 0.9236 0.9251 0.9265 0.9279 0.9292 0.9306 0.9319 1.5 0.9332 0.9345 0.9357 0.9370 0.9382 0.9394 0.9406 0.9418 0.9429 0.9441 1.6 0.9452 0.9463 0.9474 0.9484 0.9495 0.9505 0.9515 0.9525 0.9535 0.9545 1.7 0.9554 0.9564 0.9573 0.9582 0.9591 0.9599 0.9608 0.9616 0.9625 0.9633 1.8 0.9641 0.9649 0.9656 0.9664 0.9671 0.9678 0.9686 0.9693 0.9699 0.9706 1.9 0.9713 0.9719 0.9726 0.9732 0.9738 0.9744 0.9750 0.9756 0.9761 0.9767 2.0 0.9772 0.9778 0.9783 0.9788 0.9793 0.9798 0.9803 0.9808 0.9812 0.9817 2.1 0.9821 0.9826 0.9830 0.9834 0.9838 0.9842 0.9846 0.9850 0.9854 0.9857 2.2 0.9861 0.9864 0.9868 0.9871 0.9875 0.9878 0.9881 0.9884 0.9887 0.9890 2.3 0.9893 0.9896 0.9898 0.9901 0.9904 0.9906 0.9909 0.9911 0.9913 0.9916 2.4 0.9918 0.9920 0.9922 0.9925 0.9927 0.9929 0.9931 0.9932 0.9934 0.9936 2.5 0.9938 0.9940 0.9941 0.9943 0.9945 0.9946 0.9948 0.9949 0.9951 0.9952 2.6 0.9953 0.9955 0.9956 0.9957 0.9959 0.9960 0.9961 0.9962 0.9963 0.9964 2.7 0.9965 0.9966 0.9967 0.9968 0.9969 0.9970 0.9971 0.9972 0.9973 0.9974 2.8 0.9974 0.9975 0.9976 0.9977 0.9977 0.9978 0.9979 0.9979 0.9980 0.9981 2.9 0.9981 0.9982 0.9982 0.9983 0.9984 0.9984 0.9985 0.9985 0.9986 0.9986 3.0 0.9987 0.9987 0.9987 0.9988 0.9988 0.9989 0.9989 0.9989 0.9990 0.9990 3.1 0.9990 0.9991 0.9991 0.9991 0.9992 0.9992 0.9992 0.9992 0.9993 0.9993 3.2 0.9993 0.9993 0.9994 0.9994 0.9994 0.9994 0.9994 0.9995 0.9995 0.9995 3.3 0.9995 0.9995 0.9995 0.9996 0.9996 0.9996 0.9996 0.9996 0.9996 0.9997 3.4 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9998 Quantiles of the Standard Normal Distribution For selected probabilities α, the table shows the values of the quantiles z α such that Φ(z α ) = P (Z z α ) = α, where Z is a standard normal random variable. The quantiles satisfy the relation z 1 α = z α. α 0.9 0.95 0.975 0.99 0.995 0.999 z α 1.282 1.645 1.960 2.326 2.576 3.090