Twelfth Problem Assignment

Similar documents
Eleventh Problem Assignment

Math Spring Practice for the final Exam.

CS145: Probability & Computing

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

STAT Chapter 5 Continuous Distributions

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Practice Questions for Final

COMPSCI 240: Reasoning Under Uncertainty

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Introduction to Probability

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Midterm Exam 1 Solution

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

STAT 414: Introduction to Probability Theory

(Practice Version) Midterm Exam 2

Brief Review of Probability

MATH Solutions to Probability Exercises

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Analysis of Engineering and Scientific Data. Semester

X = X X n, + X 2

Random Variables and Their Distributions

Section 9.1. Expected Values of Sums

STAT 418: Probability and Stochastic Processes

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Essentials on the Analysis of Randomized Algorithms

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

ECE302 Spring 2015 HW10 Solutions May 3,

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

University of Illinois ECE 313: Final Exam Fall 2014

1 Exercises for lecture 1

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

Notes for Math 324, Part 19

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Math 447. Introduction to Probability and Statistics I. Fall 1998.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Random Variables and Expectations

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

FINAL EXAM: Monday 8-10am

Question Points Score Total: 76

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

Homework 10 (due December 2, 2009)

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

CS 1538: Introduction to Simulation Homework 1

Chapter 1: Revie of Calculus and Probability

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Continuous distributions

CS280, Spring 2004: Final

Brief Review of Probability

1 Basic continuous random variable problems

Topic 3: The Expectation of a Random Variable

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Final Solutions Fri, June 8

Probability reminders

MATH 407 FINAL EXAM May 6, 2011 Prof. Alexander

Statistics 100A Homework 5 Solutions

Midterm Exam 2 (Solutions)

1 Basic continuous random variable problems

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Normal Distribution. Distribution function and Graphical Representation - pdf - identifying the mean and variance

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

6.1 Moment Generating and Characteristic Functions

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Basic concepts of probability theory

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Chapter 3: Random Variables 1

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Practice Problem - Skewness of Bernoulli Random Variable. Lecture 7: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example

Review of Probability. CS1538: Introduction to Simulations

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Expectation of Random Variables

Chapter 5 continued. Chapter 5 sections

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

ECON Fundamentals of Probability

Bandits, Experts, and Games

14.30 Introduction to Statistical Methods in Economics Spring 2009

Chapter 3: Random Variables 1

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

MATH/STAT 3360, Probability

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Relationship between probability set function and random variable - 2 -

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Exercises with solutions (Set D)

Limiting Distributions

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

A random variable is said to have a beta distribution with parameters (a, b) ifits probability density function is equal to

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Transcription:

EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X n ) (b) Y n is the median of the values of X 1, X 2,..., X 2n+1. Determine whether or not the sequence {Y n } converges in probability in each case. (a) Y n 1 in probability. P( Y n 1 > ε) = P(Y n < 1 ε) = (b) Y n 1 2 in probability. n P(X i < 1 ε) = (1 ε) n 0(as n ). P( Y n 0.5 > ε) P(Y n > 0.5 + ε) + P(Y n < 0.5 ε) Each of these terms converges to zero. Consider P(Y n > 0.5 + ε). For the event {Y n > 0.5+ε} to occur, we must have at least n+1 of the random variables X 1, X 2,... X 2n+1 to have a value of 0.5 + ε or larger. Let Z i be a Bernoulli random variable which is equal to 1 if and only if X i 0.5+ε. Then, the event {Y n 0.5+ε} is the same as the event (Z 1 + Z 2n+1 )/(2n + 1) 0.5. Note that P(Z i = 1) = 0.5 ε. By the weak law of large numbers, the sequence (Z 1 + Z 2n+1 )/(2n + 1) converges to 0.5 ε. Therefore (Z1 + Z 2n+1 ) P 0.5 + ε (2n + 1) converges to zero, which implies P(Y n 0.5 + ε) also converges to zero. By a similar argument it can be shown that P Yn 0.5 ε converges to zero, and therefore Y n converges to 0.5. 1 Not Graded

s PROBLEM 2 (a) Show the one sided Chebyshev s inequality: If X is a random variable with mean 0 and finite variance σ 2, the for any a > 0 For any b 0, P(X a) σ2 σ 2 + a 2 P(X a) = P(X + b a + b) E[(X + b)2 ] (a + b) 2 Treat the RHS as a function of b and minimize with respect to b. The minimum is attained at b = E[X2 ] = Var(X). Substituting back, we get a a 2 Var(X) Var(X) + a P(X a) 2 = a + Var(X) a (b) Generalize to the case when E[X] = µ. In particular, show that P(X µ + a) P(X µ a) σ2 σ 2 + a 2 σ2 σ 2 + a 2 Var(X) a 2 + Var(X) Take X to be X µ in part (a), and the first result follows. Take X to be µ X and the second result follows. PROBLEM 3 Many people believe that the daily change of price of a company s stock on the stock market is a random variable with mean 0 and variance σ 2. That is, if Y n represents the price of the stock on the nth day, then Y n = Y n 1 + X n n 1 where X 1, X 2,... are independent and identically distributed random variables with mean 0 and variance σ 2. Suppose that the stock s price today is 100. If σ 2 = 1, what can you say about the probability that the stock s price will exceed 105 after 10 days? Given that Y 0 = 100, we want to find P(Y 10 > 105). Note that Y 10 = Y 0 + X 1 + X 2 + + X 10. Therefore, E[Y 10 ] = E[Y 0 ] = 100 and Var(Y 10 ) = Var(X 1 ) + 2 Not Graded

s Var(X 2 ) + + Var(X 10 ) = 10. Now, using the Chebyshev s inequality P(Y 10 > 105) = P(Y 10 100 > 5) < P( Y 10 100 > 5) Var(Y 10) 5 2 = 10 25 = 0.4 We can get a tighter bound using the one sided Chebyshev s inequality, P(Y 10 > 105) = P(Y 10 > 100 + 5) Var(Y 10) Var(Y 10 ) + 5 2 = 10 35 = 0.285 PROBLEM 4 Consider a Gaussian random variable X, with mean m and variance σ 2 = (m/2) 2. (a) Determine the probability that an experimental value of X is negative. Pr(X < 0) = Pr X m m/2 < 2 = Pr (Z < 2) = Φ ( 2) = 1 Φ (2) = 0.0228 (b) Determine the probability that the sum of four independent experimental values of X is negative. Let Y = X 1 + X 2 + X 3 + X 4. Then, E[Y] = 4m and Var(Y) = m 2. Thus Y 4m Pr(Y < 0) = Pr m < 4 = Φ( 4) = 1 Φ(4) 0 (c) Two independent experimental values of X (X 1, X 2 ) are obtained. Determine the PDF and mean and variance for: (i) ax 1 + bx 2 (ii) ax 1 bx 2 (iii) X 1 X 2 a, b > 0 (i) ax 1 + bx 2. Sum of two Gaussian r.v.s is also a Gaussian r.v. with mean (a + b)m and variance (a 2 + b 2 )m 2 /4. (ii) ax 1 bx 2. Sum of two Gaussian r.v.s is also a Gaussian r.v. with mean (a b)m and variance (a 2 + b 2 )m 2 /4. (iii) X 1 X 2. Let Y = X 1 X 2, then Y is a Gaussian r.v. with mean 0 and variance m 2 /2. Thus the density of Z = Y is Thus, 4 f Z (z) = m /m2 e z2, z > 0 2π 3 Not Graded

s and E[Z] = Var(Z) = 0 0 4z m 2π 4z 2 m 2π /m2 2 e z2 dz = m π /m2 2m e z2 dz = m π Γ 1 2 (d) Determine the probability that the product of four independent experimental values of X is negative. Let Y = X 1 X 2 X 3 X 4. Let p be the probability that X i < 0. X m p = Pr(X i < 0) = Pr m/2 < 2 = Φ( 2) = 1 Φ(2) = 0.0228 Y will be negative if either one of X i s is negative or three X i s are negative, that is 4 4 Pr(Y < 0) = p(1 p) 3 + p 3 (1 p) = 0.0835 1 3 PROBLEM 5 Consider a Poisson process with mean arrival rate λ = 1 and let X n be the number of arrivals between 0 and n. Does X n /n converge in probability? We know that X n is a Poisson random variable with parameter λn = n. Therefore, E[X n ] = n = var(x n ), which means that E[X n /n] = 1 and var(x n /n) = 1/n. By using the Chebyshev inequality, we have the following for any ɛ > 0: ( ) X n P n 1 ɛ var(x n) ɛ 2 = 1/n ɛ 2 Therefore, taking the limit as n : ( ) lim P X n n n 1 ɛ 1/n lim n ɛ 2 = 0 Since a probability is always greater than or equal to 0, then the above inequality implies that: ( ) lim P X n n n 1 ɛ = 0 Thus, we have shown that X n /n converges in probability to 1. 4 Not Graded

s PROBLEM 6 Suppose that the number of units produced daily at factory A is a random variable with mean 20 and standard deviation 3 and the number produced at factory B is a random variable with mean 18 and standard deviation 6. Assuming independence, derive an upper bound for the probability that more units are produced today at factory B than factory A. Let X be the number of units produced by factory A and let Y be the number of units produced by factory B. Then, we have that E[Y X] = E[Y] E[X] = 18 20 = 2. Also, since Y and X are independent, then: var(y X) = var(y) + var( X) = var(y) + var(x) = 36 + 9 = 45. We can use this information to apply the one-sided Chebyshev s inequality and obtain the following upperbound: P (Y X 0) = P (Y X ( 2) 2) var(y X)var(Y x) + 2 2 = 45 49 Thus,an upperbound for the probability that factory B produces at least as many units as factory B is given by 45/49. PROBLEM 7 Consider the number of 3s which result from 600 tosses of a fair six-sided die. (a) Determine the probability that there are exactly 100 3s, using a form of Stirling s approximation for n! which is very accurate for these values, Pr( 100 3s) = n! 2πe n n n+0.5 ( 600 100 ) ( 1 6 ) 100 5 500 6 500 500 = 600! 100! 500! 600 600 2πe 600 600 600.5 2πe 100 100 5 500 100.5 2πe 500 500 500.5 = 1 2π 600 500 100 = 0.043 6 600 5 Not Graded

s (b) Use the Poisson approximation to the binomial PMF to obtain the probability that there are exactly 100 3s. Using the Poisson approximation Pr(100 3s) = ( 600 1 ) 100 e(600 1/6) 6 100! 100100 e 100 100! = = 1 10 2π = 0.03989 100 100 e 100 2πe 100 100 100.5 (c) Repeat part (b), using the central limit theorem intelligently. Let X i denote the indicator function of the event that the outcome of the ith toss is a 3. Then X i is a Bernoulli random variable with parameter 1/6. Let K = 100 X i. Then E[K] = 100/6 and Var(K) = 500/36 = 125/9. Thus, 101 100 99 100 Pr(99 K 101) Φ Φ = 2Φ(0.268) 1 = 0.21128 9/125 9/125 (d) Use the Chebyshev inequality to find a lower bound on the probability that the number of 3s is between 97 and 103 inclusive, between 90 and 110 inclusive, and between 60 and 140 inclusive. From Chebyshev s inequality, Pr( K 100 a) 1 Var(K) a 2 a Pr 3-0.543 10 0.8611 40 0.9913 (e) Repeat part (d), using the central limit theorem and employing the DeMoivre-Laplace result when it appears relevant. Compare your answers with those obtained above, and comment. 6 Not Graded

s Using DeMoivre-Laplace transform, Pr(100 a < K < 100 + a) = 2Φ ( (a +.5) 9/125 ) 1 a Pr 3 0.6524 10 1 40 1 When we are close to the mean of the sum, central limit theorem gives a better approximation. When we are away from the mean, Chebyshev s inequality gives a better approximation. PROBLEM 8 Student scores on exams are given by a certain instructor have mean 74 and standard deviation 14. This instructor is about to give two exams, one to a class of size 25 and the other to a class of size 64. n Let X i denote the score of student i. LetM n = 1 n X i. Then E[M n ] = 74 and Var(M n ) = 196/n. (a) Approximate the probability that the average test score in the class of size 25 exceeds 80. Pr(M 25 74 + 6) 196/25 196/25 + 36 = 49 274 0.178. (b) Repeat part (a) for the class size of 64. Pr(M 64 74 + 6) 196/64 196/64 + 36 = 49 625 0.0784 (c) Approximate the probability that the average test score in the larger class exceeds that of the other class by over 2.2 points. Let M = M 64 M 25. Then E[M] = 0 and Var(M) = Var(M 64 ) + Var(M 25 ) = 196(1/25 + 1/64) 10.9025. Thus 7 Not Graded

s Pr(M 2.2) Var(M) Var(M) + 2.2 2 = 0.6925 (d) Approximate the probability that the average test score in the smaller class exceeds that of the other class by over 2.2 points. Pr(M 2.2) Var(M) Var(M) + 2.2 2 = 0.6925 PROBLEM 9 The energy of any individual particle in a certain system is an independent random variable with probability density function { f E (e) = 2e 2e e 0 0 e < 0 The total energy in the system is the sum of the energies of the individual particles. Energy of each particle is an exponential random variable with mean 1 2 and variance 1 4. If there are n particles in the system, the total energy K has mean n/2 and variance n/4 (a) If there are 1,600 particles in the system, determine the probability that there are between 780 and 840 energy units in the system. In this case E[K] = 800 and Var(K) = 400. Using the central limit theorem 840 800 780 800 Pr(780 K 840) Φ Φ = Φ(2) Φ( 1) = 0.8185 20 20 (b) What is the largest number of particles the system may contain if the probability that its total energy is less than 440 units must be at least 0.9725. Suppose the number of particles in the system is n. Using Markov inequality Pr(K < 440) = 1 Pr(K 440) 1 n 2 0.9725 440 Solving for n we get n 855.8. Hence the largest number of particles that the system may contain is 855. (c) Each particle will escape from the system if its energy exceeds (ln 3)/2 units. If the system originally contained 4,800 particles, what is the probability that at least 1,700 particles will escape? 8 Not Graded

s Let Y i be the indicator of the event that particle i escapes. Then Pr(Y i = 1) = ln 3/2 2e 2x dx = 1 3 Let L = 4800 Y i, then L denotes the number of particles that have escaped. E[L] = 4800/3 = 1600 and Var(L) = 4800(1/3)(2/3) = 3200/3 = 1066.66. Using the one sided Chebyshev s inequality, we have Pr(L 1600 + 100) 1066.66 1066.66 + 100 2 = 8 83 = 0.964 (d) If there are 10 particles in the system, determine an exact expression for the PDF for the total energy in the system. The total energy of all particles in the system is the sum of 10 exponential random variables. Thus, the PDF is an Erlang distribution. If T denotes the total energy, then 2 10 x 9 e 2x x 0 f T (x) = 9! 0 x < 0 PROBLEM 10 A set of 200 people, consisting of 100 men and 100 women, randomly divided into 100 pairs of 2 each. Give an upper bound to the probability that at most 30 of these pairs will consist of a man and a woman. Number the 100 men from 1 to 100. Let X i be the indicator of the event that man i is paired with a woman, that is { 1 if man i is paired with a woman X i = 0 otherwise Let X denote the number of man-woman pairs. Then X = 100 Also Therefore, E[X i ] = Pr(X i = 1) = 100 199 X i. Then, E[X i X i ] = Pr(X i = 1, X j = 1) = Pr(X i = 1) Pr(X j = 1 X i = 1) = 100 199 99 197 9 Not Graded

s and 100 E[X] = E[X i ] = 100 100 199 50.25 100 Var(X) = Var(X i ) + 2 Cov(X i, X j ) = 100 100 199 99 199 + 2 25.126 (a) Use Chebyshev s inequality i<j ( 100 2 ) [ 100 199 99 197 ] 100 2 199 Pr(X 30) Pr( X 50.25 20.25) 25.126 (20.25) 2 0.061 (b) Use one sided Chebyshev s inequality Pr(X 30) = Pr(X 50.25 20.25) 25.126 25.126 + (20.25) 2 0.058 (c) Use the Central Limit Theorem 30 50.25 Pr(X 30) Φ( = Φ( 4.04) 0 25.126 Notice that we are away from the mean of the process, so central limit theorem gives a bad approximation. PROBLEM 11 The number of rabbits in the generation i is N i. Variable N i+1, the number of rabbits in generation i + 1, depends on the random effects of light, heat, water, food, and predator population, as well as on the number N i. The relation is 0.2 j = 1 P Ni+1 (jn i ) = 0.3 j = 2 0.5 j = 3 10 Not Graded

s This states, for instance, that with probability 0.5 there will be three times as many rabbits in the generation i + 1 as there were in generation i. The rabbit population in generation 1 was 2. Find an approximation to the number of rabbits in generation 12. (Hint: To use the central limit theorem, you must find an expression involving the sum of random variables. Let N 12 denote the population after 12 generations. Then By weak law of large numbers N 12 = 2J 2 J 3 J 12 log N 12 = log 2 + log J 2 + log J 12 E[log N 12 ] = log 2 + 11E[log J] log N 12 log 2 + 11E[log J] = 3.9186 N 12 = 8290 (in probability) (in probability) Therefore there are approximately 8290 rabbits in generation 12 (assuming that rabbits do not die) 11 Not Graded