DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Similar documents
DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

STAT 414: Introduction to Probability Theory

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

STAT 418: Probability and Stochastic Processes

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

MAT 271E Probability and Statistics

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Math 510 midterm 3 answers

Math Spring Practice for the final Exam.

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

CS145: Probability & Computing Lecture 11: Derived Distributions, Functions of Random Variables

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis

We introduce methods that are useful in:

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Mathematics 375 Probability and Statistics I Final Examination Solutions December 14, 2009

ECEn 370 Introduction to Probability

Math Spring Practice for the Second Exam.

14.30 Introduction to Statistical Methods in Economics Spring 2009

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Stat 100a, Introduction to Probability.

STAT FINAL EXAM

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

Midterm Exam 2 (Solutions)

1 Random Variable: Topics

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

S n = x + X 1 + X X n.

MATH Notebook 4 Fall 2018/2019

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

6.041/6.431 Fall 2010 Quiz 2 Solutions

Final Exam # 3. Sta 230: Probability. December 16, 2012

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

EE 345 MIDTERM 2 Fall 2018 (Time: 1 hour 15 minutes) Total of 100 points

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Chapter 5. Chapter 5 sections

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Page Max. Possible Points Total 100

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

Tutorial 1 : Probabilities

SPRING 2005 Midterm Exam #1, Part A

MATH Notebook 5 Fall 2018/2019

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Discrete Mathematics and Probability Theory Spring 2015 Vazirani Midterm #2 Solution

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

1 Review of Probability

University of Illinois ECE 313: Final Exam Fall 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Final Solutions Fri, June 8

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

Exponential Distribution and Poisson Process

Lecture 8: Continuous random variables, expectation and variance

FINAL EXAM: 3:30-5:30pm

ECE Lecture #10 Overview

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

THE INTERNATIONAL UNIVERSITY (IU) Course: Probability Models in Operations Research Department of Industrial System Engineering

MAT 271E Probability and Statistics

Twelfth Problem Assignment

SDS 321: Practice questions

FINAL EXAM: Monday 8-10am

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

IE 4521 Midterm #1. Prof. John Gunnar Carlsson. March 2, 2010

3. Review of Probability and Statistics

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

1 Basic continuous random variable problems

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Edexcel past paper questions

EE 302: Probabilistic Methods in Electrical Engineering

MIDTERM EXAM SOLUTIONS

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Probability and Statistics Notes

Final Exam Study Aid

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

STAT515, Review Worksheet for Midterm 2 Spring 2019

Sample Math 115 Midterm Exam Spring, 2014

SDS 321: Introduction to Probability and Statistics

Exam 3, Math Fall 2016 October 19, 2016

STAT 311 Practice Exam 2 Key Spring 2016 INSTRUCTIONS

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Qualifying Exam in Probability and Statistics.

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Intro to Probability. Andrei Barbu

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

Transcription:

QUESTION BOOKLET EE 126 Spring 26 Midterm #2 Thursday, April 13, 11:1-12:3pm DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 8 minutes to complete the midterm. The midterm consists of three problems, provided in the question booklet (THIS BOOKLET), that are in no particular order of difficulty. Write your solution to each problem in the space provided in the solution booklet (THE OTHER BOOKLET). Try to be neat! If we can t read it, we can t grade it. You may give an answer in the form of an arithmetic expression (sums, products, ratios, factorials) that could be evaluated using a calculator. Expressions like ( 8 3) or 5 k (1/2)k are also fine. A correct answer does not guarantee full credit and a wrong answer does not guarantee loss of credit. You should concisely explain your reasoning and show all relevant work. The grade on each problem is based on our judgment of your understanding as reflected by what you have written. This is a closed-book exam except for two sheets of handwritten notes (one 8.5 11 page, both sides OR two pages, single side only), plus a calculator.

Problem 1: (13 points) Diana the Daredevil is trying to break the world land-speed record using her rocketpowered motorcycle. In order to do so, she needs to cover more than 5 yards in 1 seconds. If her motorcycle has Z pounds of rocket fuel to start, then it travels a distance of X 5 Z yards in 1 seconds. (a)(5 points) Suppose that the amount of rocket fuel Z is a random variable uniformly distributed over [5, 15]. Compute the PDF and CDF of the distance X. (b)(2 points) Compute the probability that Diana breaks the world record on any given trial. (c)(5 points ) Now suppose that Diana is allowed to take a total of n trials in order to try and break the world record. (Here n is some fixed positive integer: i.e., n {1, 2, 3,...}). The amount of rocket fuel at the start is independent from trial to trial. How large does n have to be in order for Diana to have a better than 9% chance of breaking the world record over the n trials? Solution: (a) (5pt) We first compute the CDF F X (x) P(X < x) Using this CDF, we can derive the PDF P(5 (Z) < x) ( ( ) x 2 P Z < 5) if x < 5 5 ( 5) x 2 5 1 if 5 5 x < 5 15 1 if x > 5 15 f X (x) d dx F X(x) if x < 5 5 x if 5 5 x < 5 15 5 3 if x > 5 15 (b) (2pt) The probability that Diana breaks the world record is equal to the probability that her motorcycle travels a distance greater than 5 yards in 1 seconds. So, P(X > 5) 1 F X (5) 1 1 2 1 2 2

(c) (5pt) The probability that Diana breaks the world record in n trials is equal to the probability that the maximum distance traveled by her motorcycle in the n trials is greater than 5 yards in 1 seconds. Let Y denote the maximum distance traveled by her motorcycle. We have that Y max(x 1, X 2,..., X n ), where the X i are independent and equally distributed random variables. F Y (y) P(Y < y) P(max(X 1, X 2,..., X n ) < y) P(X 1 < y, X 2 < y,..., X n < y) P(X 1 < y) n F X (y) n P(Y > 5) 1 F Y (5) 1 F X (5) n 1 1 2 The minimum n for which 1 1 2 n.9 is n 4. n. 3

Problem 2: (13 points) A car dealer sells two models of cars, Ray and Sprint. A Ray car breaks after time R days, where R is exponentially distributed with parameter λ R. Similarly, a Sprint car breaks after a random time S that is exponentially distributed with parameter λ S. The random variables R and S are independent. When it is broken, a car is brought to the dealer for repair. The cost (in dollars) of fixing a broken Ray is a random variable uniformly distributed over the interval [1, 3], whereas the cost (in dollars) of fixing a broken Sprint is a random variable that is uniformly distributed over [2, 4] dollars. The costs of fixing different cars are independent. (a) (3 points) Given that that a Ray did not break in the first r days, what is the expected time before the Ray breaks? (b) (4 points) What is the probability that a Ray breaks before a Sprint? (c) (6 points) On some day, the dealer has to fix N R cars of the Ray model and cars of the Sprint model. Here N R and are independent Poisson random variables with parameters µ R and µ S respectively. The dealer is interested in estimating the total revenue Z that he will earn from repairing all the cars that day. Compute the expected value, variance, and the moment generating function of Z. Solution: (a) The expected time before the Ray breaks is given by E(R R > r) r + P(R > r + h R > r)dh From the memoryless property of the exponential random variable P(R > r + h R > r) P (R > h), we have E(R R > r) r + P(R > h)dh r + (b) The probability that a Ray breaks before a Sprint is given by e λ Rh dh r + 1 λ R P(R < S) s λ R λ R e λ Rr λ S e λ Ss drds (1 e λ Rs )λ S e λ Ss ds λ R + λ S 4

(c) The moment generating function of random variable X which is uniformly distributed over [a, b] is b M X (s) e sx 1 b a dx esb e sa s(b a) a The moment generating function of a Poisson random variable N with parameter µ is M N (s) e µ(es 1). The total revenue Z can be expressed as the sum of independent random variables N R Z Y Ri + i1 Y Sj where the Y Ri are independent random variables uniformly distributed over [1, 3], and the Y Sj are independent random variable uniformly distributed over [2, 4]. The variables Y Ri and Y Rj are independent for any i 1,..., N R and j 1,...,. Hence, the moment generating function of Z is j1 M Z (s) E[e sz ] [ PNR E e s i1 Y Ri+ P ] j1 Y Sj [ E e s P N R ] [ i1 Y Ri E e s P ] j1 Y Sj [ [ E E e s P N R ]] [ [ i1 Y Ri N R E E e s P ]] j1 Y Sj E [ M YR (s) N ] R E [ M YS (s) N ] S [ ] [ ] M NR (s) e s M YR (s) M NS (s) e s M YS (s) e µ S(M YR (s) 1) e µ T (M YS (s) 1) e µ S e 3b e 1s 1 s2 «+µ e 4s e 2s T 1 s2 E(Z) E(N R )E(Y Ri ) + E( )E(Y Sj ) 2µ R + 3µ S V ar(z) V ar(n R )E(Y Ri ) 2 + E(N R )V ar(y Ri ) + V ar( )E(Y Sj ) 2 + E( )V ar(y Sj ) µ R 2 2 + µ R (2) 2 /12 + µ S 3 2 + µ S 2 2 /12 13µ R /3 + 28µ S /3 5

Problem 3: (15 points) George the Gambler is a good poker player: every round that he plays, he wins a random amount Z of money, distributed as Z N(µ, σ 2 ) with µ >. Every night, George goes to the local poker club and plays T 1 + V rounds of poker, where V is a Poisson random variable with parameter λ 5. Let X be the total amount of money that George wins in a given night. (a)(2 points) In the absence of any further information, what is the best estimate of X? (Here best is measured by the minimum mean-squared error.) (b)(3 points) Suppose that you observe that George plays T t rounds of poker (where t is some positive integer). Now what is the best estimate of X (again measured in terms of minimim mean-squared error)? (c)(5 points) Now suppose that by peeking into George s wallet at the end of the night, you make a noisy observation Y of the amount of money X that he won say of the form Y X + W where W N(, 1) is Gaussian noise independent of X. Compute the Bayes least squares estimate of X based on observing T t and Y y. Also compute the linear least squares estimate (LLSE) of X based on observing T t and Y y, and the error variance of the LLSE. (d)(5 points) Now suppose that you observe that {T 2} and Y y. Compute the Bayes least squares estimate of X based on this information. Solution: (a) We know that the best estimate of X is E[X]. The mean E[X] can be computed using the law of iterated expectation: E[X] E[E[X T ]] E[T µ] µ(1 + Exs[V ]) µ(1 + 5) 6µ. (b) If we observe T t, then the best estimate of X becomes E[X T t]. When T t, X Z 1 + Z 2 +... + Z t is a Gaussian random variable with mean tµ and variance tσ 2. So, E[X T t] E[Z 1 + Z 2 +... + Z t ] tµ. (c) Given that T t, we know that X is a Gaussian random variable. The variable Y is a noisy observation of X contaminated by additive Gaussian noise W, independent of X. So conditioned on T t, Y itself is Gaussian and X, Y are jointly Gaussian random variables. In this case, we know that the Bayes least square estimate is the LLSE. Conditioned on T t, the variate Y is Gaussian with mean tµ and variance tσ 2 + 1. Therefore, we have that E[X T t, Y y] tµ + The error variance is (1 ρ 2 )σ 2 X (1 σ2 X σ 2 Y ) σ 2 X 6 tσ2 tσ 2 (y tµ). + 1

(d) Using the conditional version of the law of total expectation, we have E[X T 2, Y y] 2 P[T t T 2, Y y] E[X T t, Y y]. t1 From part (c), we have E[X T t, Y y] tµ + tσ2 tσ 2 (y tµ) + 1 for t 1, 2. Using Bayes rule, for t 1, 2, we have P[T t T 2, Y y] P[T t, T 2, Y y] P[T 2, Y y] P[T t, Y y] P[T 2, Y y] f Y T (y t)p[t t] f Y T (y 2)P[T 2] + f Y T (y 1)P[T 1]. Since T 1 + V with V Poisson with parameter λ 5, we have P(T 1) e 5 and P(T 2) 5e 5. Conditioned on T t, the variate Y is Gaussian with mean tµ and variance tσ 2 + 1, which specifies the form of the conditional PDFs f Y T. 7